tox and pre-commit formatting

This commit is contained in:
bobokun 2022-10-29 11:19:09 -04:00
parent 97f97f7c50
commit 218ca69371
No known key found for this signature in database
GPG key ID: B73932169607D927
34 changed files with 1453 additions and 719 deletions

6
.dockerignore Normal file → Executable file
View file

@ -17,7 +17,9 @@ config
Dockerfile
venv
.idea
.venv
.venv*
test.py
!config/config.yml.sample
.flake8
.flake8
qbit_manage.egg-info/
.tox

4
.flake8 Normal file → Executable file
View file

@ -1,5 +1,5 @@
[flake8]
ignore =
ignore =
E226, # E226 Missing whitespace around arithmetic operator
#E302, # E302 Expected 2 blank lines, found 0
E401, # E401 Multiple imports on one line
@ -10,4 +10,4 @@ ignore =
E722, # E722 Do not use bare except, specify exception instead
W503, # W503 Line break occurred before a binary operator
E402, # E402 module level import not at top of file
max-line-length = 200
max-line-length = 130

2
.github/FUNDING.yml vendored Normal file → Executable file
View file

@ -1 +1 @@
github: bobokun
github: bobokun

2
.github/ISSUE_TEMPLATE/1.bug_report.yml vendored Normal file → Executable file
View file

@ -72,4 +72,4 @@ body:
- type: markdown
attributes:
value: |
Make sure to close your issue when it's solved! If you found the solution yourself please comment so that others benefit from it.
Make sure to close your issue when it's solved! If you found the solution yourself please comment so that others benefit from it.

0
.github/ISSUE_TEMPLATE/2.feature_request.yml vendored Normal file → Executable file
View file

2
.github/ISSUE_TEMPLATE/3.docs_request.yml vendored Normal file → Executable file
View file

@ -22,4 +22,4 @@ body:
label: Does the docs page already exist? Please link to it.
description: 'Example: https://github.com/StuffAnThings/qbit_manage/wiki/existingpagehere'
validations:
required: false
required: false

2
.github/ISSUE_TEMPLATE/config.yml vendored Normal file → Executable file
View file

@ -8,4 +8,4 @@ contact_links:
about: Please post your question under the `qbit-manage` channel for support issues.
- name: Ask a question
url: https://github.com/StuffAnThings/qbit_manage/discussions
about: Ask questions and discuss with other community members
about: Ask questions and discuss with other community members

4
.github/dependabot.yml vendored Normal file → Executable file
View file

@ -16,8 +16,8 @@ updates:
directory: '/'
schedule:
interval: daily
assignees:
assignees:
- "bobokun"
target-branch: "develop"
ignore:
- dependency-name: "salsify/action-detect-and-tag-new-version"
- dependency-name: "salsify/action-detect-and-tag-new-version"

2
.github/pull_request_template.md vendored Normal file → Executable file
View file

@ -26,4 +26,4 @@ Please delete options that are not relevant.
- [ ] I have performed a self-review of my own code
- [ ] I have commented my code, particularly in hard-to-understand areas
- [ ] I have added or updated the docstring for new or existing methods
- [ ] I have added tests when applicable
- [ ] I have added tests when applicable

2
.github/workflows/develop.yml vendored Normal file → Executable file
View file

@ -48,4 +48,4 @@ jobs:
file: ./Dockerfile
platforms: linux/amd64,linux/arm64,linux/arm/v7
push: true
tags: ${{ secrets.DOCKER_HUB_USERNAME }}/qbit_manage:develop
tags: ${{ secrets.DOCKER_HUB_USERNAME }}/qbit_manage:develop

2
.github/workflows/latest.yml vendored Normal file → Executable file
View file

@ -44,4 +44,4 @@ jobs:
file: ./Dockerfile
platforms: linux/amd64,linux/arm64,linux/arm/v7
push: true
tags: ${{ secrets.DOCKER_HUB_USERNAME }}/qbit_manage:latest
tags: ${{ secrets.DOCKER_HUB_USERNAME }}/qbit_manage:latest

2
.github/workflows/tag.yml vendored Normal file → Executable file
View file

@ -15,4 +15,4 @@ jobs:
- uses: salsify/action-detect-and-tag-new-version@v1.0.3
with:
version-command: |
cat VERSION
cat VERSION

2
.github/workflows/version.yml vendored Normal file → Executable file
View file

@ -42,4 +42,4 @@ jobs:
file: ./Dockerfile
platforms: linux/amd64,linux/arm64,linux/arm/v7
push: true
tags: ${{ secrets.DOCKER_HUB_USERNAME }}/qbit_manage:${{ steps.get_version.outputs.VERSION }}
tags: ${{ secrets.DOCKER_HUB_USERNAME }}/qbit_manage:${{ steps.get_version.outputs.VERSION }}

4
.gitignore vendored Normal file → Executable file
View file

@ -8,4 +8,6 @@ __pycache__/
.vscode/*
!.github/**
*.svg
.venv
.venv*
qbit_manage.egg-info/
.tox

41
.pre-commit-config.yaml Executable file
View file

@ -0,0 +1,41 @@
---
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.3.0
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
- id: check-json
- id: check-yaml
- id: debug-statements
- id: requirements-txt-fixer
- id: fix-encoding-pragma
args: [--remove]
- id: pretty-format-json
args: [--autofix, --indent, '4', --no-sort-keys]
- repo: https://github.com/adrienverge/yamllint.git
rev: v1.21.0 # or higher tag
hooks:
- id: yamllint
args: [--format, parsable, --strict]
exclude: ^.github/
- repo: https://github.com/lyz-code/yamlfix
rev: 1.1.0
hooks:
- id: yamlfix
exclude: ^.github/
- repo: https://github.com/asottile/reorder_python_imports
rev: v3.9.0
hooks:
- id: reorder-python-imports
- repo: https://github.com/psf/black
rev: 22.10.0
hooks:
- id: black
language_version: python3
args: [--line-length, '130']
- repo: https://gitlab.com/pycqa/flake8
rev: 5.0.4
hooks:
- id: flake8
args: [--config=.flake8]

4
Dockerfile Normal file → Executable file
View file

@ -1,4 +1,4 @@
FROM python:3.9-alpine
FROM python:3.10-alpine
# install packages
RUN apk add --no-cache gcc g++ libxml2-dev libxslt-dev shadow bash curl wget jq grep sed coreutils findutils unzip p7zip ca-certificates
@ -12,4 +12,4 @@ RUN echo "**** install python packages ****" \
COPY . /app
WORKDIR /app
VOLUME /config
ENTRYPOINT ["python3", "qbit_manage.py"]
ENTRYPOINT ["python3", "qbit_manage.py"]

0
LICENSE Normal file → Executable file
View file

3
README.md Normal file → Executable file
View file

@ -4,6 +4,7 @@
[![GitHub commits since latest release (by SemVer)](https://img.shields.io/github/commits-since/StuffAnThings/qbit_manage/latest/develop?label=Commits%20in%20Develop&style=plastic)](https://github.com/StuffAnThings/qbit_manage/tree/develop)
[![Docker Image Version (latest semver)](https://img.shields.io/docker/v/bobokun/qbit_manage?label=docker&sort=semver&style=plastic)](https://hub.docker.com/r/bobokun/qbit_manage)
![Github Workflow Status](https://img.shields.io/github/workflow/status/StuffAnThings/qbit_manage/Docker%20Latest%20Release?style=plastic)
[![pre-commit.ci status](https://results.pre-commit.ci/badge/github/StuffAnThings/qbit_manage/master.svg)](https://results.pre-commit.ci/latest/github/StuffAnThings/qbit_manage/master)
[![Docker Pulls](https://img.shields.io/docker/pulls/bobokun/qbit_manage?style=plastic)](https://hub.docker.com/r/bobokun/qbit_manage)
[![Sponsor or Donate](https://img.shields.io/badge/-Sponsor_or_Donate-blueviolet?style=plastic)](https://github.com/sponsors/bobokun)
@ -23,7 +24,7 @@ This is a program used to manage your qBittorrent instance such as:
Check out the [wiki](https://github.com/StuffAnThings/qbit_manage/wiki) for installation help
1. Install qbit_manage either by installing Python3 on the localhost and following the [Local Installation](https://github.com/StuffAnThings/qbit_manage/wiki/Local-Installations) Guide or by installing Docker and following the [Docker Installation](https://github.com/StuffAnThings/qbit_manage/wiki/Docker-Installation) Guide or the [unRAID Installation](https://github.com/StuffAnThings/qbit_manage/wiki/Unraid-Installation) Guide.<br>
2. Once installed, you have to [set up your Configuration](https://github.com/StuffAnThings/qbit_manage/wiki/Config-Setup) by create a [Configuration File](https://github.com/StuffAnThings/qbit_manage/blob/master/config/config.yml.sample) filled with all your values to connect to your qBittorrent instance.
2. Once installed, you have to [set up your Configuration](https://github.com/StuffAnThings/qbit_manage/wiki/Config-Setup) by create a [Configuration File](https://github.com/StuffAnThings/qbit_manage/blob/master/config/config.yml.sample) filled with all your values to connect to your qBittorrent instance.
3. Please refer to the list of [Commands](https://github.com/StuffAnThings/qbit_manage/wiki/Commands) that can be used with this tool.
## Usage
To run the script in an interactive terminal with a list of possible commands run:

2
VERSION Normal file → Executable file
View file

@ -1 +1 @@
3.3.1
3.3.1

16
config/config.yml.sample Normal file → Executable file
View file

@ -71,7 +71,7 @@ tracker:
# <OPTIONAL> Will set the torrent Maximum seeding time (min) until torrent is stopped from seeding. -2 means the global limit should be used, -1 means no limit.
# max_seeding_time: 129600
# <OPTIONAL> Will ensure that noHL torrents from this tracker are not deleted by cleanup variable if torrent has not yet met the minimum seeding time (min).
# min_seeding_time: 2000
# min_seeding_time: 2000
# <OPTIONAL> Will limit the upload speed KiB/s (KiloBytes/second) (-1 sets the limit to infinity)
# limit_upload_speed: 150
# <OPTIONAL> Set this to the notifiarr react name. This is used to add indexer reactions to the notifications sent by Notifiarr
@ -148,8 +148,8 @@ nohardlinks:
max_seeding_time: 86400
# <OPTIONAL> Limit Upload Speed var: Will limit the upload speed KiB/s (KiloBytes/second) (`-1` : No Limit)
limit_upload_speed:
# <OPTIONAL> min seeding time var: Will prevent torrent deletion by cleanup variable if torrent has not yet minimum seeding time (min).
# Delete this key from a category's config to use the tracker's configured min_seeding_time. Will default to 0 if not specified for the category or tracker.
# <OPTIONAL> min seeding time var: Will prevent torrent deletion by cleanup variable if torrent has not yet minimum seeding time (min).
# Delete this key from a category's config to use the tracker's configured min_seeding_time. Will default to 0 if not specified for the category or tracker.
min_seeding_time: 43200
# Can have additional categories set with separate ratio/seeding times defined.
@ -166,14 +166,14 @@ nohardlinks:
max_seeding_time: 86400
# <OPTIONAL> Limit Upload Speed var: Will limit the upload speed KiB/s (KiloBytes/second) (`-1` : No Limit)
limit_upload_speed:
# <OPTIONAL> min seeding time var: Will prevent torrent deletion by cleanup variable if torrent has not yet minimum seeding time (min).
# min_seeding_time: # Not specified for this category; tracker's value will be used. Will default to 0 if not specified for the category or tracker.
# <OPTIONAL> min seeding time var: Will prevent torrent deletion by cleanup variable if torrent has not yet minimum seeding time (min).
# min_seeding_time: # Not specified for this category; tracker's value will be used. Will default to 0 if not specified for the category or tracker.
recyclebin:
# Recycle Bin method of deletion will move files into the recycle bin (Located in /root_dir/.RecycleBin) instead of directly deleting them in qbit
# By default the Recycle Bin will be emptied on every run of the qbit_manage script if empty_after_x_days is defined.
enabled: true
# <OPTIONAL> empty_after_x_days var:
# <OPTIONAL> empty_after_x_days var:
# Will automatically remove all files and folders in recycle bin after x days. (Checks every script run)
# If this variable is not defined it, the RecycleBin will never be emptied.
# WARNING: Setting this variable to 0 will delete all files immediately upon script run!
@ -183,7 +183,7 @@ recyclebin:
# This will save a copy of your .torrent and .fastresume file in the recycle bin before deleting it from qbittorrent
save_torrents: true
# <OPTIONAL> split_by_category var:
# This will split the recycle bin folder by the save path defined in the `cat` attribute
# This will split the recycle bin folder by the save path defined in the `cat` attribute
# and add the base folder name of the recycle bin that was defined in the `recycle_bin` sub-attribute under directory.
split_by_category: false
@ -240,4 +240,4 @@ webhooks:
bhd:
# BHD Integration used for checking unregistered torrents
apikey:
apikey:

1
modules/apprise.py Normal file → Executable file
View file

@ -1,5 +1,6 @@
from modules import util
from modules.util import Failed
logger = util.logger

3
modules/bhd.py Normal file → Executable file
View file

@ -1,6 +1,7 @@
from json import JSONDecodeError
from modules import util
from modules.util import Failed
from json import JSONDecodeError
logger = util.logger
base_url = "https://beyond-hd.me/api/"

579
modules/config.py Normal file → Executable file
View file

@ -1,13 +1,21 @@
import os, requests, stat, time, re
from modules import util
from modules.util import Failed, check, YAML
from modules.qbittorrent import Qbt
from modules.webhooks import Webhooks
from modules.notifiarr import Notifiarr
from modules.bhd import BeyondHD
from modules.apprise import Apprise
import os
import re
import stat
import time
import requests
from retrying import retry
from modules import util
from modules.apprise import Apprise
from modules.bhd import BeyondHD
from modules.notifiarr import Notifiarr
from modules.qbittorrent import Qbt
from modules.util import check
from modules.util import Failed
from modules.util import YAML
from modules.webhooks import Webhooks
logger = util.logger
@ -16,11 +24,16 @@ class Config:
logger.info("Locating config...")
self.args = args
config_file = args["config_file"]
if config_file and os.path.exists(config_file): self.config_path = os.path.abspath(config_file)
elif config_file and os.path.exists(os.path.join(default_dir, config_file)): self.config_path = os.path.abspath(os.path.join(default_dir, config_file))
elif config_file and not os.path.exists(config_file): raise Failed(f"Config Error: config not found at {os.path.abspath(config_file)}")
elif os.path.exists(os.path.join(default_dir, "config.yml")): self.config_path = os.path.abspath(os.path.join(default_dir, "config.yml"))
else: raise Failed(f"Config Error: config not found at {os.path.abspath(default_dir)}")
if config_file and os.path.exists(config_file):
self.config_path = os.path.abspath(config_file)
elif config_file and os.path.exists(os.path.join(default_dir, config_file)):
self.config_path = os.path.abspath(os.path.join(default_dir, config_file))
elif config_file and not os.path.exists(config_file):
raise Failed(f"Config Error: config not found at {os.path.abspath(config_file)}")
elif os.path.exists(os.path.join(default_dir, "config.yml")):
self.config_path = os.path.abspath(os.path.join(default_dir, "config.yml"))
else:
raise Failed(f"Config Error: config not found at {os.path.abspath(default_dir)}")
logger.info(f"Using {self.config_path} as config")
self.util = check(self)
@ -37,19 +50,19 @@ class Config:
if self.data["commands"] is not None:
logger.info(f"Commands found in {config_file}, ignoring env variables and using config commands instead.")
self.commands = self.data.pop("commands")
if 'dry_run' not in self.commands:
self.commands['dry_run'] = args['dry_run'] if 'dry_run' in args else False
if "dry_run" not in self.commands:
self.commands["dry_run"] = args["dry_run"] if "dry_run" in args else False
# Add default any missing commands as False
for v in [
'cross_seed',
'recheck',
'cat_update',
'tag_update',
'rem_unregistered',
'tag_tracker_error',
'rem_orphaned',
'tag_nohardlinks',
'skip_cleanup'
"cross_seed",
"recheck",
"cat_update",
"tag_update",
"rem_unregistered",
"tag_tracker_error",
"rem_orphaned",
"tag_nohardlinks",
"skip_cleanup",
]:
if v not in self.commands:
self.commands[v] = False
@ -67,22 +80,34 @@ class Config:
else:
self.commands = args
if "qbt" in self.data: self.data["qbt"] = self.data.pop("qbt")
if "qbt" in self.data:
self.data["qbt"] = self.data.pop("qbt")
self.data["settings"] = self.data.pop("settings") if "settings" in self.data else {}
if "directory" in self.data: self.data["directory"] = self.data.pop("directory")
if "directory" in self.data:
self.data["directory"] = self.data.pop("directory")
self.data["cat"] = self.data.pop("cat") if "cat" in self.data else {}
if "cat_change" in self.data: self.data["cat_change"] = self.data.pop("cat_change")
if "tracker" in self.data: self.data["tracker"] = self.data.pop("tracker")
elif "tags" in self.data: self.data["tracker"] = self.data.pop("tags")
else: self.data["tracker"] = {}
if "nohardlinks" in self.data: self.data["nohardlinks"] = self.data.pop("nohardlinks")
if "recyclebin" in self.data: self.data["recyclebin"] = self.data.pop("recyclebin")
if "orphaned" in self.data: self.data["orphaned"] = self.data.pop("orphaned")
if "apprise" in self.data: self.data["apprise"] = self.data.pop("apprise")
if "notifiarr" in self.data: self.data["notifiarr"] = self.data.pop("notifiarr")
if "cat_change" in self.data:
self.data["cat_change"] = self.data.pop("cat_change")
if "tracker" in self.data:
self.data["tracker"] = self.data.pop("tracker")
elif "tags" in self.data:
self.data["tracker"] = self.data.pop("tags")
else:
self.data["tracker"] = {}
if "nohardlinks" in self.data:
self.data["nohardlinks"] = self.data.pop("nohardlinks")
if "recyclebin" in self.data:
self.data["recyclebin"] = self.data.pop("recyclebin")
if "orphaned" in self.data:
self.data["orphaned"] = self.data.pop("orphaned")
if "apprise" in self.data:
self.data["apprise"] = self.data.pop("apprise")
if "notifiarr" in self.data:
self.data["notifiarr"] = self.data.pop("notifiarr")
if "webhooks" in self.data:
temp = self.data.pop("webhooks")
if 'function' not in temp or ('function' in temp and temp['function'] is None): temp["function"] = {}
if "function" not in temp or ("function" in temp and temp["function"] is None):
temp["function"] = {}
def hooks(attr):
if attr in temp:
@ -92,6 +117,7 @@ class Config:
if attr not in temp["function"]:
temp["function"][attr] = {}
temp["function"][attr] = None
hooks("cross_seed")
hooks("recheck")
hooks("cat_update")
@ -101,34 +127,47 @@ class Config:
hooks("tag_nohardlinks")
hooks("cleanup_dirs")
self.data["webhooks"] = temp
if "bhd" in self.data: self.data["bhd"] = self.data.pop("bhd")
if "bhd" in self.data:
self.data["bhd"] = self.data.pop("bhd")
self.session = requests.Session()
self.settings = {
"force_auto_tmm": self.util.check_for_attribute(self.data, "force_auto_tmm", parent="settings", var_type="bool", default=False),
"tracker_error_tag": self.util.check_for_attribute(self.data, "tracker_error_tag", parent="settings", default='issue')
"force_auto_tmm": self.util.check_for_attribute(
self.data, "force_auto_tmm", parent="settings", var_type="bool", default=False
),
"tracker_error_tag": self.util.check_for_attribute(
self.data, "tracker_error_tag", parent="settings", default="issue"
),
}
default_ignore_tags = ['noHL', self.settings["tracker_error_tag"], 'cross-seed']
self.settings["ignoreTags_OnUpdate"] = self.util.check_for_attribute(self.data, "ignoreTags_OnUpdate", parent="settings", default=default_ignore_tags, var_type="list")
default_ignore_tags = ["noHL", self.settings["tracker_error_tag"], "cross-seed"]
self.settings["ignoreTags_OnUpdate"] = self.util.check_for_attribute(
self.data, "ignoreTags_OnUpdate", parent="settings", default=default_ignore_tags, var_type="list"
)
default_function = {
'cross_seed': None,
'recheck': None,
'cat_update': None,
'tag_update': None,
'rem_unregistered': None,
'tag_tracker_error': None,
'rem_orphaned': None,
'tag_nohardlinks': None,
'cleanup_dirs': None,
"cross_seed": None,
"recheck": None,
"cat_update": None,
"tag_update": None,
"rem_unregistered": None,
"tag_tracker_error": None,
"rem_orphaned": None,
"tag_nohardlinks": None,
"cleanup_dirs": None,
}
self.webhooks = {
"error": self.util.check_for_attribute(self.data, "error", parent="webhooks", var_type="list", default_is_none=True),
"run_start": self.util.check_for_attribute(self.data, "run_start", parent="webhooks", var_type="list", default_is_none=True),
"run_end": self.util.check_for_attribute(self.data, "run_end", parent="webhooks", var_type="list", default_is_none=True),
"function": self.util.check_for_attribute(self.data, "function", parent="webhooks", var_type="list", default=default_function)
"run_start": self.util.check_for_attribute(
self.data, "run_start", parent="webhooks", var_type="list", default_is_none=True
),
"run_end": self.util.check_for_attribute(
self.data, "run_end", parent="webhooks", var_type="list", default_is_none=True
),
"function": self.util.check_for_attribute(
self.data, "function", parent="webhooks", var_type="list", default=default_function
),
}
for func in default_function:
self.util.check_for_attribute(self.data, func, parent="webhooks", subparent="function", default_is_none=True)
@ -140,10 +179,17 @@ class Config:
if self.data["apprise"] is not None:
logger.info("Connecting to Apprise...")
try:
self.AppriseFactory = Apprise(self, {
"api_url": self.util.check_for_attribute(self.data, "api_url", parent="apprise", var_type="url", throw=True),
"notify_url": self.util.check_for_attribute(self.data, "notify_url", parent="apprise", var_type="list", throw=True),
})
self.AppriseFactory = Apprise(
self,
{
"api_url": self.util.check_for_attribute(
self.data, "api_url", parent="apprise", var_type="url", throw=True
),
"notify_url": self.util.check_for_attribute(
self.data, "notify_url", parent="apprise", var_type="list", throw=True
),
},
)
except Failed as e:
logger.error(e)
logger.info(f"Apprise Connection {'Failed' if self.AppriseFactory is None else 'Successful'}")
@ -153,10 +199,15 @@ class Config:
if self.data["notifiarr"] is not None:
logger.info("Connecting to Notifiarr...")
try:
self.NotifiarrFactory = Notifiarr(self, {
"apikey": self.util.check_for_attribute(self.data, "apikey", parent="notifiarr", throw=True),
"instance": self.util.check_for_attribute(self.data, "instance", parent="notifiarr", default=False, do_print=False, save=False)
})
self.NotifiarrFactory = Notifiarr(
self,
{
"apikey": self.util.check_for_attribute(self.data, "apikey", parent="notifiarr", throw=True),
"instance": self.util.check_for_attribute(
self.data, "instance", parent="notifiarr", default=False, do_print=False, save=False
),
},
)
except Failed as e:
logger.error(e)
logger.info(f"Notifiarr Connection {'Failed' if self.NotifiarrFactory is None else 'Successful'}")
@ -173,192 +224,337 @@ class Config:
if self.data["bhd"] is not None:
logger.info("Connecting to BHD API...")
try:
self.BeyondHD = BeyondHD(self, {
"apikey": self.util.check_for_attribute(self.data, "apikey", parent="bhd", throw=True)
})
self.BeyondHD = BeyondHD(
self, {"apikey": self.util.check_for_attribute(self.data, "apikey", parent="bhd", throw=True)}
)
except Failed as e:
logger.error(e)
self.notify(e, 'BHD')
self.notify(e, "BHD")
logger.info(f"BHD Connection {'Failed' if self.BeyondHD is None else 'Successful'}")
# nohardlinks
self.nohardlinks = None
if "nohardlinks" in self.data and self.commands['tag_nohardlinks']:
if "nohardlinks" in self.data and self.commands["tag_nohardlinks"]:
self.nohardlinks = {}
for cat in self.data["nohardlinks"]:
if cat in list(self.data["cat"].keys()):
self.nohardlinks[cat] = {}
self.nohardlinks[cat]["exclude_tags"] = self.util.check_for_attribute(self.data, "exclude_tags", parent="nohardlinks", subparent=cat,
var_type="list", default_is_none=True, do_print=False)
self.nohardlinks[cat]["cleanup"] = self.util.check_for_attribute(self.data, "cleanup", parent="nohardlinks", subparent=cat, var_type="bool", default=False, do_print=False)
self.nohardlinks[cat]['max_ratio'] = self.util.check_for_attribute(self.data, "max_ratio", parent="nohardlinks", subparent=cat,
var_type="float", default_int=-2, default_is_none=True, do_print=False)
self.nohardlinks[cat]['max_seeding_time'] = self.util.check_for_attribute(self.data, "max_seeding_time", parent="nohardlinks", subparent=cat,
var_type="int", default_int=-2, default_is_none=True, do_print=False)
self.nohardlinks[cat]['min_seeding_time'] = self.util.check_for_attribute(self.data, "min_seeding_time", parent="nohardlinks", subparent=cat,
var_type="int", default_int=0, default_is_none=True, do_print=False)
self.nohardlinks[cat]['limit_upload_speed'] = self.util.check_for_attribute(self.data, "limit_upload_speed", parent="nohardlinks", subparent=cat,
var_type="int", default_int=-1, default_is_none=True, do_print=False)
self.nohardlinks[cat]["exclude_tags"] = self.util.check_for_attribute(
self.data,
"exclude_tags",
parent="nohardlinks",
subparent=cat,
var_type="list",
default_is_none=True,
do_print=False,
)
self.nohardlinks[cat]["cleanup"] = self.util.check_for_attribute(
self.data, "cleanup", parent="nohardlinks", subparent=cat, var_type="bool", default=False, do_print=False
)
self.nohardlinks[cat]["max_ratio"] = self.util.check_for_attribute(
self.data,
"max_ratio",
parent="nohardlinks",
subparent=cat,
var_type="float",
default_int=-2,
default_is_none=True,
do_print=False,
)
self.nohardlinks[cat]["max_seeding_time"] = self.util.check_for_attribute(
self.data,
"max_seeding_time",
parent="nohardlinks",
subparent=cat,
var_type="int",
default_int=-2,
default_is_none=True,
do_print=False,
)
self.nohardlinks[cat]["min_seeding_time"] = self.util.check_for_attribute(
self.data,
"min_seeding_time",
parent="nohardlinks",
subparent=cat,
var_type="int",
default_int=0,
default_is_none=True,
do_print=False,
)
self.nohardlinks[cat]["limit_upload_speed"] = self.util.check_for_attribute(
self.data,
"limit_upload_speed",
parent="nohardlinks",
subparent=cat,
var_type="int",
default_int=-1,
default_is_none=True,
do_print=False,
)
else:
e = (f"Config Error: Category {cat} is defined under nohardlinks attribute but is not defined in the cat attribute.")
self.notify(e, 'Config')
e = f"Config Error: Category {cat} is defined under nohardlinks attribute "
"but is not defined in the cat attribute."
self.notify(e, "Config")
raise Failed(e)
else:
if self.commands["tag_nohardlinks"]:
e = "Config Error: nohardlinks attribute not found"
self.notify(e, 'Config')
self.notify(e, "Config")
raise Failed(e)
# Add RecycleBin
self.recyclebin = {}
self.recyclebin['enabled'] = self.util.check_for_attribute(self.data, "enabled", parent="recyclebin", var_type="bool", default=True)
self.recyclebin['empty_after_x_days'] = self.util.check_for_attribute(self.data, "empty_after_x_days", parent="recyclebin", var_type="int", default_is_none=True)
self.recyclebin['save_torrents'] = self.util.check_for_attribute(self.data, "save_torrents", parent="recyclebin", var_type="bool", default=False)
self.recyclebin['split_by_category'] = self.util.check_for_attribute(self.data, "split_by_category", parent="recyclebin", var_type="bool", default=False)
self.recyclebin["enabled"] = self.util.check_for_attribute(
self.data, "enabled", parent="recyclebin", var_type="bool", default=True
)
self.recyclebin["empty_after_x_days"] = self.util.check_for_attribute(
self.data, "empty_after_x_days", parent="recyclebin", var_type="int", default_is_none=True
)
self.recyclebin["save_torrents"] = self.util.check_for_attribute(
self.data, "save_torrents", parent="recyclebin", var_type="bool", default=False
)
self.recyclebin["split_by_category"] = self.util.check_for_attribute(
self.data, "split_by_category", parent="recyclebin", var_type="bool", default=False
)
# Assign directories
if "directory" in self.data:
self.root_dir = os.path.join(self.util.check_for_attribute(self.data, "root_dir", parent="directory", default_is_none=True), '')
self.remote_dir = os.path.join(self.util.check_for_attribute(self.data, "remote_dir", parent="directory", default=self.root_dir), '')
if (self.commands["cross_seed"] or self.commands["tag_nohardlinks"] or self.commands["rem_orphaned"]):
self.remote_dir = self.util.check_for_attribute(self.data, "remote_dir", parent="directory", var_type="path", default=self.root_dir)
self.root_dir = os.path.join(
self.util.check_for_attribute(self.data, "root_dir", parent="directory", default_is_none=True), ""
)
self.remote_dir = os.path.join(
self.util.check_for_attribute(self.data, "remote_dir", parent="directory", default=self.root_dir), ""
)
if self.commands["cross_seed"] or self.commands["tag_nohardlinks"] or self.commands["rem_orphaned"]:
self.remote_dir = self.util.check_for_attribute(
self.data, "remote_dir", parent="directory", var_type="path", default=self.root_dir
)
else:
if self.recyclebin['enabled']:
self.remote_dir = self.util.check_for_attribute(self.data, "remote_dir", parent="directory", var_type="path", default=self.root_dir)
if self.recyclebin["enabled"]:
self.remote_dir = self.util.check_for_attribute(
self.data, "remote_dir", parent="directory", var_type="path", default=self.root_dir
)
if self.commands["cross_seed"]:
self.cross_seed_dir = self.util.check_for_attribute(self.data, "cross_seed", parent="directory", var_type="path")
else:
self.cross_seed_dir = self.util.check_for_attribute(self.data, "cross_seed", parent="directory", default_is_none=True)
self.cross_seed_dir = self.util.check_for_attribute(
self.data, "cross_seed", parent="directory", default_is_none=True
)
if self.commands["rem_orphaned"]:
if "orphaned_dir" in self.data["directory"] and self.data["directory"]["orphaned_dir"] is not None:
default_orphaned = os.path.join(self.remote_dir, os.path.basename(self.data['directory']['orphaned_dir'].rstrip(os.sep)))
default_orphaned = os.path.join(
self.remote_dir, os.path.basename(self.data["directory"]["orphaned_dir"].rstrip(os.sep))
)
else:
default_orphaned = os.path.join(self.remote_dir, 'orphaned_data')
self.orphaned_dir = self.util.check_for_attribute(self.data, "orphaned_dir", parent="directory", var_type="path", default=default_orphaned, make_dirs=True)
default_orphaned = os.path.join(self.remote_dir, "orphaned_data")
self.orphaned_dir = self.util.check_for_attribute(
self.data, "orphaned_dir", parent="directory", var_type="path", default=default_orphaned, make_dirs=True
)
else:
self.orphaned_dir = None
if self.recyclebin['enabled']:
if self.recyclebin["enabled"]:
if "recycle_bin" in self.data["directory"] and self.data["directory"]["recycle_bin"] is not None:
default_recycle = os.path.join(self.remote_dir, os.path.basename(self.data['directory']['recycle_bin'].rstrip(os.sep)))
default_recycle = os.path.join(
self.remote_dir, os.path.basename(self.data["directory"]["recycle_bin"].rstrip(os.sep))
)
else:
default_recycle = os.path.join(self.remote_dir, '.RecycleBin')
if self.recyclebin['split_by_category']:
self.recycle_dir = self.util.check_for_attribute(self.data, "recycle_bin", parent="directory", default=default_recycle)
default_recycle = os.path.join(self.remote_dir, ".RecycleBin")
if self.recyclebin["split_by_category"]:
self.recycle_dir = self.util.check_for_attribute(
self.data, "recycle_bin", parent="directory", default=default_recycle
)
else:
self.recycle_dir = self.util.check_for_attribute(self.data, "recycle_bin", parent="directory", var_type="path", default=default_recycle, make_dirs=True)
self.recycle_dir = self.util.check_for_attribute(
self.data, "recycle_bin", parent="directory", var_type="path", default=default_recycle, make_dirs=True
)
else:
self.recycle_dir = None
if self.recyclebin['enabled'] and self.recyclebin['save_torrents']:
if self.recyclebin["enabled"] and self.recyclebin["save_torrents"]:
self.torrents_dir = self.util.check_for_attribute(self.data, "torrents_dir", parent="directory", var_type="path")
if not any(File.endswith(".torrent") for File in os.listdir(self.torrents_dir)):
e = f"Config Error: The location {self.torrents_dir} does not contain any .torrents"
self.notify(e, 'Config')
self.notify(e, "Config")
raise Failed(e)
else:
self.torrents_dir = self.util.check_for_attribute(self.data, "torrents_dir", parent="directory", default_is_none=True)
self.torrents_dir = self.util.check_for_attribute(
self.data, "torrents_dir", parent="directory", default_is_none=True
)
else:
e = "Config Error: directory attribute not found"
self.notify(e, 'Config')
self.notify(e, "Config")
raise Failed(e)
# Add Orphaned
self.orphaned = {}
self.orphaned['empty_after_x_days'] = self.util.check_for_attribute(self.data, "empty_after_x_days", parent="orphaned", var_type="int", default_is_none=True)
self.orphaned['exclude_patterns'] = self.util.check_for_attribute(self.data, "exclude_patterns", parent="orphaned", var_type="list", default_is_none=True, do_print=False)
self.orphaned["empty_after_x_days"] = self.util.check_for_attribute(
self.data, "empty_after_x_days", parent="orphaned", var_type="int", default_is_none=True
)
self.orphaned["exclude_patterns"] = self.util.check_for_attribute(
self.data, "exclude_patterns", parent="orphaned", var_type="list", default_is_none=True, do_print=False
)
if self.commands["rem_orphaned"]:
exclude_orphaned = f"**{os.sep}{os.path.basename(self.orphaned_dir.rstrip(os.sep))}{os.sep}*"
self.orphaned['exclude_patterns'].append(exclude_orphaned) if exclude_orphaned not in self.orphaned['exclude_patterns'] else self.orphaned['exclude_patterns']
if self.recyclebin['enabled']:
self.orphaned["exclude_patterns"].append(exclude_orphaned) if exclude_orphaned not in self.orphaned[
"exclude_patterns"
] else self.orphaned["exclude_patterns"]
if self.recyclebin["enabled"]:
exclude_recycle = f"**{os.sep}{os.path.basename(self.recycle_dir.rstrip(os.sep))}{os.sep}*"
self.orphaned['exclude_patterns'].append(exclude_recycle) if exclude_recycle not in self.orphaned['exclude_patterns'] else self.orphaned['exclude_patterns']
self.orphaned["exclude_patterns"].append(exclude_recycle) if exclude_recycle not in self.orphaned[
"exclude_patterns"
] else self.orphaned["exclude_patterns"]
# Connect to Qbittorrent
self.qbt = None
if "qbt" in self.data:
logger.info("Connecting to Qbittorrent...")
self.qbt = Qbt(self, {
"host": self.util.check_for_attribute(self.data, "host", parent="qbt", throw=True),
"username": self.util.check_for_attribute(self.data, "user", parent="qbt", default_is_none=True),
"password": self.util.check_for_attribute(self.data, "pass", parent="qbt", default_is_none=True)
})
self.qbt = Qbt(
self,
{
"host": self.util.check_for_attribute(self.data, "host", parent="qbt", throw=True),
"username": self.util.check_for_attribute(self.data, "user", parent="qbt", default_is_none=True),
"password": self.util.check_for_attribute(self.data, "pass", parent="qbt", default_is_none=True),
},
)
else:
e = "Config Error: qbt attribute not found"
self.notify(e, 'Config')
self.notify(e, "Config")
raise Failed(e)
# Get tags from config file based on keyword
def get_tags(self, urls):
tracker = {}
tracker['tag'] = None
tracker['max_ratio'] = None
tracker['min_seeding_time'] = None
tracker['max_seeding_time'] = None
tracker['limit_upload_speed'] = None
tracker['notifiarr'] = None
tracker['url'] = None
if not urls: return tracker
tracker["tag"] = None
tracker["max_ratio"] = None
tracker["min_seeding_time"] = None
tracker["max_seeding_time"] = None
tracker["limit_upload_speed"] = None
tracker["notifiarr"] = None
tracker["url"] = None
if not urls:
return tracker
try:
tracker['url'] = util.trunc_val(urls[0], os.sep)
tracker["url"] = util.trunc_val(urls[0], os.sep)
except IndexError as e:
tracker['url'] = None
tracker["url"] = None
logger.debug(f"Tracker Url:{urls}")
logger.debug(e)
if 'tracker' in self.data and self.data["tracker"] is not None:
tag_values = self.data['tracker']
if "tracker" in self.data and self.data["tracker"] is not None:
tag_values = self.data["tracker"]
for tag_url, tag_details in tag_values.items():
for url in urls:
if tag_url in url:
try:
tracker['url'] = util.trunc_val(url, os.sep)
default_tag = tracker['url'].split(os.sep)[2].split(':')[0]
tracker["url"] = util.trunc_val(url, os.sep)
default_tag = tracker["url"].split(os.sep)[2].split(":")[0]
except IndexError as e:
logger.debug(f"Tracker Url:{url}")
logger.debug(e)
# If using Format 1 convert to format 2
if isinstance(tag_details, str):
tracker['tag'] = self.util.check_for_attribute(self.data, tag_url, parent="tracker", default=default_tag, var_type="list")
self.util.check_for_attribute(self.data, "tag", parent="tracker", subparent=tag_url, default=tracker['tag'], do_print=False, var_type="list")
if tracker['tag'] == default_tag:
tracker["tag"] = self.util.check_for_attribute(
self.data, tag_url, parent="tracker", default=default_tag, var_type="list"
)
self.util.check_for_attribute(
self.data,
"tag",
parent="tracker",
subparent=tag_url,
default=tracker["tag"],
do_print=False,
var_type="list",
)
if tracker["tag"] == default_tag:
try:
self.data['tracker'][tag_url]['tag'] = [default_tag]
self.data["tracker"][tag_url]["tag"] = [default_tag]
except Exception:
self.data['tracker'][tag_url] = {'tag': [default_tag]}
self.data["tracker"][tag_url] = {"tag": [default_tag]}
# Using Format 2
else:
tracker['tag'] = self.util.check_for_attribute(self.data, "tag", parent="tracker", subparent=tag_url, default=tag_url, var_type="list")
if tracker['tag'] == [tag_url]: self.data['tracker'][tag_url]['tag'] = [tag_url]
if isinstance(tracker['tag'], str): tracker['tag'] = [tracker['tag']]
tracker['max_ratio'] = self.util.check_for_attribute(self.data, "max_ratio", parent="tracker", subparent=tag_url,
var_type="float", default_int=-2, default_is_none=True, do_print=False, save=False)
tracker['min_seeding_time'] = self.util.check_for_attribute(self.data, "min_seeding_time", parent="tracker", subparent=tag_url,
var_type="int", default_int=0, default_is_none=True, do_print=False, save=False)
tracker['max_seeding_time'] = self.util.check_for_attribute(self.data, "max_seeding_time", parent="tracker", subparent=tag_url,
var_type="int", default_int=-2, default_is_none=True, do_print=False, save=False)
tracker['limit_upload_speed'] = self.util.check_for_attribute(self.data, "limit_upload_speed", parent="tracker", subparent=tag_url,
var_type="int", default_int=-1, default_is_none=True, do_print=False, save=False)
tracker['notifiarr'] = self.util.check_for_attribute(self.data, "notifiarr", parent="tracker", subparent=tag_url, default_is_none=True, do_print=False, save=False)
return (tracker)
if tracker['url']:
default_tag = tracker['url'].split(os.sep)[2].split(':')[0]
tracker['tag'] = self.util.check_for_attribute(self.data, "tag", parent="tracker", subparent=default_tag, default=default_tag, var_type="list")
if isinstance(tracker['tag'], str): tracker['tag'] = [tracker['tag']]
tracker["tag"] = self.util.check_for_attribute(
self.data, "tag", parent="tracker", subparent=tag_url, default=tag_url, var_type="list"
)
if tracker["tag"] == [tag_url]:
self.data["tracker"][tag_url]["tag"] = [tag_url]
if isinstance(tracker["tag"], str):
tracker["tag"] = [tracker["tag"]]
tracker["max_ratio"] = self.util.check_for_attribute(
self.data,
"max_ratio",
parent="tracker",
subparent=tag_url,
var_type="float",
default_int=-2,
default_is_none=True,
do_print=False,
save=False,
)
tracker["min_seeding_time"] = self.util.check_for_attribute(
self.data,
"min_seeding_time",
parent="tracker",
subparent=tag_url,
var_type="int",
default_int=0,
default_is_none=True,
do_print=False,
save=False,
)
tracker["max_seeding_time"] = self.util.check_for_attribute(
self.data,
"max_seeding_time",
parent="tracker",
subparent=tag_url,
var_type="int",
default_int=-2,
default_is_none=True,
do_print=False,
save=False,
)
tracker["limit_upload_speed"] = self.util.check_for_attribute(
self.data,
"limit_upload_speed",
parent="tracker",
subparent=tag_url,
var_type="int",
default_int=-1,
default_is_none=True,
do_print=False,
save=False,
)
tracker["notifiarr"] = self.util.check_for_attribute(
self.data,
"notifiarr",
parent="tracker",
subparent=tag_url,
default_is_none=True,
do_print=False,
save=False,
)
return tracker
if tracker["url"]:
default_tag = tracker["url"].split(os.sep)[2].split(":")[0]
tracker["tag"] = self.util.check_for_attribute(
self.data, "tag", parent="tracker", subparent=default_tag, default=default_tag, var_type="list"
)
if isinstance(tracker["tag"], str):
tracker["tag"] = [tracker["tag"]]
try:
self.data['tracker'][default_tag]['tag'] = [default_tag]
self.data["tracker"][default_tag]["tag"] = [default_tag]
except Exception:
self.data['tracker'][default_tag] = {'tag': [default_tag]}
e = (f'No tags matched for {tracker["url"]}. Please check your config.yml file. Setting tag to {default_tag}')
self.notify(e, 'Tag', False)
self.data["tracker"][default_tag] = {"tag": [default_tag]}
e = f'No tags matched for {tracker["url"]}. Please check your config.yml file. Setting tag to {default_tag}'
self.notify(e, "Tag", False)
logger.warning(e)
return (tracker)
return tracker
# Get category from config file based on path provided
def get_category(self, path):
category = ''
path = os.path.join(path, '')
category = ""
path = os.path.join(path, "")
if "cat" in self.data and self.data["cat"] is not None:
cat_path = self.data["cat"]
for cat, save_path in cat_path.items():
if os.path.join(save_path, '') == path:
if os.path.join(save_path, "") == path:
category = cat
break
@ -366,74 +562,95 @@ class Config:
default_cat = path.split(os.sep)[-2]
category = str(default_cat)
self.util.check_for_attribute(self.data, default_cat, parent="cat", default=path)
self.data['cat'][str(default_cat)] = path
e = (f'No categories matched for the save path {path}. Check your config.yml file. - Setting category to {default_cat}')
self.notify(e, 'Category', False)
self.data["cat"][str(default_cat)] = path
e = f"No categories matched for the save path {path}. Check your config.yml file. - Setting category to {default_cat}"
self.notify(e, "Category", False)
logger.warning(e)
return category
# Empty old files from recycle bin or orphaned
def cleanup_dirs(self, location):
dry_run = self.commands['dry_run']
loglevel = 'DRYRUN' if dry_run else 'INFO'
dry_run = self.commands["dry_run"]
loglevel = "DRYRUN" if dry_run else "INFO"
num_del = 0
files = []
size_bytes = 0
skip = self.commands["skip_cleanup"]
if location == "Recycle Bin":
enabled = self.recyclebin['enabled']
empty_after_x_days = self.recyclebin['empty_after_x_days']
enabled = self.recyclebin["enabled"]
empty_after_x_days = self.recyclebin["empty_after_x_days"]
function = "cleanup_dirs"
location_path = self.recycle_dir
elif location == "Orphaned Data":
enabled = self.commands["rem_orphaned"]
empty_after_x_days = self.orphaned['empty_after_x_days']
empty_after_x_days = self.orphaned["empty_after_x_days"]
function = "cleanup_dirs"
location_path = self.orphaned_dir
if not skip:
if enabled and empty_after_x_days is not None:
if location == "Recycle Bin" and self.recyclebin['split_by_category']:
if location == "Recycle Bin" and self.recyclebin["split_by_category"]:
if "cat" in self.data and self.data["cat"] is not None:
save_path = list(self.data["cat"].values())
cleaned_save_path = [os.path.join(s.replace(self.root_dir, self.remote_dir), os.path.basename(location_path.rstrip(os.sep))) for s in save_path]
cleaned_save_path = [
os.path.join(
s.replace(self.root_dir, self.remote_dir), os.path.basename(location_path.rstrip(os.sep))
)
for s in save_path
]
location_path_list = [location_path]
for dir in cleaned_save_path:
if os.path.exists(dir): location_path_list.append(dir)
if os.path.exists(dir):
location_path_list.append(dir)
else:
e = (f'No categories defined. Checking {location} directory {location_path}.')
self.notify(e, f'Empty {location}', False)
e = f"No categories defined. Checking {location} directory {location_path}."
self.notify(e, f"Empty {location}", False)
logger.warning(e)
location_path_list = [location_path]
else:
location_path_list = [location_path]
location_files = [os.path.join(path, name) for r_path in location_path_list for path, subdirs, files in os.walk(r_path) for name in files]
location_files = [
os.path.join(path, name)
for r_path in location_path_list
for path, subdirs, files in os.walk(r_path)
for name in files
]
location_files = sorted(location_files)
if location_files:
body = []
logger.separator(f"Emptying {location} (Files > {empty_after_x_days} days)", space=True, border=True)
prevfolder = ''
prevfolder = ""
for file in location_files:
folder = re.search(f".*{os.path.basename(location_path.rstrip(os.sep))}", file).group(0)
if folder != prevfolder: body += logger.separator(f"Searching: {folder}", space=False, border=False)
if folder != prevfolder:
body += logger.separator(f"Searching: {folder}", space=False, border=False)
fileStats = os.stat(file)
filename = os.path.basename(file)
last_modified = fileStats[stat.ST_MTIME] # in seconds (last modified time)
now = time.time() # in seconds
days = (now - last_modified) / (60 * 60 * 24)
if (empty_after_x_days <= days):
if empty_after_x_days <= days:
num_del += 1
body += logger.print_line(f"{'Did not delete' if dry_run else 'Deleted'} {filename} from {folder} (Last modified {round(days)} days ago).", loglevel)
body += logger.print_line(
f"{'Did not delete' if dry_run else 'Deleted'} "
f"{filename} from {folder} (Last modified {round(days)} days ago).",
loglevel,
)
files += [str(filename)]
size_bytes += os.path.getsize(file)
if not dry_run: os.remove(file)
if not dry_run:
os.remove(file)
prevfolder = re.search(f".*{os.path.basename(location_path.rstrip(os.sep))}", file).group(0)
if num_del > 0:
if not dry_run:
for path in location_path_list:
util.remove_empty_directories(path, "**/*")
body += logger.print_line(f"{'Did not delete' if dry_run else 'Deleted'} {num_del} files ({util.human_readable_size(size_bytes)}) from the {location}.", loglevel)
body += logger.print_line(
f"{'Did not delete' if dry_run else 'Deleted'} {num_del} files "
f"({util.human_readable_size(size_bytes)}) from the {location}.",
loglevel,
)
attr = {
"function": function,
"location": location,
@ -441,7 +658,7 @@ class Config:
"body": "\n".join(body),
"files": files,
"empty_after_x_days": empty_after_x_days,
"size_in_bytes": size_bytes
"size_in_bytes": size_bytes,
}
self.send_notifications(attr)
else:
@ -450,7 +667,7 @@ class Config:
def send_notifications(self, attr):
try:
function = attr['function']
function = attr["function"]
config_webhooks = self.Webhooks.function_webhooks
config_function = None
for key in config_webhooks:

21
modules/logs.py Normal file → Executable file
View file

@ -1,4 +1,9 @@
import io, logging, os, re, sys, traceback
import io
import logging
import os
import re
import sys
import traceback
from logging.handlers import RotatingFileHandler
LOG_DIR = "logs"
@ -41,8 +46,8 @@ class MyLogger:
os.makedirs(self.log_dir, exist_ok=True)
self._logger = logging.getLogger(self.logger_name)
logging.DRYRUN = DRYRUN
logging.addLevelName(DRYRUN, 'DRYRUN')
setattr(self._logger, 'dryrun', lambda dryrun, *args: self._logger._log(DRYRUN, dryrun, args))
logging.addLevelName(DRYRUN, "DRYRUN")
setattr(self._logger, "dryrun", lambda dryrun, *args: self._logger._log(DRYRUN, dryrun, args))
self._log_level = getattr(logging, log_level.upper())
self._logger.setLevel(self._log_level)
@ -80,7 +85,7 @@ class MyLogger:
if config_key in self.config_handlers:
self._logger.addHandler(self.config_handlers[config_key])
else:
self.config_handlers[config_key] = self._get_handler(os.path.join(self.log_dir, config_key + '.log'))
self.config_handlers[config_key] = self._get_handler(os.path.join(self.log_dir, config_key + ".log"))
self._logger.addHandler(self.config_handlers[config_key])
def remove_config_handler(self, config_key):
@ -99,7 +104,7 @@ class MyLogger:
final_text = f"{text}{sep * side}{sep * side}" if left else f"{sep * side}{text}{sep * side}"
return final_text
def separator(self, text=None, space=True, border=True, side_space=True, left=False, loglevel='INFO'):
def separator(self, text=None, space=True, border=True, side_space=True, left=False, loglevel="INFO"):
sep = " " if space else self.separating_character
for handler in self._logger.handlers:
self._formatter(handler, border=False)
@ -116,7 +121,7 @@ class MyLogger:
self._formatter(handler)
return [text]
def print_line(self, msg, loglevel='INFO', *args, **kwargs):
def print_line(self, msg, loglevel="INFO", *args, **kwargs):
loglvl = getattr(logging, loglevel.upper())
if self._logger.isEnabledFor(loglvl):
self._log(loglvl, str(msg), args, **kwargs)
@ -245,10 +250,10 @@ class MyLogger:
sinfo = None
if stack_info:
sio = io.StringIO()
sio.write('Stack (most recent call last):\n')
sio.write("Stack (most recent call last):\n")
traceback.print_stack(f, file=sio)
sinfo = sio.getvalue()
if sinfo[-1] == '\n':
if sinfo[-1] == "\n":
sinfo = sinfo[:-1]
sio.close()
rv = (co.co_filename, f.f_lineno, co.co_name, sinfo)

6
modules/notifiarr.py Normal file → Executable file
View file

@ -1,8 +1,8 @@
from modules import util
from modules.util import Failed
from json import JSONDecodeError
from modules import util
from modules.util import Failed
logger = util.logger
base_url = "https://notifiarr.com/api/v1/"

802
modules/qbittorrent.py Normal file → Executable file

File diff suppressed because it is too large Load diff

107
modules/util.py Normal file → Executable file
View file

@ -1,40 +1,57 @@
import logging, os, shutil, time, signal, json, ruamel.yaml
import json
import logging
import os
import shutil
import signal
import time
from pathlib import Path
logger = logging.getLogger('qBit Manage')
import ruamel.yaml
logger = logging.getLogger("qBit Manage")
def get_list(data, lower=False, split=True, int_list=False):
if data is None: return None
elif isinstance(data, list): return data
elif isinstance(data, dict): return [data]
elif split is False: return [str(data)]
elif lower is True: return [d.strip().lower() for d in str(data).split(",")]
if data is None:
return None
elif isinstance(data, list):
return data
elif isinstance(data, dict):
return [data]
elif split is False:
return [str(data)]
elif lower is True:
return [d.strip().lower() for d in str(data).split(",")]
elif int_list is True:
try: return [int(d.strip()) for d in str(data).split(",")]
except ValueError: return []
else: return [d.strip() for d in str(data).split(",")]
try:
return [int(d.strip()) for d in str(data).split(",")]
except ValueError:
return []
else:
return [d.strip() for d in str(data).split(",")]
class check:
def __init__(self, config):
self.config = config
def check_for_attribute(self,
data,
attribute,
parent=None,
subparent=None,
test_list=None,
default=None,
do_print=True,
default_is_none=False,
req_default=False,
var_type="str",
default_int=0,
throw=False,
save=True,
make_dirs=False):
def check_for_attribute(
self,
data,
attribute,
parent=None,
subparent=None,
test_list=None,
default=None,
do_print=True,
default_is_none=False,
req_default=False,
var_type="str",
default_int=0,
throw=False,
save=True,
make_dirs=False,
):
endline = ""
if parent is not None:
if subparent is not None:
@ -75,12 +92,15 @@ class check:
endline = f"\n{parent} sub-attribute {attribute} added to config"
if parent not in yaml.data or not yaml.data[parent]:
yaml.data[parent] = {attribute: default}
elif attribute not in yaml.data[parent] or (attribute in yaml.data[parent] and yaml.data[parent][attribute] is None):
elif attribute not in yaml.data[parent] or (
attribute in yaml.data[parent] and yaml.data[parent][attribute] is None
):
yaml.data[parent][attribute] = default
else:
endline = ""
yaml.save()
if default_is_none and var_type in ["list", "int_list"]: return []
if default_is_none and var_type in ["list", "int_list"]:
return []
elif data[attribute] is None:
if default_is_none and var_type == "list":
return []
@ -114,14 +134,13 @@ class check:
message = f"{text} must a float >= {float(default_int)}"
elif var_type == "path":
if os.path.exists(os.path.abspath(data[attribute])):
return os.path.join(data[attribute], '')
return os.path.join(data[attribute], "")
else:
message = f"Path {os.path.abspath(data[attribute])} does not exist"
elif var_type == "list":
return get_list(data[attribute], split=False)
elif var_type == "list_path":
temp_list = [p for p in get_list(
data[attribute], split=False) if os.path.exists(os.path.abspath(p))]
temp_list = [p for p in get_list(data[attribute], split=False) if os.path.exists(os.path.abspath(p))]
if len(temp_list) > 0:
return temp_list
else:
@ -133,10 +152,10 @@ class check:
else:
message = f"{text}: {data[attribute]} is an invalid input"
if var_type == "path" and default and os.path.exists(os.path.abspath(default)):
return os.path.join(default, '')
return os.path.join(default, "")
elif var_type == "path" and default and make_dirs:
os.makedirs(default, exist_ok=True)
return os.path.join(default, '')
return os.path.join(default, "")
elif var_type == "path" and default:
if data and attribute in data and data[attribute]:
message = f"neither {data[attribute]} or the default path {default} could be found"
@ -147,8 +166,7 @@ class check:
message = message + f" using {default} as default"
message = message + endline
if req_default and default is None:
raise Failed(
f"Config Error: {attribute} attribute must be set under {parent}.")
raise Failed(f"Config Error: {attribute} attribute must be set under {parent}.")
options = ""
if test_list:
for option, description in test_list.items():
@ -171,8 +189,9 @@ class Failed(Exception):
def list_in_text(text, search_list, match_all=False):
if isinstance(search_list, list): search_list = set(search_list)
contains = set([x for x in search_list if ' ' in x])
if isinstance(search_list, list):
search_list = set(search_list)
contains = set([x for x in search_list if " " in x])
exception = search_list - contains
if match_all:
if all(x == m for m in text.split(" ") for x in exception) or all(x in text for x in contains):
@ -245,20 +264,20 @@ def remove_empty_directories(pathlib_root_dir, pattern):
# will check if there are any hard links if it passes a file or folder
def nohardlink(file):
check = True
if (os.path.isfile(file)):
if (os.stat(file).st_nlink > 1):
if os.path.isfile(file):
if os.stat(file).st_nlink > 1:
check = False
else:
for path, subdirs, files in os.walk(file):
for x in files:
if (os.stat(os.path.join(path, x)).st_nlink > 1):
if os.stat(os.path.join(path, x)).st_nlink > 1:
check = False
return check
# Load json file if exists
def load_json(file):
if (os.path.isfile(file)):
if os.path.isfile(file):
f = open(file, "r")
data = json.load(f)
f.close()
@ -269,7 +288,7 @@ def load_json(file):
# Save json file overwrite if exists
def save_json(torrent_json, dest):
with open(dest, 'w', encoding='utf-8') as f:
with open(dest, "w", encoding="utf-8") as f:
json.dump(torrent_json, f, ensure_ascii=False, indent=4)
@ -286,7 +305,7 @@ class GracefulKiller:
def human_readable_size(size, decimal_places=3):
for unit in ['B', 'KiB', 'MiB', 'GiB', 'TiB']:
for unit in ["B", "KiB", "MiB", "GiB", "TiB"]:
if size < 1024.0:
break
size /= 1024.0
@ -304,7 +323,7 @@ class YAML:
self.data = self.yaml.load(input_data)
else:
if create and not os.path.exists(self.path):
with open(self.path, 'w'):
with open(self.path, "w"):
pass
self.data = {}
else:
@ -322,5 +341,5 @@ class YAML:
def save(self):
if self.path:
with open(self.path, 'w') as fp:
with open(self.path, "w") as fp:
self.yaml.dump(self.data, fp)

88
modules/webhooks.py Normal file → Executable file
View file

@ -1,4 +1,5 @@
from json import JSONDecodeError
from modules import util
from modules.util import Failed
@ -44,7 +45,7 @@ class Webhooks:
logger.warning("Webhook attribute set to apprise but apprise attribute is not configured.")
break
else:
json['urls'] = self.apprise.notify_url
json["urls"] = self.apprise.notify_url
for x in range(6):
response = self.config.post(f"{self.apprise.api_url}/notify", json=json)
if response.status_code < 500:
@ -57,13 +58,20 @@ class Webhooks:
response_json = response.json()
if self.config.trace_mode:
logger.debug(f"Response: {response_json}")
if "result" in response_json and response_json["result"] == "error" and "details" in response_json and "response" in response_json["details"]:
if ('trigger is not enabled' in response_json['details']['response']):
if (
"result" in response_json
and response_json["result"] == "error"
and "details" in response_json
and "response" in response_json["details"]
):
if "trigger is not enabled" in response_json["details"]["response"]:
logger.debug(f"Notifiarr Warning: {response_json['details']['response']}")
skip = True
else:
raise Failed(f"Notifiarr Error: {response_json['details']['response']}")
if (response.status_code >= 400 or ("result" in response_json and response_json["result"] == "error")) and skip is False:
if (
response.status_code >= 400 or ("result" in response_json and response_json["result"] == "error")
) and skip is False:
raise Failed(f"({response.status_code} [{response.reason}]) {response_json}")
except JSONDecodeError:
if response.status_code >= 400:
@ -71,45 +79,51 @@ class Webhooks:
def start_time_hooks(self, start_time):
if self.run_start_webhooks:
dry_run = self.config.commands['dry_run']
dry_run = self.config.commands["dry_run"]
if dry_run:
start_type = "Dry-"
else:
start_type = ""
self._request(self.run_start_webhooks, {
"function": "run_start",
"title": None,
"body": f"Starting {start_type}Run",
"start_time": start_time.strftime("%Y-%m-%d %H:%M:%S"),
"dry_run": self.config.commands['dry_run']
})
self._request(
self.run_start_webhooks,
{
"function": "run_start",
"title": None,
"body": f"Starting {start_type}Run",
"start_time": start_time.strftime("%Y-%m-%d %H:%M:%S"),
"dry_run": self.config.commands["dry_run"],
},
)
def end_time_hooks(self, start_time, end_time, run_time, next_run, stats, body):
if self.run_end_webhooks:
self._request(self.run_end_webhooks, {
"function": "run_end",
"title": None,
"body": body,
"start_time": start_time.strftime("%Y-%m-%d %H:%M:%S"),
"end_time": end_time.strftime("%Y-%m-%d %H:%M:%S"),
"next_run": next_run.strftime("%Y-%m-%d %H:%M:%S") if next_run is not None else next_run,
"run_time": run_time,
"torrents_added": stats["added"],
"torrents_deleted": stats["deleted"],
"torrents_deleted_and_contents_count": stats["deleted_contents"],
"torrents_resumed": stats["resumed"],
"torrents_rechecked": stats["rechecked"],
"torrents_categorized": stats["categorized"],
"torrents_tagged": stats["tagged"],
"remove_unregistered": stats["rem_unreg"],
"torrents_tagged_tracker_error": stats["tagged_tracker_error"],
"torrents_untagged_tracker_error": stats["untagged_tracker_error"],
"orphaned_files_found": stats["orphaned"],
"torrents_tagged_no_hardlinks": stats["tagged_noHL"],
"torrents_untagged_no_hardlinks": stats["untagged_noHL"],
"files_deleted_from_recyclebin": stats["recycle_emptied"],
"files_deleted_from_orphaned": stats["orphaned_emptied"]
})
self._request(
self.run_end_webhooks,
{
"function": "run_end",
"title": None,
"body": body,
"start_time": start_time.strftime("%Y-%m-%d %H:%M:%S"),
"end_time": end_time.strftime("%Y-%m-%d %H:%M:%S"),
"next_run": next_run.strftime("%Y-%m-%d %H:%M:%S") if next_run is not None else next_run,
"run_time": run_time,
"torrents_added": stats["added"],
"torrents_deleted": stats["deleted"],
"torrents_deleted_and_contents_count": stats["deleted_contents"],
"torrents_resumed": stats["resumed"],
"torrents_rechecked": stats["rechecked"],
"torrents_categorized": stats["categorized"],
"torrents_tagged": stats["tagged"],
"remove_unregistered": stats["rem_unreg"],
"torrents_tagged_tracker_error": stats["tagged_tracker_error"],
"torrents_untagged_tracker_error": stats["untagged_tracker_error"],
"orphaned_files_found": stats["orphaned"],
"torrents_tagged_no_hardlinks": stats["tagged_noHL"],
"torrents_untagged_no_hardlinks": stats["untagged_noHL"],
"files_deleted_from_recyclebin": stats["recycle_emptied"],
"files_deleted_from_orphaned": stats["orphaned_emptied"],
},
)
def error_hooks(self, text, function_error=None, critical=True):
if self.error_webhooks:
@ -119,7 +133,7 @@ class Webhooks:
"title": f"{function_error} Error",
"body": str(text),
"critical": critical,
"type": type
"type": type,
}
if function_error:
json["function_error"] = function_error

319
qbit_manage.py Normal file → Executable file
View file

@ -1,7 +1,11 @@
#!/usr/bin/python3
import argparse, os, sys, time, glob
from datetime import datetime, timedelta
import argparse
import glob
import os
import sys
import time
from datetime import datetime
from datetime import timedelta
try:
import schedule
@ -12,38 +16,148 @@ except ModuleNotFoundError:
if sys.version_info[0] != 3 or sys.version_info[1] < 6:
print("Version Error: Version: %s.%s.%s incompatible please use Python 3.6+" % (sys.version_info[0], sys.version_info[1], sys.version_info[2]))
print(
"Version Error: Version: %s.%s.%s incompatible please use Python 3.6+"
% (sys.version_info[0], sys.version_info[1], sys.version_info[2])
)
sys.exit(0)
parser = argparse.ArgumentParser('qBittorrent Manager.', description='A mix of scripts combined for managing qBittorrent.')
parser = argparse.ArgumentParser("qBittorrent Manager.", description="A mix of scripts combined for managing qBittorrent.")
parser.add_argument("-db", "--debug", dest="debug", help=argparse.SUPPRESS, action="store_true", default=False)
parser.add_argument("-tr", "--trace", dest="trace", help=argparse.SUPPRESS, action="store_true", default=False)
parser.add_argument('-r', '--run', dest='run', action='store_true', default=False, help='Run without the scheduler. Script will exit after completion.')
parser.add_argument('-sch', '--schedule', dest='min', default='1440', type=str, help='Schedule to run every x minutes. (Default set to 1440 (1 day))')
parser.add_argument('-sd', '--startup-delay', dest='startupDelay', default='0', type=str, help='Set delay in seconds on the first run of a schedule (Default set to 0)')
parser.add_argument('-c', '--config-file', dest='configfiles', action='store', default='config.yml', type=str,
help='This is used if you want to use a different name for your config.yml or if you want to load multiple config files using *. Example: tv.yml or config*.yml')
parser.add_argument('-lf', '--log-file', dest='logfile', action='store', default='qbit_manage.log', type=str,
help='This is used if you want to use a different name for your log file. Example: tv.log',)
parser.add_argument('-cs', '--cross-seed', dest='cross_seed', action="store_true", default=False,
help='Use this after running cross-seed script to add torrents from the cross-seed output folder to qBittorrent')
parser.add_argument('-re', '--recheck', dest='recheck', action="store_true", default=False, help='Recheck paused torrents sorted by lowest size. Resume if Completed.')
parser.add_argument('-cu', '--cat-update', dest='cat_update', action="store_true", default=False, help='Use this if you would like to update your categories.')
parser.add_argument('-tu', '--tag-update', dest='tag_update', action="store_true", default=False,
help='Use this if you would like to update your tags and/or set seed goals/limit upload speed by tag. (Only adds tags to untagged torrents)')
parser.add_argument('-ru', '--rem-unregistered', dest='rem_unregistered', action="store_true", default=False, help='Use this if you would like to remove unregistered torrents.')
parser.add_argument('-tte', '--tag-tracker-error', dest='tag_tracker_error', action="store_true", default=False, help='Use this if you would like to tag torrents that do not have a working tracker.')
parser.add_argument('-ro', '--rem-orphaned', dest='rem_orphaned', action="store_true", default=False, help='Use this if you would like to remove unregistered torrents.')
parser.add_argument('-tnhl', '--tag-nohardlinks', dest='tag_nohardlinks', action="store_true", default=False,
help='Use this to tag any torrents that do not have any hard links associated with any of the files. \
This is useful for those that use Sonarr/Radarr which hard link your media files with the torrents for seeding. \
When files get upgraded they no longer become linked with your media therefore will be tagged with a new tag noHL. \
You can then safely delete/remove these torrents to free up any extra space that is not being used by your media folder.')
parser.add_argument('-sc', '--skip-cleanup', dest='skip_cleanup', action="store_true", default=False, help='Use this to skip cleaning up Reycle Bin/Orphaned directory.')
parser.add_argument('-dr', '--dry-run', dest='dry_run', action="store_true", default=False,
help='If you would like to see what is gonna happen but not actually move/delete or tag/categorize anything.')
parser.add_argument('-ll', '--log-level', dest='log_level', action="store", default='INFO', type=str, help='Change your log level.')
parser.add_argument("-d", "--divider", dest="divider", help="Character that divides the sections (Default: '=')", default="=", type=str)
parser.add_argument(
"-r",
"--run",
dest="run",
action="store_true",
default=False,
help="Run without the scheduler. Script will exit after completion.",
)
parser.add_argument(
"-sch",
"--schedule",
dest="min",
default="1440",
type=str,
help="Schedule to run every x minutes. (Default set to 1440 (1 day))",
)
parser.add_argument(
"-sd",
"--startup-delay",
dest="startupDelay",
default="0",
type=str,
help="Set delay in seconds on the first run of a schedule (Default set to 0)",
)
parser.add_argument(
"-c",
"--config-file",
dest="configfiles",
action="store",
default="config.yml",
type=str,
help="This is used if you want to use a different name for your config.yml or if you want to load multiple"
"config files using *. Example: tv.yml or config*.yml",
)
parser.add_argument(
"-lf",
"--log-file",
dest="logfile",
action="store",
default="qbit_manage.log",
type=str,
help="This is used if you want to use a different name for your log file. Example: tv.log",
)
parser.add_argument(
"-cs",
"--cross-seed",
dest="cross_seed",
action="store_true",
default=False,
help="Use this after running cross-seed script to add torrents from the cross-seed output folder to qBittorrent",
)
parser.add_argument(
"-re",
"--recheck",
dest="recheck",
action="store_true",
default=False,
help="Recheck paused torrents sorted by lowest size. Resume if Completed.",
)
parser.add_argument(
"-cu",
"--cat-update",
dest="cat_update",
action="store_true",
default=False,
help="Use this if you would like to update your categories.",
)
parser.add_argument(
"-tu",
"--tag-update",
dest="tag_update",
action="store_true",
default=False,
help="Use this if you would like to update your tags and/or set seed goals/limit upload speed by tag."
" (Only adds tags to untagged torrents)",
)
parser.add_argument(
"-ru",
"--rem-unregistered",
dest="rem_unregistered",
action="store_true",
default=False,
help="Use this if you would like to remove unregistered torrents.",
)
parser.add_argument(
"-tte",
"--tag-tracker-error",
dest="tag_tracker_error",
action="store_true",
default=False,
help="Use this if you would like to tag torrents that do not have a working tracker.",
)
parser.add_argument(
"-ro",
"--rem-orphaned",
dest="rem_orphaned",
action="store_true",
default=False,
help="Use this if you would like to remove unregistered torrents.",
)
parser.add_argument(
"-tnhl",
"--tag-nohardlinks",
dest="tag_nohardlinks",
action="store_true",
default=False,
help="Use this to tag any torrents that do not have any hard links associated with any of the files. "
"This is useful for those that use Sonarr/Radarr which hard link your media files with the torrents for seeding. "
"When files get upgraded they no longer become linked with your media therefore will be tagged with a new tag noHL. "
"You can then safely delete/remove these torrents to free up any extra space that is not being used by your media folder.",
)
parser.add_argument(
"-sc",
"--skip-cleanup",
dest="skip_cleanup",
action="store_true",
default=False,
help="Use this to skip cleaning up Reycle Bin/Orphaned directory.",
)
parser.add_argument(
"-dr",
"--dry-run",
dest="dry_run",
action="store_true",
default=False,
help="If you would like to see what is gonna happen but not actually move/delete or tag/categorize anything.",
)
parser.add_argument(
"-ll", "--log-level", dest="log_level", action="store", default="INFO", type=str, help="Change your log level."
)
parser.add_argument(
"-d", "--divider", dest="divider", help="Character that divides the sections (Default: '=')", default="=", type=str
)
parser.add_argument("-w", "--width", dest="width", help="Screen Width (Default: 100)", default=100, type=int)
args = parser.parse_args()
@ -93,18 +207,19 @@ screen_width = get_arg("QBT_WIDTH", args.width, arg_int=True)
debug = get_arg("QBT_DEBUG", args.debug, arg_bool=True)
trace = get_arg("QBT_TRACE", args.trace, arg_bool=True)
if debug or trace: log_level = 'DEBUG'
if debug or trace:
log_level = "DEBUG"
stats = {}
args = {}
if os.path.isdir('/config') and glob.glob(os.path.join('/config', config_files)):
default_dir = '/config'
if os.path.isdir("/config") and glob.glob(os.path.join("/config", config_files)):
default_dir = "/config"
else:
default_dir = os.path.join(os.path.dirname(os.path.abspath(__file__)), "config")
if '*' not in config_files:
if "*" not in config_files:
config_files = [config_files]
else:
glob_configs = glob.glob(os.path.join(default_dir, config_files))
@ -116,26 +231,26 @@ else:
for v in [
'run',
'sch',
'startupDelay',
'config_files',
'log_file',
'cross_seed',
'recheck',
'cat_update',
'tag_update',
'rem_unregistered',
'tag_tracker_error',
'rem_orphaned',
'tag_nohardlinks',
'skip_cleanup',
'dry_run',
'log_level',
'divider',
'screen_width',
'debug',
'trace'
"run",
"sch",
"startupDelay",
"config_files",
"log_file",
"cross_seed",
"recheck",
"cat_update",
"tag_update",
"rem_unregistered",
"tag_tracker_error",
"rem_orphaned",
"tag_nohardlinks",
"skip_cleanup",
"dry_run",
"log_level",
"divider",
"screen_width",
"debug",
"trace",
]:
args[v] = eval(v)
@ -158,8 +273,9 @@ except ValueError:
sys.exit(0)
logger = MyLogger('qBit Manage', log_file, log_level, default_dir, screen_width, divider[0], False, debug or trace)
logger = MyLogger("qBit Manage", log_file, log_level, default_dir, screen_width, divider[0], False, debug or trace)
from modules import util
util.logger = logger
from modules.config import Config
from modules.util import GracefulKiller
@ -204,8 +320,8 @@ def start():
stats_summary = []
logger.separator("Starting Run")
cfg = None
body = ''
run_time = ''
body = ""
run_time = ""
end_time = None
next_run = None
global stats
@ -224,30 +340,33 @@ def start():
"tagged_tracker_error": 0,
"untagged_tracker_error": 0,
"tagged_noHL": 0,
"untagged_noHL": 0
"untagged_noHL": 0,
}
def FinishedRun():
nonlocal end_time, start_time, stats_summary, run_time, next_run, body
end_time = datetime.now()
run_time = str(end_time - start_time).split('.')[0]
run_time = str(end_time - start_time).split(".")[0]
_, nr = calc_next_run(sch, True)
next_run_str = nr['next_run_str']
next_run = nr['next_run']
body = logger.separator(f"Finished Run\n{os.linesep.join(stats_summary) if len(stats_summary)>0 else ''}\nRun Time: {run_time}\n{next_run_str if len(next_run_str)>0 else ''}"
.replace('\n\n', '\n').rstrip())[0]
next_run_str = nr["next_run_str"]
next_run = nr["next_run"]
body = logger.separator(
f"Finished Run\n{os.linesep.join(stats_summary) if len(stats_summary)>0 else ''}"
f"\nRun Time: {run_time}\n{next_run_str if len(next_run_str)>0 else ''}".replace("\n\n", "\n").rstrip()
)[0]
return next_run, body
try:
cfg = Config(default_dir, args)
except Exception as e:
if 'Qbittorrent Error' in e.args[0]:
logger.print_line(e, 'CRITICAL')
logger.print_line('Exiting scheduled Run.', 'CRITICAL')
if "Qbittorrent Error" in e.args[0]:
logger.print_line(e, "CRITICAL")
logger.print_line("Exiting scheduled Run.", "CRITICAL")
FinishedRun()
return None
else:
logger.stacktrace()
logger.print_line(e, 'CRITICAL')
logger.print_line(e, "CRITICAL")
if cfg:
# Set Category
@ -260,7 +379,7 @@ def start():
# Remove Unregistered Torrents
num_deleted, num_deleted_contents, num_tagged, num_untagged = cfg.qbt.rem_unregistered()
stats["rem_unreg"] += (num_deleted + num_deleted_contents)
stats["rem_unreg"] += num_deleted + num_deleted_contents
stats["deleted"] += num_deleted
stats["deleted_contents"] += num_deleted_contents
stats["tagged_tracker_error"] += num_tagged
@ -297,21 +416,36 @@ def start():
orphaned_emptied = cfg.cleanup_dirs("Orphaned Data")
stats["orphaned_emptied"] += orphaned_emptied
if stats["categorized"] > 0: stats_summary.append(f"Total Torrents Categorized: {stats['categorized']}")
if stats["tagged"] > 0: stats_summary.append(f"Total Torrents Tagged: {stats['tagged']}")
if stats["rem_unreg"] > 0: stats_summary.append(f"Total Unregistered Torrents Removed: {stats['rem_unreg']}")
if stats["tagged_tracker_error"] > 0: stats_summary.append(f"Total {cfg.settings['tracker_error_tag']} Torrents Tagged: {stats['tagged_tracker_error']}")
if stats["untagged_tracker_error"] > 0: stats_summary.append(f"Total {cfg.settings['tracker_error_tag']} Torrents untagged: {stats['untagged_tracker_error']}")
if stats["added"] > 0: stats_summary.append(f"Total Torrents Added: {stats['added']}")
if stats["resumed"] > 0: stats_summary.append(f"Total Torrents Resumed: {stats['resumed']}")
if stats["rechecked"] > 0: stats_summary.append(f"Total Torrents Rechecked: {stats['rechecked']}")
if stats["deleted"] > 0: stats_summary.append(f"Total Torrents Deleted: {stats['deleted']}")
if stats["deleted_contents"] > 0: stats_summary.append(f"Total Torrents + Contents Deleted : {stats['deleted_contents']}")
if stats["orphaned"] > 0: stats_summary.append(f"Total Orphaned Files: {stats['orphaned']}")
if stats["tagged_noHL"] > 0: stats_summary.append(f"Total noHL Torrents Tagged: {stats['tagged_noHL']}")
if stats["untagged_noHL"] > 0: stats_summary.append(f"Total noHL Torrents untagged: {stats['untagged_noHL']}")
if stats["recycle_emptied"] > 0: stats_summary.append(f"Total Files Deleted from Recycle Bin: {stats['recycle_emptied']}")
if stats["orphaned_emptied"] > 0: stats_summary.append(f"Total Files Deleted from Orphaned Data: {stats['orphaned_emptied']}")
if stats["categorized"] > 0:
stats_summary.append(f"Total Torrents Categorized: {stats['categorized']}")
if stats["tagged"] > 0:
stats_summary.append(f"Total Torrents Tagged: {stats['tagged']}")
if stats["rem_unreg"] > 0:
stats_summary.append(f"Total Unregistered Torrents Removed: {stats['rem_unreg']}")
if stats["tagged_tracker_error"] > 0:
stats_summary.append(f"Total {cfg.settings['tracker_error_tag']} Torrents Tagged: {stats['tagged_tracker_error']}")
if stats["untagged_tracker_error"] > 0:
stats_summary.append(f"Total {cfg.settings['tracker_error_tag']} Torrents untagged: {stats['untagged_tracker_error']}")
if stats["added"] > 0:
stats_summary.append(f"Total Torrents Added: {stats['added']}")
if stats["resumed"] > 0:
stats_summary.append(f"Total Torrents Resumed: {stats['resumed']}")
if stats["rechecked"] > 0:
stats_summary.append(f"Total Torrents Rechecked: {stats['rechecked']}")
if stats["deleted"] > 0:
stats_summary.append(f"Total Torrents Deleted: {stats['deleted']}")
if stats["deleted_contents"] > 0:
stats_summary.append(f"Total Torrents + Contents Deleted : {stats['deleted_contents']}")
if stats["orphaned"] > 0:
stats_summary.append(f"Total Orphaned Files: {stats['orphaned']}")
if stats["tagged_noHL"] > 0:
stats_summary.append(f"Total noHL Torrents Tagged: {stats['tagged_noHL']}")
if stats["untagged_noHL"] > 0:
stats_summary.append(f"Total noHL Torrents untagged: {stats['untagged_noHL']}")
if stats["recycle_emptied"] > 0:
stats_summary.append(f"Total Files Deleted from Recycle Bin: {stats['recycle_emptied']}")
if stats["orphaned_emptied"] > 0:
stats_summary.append(f"Total Files Deleted from Orphaned Data: {stats['orphaned_emptied']}")
FinishedRun()
if cfg:
@ -330,14 +464,14 @@ def end():
def calc_next_run(sch, print=False):
current = datetime.now().strftime("%H:%M")
seconds = sch*60
seconds = sch * 60
time_to_run = datetime.now() + timedelta(minutes=sch)
time_to_run_str = time_to_run.strftime("%H:%M")
new_seconds = (datetime.strptime(time_to_run_str, "%H:%M") - datetime.strptime(current, "%H:%M")).total_seconds()
time_str = ''
time_str = ""
next_run = {}
if run is False:
next_run['next_run'] = time_to_run
next_run["next_run"] = time_to_run
if new_seconds < 0:
new_seconds += 86400
if (seconds is None or new_seconds < seconds) and new_seconds > 0:
@ -347,14 +481,15 @@ def calc_next_run(sch, print=False):
minutes = int((seconds % 3600) // 60)
time_str = f"{hours} Hour{'s' if hours > 1 else ''}{' and ' if minutes > 1 else ''}" if hours > 0 else ""
time_str += f"{minutes} Minute{'s' if minutes > 1 else ''}" if minutes > 0 else ""
if print: next_run['next_run_str'] = (f"Current Time: {current} | {time_str} until the next run at {time_to_run_str}")
if print:
next_run["next_run_str"] = f"Current Time: {current} | {time_str} until the next run at {time_to_run_str}"
else:
next_run['next_run'] = None
next_run['next_run_str'] = ''
next_run["next_run"] = None
next_run["next_run_str"] = ""
return time_str, next_run
if __name__ == '__main__':
if __name__ == "__main__":
killer = GracefulKiller()
logger.add_main_handler()
logger.separator()
@ -368,7 +503,7 @@ if __name__ == '__main__':
logger.info_center(" |_| |______| |___/ ") # noqa: W605
logger.info(f" Version: {version}")
logger.separator(loglevel='DEBUG')
logger.separator(loglevel="DEBUG")
logger.debug(f" --run (QBT_RUN): {run}")
logger.debug(f" --schedule (QBT_SCHEDULE): {sch}")
logger.debug(f" --startup-delay (QBT_STARTUP_DELAY): {startupDelay}")

10
requirements.txt Normal file → Executable file
View file

@ -1,6 +1,8 @@
ruamel.yaml==0.17.21
qbittorrent-api==2022.10.39
schedule==1.1.0
retrying==1.3.3
alive_progress==2.4.1
flake8==5.0.4
pre-commit==2.20.0
qbittorrent-api==2022.10.39
requests==2.28.1
retrying==1.3.3
ruamel.yaml==0.17.21
schedule==1.1.0

47
scripts/delete_torrents_on_low_disk_space.py Normal file → Executable file
View file

@ -3,25 +3,29 @@ You can set a min torrent age and share ratio for a torrent to be deleted.
You can also allow incomplete torrents to be deleted.
Torrents will be deleted starting with the ones with the most seeds, only torrents with a single hardlink will be deleted.
"""
import os
import qbittorrentapi
from datetime import datetime, timedelta
import shutil
import time
from datetime import datetime
from datetime import timedelta
import qbittorrentapi
"""===Config==="""
# qBittorrent WebUi Login
qbt_login = {"host": "localhost", "port": 8080, "username": "???", "password": "???"}
PATH = "M:" # Path of drive to monitor
MIN_FREE_SPACE = 10 # In GB. Min free space on drive.
MIN_FREE_USAGE = 0 # In decimal percentage, 0 to 1. Min % free space on drive.
MIN_TORRENT_SHARE_RATIO = 0 # In decimal percentage, 0 to inf. Min seeding ratio of torrent to delete.
MIN_TORRENT_AGE = 30 # In days, min age of torrent to delete. Uses seeding time.
ALLOW_INCOMPLETE_TORRENT_DELETIONS = False # Also delete torrents that haven't finished downloading. MIN_TORRENT_AGE now based on time torrent was added.
PREFER_PRIVATE_TORRENTS = True # Will delete public torrents before private ones regardless of seed difference. See is_torrent_public().
PATH = "M:" # Path of drive to monitor
MIN_FREE_SPACE = 10 # In GB. Min free space on drive.
MIN_FREE_USAGE = 0 # In decimal percentage, 0 to 1. Min % free space on drive.
MIN_TORRENT_SHARE_RATIO = 0 # In decimal percentage, 0 to inf. Min seeding ratio of torrent to delete.
MIN_TORRENT_AGE = 30 # In days, min age of torrent to delete. Uses seeding time.
ALLOW_INCOMPLETE_TORRENT_DELETIONS = (
False # Also delete torrents that haven't finished downloading. MIN_TORRENT_AGE now based on time torrent was added.
)
PREFER_PRIVATE_TORRENTS = (
True # Will delete public torrents before private ones regardless of seed difference. See is_torrent_public().
)
"""===End Config==="""
# Services
@ -32,15 +36,18 @@ def quit_program(code=0) -> None:
"""Quits program with info"""
print("Exiting...")
import sys
sys.exit(code)
def setup_services(qbt=False) -> None:
"""Setup required services"""
global qbt_client
if qbt:
qbt_client = qbittorrentapi.Client(host=qbt_login["host"], port=qbt_login["port"], username=qbt_login["username"], password=qbt_login["password"])
qbt_client = qbittorrentapi.Client(
host=qbt_login["host"], port=qbt_login["port"], username=qbt_login["username"], password=qbt_login["password"]
)
try:
qbt_client.auth_log_in()
print("Succesfully connected to qBittorrent!")
@ -164,11 +171,15 @@ def main() -> None:
if not is_storage_full():
print(f"Free space now above threshold, {len(deleted_torrents)} torrents were deleted!")
else: # No more torrents to delete but still low on space
print(f"WARNING... Free space still below threshold after deleting all {len(deleted_torrents)} eligible torrents! Either:")
print(f"--- Torrent ages are below threshold of '{MIN_TORRENT_AGE} days'\n"
f"--- Torrent seed ratios are below threshold of '{MIN_TORRENT_SHARE_RATIO}'\n"
f"--- Torrents have multiple hard links\n"
f"--- No torrents exists!")
print(
f"WARNING... Free space still below threshold after deleting all {len(deleted_torrents)} eligible torrents! Either:"
)
print(
f"--- Torrent ages are below threshold of '{MIN_TORRENT_AGE} days'\n"
f"--- Torrent seed ratios are below threshold of '{MIN_TORRENT_SHARE_RATIO}'\n"
f"--- Torrents have multiple hard links\n"
f"--- No torrents exists!"
)
quit_program(0)

29
scripts/mover.py Normal file → Executable file
View file

@ -1,7 +1,11 @@
#!/usr/bin/env python3
# This standalone script is used to pause torrents older than last x days, run mover (in Unraid) and start torrents again once completed
import os, sys, time
from datetime import datetime, timedelta
# This standalone script is used to pause torrents older than last x days,
# run mover (in Unraid) and start torrents again once completed
import os
import sys
import time
from datetime import datetime
from datetime import timedelta
# --DEFINE VARIABLES--#
@ -10,7 +14,7 @@ from datetime import datetime, timedelta
# days_to will be the upper limit of how far you want to pause torrents to
days_from = 0
days_to = 2
qbt_host = 'qbittorrent:8080'
qbt_host = "qbittorrent:8080"
qbt_user = None
qbt_pass = None
# --DEFINE VARIABLES--#
@ -19,7 +23,7 @@ qbt_pass = None
try:
from qbittorrentapi import Client, LoginFailed, APIConnectionError
except ModuleNotFoundError:
print("Requirements Error: qbittorrent-api not installed. Please install using the command \"pip install qbittorrent-api\"")
print('Requirements Error: qbittorrent-api not installed. Please install using the command "pip install qbittorrent-api"')
sys.exit(0)
current = datetime.now()
@ -27,7 +31,8 @@ timeoffset_from = (current - timedelta(days=days_from)).timestamp()
timeoffset_to = (current - timedelta(days=days_to)).timestamp()
if days_from > days_to:
raise("Config Error: days_from must be set lower than days_to")
raise ("Config Error: days_from must be set lower than days_to")
def stop_start_torrents(torrent_list, pause=True):
for torrent in torrent_list:
@ -43,16 +48,16 @@ def stop_start_torrents(torrent_list, pause=True):
break
if __name__ == '__main__':
if __name__ == "__main__":
try:
client = Client(host=qbt_host, username=qbt_user, password=qbt_pass)
except LoginFailed:
raise("Qbittorrent Error: Failed to login. Invalid username/password.")
raise ("Qbittorrent Error: Failed to login. Invalid username/password.")
except APIConnectionError:
raise("Qbittorrent Error: Unable to connect to the client.")
raise ("Qbittorrent Error: Unable to connect to the client.")
except Exception:
raise("Qbittorrent Error: Unable to connect to the client.")
torrent_list = client.torrents.info(sort='added_on', reverse=True)
raise ("Qbittorrent Error: Unable to connect to the client.")
torrent_list = client.torrents.info(sort="added_on", reverse=True)
# Pause Torrents
print(f"Pausing torrents from {days_from} - {days_to} days ago")
@ -60,7 +65,7 @@ if __name__ == '__main__':
time.sleep(10)
# Start mover
print("Starting Mover")
os.system('/usr/local/sbin/mover.old start')
os.system("/usr/local/sbin/mover.old start")
# Start Torrents
print(f"Resuming paused torrents from {days_from} - {days_to} days ago")
stop_start_torrents(torrent_list, False)

46
setup.py Executable file
View file

@ -0,0 +1,46 @@
import os
from distutils.core import setup
from setuptools import find_packages
# User-friendly description from README.md
current_directory = os.path.dirname(os.path.abspath(__file__))
try:
with open(os.path.join(current_directory, "README.md"), encoding="utf-8") as f:
long_description = f.read()
except Exception:
long_description = ""
try:
with open(os.path.join(current_directory, "VERSION"), encoding="utf-8") as f:
version_no = f.read()
except Exception:
version_no = ""
setup(
# Name of the package
name="qbit_manage",
# Packages to include into the distribution
packages=find_packages("."),
# Start with a small number and increase it with
# every change you make https://semver.org
version=version_no,
# Chose a license from here: https: //
# help.github.com / articles / licensing - a -
# repository. For example: MIT
license="MIT",
# Short description of your library
description="This tool will help manage tedious tasks in qBittorrent and automate them. "
"Tag, categorize, remove Orphaned data, remove unregistered torrents and much much more.",
# Long description of your library
long_description=long_description,
long_description_content_type="text/markdown",
# Your name
author="bobokun",
# Your email
author_email="",
# Either the link to your github or to your website
url="https://github.com/StuffAnThings",
# Link from which the project can be downloaded
download_url="https://github.com/StuffAnThings/qbit_manage",
)

12
tox.ini Executable file
View file

@ -0,0 +1,12 @@
[tox]
envlist = py39,py310,py311,pre-commit
skip_missing_interpreters = true
[testenv]
deps = -r{toxinidir}/requirements.txt
commands =
pre-commit install
[testenv:pre-commit]
skip_install = true
deps = pre-commit
commands = pre-commit run --all-files --show-diff-on-failure