Merge pull request #253 from StuffAnThings/develop

3.5.0
This commit is contained in:
bobokun 2023-04-03 21:58:07 -04:00 committed by GitHub
commit ac3da267f9
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
19 changed files with 503 additions and 321 deletions

View file

@ -1,7 +1,7 @@
name: Bug Report
description: Please do not use bug reports for support issues.
title: '[Bug]: '
labels: ['status:not-yet-viewed', 'bug']
labels: ['bug']
assignees: 'bobokun'
body:

View file

@ -1,7 +1,7 @@
name: Feature Request
description: Suggest a new feature for Qbit Manage
title: '[FR]: '
labels: ['status:not-yet-viewed', 'feature request']
labels: ['feature request']
assignees: 'bobokun'
body:

View file

@ -1,7 +1,7 @@
name: 'Docs Request for an Update or Improvement'
description: A request to update or improve Qbit Manage documentation
title: '[Docs]: '
labels: ['status:not-yet-viewed', 'documentation']
labels: ['documentation']
assignees: 'bobokun'
body:

View file

@ -19,7 +19,7 @@ jobs:
# will not occur.
- name: Dependabot metadata
id: dependabot-metadata
uses: dependabot/fetch-metadata@v1.1.1
uses: dependabot/fetch-metadata@v1.3.6
with:
github-token: "${{ secrets.GITHUB_TOKEN }}"
# Here the PR gets approved.

View file

@ -1,7 +1,15 @@
# Requirements Updated
- Bump qbittorrent-api from 2022.11.42 to 2023.2.43
- Adds check for minimum python requirement of 3.8.1+ (Fixes #221)
- Updates qbitorrent api to 2023.3.44
# New Features
- Adds support for custom noHL tag (closes #210)
# Bug Fixes
- Fixes NoHL torrents being removed when criteria not met (#205)
- Fix wrapped JSON Decod error from requests (#245) - Thanks @USA-RedDragon
- Code Refactor / lint - Thanks @bakerboy448
- Adds docstrings - Thanks @bakerboy448
- Fixes #229 - Thanks @bakerboy448
- Fixes #218 - Thanks @bakerboy448
**Full Changelog**: https://github.com/StuffAnThings/qbit_manage/compare/v3.4.3...v3.4.4
**Full Changelog**: https://github.com/StuffAnThings/qbit_manage/compare/v3.4.4...v3.5.0

View file

@ -23,7 +23,7 @@ This is a program used to manage your qBittorrent instance such as:
## Getting Started
Check out the [wiki](https://github.com/StuffAnThings/qbit_manage/wiki) for installation help
1. Install qbit_manage either by installing Python3 on the localhost and following the [Local Installation](https://github.com/StuffAnThings/qbit_manage/wiki/Local-Installations) Guide or by installing Docker and following the [Docker Installation](https://github.com/StuffAnThings/qbit_manage/wiki/Docker-Installation) Guide or the [unRAID Installation](https://github.com/StuffAnThings/qbit_manage/wiki/Unraid-Installation) Guide.<br>
1. Install qbit_manage either by installing Python 3.8.1+ on the localhost and following the [Local Installation](https://github.com/StuffAnThings/qbit_manage/wiki/Local-Installations) Guide or by installing Docker and following the [Docker Installation](https://github.com/StuffAnThings/qbit_manage/wiki/Docker-Installation) Guide or the [unRAID Installation](https://github.com/StuffAnThings/qbit_manage/wiki/Unraid-Installation) Guide.<br>
2. Once installed, you have to [set up your Configuration](https://github.com/StuffAnThings/qbit_manage/wiki/Config-Setup) by create a [Configuration File](https://github.com/StuffAnThings/qbit_manage/blob/master/config/config.yml.sample) filled with all your values to connect to your qBittorrent instance.
3. Please refer to the list of [Commands](https://github.com/StuffAnThings/qbit_manage/wiki/Commands) that can be used with this tool.
## Usage

View file

@ -1 +1 @@
3.4.4
3.5.0

View file

@ -25,6 +25,7 @@ qbt:
settings:
force_auto_tmm: False # Will force qBittorrent to enable Automatic Torrent Management for each torrent.
tracker_error_tag: issue # Will set the tag of any torrents that do not have a working tracker.
nohardlinks_tag: noHL # Will set the tag of any torrents with no hardlinks.
ignoreTags_OnUpdate: # When running tag-update function, it will update torrent tags for a given torrent even if the torrent has at least one or more of the tags defined here. Otherwise torrents will not be tagged if tags exist.
- noHL
- issue
@ -146,14 +147,19 @@ nohardlinks:
- MaM
# <OPTIONAL> cleanup var: WARNING!! Setting this as true Will remove and delete contents of any torrents that have a noHL tag and meets share limits
cleanup: false
# <OPTIONAL> max_ratio var: Will set the torrent Maximum share ratio until torrent is stopped from seeding/uploading
# <OPTIONAL> max_ratio var: Will set the torrent Maximum share ratio until torrent is stopped from seeding/uploading.
# Delete this key from a category's config to use the tracker's configured max_ratio. Will default to -1 if not specified for the category or tracker.
# Uses the larger value of the noHL Category or Tracker specific setting.
max_ratio: 4.0
# <OPTIONAL> max seeding time var: Will set the torrent Maximum seeding time (min) until torrent is stopped from seeding
# <OPTIONAL> max seeding time var: Will set the torrent Maximum seeding time (min) until torrent is stopped from seeding.
# Delete this key from a category's config to use the tracker's configured max_seeding_time. Will default to -1 if not specified for the category or tracker.
# Uses the larger value of the noHL Category or Tracker specific setting.
max_seeding_time: 86400
# <OPTIONAL> Limit Upload Speed var: Will limit the upload speed KiB/s (KiloBytes/second) (`-1` : No Limit)
limit_upload_speed:
# <OPTIONAL> min seeding time var: Will prevent torrent deletion by cleanup variable if torrent has not yet minimum seeding time (min).
# Delete this key from a category's config to use the tracker's configured min_seeding_time. Will default to 0 if not specified for the category or tracker.
# Uses the larger value of the noHL Category or Tracker specific setting.
min_seeding_time: 43200
# <OPTIONAL> resume_torrent_after_untagging_noHL var: If a torrent was previously tagged as NoHL and now has hardlinks, this variable will resume your torrent after changing share limits
resume_torrent_after_untagging_noHL: false
@ -165,14 +171,20 @@ nohardlinks:
- BroadcasTheNet
# <OPTIONAL> cleanup var: WARNING!! Setting this as true Will remove and delete contents of any torrents that have a noHL tag and meets share limits
cleanup: false
# <OPTIONAL> max_ratio var: Will set the torrent Maximum share ratio until torrent is stopped from seeding/uploading
# <OPTIONAL> max_ratio var: Will set the torrent Maximum share ratio until torrent is stopped from seeding/uploading.
# Delete this key from a category's config to use the tracker's configured max_ratio. Will default to -1 if not specified for the category or tracker.
# Uses the larger value of the noHL Category or Tracker specific setting.
max_ratio: 4.0
# <OPTIONAL> max seeding time var: Will set the torrent Maximum seeding time (min) until torrent is stopped from seeding
# <OPTIONAL> max seeding time var: Will set the torrent Maximum seeding time (min) until torrent is stopped from seeding.
# Delete this key from a category's config to use the tracker's configured max_seeding_time. Will default to -1 if not specified for the category or tracker.
# Uses the larger value of the noHL Category or Tracker specific setting.
max_seeding_time: 86400
# <OPTIONAL> Limit Upload Speed var: Will limit the upload speed KiB/s (KiloBytes/second) (`-1` : No Limit)
limit_upload_speed:
# <OPTIONAL> min seeding time var: Will prevent torrent deletion by cleanup variable if torrent has not yet minimum seeding time (min).
# min_seeding_time: # Not specified for this category; tracker's value will be used. Will default to 0 if not specified for the category or tracker.
# Delete this key from a category's config to use the tracker's configured min_seeding_time. Will default to 0 if not specified for the category or tracker.
# Uses the larger value of the noHL Category or Tracker specific setting.
min_seeding_time: 43200
# <OPTIONAL> resume_torrent_after_untagging_noHL var: If a torrent was previously tagged as NoHL and now has hardlinks, this variable will resume your torrent after changing share limits
resume_torrent_after_untagging_noHL: false

View file

@ -1,3 +1,4 @@
"""Apprise notification class"""
from modules import util
from modules.util import Failed
@ -5,6 +6,8 @@ logger = util.logger
class Apprise:
"""Apprise notification class"""
def __init__(self, config, params):
self.config = config
self.api_url = params["api_url"]

View file

@ -1,13 +1,16 @@
"""Module for BeyondHD (BHD) tracker."""
from json import JSONDecodeError
from modules import util
from modules.util import Failed
logger = util.logger
base_url = "https://beyond-hd.me/api/"
BASE_URL = "https://beyond-hd.me/api/"
class BeyondHD:
"""BeyondHD (BHD) tracker class."""
def __init__(self, config, params):
self.config = config
self.apikey = params["apikey"]
@ -16,7 +19,8 @@ class BeyondHD:
self.search(json)
def search(self, json, path="torrents/"):
url = f"{base_url}{path}{self.apikey}"
"""Search BHD."""
url = f"{BASE_URL}{path}{self.apikey}"
json["action"] = "search"
logger.trace(url)
logger.trace(f"JSON: {json}")
@ -24,11 +28,11 @@ class BeyondHD:
response = self.config.post(url, json=json, headers={"User-Agent": "Chrome"})
logger.trace(response)
response_json = response.json()
except JSONDecodeError as e:
except JSONDecodeError as err:
if response.status_code >= 400:
raise Failed(e)
elif "Expecting value" in e:
logger.debug(e)
raise Failed(err) from err
elif "Expecting value" in err:
logger.debug(err)
if response.status_code >= 400:
logger.debug(f"Response: {response_json}")
raise Failed(f"({response.status_code} [{response.reason}]) {response_json}")

View file

@ -1,3 +1,4 @@
"""Config class for qBittorrent-Manage"""
import os
import re
import stat
@ -33,6 +34,8 @@ COMMANDS = [
class Config:
"""Config class for qBittorrent-Manage"""
def __init__(self, default_dir, args):
logger.info("Locating config...")
self.args = args
@ -143,8 +146,13 @@ class Config:
"tracker_error_tag": self.util.check_for_attribute(
self.data, "tracker_error_tag", parent="settings", default="issue"
),
"nohardlinks_tag": self.util.check_for_attribute(self.data, "nohardlinks_tag", parent="settings", default="noHL"),
}
default_ignore_tags = ["noHL", self.settings["tracker_error_tag"], "cross-seed"]
self.tracker_error_tag = self.settings["tracker_error_tag"]
self.nohardlinks_tag = self.settings["nohardlinks_tag"]
default_ignore_tags = [self.nohardlinks_tag, self.tracker_error_tag, "cross-seed"]
self.settings["ignoreTags_OnUpdate"] = self.util.check_for_attribute(
self.data, "ignoreTags_OnUpdate", parent="settings", default=default_ignore_tags, var_type="list"
)
@ -161,7 +169,7 @@ class Config:
"cleanup_dirs": None,
}
self.webhooks = {
self.webhooks_factory = {
"error": self.util.check_for_attribute(self.data, "error", parent="webhooks", var_type="list", default_is_none=True),
"run_start": self.util.check_for_attribute(
self.data, "run_start", parent="webhooks", var_type="list", default_is_none=True
@ -178,12 +186,12 @@ class Config:
self.cat_change = self.data["cat_change"] if "cat_change" in self.data else {}
self.AppriseFactory = None
self.apprise_factory = None
if "apprise" in self.data:
if self.data["apprise"] is not None:
logger.info("Connecting to Apprise...")
try:
self.AppriseFactory = Apprise(
self.apprise_factory = Apprise(
self,
{
"api_url": self.util.check_for_attribute(
@ -194,16 +202,16 @@ class Config:
),
},
)
except Failed as e:
logger.error(e)
logger.info(f"Apprise Connection {'Failed' if self.AppriseFactory is None else 'Successful'}")
except Failed as err:
logger.error(err)
logger.info(f"Apprise Connection {'Failed' if self.apprise_factory is None else 'Successful'}")
self.NotifiarrFactory = None
self.notifiarr_factory = None
if "notifiarr" in self.data:
if self.data["notifiarr"] is not None:
logger.info("Connecting to Notifiarr...")
try:
self.NotifiarrFactory = Notifiarr(
self.notifiarr_factory = Notifiarr(
self,
{
"apikey": self.util.check_for_attribute(self.data, "apikey", parent="notifiarr", throw=True),
@ -212,29 +220,31 @@ class Config:
),
},
)
except Failed as e:
logger.error(e)
logger.info(f"Notifiarr Connection {'Failed' if self.NotifiarrFactory is None else 'Successful'}")
except Failed as err:
logger.error(err)
logger.info(f"Notifiarr Connection {'Failed' if self.notifiarr_factory is None else 'Successful'}")
self.Webhooks = Webhooks(self, self.webhooks, notifiarr=self.NotifiarrFactory, apprise=self.AppriseFactory)
self.webhooks_factory = Webhooks(
self, self.webhooks_factory, notifiarr=self.notifiarr_factory, apprise=self.apprise_factory
)
try:
self.Webhooks.start_time_hooks(self.start_time)
except Failed as e:
self.webhooks_factory.start_time_hooks(self.start_time)
except Failed as err:
logger.stacktrace()
logger.error(f"Webhooks Error: {e}")
logger.error(f"Webhooks Error: {err}")
self.BeyondHD = None
self.beyond_hd = None
if "bhd" in self.data:
if self.data["bhd"] is not None:
logger.info("Connecting to BHD API...")
try:
self.BeyondHD = BeyondHD(
self.beyond_hd = BeyondHD(
self, {"apikey": self.util.check_for_attribute(self.data, "apikey", parent="bhd", throw=True)}
)
except Failed as e:
logger.error(e)
self.notify(e, "BHD")
logger.info(f"BHD Connection {'Failed' if self.BeyondHD is None else 'Successful'}")
except Failed as err:
logger.error(err)
self.notify(err, "BHD")
logger.info(f"BHD Connection {'Failed' if self.beyond_hd is None else 'Successful'}")
# nohardlinks
self.nohardlinks = None
@ -336,15 +346,15 @@ class Config:
save=False,
)
else:
e = f"Config Error: Category {cat} is defined under nohardlinks attribute "
err = f"Config Error: Category {cat} is defined under nohardlinks attribute "
"but is not defined in the cat attribute."
self.notify(e, "Config")
raise Failed(e)
self.notify(err, "Config")
raise Failed(err)
else:
if self.commands["tag_nohardlinks"]:
e = "Config Error: nohardlinks attribute max_ratio not found"
self.notify(e, "Config")
raise Failed(e)
err = "Config Error: nohardlinks attribute max_ratio not found"
self.notify(err, "Config")
raise Failed(err)
# Add RecycleBin
self.recyclebin = {}
@ -416,9 +426,9 @@ class Config:
if self.recyclebin["enabled"] and self.recyclebin["save_torrents"]:
self.torrents_dir = self.util.check_for_attribute(self.data, "torrents_dir", parent="directory", var_type="path")
if not any(File.endswith(".torrent") for File in os.listdir(self.torrents_dir)):
e = f"Config Error: The location {self.torrents_dir} does not contain any .torrents"
self.notify(e, "Config")
raise Failed(e)
err = f"Config Error: The location {self.torrents_dir} does not contain any .torrents"
self.notify(err, "Config")
raise Failed(err)
else:
self.torrents_dir = self.util.check_for_attribute(
self.data, "torrents_dir", parent="directory", default_is_none=True
@ -728,9 +738,9 @@ class Config:
for s in save_path
]
location_path_list = [location_path]
for dir in cleaned_save_path:
if os.path.exists(dir):
location_path_list.append(dir)
for folder in cleaned_save_path:
if os.path.exists(folder):
location_path_list.append(folder)
else:
e = f"No categories defined. Checking {location} directory {location_path}."
self.notify(e, f"Empty {location}", False)
@ -796,25 +806,25 @@ class Config:
def send_notifications(self, attr):
try:
function = attr["function"]
config_webhooks = self.Webhooks.function_webhooks
config_webhooks = self.webhooks_factory.function_webhooks
config_function = None
for key in config_webhooks:
if key in function:
config_function = key
break
if config_function:
self.Webhooks.function_hooks([config_webhooks[config_function]], attr)
self.webhooks_factory.function_hooks([config_webhooks[config_function]], attr)
except Failed as e:
logger.stacktrace()
logger.error(f"Webhooks Error: {e}")
logger.error(f"webhooks_factory Error: {e}")
def notify(self, text, function=None, critical=True):
for error in util.get_list(text, split=False):
try:
self.Webhooks.error_hooks(error, function_error=function, critical=critical)
self.webhooks_factory.error_hooks(error, function_error=function, critical=critical)
except Failed as e:
logger.stacktrace()
logger.error(f"Webhooks Error: {e}")
logger.error(f"webhooks_factory Error: {e}")
def get_json(self, url, json=None, headers=None, params=None):
return self.get(url, json=json, headers=headers, params=params).json()

View file

@ -1,3 +1,4 @@
"""Logging module"""
import io
import logging
import os
@ -20,6 +21,7 @@ TRACE = 5
def fmt_filter(record):
"""Filter log message"""
record.levelname = f"[{record.levelname}]"
record.filename = f"[{record.filename}:{record.lineno}]"
return True
@ -29,7 +31,10 @@ _srcfile = os.path.normcase(fmt_filter.__code__.co_filename)
class MyLogger:
"""Logger class"""
def __init__(self, logger_name, log_file, log_level, default_dir, screen_width, separating_character, ignore_ghost):
"""Initialize logger"""
self.logger_name = logger_name
self.default_dir = default_dir
self.screen_width = screen_width
@ -60,9 +65,11 @@ class MyLogger:
self._logger.addHandler(cmd_handler)
def clear_errors(self):
"""Clear saved errors"""
self.saved_errors = []
def _get_handler(self, log_file, count=3):
"""Get handler for log file"""
max_bytes = 1024 * 1024 * 2
_handler = RotatingFileHandler(log_file, delay=True, mode="w", maxBytes=max_bytes, backupCount=count, encoding="utf-8")
self._formatter(_handler)
@ -71,20 +78,24 @@ class MyLogger:
return _handler
def _formatter(self, handler, border=True):
"""Format log message"""
text = f"| %(message)-{self.screen_width - 2}s |" if border else f"%(message)-{self.screen_width - 2}s"
if isinstance(handler, RotatingFileHandler):
text = f"[%(asctime)s] %(filename)-27s %(levelname)-10s {text}"
handler.setFormatter(logging.Formatter(text))
def add_main_handler(self):
"""Add main handler to logger"""
self.main_handler = self._get_handler(self.main_log, count=9)
self.main_handler.addFilter(fmt_filter)
self._logger.addHandler(self.main_handler)
def remove_main_handler(self):
"""Remove main handler from logger"""
self._logger.removeHandler(self.main_handler)
def add_config_handler(self, config_key):
"""Add config handler to logger"""
if config_key in self.config_handlers:
self._logger.addHandler(self.config_handlers[config_key])
else:
@ -92,10 +103,12 @@ class MyLogger:
self._logger.addHandler(self.config_handlers[config_key])
def remove_config_handler(self, config_key):
"""Remove config handler from logger"""
if config_key in self.config_handlers:
self._logger.removeHandler(self.config_handlers[config_key])
def _centered(self, text, sep=" ", side_space=True, left=False):
"""Center text"""
if len(text) > self.screen_width - 2:
return text
space = self.screen_width - len(text) - 2
@ -108,6 +121,7 @@ class MyLogger:
return final_text
def separator(self, text=None, space=True, border=True, side_space=True, left=False, loglevel="INFO"):
"""Print separator"""
sep = " " if space else self.separating_character
for handler in self._logger.handlers:
self._formatter(handler, border=False)
@ -116,8 +130,8 @@ class MyLogger:
self.print_line(border_text, loglevel)
if text:
text_list = text.split("\n")
for t in text_list:
self.print_line(f"|{sep}{self._centered(t, sep=sep, side_space=side_space, left=left)}{sep}|", loglevel)
for txt in text_list:
self.print_line(f"|{sep}{self._centered(txt, sep=sep, side_space=side_space, left=left)}{sep}|", loglevel)
if border:
self.print_line(border_text, loglevel)
for handler in self._logger.handlers:
@ -125,50 +139,61 @@ class MyLogger:
return [text]
def print_line(self, msg, loglevel="INFO", *args, **kwargs):
"""Print line"""
loglvl = getattr(logging, loglevel.upper())
if self._logger.isEnabledFor(loglvl):
self._log(loglvl, str(msg), args, **kwargs)
return [str(msg)]
def trace(self, msg, *args, **kwargs):
"""Print trace"""
if self._logger.isEnabledFor(TRACE):
self._log(TRACE, str(msg), args, **kwargs)
def debug(self, msg, *args, **kwargs):
"""Print debug"""
if self._logger.isEnabledFor(DEBUG):
self._log(DEBUG, str(msg), args, **kwargs)
def info_center(self, msg, *args, **kwargs):
"""Print info centered"""
self.info(self._centered(str(msg)), *args, **kwargs)
def info(self, msg, *args, **kwargs):
"""Print info"""
if self._logger.isEnabledFor(INFO):
self._log(INFO, str(msg), args, **kwargs)
def dryrun(self, msg, *args, **kwargs):
"""Print dryrun"""
if self._logger.isEnabledFor(DRYRUN):
self._log(DRYRUN, str(msg), args, **kwargs)
def warning(self, msg, *args, **kwargs):
"""Print warning"""
if self._logger.isEnabledFor(WARNING):
self._log(WARNING, str(msg), args, **kwargs)
def error(self, msg, *args, **kwargs):
"""Print error"""
if self.save_errors:
self.saved_errors.append(msg)
if self._logger.isEnabledFor(ERROR):
self._log(ERROR, str(msg), args, **kwargs)
def critical(self, msg, *args, **kwargs):
"""Print critical"""
if self.save_errors:
self.saved_errors.append(msg)
if self._logger.isEnabledFor(CRITICAL):
self._log(CRITICAL, str(msg), args, **kwargs)
def stacktrace(self):
"""Print stacktrace"""
self.debug(traceback.format_exc())
def _space(self, display_title):
"""Add spaces to display title"""
display_title = str(display_title)
space_length = self.spacing - len(display_title)
if space_length > 0:
@ -176,6 +201,7 @@ class MyLogger:
return display_title
def ghost(self, text):
"""Print ghost"""
if not self.ignore_ghost:
try:
final_text = f"| {text}"
@ -186,15 +212,18 @@ class MyLogger:
self.spacing = len(text) + 2
def exorcise(self):
"""Exorcise ghost"""
if not self.ignore_ghost:
print(self._space(" "), end="\r")
self.spacing = 0
def secret(self, text):
"""Add secret"""
if str(text) not in self.secrets and str(text):
self.secrets.append(str(text))
def insert_space(self, display_title, space_length=0):
"""Insert space"""
display_title = str(display_title)
if space_length == 0:
space_length = self.spacing - len(display_title)
@ -203,6 +232,7 @@ class MyLogger:
return display_title
def _log(self, level, msg, args, exc_info=None, extra=None, stack_info=False, stacklevel=1):
"""Log"""
if self.spacing > 0:
self.exorcise()
if "\n" in msg:
@ -226,43 +256,44 @@ class MyLogger:
try:
if not _srcfile:
raise ValueError
fn, lno, func, sinfo = self.findCaller(stack_info, stacklevel)
func, lno, func, sinfo = self.find_caller(stack_info, stacklevel)
except ValueError:
fn, lno, func, sinfo = "(unknown file)", 0, "(unknown function)", None
func, lno, func, sinfo = "(unknown file)", 0, "(unknown function)", None
if exc_info:
if isinstance(exc_info, BaseException):
exc_info = (type(exc_info), exc_info, exc_info.__traceback__)
elif not isinstance(exc_info, tuple):
exc_info = sys.exc_info()
record = self._logger.makeRecord(self._logger.name, level, fn, lno, msg, args, exc_info, func, extra, sinfo)
record = self._logger.makeRecord(self._logger.name, level, func, lno, msg, args, exc_info, func, extra, sinfo)
self._logger.handle(record)
def findCaller(self, stack_info=False, stacklevel=1):
f = logging.currentframe()
if f is not None:
f = f.f_back
orig_f = f
while f and stacklevel > 1:
f = f.f_back
def find_caller(self, stack_info=False, stacklevel=1):
"""Find caller"""
frm = logging.currentframe()
if frm is not None:
frm = frm.f_back
orig_f = frm
while frm and stacklevel > 1:
frm = frm.f_back
stacklevel -= 1
if not f:
f = orig_f
rv = "(unknown file)", 0, "(unknown function)", None
while hasattr(f, "f_code"):
co = f.f_code
filename = os.path.normcase(co.co_filename)
if not frm:
frm = orig_f
rvf = "(unknown file)", 0, "(unknown function)", None
while hasattr(frm, "f_code"):
code = frm.f_code
filename = os.path.normcase(code.co_filename)
if filename == _srcfile:
f = f.f_back
frm = frm.f_back
continue
sinfo = None
if stack_info:
sio = io.StringIO()
sio.write("Stack (most recent call last):\n")
traceback.print_stack(f, file=sio)
traceback.print_stack(frm, file=sio)
sinfo = sio.getvalue()
if sinfo[-1] == "\n":
sinfo = sinfo[:-1]
sio.close()
rv = (co.co_filename, f.f_lineno, co.co_name, sinfo)
rvf = (code.co_filename, frm.f_lineno, code.co_name, sinfo)
break
return rv
return rvf

View file

@ -5,17 +5,22 @@ from modules.util import Failed
logger = util.logger
base_url = "https://notifiarr.com/api/v1/"
class Notifiarr:
"""Notifiarr API"""
BASE_URL = "https://notifiarr.com/api"
API_VERSION = "v1"
def __init__(self, config, params):
"""Initialize Notifiarr API"""
self.config = config
self.apikey = params["apikey"]
self.header = {"X-API-Key": self.apikey}
self.instance = params["instance"]
self.url = f"{self.BASE_URL}/{self.API_VERSION}/"
logger.secret(self.apikey)
response = self.config.get(f"{base_url}user/qbitManage/", headers=self.header, params={"fetch": "settings"})
response = self.config.get(f"{self.url}user/qbitManage/", headers=self.header, params={"fetch": "settings"})
response_json = None
try:
response_json = response.json()
@ -29,5 +34,6 @@ class Notifiarr:
raise Failed("Notifiarr Error: Invalid apikey")
def notification(self, json):
"""Send notification to Notifiarr"""
params = {"qbit_client": self.config.data["qbt"]["host"], "instance": self.instance}
return self.config.get(f"{base_url}notification/qbitManage/", json=json, headers=self.header, params=params)
return self.config.get(f"{self.url}notification/qbitManage/", json=json, headers=self.header, params=params)

View file

@ -1,3 +1,4 @@
"""Qbittorrent Module"""
import os
import sys
from collections import Counter
@ -19,6 +20,13 @@ logger = util.logger
class Qbt:
"""
Qbittorrent Class
"""
SUPPORTED_VERSION = Version.latest_supported_app_version()
MIN_SUPPORTED_VERSION = "v4.3.0"
def __init__(self, config, params):
self.config = config
self.host = params["host"]
@ -27,49 +35,46 @@ class Qbt:
logger.secret(self.username)
logger.secret(self.password)
logger.debug(f"Host: {self.host}, Username: {self.username}, Password: {self.password}")
e = ""
ex = ""
try:
self.client = Client(host=self.host, username=self.username, password=self.password, VERIFY_WEBUI_CERTIFICATE=False)
self.client.auth_log_in()
SUPPORTED_VERSION = Version.latest_supported_app_version()
CURRENT_VERSION = self.client.app.version
MIN_SUPPORTED_VERSION = "v4.3.0"
logger.debug(f"qBittorrent: {self.client.app.version}")
self.current_version = self.client.app.version
logger.debug(f"qBittorrent: {self.current_version}")
logger.debug(f"qBittorrent Web API: {self.client.app.web_api_version}")
logger.debug(f"qbit_manage support versions: {MIN_SUPPORTED_VERSION} - {SUPPORTED_VERSION}")
if CURRENT_VERSION < MIN_SUPPORTED_VERSION:
e = (
f"Qbittorrent Error: qbit_manage is only comaptible with {MIN_SUPPORTED_VERSION} or higher. "
f"You are currently on {CURRENT_VERSION}."
logger.debug(f"qbit_manage supported versions: {self.MIN_SUPPORTED_VERSION} - {self.SUPPORTED_VERSION}")
if self.current_version < self.MIN_SUPPORTED_VERSION:
ex = (
f"Qbittorrent Error: qbit_manage is only compatible with {self.MIN_SUPPORTED_VERSION} or higher. "
f"You are currently on {self.current_version}."
+ "\n"
+ f"Please upgrade to your Qbittorrent version to {MIN_SUPPORTED_VERSION} or higher to use qbit_manage."
+ f"Please upgrade to your Qbittorrent version to {self.MIN_SUPPORTED_VERSION} or higher to use qbit_manage."
)
elif not Version.is_app_version_supported(CURRENT_VERSION):
e = (
f"Qbittorrent Error: qbit_manage is only comaptible with {SUPPORTED_VERSION} or lower. "
f"You are currently on {CURRENT_VERSION}."
elif not Version.is_app_version_supported(self.current_version):
ex = (
f"Qbittorrent Error: qbit_manage is only compatible with {self.SUPPORTED_VERSION} or lower. "
f"You are currently on {self.current_version}."
+ "\n"
+ f"Please downgrade to your Qbittorrent version to {SUPPORTED_VERSION} to use qbit_manage."
+ f"Please downgrade to your Qbittorrent version to {self.SUPPORTED_VERSION} to use qbit_manage."
)
if e:
self.config.notify(e, "Qbittorrent")
logger.print_line(e, "CRITICAL")
if ex:
self.config.notify(ex, "Qbittorrent")
logger.print_line(ex, "CRITICAL")
sys.exit(0)
else:
logger.info("Qbt Connection Successful")
except LoginFailed:
e = "Qbittorrent Error: Failed to login. Invalid username/password."
self.config.notify(e, "Qbittorrent")
raise Failed(e)
except APIConnectionError:
e = "Qbittorrent Error: Unable to connect to the client."
self.config.notify(e, "Qbittorrent")
raise Failed(e)
except Exception:
e = "Qbittorrent Error: Unable to connect to the client."
self.config.notify(e, "Qbittorrent")
raise Failed(e)
except LoginFailed as exc:
ex = "Qbittorrent Error: Failed to login. Invalid username/password."
self.config.notify(ex, "Qbittorrent")
raise Failed(ex) from exc
except APIConnectionError as exc:
ex = "Qbittorrent Error: Unable to connect to the client."
self.config.notify(ex, "Qbittorrent")
raise Failed(ex) from exc
except Exception as exc:
ex = "Qbittorrent Error: Unable to connect to the client."
self.config.notify(ex, "Qbittorrent")
raise Failed(ex) from exc
logger.separator("Getting Torrent List", space=False, border=False)
self.torrent_list = self.get_torrents({"sort": "added_on"})
@ -130,9 +135,9 @@ class Qbt:
save_path = torrent.save_path
category = torrent.category
torrent_trackers = torrent.trackers
except Exception as e:
self.config.notify(e, "Get Torrent Info", False)
logger.warning(e)
except Exception as ex:
self.config.notify(ex, "Get Torrent Info", False)
logger.warning(ex)
if torrent_name in torrentdict:
t_obj_list.append(torrent)
t_count = torrentdict[torrent_name]["count"] + 1
@ -147,10 +152,10 @@ class Qbt:
status_list = []
is_complete = torrent_is_complete
first_hash = torrent_hash
for x in torrent_trackers:
if x.url.startswith("http"):
status = x.status
msg = x.msg.upper()
for trk in torrent_trackers:
if trk.url.startswith("http"):
status = trk.status
msg = trk.msg.upper()
exception = [
"DOWN",
"DOWN.",
@ -160,11 +165,11 @@ class Qbt:
"BAD GATEWAY",
"TRACKER UNAVAILABLE",
]
if x.status == 2:
if trk.status == 2:
working_tracker = True
break
# Add any potential unregistered torrents to a list
if x.status == 4 and not list_in_text(msg, exception):
if trk.status == 4 and not list_in_text(msg, exception):
issue["potential"] = True
issue["msg"] = msg
issue["status"] = status
@ -207,9 +212,11 @@ class Qbt:
self.torrentinfo, self.torrentissue, self.torrentvalid = get_torrent_info(self.torrent_list)
def get_torrents(self, params):
"""Get torrents from qBittorrent"""
return self.client.torrents.info(**params)
def category(self):
"""Update category for torrents"""
num_cat = 0
def update_cat(new_cat, cat_change):
@ -222,11 +229,11 @@ class Qbt:
if torrent.auto_tmm is False and self.config.settings["force_auto_tmm"]:
torrent.set_auto_management(True)
except Conflict409Error:
e = logger.print_line(
ex = logger.print_line(
f'Existing category "{new_cat}" not found for save path {torrent.save_path}, category will be created.',
self.config.loglevel,
)
self.config.notify(e, "Update Category", False)
self.config.notify(ex, "Update Category", False)
self.client.torrent_categories.create_category(name=new_cat, save_path=torrent.save_path)
torrent.set_category(category=new_cat)
body = []
@ -274,13 +281,14 @@ class Qbt:
return num_cat
def tags(self):
"""Update tags for torrents"""
num_tags = 0
ignore_tags = self.config.settings["ignoreTags_OnUpdate"]
if self.config.commands["tag_update"]:
logger.separator("Updating Tags", space=False, border=False)
for torrent in self.torrent_list:
check_tags = util.get_list(torrent.tags)
if torrent.tags == "" or (len([x for x in check_tags if x not in ignore_tags]) == 0):
if torrent.tags == "" or (len([trk for trk in check_tags if trk not in ignore_tags]) == 0):
tracker = self.config.get_tags(torrent.trackers)
if tracker["tag"]:
num_tags += len(tracker["tag"])
@ -328,6 +336,7 @@ class Qbt:
def set_tags_and_limits(
self, torrent, max_ratio, max_seeding_time, limit_upload_speed=None, tags=None, restore=False, do_print=True
):
"""Set tags and limits for a torrent"""
body = []
if limit_upload_speed:
if limit_upload_speed != -1:
@ -356,7 +365,6 @@ class Qbt:
body += logger.print_line(msg, self.config.loglevel)
else:
body.append(msg)
elif max_seeding_time != torrent.max_seeding_time and (not max_ratio or max_ratio < 0):
msg = logger.insert_space(f"Share Limit: Max Seed Time = {max_seeding_time} min", 4)
if do_print:
@ -388,11 +396,14 @@ class Qbt:
return body
def has_reached_seed_limit(self, torrent, max_ratio, max_seeding_time, min_seeding_time, resume_torrent, tracker):
"""Check if torrent has reached seed limit"""
body = ""
def _has_reached_min_seeding_time_limit():
print_log = []
if torrent.seeding_time >= min_seeding_time * 60:
if "MinSeedTimeNotReached" in torrent.tags:
torrent.remove_tags(tags="MinSeedTimeNotReached")
return True
else:
print_log += logger.print_line(logger.insert_space(f"Torrent Name: {torrent.name}", 3), self.config.loglevel)
@ -450,51 +461,53 @@ class Qbt:
return False
def tag_nohardlinks(self):
num_tags = 0 # counter for the number of torrents that has no hard links
del_tor = 0 # counter for the number of torrents that has no hard links and \
"""Tag torrents with no hardlinks"""
num_tags = 0 # counter for the number of torrents that has no hardlinks
del_tor = 0 # counter for the number of torrents that has no hardlinks and \
# meets the criteria for ratio limit/seed limit for deletion
del_tor_cont = 0 # counter for the number of torrents that has no hard links and \
del_tor_cont = 0 # counter for the number of torrents that has no hardlinks and \
# meets the criteria for ratio limit/seed limit for deletion including contents
num_untag = 0 # counter for number of torrents that previously had no hard links but now have hard links
num_untag = 0 # counter for number of torrents that previously had no hardlinks but now have hardlinks
def add_tag_noHL(add_tag=True):
nonlocal num_tags, torrent, tracker, nohardlinks, category
def add_tag_no_hl(add_tag=True):
"""Add tag nohardlinks_tag to torrents with no hardlinks"""
nonlocal num_tags, torrent, tracker, nohardlinks, category, max_ratio, max_seeding_time
body = []
body.append(logger.insert_space(f"Torrent Name: {torrent.name}", 3))
if add_tag:
body.append(logger.insert_space("Added Tag: noHL", 6))
body.append(logger.insert_space(f"Added Tag: {self.config.nohardlinks_tag}", 6))
title = "Tagging Torrents with No Hardlinks"
else:
title = "Changing Share Ratio of Torrents with No Hardlinks"
body.append(logger.insert_space(f'Tracker: {tracker["url"]}', 8))
body_tags_and_limits = self.set_tags_and_limits(
torrent,
nohardlinks[category]["max_ratio"],
nohardlinks[category]["max_seeding_time"],
max_ratio,
max_seeding_time,
nohardlinks[category]["limit_upload_speed"],
tags="noHL",
tags=self.config.nohardlinks_tag,
do_print=False,
)
if body_tags_and_limits or add_tag:
num_tags += 1
# Resume torrent if it was paused now that the share limit has changed
if torrent.state == "pausedUP" and nohardlinks[category]["resume_torrent_after_untagging_noHL"]:
if torrent.state_enum.is_complete and nohardlinks[category]["resume_torrent_after_untagging_noHL"]:
if not self.config.dry_run:
torrent.resume()
body.extend(body_tags_and_limits)
for b in body:
logger.print_line(b, self.config.loglevel)
for rcd in body:
logger.print_line(rcd, self.config.loglevel)
attr = {
"function": "tag_nohardlinks",
"title": title,
"body": "\n".join(body),
"torrent_name": torrent.name,
"torrent_category": torrent.category,
"torrent_tag": "noHL",
"torrent_tag": self.config.nohardlinks_tag,
"torrent_tracker": tracker["url"],
"notifiarr_indexer": tracker["notifiarr"],
"torrent_max_ratio": nohardlinks[category]["max_ratio"],
"torrent_max_seeding_time": nohardlinks[category]["max_seeding_time"],
"torrent_max_ratio": max_ratio,
"torrent_max_seeding_time": max_seeding_time,
"torrent_limit_upload_speed": nohardlinks[category]["limit_upload_speed"],
}
self.config.send_notifications(attr)
@ -508,13 +521,13 @@ class Qbt:
for category in nohardlinks:
torrent_list = self.get_torrents({"category": category, "status_filter": "completed"})
if len(torrent_list) == 0:
e = (
ex = (
"No torrents found in the category ("
+ category
+ ") defined under nohardlinks attribute in the config. "
+ "Please check if this matches with any category in qbittorrent and has 1 or more torrents."
)
logger.warning(e)
logger.warning(ex)
continue
for torrent in torrent_list:
tracker = self.config.get_tags(torrent.trackers)
@ -523,26 +536,57 @@ class Qbt:
# Skip to the next torrent if we find any torrents that are in the exclude tag
continue
else:
# Checks for any hard links and not already tagged
# Checks for any hardlinks and not already tagged
# Cleans up previously tagged nohardlinks_tag torrents that no longer have hardlinks
if has_nohardlinks:
# Will only tag new torrents that don't have noHL tag
if "noHL" not in torrent.tags:
add_tag_noHL(add_tag=True)
# Cleans up previously tagged noHL torrents
# Determine min_seeding_time. noHl > Tracker w/ default 0
min_seeding_time = 0
tracker = self.config.get_tags(torrent.trackers)
if nohardlinks[category]["min_seeding_time"]:
min_seeding_time = nohardlinks[category]["min_seeding_time"]
elif tracker["min_seeding_time"]:
# Determine min_seeding_time.
# If only tracker setting is set, use tracker's min_seeding_time
# If only nohardlinks category setting is set, use nohardlinks category's min_seeding_time
# If both tracker and nohardlinks category setting is set, use the larger of the two
# If neither set, use 0 (no limit)
min_seeding_time = 0
if (
tracker["min_seeding_time"]
and tracker["min_seeding_time"] >= nohardlinks[category]["min_seeding_time"]
):
min_seeding_time = tracker["min_seeding_time"]
elif nohardlinks[category]["min_seeding_time"]:
min_seeding_time = nohardlinks[category]["min_seeding_time"]
# Determine max_ratio.
# If only tracker setting is set, use tracker's max_ratio
# If only nohardlinks category setting is set, use nohardlinks category's max_ratio
# If both tracker and nohardlinks category setting is set, use the larger of the two
# If neither set, use -1 (no limit)
max_ratio = -1
if tracker["max_ratio"] and tracker["max_ratio"] >= nohardlinks[category]["max_ratio"]:
max_ratio = tracker["max_ratio"]
elif nohardlinks[category]["max_ratio"]:
max_ratio = nohardlinks[category]["max_ratio"]
# Determine max_seeding_time.
# If only tracker setting is set, use tracker's max_seeding_time
# If only nohardlinks category setting is set, use nohardlinks category's max_seeding_time
# If both tracker and nohardlinks category setting is set, use the larger of the two
# If neither set, use -1 (no limit)
max_seeding_time = -1
if (
tracker["max_seeding_time"]
and tracker["max_seeding_time"] >= nohardlinks[category]["max_seeding_time"]
):
max_seeding_time = tracker["max_seeding_time"]
elif nohardlinks[category]["max_seeding_time"]:
max_seeding_time = nohardlinks[category]["max_seeding_time"]
# Will only tag new torrents that don't have nohardlinks_tag tag
if self.config.nohardlinks_tag not in torrent.tags:
add_tag_no_hl(add_tag=True)
# Deletes torrent with data if cleanup is set to true and meets the ratio/seeding requirements
if nohardlinks[category]["cleanup"] and len(nohardlinks[category]) > 0:
tor_reach_seed_limit = self.has_reached_seed_limit(
torrent,
nohardlinks[category]["max_ratio"],
nohardlinks[category]["max_seeding_time"],
max_ratio,
max_seeding_time,
min_seeding_time,
nohardlinks[category]["resume_torrent_after_untagging_noHL"],
tracker["url"],
@ -557,19 +601,24 @@ class Qbt:
else:
# Updates torrent to see if "MinSeedTimeNotReached" tag has been added
torrent = self.get_torrents({"torrent_hashes": [torrent.hash]}).data[0]
# Checks to see if previously noHL share limits have changed.
add_tag_noHL(add_tag=False)
# Checks to see if previous noHL tagged torrents now have hard links.
if not (has_nohardlinks) and ("noHL" in torrent.tags):
# Checks to see if previously nohardlinks_tag share limits have changed.
add_tag_no_hl(add_tag=False)
# Checks to see if previous nohardlinks_tag tagged torrents now have hardlinks.
if not (has_nohardlinks) and (self.config.nohardlinks_tag in torrent.tags):
num_untag += 1
body = []
body += logger.print_line(
f"Previous Tagged noHL Torrent Name: {torrent.name} has hard links found now.", self.config.loglevel
f"Previous Tagged {self.config.nohardlinks_tag} "
f"Torrent Name: {torrent.name} has hardlinks found now.",
self.config.loglevel,
)
body += logger.print_line(
logger.insert_space(f"Removed Tag: {self.config.nohardlinks_tag}", 6), self.config.loglevel
)
body += logger.print_line(logger.insert_space("Removed Tag: noHL", 6), self.config.loglevel)
body += logger.print_line(logger.insert_space(f'Tracker: {tracker["url"]}', 8), self.config.loglevel)
body += logger.print_line(
f"{'Not Reverting' if self.config.dry_run else 'Reverting'} share limits.", self.config.loglevel
f"{'Not Reverting' if self.config.dry_run else 'Reverting'} to tracker or Global share limits.",
self.config.loglevel,
)
restore_max_ratio = tracker["max_ratio"]
restore_max_seeding_time = tracker["max_seeding_time"]
@ -581,21 +630,21 @@ class Qbt:
if restore_limit_upload_speed is None:
restore_limit_upload_speed = -1
if not self.config.dry_run:
torrent.remove_tags(tags="noHL")
torrent.remove_tags(tags=self.config.nohardlinks_tag)
body.extend(
self.set_tags_and_limits(
torrent, restore_max_ratio, restore_max_seeding_time, restore_limit_upload_speed, restore=True
)
)
if torrent.state == "pausedUP" and nohardlinks[category]["resume_torrent_after_untagging_noHL"]:
if torrent.state_enum.is_complete and nohardlinks[category]["resume_torrent_after_untagging_noHL"]:
torrent.resume()
attr = {
"function": "untag_nohardlinks",
"title": "Untagging Previous Torrents that now have Hard Links",
"title": "Untagging Previous Torrents that now have hardlinks",
"body": "\n".join(body),
"torrent_name": torrent.name,
"torrent_category": torrent.category,
"torrent_tag": "noHL",
"torrent_tag": self.config.nohardlinks_tag,
"torrent_tracker": tracker["url"],
"notifiarr_indexer": tracker["notifiarr"],
"torrent_max_ratio": restore_max_ratio,
@ -609,7 +658,7 @@ class Qbt:
for torrent in torrent_list:
t_name = torrent.name
t_hash = torrent.hash
if t_hash in tdel_dict.keys() and "noHL" in torrent.tags:
if t_hash in tdel_dict and self.config.nohardlinks_tag in torrent.tags:
t_count = self.torrentinfo[t_name]["count"]
t_msg = self.torrentinfo[t_name]["msg"]
t_status = self.torrentinfo[t_name]["status"]
@ -623,7 +672,7 @@ class Qbt:
)
body += logger.print_line(tdel_dict[t_hash]["body"], self.config.loglevel)
body += logger.print_line(
logger.insert_space("Cleanup: True [No hard links found and meets Share Limits.]", 8),
logger.insert_space("Cleanup: True [No hardlinks found and meets Share Limits.]", 8),
self.config.loglevel,
)
attr = {
@ -672,10 +721,11 @@ class Qbt:
self.config.loglevel,
)
else:
logger.print_line("No torrents to tag with no hard links.", self.config.loglevel)
logger.print_line("No torrents to tag with no hardlinks.", self.config.loglevel)
if num_untag >= 1:
logger.print_line(
f"{'Did not delete' if self.config.dry_run else 'Deleted'} noHL tags / share limits for {num_untag} "
f"{'Did not delete' if self.config.dry_run else 'Deleted'} "
f"{self.config.nohardlinks_tag} tags / share limits for {num_untag} "
f".torrent{'s.' if num_untag > 1 else '.'}",
self.config.loglevel,
)
@ -694,12 +744,13 @@ class Qbt:
return num_tags, num_untag, del_tor, del_tor_cont
def rem_unregistered(self):
"""Remove torrents with unregistered trackers."""
del_tor = 0
del_tor_cont = 0
num_tor_error = 0
num_untag = 0
tor_error_summary = ""
tag_error = self.config.settings["tracker_error_tag"]
tag_error = self.config.tracker_error_tag
cfg_rem_unregistered = self.config.commands["rem_unregistered"]
cfg_tag_error = self.config.commands["tag_tracker_error"]
@ -829,39 +880,39 @@ class Qbt:
t_status = self.torrentinfo[t_name]["status"]
check_tags = util.get_list(torrent.tags)
try:
for x in torrent.trackers:
if x.url.startswith("http"):
tracker = self.config.get_tags([x])
msg_up = x.msg.upper()
msg = x.msg
for trk in torrent.trackers:
if trk.url.startswith("http"):
tracker = self.config.get_tags([trk])
msg_up = trk.msg.upper()
msg = trk.msg
# Tag any error torrents
if cfg_tag_error:
if x.status == 4 and tag_error not in check_tags:
if trk.status == 4 and tag_error not in check_tags:
tag_tracker_error()
if cfg_rem_unregistered:
# Tag any error torrents that are not unregistered
if not list_in_text(msg_up, unreg_msgs) and x.status == 4 and tag_error not in check_tags:
if not list_in_text(msg_up, unreg_msgs) and trk.status == 4 and tag_error not in check_tags:
# Check for unregistered torrents using BHD API if the tracker is BHD
if (
"tracker.beyond-hd.me" in tracker["url"]
and self.config.BeyondHD is not None
and self.config.beyond_hd is not None
and not list_in_text(msg_up, ignore_msgs)
):
json = {"info_hash": torrent.hash}
response = self.config.BeyondHD.search(json)
response = self.config.beyond_hd.search(json)
if response["total_results"] == 0:
del_unregistered()
break
tag_tracker_error()
if list_in_text(msg_up, unreg_msgs) and not list_in_text(msg_up, ignore_msgs) and x.status == 4:
if list_in_text(msg_up, unreg_msgs) and not list_in_text(msg_up, ignore_msgs) and trk.status == 4:
del_unregistered()
break
except NotFound404Error:
continue
except Exception as e:
except Exception as ex:
logger.stacktrace()
self.config.notify(e, "Remove Unregistered Torrents", False)
logger.error(f"Unknown Error: {e}")
self.config.notify(ex, "Remove Unregistered Torrents", False)
logger.error(f"Remove Unregistered Torrents Error: {ex}")
if cfg_rem_unregistered:
if del_tor >= 1 or del_tor_cont >= 1:
if del_tor >= 1:
@ -894,8 +945,8 @@ class Qbt:
logger.print_line(tor_error_summary.rstrip(), self.config.loglevel)
return del_tor, del_tor_cont, num_tor_error, num_untag
# Function used to move any torrents from the cross seed directory to the correct save directory
def cross_seed(self):
"""Move torrents from cross seed directory to correct save directory."""
added = 0 # Keep track of total torrents tagged
tagged = 0 # Track # of torrents tagged that are not cross-seeded
if self.config.commands["cross_seed"]:
@ -909,11 +960,11 @@ class Qbt:
dir_cs_out = os.path.join(dir_cs, "qbit_manage_added")
os.makedirs(dir_cs_out, exist_ok=True)
for file in cs_files:
t_name = file.split("]", 2)[2].split(".torrent")[0]
tr_name = file.split("]", 2)[2].split(".torrent")[0]
t_tracker = file.split("]", 2)[1][1:]
# Substring Key match in dictionary (used because t_name might not match exactly with torrentdict key)
# Returned the dictionary of filtered item
torrentdict_file = dict(filter(lambda item: t_name in item[0], self.torrentinfo.items()))
torrentdict_file = dict(filter(lambda item: tr_name in item[0], self.torrentinfo.items()))
if torrentdict_file:
# Get the exact torrent match name from torrentdict
t_name = next(iter(torrentdict_file))
@ -989,10 +1040,10 @@ class Qbt:
torrent.add_tags(tags="cross-seed")
numcategory = Counter(categories)
for c in numcategory:
if numcategory[c] > 0:
for cat in numcategory:
if numcategory[cat] > 0:
logger.print_line(
f"{numcategory[c]} {c} cross-seed .torrents {'not added' if self.config.dry_run else 'added'}.",
f"{numcategory[cat]} {cat} cross-seed .torrents {'not added' if self.config.dry_run else 'added'}.",
self.config.loglevel,
)
if added > 0:
@ -1006,8 +1057,8 @@ class Qbt:
)
return added, tagged
# Function used to recheck paused torrents sorted by size and resume torrents that are completed
def recheck(self):
"""Function used to recheck paused torrents sorted by size and resume torrents that are completed"""
resumed = 0
rechecked = 0
if self.config.commands["recheck"]:
@ -1109,6 +1160,7 @@ class Qbt:
return resumed, rechecked
def rem_orphaned(self):
"""Remove orphaned files from remote directory"""
orphaned = 0
if self.config.commands["rem_orphaned"]:
logger.separator("Checking for Orphaned Files", space=False, border=False)
@ -1140,7 +1192,7 @@ class Qbt:
for torrent in torrent_list:
for file in torrent.files:
fullpath = os.path.join(torrent.save_path, file.name)
# Replace fullpath with \\ if qbm is runnig in docker (linux) but qbt is on windows
# Replace fullpath with \\ if qbm is running in docker (linux) but qbt is on windows
fullpath = fullpath.replace(r"/", "\\") if ":\\" in fullpath else fullpath
torrent_files.append(fullpath)
@ -1194,6 +1246,7 @@ class Qbt:
return orphaned
def tor_delete_recycle(self, torrent, info):
"""Move torrent to recycle bin"""
if self.config.recyclebin["enabled"]:
tor_files = []
try:
@ -1228,15 +1281,15 @@ class Qbt:
else:
logger.info(f"Adding {info['torrent_tracker']} to existing {os.path.basename(torrent_json_file)}")
dot_torrent_files = []
for File in os.listdir(self.config.torrents_dir):
if File.startswith(info_hash):
dot_torrent_files.append(File)
for file in os.listdir(self.config.torrents_dir):
if file.startswith(info_hash):
dot_torrent_files.append(file)
try:
util.copy_files(os.path.join(self.config.torrents_dir, File), os.path.join(torrent_path, File))
except Exception as e:
util.copy_files(os.path.join(self.config.torrents_dir, file), os.path.join(torrent_path, file))
except Exception as ex:
logger.stacktrace()
self.config.notify(e, "Deleting Torrent", False)
logger.warning(f"RecycleBin Warning: {e}")
self.config.notify(ex, "Deleting Torrent", False)
logger.warning(f"RecycleBin Warning: {ex}")
if "tracker_torrent_files" in torrent_json:
tracker_torrent_files = torrent_json["tracker_torrent_files"]
else:
@ -1279,12 +1332,12 @@ class Qbt:
dest = os.path.join(recycle_path, file.replace(self.config.remote_dir, ""))
# Move files and change date modified
try:
toDelete = util.move_files(src, dest, True)
to_delete = util.move_files(src, dest, True)
except FileNotFoundError:
e = logger.print_line(f"RecycleBin Warning - FileNotFound: No such file or directory: {src} ", "WARNING")
self.config.notify(e, "Deleting Torrent", False)
ex = logger.print_line(f"RecycleBin Warning - FileNotFound: No such file or directory: {src} ", "WARNING")
self.config.notify(ex, "Deleting Torrent", False)
# Delete torrent and files
torrent.delete(delete_files=toDelete)
torrent.delete(delete_files=to_delete)
# Remove any empty directories
util.remove_empty_directories(save_path, "**/*")
else:

View file

@ -1,3 +1,4 @@
""" Utility functions for qBit Manage. """
import json
import logging
import os
@ -12,6 +13,7 @@ logger = logging.getLogger("qBit Manage")
def get_list(data, lower=False, split=True, int_list=False):
"""Return a list from a string or list."""
if data is None:
return None
elif isinstance(data, list):
@ -32,6 +34,8 @@ def get_list(data, lower=False, split=True, int_list=False):
class check:
"""Check for attributes in config."""
def __init__(self, config):
self.config = config
@ -52,6 +56,7 @@ class check:
save=True,
make_dirs=False,
):
"""Check for attribute in config."""
endline = ""
if parent is not None:
if subparent is not None:
@ -188,10 +193,13 @@ class check:
class Failed(Exception):
"""Exception raised for errors in the input."""
pass
def list_in_text(text, search_list, match_all=False):
"""Check if a list of strings is in a string"""
if isinstance(search_list, list):
search_list = set(search_list)
contains = {x for x in search_list if " " in x}
@ -205,77 +213,80 @@ def list_in_text(text, search_list, match_all=False):
return False
# truncate the value of the torrent url to remove sensitive information
def trunc_val(s, d, n=3):
def trunc_val(stg, delm, num=3):
"""Truncate the value of the torrent url to remove sensitive information"""
try:
x = d.join(s.split(d, n)[:n])
val = delm.join(stg.split(delm, num)[:num])
except IndexError:
x = None
return x
val = None
return val
# Move files from source to destination, mod variable is to change the date modified of the file being moved
def move_files(src, dest, mod=False):
"""Move files from source to destination, mod variable is to change the date modified of the file being moved"""
dest_path = os.path.dirname(dest)
toDelete = False
to_delete = False
if os.path.isdir(dest_path) is False:
os.makedirs(dest_path)
try:
if mod is True:
modTime = time.time()
os.utime(src, (modTime, modTime))
mod_time = time.time()
os.utime(src, (mod_time, mod_time))
shutil.move(src, dest)
except PermissionError as p:
logger.warning(f"{p} : Copying files instead.")
except PermissionError as perm:
logger.warning(f"{perm} : Copying files instead.")
shutil.copyfile(src, dest)
toDelete = True
except FileNotFoundError as f:
logger.warning(f"{f} : source: {src} -> destination: {dest}")
except Exception as e:
to_delete = True
except FileNotFoundError as file:
logger.warning(f"{file} : source: {src} -> destination: {dest}")
except Exception as ex:
logger.stacktrace()
logger.error(e)
return toDelete
logger.error(ex)
return to_delete
# Copy Files from source to destination
def copy_files(src, dest):
"""Copy files from source to destination"""
dest_path = os.path.dirname(dest)
if os.path.isdir(dest_path) is False:
os.makedirs(dest_path)
try:
shutil.copyfile(src, dest)
except Exception as e:
except Exception as ex:
logger.stacktrace()
logger.error(e)
logger.error(ex)
# Remove any empty directories after moving files
def remove_empty_directories(pathlib_root_dir, pattern):
"""Remove empty directories recursively."""
pathlib_root_dir = Path(pathlib_root_dir)
# list all directories recursively and sort them by path,
# longest first
L = sorted(
longest = sorted(
pathlib_root_dir.glob(pattern),
key=lambda p: len(str(p)),
reverse=True,
)
for pdir in L:
for pdir in longest:
try:
pdir.rmdir() # remove directory if empty
except OSError:
continue # catch and continue if non-empty
# will check if there are any hard links if it passes a file or folder
# If a folder is passed, it will take the largest file in that folder and only check for hardlinks
# of the remaining files where the file is greater size a percentage of the largest file
# This fixes the bug in #192
def nohardlink(file, notify):
check = True
"""
Check if there are any hard links
Will check if there are any hard links if it passes a file or folder
If a folder is passed, it will take the largest file in that folder and only check for hardlinks
of the remaining files where the file is greater size a percentage of the largest file
This fixes the bug in #192
"""
check_for_hl = True
if os.path.isfile(file):
logger.trace(f"Checking file: {file}")
if os.stat(file).st_nlink > 1:
check = False
check_for_hl = False
else:
sorted_files = sorted(Path(file).rglob("*"), key=lambda x: os.stat(x).st_size, reverse=True)
logger.trace(f"Folder: {file}")
@ -292,36 +303,40 @@ def nohardlink(file, notify):
largest_file_size = os.stat(sorted_files[0]).st_size
logger.trace(f"Largest file: {sorted_files[0]}")
logger.trace(f"Largest file size: {largest_file_size}")
for x in sorted_files:
file_size = os.stat(x).st_size
file_no_hardlinks = os.stat(x).st_nlink
for files in sorted_files:
file_size = os.stat(files).st_size
file_no_hardlinks = os.stat(files).st_nlink
logger.trace(f"Checking file: {file}")
logger.trace(f"Checking file size: {file_size}")
logger.trace(f"Checking no of hard links: {file_no_hardlinks}")
if file_no_hardlinks > 1 and file_size >= (largest_file_size * threshold):
check = False
return check
check_for_hl = False
return check_for_hl
# Load json file if exists
def load_json(file):
"""Load json file if exists"""
if os.path.isfile(file):
f = open(file)
data = json.load(f)
f.close()
file = open(file)
data = json.load(file)
file.close()
else:
data = {}
return data
# Save json file overwrite if exists
def save_json(torrent_json, dest):
with open(dest, "w", encoding="utf-8") as f:
json.dump(torrent_json, f, ensure_ascii=False, indent=4)
"""Save json file to destination"""
with open(dest, "w", encoding="utf-8") as file:
json.dump(torrent_json, file, ensure_ascii=False, indent=4)
# Gracefully kill script when docker stops
class GracefulKiller:
"""
Class to catch SIGTERM and SIGINT signals.
Gracefully kill script when docker stops.
"""
kill_now = False
def __init__(self):
@ -329,10 +344,12 @@ class GracefulKiller:
signal.signal(signal.SIGTERM, self.exit_gracefully)
def exit_gracefully(self, *args):
"""Set kill_now to True to exit gracefully."""
self.kill_now = True
def human_readable_size(size, decimal_places=3):
"""Convert bytes to human readable size"""
for unit in ["B", "KiB", "MiB", "GiB", "TiB"]:
if size < 1024.0:
break
@ -341,6 +358,8 @@ def human_readable_size(size, decimal_places=3):
class YAML:
"""Class to load and save yaml files"""
def __init__(self, path=None, input_data=None, check_empty=False, create=False):
self.path = path
self.input_data = input_data
@ -355,19 +374,20 @@ class YAML:
pass
self.data = {}
else:
with open(self.path, encoding="utf-8") as fp:
self.data = self.yaml.load(fp)
except ruamel.yaml.error.YAMLError as e:
e = str(e).replace("\n", "\n ")
raise Failed(f"YAML Error: {e}")
except Exception as e:
raise Failed(f"YAML Error: {e}")
with open(self.path, encoding="utf-8") as filepath:
self.data = self.yaml.load(filepath)
except ruamel.yaml.error.YAMLError as yerr:
err = str(yerr).replace("\n", "\n ")
raise Failed(f"YAML Error: {err}") from yerr
except Exception as yerr:
raise Failed(f"YAML Error: {yerr}") from yerr
if not self.data or not isinstance(self.data, dict):
if check_empty:
raise Failed("YAML Error: File is empty")
self.data = {}
def save(self):
"""Save yaml file"""
if self.path:
with open(self.path, "w") as fp:
self.yaml.dump(self.data, fp)
with open(self.path, "w") as filepath:
self.yaml.dump(self.data, filepath)

View file

@ -1,5 +1,9 @@
"""Class to handle webhooks."""
import time
from json import JSONDecodeError
from requests.exceptions import JSONDecodeError as requestsJSONDecodeError
from modules import util
from modules.util import Failed
@ -7,7 +11,10 @@ logger = util.logger
class Webhooks:
"""Class to handle webhooks."""
def __init__(self, config, system_webhooks, notifiarr=None, apprise=None):
"""Initialize the class."""
self.config = config
self.error_webhooks = system_webhooks["error"] if "error" in system_webhooks else []
self.run_start_webhooks = system_webhooks["run_start"] if "run_start" in system_webhooks else []
@ -22,7 +29,36 @@ class Webhooks:
self.notifiarr = notifiarr
self.apprise = apprise
def request_and_check(self, webhook, json):
"""
Send a webhook request and check for errors.
retry up to 6 times if the response is a 500+ error.
"""
retry_count = 0
retry_attempts = 6
request_delay = 2
for retry_count in range(retry_attempts):
if webhook == "notifiarr":
response = self.notifiarr.notification(json=json)
else:
webhook_post = webhook
if webhook == "apprise":
json["urls"] = self.apprise.notify_url
webhook_post = f"{self.apprise.api_url}/notify"
response = self.config.post(webhook_post, json=json)
if response.status_code < 500:
return response
logger.debug(f"({response.status_code} [{response.reason}]) Retrying in {request_delay} seconds.")
time.sleep(request_delay)
logger.debug(f"(Retry {retry_count + 1} of {retry_attempts}.")
retry_count += 1
logger.warning(f"({response.status_code} [{response.reason}]) after {retry_attempts} attempts.")
def _request(self, webhooks, json):
"""
Send a webhook request via request_and_check.
Check for errors and log them.
"""
logger.trace("")
logger.trace(f"JSON: {json}")
for webhook in list(set(webhooks)):
@ -30,26 +66,10 @@ class Webhooks:
logger.trace(f"Webhook: {webhook}")
if webhook is None:
break
elif webhook == "notifiarr":
if self.notifiarr is None:
break
else:
for x in range(6):
response = self.notifiarr.notification(json=json)
if response.status_code < 500:
break
elif webhook == "apprise":
if self.apprise is None:
logger.warning("Webhook attribute set to apprise but apprise attribute is not configured.")
break
else:
json["urls"] = self.apprise.notify_url
for x in range(6):
response = self.config.post(f"{self.apprise.api_url}/notify", json=json)
if response.status_code < 500:
break
else:
response = self.config.post(webhook, json=json)
elif (webhook == "notifiarr" and self.notifiarr is None) or (webhook == "apprise" and self.apprise is None):
logger.warning(f"Webhook attribute set to {webhook} but {webhook} attribute is not configured.")
break
response = self.request_and_check(webhook, json)
if response:
skip = False
try:
@ -62,7 +82,7 @@ class Webhooks:
and "response" in response_json["details"]
):
if "trigger is not enabled" in response_json["details"]["response"]:
logger.debug(f"Notifiarr Warning: {response_json['details']['response']}")
logger.info(f"Notifiarr Warning: {response_json['details']['response']}")
skip = True
else:
raise Failed(f"Notifiarr Error: {response_json['details']['response']}")
@ -70,11 +90,12 @@ class Webhooks:
response.status_code >= 400 or ("result" in response_json and response_json["result"] == "error")
) and skip is False:
raise Failed(f"({response.status_code} [{response.reason}]) {response_json}")
except JSONDecodeError:
except (JSONDecodeError, requestsJSONDecodeError) as exc:
if response.status_code >= 400:
raise Failed(f"({response.status_code} [{response.reason}])")
raise Failed(f"({response.status_code} [{response.reason}])") from exc
def start_time_hooks(self, start_time):
"""Send a webhook to notify that the run has started."""
if self.run_start_webhooks:
dry_run = self.config.commands["dry_run"]
if dry_run:
@ -93,6 +114,7 @@ class Webhooks:
)
def end_time_hooks(self, start_time, end_time, run_time, next_run, stats, body):
"""Send a webhook to notify that the run has ended."""
if self.run_end_webhooks:
self._request(
self.run_end_webhooks,
@ -123,19 +145,21 @@ class Webhooks:
)
def error_hooks(self, text, function_error=None, critical=True):
"""Send a webhook to notify that an error has occurred."""
if self.error_webhooks:
type = "failure" if critical is True else "warning"
err_type = "failure" if critical is True else "warning"
json = {
"function": "run_error",
"title": f"{function_error} Error",
"body": str(text),
"critical": critical,
"type": type,
"type": err_type,
}
if function_error:
json["function_error"] = function_error
self._request(self.error_webhooks, json)
def function_hooks(self, webhook, json):
"""Send a webhook to notify that a function has completed."""
if self.function_webhooks:
self._request(webhook, json)

View file

@ -1,4 +1,5 @@
#!/usr/bin/python3
"""qBittorrent Manager."""
import argparse
import glob
import os
@ -14,11 +15,14 @@ except ModuleNotFoundError:
print("Requirements Error: Requirements are not installed")
sys.exit(0)
REQUIRED_VERSION = (3, 8, 1)
REQUIRED_VERSION_STR = ".".join(str(x) for x in REQUIRED_VERSION)
current_version = sys.version_info
if sys.version_info[0] != 3 or sys.version_info[1] < 6:
if current_version < (REQUIRED_VERSION):
print(
"Version Error: Version: %s.%s.%s incompatible please use Python 3.6+"
% (sys.version_info[0], sys.version_info[1], sys.version_info[2])
"Version Error: Version: %s.%s.%s incompatible with qbit_manage please use Python %s+"
% (current_version[0], current_version[1], current_version[2], REQUIRED_VERSION_STR)
)
sys.exit(0)
@ -142,7 +146,7 @@ parser.add_argument(
dest="skip_cleanup",
action="store_true",
default=False,
help="Use this to skip cleaning up Reycle Bin/Orphaned directory.",
help="Use this to skip cleaning up Recycle Bin/Orphaned directory.",
)
parser.add_argument(
"-dr",
@ -163,6 +167,7 @@ args = parser.parse_args()
def get_arg(env_str, default, arg_bool=False, arg_int=False):
"""Get argument from environment variable or command line argument."""
env_vars = [env_str] if not isinstance(env_str, list) else env_str
final_value = None
for env_var in env_vars:
@ -284,11 +289,12 @@ from modules.util import GracefulKiller # noqa
from modules.util import Failed # noqa
def my_except_hook(exctype, value, tb):
def my_except_hook(exctype, value, tbi):
"""Handle uncaught exceptions"""
if issubclass(exctype, KeyboardInterrupt):
sys.__excepthook__(exctype, value, tb)
sys.__excepthook__(exctype, value, tbi)
else:
logger.critical("Uncaught Exception", exc_info=(exctype, value, tb))
logger.critical("Uncaught Exception", exc_info=(exctype, value, tbi))
sys.excepthook = my_except_hook
@ -303,6 +309,7 @@ with open(os.path.join(os.path.dirname(os.path.abspath(__file__)), "VERSION")) a
def start_loop():
"""Start the main loop"""
if len(config_files) == 1:
args["config_file"] = config_files[0]
start()
@ -316,6 +323,7 @@ def start_loop():
def start():
"""Start the run"""
start_time = datetime.now()
args["time"] = start_time.strftime("%H:%M")
args["time_obj"] = start_time
@ -345,13 +353,14 @@ def start():
"untagged_noHL": 0,
}
def FinishedRun():
def finished_run():
"""Handle the end of a run"""
nonlocal end_time, start_time, stats_summary, run_time, next_run, body
end_time = datetime.now()
run_time = str(end_time - start_time).split(".")[0]
_, nr = calc_next_run(sch, True)
next_run_str = nr["next_run_str"]
next_run = nr["next_run"]
run_time = str(end_time - start_time).split(".", maxsplit=1)[0]
_, nxt_run = calc_next_run(sch, True)
next_run_str = nxt_run["next_run_str"]
next_run = nxt_run["next_run"]
body = logger.separator(
f"Finished Run\n{os.linesep.join(stats_summary) if len(stats_summary)>0 else ''}"
f"\nRun Time: {run_time}\n{next_run_str if len(next_run_str)>0 else ''}".replace("\n\n", "\n").rstrip()
@ -360,15 +369,15 @@ def start():
try:
cfg = Config(default_dir, args)
except Exception as e:
if "Qbittorrent Error" in e.args[0]:
logger.print_line(e, "CRITICAL")
except Exception as ex:
if "Qbittorrent Error" in ex.args[0]:
logger.print_line(ex, "CRITICAL")
logger.print_line("Exiting scheduled Run.", "CRITICAL")
FinishedRun()
finished_run()
return None
else:
logger.stacktrace()
logger.print_line(e, "CRITICAL")
logger.print_line(ex, "CRITICAL")
if cfg:
# Set Category
@ -425,9 +434,9 @@ def start():
if stats["rem_unreg"] > 0:
stats_summary.append(f"Total Unregistered Torrents Removed: {stats['rem_unreg']}")
if stats["tagged_tracker_error"] > 0:
stats_summary.append(f"Total {cfg.settings['tracker_error_tag']} Torrents Tagged: {stats['tagged_tracker_error']}")
stats_summary.append(f"Total {cfg.tracker_error_tag} Torrents Tagged: {stats['tagged_tracker_error']}")
if stats["untagged_tracker_error"] > 0:
stats_summary.append(f"Total {cfg.settings['tracker_error_tag']} Torrents untagged: {stats['untagged_tracker_error']}")
stats_summary.append(f"Total {cfg.tracker_error_tag} Torrents untagged: {stats['untagged_tracker_error']}")
if stats["added"] > 0:
stats_summary.append(f"Total Torrents Added: {stats['added']}")
if stats["resumed"] > 0:
@ -441,36 +450,38 @@ def start():
if stats["orphaned"] > 0:
stats_summary.append(f"Total Orphaned Files: {stats['orphaned']}")
if stats["tagged_noHL"] > 0:
stats_summary.append(f"Total noHL Torrents Tagged: {stats['tagged_noHL']}")
stats_summary.append(f"Total {cfg.nohardlinks_tag} Torrents Tagged: {stats['tagged_noHL']}")
if stats["untagged_noHL"] > 0:
stats_summary.append(f"Total noHL Torrents untagged: {stats['untagged_noHL']}")
stats_summary.append(f"Total {cfg.nohardlinks_tag} Torrents untagged: {stats['untagged_noHL']}")
if stats["recycle_emptied"] > 0:
stats_summary.append(f"Total Files Deleted from Recycle Bin: {stats['recycle_emptied']}")
if stats["orphaned_emptied"] > 0:
stats_summary.append(f"Total Files Deleted from Orphaned Data: {stats['orphaned_emptied']}")
FinishedRun()
finished_run()
if cfg:
try:
cfg.Webhooks.end_time_hooks(start_time, end_time, run_time, next_run, stats, body)
except Failed as e:
cfg.webhooks_factory.end_time_hooks(start_time, end_time, run_time, next_run, stats, body)
except Failed as err:
logger.stacktrace()
logger.error(f"Webhooks Error: {e}")
logger.error(f"Webhooks Error: {err}")
def end():
"""Ends the program"""
logger.info("Exiting Qbit_manage")
logger.remove_main_handler()
sys.exit(0)
def calc_next_run(sch, print=False):
def calc_next_run(schd, write_out=False):
"""Calculates the next run time based on the schedule"""
current = datetime.now().strftime("%H:%M")
seconds = sch * 60
time_to_run = datetime.now() + timedelta(minutes=sch)
seconds = schd * 60
time_to_run = datetime.now() + timedelta(minutes=schd)
time_to_run_str = time_to_run.strftime("%H:%M")
new_seconds = (datetime.strptime(time_to_run_str, "%H:%M") - datetime.strptime(current, "%H:%M")).total_seconds()
time_str = ""
time_until = ""
next_run = {}
if run is False:
next_run["next_run"] = time_to_run
@ -481,14 +492,14 @@ def calc_next_run(sch, print=False):
if seconds is not None:
hours = int(seconds // 3600)
minutes = int((seconds % 3600) // 60)
time_str = f"{hours} Hour{'s' if hours > 1 else ''}{' and ' if minutes > 1 else ''}" if hours > 0 else ""
time_str += f"{minutes} Minute{'s' if minutes > 1 else ''}" if minutes > 0 else ""
if print:
next_run["next_run_str"] = f"Current Time: {current} | {time_str} until the next run at {time_to_run_str}"
time_until = f"{hours} Hour{'s' if hours > 1 else ''}{' and ' if minutes > 1 else ''}" if hours > 0 else ""
time_until += f"{minutes} Minute{'s' if minutes > 1 else ''}" if minutes > 0 else ""
if write_out:
next_run["next_run_str"] = f"Current Time: {current} | {time_until} until the next run at {time_to_run_str}"
else:
next_run["next_run"] = None
next_run["next_run_str"] = ""
return time_str, next_run
return time_until, next_run
if __name__ == "__main__":

View file

@ -1,6 +1,6 @@
flake8==6.0.0
pre-commit==3.1.1
qbittorrent-api==2023.2.43
pre-commit==3.2.2
qbittorrent-api==2023.3.44
requests==2.28.2
retrying==1.3.4
ruamel.yaml==0.17.21

View file

@ -49,7 +49,7 @@ def setup_services(qbt=False):
)
try:
qbt_client.auth_log_in()
print("Succesfully connected to qBittorrent!")
print("Successfully connected to qBittorrent!")
except:
print("Error: Could not log into qBittorrent. Please verify login details are correct and Web Ui is available.")
quit_program(1)