Merge branch 'develop' into pre-commit-ci-update-config

This commit is contained in:
bobokun 2023-06-12 21:20:31 -04:00 committed by GitHub
commit 9b81f61b31
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
21 changed files with 226 additions and 122 deletions

View file

@ -45,7 +45,7 @@ repos:
hooks:
- id: black
language_version: python3
args: [--line-length, '130']
args: [--line-length, '130', --preview]
- repo: https://github.com/PyCQA/flake8
rev: 6.0.0
hooks:

View file

@ -1,23 +1,26 @@
# Requirements Updated
- Updates ruamel.yaml to 0.17.31
- Updates qbitorrent-api to 2023.5.48
- Separate out dev requirements into requirements-dev.txt
# Breaking Changes
- `tag_nohardlinks` only updates/removes `noHL` tag. **It does not modify or cleanup share_limits anymore.**
- `tag_update` only adds tracker tags to torrent. **It does not modify or cleanup share_limits anymore.**
- Please remove any references to share_limits from your configuration in the tracker/nohardlinks section
- Migration guide can be followed here: [V4 Migration Guide](https://github.com/StuffAnThings/qbit_manage/wiki/v4-Migration-Guide)
- Webhook payloads changed (See [webhooks](https://github.com/StuffAnThings/qbit_manage/wiki/Config-Setup#webhooks) for updated payload)
- qbitorrent-api updated to 2023.6.49
# New Features
- Adds new command `share_limits`, `--share-limits` , `QBT_SHARE_LIMITS=True` to update share limits based on tags/categories specified per group (Closes #88, Closes #306, Closes #259, Closes #308, Closes #137)
- See [Config Setup - share_limits](https://github.com/StuffAnThings/qbit_manage/wiki/Config-Setup#share_limits) for more details
- Adds new command `skip_qb_version_check`, `--skip-qb-version-check`, `QBT_SKIP_QB_VERSION_CHECK` to bypass qbitorrent compatibility check (unsupported - Thanks to @ftc2 #307)
- Updates to webhook notifications to group notifications when a function updates more than 10 Torrents.
- Adds new webhooks for `share_limits`
- Adds rate limit to webhook notifications (1 msg/sec)
# Bug Fixes
- Fixes #302
- Fixes #317
- cross-seed will move torrent to an error folder if it fails to inject torrent
- Define multiple announce urls for one tracker (#328 Thanks to @buthed010203 for the PR)
**Full Changelog**: https://github.com/StuffAnThings/qbit_manage/compare/v3.6.4...v4.0.0
# Bug Fixes
- Fixes #329 (Updates missing share_limits tag even when share_limits are satisfied)
- Fixes #327 (Zero value share limits not being applied correctly)
# Enhancements
- Logic for `share_limits_suffix_tag` changed to become a prefix tag instead along with adding the priority of the group. The reason for this change is so it's easier to see the share limit groups togethered in qbitorrent ordered by priority.
- `share_limits_suffix_tag` key is now `share_limits_tag`
- No config changes are required as the qbm will automatically change the previous `share_limits_suffix_tag` key to `share_limits_tag`
- Changes the default value of `share_limits_tag` to `~share_limit`. The `~` is used to sort the `share_limits_tag` in the qbt webUI to the bottom for less clutter
Example based on config.sample:
| old tag (v4.0.0) | new tag (v4.0.1) |
| ----------- | ----------- |
| noHL.share_limit | ~share_limit_1.noHL |
| cross-seed.share_limit | ~share_limit_2.cross-seed |
| PTP.share_limit | ~share_limit_3.PTP |
| default.share_limit | ~share_limit_999.default |
**Full Changelog**: https://github.com/StuffAnThings/qbit_manage/compare/v4.0.0...v4.0.1

View file

@ -6,7 +6,11 @@ venv: requirements.txt setup.py tox.ini
.PHONY: test
test:
tox
tox -e tests
.PHONY: pre-commit
pre-commit:
tox -e pre-commit
.PHONY: clean
clean:
@ -14,3 +18,7 @@ clean:
find -name '__pycache__' -delete
rm -rf .tox
rm -rf venv
.PHONY: install-hooks
install-hooks:
tox -e install-hooks

View file

@ -10,13 +10,14 @@
This is a program used to manage your qBittorrent instance such as:
* Tag torrents based on tracker URL and set seed goals/limit upload speed by tag (only tag torrents that have no tags)
* Tag torrents based on tracker URLs
* Update categories based on save directory
* Remove unregistered torrents (delete data & torrent if it is not being cross-seeded, otherwise it will just remove the torrent)
* Automatically add [cross-seed](https://github.com/mmgoodnow/cross-seed) torrents in paused state. **\*Note: cross-seed now allows for torrent injections directly to qBit, making this feature obsolete.\***
* Recheck paused torrents sorted by lowest size and resume if completed
* Remove orphaned files from your root directory that are not referenced by qBittorrent
* Tag any torrents that have no hard links outisde the root folder and allows optional cleanup to delete these torrents and contents based on maximum ratio and/or time seeded
* Tag any torrents that have no hard links outisde the root folder
* Apply share limits based on groups filtered by tags/categories and allows optional cleanup to delete these torrents and contents based on maximum ratio and/or time seeded
* RecycleBin function to move files into a RecycleBin folder instead of deleting the data directly when deleting a torrent
* Built-in scheduler to run the script every x minutes. (Can use `--run` command to run without the scheduler)
* Webhook notifications with [Notifiarr](https://notifiarr.com/) and [Apprise API](https://github.com/caronc/apprise-api) integration

View file

@ -1 +1 @@
4.0.0
4.0.1-develop6

View file

@ -28,7 +28,7 @@ settings:
force_auto_tmm: False # Will force qBittorrent to enable Automatic Torrent Management for each torrent.
tracker_error_tag: issue # Will set the tag of any torrents that do not have a working tracker.
nohardlinks_tag: noHL # Will set the tag of any torrents with no hardlinks.
share_limits_suffix_tag: share_limit # Will add this suffix to the grouping separated by '.' to the tag of any torrents with share limits.
share_limits_tag: ~share_limit # Will add this tag when applying share limits to provide an easy way to filter torrents by share limit group/priority for each torrent
ignoreTags_OnUpdate: # When running tag-update function, it will update torrent tags for a given torrent even if the torrent has at least one or more of the tags defined here. Otherwise torrents will not be tagged if tags exist.
- noHL
- issue
@ -67,7 +67,7 @@ cat_change:
tracker:
# Mandatory
# Tag Parameters
# <Tracker URL Keyword>: # <MANDATORY> This is the keyword in the tracker url
# <Tracker URL Keyword>: # <MANDATORY> This is the keyword in the tracker url. You can define multiple tracker urls by splitting with `|` delimiter
# <MANDATORY> Set tag name. Can be a list of tags or a single tag
# tag: <Tag Name>
# <OPTIONAL> Set this to the notifiarr react name. This is used to add indexer reactions to the notifications sent by Notifiarr
@ -107,13 +107,10 @@ tracker:
privatehd:
tag: PrivateHD
notifiarr:
tleechreload:
tag: TorrentLeech
notifiarr: torrentleech
torrentdb:
tag: TorrentDB
notifiarr: torrentdb
torrentleech:
torrentleech|tleechreload:
tag: TorrentLeech
notifiarr: torrentleech
tv-vault:

View file

@ -100,7 +100,12 @@ class Config:
if "cat_change" in self.data:
self.data["cat_change"] = self.data.pop("cat_change")
if "tracker" in self.data:
self.data["tracker"] = self.data.pop("tracker")
trackers = self.data.pop("tracker")
self.data["tracker"] = {}
# Splits tracker urls at pipes, useful for trackers with multiple announce urls
for tracker_urls, data in trackers.items():
for tracker_url in tracker_urls.split("|"):
self.data["tracker"][tracker_url.strip()] = data
else:
self.data["tracker"] = {}
if "nohardlinks" in self.data:
@ -146,6 +151,11 @@ class Config:
self.loglevel = "DRYRUN" if self.dry_run else "INFO"
self.session = requests.Session()
share_limits_tag = self.data["settings"].get("share_limits_suffix_tag", "~share_limit")
# Convert previous share_limits_suffix_tag to new default share_limits_tag
if share_limits_tag == "share_limit":
share_limits_tag = "~share_limit"
self.settings = {
"force_auto_tmm": self.util.check_for_attribute(
self.data, "force_auto_tmm", parent="settings", var_type="bool", default=False
@ -154,19 +164,22 @@ class Config:
self.data, "tracker_error_tag", parent="settings", default="issue"
),
"nohardlinks_tag": self.util.check_for_attribute(self.data, "nohardlinks_tag", parent="settings", default="noHL"),
"share_limits_suffix_tag": self.util.check_for_attribute(
self.data, "share_limits_suffix_tag", parent="settings", default="share_limit"
"share_limits_tag": self.util.check_for_attribute(
self.data, "share_limits_tag", parent="settings", default=share_limits_tag
),
}
self.tracker_error_tag = self.settings["tracker_error_tag"]
self.nohardlinks_tag = self.settings["nohardlinks_tag"]
self.share_limits_suffix_tag = "." + self.settings["share_limits_suffix_tag"]
self.share_limits_tag = self.settings["share_limits_tag"]
default_ignore_tags = [self.nohardlinks_tag, self.tracker_error_tag, "cross-seed"]
self.settings["ignoreTags_OnUpdate"] = self.util.check_for_attribute(
self.data, "ignoreTags_OnUpdate", parent="settings", default=default_ignore_tags, var_type="list"
)
"Migrate settings from v4.0.0 to v4.0.1 and beyond. Convert 'share_limits_suffix_tag' to 'share_limits_tag'"
if "share_limits_suffix_tag" in self.data["settings"]:
self.util.overwrite_attributes(self.settings, "settings")
default_function = {
"cross_seed": None,
@ -303,7 +316,7 @@ class Config:
else:
priority = max(priorities) + 1
logger.warning(
f"Priority not defined for the grouping '{key}' in share_limits. " f"Setting priority to {priority}"
f"Priority not defined for the grouping '{key}' in share_limits. Setting priority to {priority}"
)
value["priority"] = self.util.check_for_attribute(
self.data,
@ -538,14 +551,18 @@ class Config:
)
if self.commands["rem_orphaned"]:
exclude_orphaned = f"**{os.sep}{os.path.basename(self.orphaned_dir.rstrip(os.sep))}{os.sep}*"
self.orphaned["exclude_patterns"].append(exclude_orphaned) if exclude_orphaned not in self.orphaned[
"exclude_patterns"
] else self.orphaned["exclude_patterns"]
(
self.orphaned["exclude_patterns"].append(exclude_orphaned)
if exclude_orphaned not in self.orphaned["exclude_patterns"]
else self.orphaned["exclude_patterns"]
)
if self.recyclebin["enabled"]:
exclude_recycle = f"**{os.sep}{os.path.basename(self.recycle_dir.rstrip(os.sep))}{os.sep}*"
self.orphaned["exclude_patterns"].append(exclude_recycle) if exclude_recycle not in self.orphaned[
"exclude_patterns"
] else self.orphaned["exclude_patterns"]
(
self.orphaned["exclude_patterns"].append(exclude_recycle)
if exclude_recycle not in self.orphaned["exclude_patterns"]
else self.orphaned["exclude_patterns"]
)
# Connect to Qbittorrent
self.qbt = None
@ -635,8 +652,10 @@ class Config:
if empty_after_x_days <= days:
num_del += 1
body += logger.print_line(
f"{'Did not delete' if self.dry_run else 'Deleted'} "
f"{filename} from {folder} (Last modified {round(days)} days ago).",
(
f"{'Did not delete' if self.dry_run else 'Deleted'} "
f"{filename} from {folder} (Last modified {round(days)} days ago)."
),
self.loglevel,
)
files += [str(filename)]
@ -649,8 +668,10 @@ class Config:
for path in location_path_list:
util.remove_empty_directories(path, "**/*")
body += logger.print_line(
f"{'Did not delete' if self.dry_run else 'Deleted'} {num_del} files "
f"({util.human_readable_size(size_bytes)}) from the {location}.",
(
f"{'Did not delete' if self.dry_run else 'Deleted'} {num_del} files "
f"({util.human_readable_size(size_bytes)}) from the {location}."
),
self.loglevel,
)
attr = {

View file

@ -31,18 +31,21 @@ class CrossSeed:
dir_cs = self.config.cross_seed_dir
dir_cs_out = os.path.join(dir_cs, "qbit_manage_added")
os.makedirs(dir_cs_out, exist_ok=True)
dir_cs_err = os.path.join(dir_cs, "qbit_manage_error")
os.makedirs(dir_cs_err, exist_ok=True)
for file in cs_files:
tr_name = file.split("]", 2)[2].split(".torrent")[0]
t_tracker = file.split("]", 2)[1][1:]
# Substring Key match in dictionary (used because t_name might not match exactly with self.qbt.torrentinfo key)
# Returned the dictionary of filtered item
torrentdict_file = dict(filter(lambda item: tr_name in item[0], self.qbt.torrentinfo.items()))
src = os.path.join(dir_cs, file)
dir_cs_out = os.path.join(dir_cs_out, file)
dir_cs_err = os.path.join(dir_cs_err, file)
if torrentdict_file:
# Get the exact torrent match name from self.qbt.torrentinfo
t_name = next(iter(torrentdict_file))
dest = os.path.join(self.qbt.torrentinfo[t_name]["save_path"], "")
src = os.path.join(dir_cs, file)
dir_cs_out = os.path.join(dir_cs, "qbit_manage_added", file)
category = self.qbt.torrentinfo[t_name].get("Category", self.qbt.get_category(dest))
# Only add cross-seed torrent if original torrent is complete
if self.qbt.torrentinfo[t_name]["is_complete"]:
@ -98,6 +101,7 @@ class CrossSeed:
logger.print_line(error, self.config.loglevel)
else:
logger.print_line(error, "WARNING")
util.move_files(src, dir_cs_err)
self.config.notify(error, "cross-seed", False)
self.config.webhooks_factory.notify(self.torrents_updated, self.notify_attr, group_by="category")

View file

@ -64,8 +64,10 @@ class ReCheck:
)
logger.debug(
logger.insert_space(
f"-- Seeding Time vs Max Seed Time: {timedelta(seconds=torrent.seeding_time)} < "
f"{timedelta(minutes=torrent.max_seeding_time)}",
(
f"-- Seeding Time vs Max Seed Time: {timedelta(seconds=torrent.seeding_time)} < "
f"{timedelta(minutes=torrent.max_seeding_time)}"
),
4,
)
)
@ -85,7 +87,7 @@ class ReCheck:
):
self.stats_resumed += 1
body = logger.print_line(
f"{'Not Resuming' if self.config.dry_run else 'Resuming'} [{tracker['tag']}] - " f"{t_name}",
f"{'Not Resuming' if self.config.dry_run else 'Resuming'} [{tracker['tag']}] - {t_name}",
self.config.loglevel,
)
attr = {

View file

@ -67,8 +67,10 @@ class RemoveOrphaned:
logger.print_line(f"{num_orphaned} Orphaned files found", self.config.loglevel)
body += logger.print_line("\n".join(orphaned_files), self.config.loglevel)
body += logger.print_line(
f"{'Not moving' if self.config.dry_run else 'Moving'} {num_orphaned} Orphaned files "
f"to {self.orphaned_dir.replace(self.remote_dir,self.root_dir)}",
(
f"{'Not moving' if self.config.dry_run else 'Moving'} {num_orphaned} Orphaned files "
f"to {self.orphaned_dir.replace(self.remote_dir,self.root_dir)}"
),
self.config.loglevel,
)

View file

@ -145,22 +145,28 @@ class RemoveUnregistered:
if self.stats_deleted >= 1 or self.stats_deleted_contents >= 1:
if self.stats_deleted >= 1:
logger.print_line(
f"{'Did not delete' if self.config.dry_run else 'Deleted'} {self.stats_deleted} "
f".torrent{'s' if self.stats_deleted > 1 else ''} but not content files.",
(
f"{'Did not delete' if self.config.dry_run else 'Deleted'} {self.stats_deleted} "
f".torrent{'s' if self.stats_deleted > 1 else ''} but not content files."
),
self.config.loglevel,
)
if self.stats_deleted_contents >= 1:
logger.print_line(
f"{'Did not delete' if self.config.dry_run else 'Deleted'} {self.stats_deleted_contents} "
f".torrent{'s' if self.stats_deleted_contents > 1 else ''} AND content files.",
(
f"{'Did not delete' if self.config.dry_run else 'Deleted'} {self.stats_deleted_contents} "
f".torrent{'s' if self.stats_deleted_contents > 1 else ''} AND content files."
),
self.config.loglevel,
)
else:
logger.print_line("No unregistered torrents found.", self.config.loglevel)
if self.stats_untagged >= 1:
logger.print_line(
f"{'Did not delete' if self.config.dry_run else 'Deleted'} {self.tag_error} tags for {self.stats_untagged} "
f".torrent{'s.' if self.stats_untagged > 1 else '.'}",
(
f"{'Did not delete' if self.config.dry_run else 'Deleted'} {self.tag_error} tags for {self.stats_untagged} "
f".torrent{'s.' if self.stats_untagged > 1 else '.'}"
),
self.config.loglevel,
)
if self.stats_tagged >= 1:

View file

@ -24,10 +24,11 @@ class ShareLimits:
self.share_limits_config = qbit_manager.config.share_limits # configuration of share limits
self.torrents_updated = [] # list of torrents that have been updated
self.torrent_hash_checked = [] # list of torrent hashes that have been checked for share limits
self.share_limits_suffix_tag = qbit_manager.config.share_limits_suffix_tag # suffix tag for share limits
self.share_limits_tag = qbit_manager.config.share_limits_tag # tag for share limits
self.group_tag = None # tag for the share limit group
self.update_share_limits()
self.delete_share_limits_suffix_tag()
def update_share_limits(self):
"""Updates share limits for torrents based on grouping"""
@ -166,6 +167,9 @@ class ShareLimits:
for torrent in torrents:
t_name = torrent.name
t_hash = torrent.hash
self.group_tag = (
f"{self.share_limits_tag}_{group_config['priority']}.{group_name}" if group_config["add_group_to_tag"] else None
)
tracker = self.qbt.get_tags(torrent.trackers)
check_max_ratio = group_config["max_ratio"] != torrent.max_ratio
check_max_seeding_time = group_config["max_seeding_time"] != torrent.max_seeding_time
@ -175,6 +179,7 @@ class ShareLimits:
group_config["limit_upload_speed"] = -1
check_limit_upload_speed = group_config["limit_upload_speed"] != torrent_upload_limit
hash_not_prev_checked = t_hash not in self.torrent_hash_checked
share_limits_not_yet_tagged = True if self.group_tag and self.group_tag not in torrent.tags else False
logger.trace(f"Torrent: {t_name} [Hash: {t_hash}]")
logger.trace(f"Torrent Category: {torrent.category}")
logger.trace(f"Torrent Tags: {torrent.tags}")
@ -192,9 +197,11 @@ class ShareLimits:
)
logger.trace(f"check_limit_upload_speed: {check_limit_upload_speed}")
logger.trace(f"hash_not_prev_checked: {hash_not_prev_checked}")
if (check_max_ratio or check_max_seeding_time or check_limit_upload_speed) and hash_not_prev_checked:
logger.trace(f"share_limits_not_yet_tagged: {share_limits_not_yet_tagged}")
if (
check_max_ratio or check_max_seeding_time or check_limit_upload_speed or share_limits_not_yet_tagged
) and hash_not_prev_checked:
if "MinSeedTimeNotReached" not in torrent.tags:
self.group_tag = f"{group_name}{self.share_limits_suffix_tag}" if group_config["add_group_to_tag"] else None
logger.print_line(logger.insert_space(f"Torrent Name: {t_name}", 3), self.config.loglevel)
logger.print_line(logger.insert_space(f'Tracker: {tracker["url"]}', 8), self.config.loglevel)
if self.group_tag:
@ -224,10 +231,11 @@ class ShareLimits:
def tag_and_update_share_limits_for_torrent(self, torrent, group_config):
"""Removes previous share limits tag, updates tag and share limits for a torrent, and resumes the torrent"""
# Remove previous share_limits tag
tags = util.get_list(torrent.tags)
for tag in tags:
if self.share_limits_suffix_tag in tag:
torrent.remove_tags(tag)
if not self.config.dry_run:
tags = util.get_list(torrent.tags)
for tag in tags:
if self.share_limits_tag in tag:
torrent.remove_tags(tag)
# Will tag the torrent with the group name if add_group_to_tag is True and set the share limits
self.set_tags_and_limits(
@ -293,39 +301,37 @@ class ShareLimits:
return False
return True
def set_tags_and_limits(
self, torrent, max_ratio, max_seeding_time, limit_upload_speed=None, tags=None, restore=False, do_print=True
):
def set_tags_and_limits(self, torrent, max_ratio, max_seeding_time, limit_upload_speed=None, tags=None, do_print=True):
"""Set tags and limits for a torrent"""
body = []
if limit_upload_speed:
if limit_upload_speed is not None:
if limit_upload_speed != -1:
msg = logger.insert_space(f"Limit UL Speed: {limit_upload_speed} kB/s", 1)
if do_print:
body += logger.print_line(msg, self.config.loglevel)
else:
body.append(msg)
if max_ratio or max_seeding_time:
if (max_ratio == -2 and max_seeding_time == -2) and not restore:
if max_ratio is not None or max_seeding_time is not None:
if max_ratio == -2 and max_seeding_time == -2:
msg = logger.insert_space("Share Limit: Use Global Share Limit", 4)
if do_print:
body += logger.print_line(msg, self.config.loglevel)
else:
body.append(msg)
elif (max_ratio == -1 and max_seeding_time == -1) and not restore:
elif max_ratio == -1 and max_seeding_time == -1:
msg = logger.insert_space("Share Limit: Set No Share Limit", 4)
if do_print:
body += logger.print_line(msg, self.config.loglevel)
else:
body.append(msg)
else:
if max_ratio != torrent.max_ratio and (not max_seeding_time or max_seeding_time < 0):
if max_ratio != torrent.max_ratio and (max_seeding_time is None or max_seeding_time < 0):
msg = logger.insert_space(f"Share Limit: Max Ratio = {max_ratio}", 4)
if do_print:
body += logger.print_line(msg, self.config.loglevel)
else:
body.append(msg)
elif max_seeding_time != torrent.max_seeding_time and (not max_ratio or max_ratio < 0):
elif max_seeding_time != torrent.max_seeding_time and (max_ratio is None or max_ratio < 0):
msg = logger.insert_space(f"Share Limit: Max Seed Time = {max_seeding_time} min", 4)
if do_print:
body += logger.print_line(msg, self.config.loglevel)
@ -341,14 +347,14 @@ class ShareLimits:
if not self.config.dry_run:
if tags and tags not in torrent.tags:
torrent.add_tags(tags)
if limit_upload_speed:
if limit_upload_speed is not None:
if limit_upload_speed == -1:
torrent.set_upload_limit(-1)
else:
torrent.set_upload_limit(limit_upload_speed * 1024)
if not max_ratio:
if max_ratio is not None:
max_ratio = torrent.max_ratio
if not max_seeding_time:
if max_seeding_time is not None:
max_seeding_time = torrent.max_seeding_time
if "MinSeedTimeNotReached" in torrent.tags:
return []
@ -363,7 +369,8 @@ class ShareLimits:
print_log = []
if torrent.seeding_time >= min_seeding_time * 60:
if "MinSeedTimeNotReached" in torrent.tags:
torrent.remove_tags(tags="MinSeedTimeNotReached")
if not self.config.dry_run:
torrent.remove_tags(tags="MinSeedTimeNotReached")
return True
else:
if "MinSeedTimeNotReached" not in torrent.tags:
@ -371,8 +378,11 @@ class ShareLimits:
print_log += logger.print_line(logger.insert_space(f"Tracker: {tracker}", 8), self.config.loglevel)
print_log += logger.print_line(
logger.insert_space(
f"Min seed time not met: {timedelta(seconds=torrent.seeding_time)} <= "
f"{timedelta(minutes=min_seeding_time)}. Removing Share Limits so qBittorrent can continue seeding.",
(
f"Min seed time not met: {timedelta(seconds=torrent.seeding_time)} <="
f" {timedelta(minutes=min_seeding_time)}. Removing Share Limits so qBittorrent can continue"
" seeding."
),
8,
),
self.config.loglevel,
@ -390,7 +400,7 @@ class ShareLimits:
def _has_reached_seeding_time_limit():
nonlocal body
seeding_time_limit = None
if not max_seeding_time:
if max_seeding_time is None:
return False
if max_seeding_time >= 0:
seeding_time_limit = max_seeding_time
@ -401,14 +411,16 @@ class ShareLimits:
if seeding_time_limit:
if (torrent.seeding_time >= seeding_time_limit * 60) and _has_reached_min_seeding_time_limit():
body += logger.insert_space(
f"Seeding Time vs Max Seed Time: {timedelta(seconds=torrent.seeding_time)} >= "
f"{timedelta(minutes=seeding_time_limit)}",
(
f"Seeding Time vs Max Seed Time: {timedelta(seconds=torrent.seeding_time)} >= "
f"{timedelta(minutes=seeding_time_limit)}"
),
8,
)
return True
return False
if max_ratio:
if max_ratio is not None:
if max_ratio >= 0:
if torrent.ratio >= max_ratio and _has_reached_min_seeding_time_limit():
body += logger.insert_space(f"Ratio vs Max Ratio: {torrent.ratio:.2f} >= {max_ratio:.2f}", 8)
@ -422,3 +434,11 @@ class ShareLimits:
if _has_reached_seeding_time_limit():
return body
return False
def delete_share_limits_suffix_tag(self):
""" "Delete Share Limits Suffix Tag from version 4.0.0"""
tags = self.client.torrent_tags.tags
old_share_limits_tag = self.share_limits_tag[1:] if self.share_limits_tag.startswith("~") else self.share_limits_tag
for tag in tags:
if tag.endswith(f".{old_share_limits_tag}"):
self.client.torrent_tags.delete_tags(tag)

View file

@ -55,14 +55,13 @@ class TagNoHardLinks:
def check_previous_nohardlinks_tagged_torrents(self, has_nohardlinks, torrent, tracker, category):
"""
Checks for any previous torrents that were tagged with the nohardlinks tag and have since had hardlinks added.
If any are found, the nohardlinks tag is removed from the torrent and the tracker or global share limits are restored.
If the torrent is complete and the option to resume after untagging is enabled, the torrent is resumed.
If any are found, the nohardlinks tag is removed
"""
if not (has_nohardlinks) and (self.nohardlinks_tag in torrent.tags):
self.stats_untagged += 1
body = []
body += logger.print_line(
f"Previous Tagged {self.nohardlinks_tag} " f"Torrent Name: {torrent.name} has hardlinks found now.",
f"Previous Tagged {self.nohardlinks_tag} Torrent Name: {torrent.name} has hardlinks found now.",
self.config.loglevel,
)
body += logger.print_line(logger.insert_space(f"Removed Tag: {self.nohardlinks_tag}", 6), self.config.loglevel)
@ -121,16 +120,20 @@ class TagNoHardLinks:
self.check_previous_nohardlinks_tagged_torrents(has_nohardlinks, torrent, tracker, category)
if self.stats_tagged >= 1:
logger.print_line(
f"{'Did not Tag' if self.config.dry_run else 'Added Tag'} for {self.stats_tagged} "
f".torrent{'s.' if self.stats_tagged > 1 else '.'}",
(
f"{'Did not Tag' if self.config.dry_run else 'Added Tag'} for {self.stats_tagged} "
f".torrent{'s.' if self.stats_tagged > 1 else '.'}"
),
self.config.loglevel,
)
else:
logger.print_line("No torrents to tag with no hardlinks.", self.config.loglevel)
if self.stats_untagged >= 1:
logger.print_line(
f"{'Did not delete' if self.config.dry_run else 'Deleted'} "
f"{self.nohardlinks_tag} tags for {self.stats_untagged} "
f".torrent{'s.' if self.stats_untagged > 1 else '.'}",
(
f"{'Did not delete' if self.config.dry_run else 'Deleted'} "
f"{self.nohardlinks_tag} tags for {self.stats_untagged} "
f".torrent{'s.' if self.stats_untagged > 1 else '.'}"
),
self.config.loglevel,
)

View file

@ -9,7 +9,7 @@ class Tags:
self.config = qbit_manager.config
self.client = qbit_manager.client
self.stats = 0
self.share_limits_suffix_tag = qbit_manager.config.share_limits_suffix_tag # suffix tag for share limits
self.share_limits_tag = qbit_manager.config.share_limits_tag # suffix tag for share limits
self.torrents_updated = [] # List of torrents updated
self.notify_attr = [] # List of single torrent attributes to send to notifiarr
@ -21,7 +21,7 @@ class Tags:
ignore_tags = self.config.settings["ignoreTags_OnUpdate"]
logger.separator("Updating Tags", space=False, border=False)
for torrent in self.qbt.torrent_list:
check_tags = [tag for tag in util.get_list(torrent.tags) if self.share_limits_suffix_tag not in tag]
check_tags = [tag for tag in util.get_list(torrent.tags) if self.share_limits_tag not in tag]
if torrent.tags == "" or (len([trk for trk in check_tags if trk not in ignore_tags]) == 0):
tracker = self.qbt.get_tags(torrent.trackers)

View file

@ -136,6 +136,13 @@ class check:
def __init__(self, config):
self.config = config
def overwrite_attributes(self, data, attribute):
"""Overwrite attributes in config."""
yaml = YAML(self.config.config_path)
if data is not None and attribute in yaml.data:
yaml.data[attribute] = data
yaml.save()
def check_for_attribute(
self,
data,

View file

@ -171,8 +171,10 @@ class Webhooks:
def notify(self, torrents_updated=[], payload={}, group_by=""):
if len(torrents_updated) > GROUP_NOTIFICATION_LIMIT:
logger.trace(
f"Number of torrents updated > {GROUP_NOTIFICATION_LIMIT}, grouping notifications"
f"{f' by {group_by}' if group_by else ''}",
(
f"Number of torrents updated > {GROUP_NOTIFICATION_LIMIT}, grouping notifications"
f"{f' by {group_by}' if group_by else ''}"
),
)
if group_by == "category":
group_attr = group_notifications_by_key(payload, "torrent_category")
@ -189,10 +191,14 @@ class Webhooks:
attr = {
"function": group_attr[group]["function"],
"title": f"{group_attr[group]['title']} for {group}",
"body": group_attr[group]["body"]
if only_one_torrent_updated
else f"Updated {num_torrents_updated} "
f"{'torrent' if only_one_torrent_updated else 'torrents'} with {group_by} '{group}'",
"body": (
group_attr[group]["body"]
if only_one_torrent_updated
else (
f"Updated {num_torrents_updated} "
f"{'torrent' if only_one_torrent_updated else 'torrents'} with {group_by} '{group}'"
)
),
"torrents": group_attr[group]["torrents"],
}
if group_by == "category":

View file

@ -61,8 +61,10 @@ parser.add_argument(
action="store",
default="config.yml",
type=str,
help="This is used if you want to use a different name for your config.yml or if you want to load multiple"
"config files using *. Example: tv.yml or config*.yml",
help=(
"This is used if you want to use a different name for your config.yml or if you want to load multiple"
"config files using *. Example: tv.yml or config*.yml"
),
)
parser.add_argument(
"-lf",
@ -103,8 +105,10 @@ parser.add_argument(
dest="tag_update",
action="store_true",
default=False,
help="Use this if you would like to update your tags and/or set seed goals/limit upload speed by tag."
" (Only adds tags to untagged torrents)",
help=(
"Use this if you would like to update your tags and/or set seed goals/limit upload speed by tag."
" (Only adds tags to untagged torrents)"
),
)
parser.add_argument(
"-ru",
@ -136,10 +140,12 @@ parser.add_argument(
dest="tag_nohardlinks",
action="store_true",
default=False,
help="Use this to tag any torrents that do not have any hard links associated with any of the files. "
"This is useful for those that use Sonarr/Radarr which hard link your media files with the torrents for seeding. "
"When files get upgraded they no longer become linked with your media therefore will be tagged with a new tag noHL. "
"You can then safely delete/remove these torrents to free up any extra space that is not being used by your media folder.",
help=(
"Use this to tag any torrents that do not have any hard links associated with any of the files. "
"This is useful for those that use Sonarr/Radarr which hard link your media files with the torrents for seeding. "
"When files get upgraded they no longer become linked with your media therefore will be tagged with a new tag noHL. "
"You can then safely delete/remove these torrents to free up any extra space that is not being used by your media folder."
),
)
parser.add_argument(
"-sl",
@ -147,9 +153,11 @@ parser.add_argument(
dest="share_limits",
action="store_true",
default=False,
help="Use this to help apply and manage your torrent share limits based on your tags/categories."
"This can apply a max ratio, seed time limits to your torrents or limit your torrent upload speed as well."
"Share limits are applied in the order of priority specified.",
help=(
"Use this to help apply and manage your torrent share limits based on your tags/categories."
"This can apply a max ratio, seed time limits to your torrents or limit your torrent upload speed as well."
"Share limits are applied in the order of priority specified."
),
)
parser.add_argument(
"-sc",
@ -422,7 +430,9 @@ def start():
next_run = nxt_run["next_run"]
body = logger.separator(
f"Finished Run\n{os.linesep.join(stats_summary) if len(stats_summary)>0 else ''}"
f"\nRun Time: {run_time}\n{next_run_str if len(next_run_str)>0 else ''}".replace("\n\n", "\n").rstrip()
f"\nRun Time: {run_time}\n{next_run_str if len(next_run_str)>0 else ''}".replace(
"\n\n", "\n"
).rstrip()
)[0]
return next_run, body

View file

@ -1,6 +1,6 @@
bencodepy==0.9.5
GitPython==3.1.31
qbittorrent-api==2023.5.48
qbittorrent-api==2023.6.49
requests==2.31.0
retrying==1.3.4
ruamel.yaml==0.17.31

View file

@ -184,8 +184,8 @@ def main():
print(
f"--- Torrent ages are below threshold of '{MIN_TORRENT_AGE} days'\n"
f"--- Torrent seed ratios are below threshold of '{MIN_TORRENT_SHARE_RATIO}'\n"
f"--- Torrents have multiple hard links\n"
f"--- No torrents exists!"
"--- Torrents have multiple hard links\n"
"--- No torrents exists!"
)
quit_program(0)

View file

@ -30,8 +30,10 @@ setup(
# repository. For example: MIT
license="MIT",
# Short description of your library
description="This tool will help manage tedious tasks in qBittorrent and automate them. "
"Tag, categorize, remove Orphaned data, remove unregistered torrents and much much more.",
description=(
"This tool will help manage tedious tasks in qBittorrent and automate them. "
"Tag, categorize, remove Orphaned data, remove unregistered torrents and much much more."
),
# Long description of your library
long_description=long_description,
long_description_content_type="text/markdown",

20
tox.ini
View file

@ -5,16 +5,28 @@ tox_pip_extensions_ext_pip_custom_platform = true
tox_pip_extensions_ext_venv_update = true
[testenv]
deps = -r{toxinidir}/requirements.txt
deps =
-r{toxinidir}/requirements.txt
-r{toxinidir}/requirements-dev.txt
passenv = HOME SSH_AUTH_SOCK USER
commands =
pre-commit install -f --install-hooks
pre-commit run --all-files --show-diff-on-failure
[testenv:venv]
envdir = venv
commands =
[testenv:install-hooks]
deps = pre-commit
commands = pre-commit install -f --install-hooks
[testenv:pre-commit]
deps = pre-commit
commands = pre-commit run --all-files
[testenv:tests]
commands =
pre-commit install -f --install-hooks
pre-commit run --all-files
[flake8]
max-line-length = 130