# Requirements Updated
- qbittorrent-api==2025.7.0
- fastapi==0.116.1


# New Features
- **Uncategorized Category**: Allow multiple paths for Uncategorized
category and add error handling (Thanks to @cat-of-wisdom #849)
- **Config Auto Backup and Cleanup**: implement automatic backup
rotation (30 most recent backups per config) and cleanup
- **Web UI**: add base URL support for reverse proxy deployments (Fixes
#871)
- **Share Limits**: add option to preserve upload speed limits when
minimums unmet (New config option
`reset_upload_speed_on_unmet_minimums`) (Fixes #835, #791)

# Improvements
- Optimize webUI form rendering
- Better centralized error handling for qbitorrent API operations
- **Web UI**: add editable group names to share limit modal

# Bug Fixes
- Fix bug in remove orphaned to notify when there are 0 orphaned files
- Fixes [Bug]: Cannot run on Python 3.9.18 #864
- fix(qbit): add error handling for qBittorrent API operations

**Full Changelog**:
https://github.com/StuffAnThings/qbit_manage/compare/v4.5.0...v4.5.1

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: cat-of-wisdom <217637421+cat-of-wisdom@users.noreply.github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
This commit is contained in:
bobokun 2025-07-19 08:59:41 -04:00 committed by GitHub
parent 3fa5fcee3b
commit ca4819bc0b
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
51 changed files with 1602 additions and 501 deletions

View file

@ -9,6 +9,8 @@ updates:
directory: "/"
schedule:
interval: "daily"
time: "07:00"
timezone: "America/Toronto"
target-branch: "develop"
assignees:
- "bobokun"
@ -21,7 +23,9 @@ updates:
- package-ecosystem: github-actions
directory: '/'
schedule:
interval: daily
interval: "daily"
time: "07:00"
timezone: "America/Toronto"
assignees:
- "bobokun"
target-branch: "develop"

View file

@ -25,7 +25,7 @@ repos:
exclude: ^.github/
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.11.13
rev: v0.12.3
hooks:
# Run the linter.
- id: ruff

View file

@ -1,17 +1,22 @@
# Requirements Updated
- fastapi==0.116.0
- retrying==1.4.0
- uvicorn==0.35.0
- qbittorrent-api==2025.7.0
- fastapi==0.116.1
# New Features
- **Web UI**: Introduced a new Web UI for configuring and managing qBit Manage.
- Visual Configuration Editor for YAML files.
- Command Execution directly from the UI.
- Undo/Redo History for changes.
- Theme Support (light/dark mode).
- Responsive Design for desktop and mobile.
- Real-time YAML Preview.
- Pass skip qbitorrent check as optional parameter to the API (Adds #860)\
- **Uncategorized Category**: Allow multiple paths for Uncategorized category and add error handling (Thanks to @cat-of-wisdom #849)
- **Config Auto Backup and Cleanup**: implement automatic backup rotation (30 most recent backups per config) and cleanup
- **Web UI**: add base URL support for reverse proxy deployments (Fixes #871)
- **Share Limits**: add option to preserve upload speed limits when minimums unmet (New config option `reset_upload_speed_on_unmet_minimums`) (Fixes #835, #791)
# Improvements
- Optimize webUI form rendering
- Better centralized error handling for qbitorrent API operations
- **Web UI**: add editable group names to share limit modal
**Full Changelog**: https://github.com/StuffAnThings/qbit_manage/compare/v4.4.0...v4.5.0
# Bug Fixes
- Fix bug in remove orphaned to notify when there are 0 orphaned files
- Fixes [Bug]: Cannot run on Python 3.9.18 #864
- fix(qbit): add error handling for qBittorrent API operations
**Full Changelog**: https://github.com/StuffAnThings/qbit_manage/compare/v4.5.0...v4.5.1

View file

@ -45,5 +45,8 @@ COPY . /app
WORKDIR /app
VOLUME /config
# Expose port 8080
EXPOSE 8080
ENTRYPOINT ["/sbin/tini", "-s", "--"]
CMD ["python3", "qbit_manage.py"]

View file

@ -4,7 +4,7 @@
"qbitapi": "2025.5.0"
},
"develop": {
"qbit": "v5.1.0",
"qbitapi": "2025.5.0"
"qbit": "v5.1.2",
"qbitapi": "2025.7.0"
}
}

View file

@ -1 +1 @@
4.5.0
4.5.1

View file

@ -221,6 +221,8 @@ share_limits:
limit_upload_speed: 0
# <OPTIONAL> Enable Group Upload Speed <bool>: Upload speed limits are applied at the group level. This will take limit_upload_speed defined and divide it equally among the number of torrents in the group.
enable_group_upload_speed: false
# <OPTIONAL> reset_upload_speed_on_unmet_minimums <bool>: If true (default), upload speed limits will be reset to unlimited when minimum conditions are not met. Set to false to preserve upload speed limits.
reset_upload_speed_on_unmet_minimums: true
# <OPTIONAL> cleanup <bool>: WARNING!! Setting this as true Will remove and delete contents of any torrents that satisfies the share limits (max time OR max ratio)
cleanup: false
# <OPTIONAL> resume_torrent_after_change <bool>: This variable will resume your torrent after changing share limits. Default is true

View file

@ -2,6 +2,9 @@
| **Shell Command** | **Docker Environment Variable** | **Config Command** | **Description** | **Default Value** |
|:---------------------------------------:|:-------------------------------:|:---------------------:|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------:|
| `-ws` or `--web-server` | QBT_WEB_SERVER | N/A | Start the webUI server to handle command requests via HTTP API. | False |
| `-p` or `--port` | QBT_PORT | N/A | Port number for the web server (default: 8080). | 8080 |
| `-b` or `--base-url` | QBT_BASE_URL | N/A | Base URL path for the web UI (e.g., '/qbit-manage'). Default is empty (root). | "" |
| `-r` or`--run` | QBT_RUN | N/A | Run without the scheduler. Script will exit after completion. | False |
| `-sch` or `--schedule` | QBT_SCHEDULE | N/A | Schedule to run every x minutes or choose customize schedule via [cron](https://crontab.guru/examples.html). (Default set to 1440 (1 day)) | 1440 |
| `-sd` or `--startup-delay` | QBT_STARTUP_DELAY | N/A | Set delay in seconds on the first run of a schedule (Default set to 0) | 0 |

View file

@ -177,6 +177,7 @@ Control how torrent share limits are set depending on the priority of your group
| `min_last_active` | Will prevent torrent deletion by cleanup variable if torrent has been active within the last x minutes. If the torrent has been active within the last x minutes, it will change the share limits back to no limits and resume the torrent to continue seeding. See Some examples of [valid time expressions](https://github.com/onegreyonewhite/pytimeparse2?tab=readme-ov-file#pytimeparse2-time-expression-parser) 32m, 2h32m, 3d2h32m, 1w3d2h32m | 0 | str | <center></center> |
| `limit_upload_speed` | Will limit the upload speed KiB/s (KiloBytes/second) (`-1` : No Limit) | -1 | int | <center></center> |
| `enable_group_upload_speed` | Upload speed limits are applied at the group level. This will take `limit_upload_speed` defined and divide it equally among the number of torrents in the group. | False | bool | <center></center> |
| `reset_upload_speed_on_unmet_minimums` | Controls whether upload speed limits are reset when minimum conditions are not met. When `true` (default), upload speed limits will be reset to unlimited if minimum seeding time, number of seeds, or last active time conditions are not satisfied. When `false`, existing upload speed limits will be preserved for bandwidth management purposes. | True | bool | <center></center> |
| `resume_torrent_after_change` | Will resume your torrent after changing share limits. | True | bool | <center></center> |
| `add_group_to_tag` | This adds your grouping as a tag with a prefix defined in settings (share_limits_tag). Example: A grouping named noHL with a priority of 1 will have a tag set to `~share_limit_1.noHL` (if using the default prefix). | True | bool | <center></center> |
| `min_num_seeds` | Will prevent torrent deletion by cleanup variable if the number of seeds is less than the value set here (depending on the tracker, you may or may not be included). If the torrent has less number of seeds than the min_num_seeds, the share limits will be changed back to no limits and resume the torrent to continue seeding. | 0 | int | <center></center> |

View file

@ -57,9 +57,15 @@ services:
- QBT_SHARE_LIMITS=false
- QBT_SKIP_CLEANUP=false
- QBT_DRY_RUN=false
- QBT_STARTUP_DELAY=0
- QBT_SKIP_QB_VERSION_CHECK=false
- QBT_DEBUG=false
- QBT_TRACE=false
# Logging Configuration
- QBT_LOG_LEVEL=INFO
- QBT_LOG_SIZE=10
- QBT_LOG_COUNT=5
- QBT_DIVIDER==
- QBT_WIDTH=100
restart: on-failure:2

View file

@ -1,4 +1,4 @@
# qBit Manage Web UI (Develop)
# qBit Manage Web UI
## Overview
The qBit Manage Web UI provides a modern interface for configuring and managing qBit Manage. It offers real-time editing of YAML configuration files through an intuitive visual interface, eliminating the need for manual file editing.

View file

@ -139,6 +139,9 @@ class Config:
logger.debug(f" --width (QBT_WIDTH): {self.args['screen_width']}")
logger.debug(f" --debug (QBT_DEBUG): {self.args['debug']}")
logger.debug(f" --trace (QBT_TRACE): {self.args['trace']}")
logger.debug(f" --web-server (QBT_WEB_SERVER): {self.args['web_server']}")
logger.debug(f" --port (QBT_PORT): {self.args['port']}")
logger.debug(f" --base-url (QBT_BASE_URL): {self.args['base_url']}")
# Log run commands (which may come from config or env)
logger.separator(command_source, space=False, border=False, loglevel="DEBUG")
@ -745,6 +748,16 @@ class Config:
)
self.notify(err, "Config")
raise Failed(err)
self.share_limits[group]["reset_upload_speed_on_unmet_minimums"] = self.util.check_for_attribute(
self.data,
"reset_upload_speed_on_unmet_minimums",
parent="share_limits",
subparent=group,
var_type="bool",
default=True,
do_print=False,
save=False,
)
self.share_limits[group]["torrents"] = []
if (
self.share_limits[group]["min_seeding_time"] > 0

View file

@ -1,6 +1,7 @@
from qbittorrentapi import Conflict409Error
import time
from modules import util
from modules.qbit_error_handler import handle_qbit_api_errors
logger = util.logger
@ -22,6 +23,7 @@ class Category:
def category(self):
"""Update category for torrents that don't have any category defined and returns total number categories updated"""
start_time = time.time()
logger.separator("Updating Categories", space=False, border=False)
torrent_list_filter = {"status_filter": self.status_filter}
if self.hashes:
@ -57,6 +59,10 @@ class Category:
else:
logger.print_line("No new torrents to categorize.", self.config.loglevel)
end_time = time.time()
duration = end_time - start_time
logger.debug(f"Category command completed in {duration:.2f} seconds")
def get_tracker_cat(self, torrent):
tracker = self.qbt.get_tags(self.qbt.get_tracker_urls(torrent.trackers))
return [tracker["cat"]] if tracker["cat"] else None
@ -67,18 +73,28 @@ class Category:
t_name = torrent.name
old_cat = torrent.category
if not self.config.dry_run:
try:
torrent.set_category(category=new_cat)
if torrent.auto_tmm is False and self.config.settings["force_auto_tmm"]:
torrent.set_auto_management(True)
except Conflict409Error:
ex = logger.print_line(
f'Existing category "{new_cat}" not found for save path {torrent.save_path}, category will be created.',
self.config.loglevel,
)
self.config.notify(ex, "Update Category", False)
self.client.torrent_categories.create_category(name=new_cat, save_path=torrent.save_path)
torrent.set_category(category=new_cat)
@handle_qbit_api_errors(context="set_category", retry_attempts=2)
def set_category_with_creation():
try:
torrent.set_category(category=new_cat)
if torrent.auto_tmm is False and self.config.settings["force_auto_tmm"]:
torrent.set_auto_management(True)
except Exception as e:
# Check if it's a category creation issue
if "not found" in str(e).lower() or "409" in str(e):
ex = logger.print_line(
f'Existing category "{new_cat}" not found for save path '
f"{torrent.save_path}, category will be created.",
self.config.loglevel,
)
self.config.notify(ex, "Update Category", False)
self.client.torrent_categories.create_category(name=new_cat, save_path=torrent.save_path)
torrent.set_category(category=new_cat)
else:
raise
set_category_with_creation()
body = []
body += logger.print_line(logger.insert_space(f"Torrent Name: {t_name}", 3), self.config.loglevel)
if cat_change:

View file

@ -1,3 +1,4 @@
import time
from datetime import timedelta
from modules import util
@ -26,6 +27,7 @@ class ReCheck:
def recheck(self):
"""Function used to recheck paused torrents sorted by size and resume torrents that are completed"""
start_time = time.time()
if self.config.commands["recheck"]:
logger.separator("Rechecking Paused Torrents", space=False, border=False)
# sort by size and paused
@ -136,3 +138,7 @@ class ReCheck:
self.notify_attr_recheck.append(attr)
if not self.config.dry_run:
torrent.recheck()
end_time = time.time()
duration = end_time - start_time
logger.debug(f"Recheck command completed in {duration:.2f} seconds")

View file

@ -1,4 +1,5 @@
import os
import time
from concurrent.futures import ThreadPoolExecutor
from fnmatch import fnmatch
@ -18,37 +19,39 @@ class RemoveOrphaned:
self.root_dir = qbit_manager.config.root_dir
self.orphaned_dir = qbit_manager.config.orphaned_dir
max_workers = max(os.cpu_count() - 1, 1)
self.executor = ThreadPoolExecutor(max_workers=max_workers)
self.rem_orphaned()
self.executor.shutdown()
max_workers = max(os.cpu_count() * 2, 4) # Increased workers for I/O bound operations
with ThreadPoolExecutor(max_workers=max_workers) as executor:
self.executor = executor
self.rem_orphaned()
def rem_orphaned(self):
"""Remove orphaned files from remote directory"""
start_time = time.time()
self.stats = 0
logger.separator("Checking for Orphaned Files", space=False, border=False)
torrent_files = []
orphaned_files = []
excluded_orphan_files = set()
exclude_patterns = []
root_files = self.executor.submit(util.get_root_files, self.root_dir, self.remote_dir, self.orphaned_dir)
# Get an updated list of torrents
# Get torrents and files in parallel
logger.print_line("Locating orphan files", self.config.loglevel)
torrent_list = self.qbt.get_torrents({"sort": "added_on"})
torrent_files.extend(
[
fullpath
for fullpathlist in self.executor.map(self.get_full_path_of_torrent_files, torrent_list)
for fullpath in fullpathlist
]
)
root_files_set = set(root_files.result())
torrent_files_set = set(torrent_files)
orphaned_files = root_files_set - torrent_files_set
# Parallel fetch torrents and root files
torrent_list_future = self.executor.submit(self.qbt.get_torrents, {"sort": "added_on"})
root_files_future = self.executor.submit(util.get_root_files, self.root_dir, self.remote_dir, self.orphaned_dir)
# Process torrent files in parallel
torrent_list = torrent_list_future.result()
# Use generator expression to reduce memory usage
torrent_files = set()
for fullpath_list in self.executor.map(self.get_full_path_of_torrent_files, torrent_list):
torrent_files.update(fullpath_list)
# Get root files
root_files = set(root_files_future.result())
# Find orphaned files efficiently
orphaned_files = root_files - torrent_files
# Process exclude patterns efficiently
if self.config.orphaned["exclude_patterns"]:
logger.print_line("Processing orphan exclude patterns")
exclude_patterns = [
@ -56,16 +59,18 @@ class RemoveOrphaned:
for exclude_pattern in self.config.orphaned["exclude_patterns"]
]
for file in orphaned_files:
for exclude_pattern in exclude_patterns:
if fnmatch(file, exclude_pattern):
excluded_orphan_files.add(file)
# Use set comprehension for efficient filtering
excluded_files = {file for file in orphaned_files if any(fnmatch(file, pattern) for pattern in exclude_patterns)}
orphaned_files -= excluded_files
orphaned_files = orphaned_files - excluded_orphan_files
# Early return if no orphaned files
if not orphaned_files:
logger.print_line("No Orphaned Files found.", self.config.loglevel)
return
# Check the threshold before deleting orphaned files
# Check threshold
max_orphaned_files_to_delete = self.config.orphaned.get("max_orphaned_files_to_delete")
if len(orphaned_files) and len(orphaned_files) > max_orphaned_files_to_delete and max_orphaned_files_to_delete != -1:
if len(orphaned_files) > max_orphaned_files_to_delete and max_orphaned_files_to_delete != -1:
e = (
f"Too many orphaned files detected ({len(orphaned_files)}). "
f"Max Threshold for deletion is set to {max_orphaned_files_to_delete}. "
@ -75,72 +80,101 @@ class RemoveOrphaned:
logger.info(f"Orphaned files detected: {orphaned_files}")
logger.warning(e)
return
elif orphaned_files:
orphaned_files = sorted(orphaned_files)
os.makedirs(self.orphaned_dir, exist_ok=True)
body = []
num_orphaned = len(orphaned_files)
logger.print_line(f"{num_orphaned} Orphaned files found", self.config.loglevel)
body += logger.print_line("\n".join(orphaned_files), self.config.loglevel)
if self.config.orphaned["empty_after_x_days"] == 0:
body += logger.print_line(
f"{'Not Deleting' if self.config.dry_run else 'Deleting'} {num_orphaned} Orphaned files",
self.config.loglevel,
)
else:
body += logger.print_line(
f"{'Not moving' if self.config.dry_run else 'Moving'} {num_orphaned} Orphaned files "
f"to {self.orphaned_dir.replace(self.remote_dir, self.root_dir)}",
self.config.loglevel,
)
attr = {
"function": "rem_orphaned",
"title": f"Removing {num_orphaned} Orphaned Files",
"body": "\n".join(body),
"orphaned_files": list(orphaned_files),
"orphaned_directory": self.orphaned_dir.replace(self.remote_dir, self.root_dir),
"total_orphaned_files": num_orphaned,
}
self.config.send_notifications(attr)
# Delete empty directories after moving orphan files
if not self.config.dry_run:
orphaned_parent_path = set(self.executor.map(self.handle_orphaned_files, orphaned_files))
# Process orphaned files
orphaned_files = sorted(orphaned_files)
os.makedirs(self.orphaned_dir, exist_ok=True)
body = []
num_orphaned = len(orphaned_files)
logger.print_line(f"{num_orphaned} Orphaned files found", self.config.loglevel)
body += logger.print_line("\n".join(orphaned_files), self.config.loglevel)
if self.config.orphaned["empty_after_x_days"] == 0:
body += logger.print_line(
f"{'Not Deleting' if self.config.dry_run else 'Deleting'} {num_orphaned} Orphaned files",
self.config.loglevel,
)
else:
body += logger.print_line(
f"{'Not moving' if self.config.dry_run else 'Moving'} {num_orphaned} Orphaned files "
f"to {self.orphaned_dir.replace(self.remote_dir, self.root_dir)}",
self.config.loglevel,
)
attr = {
"function": "rem_orphaned",
"title": f"Removing {num_orphaned} Orphaned Files",
"body": "\n".join(body),
"orphaned_files": list(orphaned_files),
"orphaned_directory": self.orphaned_dir.replace(self.remote_dir, self.root_dir),
"total_orphaned_files": num_orphaned,
}
self.config.send_notifications(attr)
# Batch process orphaned files
if not self.config.dry_run:
orphaned_parent_paths = set()
# Process files in batches to reduce I/O overhead
batch_size = 100
for i in range(0, len(orphaned_files), batch_size):
batch = orphaned_files[i : i + batch_size]
batch_results = self.executor.map(self.handle_orphaned_files, batch)
orphaned_parent_paths.update(batch_results)
# Remove empty directories
if orphaned_parent_paths:
logger.print_line("Removing newly empty directories", self.config.loglevel)
exclude_patterns = [
exclude_pattern.replace(self.remote_dir, self.root_dir)
for exclude_pattern in self.config.orphaned.get("exclude_patterns", [])
]
# Process directories in parallel
self.executor.map(
lambda directory: util.remove_empty_directories(
directory, self.qbt.get_category_save_paths(), exclude_patterns
),
orphaned_parent_path,
orphaned_parent_paths,
)
else:
logger.print_line("No Orphaned Files found.", self.config.loglevel)
end_time = time.time()
duration = end_time - start_time
logger.debug(f"Remove orphaned command completed in {duration:.2f} seconds")
def handle_orphaned_files(self, file):
"""Handle orphaned file with improved error handling and batching"""
src = file.replace(self.root_dir, self.remote_dir)
dest = os.path.join(self.orphaned_dir, file.replace(self.root_dir, ""))
orphaned_parent_path = os.path.dirname(file).replace(self.root_dir, self.remote_dir)
"""Delete orphaned files directly if empty_after_x_days is set to 0"""
if self.config.orphaned["empty_after_x_days"] == 0:
try:
try:
if self.config.orphaned["empty_after_x_days"] == 0:
util.delete_files(src)
except Exception:
logger.error(f"Error deleting orphaned file: {file}")
else:
util.move_files(src, dest, True)
else: # Move orphaned files to orphaned directory
util.move_files(src, dest, True)
except Exception as e:
logger.error(f"Error processing orphaned file {file}: {e}")
if self.config.orphaned["empty_after_x_days"] == 0:
# Fallback to move if delete fails
util.move_files(src, dest, True)
return orphaned_parent_path
def get_full_path_of_torrent_files(self, torrent):
torrent_files = map(lambda dict: dict.name, torrent.files)
"""Get full paths for torrent files with improved path handling"""
save_path = torrent.save_path
fullpath_torrent_files = []
for file in torrent_files:
fullpath = os.path.join(save_path, file)
# Replace fullpath with \\ if qbm is running in docker (linux) but qbt is on windows
fullpath = fullpath.replace(r"/", "\\") if ":\\" in fullpath else fullpath
fullpath_torrent_files.append(fullpath)
# Use list comprehension for better performance
fullpath_torrent_files = [
os.path.join(save_path, file.name).replace(r"/", "\\")
if ":\\" in os.path.join(save_path, file.name)
else os.path.join(save_path, file.name)
for file in torrent.files
]
return fullpath_torrent_files

View file

@ -1,7 +1,9 @@
from qbittorrentapi import NotFound404Error
import time
from qbittorrentapi import TrackerStatus
from modules import util
from modules.qbit_error_handler import handle_qbit_api_errors
from modules.util import TorrentMessages
from modules.util import list_in_text
@ -109,9 +111,11 @@ class RemoveUnregistered:
self.t_msg = self.qbt.torrentinfo[self.t_name]["msg"]
self.t_status = self.qbt.torrentinfo[self.t_name]["status"]
check_tags = util.get_list(torrent.tags)
try:
@handle_qbit_api_errors(context="process_torrent_issues", retry_attempts=1)
def process_single_torrent():
if self.filter_completed and not torrent.state_enum.is_complete:
continue
return
tracker_working = False
for trk in torrent.trackers:
if (
@ -120,7 +124,7 @@ class RemoveUnregistered:
):
tracker_working = True
if tracker_working:
continue
return
tracker = self.qbt.get_tags(self.qbt.get_tracker_urls([trk]))
msg_up = trk.msg.upper()
msg = trk.msg
@ -143,8 +147,9 @@ class RemoveUnregistered:
# Tag any error torrents
if self.cfg_tag_error and self.tag_error not in check_tags:
self.tag_tracker_error(msg, tracker, torrent)
except NotFound404Error:
continue
try:
process_single_torrent()
except Exception as ex:
logger.stacktrace()
self.config.notify(ex, "Remove Unregistered Torrents", False)
@ -152,6 +157,7 @@ class RemoveUnregistered:
def rem_unregistered(self):
"""Remove torrents with unregistered trackers."""
start_time = time.time()
self.remove_previous_errors()
self.process_torrent_issues()
@ -189,6 +195,10 @@ class RemoveUnregistered:
)
logger.print_line(self.tor_error_summary.rstrip(), self.config.loglevel)
end_time = time.time()
duration = end_time - start_time
logger.debug(f"Remove unregistered command completed in {duration:.2f} seconds")
def tag_tracker_error(self, msg, tracker, torrent):
"""Tags any trackers with errors"""
tor_error = ""

View file

@ -47,6 +47,7 @@ class ShareLimits:
def update_share_limits(self):
"""Updates share limits for torrents based on grouping"""
start_time = time()
logger.separator("Updating Share Limits based on priority", space=False, border=False)
if self.config.dry_run:
logger.warning("Share Limits will not be applied with dry_run and could be inaccurate unless manually adding tags.")
@ -89,6 +90,10 @@ class ShareLimits:
if group_config["cleanup"] and len(self.tdel_dict) > 0:
self.cleanup_torrents_for_group(group_name, group_config["priority"])
end_time = time()
duration = end_time - start_time
logger.debug(f"Share limits command completed in {duration:.2f} seconds")
def cleanup_torrents_for_group(self, group_name, priority):
"""Deletes torrents that have reached the ratio/seed limit"""
logger.separator(
@ -297,6 +302,7 @@ class ShareLimits:
min_last_active=group_config["min_last_active"],
resume_torrent=group_config["resume_torrent_after_change"],
tracker=tracker["url"],
reset_upload_speed_on_unmet_minimums=group_config["reset_upload_speed_on_unmet_minimums"],
)
# Get updated torrent after checking if the torrent has reached seed limits
torrent = self.qbt.get_torrents({"torrent_hashes": t_hash})[0]
@ -471,6 +477,7 @@ class ShareLimits:
min_last_active,
resume_torrent,
tracker,
reset_upload_speed_on_unmet_minimums,
):
"""Check if torrent has reached seed limit"""
body = ""
@ -508,7 +515,8 @@ class ShareLimits:
torrent.add_tags(self.min_seeding_time_tag)
torrent_tags += f", {self.min_seeding_time_tag}"
torrent.set_share_limits(ratio_limit=-1, seeding_time_limit=-1, inactive_seeding_time_limit=-1)
torrent.set_upload_limit(-1)
if reset_upload_speed_on_unmet_minimums:
torrent.set_upload_limit(-1)
if resume_torrent:
torrent.resume()
return False
@ -541,7 +549,8 @@ class ShareLimits:
torrent.add_tags(self.min_num_seeds_tag)
torrent_tags += f", {self.min_num_seeds_tag}"
torrent.set_share_limits(ratio_limit=-1, seeding_time_limit=-1, inactive_seeding_time_limit=-1)
torrent.set_upload_limit(-1)
if reset_upload_speed_on_unmet_minimums:
torrent.set_upload_limit(-1)
if resume_torrent:
torrent.resume()
return True
@ -576,7 +585,8 @@ class ShareLimits:
torrent.add_tags(self.last_active_tag)
torrent_tags += f", {self.last_active_tag}"
torrent.set_share_limits(ratio_limit=-1, seeding_time_limit=-1, inactive_seeding_time_limit=-1)
torrent.set_upload_limit(-1)
if reset_upload_speed_on_unmet_minimums:
torrent.set_upload_limit(-1)
if resume_torrent:
torrent.resume()
return False

View file

@ -1,3 +1,5 @@
import time
from modules import util
logger = util.logger
@ -112,6 +114,7 @@ class TagNoHardLinks:
def tag_nohardlinks(self):
"""Tag torrents with no hardlinks"""
start_time = time.time()
logger.separator("Tagging Torrents with No Hardlinks", space=False, border=False)
nohardlinks = self.nohardlinks
check_hardlinks = util.CheckHardLinks(self.config)
@ -161,3 +164,7 @@ class TagNoHardLinks:
f".torrent{'s.' if self.stats_untagged > 1 else '.'}",
self.config.loglevel,
)
end_time = time.time()
duration = end_time - start_time
logger.debug(f"Tag nohardlinks command completed in {duration:.2f} seconds")

View file

@ -1,3 +1,5 @@
import time
from modules import util
logger = util.logger
@ -22,6 +24,7 @@ class Tags:
def tags(self):
"""Update tags for torrents"""
start_time = time.time()
logger.separator("Updating Tags", space=False, border=False)
torrent_list = self.qbt.torrent_list
if self.hashes:
@ -86,3 +89,7 @@ class Tags:
)
else:
logger.print_line("No new torrents to tag.", self.config.loglevel)
end_time = time.time()
duration = end_time - start_time
logger.debug(f"Tags command completed in {duration:.2f} seconds")

View file

@ -0,0 +1,276 @@
"""
Centralized error handling for qBittorrent API exceptions
"""
import functools
import time
from typing import Any
from typing import Callable
from qbittorrentapi import APIConnectionError
from qbittorrentapi import APIError
from qbittorrentapi import Conflict409Error
from qbittorrentapi import Forbidden403Error
from qbittorrentapi import HTTP4XXError
from qbittorrentapi import HTTP5XXError
from qbittorrentapi import HTTPError
from qbittorrentapi import InternalServerError500Error
from qbittorrentapi import InvalidRequest400Error
from qbittorrentapi import LoginFailed
from qbittorrentapi import MissingRequiredParameters400Error
from qbittorrentapi import NotFound404Error
from qbittorrentapi import TorrentFileError
from qbittorrentapi import TorrentFileNotFoundError
from qbittorrentapi import TorrentFilePermissionError
from qbittorrentapi import Unauthorized401Error
from qbittorrentapi import UnsupportedMediaType415Error
from qbittorrentapi import UnsupportedQbittorrentVersion
from modules import util
logger = util.logger
class QbitAPIErrorHandler:
"""Centralized handler for qBittorrent API errors"""
def __init__(self, config=None):
self.config = config
self.retry_attempts = 3
self.retry_delay = 5 # seconds
def handle_api_error(self, error: Exception, context: str = "") -> bool:
"""
Handle qBittorrent API errors with appropriate logging and notifications
Args:
error: The exception that occurred
context: Additional context about where the error occurred
Returns:
bool: True if the error was handled gracefully, False if it should be re-raised
"""
error_msg = f"qBittorrent API Error{f' in {context}' if context else ''}: {str(error)}"
if isinstance(error, Forbidden403Error):
logger.error(f"{error_msg} - Access forbidden. Check qBittorrent permissions and authentication.")
if self.config:
self.config.notify(f"qBittorrent access forbidden: {str(error)}", "API Error", False)
return True
elif isinstance(error, LoginFailed):
logger.error(f"{error_msg} - Login failed. Check qBittorrent credentials.")
if self.config:
self.config.notify(f"qBittorrent login failed: {str(error)}", "Authentication Error", False)
return True
elif isinstance(error, APIConnectionError):
logger.error(f"{error_msg} - Connection failed. Check qBittorrent server status.")
if self.config:
self.config.notify(f"qBittorrent connection failed: {str(error)}", "Connection Error", False)
return True
elif isinstance(error, NotFound404Error):
logger.warning(f"{error_msg} - Resource not found. This may be expected behavior.")
if self.config:
self.config.notify(f"qBittorrent resource not found: {str(error)}", "API Warning", False)
return True
elif isinstance(error, Conflict409Error):
logger.warning(f"{error_msg} - Resource conflict. This may be expected behavior.")
if self.config:
self.config.notify(f"qBittorrent resource conflict: {str(error)}", "API Warning", False)
return True
elif isinstance(error, TorrentFileNotFoundError):
logger.error(f"{error_msg} - Torrent file not found.")
if self.config:
self.config.notify(f"Torrent file not found: {str(error)}", "File Error", False)
return True
elif isinstance(error, TorrentFilePermissionError):
logger.error(f"{error_msg} - Permission denied for torrent file.")
if self.config:
self.config.notify(f"Torrent file permission denied: {str(error)}", "Permission Error", False)
return True
elif isinstance(error, TorrentFileError):
logger.error(f"{error_msg} - Torrent file error.")
if self.config:
self.config.notify(f"Torrent file error: {str(error)}", "File Error", False)
return True
elif isinstance(error, (MissingRequiredParameters400Error, InvalidRequest400Error)):
logger.error(f"{error_msg} - Invalid request parameters.")
if self.config:
self.config.notify(f"Invalid qBittorrent request parameters: {str(error)}", "Request Error", False)
return True
elif isinstance(error, Unauthorized401Error):
logger.error(f"{error_msg} - Unauthorized access. Check authentication.")
if self.config:
self.config.notify(f"qBittorrent unauthorized access: {str(error)}", "Authentication Error", False)
return True
elif isinstance(error, UnsupportedMediaType415Error):
logger.error(f"{error_msg} - Unsupported media type (invalid torrent file/URL).")
if self.config:
self.config.notify(f"Unsupported media type: {str(error)}", "Media Error", False)
return True
elif isinstance(error, InternalServerError500Error):
logger.error(f"{error_msg} - qBittorrent internal server error.")
if self.config:
self.config.notify(f"qBittorrent server error: {str(error)}", "Server Error", False)
return True
elif isinstance(error, UnsupportedQbittorrentVersion):
logger.error(f"{error_msg} - Unsupported qBittorrent version.")
if self.config:
self.config.notify(f"Unsupported qBittorrent version: {str(error)}", "Version Error", False)
return True
elif isinstance(error, (HTTPError, HTTP4XXError, HTTP5XXError)):
# Catch any other HTTP errors we might have missed
logger.error(f"{error_msg} - HTTP error (status: {getattr(error, 'http_status_code', 'unknown')}).")
if self.config:
self.config.notify(f"qBittorrent HTTP error: {str(error)}", "HTTP Error", False)
return True
elif isinstance(error, APIError):
# Catch any other API errors we might have missed
logger.error(f"{error_msg} - General API error.")
if self.config:
self.config.notify(f"qBittorrent API error: {str(error)}", "API Error", False)
return True
else:
# Unknown qBittorrent API error
logger.error(f"Unknown qBittorrent API error{f' in {context}' if context else ''}: {str(error)}")
if self.config:
self.config.notify(f"Unknown qBittorrent error: {str(error)}", "Unknown Error", False)
return False
def handle_qbit_api_errors(context: str = "", retry_attempts: int = 3, retry_delay: int = 5):
"""
Decorator to handle qBittorrent API errors with retry logic
Args:
context: Description of where the error occurred
retry_attempts: Number of retry attempts for recoverable errors
retry_delay: Delay between retry attempts in seconds
"""
def decorator(func: Callable) -> Callable:
@functools.wraps(func)
def wrapper(*args, **kwargs) -> Any:
# Try to get config from args/kwargs for notifications
config = None
if args and hasattr(args[0], "config"):
config = args[0].config
elif "config" in kwargs:
config = kwargs["config"]
error_handler = QbitAPIErrorHandler(config)
for attempt in range(retry_attempts + 1):
try:
return func(*args, **kwargs)
except (
APIConnectionError,
Forbidden403Error,
LoginFailed,
NotFound404Error,
Conflict409Error,
TorrentFileError,
TorrentFileNotFoundError,
TorrentFilePermissionError,
UnsupportedQbittorrentVersion,
MissingRequiredParameters400Error,
InvalidRequest400Error,
Unauthorized401Error,
UnsupportedMediaType415Error,
InternalServerError500Error,
HTTPError,
HTTP4XXError,
HTTP5XXError,
APIError,
) as e:
# Handle the error
handled = error_handler.handle_api_error(e, context)
if not handled:
# Re-raise if not handled
raise
# For certain errors, don't retry
if isinstance(
e,
(
UnsupportedQbittorrentVersion,
MissingRequiredParameters400Error,
InvalidRequest400Error,
NotFound404Error,
Conflict409Error,
UnsupportedMediaType415Error,
TorrentFileNotFoundError,
TorrentFilePermissionError,
),
):
logger.info(f"Skipping operation due to {type(e).__name__}")
return None
# Retry for connection/auth errors
if attempt < retry_attempts:
logger.info(f"Retrying in {retry_delay} seconds... (attempt {attempt + 1}/{retry_attempts})")
time.sleep(retry_delay)
else:
logger.error(f"Max retry attempts ({retry_attempts}) exceeded for {context}")
if config:
config.notify(f"Max retry attempts exceeded for {context}: {str(e)}", "Retry Error", False)
return None
except Exception as e:
# Non-qBittorrent API error, let it propagate
logger.error(f"Non-API error in {context}: {str(e)}")
if config:
config.notify(f"Non-API error in {context}: {str(e)}", "System Error", False)
raise
return None
return wrapper
return decorator
def safe_execute_with_qbit_error_handling(func: Callable, context: str = "", *args, **kwargs) -> Any:
"""
Safely execute a function with qBittorrent API error handling
Args:
func: Function to execute
context: Description of the operation
*args, **kwargs: Arguments to pass to the function
Returns:
Function result or None if an error occurred
"""
try:
# Apply the decorator dynamically
wrapped_func = handle_qbit_api_errors(context)(func)
return wrapped_func(*args, **kwargs)
except Exception as e:
logger.error(f"Unexpected error in {context}: {str(e)}")
logger.stacktrace()
# Try to get config from args for notification
config = None
if args and hasattr(args[0], "config"):
config = args[0].config
elif "config" in kwargs:
config = kwargs["config"]
if config:
config.notify(f"Unexpected error in {context}: {str(e)}", "System Error", False)
return None

View file

@ -8,11 +8,12 @@ from functools import cache
from qbittorrentapi import APIConnectionError
from qbittorrentapi import Client
from qbittorrentapi import LoginFailed
from qbittorrentapi import NotFound404Error
from qbittorrentapi import TrackerStatus
from qbittorrentapi import Version
from ruamel.yaml import CommentedSeq
from modules import util
from modules.qbit_error_handler import handle_qbit_api_errors
from modules.util import Failed
from modules.util import TorrentMessages
from modules.util import list_in_text
@ -402,8 +403,17 @@ class Qbt:
if "cat" in self.config.data and self.config.data["cat"] is not None:
cat_path = self.config.data["cat"]
for cat, save_path in cat_path.items():
if os.path.join(save_path, "") == path or fnmatch(path, save_path):
category.append(cat)
try:
if cat == "Uncategorized" and isinstance(save_path, CommentedSeq):
if any(os.path.join(p, "") == path or fnmatch(path, p) for p in save_path):
category.append(cat)
elif os.path.join(save_path, "") == path or fnmatch(path, save_path):
category.append(cat)
except TypeError:
e = f"Invalid configuration for category {cat}. Check your config.yml file."
self.config.notify(e, "Category", True)
logger.print_line(e, "CRITICAL")
sys.exit(1)
if not category:
default_cat = path.split(os.sep)[-2]
@ -435,14 +445,20 @@ class Qbt:
logger.debug(f"Torrent {torrent.name} has already been removed from torrent files.")
tor_files = []
try:
@handle_qbit_api_errors(context="tor_delete_recycle_get_files", retry_attempts=1)
def get_torrent_files():
info_hash = torrent.hash
save_path = torrent.save_path.replace(self.config.root_dir, self.config.remote_dir)
# Define torrent files/folders
for file in torrent.files:
tor_files.append(os.path.join(save_path, file.name))
except NotFound404Error:
return info_hash, save_path
result = get_torrent_files()
if result is None: # Error occurred and was handled
return
info_hash, save_path = result
if self.config.recyclebin["enabled"]:
if self.config.recyclebin["split_by_category"]:

View file

@ -659,43 +659,66 @@ def copy_files(src, dest):
def remove_empty_directories(pathlib_root_dir, excluded_paths=None, exclude_patterns=[]):
"""Remove empty directories recursively, optimized version."""
"""Remove empty directories recursively with optimized performance."""
pathlib_root_dir = Path(pathlib_root_dir)
# Early return for non-existent paths
if not pathlib_root_dir.exists():
return
# Optimize excluded paths handling
excluded_paths_set = set()
if excluded_paths is not None:
# Ensure excluded_paths is a set of Path objects for efficient lookup
excluded_paths = {Path(p) for p in excluded_paths}
excluded_paths_set = {Path(p).resolve() for p in excluded_paths}
# Pre-compile exclude patterns for better performance
compiled_patterns = []
for pattern in exclude_patterns:
# Convert to regex for faster matching
import re
regex_pattern = fnmatch.translate(pattern)
compiled_patterns.append(re.compile(regex_pattern))
# Cache directory checks to avoid redundant operations
directories_to_check = []
# Collect all directories in single pass
for root, dirs, files in os.walk(pathlib_root_dir, topdown=False):
root_path = Path(root)
# Skip excluded paths
if excluded_paths and root_path in excluded_paths:
root_path = Path(root).resolve()
# Skip excluded paths efficiently
if excluded_paths_set and root_path in excluded_paths_set:
continue
exclude_pattern_match = False
for exclude_pattern in exclude_patterns:
if fnmatch(os.path.join(root, ""), exclude_pattern):
exclude_pattern_match = True
break
if exclude_pattern_match:
continue
# Check exclude patterns efficiently
if compiled_patterns:
root_str = str(root_path) + os.sep
if any(pattern.match(root_str) for pattern in compiled_patterns):
continue
# Attempt to remove the directory if it's empty
# Only add directories that might be empty (no files)
if not files:
directories_to_check.append(root_path)
# Remove empty directories in batch
removed_dirs = set()
for dir_path in directories_to_check:
try:
os.rmdir(root)
os.rmdir(dir_path)
removed_dirs.add(dir_path)
except PermissionError as perm:
logger.warning(f"{perm} : Unable to delete folder {root} as it has permission issues. Skipping...")
pass
logger.warning(f"{perm} : Unable to delete folder {dir_path} as it has permission issues. Skipping...")
except OSError:
# Directory not empty or other error - safe to ignore here
# Directory not empty - expected
pass
# Attempt to remove the root directory if it's now empty and not excluded
if not excluded_paths or pathlib_root_dir not in excluded_paths:
# Attempt root directory removal if it's now empty
if not excluded_paths_set or pathlib_root_dir.resolve() not in excluded_paths_set:
try:
pathlib_root_dir.rmdir()
except PermissionError as perm:
logger.warning(f"{perm} : Unable to delete folder {root} as it has permission issues. Skipping...")
pass
logger.warning(f"{perm} : Unable to delete root folder {pathlib_root_dir} as it has permission issues. Skipping...")
except OSError:
pass
@ -826,15 +849,41 @@ class CheckHardLinks:
def get_root_files(root_dir, remote_dir, exclude_dir=None):
local_exclude_dir = exclude_dir.replace(remote_dir, root_dir) if exclude_dir and remote_dir != root_dir else exclude_dir
# if not root_dir:
# return []
root_files = [
os.path.join(path.replace(remote_dir, root_dir) if remote_dir != root_dir else path, name)
for path, subdirs, files in os.walk(remote_dir if remote_dir != root_dir else root_dir)
for name in files
if not local_exclude_dir or local_exclude_dir not in path
]
"""Get all files in root directory with optimized path handling and filtering."""
if not root_dir or not os.path.exists(root_dir):
return []
# Pre-calculate path transformations
is_same_path = remote_dir == root_dir
local_exclude_dir = None
if exclude_dir and not is_same_path:
local_exclude_dir = exclude_dir.replace(remote_dir, root_dir)
# Use list comprehension with pre-filtered results
root_files = []
# Optimize path replacement
if is_same_path:
# Fast path when paths are the same
for path, subdirs, files in os.walk(root_dir):
if local_exclude_dir and local_exclude_dir in path:
continue
root_files.extend(os.path.join(path, name) for name in files)
else:
# Path replacement needed
path_replacements = {}
for path, subdirs, files in os.walk(remote_dir):
if local_exclude_dir and local_exclude_dir in path:
continue
# Cache path replacement
if path not in path_replacements:
path_replacements[path] = path.replace(remote_dir, root_dir)
replaced_path = path_replacements[path]
root_files.extend(os.path.join(replaced_path, name) for name in files)
return root_files
@ -1048,6 +1097,7 @@ def execute_qbit_commands(qbit_manager, commands, stats, hashes=None):
from modules.core.share_limits import ShareLimits
from modules.core.tag_nohardlinks import TagNoHardLinks
from modules.core.tags import Tags
from modules.qbit_error_handler import safe_execute_with_qbit_error_handling
# Initialize executed_commands list
if "executed_commands" not in stats:
@ -1056,99 +1106,132 @@ def execute_qbit_commands(qbit_manager, commands, stats, hashes=None):
# Set Category
if commands.get("cat_update"):
if hashes is not None:
result = Category(qbit_manager, hashes).stats
result = safe_execute_with_qbit_error_handling(
lambda: Category(qbit_manager, hashes).stats, "Category Update (with hashes)"
)
else:
result = Category(qbit_manager).stats
result = safe_execute_with_qbit_error_handling(lambda: Category(qbit_manager).stats, "Category Update")
if "categorized" not in stats:
stats["categorized"] = 0
stats["categorized"] += result
stats["executed_commands"].append("cat_update")
if result is not None:
if "categorized" not in stats:
stats["categorized"] = 0
stats["categorized"] += result
stats["executed_commands"].append("cat_update")
else:
logger.warning("Category Update operation skipped due to API errors")
# Set Tags
if commands.get("tag_update"):
if hashes is not None:
result = Tags(qbit_manager, hashes).stats
result = safe_execute_with_qbit_error_handling(lambda: Tags(qbit_manager, hashes).stats, "Tags Update (with hashes)")
else:
result = Tags(qbit_manager).stats
result = safe_execute_with_qbit_error_handling(lambda: Tags(qbit_manager).stats, "Tags Update")
stats["tagged"] += result
stats["executed_commands"].append("tag_update")
if result is not None:
stats["tagged"] += result
stats["executed_commands"].append("tag_update")
else:
logger.warning("Tags Update operation skipped due to API errors")
# Remove Unregistered Torrents and tag errors
if commands.get("rem_unregistered") or commands.get("tag_tracker_error"):
if hashes is not None:
rem_unreg = RemoveUnregistered(qbit_manager, hashes)
rem_unreg = safe_execute_with_qbit_error_handling(
lambda: RemoveUnregistered(qbit_manager, hashes), "Remove Unregistered Torrents (with hashes)"
)
else:
rem_unreg = RemoveUnregistered(qbit_manager)
rem_unreg = safe_execute_with_qbit_error_handling(
lambda: RemoveUnregistered(qbit_manager), "Remove Unregistered Torrents"
)
# Initialize stats if they don't exist
for key in ["rem_unreg", "deleted", "deleted_contents", "tagged_tracker_error", "untagged_tracker_error"]:
if key not in stats:
stats[key] = 0
if rem_unreg is not None:
# Initialize stats if they don't exist
for key in ["rem_unreg", "deleted", "deleted_contents", "tagged_tracker_error", "untagged_tracker_error"]:
if key not in stats:
stats[key] = 0
stats["rem_unreg"] += rem_unreg.stats_deleted + rem_unreg.stats_deleted_contents
stats["deleted"] += rem_unreg.stats_deleted
stats["deleted_contents"] += rem_unreg.stats_deleted_contents
stats["tagged_tracker_error"] += rem_unreg.stats_tagged
stats["untagged_tracker_error"] += rem_unreg.stats_untagged
stats["tagged"] += rem_unreg.stats_tagged
stats["executed_commands"].extend([cmd for cmd in ["rem_unregistered", "tag_tracker_error"] if commands.get(cmd)])
stats["rem_unreg"] += rem_unreg.stats_deleted + rem_unreg.stats_deleted_contents
stats["deleted"] += rem_unreg.stats_deleted
stats["deleted_contents"] += rem_unreg.stats_deleted_contents
stats["tagged_tracker_error"] += rem_unreg.stats_tagged
stats["untagged_tracker_error"] += rem_unreg.stats_untagged
stats["tagged"] += rem_unreg.stats_tagged
stats["executed_commands"].extend([cmd for cmd in ["rem_unregistered", "tag_tracker_error"] if commands.get(cmd)])
else:
logger.warning("Remove Unregistered Torrents operation skipped due to API errors")
# Recheck Torrents
if commands.get("recheck"):
if hashes is not None:
recheck = ReCheck(qbit_manager, hashes)
recheck = safe_execute_with_qbit_error_handling(
lambda: ReCheck(qbit_manager, hashes), "Recheck Torrents (with hashes)"
)
else:
recheck = ReCheck(qbit_manager)
recheck = safe_execute_with_qbit_error_handling(lambda: ReCheck(qbit_manager), "Recheck Torrents")
if "rechecked" not in stats:
stats["rechecked"] = 0
if "resumed" not in stats:
stats["resumed"] = 0
stats["rechecked"] += recheck.stats_rechecked
stats["resumed"] += recheck.stats_resumed
stats["executed_commands"].append("recheck")
if recheck is not None:
if "rechecked" not in stats:
stats["rechecked"] = 0
if "resumed" not in stats:
stats["resumed"] = 0
stats["rechecked"] += recheck.stats_rechecked
stats["resumed"] += recheck.stats_resumed
stats["executed_commands"].append("recheck")
else:
logger.warning("Recheck Torrents operation skipped due to API errors")
# Remove Orphaned Files
if commands.get("rem_orphaned"):
result = RemoveOrphaned(qbit_manager).stats
result = safe_execute_with_qbit_error_handling(lambda: RemoveOrphaned(qbit_manager).stats, "Remove Orphaned Files")
if "orphaned" not in stats:
stats["orphaned"] = 0
stats["orphaned"] += result
stats["executed_commands"].append("rem_orphaned")
if result is not None:
if "orphaned" not in stats:
stats["orphaned"] = 0
stats["orphaned"] += result
stats["executed_commands"].append("rem_orphaned")
else:
logger.warning("Remove Orphaned Files operation skipped due to API errors")
# Tag NoHardLinks
if commands.get("tag_nohardlinks"):
if hashes is not None:
no_hardlinks = TagNoHardLinks(qbit_manager, hashes)
no_hardlinks = safe_execute_with_qbit_error_handling(
lambda: TagNoHardLinks(qbit_manager, hashes), "Tag NoHardLinks (with hashes)"
)
else:
no_hardlinks = TagNoHardLinks(qbit_manager)
no_hardlinks = safe_execute_with_qbit_error_handling(lambda: TagNoHardLinks(qbit_manager), "Tag NoHardLinks")
if "tagged_noHL" not in stats:
stats["tagged_noHL"] = 0
if "untagged_noHL" not in stats:
stats["untagged_noHL"] = 0
stats["tagged"] += no_hardlinks.stats_tagged
stats["tagged_noHL"] += no_hardlinks.stats_tagged
stats["untagged_noHL"] += no_hardlinks.stats_untagged
stats["executed_commands"].append("tag_nohardlinks")
if no_hardlinks is not None:
if "tagged_noHL" not in stats:
stats["tagged_noHL"] = 0
if "untagged_noHL" not in stats:
stats["untagged_noHL"] = 0
stats["tagged"] += no_hardlinks.stats_tagged
stats["tagged_noHL"] += no_hardlinks.stats_tagged
stats["untagged_noHL"] += no_hardlinks.stats_untagged
stats["executed_commands"].append("tag_nohardlinks")
else:
logger.warning("Tag NoHardLinks operation skipped due to API errors")
# Set Share Limits
if commands.get("share_limits"):
if hashes is not None:
share_limits = ShareLimits(qbit_manager, hashes)
share_limits = safe_execute_with_qbit_error_handling(
lambda: ShareLimits(qbit_manager, hashes), "Share Limits (with hashes)"
)
else:
share_limits = ShareLimits(qbit_manager)
share_limits = safe_execute_with_qbit_error_handling(lambda: ShareLimits(qbit_manager), "Share Limits")
if "updated_share_limits" not in stats:
stats["updated_share_limits"] = 0
if "cleaned_share_limits" not in stats:
stats["cleaned_share_limits"] = 0
stats["tagged"] += share_limits.stats_tagged
stats["updated_share_limits"] += share_limits.stats_tagged
stats["deleted"] += share_limits.stats_deleted
stats["deleted_contents"] += share_limits.stats_deleted_contents
stats["cleaned_share_limits"] += share_limits.stats_deleted + share_limits.stats_deleted_contents
stats["executed_commands"].append("share_limits")
if share_limits is not None:
if "updated_share_limits" not in stats:
stats["updated_share_limits"] = 0
if "cleaned_share_limits" not in stats:
stats["cleaned_share_limits"] = 0
stats["tagged"] += share_limits.stats_tagged
stats["updated_share_limits"] += share_limits.stats_tagged
stats["deleted"] += share_limits.stats_deleted
stats["deleted_contents"] += share_limits.stats_deleted_contents
stats["cleaned_share_limits"] += share_limits.stats_deleted + share_limits.stats_deleted_contents
stats["executed_commands"].append("share_limits")
else:
logger.warning("Share Limits operation skipped due to API errors")

View file

@ -7,7 +7,9 @@ import glob
import json
import logging
import os
import re
import shutil
from contextlib import asynccontextmanager
from dataclasses import dataclass
from dataclasses import field
from datetime import datetime
@ -15,6 +17,7 @@ from multiprocessing import Queue
from multiprocessing.sharedctypes import Synchronized
from pathlib import Path
from typing import Any
from typing import Optional
import ruamel.yaml
from fastapi import FastAPI
@ -42,7 +45,7 @@ class CommandRequest(BaseModel):
dry_run: bool = False
skip_cleanup: bool = False
skip_qb_version_check: bool = False
log_level: str | None = None
log_level: Optional[str] = None # noqa: UP045
class ConfigRequest(BaseModel):
@ -85,7 +88,7 @@ class HealthCheckResponse(BaseModel):
application: dict = {} # web_api_responsive, can_accept_requests, queue_size, etc.
directories: dict = {} # config/logs directory status and activity info
issues: list[str] = []
error: str = None
error: Optional[str] = None # noqa: UP045
async def process_queue_periodically(web_api: WebAPI) -> None:
@ -135,7 +138,7 @@ class WebAPI:
)
)
args: dict = field(default_factory=dict)
app: FastAPI = field(default_factory=FastAPI)
app: FastAPI = field(default=None)
is_running: Synchronized[bool] = field(default=None)
is_running_lock: object = field(default=None) # multiprocessing.Lock
web_api_queue: Queue = field(default=None)
@ -143,6 +146,31 @@ class WebAPI:
def __post_init__(self) -> None:
"""Initialize routes and events."""
# Initialize FastAPI app with root_path if base_url is provided
base_url = self.args.get("base_url", "")
if base_url and not base_url.startswith("/"):
base_url = "/" + base_url
# Create lifespan context manager for startup/shutdown events
@asynccontextmanager
async def lifespan(app: FastAPI):
"""Handle application startup and shutdown events."""
# Startup: Start background task for queue processing
app.state.web_api = self
app.state.background_task = asyncio.create_task(process_queue_periodically(self))
yield
# Shutdown: Clean up background task
if hasattr(app.state, "background_task"):
app.state.background_task.cancel()
try:
await app.state.background_task
except asyncio.CancelledError:
pass
# Create app with lifespan context manager
app = FastAPI(lifespan=lifespan)
object.__setattr__(self, "app", app)
# Initialize paths during startup
object.__setattr__(self, "config_path", Path(self.default_dir))
object.__setattr__(self, "logs_path", Path(self.default_dir) / "logs")
@ -174,37 +202,52 @@ class WebAPI:
self.app.get("/api/log_files")(self.list_log_files)
self.app.get("/api/version")(self.get_version)
self.app.get("/api/health")(self.health_check)
self.app.get("/api/get_base_url")(self.get_base_url)
# Mount static files for web UI
web_ui_dir = Path(__file__).parent.parent / "web-ui"
if web_ui_dir.exists():
self.app.mount("/static", StaticFiles(directory=str(web_ui_dir)), name="static")
# If base URL is configured, also mount static files at the base URL path
if base_url:
self.app.mount(base_url + "/static", StaticFiles(directory=str(web_ui_dir)), name="base_static")
# Root route to serve web UI
@self.app.get("/")
async def serve_index():
# If base URL is configured, redirect to the base URL path
if base_url:
from fastapi.responses import RedirectResponse
return RedirectResponse(url=base_url + "/", status_code=302)
# Otherwise, serve the web UI normally
web_ui_path = Path(__file__).parent.parent / "web-ui" / "index.html"
if web_ui_path.exists():
return FileResponse(str(web_ui_path))
raise HTTPException(status_code=404, detail="Web UI not found")
# Mount static files for web UI
web_ui_dir = Path(__file__).parent.parent / "web-ui"
if web_ui_dir.exists():
self.app.mount("/", StaticFiles(directory=str(web_ui_dir), html=True), name="web-ui")
# If base URL is configured, also handle the base URL path
if base_url:
# Store reference to self in app state for access in event handlers
self.app.state.web_api = self
@self.app.get(base_url + "/")
async def serve_base_url_index():
web_ui_path = Path(__file__).parent.parent / "web-ui" / "index.html"
if web_ui_path.exists():
return FileResponse(str(web_ui_path))
raise HTTPException(status_code=404, detail="Web UI not found")
@self.app.on_event("startup")
async def startup_event():
"""Start background task for queue processing."""
self.app.state.background_task = asyncio.create_task(process_queue_periodically(self.app.state.web_api))
# Catch-all route for SPA routing (must be last)
@self.app.get("/{full_path:path}")
async def catch_all(full_path: str):
# For any non-API route that doesn't start with static/, serve the index.html (SPA routing)
if not full_path.startswith("api/") and not full_path.startswith("static/"):
web_ui_path = Path(__file__).parent.parent / "web-ui" / "index.html"
if web_ui_path.exists():
return FileResponse(str(web_ui_path))
raise HTTPException(status_code=404, detail="Not found")
@self.app.on_event("shutdown")
async def shutdown_event():
"""Clean up background task."""
if hasattr(self.app.state, "background_task"):
self.app.state.background_task.cancel()
try:
await self.app.state.background_task
except asyncio.CancelledError:
pass
# Note: Lifespan events are now handled in the lifespan context manager above
async def execute_for_config(self, args: dict, hashes: list[str]) -> dict:
"""Execute commands for a specific config file."""
@ -390,6 +433,10 @@ class WebAPI:
"issues": ["Health check failed"],
}
async def get_base_url(self) -> dict:
"""Get the configured base URL for the web UI."""
return {"baseUrl": self.args.get("base_url", "")}
async def _execute_command(self, request: CommandRequest) -> dict:
"""Execute the actual command implementation."""
try:
@ -703,6 +750,37 @@ class WebAPI:
shutil.copy2(config_path, backup_file_path)
logger.info(f"Created backup: {backup_file_path}")
await self._cleanup_backups(config_path)
async def _cleanup_backups(self, config_path: Path):
"""Clean up old backups for a configuration file, keeping the last 30."""
try:
if not self.backup_path.exists():
return
config_stem = config_path.stem
config_suffix = config_path.suffix.lstrip(".")
# Regex to precisely match backups for THIS config file.
# Format: {stem}_{YYYYMMDD}_{HHMMSS}.{suffix}
backup_re = re.compile(f"^{re.escape(config_stem)}_(\\d{{8}}_\\d{{6}})\\.{re.escape(config_suffix)}$")
config_backups = [f for f in self.backup_path.iterdir() if f.is_file() and backup_re.match(f.name)]
# sort by name descending, which works for YYYYMMDD_HHMMSS format
sorted_backups = sorted(config_backups, key=lambda p: p.name, reverse=True)
num_to_keep = 30
if len(sorted_backups) > num_to_keep:
files_to_delete = sorted_backups[num_to_keep:]
logger.info(f"Cleaning up {len(files_to_delete)} old backups for '{config_path.name}'...")
for f in files_to_delete:
try:
f.unlink()
logger.debug(f"Deleted old backup: {f.name}")
except OSError as e:
logger.warning(f"Could not delete old backup {f.name}: {e}")
except Exception as e:
logger.error(f"An unexpected error occurred during backup cleanup: {e}")
async def run_command(self, request: CommandRequest) -> dict:
"""Handle incoming command requests."""
@ -760,7 +838,7 @@ class WebAPI:
logger.error(f"Error in run_command during execution: {str(e)}")
raise HTTPException(status_code=500, detail=str(e))
async def get_logs(self, limit: int | None = None, log_filename: str | None = None) -> dict[str, Any]:
async def get_logs(self, limit: Optional[int] = None, log_filename: Optional[str] = None) -> dict[str, Any]: # noqa: UP045
"""Get recent logs from the log file."""
if not self.logs_path.exists():
logger.warning(f"Log directory not found: {self.logs_path}")
@ -838,9 +916,11 @@ class WebAPI:
return {"backups": []}
# Find backup files for this config
config_base = Path(filename).stem
backup_pattern = f"{config_base}_*{Path(filename).suffix}"
backup_files = list(self.backup_path.glob(backup_pattern))
config_stem = Path(filename).stem
config_suffix = Path(filename).suffix.lstrip(".")
# Regex to precisely match backups for THIS config file.
backup_re = re.compile(f"^{re.escape(config_stem)}_(\\d{{8}}_\\d{{6}})\\.{re.escape(config_suffix)}$")
backup_files = [f for f in self.backup_path.iterdir() if f.is_file() and backup_re.match(f.name)]
backups = []
for backup_file in sorted(backup_files, key=lambda x: x.stat().st_mtime, reverse=True):

View file

@ -19,11 +19,11 @@ authors = [
dependencies = [
"bencodepy==0.9.5",
"croniter==6.0.0",
"fastapi==0.116.0",
"fastapi==0.116.1",
"GitPython==3.1.44",
"humanize==4.12.3",
"pytimeparse2==1.7.1",
"qbittorrent-api==2025.5.0",
"qbittorrent-api==2025.7.0",
"requests==2.32.4",
"retrying==1.4.0",
"ruamel.yaml==0.18.14",
@ -38,7 +38,7 @@ Repository = "https://github.com/StuffAnThings/qbit_manage"
[project.optional-dependencies]
dev = [
"pre-commit==4.2.0",
"ruff==0.12.2",
"ruff==0.12.4",
]
[tool.ruff]

View file

@ -46,7 +46,7 @@ parser.add_argument(
dest="web_server",
action="store_true",
default=False,
help="Start a web server to handle command requests via HTTP API.",
help="Start the webUI server to handle command requests via HTTP API.",
)
parser.add_argument(
"-p",
@ -56,6 +56,14 @@ parser.add_argument(
default=8080,
help="Port number for the web server (default: 8080).",
)
parser.add_argument(
"-b",
"--base-url",
dest="base_url",
type=str,
default="",
help="Base URL path for the web UI (e.g., '/qbit-manage'). Default is empty (root).",
)
parser.add_argument("-db", "--debug", dest="debug", help=argparse.SUPPRESS, action="store_true", default=False)
parser.add_argument("-tr", "--trace", dest="trace", help=argparse.SUPPRESS, action="store_true", default=False)
parser.add_argument(
@ -283,6 +291,7 @@ env_version = get_arg("BRANCH_NAME", "master")
is_docker = get_arg("QBM_DOCKER", False, arg_bool=True)
web_server = get_arg("QBT_WEB_SERVER", args.web_server, arg_bool=True)
port = get_arg("QBT_PORT", args.port, arg_int=True)
base_url = get_arg("QBT_BASE_URL", args.base_url)
run = get_arg("QBT_RUN", args.run, arg_bool=True)
sch = get_arg("QBT_SCHEDULE", args.schedule)
startupDelay = get_arg("QBT_STARTUP_DELAY", args.startupDelay)
@ -347,6 +356,9 @@ for v in [
"screen_width",
"debug",
"trace",
"web_server",
"port",
"base_url",
]:
args[v] = eval(v)
@ -492,8 +504,14 @@ def start():
return None
if qbit_manager:
# Execute qBittorrent commands using shared function
execute_qbit_commands(qbit_manager, cfg.commands, stats, hashes=None)
# Execute qBittorrent commands using shared function with error handling
try:
execute_qbit_commands(qbit_manager, cfg.commands, stats, hashes=None)
except Exception as ex:
logger.error(f"Error executing qBittorrent commands: {str(ex)}")
logger.stacktrace()
if cfg:
cfg.notify(f"qBittorrent command execution failed: {str(ex)}", "Execution Error", False)
# Empty RecycleBin
stats["recycle_emptied"] += cfg.cleanup_dirs("Recycle Bin")
@ -629,7 +647,11 @@ if __name__ == "__main__":
if web_server:
logger.separator("Starting Web Server")
logger.info(f"Web API server running on http://0.0.0.0:{port}")
logger.info(f"Access the WebUI at http://localhost:{port}")
if base_url:
logger.info(f"Access the WebUI at http://localhost:{port}/{base_url.lstrip('/')}")
logger.info(f"Root path http://localhost:{port}/ will redirect to the WebUI")
else:
logger.info(f"Access the WebUI at http://localhost:{port}")
# Create a copy of args to pass to the web server process
process_args = args.copy()
@ -670,21 +692,57 @@ if __name__ == "__main__":
start_loop(True)
# Update next_scheduled_run_info_shared in the main loop
last_skip_log_time = 0 # Track when we last logged a skip message
while not killer.kill_now:
current_time = datetime.now()
next_run_time = schedule.next_run() # Call the function to get the datetime object
# Calculate time until next run (avoid redundant calculation)
if next_run_time:
delta_seconds = max(0, (next_run_time - current_time).total_seconds())
else:
delta_seconds = float("inf") # No scheduled runs
# Update shared dictionary
next_run_info = calc_next_run(next_run_time)
next_scheduled_run_info_shared.update(next_run_info) # Update shared dictionary
next_scheduled_run_info_shared.update(next_run_info)
if web_server:
if is_running.value:
logger.info("Scheduled run skipped: Web API is currently processing a request.")
# Only log if the next run would occur within the next sleep interval
# and we haven't logged recently (rate limiting)
if delta_seconds <= 60 and (current_time.timestamp() - last_skip_log_time) >= 300: # 5 min rate limit
time_until_str = f"{int(delta_seconds)}s" if delta_seconds < 60 else f"{int(delta_seconds / 60)}m"
logger.info(
f"Scheduled run skipped: Web API is currently processing a request. "
f"Next run in {time_until_str} at {next_run_time.strftime('%I:%M %p')}"
)
last_skip_log_time = current_time.timestamp()
else:
schedule.run_pending()
try:
schedule.run_pending()
except Exception as ex:
logger.error(f"Error during scheduled run: {str(ex)}")
logger.stacktrace()
# Continue running instead of crashing
else:
schedule.run_pending()
try:
schedule.run_pending()
except Exception as ex:
logger.error(f"Error during scheduled run: {str(ex)}")
logger.stacktrace()
# Continue running instead of crashing
logger.trace(f" Pending Jobs: {schedule.get_jobs()}")
time.sleep(60)
# Dynamic sleep: sleep less if next run is soon, but at least 10 seconds
if delta_seconds < 120: # If next run is within 2 minutes
sleep_time = max(10, min(30, delta_seconds / 2)) # Sleep 10-30 seconds
else:
sleep_time = 60 # Normal 60-second interval
logger.trace(f" Sleeping for {sleep_time:.0f} seconds (next run in {delta_seconds:.0f}s)")
time.sleep(sleep_time)
if web_process:
web_process.terminate()
web_process.join()

88
uv.lock generated
View file

@ -38,11 +38,11 @@ sdist = { url = "https://files.pythonhosted.org/packages/d8/72/e2ee9f8a93c92af1b
[[package]]
name = "certifi"
version = "2025.4.26"
version = "2025.7.9"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/e8/9e/c05b3920a3b7d20d3d3310465f50348e5b3694f4f88c6daf736eef3024c4/certifi-2025.4.26.tar.gz", hash = "sha256:0a816057ea3cdefcef70270d2c515e4506bbc954f417fa5ade2021213bb8f0c6", size = 160705, upload-time = "2025-04-26T02:12:29.51Z" }
sdist = { url = "https://files.pythonhosted.org/packages/de/8a/c729b6b60c66a38f590c4e774decc4b2ec7b0576be8f1aa984a53ffa812a/certifi-2025.7.9.tar.gz", hash = "sha256:c1d2ec05395148ee10cf672ffc28cd37ea0ab0d99f9cc74c43e588cbd111b079", size = 160386, upload-time = "2025-07-09T02:13:58.874Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/4a/7e/3db2bd1b1f9e95f7cddca6d6e75e2f2bd9f51b1246e546d88addca0106bd/certifi-2025.4.26-py3-none-any.whl", hash = "sha256:30350364dfe371162649852c63336a15c70c6510c2ad5015b21c2345311805f3", size = 159618, upload-time = "2025-04-26T02:12:27.662Z" },
{ url = "https://files.pythonhosted.org/packages/66/f3/80a3f974c8b535d394ff960a11ac20368e06b736da395b551a49ce950cce/certifi-2025.7.9-py3-none-any.whl", hash = "sha256:d842783a14f8fdd646895ac26f719a061408834473cfc10203f6a575beb15d39", size = 159230, upload-time = "2025-07-09T02:13:57.007Z" },
]
[[package]]
@ -203,16 +203,16 @@ wheels = [
[[package]]
name = "fastapi"
version = "0.115.14"
version = "0.116.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "pydantic" },
{ name = "starlette" },
{ name = "typing-extensions" },
]
sdist = { url = "https://files.pythonhosted.org/packages/ca/53/8c38a874844a8b0fa10dd8adf3836ac154082cf88d3f22b544e9ceea0a15/fastapi-0.115.14.tar.gz", hash = "sha256:b1de15cdc1c499a4da47914db35d0e4ef8f1ce62b624e94e0e5824421df99739", size = 296263, upload-time = "2025-06-26T15:29:08.21Z" }
sdist = { url = "https://files.pythonhosted.org/packages/78/d7/6c8b3bfe33eeffa208183ec037fee0cce9f7f024089ab1c5d12ef04bd27c/fastapi-0.116.1.tar.gz", hash = "sha256:ed52cbf946abfd70c5a0dccb24673f0670deeb517a88b3544d03c2a6bf283143", size = 296485, upload-time = "2025-07-11T16:22:32.057Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/53/50/b1222562c6d270fea83e9c9075b8e8600b8479150a18e4516a6138b980d1/fastapi-0.115.14-py3-none-any.whl", hash = "sha256:6c0c8bf9420bd58f565e585036d971872472b4f7d3f6c73b698e10cffdefb3ca", size = 95514, upload-time = "2025-06-26T15:29:06.49Z" },
{ url = "https://files.pythonhosted.org/packages/e5/47/d63c60f59a59467fda0f93f46335c9d18526d7071f025cb5b89d5353ea42/fastapi-0.116.1-py3-none-any.whl", hash = "sha256:c46ac7c312df840f0c9e220f7964bada936781bc4e2e6eb71f1c4d7553786565", size = 95631, upload-time = "2025-07-11T16:22:30.485Z" },
]
[[package]]
@ -329,7 +329,7 @@ wheels = [
[[package]]
name = "pydantic"
version = "2.11.5"
version = "2.11.7"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "annotated-types" },
@ -337,9 +337,9 @@ dependencies = [
{ name = "typing-extensions" },
{ name = "typing-inspection" },
]
sdist = { url = "https://files.pythonhosted.org/packages/f0/86/8ce9040065e8f924d642c58e4a344e33163a07f6b57f836d0d734e0ad3fb/pydantic-2.11.5.tar.gz", hash = "sha256:7f853db3d0ce78ce8bbb148c401c2cdd6431b3473c0cdff2755c7690952a7b7a", size = 787102, upload-time = "2025-05-22T21:18:08.761Z" }
sdist = { url = "https://files.pythonhosted.org/packages/00/dd/4325abf92c39ba8623b5af936ddb36ffcfe0beae70405d456ab1fb2f5b8c/pydantic-2.11.7.tar.gz", hash = "sha256:d989c3c6cb79469287b1569f7447a17848c998458d49ebe294e975b9baf0f0db", size = 788350, upload-time = "2025-06-14T08:33:17.137Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/b5/69/831ed22b38ff9b4b64b66569f0e5b7b97cf3638346eb95a2147fdb49ad5f/pydantic-2.11.5-py3-none-any.whl", hash = "sha256:f9c26ba06f9747749ca1e5c94d6a85cb84254577553c8785576fd38fa64dc0f7", size = 444229, upload-time = "2025-05-22T21:18:06.329Z" },
{ url = "https://files.pythonhosted.org/packages/6a/c0/ec2b1c8712ca690e5d61979dee872603e92b8a32f94cc1b72d53beab008a/pydantic-2.11.7-py3-none-any.whl", hash = "sha256:dde5df002701f6de26248661f6835bbe296a47bf73990135c7d07ce741b9623b", size = 444782, upload-time = "2025-06-14T08:33:14.905Z" },
]
[[package]]
@ -562,16 +562,16 @@ dev = [
requires-dist = [
{ name = "bencodepy", specifier = "==0.9.5" },
{ name = "croniter", specifier = "==6.0.0" },
{ name = "fastapi", specifier = "==0.115.14" },
{ name = "fastapi", specifier = "==0.116.1" },
{ name = "gitpython", specifier = "==3.1.44" },
{ name = "humanize", specifier = "==4.12.3" },
{ name = "pre-commit", marker = "extra == 'dev'", specifier = "==4.2.0" },
{ name = "pytimeparse2", specifier = "==1.7.1" },
{ name = "qbittorrent-api", specifier = "==2025.5.0" },
{ name = "qbittorrent-api", specifier = "==2025.7.0" },
{ name = "requests", specifier = "==2.32.4" },
{ name = "retrying", specifier = "==1.4.0" },
{ name = "ruamel-yaml", specifier = "==0.18.14" },
{ name = "ruff", marker = "extra == 'dev'", specifier = "==0.12.1" },
{ name = "ruff", marker = "extra == 'dev'", specifier = "==0.12.3" },
{ name = "schedule", specifier = "==1.2.2" },
{ name = "uvicorn", specifier = "==0.35.0" },
]
@ -579,16 +579,16 @@ provides-extras = ["dev"]
[[package]]
name = "qbittorrent-api"
version = "2025.5.0"
version = "2025.7.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "packaging" },
{ name = "requests" },
{ name = "urllib3" },
]
sdist = { url = "https://files.pythonhosted.org/packages/03/00/c3b29563c74038bd7cc368c54878fbebb12e083d9d3c40814021d0abe1ab/qbittorrent_api-2025.5.0.tar.gz", hash = "sha256:34a0f9c1e1ae7e16de50e94651752c8d97a3cf54c2a3e1a05c6a8377399a0e30", size = 1321983, upload-time = "2025-05-03T01:09:45.483Z" }
sdist = { url = "https://files.pythonhosted.org/packages/7b/a2/e6e303b70f2db5a2cc2997a611b424b24a7715e1ef360b0fdaf8007ef8ab/qbittorrent_api-2025.7.0.tar.gz", hash = "sha256:f462f2817559ccaa4c6cdc12694a3153efacd50fa3285a425401ab30abc0de76", size = 1321951, upload-time = "2025-07-12T18:45:33.754Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/ad/b8/e8b19b8ffe626b2355660d7cfc578e51e0aa045964f14305875ec976f427/qbittorrent_api-2025.5.0-py3-none-any.whl", hash = "sha256:44d98791c98cf422fbd86bc19f7475ba4ca604a7d2729551867ec79be85cf4cd", size = 66005, upload-time = "2025-05-03T01:09:42.734Z" },
{ url = "https://files.pythonhosted.org/packages/e2/34/f9cb6f8ce8dbcd15d839d81f55846d33cee312f2f153d30dfbc7118380b4/qbittorrent_api-2025.7.0-py3-none-any.whl", hash = "sha256:6cd79af4757ec71a64e5216cbaa50bf2c436efea0d32b9a79585e0ec21e85ea9", size = 66041, upload-time = "2025-07-12T18:45:31.636Z" },
]
[[package]]
@ -682,27 +682,27 @@ wheels = [
[[package]]
name = "ruff"
version = "0.12.1"
version = "0.12.3"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/97/38/796a101608a90494440856ccfb52b1edae90de0b817e76bfade66b12d320/ruff-0.12.1.tar.gz", hash = "sha256:806bbc17f1104fd57451a98a58df35388ee3ab422e029e8f5cf30aa4af2c138c", size = 4413426, upload-time = "2025-06-26T20:34:14.784Z" }
sdist = { url = "https://files.pythonhosted.org/packages/c3/2a/43955b530c49684d3c38fcda18c43caf91e99204c2a065552528e0552d4f/ruff-0.12.3.tar.gz", hash = "sha256:f1b5a4b6668fd7b7ea3697d8d98857390b40c1320a63a178eee6be0899ea2d77", size = 4459341, upload-time = "2025-07-11T13:21:16.086Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/06/bf/3dba52c1d12ab5e78d75bd78ad52fb85a6a1f29cc447c2423037b82bed0d/ruff-0.12.1-py3-none-linux_armv6l.whl", hash = "sha256:6013a46d865111e2edb71ad692fbb8262e6c172587a57c0669332a449384a36b", size = 10305649, upload-time = "2025-06-26T20:33:39.242Z" },
{ url = "https://files.pythonhosted.org/packages/8c/65/dab1ba90269bc8c81ce1d499a6517e28fe6f87b2119ec449257d0983cceb/ruff-0.12.1-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:b3f75a19e03a4b0757d1412edb7f27cffb0c700365e9d6b60bc1b68d35bc89e0", size = 11120201, upload-time = "2025-06-26T20:33:42.207Z" },
{ url = "https://files.pythonhosted.org/packages/3f/3e/2d819ffda01defe857fa2dd4cba4d19109713df4034cc36f06bbf582d62a/ruff-0.12.1-py3-none-macosx_11_0_arm64.whl", hash = "sha256:9a256522893cb7e92bb1e1153283927f842dea2e48619c803243dccc8437b8be", size = 10466769, upload-time = "2025-06-26T20:33:44.102Z" },
{ url = "https://files.pythonhosted.org/packages/63/37/bde4cf84dbd7821c8de56ec4ccc2816bce8125684f7b9e22fe4ad92364de/ruff-0.12.1-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:069052605fe74c765a5b4272eb89880e0ff7a31e6c0dbf8767203c1fbd31c7ff", size = 10660902, upload-time = "2025-06-26T20:33:45.98Z" },
{ url = "https://files.pythonhosted.org/packages/0e/3a/390782a9ed1358c95e78ccc745eed1a9d657a537e5c4c4812fce06c8d1a0/ruff-0.12.1-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:a684f125a4fec2d5a6501a466be3841113ba6847827be4573fddf8308b83477d", size = 10167002, upload-time = "2025-06-26T20:33:47.81Z" },
{ url = "https://files.pythonhosted.org/packages/6d/05/f2d4c965009634830e97ffe733201ec59e4addc5b1c0efa035645baa9e5f/ruff-0.12.1-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:bdecdef753bf1e95797593007569d8e1697a54fca843d78f6862f7dc279e23bd", size = 11751522, upload-time = "2025-06-26T20:33:49.857Z" },
{ url = "https://files.pythonhosted.org/packages/35/4e/4bfc519b5fcd462233f82fc20ef8b1e5ecce476c283b355af92c0935d5d9/ruff-0.12.1-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:70d52a058c0e7b88b602f575d23596e89bd7d8196437a4148381a3f73fcd5010", size = 12520264, upload-time = "2025-06-26T20:33:52.199Z" },
{ url = "https://files.pythonhosted.org/packages/85/b2/7756a6925da236b3a31f234b4167397c3e5f91edb861028a631546bad719/ruff-0.12.1-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:84d0a69d1e8d716dfeab22d8d5e7c786b73f2106429a933cee51d7b09f861d4e", size = 12133882, upload-time = "2025-06-26T20:33:54.231Z" },
{ url = "https://files.pythonhosted.org/packages/dd/00/40da9c66d4a4d51291e619be6757fa65c91b92456ff4f01101593f3a1170/ruff-0.12.1-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6cc32e863adcf9e71690248607ccdf25252eeeab5193768e6873b901fd441fed", size = 11608941, upload-time = "2025-06-26T20:33:56.202Z" },
{ url = "https://files.pythonhosted.org/packages/91/e7/f898391cc026a77fbe68dfea5940f8213622474cb848eb30215538a2dadf/ruff-0.12.1-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7fd49a4619f90d5afc65cf42e07b6ae98bb454fd5029d03b306bd9e2273d44cc", size = 11602887, upload-time = "2025-06-26T20:33:58.47Z" },
{ url = "https://files.pythonhosted.org/packages/f6/02/0891872fc6aab8678084f4cf8826f85c5d2d24aa9114092139a38123f94b/ruff-0.12.1-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:ed5af6aaaea20710e77698e2055b9ff9b3494891e1b24d26c07055459bb717e9", size = 10521742, upload-time = "2025-06-26T20:34:00.465Z" },
{ url = "https://files.pythonhosted.org/packages/2a/98/d6534322c74a7d47b0f33b036b2498ccac99d8d8c40edadb552c038cecf1/ruff-0.12.1-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:801d626de15e6bf988fbe7ce59b303a914ff9c616d5866f8c79eb5012720ae13", size = 10149909, upload-time = "2025-06-26T20:34:02.603Z" },
{ url = "https://files.pythonhosted.org/packages/34/5c/9b7ba8c19a31e2b6bd5e31aa1e65b533208a30512f118805371dbbbdf6a9/ruff-0.12.1-py3-none-musllinux_1_2_i686.whl", hash = "sha256:2be9d32a147f98a1972c1e4df9a6956d612ca5f5578536814372113d09a27a6c", size = 11136005, upload-time = "2025-06-26T20:34:04.723Z" },
{ url = "https://files.pythonhosted.org/packages/dc/34/9bbefa4d0ff2c000e4e533f591499f6b834346025e11da97f4ded21cb23e/ruff-0.12.1-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:49b7ce354eed2a322fbaea80168c902de9504e6e174fd501e9447cad0232f9e6", size = 11648579, upload-time = "2025-06-26T20:34:06.766Z" },
{ url = "https://files.pythonhosted.org/packages/6f/1c/20cdb593783f8f411839ce749ec9ae9e4298c2b2079b40295c3e6e2089e1/ruff-0.12.1-py3-none-win32.whl", hash = "sha256:d973fa626d4c8267848755bd0414211a456e99e125dcab147f24daa9e991a245", size = 10519495, upload-time = "2025-06-26T20:34:08.718Z" },
{ url = "https://files.pythonhosted.org/packages/cf/56/7158bd8d3cf16394928f47c637d39a7d532268cd45220bdb6cd622985760/ruff-0.12.1-py3-none-win_amd64.whl", hash = "sha256:9e1123b1c033f77bd2590e4c1fe7e8ea72ef990a85d2484351d408224d603013", size = 11547485, upload-time = "2025-06-26T20:34:11.008Z" },
{ url = "https://files.pythonhosted.org/packages/91/d0/6902c0d017259439d6fd2fd9393cea1cfe30169940118b007d5e0ea7e954/ruff-0.12.1-py3-none-win_arm64.whl", hash = "sha256:78ad09a022c64c13cc6077707f036bab0fac8cd7088772dcd1e5be21c5002efc", size = 10691209, upload-time = "2025-06-26T20:34:12.928Z" },
{ url = "https://files.pythonhosted.org/packages/e2/fd/b44c5115539de0d598d75232a1cc7201430b6891808df111b8b0506aae43/ruff-0.12.3-py3-none-linux_armv6l.whl", hash = "sha256:47552138f7206454eaf0c4fe827e546e9ddac62c2a3d2585ca54d29a890137a2", size = 10430499, upload-time = "2025-07-11T13:20:26.321Z" },
{ url = "https://files.pythonhosted.org/packages/43/c5/9eba4f337970d7f639a37077be067e4ec80a2ad359e4cc6c5b56805cbc66/ruff-0.12.3-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:0a9153b000c6fe169bb307f5bd1b691221c4286c133407b8827c406a55282041", size = 11213413, upload-time = "2025-07-11T13:20:30.017Z" },
{ url = "https://files.pythonhosted.org/packages/e2/2c/fac3016236cf1fe0bdc8e5de4f24c76ce53c6dd9b5f350d902549b7719b2/ruff-0.12.3-py3-none-macosx_11_0_arm64.whl", hash = "sha256:fa6b24600cf3b750e48ddb6057e901dd5b9aa426e316addb2a1af185a7509882", size = 10586941, upload-time = "2025-07-11T13:20:33.046Z" },
{ url = "https://files.pythonhosted.org/packages/c5/0f/41fec224e9dfa49a139f0b402ad6f5d53696ba1800e0f77b279d55210ca9/ruff-0.12.3-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e2506961bf6ead54887ba3562604d69cb430f59b42133d36976421bc8bd45901", size = 10783001, upload-time = "2025-07-11T13:20:35.534Z" },
{ url = "https://files.pythonhosted.org/packages/0d/ca/dd64a9ce56d9ed6cad109606ac014860b1c217c883e93bf61536400ba107/ruff-0.12.3-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:c4faaff1f90cea9d3033cbbcdf1acf5d7fb11d8180758feb31337391691f3df0", size = 10269641, upload-time = "2025-07-11T13:20:38.459Z" },
{ url = "https://files.pythonhosted.org/packages/63/5c/2be545034c6bd5ce5bb740ced3e7014d7916f4c445974be11d2a406d5088/ruff-0.12.3-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:40dced4a79d7c264389de1c59467d5d5cefd79e7e06d1dfa2c75497b5269a5a6", size = 11875059, upload-time = "2025-07-11T13:20:41.517Z" },
{ url = "https://files.pythonhosted.org/packages/8e/d4/a74ef1e801ceb5855e9527dae105eaff136afcb9cc4d2056d44feb0e4792/ruff-0.12.3-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:0262d50ba2767ed0fe212aa7e62112a1dcbfd46b858c5bf7bbd11f326998bafc", size = 12658890, upload-time = "2025-07-11T13:20:44.442Z" },
{ url = "https://files.pythonhosted.org/packages/13/c8/1057916416de02e6d7c9bcd550868a49b72df94e3cca0aeb77457dcd9644/ruff-0.12.3-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:12371aec33e1a3758597c5c631bae9a5286f3c963bdfb4d17acdd2d395406687", size = 12232008, upload-time = "2025-07-11T13:20:47.374Z" },
{ url = "https://files.pythonhosted.org/packages/f5/59/4f7c130cc25220392051fadfe15f63ed70001487eca21d1796db46cbcc04/ruff-0.12.3-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:560f13b6baa49785665276c963edc363f8ad4b4fc910a883e2625bdb14a83a9e", size = 11499096, upload-time = "2025-07-11T13:20:50.348Z" },
{ url = "https://files.pythonhosted.org/packages/d4/01/a0ad24a5d2ed6be03a312e30d32d4e3904bfdbc1cdbe63c47be9d0e82c79/ruff-0.12.3-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:023040a3499f6f974ae9091bcdd0385dd9e9eb4942f231c23c57708147b06311", size = 11688307, upload-time = "2025-07-11T13:20:52.945Z" },
{ url = "https://files.pythonhosted.org/packages/93/72/08f9e826085b1f57c9a0226e48acb27643ff19b61516a34c6cab9d6ff3fa/ruff-0.12.3-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:883d844967bffff5ab28bba1a4d246c1a1b2933f48cb9840f3fdc5111c603b07", size = 10661020, upload-time = "2025-07-11T13:20:55.799Z" },
{ url = "https://files.pythonhosted.org/packages/80/a0/68da1250d12893466c78e54b4a0ff381370a33d848804bb51279367fc688/ruff-0.12.3-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:2120d3aa855ff385e0e562fdee14d564c9675edbe41625c87eeab744a7830d12", size = 10246300, upload-time = "2025-07-11T13:20:58.222Z" },
{ url = "https://files.pythonhosted.org/packages/6a/22/5f0093d556403e04b6fd0984fc0fb32fbb6f6ce116828fd54306a946f444/ruff-0.12.3-py3-none-musllinux_1_2_i686.whl", hash = "sha256:6b16647cbb470eaf4750d27dddc6ebf7758b918887b56d39e9c22cce2049082b", size = 11263119, upload-time = "2025-07-11T13:21:01.503Z" },
{ url = "https://files.pythonhosted.org/packages/92/c9/f4c0b69bdaffb9968ba40dd5fa7df354ae0c73d01f988601d8fac0c639b1/ruff-0.12.3-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:e1417051edb436230023575b149e8ff843a324557fe0a265863b7602df86722f", size = 11746990, upload-time = "2025-07-11T13:21:04.524Z" },
{ url = "https://files.pythonhosted.org/packages/fe/84/7cc7bd73924ee6be4724be0db5414a4a2ed82d06b30827342315a1be9e9c/ruff-0.12.3-py3-none-win32.whl", hash = "sha256:dfd45e6e926deb6409d0616078a666ebce93e55e07f0fb0228d4b2608b2c248d", size = 10589263, upload-time = "2025-07-11T13:21:07.148Z" },
{ url = "https://files.pythonhosted.org/packages/07/87/c070f5f027bd81f3efee7d14cb4d84067ecf67a3a8efb43aadfc72aa79a6/ruff-0.12.3-py3-none-win_amd64.whl", hash = "sha256:a946cf1e7ba3209bdef039eb97647f1c77f6f540e5845ec9c114d3af8df873e7", size = 11695072, upload-time = "2025-07-11T13:21:11.004Z" },
{ url = "https://files.pythonhosted.org/packages/e0/30/f3eaf6563c637b6e66238ed6535f6775480db973c836336e4122161986fc/ruff-0.12.3-py3-none-win_arm64.whl", hash = "sha256:5f9c7c9c8f84c2d7f27e93674d27136fbf489720251544c4da7fb3d742e011b1", size = 10805855, upload-time = "2025-07-11T13:21:13.547Z" },
]
[[package]]
@ -743,24 +743,24 @@ wheels = [
[[package]]
name = "starlette"
version = "0.46.2"
version = "0.47.1"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "anyio" },
{ name = "typing-extensions", marker = "python_full_version < '3.10'" },
{ name = "typing-extensions", marker = "python_full_version < '3.13'" },
]
sdist = { url = "https://files.pythonhosted.org/packages/ce/20/08dfcd9c983f6a6f4a1000d934b9e6d626cff8d2eeb77a89a68eef20a2b7/starlette-0.46.2.tar.gz", hash = "sha256:7f7361f34eed179294600af672f565727419830b54b7b084efe44bb82d2fccd5", size = 2580846, upload-time = "2025-04-13T13:56:17.942Z" }
sdist = { url = "https://files.pythonhosted.org/packages/0a/69/662169fdb92fb96ec3eaee218cf540a629d629c86d7993d9651226a6789b/starlette-0.47.1.tar.gz", hash = "sha256:aef012dd2b6be325ffa16698f9dc533614fb1cebd593a906b90dc1025529a79b", size = 2583072, upload-time = "2025-06-21T04:03:17.337Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/8b/0c/9d30a4ebeb6db2b25a841afbb80f6ef9a854fc3b41be131d249a977b4959/starlette-0.46.2-py3-none-any.whl", hash = "sha256:595633ce89f8ffa71a015caed34a5b2dc1c0cdb3f0f1fbd1e69339cf2abeec35", size = 72037, upload-time = "2025-04-13T13:56:16.21Z" },
{ url = "https://files.pythonhosted.org/packages/82/95/38ef0cd7fa11eaba6a99b3c4f5ac948d8bc6ff199aabd327a29cc000840c/starlette-0.47.1-py3-none-any.whl", hash = "sha256:5e11c9f5c7c3f24959edbf2dffdc01bba860228acf657129467d8a7468591527", size = 72747, upload-time = "2025-06-21T04:03:15.705Z" },
]
[[package]]
name = "typing-extensions"
version = "4.14.0"
version = "4.14.1"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/d1/bc/51647cd02527e87d05cb083ccc402f93e441606ff1f01739a62c8ad09ba5/typing_extensions-4.14.0.tar.gz", hash = "sha256:8676b788e32f02ab42d9e7c61324048ae4c6d844a399eebace3d4979d75ceef4", size = 107423, upload-time = "2025-06-02T14:52:11.399Z" }
sdist = { url = "https://files.pythonhosted.org/packages/98/5a/da40306b885cc8c09109dc2e1abd358d5684b1425678151cdaed4731c822/typing_extensions-4.14.1.tar.gz", hash = "sha256:38b39f4aeeab64884ce9f74c94263ef78f3c22467c8724005483154c26648d36", size = 107673, upload-time = "2025-07-04T13:28:34.16Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/69/e0/552843e0d356fbb5256d21449fa957fa4eff3bbc135a74a691ee70c7c5da/typing_extensions-4.14.0-py3-none-any.whl", hash = "sha256:a1514509136dd0b477638fc68d6a91497af5076466ad0fa6c338e44e359944af", size = 43839, upload-time = "2025-06-02T14:52:10.026Z" },
{ url = "https://files.pythonhosted.org/packages/b5/00/d631e67a838026495268c2f6884f3711a15a9a2a96cd244fdaea53b823fb/typing_extensions-4.14.1-py3-none-any.whl", hash = "sha256:d1e1e3b58374dc93031d6eda2420a48ea44a36c2b4766a4fdeb3710755731d76", size = 43906, upload-time = "2025-07-04T13:28:32.743Z" },
]
[[package]]
@ -777,11 +777,11 @@ wheels = [
[[package]]
name = "urllib3"
version = "2.4.0"
version = "2.5.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/8a/78/16493d9c386d8e60e442a35feac5e00f0913c0f4b7c217c11e8ec2ff53e0/urllib3-2.4.0.tar.gz", hash = "sha256:414bc6535b787febd7567804cc015fee39daab8ad86268f1310a9250697de466", size = 390672, upload-time = "2025-04-10T15:23:39.232Z" }
sdist = { url = "https://files.pythonhosted.org/packages/15/22/9ee70a2574a4f4599c47dd506532914ce044817c7752a79b6a51286319bc/urllib3-2.5.0.tar.gz", hash = "sha256:3fc47733c7e419d4bc3f6b3dc2b4f890bb743906a30d56ba4a5bfa4bbff92760", size = 393185, upload-time = "2025-06-18T14:07:41.644Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/6b/11/cc635220681e93a0183390e26485430ca2c7b5f9d33b15c74c2861cb8091/urllib3-2.4.0-py3-none-any.whl", hash = "sha256:4e16665048960a0900c702d4a66415956a584919c03361cac9f1df5c5dd7e813", size = 128680, upload-time = "2025-04-10T15:23:37.377Z" },
{ url = "https://files.pythonhosted.org/packages/a7/c2/fe1e52489ae3122415c51f387e221dd0773709bad6c6cdaa599e8a2c5185/urllib3-2.5.0-py3-none-any.whl", hash = "sha256:e6b01673c0fa6a13e374b50871808eb3bf7046c4b125b216f6bf1cc604cff0dc", size = 129795, upload-time = "2025-06-18T14:07:40.39Z" },
]
[[package]]

View file

@ -4,6 +4,10 @@
flex-direction: column;
gap: var(--spacing-md);
}
.array-field .add-array-item {
margin-top: var(--spacing-sm);
margin-bottom: var(--spacing-md);
}
.array-item {
display: flex;

View file

@ -81,9 +81,9 @@
.form-select:focus,
.form-textarea:focus {
outline: none;
border-color: var(--primary);
box-shadow: 0 0 0 3px var(--primary-focus);
background-color: var(--input-focus-bg);
border-color: var(--input-focus-border);
box-shadow: 0 0 0 3px var(--input-focus-ring);
background-color: var(--input-bg); /* Keep same bg on focus */
}
.form-input:disabled,
@ -184,6 +184,7 @@
.password-input-group .form-input {
flex: 1;
z-index: 1;
}
.password-input-group .password-toggle {
@ -199,6 +200,7 @@
display: flex;
align-items: center;
justify-content: center;
z-index: 2;
}
.password-input-group .password-toggle:hover {
@ -214,3 +216,60 @@
pointer-events: none;
fill: currentColor;
}
/* Hide browser's built-in password visibility toggle */
.hide-password-toggle::-webkit-textfield-decoration-container,
.hide-password-toggle::-webkit-credentials-auto-fill-button,
.hide-password-toggle::-webkit-strong-password-auto-fill-button {
display: none !important;
}
.hide-password-toggle::-ms-reveal,
.hide-password-toggle::-ms-clear {
display: none !important;
}
/* Notifications section optimizations */
.field-section {
margin-bottom: 1.5rem;
}
.webhook-sections-container {
border-top: 1px solid var(--border-color);
padding-top: 1rem;
margin-top: 1rem;
}
.function-webhooks-lazy {
margin-top: 1rem;
border: 1px dashed var(--border-color);
border-radius: var(--border-radius);
background: var(--background-secondary);
}
.lazy-load-placeholder {
padding: 1rem;
text-align: center;
cursor: pointer;
color: var(--text-secondary);
transition: all var(--transition-fast);
user-select: none;
}
.lazy-load-placeholder:hover {
background: var(--background-hover);
color: var(--text-primary);
}
.lazy-content.hidden {
display: none;
}
/* Reduce DOM complexity for dynamic select fields */
.dynamic-select-text-group {
contain: layout style;
}
.dynamic-text-input {
will-change: display;
}

View file

@ -332,6 +332,77 @@
border-radius: var(--border-radius-xl) var(--border-radius-xl) 0 0;
}
.share-limit-modal .modal-header-content {
display: flex;
align-items: center;
justify-content: space-between;
gap: var(--spacing-lg);
}
.share-limit-modal .group-name-section {
flex: 1;
min-width: 0;
}
.share-limit-modal .group-name-label {
display: block;
font-size: var(--font-size-sm);
font-weight: 600;
color: var(--text-secondary);
margin-bottom: var(--spacing-xs);
text-transform: uppercase;
letter-spacing: 0.5px;
}
.share-limit-modal .group-name-input-wrapper {
position: relative;
display: flex;
align-items: center;
}
.share-limit-modal .group-name-input {
flex: 1;
font-size: var(--font-size-xl);
font-weight: 600;
color: var(--text-primary);
background: transparent;
border: 2px solid transparent;
border-radius: var(--border-radius);
padding: var(--spacing-sm) var(--spacing-md);
padding-right: 40px;
transition: all var(--transition-fast);
min-width: 0;
}
.share-limit-modal .group-name-input:hover {
background: var(--bg-primary);
border-color: var(--border-color);
}
.share-limit-modal .group-name-input:focus {
outline: none;
background: var(--bg-primary);
border-color: var(--primary);
box-shadow: 0 0 0 3px rgba(var(--primary-rgb), 0.1);
}
.share-limit-modal .group-name-edit-icon {
position: absolute;
right: var(--spacing-sm);
color: var(--text-muted);
pointer-events: none;
transition: color var(--transition-fast);
}
.share-limit-modal .group-name-input:hover + .group-name-edit-icon,
.share-limit-modal .group-name-input:focus + .group-name-edit-icon {
color: var(--primary);
}
.share-limit-modal .group-name-edit-icon .material-icons {
font-size: 18px;
}
.share-limit-modal .modal-header h3 {
color: var(--text-primary);
font-size: var(--font-size-xl);

View file

@ -642,7 +642,7 @@ body {
.checkmark {
width: 1.25rem;
height: 1.25rem;
border: 1px solid var(--border-color);
border: 2px solid var(--input-border);
border-radius: var(--border-radius-sm);
background-color: var(--bg-primary);
transition: all var(--transition-fast);

View file

@ -119,7 +119,7 @@
--info-hover: #22d3ee;
--info-light: #164e63;
/* Background Colors - Dark theme inspired by Radarr/Sonarr */
/* Background Colors - Dark theme */
--bg-primary: #0f172a;
--bg-secondary: #1e293b;
--bg-tertiary: #334155;

View file

@ -4,15 +4,15 @@
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>qBit Manage - Configuration Manager</title>
<link rel="stylesheet" href="css/main.css">
<link rel="stylesheet" href="css/components.css">
<link rel="stylesheet" href="css/themes.css">
<link rel="stylesheet" href="css/responsive.css">
<link rel="apple-touch-icon" sizes="180x180" href="img/apple-touch-icon.png">
<link rel="icon" type="image/png" sizes="32x32" href="img/favicon-32x32.png">
<link rel="icon" type="image/png" sizes="16x16" href="img/favicon-16x16.png">
<link rel="manifest" href="img/site.webmanifest">
<link rel="shortcut icon" href="img/favicon.ico">
<link rel="stylesheet" href="static/css/main.css">
<link rel="stylesheet" href="static/css/components.css">
<link rel="stylesheet" href="static/css/themes.css">
<link rel="stylesheet" href="static/css/responsive.css">
<link rel="apple-touch-icon" sizes="180x180" href="static/img/apple-touch-icon.png">
<link rel="icon" type="image/png" sizes="32x32" href="static/img/favicon-32x32.png">
<link rel="icon" type="image/png" sizes="16x16" href="static/img/favicon-16x16.png">
<link rel="manifest" href="static/img/site.webmanifest">
<link rel="shortcut icon" href="static/img/favicon.ico">
<!-- Material Icons -->
<link href="https://fonts.googleapis.com/icon?family=Material+Icons" rel="stylesheet">
</head>
@ -25,7 +25,7 @@
<button class="sidebar-toggle" id="sidebar-toggle" aria-label="Toggle sidebar">
<span class="material-icons">menu_open</span>
</button>
<img src="img/qbm_logo.png" alt="qBit Manage" class="logo">
<img src="static/img/qbm_logo.png" alt="qBit Manage" class="logo">
<h1 class="app-title">qBit Manage</h1>
</div>
@ -275,7 +275,7 @@
<div id="toast-container" class="toast-container"></div>
<!-- Scripts -->
<script src="js/components/header.js"></script>
<script type="module" src="js/app.js"></script>
<script src="static/js/components/header.js"></script>
<script type="module" src="static/js/app.js"></script>
</body>
</html>

View file

@ -13,6 +13,13 @@ class API {
this.supportsBackups = true; // Assume backups are supported until proven otherwise
}
/**
* Set the base URL for API requests
*/
setBaseUrl(baseUrl) {
this.baseUrl = baseUrl;
}
/**
* Make HTTP request with error handling
*/
@ -59,14 +66,18 @@ class API {
* GET request
*/
async get(endpoint, params = {}) {
const url = new URL(`${this.baseUrl}/api${endpoint}`, window.location.origin);
// Build query string if params exist
const queryParams = new URLSearchParams();
Object.keys(params).forEach(key => {
if (params[key] !== undefined && params[key] !== null) {
url.searchParams.append(key, params[key]);
queryParams.append(key, params[key]);
}
});
return this.request(endpoint + (url.search || ''), { method: 'GET' });
const queryString = queryParams.toString();
const fullEndpoint = endpoint + (queryString ? `?${queryString}` : '');
return this.request(fullEndpoint, { method: 'GET' });
}
/**

View file

@ -41,6 +41,9 @@ class QbitManageApp {
// Initialize modal system
initModal();
// Fetch app configuration including base URL
await this.fetchAppConfig();
// Initialize components
this.initComponents();
@ -80,6 +83,32 @@ class QbitManageApp {
}
}
async fetchAppConfig() {
try {
// Construct the API URL based on current location
const currentPath = window.location.pathname;
let basePath = currentPath.endsWith('/') ? currentPath.slice(0, -1) : currentPath;
if (basePath === '') basePath = '';
const configUrl = basePath + '/api/get_base_url';
const response = await fetch(configUrl);
if (!response.ok) {
throw new Error(`HTTP error! status: ${response.status}`);
}
const config = await response.json();
// Update the API instance with the base URL
if (config.baseUrl) {
const baseUrl = config.baseUrl.startsWith('/') ? config.baseUrl : '/' + config.baseUrl;
this.api.setBaseUrl(baseUrl);
}
} catch (error) {
console.error('Failed to fetch app configuration:', error);
// Continue with default configuration
}
}
async fetchVersion() {
try {
const response = await this.api.getVersion();
@ -277,6 +306,13 @@ class QbitManageApp {
// Clear existing options
configSelect.innerHTML = '';
// Check if response is valid and has configs
if (!response || !response.configs) {
console.error('Invalid response from listConfigs:', response);
configSelect.innerHTML = '<option value="">Error loading configurations</option>';
return;
}
if (response.configs.length === 0) {
configSelect.innerHTML = '<option value="">No configurations found</option>';
return;

View file

@ -120,6 +120,14 @@ class CommandPanel {
data-command="rem_unregistered">
🗑 Remove Unregistered
</button>
<button type="button" class="btn btn-outline quick-action-btn"
data-command="tag_tracker_error">
Tag Tracker Errors
</button>
<button type="button" class="btn btn-outline quick-action-btn"
data-command="tag_nohardlinks">
🔗 Tag No Hard Links
</button>
<button type="button" class="btn btn-outline quick-action-btn"
data-command="share_limits">
Apply Share Limits

View file

@ -87,6 +87,13 @@ class ConfigForm {
this.currentData._originalNohardlinksFormat = 'object';
}
// Ensure "Uncategorized" category exists for 'cat' section
if (sectionName === 'cat') {
if (!this.currentData.hasOwnProperty('Uncategorized')) {
this.currentData['Uncategorized'] = [];
}
}
await this.renderSection();
// Removed this.validateSection() from here to prevent premature validation display
}
@ -123,10 +130,30 @@ class ConfigForm {
}
}
// Initialize lazy loading for notifications section
if (this.currentSection === 'notifications') {
this.initializeLazyLoading();
}
// Populate category dropdowns after rendering is complete
await this.populateCategoryDropdowns();
}
initializeLazyLoading() {
const lazyContainers = this.container.querySelectorAll('.function-webhooks-lazy');
lazyContainers.forEach(container => {
const placeholder = container.querySelector('.lazy-load-placeholder');
const content = container.querySelector('.lazy-content');
if (placeholder && content) {
placeholder.addEventListener('click', () => {
placeholder.style.display = 'none';
content.classList.remove('hidden');
});
}
});
}
bindEvents() {
if (!this.container) return;
@ -281,15 +308,13 @@ class ConfigForm {
} else if (e.target.type === 'number') {
value = e.target.value ? parseFloat(e.target.value) : null;
} else if (e.target.classList.contains('array-item-input')) {
// Extract base fieldName (e.g., "trackerKey::propName") and index from name (e.g., "trackerKey::propName[index]")
const fullFieldName = e.target.name;
const match = fullFieldName.match(/(.*)\[(\d+)\]$/);
if (match) {
const baseFieldName = match[1]; // e.g., "animebytes.tv::tag"
const index = match[2]; // e.g., "0"
const baseFieldName = match[1];
const index = match[2];
this.updateArrayValue(baseFieldName, index, e.target.value);
} else {
// Fallback if name doesn't match expected array format (shouldn't happen if form-renderer is correct)
this.updateArrayValue(fieldName, e.target.dataset.index, e.target.value);
}
return;
@ -350,37 +375,29 @@ class ConfigForm {
}
updateArrayValue(fieldName, index, value) {
// Check if this is a complex object field (e.g., "trackerKey::propName")
const isComplexObjectField = fieldName.includes('::');
let currentArray;
if (isComplexObjectField) {
const [entryKey, propName] = fieldName.split('::');
if (!this.currentData[entryKey]) {
this.currentData[entryKey] = {};
}
currentArray = this.currentData[entryKey][propName] || [];
currentArray[parseInt(index)] = value;
this.currentData[entryKey][propName] = currentArray;
} else {
// Handle complex object fields (containing ::) directly
if (fieldName.includes('::')) {
const [entryKey, propName] = fieldName.split('::');
if (!this.currentData[entryKey]) {
this.currentData[entryKey] = {};
// Handle Uncategorized array
if (fieldName === 'Uncategorized') {
currentArray = this.currentData[fieldName] || [];
if (!Array.isArray(currentArray)) {
currentArray = [currentArray];
}
currentArray = this.currentData[entryKey][propName] || [];
currentArray[parseInt(index)] = value;
this.currentData[entryKey][propName] = currentArray;
} else {
currentArray = getNestedValue(this.currentData, fieldName) || [];
currentArray[parseInt(index)] = value;
setNestedValue(this.currentData, fieldName, currentArray);
}
currentArray[parseInt(index)] = value;
setNestedValue(this.currentData, fieldName, currentArray);
}
this.onDataChange(this.currentData);
this._dispatchDirtyEvent();
@ -458,21 +475,23 @@ class ConfigForm {
const arrayField = this.container.querySelector(`[data-field="${fieldName}"] .array-items`);
let currentArray;
// Handle complex object fields (containing ::) directly
if (fieldName.includes('::')) {
const [entryKey, propName] = fieldName.split('::');
if (!this.currentData[entryKey]) {
this.currentData[entryKey] = {};
}
currentArray = this.currentData[entryKey][propName] || [];
const newIndex = currentArray.length;
currentArray.push('');
this.currentData[entryKey][propName] = currentArray;
} else {
currentArray = getNestedValue(this.currentData, fieldName) || [];
const newIndex = currentArray.length;
if (fieldName === 'Uncategorized') {
currentArray = this.currentData[fieldName] || [];
if (!Array.isArray(currentArray)) {
currentArray = [currentArray];
}
} else {
currentArray = getNestedValue(this.currentData, fieldName) || [];
}
currentArray.push('');
setNestedValue(this.currentData, fieldName, currentArray);
}
@ -528,18 +547,21 @@ class ConfigForm {
const fieldName = item.querySelector('.array-item-input').dataset.field;
const index = parseInt(item.dataset.index);
// Handle complex object fields (containing ::) directly
if (fieldName.includes('::')) {
const [entryKey, propName] = fieldName.split('::');
if (!this.currentData[entryKey]) {
this.currentData[entryKey] = {};
if (this.currentData[entryKey] && this.currentData[entryKey][propName]) {
this.currentData[entryKey][propName].splice(index, 1);
}
const currentArray = this.currentData[entryKey][propName] || [];
currentArray.splice(index, 1);
this.currentData[entryKey][propName] = currentArray;
} else {
const currentArray = getNestedValue(this.currentData, fieldName) || [];
let currentArray;
if (fieldName === 'Uncategorized') {
currentArray = this.currentData[fieldName] || [];
if (!Array.isArray(currentArray)) {
currentArray = [currentArray];
}
} else {
currentArray = getNestedValue(this.currentData, fieldName) || [];
}
currentArray.splice(index, 1);
setNestedValue(this.currentData, fieldName, currentArray);
}
@ -738,9 +760,13 @@ class ConfigForm {
// Handle flat string values (like categories)
if (sectionConfig.flatStringValues) {
const defaultSchema = sectionConfig.additionalProperties || sectionConfig.patternProperties[".*"];
const defaultValue = defaultSchema?.default || '';
this.currentData[newKey] = defaultValue;
if (newKey === 'Uncategorized') {
this.currentData[newKey] = [];
} else {
const defaultSchema = sectionConfig.additionalProperties || sectionConfig.patternProperties[".*"];
const defaultValue = defaultSchema?.default || '';
this.currentData[newKey] = defaultValue;
}
} else {
// Initialize with default values based on schema for object entries
const newEntry = {};
@ -985,35 +1011,43 @@ class ConfigForm {
return;
}
const updateValue = () => {
if (selectElement.value === 'webhook') {
textInput.style.display = 'block';
textInput.required = true;
hiddenInput.value = textInput.value;
} else {
textInput.style.display = 'none';
textInput.required = false;
textInput.value = '';
hiddenInput.value = selectElement.value;
}
};
// Update the display and values
updateValue();
// Add input listener to text field if not already added
if (!textInput.hasAttribute('data-listener-added')) {
textInput.addEventListener('input', () => {
// Performance optimization: use requestAnimationFrame for DOM updates
requestAnimationFrame(() => {
const updateValue = () => {
if (selectElement.value === 'webhook') {
textInput.style.display = 'block';
textInput.required = true;
hiddenInput.value = textInput.value;
} else {
textInput.style.display = 'none';
textInput.required = false;
textInput.value = '';
hiddenInput.value = selectElement.value;
}
});
textInput.setAttribute('data-listener-added', 'true');
}
};
// Trigger form change event for the hidden input
const changeEvent = new Event('input', { bubbles: true });
hiddenInput.dispatchEvent(changeEvent);
// Update the display and values
updateValue();
// Add input listener to text field if not already added
if (!textInput.hasAttribute('data-listener-added')) {
// Debounce text input to reduce excessive updates
let debounceTimer;
textInput.addEventListener('input', () => {
clearTimeout(debounceTimer);
debounceTimer = setTimeout(() => {
if (selectElement.value === 'webhook') {
hiddenInput.value = textInput.value;
}
}, 150); // 150ms debounce
});
textInput.setAttribute('data-listener-added', 'true');
}
// Trigger form change event for the hidden input
const changeEvent = new Event('input', { bubbles: true });
hiddenInput.dispatchEvent(changeEvent);
});
}
resetSection() {

View file

@ -13,7 +13,7 @@ class LogViewer {
this.autoRefreshInterval = parseInt(localStorage.getItem('qbm-log-refresh-interval') || '0'); // Default to 0 (no auto-refresh)
this.autoRefreshTimer = null;
this.currentLogFile = localStorage.getItem('qbm-selected-log-file') || 'qbit_manage.log'; // Default log file
this.currentLogLimit = parseInt(localStorage.getItem('qbm-log-limit') || '0'); // Default to 0 (all logs)
this.currentLogLimit = parseInt(localStorage.getItem('qbm-log-limit') || '50'); // Default to 50 lines
this.api = new API();
this.logs = [];
@ -161,7 +161,8 @@ class LogViewer {
});
if (logLimitSelect) {
logLimitSelect.value = this.currentLogLimit; // Set initial value
// Always set the dropdown value to match currentLogLimit
logLimitSelect.value = this.currentLogLimit;
logLimitSelect.addEventListener('change', (e) => {
this.currentLogLimit = parseInt(e.target.value);
localStorage.setItem('qbm-log-limit', this.currentLogLimit);

View file

@ -387,8 +387,8 @@ export class ShareLimitsComponent {
<div class="floating-label-group">
<input type="text" id="group-name-input" class="form-input"
placeholder=" " autofocus maxlength="50"
pattern="[a-zA-Z0-9_\\-]+"
title="Only letters, numbers, underscores, and hyphens are allowed">
pattern="[a-zA-Z0-9_\\-\\s]+"
title="Only letters, numbers, spaces, underscores, and hyphens are allowed">
<label for="group-name-input" class="floating-label">Enter a unique group name</label>
</div>
<div class="form-help">
@ -408,8 +408,8 @@ export class ShareLimitsComponent {
const value = input ? input.value.trim() : '';
// Validate input
if (value && !/^[a-zA-Z0-9_-]+$/.test(value)) {
showToast('Group name can only contain letters, numbers, underscores, and hyphens', 'error');
if (value && !/^[a-zA-Z0-9\s_-]+$/.test(value)) {
showToast('Group name can only contain letters, numbers, spaces, underscores, and hyphens', 'error');
resolve(null);
return;
}
@ -426,7 +426,7 @@ export class ShareLimitsComponent {
if (input) {
input.addEventListener('input', (e) => {
const value = e.target.value;
const isValid = /^[a-zA-Z0-9_-]*$/.test(value);
const isValid = /^[a-zA-Z0-9\s_-]*$/.test(value);
if (!isValid && value) {
e.target.style.borderColor = 'var(--error)';
@ -504,10 +504,23 @@ export class ShareLimitsComponent {
<div class="modal-overlay share-limit-modal" id="${modalId}">
<div class="modal">
<div class="modal-header">
<h3>Edit Share Limit Group: ${key}</h3>
<button type="button" class="btn btn-icon modal-close-btn btn-close-icon">
<svg class="icon" viewBox="0 0 24 24"><path d="M19 6.41L17.59 5 12 10.59 6.41 5 5 6.41 10.59 12 5 17.59 6.41 19 12 13.41 17.59 19 19 17.59 13.41 12z"/></svg>
</button>
<div class="modal-header-content">
<div class="group-name-section">
<label for="group-name-edit" class="group-name-label">Group Name:</label>
<div class="group-name-input-wrapper">
<input type="text" id="group-name-edit" class="group-name-input"
value="${key}" maxlength="50"
pattern="[a-zA-Z0-9_\\-\\s]+"
title="Only letters, numbers, spaces, underscores, and hyphens are allowed">
<span class="group-name-edit-icon">
<span class="material-icons">edit</span>
</span>
</div>
</div>
<button type="button" class="btn btn-icon modal-close-btn btn-close-icon">
<svg class="icon" viewBox="0 0 24 24"><path d="M19 6.41L17.59 5 12 10.59 6.41 5 5 6.41 10.59 12 5 17.59 6.41 19 12 13.41 17.59 19 19 17.59 13.41 12z"/></svg>
</button>
</div>
</div>
<div class="modal-content">
${modalContent}
@ -547,6 +560,7 @@ export class ShareLimitsComponent {
const closeBtn = modalElement.querySelector('.modal-close-btn');
const cancelBtn = modalElement.querySelector('.modal-cancel-btn');
const saveBtn = modalElement.querySelector('.modal-save-btn');
const groupNameInput = modalElement.querySelector('#group-name-edit');
const closeModal = () => {
modal.classList.add('hidden');
@ -557,6 +571,22 @@ export class ShareLimitsComponent {
}, 300);
};
// Add real-time validation for group name input
if (groupNameInput) {
groupNameInput.addEventListener('input', (e) => {
const value = e.target.value;
const isValid = /^[a-zA-Z0-9\s_-]*$/.test(value);
if (!isValid && value) {
e.target.style.borderColor = 'var(--error)';
e.target.style.boxShadow = '0 0 0 3px rgba(239, 68, 68, 0.1)';
} else {
e.target.style.borderColor = '';
e.target.style.boxShadow = '';
}
});
}
// Ensure buttons exist before adding event listeners
if (closeBtn) {
closeBtn.addEventListener('click', closeModal);
@ -575,6 +605,26 @@ export class ShareLimitsComponent {
saveBtn.addEventListener('click', () => {
const formData = this.collectFormData(modalElement);
// Get the new group name
const newGroupName = groupNameInput ? groupNameInput.value.trim() : key;
// Validate group name
if (!newGroupName) {
showToast('Group name cannot be empty', 'error');
return;
}
if (!/^[a-zA-Z0-9\s_-]+$/.test(newGroupName)) {
showToast('Group name can only contain letters, numbers, spaces, underscores, and hyphens', 'error');
return;
}
// Check if group name already exists (only if it's different from current)
if (newGroupName !== key && this.data[newGroupName]) {
showToast('A group with this name already exists', 'error');
return;
}
// Validate priority uniqueness
const newPriority = formData.priority;
const priorityError = this.validatePriorityUniqueness(newPriority, key);
@ -617,14 +667,24 @@ export class ShareLimitsComponent {
});
}
this.data[key] = cleanedData;
// Handle group name change
if (newGroupName !== key) {
// Remove old group and add new one
delete this.data[key];
this.data[newGroupName] = cleanedData;
} else {
// Update existing group
this.data[key] = cleanedData;
}
this.onDataChange(this.data);
closeModal();
// Refresh display after modal is closed to avoid timing issues
setTimeout(() => {
this.refreshDisplay();
showToast(`Share limit group "${key}" updated`, 'success');
const actionText = newGroupName !== key ? 'renamed and updated' : 'updated';
showToast(`Share limit group "${newGroupName}" ${actionText}`, 'success');
}, 350); // Wait for modal close animation to complete
});
}
@ -944,7 +1004,7 @@ export class ShareLimitsComponent {
},
{
title: 'Upload Speed Limits',
fields: ['limit_upload_speed', 'enable_group_upload_speed']
fields: ['limit_upload_speed', 'enable_group_upload_speed', 'reset_upload_speed_on_unmet_minimums']
},
{
title: 'Tag Filters',
@ -1156,6 +1216,7 @@ export class ShareLimitsComponent {
'exclude_any_tags': '<span class="material-icons">remove_circle</span>',
'min_num_seeds': '<span class="material-icons">group</span>',
'enable_group_upload_speed': '<span class="material-icons">speed</span>',
'reset_upload_speed_on_unmet_minimums': '<span class="material-icons">refresh</span>',
'resume_torrent_after_change': '<span class="material-icons">play_arrow</span>',
'add_group_to_tag': '<span class="material-icons">add_circle</span>',
'max_last_active': '<span class="material-icons">access_time</span>',

View file

@ -1,23 +1,23 @@
export const catSchema = {
title: 'Categories Configuration',
description: 'Configure torrent categories and their rules',
title: 'Categories',
description: 'Define categories and their associated save paths. All save paths in qBittorrent must be defined here. You can use `*` as a wildcard for subdirectories.',
type: 'complex-object',
keyLabel: 'Category Name',
keyDescription: 'category name',
keyDescription: 'Name of the category as it appears in qBittorrent.',
// Special handling for flat string values (category: path format)
flatStringValues: true,
patternProperties: {
".*": {
type: 'string',
label: 'Save Path',
description: 'Directory path for this category',
description: 'The absolute path where torrents in this category should be saved.',
default: ''
}
},
additionalProperties: {
type: 'string',
label: 'Save Path',
description: 'Directory path for this category',
description: 'The absolute path where torrents in this category should be saved.',
default: ''
}
};

View file

@ -1,6 +1,6 @@
export const catChangeSchema = {
title: 'Category Change Configuration',
description: 'Moves all the torrents from one category to another category if the torrents are marked as complete.',
title: 'Category Changes',
description: 'Move torrents from one category to another after they are marked as complete. Be cautious, as this can cause data to be moved if "Default Torrent Management Mode" is set to automatic in qBittorrent.',
type: 'dynamic-key-value-list',
useCategoryDropdown: true, // Flag to indicate this should use category dropdown for keys
fields: [

View file

@ -1,61 +1,61 @@
export const commandsSchema = {
title: 'Commands Configuration',
description: 'Configure which commands to run',
title: 'Commands',
description: 'Enable or disable specific commands to be executed during a run. These settings can be overridden by command-line arguments or environment variables.',
fields: [
{
name: 'recheck',
type: 'boolean',
label: 'Recheck Torrents',
description: 'Force recheck of all torrents',
description: 'Recheck paused torrents, sorted by lowest size. Resumes the torrent if it has completed.',
default: false
},
{
name: 'cat_update',
type: 'boolean',
label: 'Update Categories',
description: 'Update torrent categories based on rules',
description: 'Update torrent categories based on specified rules and move torrents between categories.',
default: false
},
{
name: 'tag_update',
type: 'boolean',
label: 'Update Tags',
description: 'Update torrent tags based on rules',
description: 'Update torrent tags, set seed goals, and limit upload speed by tag.',
default: false
},
{
name: 'rem_unregistered',
type: 'boolean',
label: 'Remove Unregistered',
description: 'Remove torrents that are unregistered with tracker',
description: 'Remove torrents that are unregistered with the tracker. Deletes data if not cross-seeded.',
default: false
},
{
name: 'rem_orphaned',
type: 'boolean',
label: 'Remove Orphaned',
description: 'Remove orphaned files from filesystem',
description: 'Scan for and remove orphaned files from your root directory that are not referenced by any torrents.',
default: false
},
{
name: 'tag_tracker_error',
type: 'boolean',
label: 'Tag Tracker Errors',
description: 'Tag torrents with tracker errors',
description: 'Tag torrents that have a non-working tracker.',
default: false
},
{
name: 'tag_nohardlinks',
type: 'boolean',
label: 'Tag No Hard Links',
description: 'Tag torrents without hard links',
description: 'Tag torrents that do not have any hard links, useful for managing files from Sonarr/Radarr.',
default: false
},
{
name: 'share_limits',
type: 'boolean',
label: 'Apply Share Limits',
description: 'Apply share ratio and time limits',
description: 'Apply share limits to torrents based on priority and grouping criteria.',
default: false
},
{
@ -66,21 +66,21 @@ export const commandsSchema = {
name: 'skip_cleanup',
type: 'boolean',
label: 'Skip Cleanup',
description: 'Skip cleanup operations',
description: 'Skip emptying the Recycle Bin and Orphaned directories.',
default: false
},
{
name: 'dry_run',
type: 'boolean',
label: 'Dry Run',
description: 'Simulate command execution without making actual changes.',
description: 'Simulate a run without making any actual changes to files, tags, or categories.',
default: true
},
{
name: 'skip_qb_version_check',
type: 'boolean',
label: 'Skip qBittorrent Version Check',
description: 'Skip the qBittorrent version compatibility check.',
description: 'Bypass the qBittorrent/libtorrent version compatibility check. Use at your own risk.',
default: false
}
]

View file

@ -1,40 +1,40 @@
export const directorySchema = {
title: 'Directory Configuration',
description: 'Configure directory paths for different operations',
title: 'Directory Paths',
description: 'Configure directory paths for various operations. Proper configuration is crucial for features like orphaned file detection, no-hardlinks tagging, and the recycle bin.',
fields: [
{
name: 'root_dir',
type: 'text',
label: 'Root Directory',
description: 'Root downloads directory used to check for orphaned files, noHL, and remove unregistered.',
description: 'The primary download directory qBittorrent uses. This path is essential for checking for orphaned files, no-hardlinks, and unregistered torrents.',
placeholder: '/path/to/torrents'
},
{
name: 'remote_dir',
type: 'text',
label: 'Remote Directory',
description: 'Path of docker host mapping of root_dir, this must be set if you\'re running qbit_manage locally (not required if running qbit_manage in a container) and qBittorrent/cross_seed is in a docker.',
description: 'If running qbit_manage locally and qBittorrent is in Docker, this should be the host path that maps to `root_dir` inside the container. Not required if qbit_manage is also in a container.',
placeholder: '/mnt/remote'
},
{
name: 'recycle_bin',
type: 'text',
label: 'Recycle Bin Directory',
description: 'Path of the RecycleBin folder. Default location is set to `remote_dir/.RecycleBin`.',
description: 'The path to the recycle bin folder. If not specified, it defaults to `.RecycleBin` inside your `root_dir`.',
placeholder: '/path/to/recycle-bin'
},
{
name: 'torrents_dir',
type: 'text',
label: 'Torrents Directory',
description: 'Path of the your qbittorrent torrents directory. Required for `save_torrents` attribute in recyclebin',
description: 'The path to your qBittorrent `BT_backup` directory. This is required to use the `save_torrents` feature in the recycle bin.',
placeholder: '/path/to/torrent-files'
},
{
name: 'orphaned_dir',
type: 'text',
label: 'Orphaned Files Directory',
description: 'Path of the Orphaned Directory folder. Default location is set to `remote_dir/orphaned_data`.',
description: 'The path to the orphaned files directory. If not specified, it defaults to `orphaned_data` inside your `root_dir`.',
placeholder: '/path/to/orphaned'
}
]

View file

@ -1,9 +1,9 @@
export const nohardlinksSchema = {
title: 'No Hardlinks Configuration',
description: 'Configure settings for checking and tagging torrents without hardlinks.',
title: 'No Hardlinks',
description: 'Configure settings for tagging torrents that are not hardlinked. This is useful for identifying files that can be safely deleted after being processed by applications like Sonarr or Radarr.',
type: 'complex-object',
keyLabel: 'Category',
keyDescription: 'Category name to check for no hardlinks',
keyDescription: 'Category to check for torrents without hardlinks.',
useCategoryDropdown: true, // Flag to indicate this should use category dropdown
patternProperties: {
".*": { // Matches any category name
@ -18,7 +18,7 @@ export const nohardlinksSchema = {
ignore_root_dir: {
type: 'boolean',
label: 'Ignore Root Directory',
description: 'Ignore any hardlinks detected in the same root_dir (Default True).',
description: 'If true, ignore hardlinks found within the same root directory.',
default: true
}
},
@ -37,7 +37,7 @@ export const nohardlinksSchema = {
ignore_root_dir: {
type: 'boolean',
label: 'Ignore Root Directory',
description: 'Ignore any hardlinks detected in the same root_dir (Default True).',
description: 'If true, ignore hardlinks found within the same root directory.',
default: true
}
},

View file

@ -1,6 +1,6 @@
export const notificationsSchema = {
title: 'Notifications',
description: 'Configure Apprise, Notifiarr, and Webhook notifications.',
description: 'Configure notifications for various events using Apprise, Notifiarr, or custom webhooks.',
type: 'multi-root-object',
fields: [
{
@ -11,14 +11,14 @@ export const notificationsSchema = {
name: 'apprise.api_url',
type: 'text',
label: 'Apprise API Endpoint URL',
description: 'Mandatory to fill out the url of your apprise API endpoint. Leave empty to disable.',
description: 'The URL of your Apprise API endpoint (e.g., http://apprise-api:8000). Leave empty to disable.',
placeholder: 'http://apprise-api:8000'
},
{
name: 'apprise.notify_url',
type: 'text',
label: 'Notification Services URL',
description: 'Mandatory to fill out the notification url/urls based on the notification services provided by apprise.',
description: 'The notification URL(s) for your desired services, as supported by Apprise.',
placeholder: 'discord://webhook_id/webhook_token'
},
{
@ -29,14 +29,14 @@ export const notificationsSchema = {
name: 'notifiarr.apikey',
type: 'password',
label: 'API Key',
description: 'Mandatory to fill out API Key. Leave empty to disable.',
description: 'Your Notifiarr API key. Leave empty to disable.',
placeholder: 'Your Notifiarr API Key'
},
{
name: 'notifiarr.instance',
type: 'text',
label: 'Instance',
description: 'Optional unique value used to identify your instance. (could be your username on notifiarr for example)',
description: '(Optional) A unique identifier for this qbit_manage instance in Notifiarr.',
placeholder: 'my-instance'
},
{
@ -67,21 +67,21 @@ export const notificationsSchema = {
name: 'webhooks.error',
type: 'dynamic_select_text',
label: 'Error Webhook',
description: 'Webhook for error notifications. Can be "apprise", "notifiarr", or a custom URL.',
description: 'Webhook for error notifications. Can be set to "apprise", "notifiarr", or a custom URL.',
options: ['apprise', 'notifiarr', 'webhook']
},
{
name: 'webhooks.run_start',
type: 'dynamic_select_text',
label: 'Run Start Webhook',
description: 'Webhook for run start notifications. Can be "apprise", "notifiarr", or a custom URL.',
description: 'Webhook for run start notifications. Can be set to "apprise", "notifiarr", or a custom URL.',
options: ['apprise', 'notifiarr', 'webhook']
},
{
name: 'webhooks.run_end',
type: 'dynamic_select_text',
label: 'Run End Webhook',
description: 'Webhook for run end notifications. Can be "apprise", "notifiarr", or a custom URL.',
description: 'Webhook for run end notifications. Can be set to "apprise", "notifiarr", or a custom URL.',
options: ['apprise', 'notifiarr', 'webhook']
},
{
@ -92,63 +92,63 @@ export const notificationsSchema = {
name: 'webhooks.function.recheck',
type: 'dynamic_select_text',
label: 'Recheck Webhook',
description: 'Webhook for recheck notifications. Can be "apprise", "notifiarr", or a custom URL.',
description: 'Webhook for recheck notifications. Can be set to "apprise", "notifiarr", or a custom URL.',
options: ['apprise', 'notifiarr', 'webhook']
},
{
name: 'webhooks.function.cat_update',
type: 'dynamic_select_text',
label: 'Category Update Webhook',
description: 'Webhook for category update notifications. Can be "apprise", "notifiarr", or a custom URL.',
description: 'Webhook for category update notifications. Can be set to "apprise", "notifiarr", or a custom URL.',
options: ['apprise', 'notifiarr', 'webhook']
},
{
name: 'webhooks.function.tag_update',
type: 'dynamic_select_text',
label: 'Tag Update Webhook',
description: 'Webhook for tag update notifications. Can be "apprise", "notifiarr", or a custom URL.',
description: 'Webhook for tag update notifications. Can be set to "apprise", "notifiarr", or a custom URL.',
options: ['apprise', 'notifiarr', 'webhook']
},
{
name: 'webhooks.function.rem_unregistered',
type: 'dynamic_select_text',
label: 'Remove Unregistered Webhook',
description: 'Webhook for remove unregistered notifications. Can be "apprise", "notifiarr", or a custom URL.',
description: 'Webhook for remove unregistered notifications. Can be set to "apprise", "notifiarr", or a custom URL.',
options: ['apprise', 'notifiarr', 'webhook']
},
{
name: 'webhooks.function.tag_tracker_error',
type: 'dynamic_select_text',
label: 'Tag Tracker Error Webhook',
description: 'Webhook for tag tracker error notifications. Can be "apprise", "notifiarr", or a custom URL.',
description: 'Webhook for tag tracker error notifications. Can be set to "apprise", "notifiarr", or a custom URL.',
options: ['apprise', 'notifiarr', 'webhook']
},
{
name: 'webhooks.function.rem_orphaned',
type: 'dynamic_select_text',
label: 'Remove Orphaned Webhook',
description: 'Webhook for remove orphaned notifications. Can be "apprise", "notifiarr", or a custom URL.',
description: 'Webhook for remove orphaned notifications. Can be set to "apprise", "notifiarr", or a custom URL.',
options: ['apprise', 'notifiarr', 'webhook']
},
{
name: 'webhooks.function.tag_nohardlinks',
type: 'dynamic_select_text',
label: 'Tag No Hardlinks Webhook',
description: 'Webhook for tag no hardlinks notifications. Can be "apprise", "notifiarr", or a custom URL.',
description: 'Webhook for tag no hardlinks notifications. Can be set to "apprise", "notifiarr", or a custom URL.',
options: ['apprise', 'notifiarr', 'webhook']
},
{
name: 'webhooks.function.share_limits',
type: 'dynamic_select_text',
label: 'Share Limits Webhook',
description: 'Webhook for share limits notifications. Can be "apprise", "notifiarr", or a custom URL.',
description: 'Webhook for share limits notifications. Can be set to "apprise", "notifiarr", or a custom URL.',
options: ['apprise', 'notifiarr', 'webhook']
},
{
name: 'webhooks.function.cleanup_dirs',
type: 'dynamic_select_text',
label: 'Cleanup Directories Webhook',
description: 'Webhook for cleanup directories notifications. Can be "apprise", "notifiarr", or a custom URL.',
description: 'Webhook for cleanup directories notifications. Can be set to "apprise", "notifiarr", or a custom URL.',
options: ['apprise', 'notifiarr', 'webhook']
}
]

View file

@ -1,26 +1,26 @@
export const orphanedSchema = {
title: 'Orphaned Files Configuration',
description: 'Configure settings for managing orphaned files.',
title: 'Orphaned Files',
description: 'Configure settings for managing orphaned files, which are files in your root directory not associated with any torrent.',
fields: [
{
name: 'empty_after_x_days',
type: 'number',
label: 'Empty After X Days',
description: 'Will delete Orphaned data contents if the files have been in the Orphaned data for more than x days. (0 for immediate deletion, empty for never)',
description: 'Delete orphaned files after they have been in the orphaned directory for this many days. Set to 0 for immediate deletion, or leave empty to never delete.',
min: 0
},
{
name: 'exclude_patterns',
type: 'array',
label: 'Exclude Patterns',
description: 'List of patterns to exclude certain files from orphaned.',
description: 'A list of glob patterns to exclude files from being considered orphaned (e.g., "**/.DS_Store").',
items: { type: 'text' }
},
{
name: 'max_orphaned_files_to_delete',
type: 'number',
label: 'Max Orphaned Files to Delete',
description: 'Set your desired threshold for the maximum number of orphaned files qbm will delete in a single run. (-1 to disable safeguards)',
description: 'The maximum number of orphaned files to delete in a single run. This is a safeguard to prevent accidental mass deletions. Set to -1 to disable.',
default: 50,
min: -1
}

View file

@ -1,12 +1,12 @@
export const qbtSchema = {
title: 'qBittorrent Connection',
description: 'Configure connection to qBittorrent client',
description: 'Configure the connection to your qBittorrent client.',
fields: [
{
name: 'host',
type: 'text',
label: 'Host',
description: 'qBittorrent host address (e.g., localhost:8080)',
description: 'The IP address and port of your qBittorrent WebUI.',
required: true,
placeholder: 'localhost:8080 or qbittorrent:8080'
},
@ -14,7 +14,7 @@ export const qbtSchema = {
name: 'user',
type: 'text',
label: 'Username',
description: 'qBittorrent WebUI username',
description: 'The username for your qBittorrent WebUI.',
required: false,
placeholder: 'admin'
},
@ -22,7 +22,7 @@ export const qbtSchema = {
name: 'pass',
type: 'password',
label: 'Password',
description: 'qBittorrent WebUI password',
description: 'The password for your qBittorrent WebUI.',
required: false,
placeholder: 'Enter password'
}

View file

@ -1,6 +1,6 @@
export const recyclebinSchema = {
title: 'Recycle Bin Configuration',
description: 'Recycle Bin method of deletion will move files into the recycle bin instead of directly deleting them.',
title: 'Recycle Bin',
description: 'Configure the recycle bin to move deleted files to a temporary location instead of permanently deleting them. This provides a safety net for accidental deletions.',
fields: [
{
name: 'enabled',
@ -14,21 +14,21 @@ export const recyclebinSchema = {
name: 'empty_after_x_days',
type: 'number',
label: 'Empty After X Days',
description: 'Will delete Recycle Bin contents if the files have been in the Recycle Bin for more than x days. (0 for immediate deletion, empty for never)',
description: 'Delete files from the recycle bin after this many days. Set to 0 for immediate deletion, or leave empty to never delete.',
min: 0
},
{
name: 'save_torrents',
type: 'boolean',
label: 'Save Torrents',
description: 'This will save a copy of your .torrent and .fastresume file in the recycle bin before deleting it from qbittorrent.',
description: 'Save a copy of the .torrent and .fastresume files in the recycle bin. Requires `torrents_dir` to be set in the Directory configuration.',
default: false
},
{
name: 'split_by_category',
type: 'boolean',
label: 'Split by Category',
description: 'This will split the recycle bin folder by the save path defined in the `cat` attribute.',
description: 'Organize the recycle bin by creating subdirectories based on the torrent\'s category save path.',
default: false
}
]

View file

@ -1,12 +1,12 @@
export const settingsSchema = {
title: 'General Settings',
description: 'Configure general application settings',
description: 'Configure general application settings and default behaviors.',
fields: [
{
name: 'force_auto_tmm',
type: 'boolean',
label: 'Force Auto TMM',
description: 'Will force qBittorrent to enable Automatic Torrent Management for each torrent.',
description: 'Force qBittorrent to enable Automatic Torrent Management (ATM) for each torrent.',
default: false
},
{
@ -20,105 +20,105 @@ export const settingsSchema = {
name: 'tracker_error_tag',
type: 'text',
label: 'Tracker Error Tag',
description: 'Define the tag of any torrents that do not have a working tracker. (Used in --tag-tracker-error)',
description: 'The tag to apply to torrents that have a tracker error. Used by the `tag_tracker_error` command.',
default: 'issue'
},
{
name: 'nohardlinks_tag',
type: 'text',
label: 'No Hard Links Tag',
description: 'Define the tag of any torrents that don\'t have hardlinks (Used in --tag-nohardlinks)',
description: 'The tag to apply to torrents that do not have any hardlinks. Used by the `tag_nohardlinks` command.',
default: 'noHL'
},
{
name: 'stalled_tag',
type: 'text',
label: 'Stalled Tag',
description: 'Will set the tag of any torrents stalled downloading.',
description: 'The tag to apply to torrents that are stalled during download.',
default: 'stalledDL'
},
{
name: 'share_limits_tag',
type: 'text',
label: 'Share Limits Tag',
description: 'Will add this tag when applying share limits to provide an easy way to filter torrents by share limit group/priority for each torrent',
description: 'The prefix for tags created by share limit groups. For example, a group named "group1" with priority 1 would get the tag "~share_limit_1.group1".',
default: '~share_limit'
},
{
name: 'share_limits_min_seeding_time_tag',
type: 'text',
label: 'Min Seeding Time Tag',
description: 'Will add this tag when applying share limits to torrents that have not yet reached the minimum seeding time (Used in --share-limits)',
description: 'The tag to apply to torrents that have not met their minimum seeding time requirement in a share limit group.',
default: 'MinSeedTimeNotReached'
},
{
name: 'share_limits_min_num_seeds_tag',
type: 'text',
label: 'Min Num Seeds Tag',
description: 'Will add this tag when applying share limits to torrents that have not yet reached the minimum number of seeds (Used in --share-limits)',
description: 'The tag to apply to torrents that have not met their minimum number of seeds requirement in a share limit group.',
default: 'MinSeedsNotMet'
},
{
name: 'share_limits_last_active_tag',
type: 'text',
label: 'Last Active Tag',
description: 'Will add this tag when applying share limits to torrents that have not yet reached the last active limit (Used in --share-limits)',
description: 'The tag to apply to torrents that have not met their last active time requirement in a share limit group.',
default: 'LastActiveLimitNotReached'
},
{
name: 'cat_filter_completed',
type: 'boolean',
label: 'Category Filter Completed',
description: 'When running --cat-update function, it will filter for completed torrents only.',
description: 'If true, the `cat_update` command will only process completed torrents.',
default: true
},
{
name: 'share_limits_filter_completed',
type: 'boolean',
label: 'Share Limits Filter Completed',
description: 'When running --share-limits function, it will filter for completed torrents only.',
description: 'If true, the `share_limits` command will only process completed torrents.',
default: true
},
{
name: 'tag_nohardlinks_filter_completed',
type: 'boolean',
label: 'Tag No Hardlinks Filter Completed',
description: 'When running --tag-nohardlinks function, it will filter for completed torrents only.',
description: 'If true, the `tag_nohardlinks` command will only process completed torrents.',
default: true
},
{
name: 'rem_unregistered_filter_completed',
type: 'boolean',
label: 'Remove Unregistered Filter Completed',
description: 'Filters for completed torrents only when running rem_unregistered command',
description: 'If true, the `rem_unregistered` command will only process completed torrents.',
default: false
},
{
name: 'cat_update_all',
type: 'boolean',
label: 'Category Update All',
description: 'When running --cat-update function, it will check and update all torrents categories, otherwise it will only update uncategorized torrents.',
description: 'If true, `cat_update` will update all torrents; otherwise, it will only update uncategorized torrents.',
default: true
},
{
name: 'disable_qbt_default_share_limits',
type: 'boolean',
label: 'Disable qBittorrent Default Share Limits',
description: 'When running --share-limits function, it allows QBM to handle share limits by disabling qBittorrents default Share limits.',
description: 'If true, qBittorrent\'s default share limits will be disabled, allowing qbit_manage to handle them exclusively.',
default: true
},
{
name: 'tag_stalled_torrents',
type: 'boolean',
label: 'Tag Stalled Torrents',
description: 'Tags any downloading torrents that are stalled with the user defined `stalledDL` tag when running the tag_update command',
description: 'If true, the `tag_update` command will tag stalled downloading torrents with the `stalled_tag`.',
default: true
},
{
name: 'rem_unregistered_ignore_list',
type: 'array',
label: 'Remove Unregistered Ignore List',
description: 'Ignores a list of words found in the status of the tracker when running rem_unregistered command and will not remove the torrent if matched',
description: 'A list of keywords. If any of these are found in a tracker\'s status message, the torrent will not be removed by the `rem_unregistered` command.',
items: { type: 'text' }
}
]

View file

@ -1,6 +1,6 @@
export const shareLimitsSchema = {
title: 'Share Limits Configuration',
description: 'Control how torrent share limits are set depending on the priority of your grouping.',
description: 'Define prioritized groups to manage share limits for your torrents. Each torrent is matched to the highest-priority group that meets the filter criteria.',
type: 'share-limits-config',
fields: [
{
@ -12,38 +12,38 @@ export const shareLimitsSchema = {
priority: {
type: 'number',
label: 'Priority',
description: 'The lower the number the higher the priority.',
description: 'The priority of the group. Lower numbers have higher priority.',
required: true,
step: 0.1
},
include_all_tags: {
type: 'array',
label: 'Include All Tags',
description: 'Filter the group based on one or more tags. All tags defined here must be present in the torrent.',
description: 'Torrents must have all of these tags to be included in this group.',
items: { type: 'text' }
},
include_any_tags: {
type: 'array',
label: 'Include Any Tags',
description: 'Filter the group based on one or more tags. Any tags defined here must be present in the torrent.',
description: 'Torrents must have at least one of these tags to be included in this group.',
items: { type: 'text' }
},
exclude_all_tags: {
type: 'array',
label: 'Exclude All Tags',
description: 'Filter the group based on one or more tags. All tags defined here must be present in the torrent for it to be excluded.',
description: 'Torrents that have all of these tags will be excluded from this group.',
items: { type: 'text' }
},
exclude_any_tags: {
type: 'array',
label: 'Exclude Any Tags',
description: 'Filter the group based on one or more tags. Any tags defined here must be present in the torrent for it to be excluded.',
description: 'Torrents that have at least one of these tags will be excluded from this group.',
items: { type: 'text' }
},
categories: {
type: 'array',
label: 'Categories',
description: 'Filter by including one or more categories.',
description: 'Torrents must be in one of these categories to be included in this group.',
items: {
type: 'text',
useCategoryDropdown: true // Flag to indicate array items should use category dropdown
@ -52,75 +52,81 @@ export const shareLimitsSchema = {
cleanup: {
type: 'boolean',
label: 'Cleanup',
description: 'Setting this as true will remove and delete contents of any torrents that satisfies the share limits.',
description: 'If true, torrents that meet their share limits will be removed and their contents deleted.',
default: false
},
max_ratio: {
type: 'number',
label: 'Maximum Share Ratio',
description: 'Will set the torrent Maximum share ratio until torrent is stopped from seeding/uploading. (-2: Global Limit, -1: No Limit)',
description: 'The maximum share ratio before a torrent is paused. Use -2 for the global limit and -1 for no limit.',
default: -1,
step: 0.1
},
max_seeding_time: {
type: 'text',
label: 'Maximum Seeding Time',
description: 'Will set the torrent Maximum seeding time until torrent is stopped from seeding/uploading. (-2: Global Limit, -1: No Limit) (e.g., 32m, 2h32m, 3d2h32m, 1w3d2h32m)',
description: 'The maximum seeding time before a torrent is paused. Use -2 for the global limit and -1 for no limit. (e.g., "30d", "1w 4d 2h").',
default: '-1'
},
max_last_active: {
type: 'text',
label: 'Maximum Last Active',
description: 'Will delete the torrent if cleanup variable is set and if torrent has been inactive longer than x minutes. (e.g., 32m, 2h32m, 3d2h32m, 1w3d2h32m)',
description: 'If cleanup is enabled, delete torrents that have been inactive for this duration. Use -1 for no limit. (e.g., "30d", "1w 4d 2h").',
default: '-1'
},
min_seeding_time: {
type: 'text',
label: 'Minimum Seeding Time',
description: 'Will prevent torrent deletion by the cleanup variable if the torrent has not yet reached this minimum seeding time. (e.g., 32m, 2h32m, 3d2h32m, 1w3d2h32m)',
description: 'Prevents cleanup from deleting a torrent until it has been seeding for at least this long. (e.g., "30d", "1w 4d 2h").',
default: '0'
},
min_last_active: {
type: 'text',
label: 'Minimum Last Active',
description: 'Will prevent torrent deletion by cleanup variable if torrent has been active within the last x minutes. (e.g., 32m, 2h32m, 3d2h32m, 1w3d2h32m)',
description: 'Prevents cleanup from deleting a torrent if it has been active within this duration. (e.g., "30d", "1w 4d 2h").',
default: '0'
},
limit_upload_speed: {
type: 'number',
label: 'Limit Upload Speed (KiB/s)',
description: 'Will limit the upload speed KiB/s (KiloBytes/second) (-1 : No Limit)',
description: 'The upload speed limit in KiB/s. Use -1 for no limit.',
default: -1
},
enable_group_upload_speed: {
type: 'boolean',
label: 'Enable Group Upload Speed',
description: 'Upload speed limits are applied at the group level. This will take `limit_upload_speed` defined and divide it equally among the number of torrents in the group.',
description: 'If true, the `limit_upload_speed` will be divided equally among all torrents in this group.',
default: false
},
resume_torrent_after_change: {
type: 'boolean',
label: 'Resume Torrent After Change',
description: 'Will resume your torrent after changing share limits.',
description: 'If true, the torrent will be resumed after its share limits are changed.',
default: true
},
add_group_to_tag: {
type: 'boolean',
label: 'Add Group to Tag',
description: 'This adds your grouping as a tag with a prefix defined in settings (share_limits_tag).',
description: 'If true, a tag representing the group and its priority will be added to the torrent.',
default: true
},
min_num_seeds: {
type: 'number',
label: 'Minimum Number of Seeds',
description: 'Will prevent torrent deletion by cleanup variable if the number of seeds is less than the value set here.',
description: 'Prevents cleanup from deleting a torrent if it has fewer than this many seeds.',
default: 0
},
custom_tag: {
type: 'text',
label: 'Custom Tag',
description: 'Apply a custom tag name for this particular group. (WARNING: This tag MUST be unique)',
description: 'Apply a unique custom tag for this group. This tag will be used to identify and manage the share limits for these torrents.',
default: ''
},
reset_upload_speed_on_unmet_minimums: {
type: 'boolean',
label: 'Reset Upload Speed on Unmet Minimums',
description: 'If true, upload speed limits will be reset to unlimited when minimum conditions (seeding time, number of seeds, last active time) are not met. If false, existing upload speed limits will be preserved.',
default: true
}
}
}

View file

@ -1,6 +1,6 @@
export const trackerSchema = {
title: 'Tracker Configuration',
description: 'Configure tracker-specific settings and rules',
title: 'Tracker',
description: 'Configure tags and categories based on tracker URLs. Use a keyword from the tracker URL to define rules. The `other` key is a special keyword for trackers that do not match any other entry.',
type: 'complex-object',
patternProperties: {
"^(?!other$).*$": { // Matches any key except 'other'
@ -8,20 +8,20 @@ export const trackerSchema = {
properties: {
tag: {
label: 'Tag(s)',
description: 'The tracker tag or additional list of tags defined',
description: 'The tag or tags to apply to torrents from this tracker.',
type: 'array',
items: { type: 'string' }
},
cat: {
type: 'string',
label: 'Category',
description: 'Set the category based on tracker URL. This category option takes priority over the category defined in cat',
description: 'Set a category for torrents from this tracker. This will override any category set by the `cat` section.',
useCategoryDropdown: true // Flag to indicate this field should use category dropdown
},
notifiarr: {
type: 'string',
label: 'Notifiarr React Name',
description: 'Set this to the notifiarr react name. This is used to add indexer reactions to the notifications sent by Notifiarr',
description: 'The Notifiarr "React Name" for this tracker, used for indexer-specific reactions in notifications.',
}
},
required: ['tag'],
@ -32,7 +32,7 @@ export const trackerSchema = {
properties: {
tag: {
label: 'Tag(s)',
description: 'The tracker tag or additional list of tags defined for "other" trackers',
description: 'The tag or tags to apply to torrents from any tracker not explicitly defined elsewhere.',
type: 'array',
items: { type: 'string' }
}
@ -46,20 +46,20 @@ export const trackerSchema = {
properties: {
tag: {
label: 'Tag(s)',
description: 'The tracker tag or additional list of tags defined',
description: 'The tag or tags to apply to torrents from this tracker.',
type: 'array',
items: { type: 'string' }
},
cat: {
type: 'string',
label: 'Category',
description: 'Set the category based on tracker URL. This category option takes priority over the category defined in cat',
description: 'Set a category for torrents from this tracker. This will override any category set by the `cat` section.',
useCategoryDropdown: true // Flag to indicate this field should use category dropdown
},
notifiarr: {
type: 'string',
label: 'Notifiarr React Name',
description: 'Set this to the notifiarr react name. This is used to add indexer reactions to the notifications sent by Notifiarr',
description: 'The Notifiarr "React Name" for this tracker, used for indexer-specific reactions in notifications.',
}
},
required: ['tag'],

View file

@ -7,6 +7,50 @@ import { getNestedValue } from './utils.js';
import { CLOSE_ICON_SVG } from './icons.js';
import { generateCategoryDropdownHTML } from './categories.js';
/**
* Generates attributes to prevent password manager detection and autofill
* Only applied to specific fields that cause performance issues
* @param {string} inputType - The type of input ('password', 'text', etc.)
* @returns {string} HTML attributes string
*/
function getPasswordManagerPreventionAttributes(inputType = 'text') {
const attributes = [
'autocomplete="off"',
'data-lpignore="true"',
'data-form-type="other"',
'data-1p-ignore="true"',
'data-bwignore="true"',
'spellcheck="false"'
];
// For password fields, use more specific autocomplete value
if (inputType === 'password') {
attributes[0] = 'autocomplete="new-password"';
}
return attributes.join(' ');
}
/**
* Determines if a field should have password manager prevention attributes
* Only applies to the specific fields that cause performance issues
* @param {string} fieldName - The field name
* @param {object} field - The field definition
* @returns {boolean} Whether to apply prevention attributes
*/
function shouldPreventPasswordManager(fieldName, field) {
if (!fieldName) return false;
// Only these exact fields need password manager prevention
const targetFields = [
'notifiarr.apikey', // Notifiarr API key (password field)
'user', // qBittorrent username (text field)
'pass' // qBittorrent password (password field)
];
return targetFields.includes(fieldName);
}
/**
* Generates the HTML for a given section.
* @param {object} config - The configuration object for the section.
@ -35,7 +79,12 @@ export function generateSectionHTML(config, data) {
html += generateComplexObjectHTML(config, data);
} else if (config.type === 'multi-root-object') {
// For multi-root-object, render fields directly without nesting under section name
html += generateFieldsHTML(config.fields, data);
// Performance optimization for notifications section
if (config.title === 'Notifications') {
html += generateNotificationsFieldsHTML(config.fields, data);
} else {
html += generateFieldsHTML(config.fields, data);
}
} else {
html += generateFieldsHTML(config.fields, data);
}
@ -77,6 +126,82 @@ function generateFieldsHTML(fields, data, prefix = '') {
}).join('');
}
/**
* Optimized field generation for notifications section to reduce DOM complexity
* @param {Array<object>} fields - An array of field definitions.
* @param {object} data - The current data for the section.
* @returns {string} The HTML string for the fields.
*/
function generateNotificationsFieldsHTML(fields, data) {
// Group fields by section for better performance
const sections = {
apprise: [],
notifiarr: [],
applyToAll: [],
webhooks: [],
functionWebhooks: []
};
fields.forEach(field => {
if (field.type === 'section_header') {
if (field.label.includes('Apprise')) sections.apprise.push(field);
else if (field.label.includes('Notifiarr')) sections.notifiarr.push(field);
else if (field.label.includes('Apply to All')) sections.applyToAll.push(field);
else if (field.label.includes('Webhooks Configuration')) sections.webhooks.push(field);
else if (field.label.includes('Function Specific')) sections.functionWebhooks.push(field);
} else {
if (field.name?.startsWith('apprise.')) sections.apprise.push(field);
else if (field.name?.startsWith('notifiarr.')) sections.notifiarr.push(field);
else if (field.name === 'apply_to_all_value' || field.action === 'apply-to-all') sections.applyToAll.push(field);
else if (field.name?.startsWith('webhooks.function.')) sections.functionWebhooks.push(field);
else if (field.name?.startsWith('webhooks.')) sections.webhooks.push(field);
}
});
// Render sections with lazy loading containers
let html = '';
// Render critical sections first (Apprise, Notifiarr)
html += renderFieldSection(sections.apprise, data, 'apprise-section');
html += renderFieldSection(sections.notifiarr, data, 'notifiarr-section');
html += renderFieldSection(sections.applyToAll, data, 'apply-all-section');
// Render webhook sections with lazy loading
html += `<div class="webhook-sections-container">`;
html += renderFieldSection(sections.webhooks, data, 'webhooks-section');
html += `<div class="function-webhooks-lazy" data-section="function-webhooks">`;
html += `<div class="lazy-load-placeholder">Click to load Function Specific Webhooks...</div>`;
html += `<div class="lazy-content hidden">`;
html += renderFieldSection(sections.functionWebhooks, data, 'function-webhooks-section');
html += `</div></div></div>`;
return html;
}
/**
* Renders a section of fields with performance optimizations
*/
function renderFieldSection(fields, data, sectionId) {
if (!fields.length) return '';
return `<div class="field-section" id="${sectionId}">` +
fields.map(field => {
if (field.type === 'section_header') {
return generateFieldHTML(field, null, null);
}
let fieldName, value;
if (field.name) {
fieldName = field.name;
value = getNestedValue(data, fieldName) ?? field.default ?? '';
} else {
fieldName = null;
value = null;
}
return generateFieldHTML(field, value, fieldName);
}).join('') +
`</div>`;
}
/**
* Generates HTML for a single field.
* @param {object} field - The field definition.
@ -143,8 +268,9 @@ export function generateFieldHTML(field, value, fieldName) {
</label>
<div class="password-input-group">
<input type="password" id="${fieldId}" name="${fieldName}"
class="form-input" value="${value}"
class="form-input ${shouldPreventPasswordManager(fieldName, field) ? 'hide-password-toggle' : ''}" value="${value}"
${field.placeholder ? `placeholder="${field.placeholder}"` : ''}
${shouldPreventPasswordManager(fieldName, field) ? getPasswordManagerPreventionAttributes('password') : ''}
${isRequired}>
<button type="button" class="btn btn-icon password-toggle"
data-target="${fieldId}">
@ -247,6 +373,7 @@ export function generateFieldHTML(field, value, fieldName) {
<input type="text" id="${fieldId}" name="${fieldName}"
class="form-input" value="${value}"
${field.placeholder ? `placeholder="${field.placeholder}"` : ''}
${shouldPreventPasswordManager(fieldName, field) ? getPasswordManagerPreventionAttributes('text') : ''}
${isRequired}>
`;
}
@ -459,12 +586,57 @@ function generateComplexObjectEntryHTML(entryKey, entryValue, config) {
const isOther = entryKey === 'other';
// Handle flat string values (like categories: "category_name": "/path/to/save")
if (config.flatStringValues && typeof entryValue === 'string') {
if (config.flatStringValues && (typeof entryValue === 'string' || Array.isArray(entryValue))) {
const keyLabel = config.keyLabel || 'Key';
const valueSchema = config.patternProperties?.[".*"] || config.additionalProperties;
const valueLabel = valueSchema?.label || 'Value';
const valueDescription = valueSchema?.description || '';
let valueInputHTML;
if (entryKey === 'Uncategorized') {
const arrayValue = Array.isArray(entryValue) ? entryValue : (entryValue ? [entryValue] : []);
const fieldId = `field-${entryKey.replace(/\./g, '-')}`;
let itemsHTML = '';
arrayValue.forEach((item, index) => {
itemsHTML += `
<div class="array-item" data-index="${index}">
<label for="${fieldId}-item-${index}" class="form-label sr-only">Item ${index + 1}</label>
<div class="array-item-input-group">
<input type="text" class="form-input array-item-input"
id="${fieldId}-item-${index}"
value="${item}" data-field="${entryKey}" data-index="${index}"
name="${entryKey}[${index}]">
<button type="button" class="btn btn-icon btn-close-icon remove-array-item">
${CLOSE_ICON_SVG}
</button>
</div>
</div>
`;
});
valueInputHTML = `
<div class="array-field" data-field="${entryKey}">
<div class="array-items">
${itemsHTML}
</div>
<button type="button" class="btn btn-secondary add-array-item"
data-field="${entryKey}">
Add Path
</button>
</div>
${valueDescription ? `<div class="form-help">${valueDescription}</div>` : ''}
`;
} else if (typeof entryValue === 'string') {
valueInputHTML = `
<input type="text" class="form-input" name="${entryKey}::value" value="${entryValue}">
${valueDescription ? `<div class="form-help">${valueDescription}</div>` : ''}
<div class="field-validation"></div>
`;
} else {
valueInputHTML = `<div class="alert alert-error">Invalid value for ${entryKey}. Expected a string.</div>`;
}
let html = `
<div class="complex-object-item complex-object-entry-card" data-key="${entryKey}">
<div class="complex-object-item-content">
@ -479,12 +651,10 @@ function generateComplexObjectEntryHTML(entryKey, entryValue, config) {
</div>
<div class="category-inputs-row">
<div class="form-group category-name-group">
<input type="text" class="form-input complex-object-key" value="${entryKey}" data-original-key="${entryKey}" ${isOther ? 'readonly' : ''}>
<input type="text" class="form-input complex-object-key" value="${entryKey}" data-original-key="${entryKey}" ${isOther || entryKey === 'Uncategorized' ? 'readonly' : ''}>
</div>
<div class="form-group category-path-group">
<input type="text" class="form-input" name="${entryKey}::value" value="${entryValue}">
${valueDescription ? `<div class="form-help">${valueDescription}</div>` : ''}
<div class="field-validation"></div>
${valueInputHTML}
</div>
</div>
</div>