# Requirements Updated
- "humanize==4.13.0"
- "ruff==0.12.11"

# Breaking Changes
- **DEPRECATE `QBT_CONFIG` / `--config-file` OPTION**
- No longer supporting `QBT_CONFIG` / `--config-file`. Instead please
switch over to **`QBT_CONFIG_DIR` / `--config-dir`**.
- `QBT_CONFIG` / `--config-file` option will still work for now but is
now considered legacy and will be removed in a future release.
- **Note**: All yml/yaml files will be treated as valid configuration
files and loaded in the `QBT_CONFIG_DIR` path. Please ensure you
**remove** any old/unused configurations that you don't want to be
loaded prior to using this path.

# Improvements
- Adds docker support for PUID/PGID environment variables
- Dockerfile copies the latest `config.yml.sample` in the config folder
- Add `QBT_HOST` / `--host` option to specify webUI host address (#929
Thanks to @QuixThe2nd)
- WebUI: Quick action settings persist now

# Bug Fixes
- WebUI: Fix loading spinner to be centered in the webUI

**Full Changelog**:
https://github.com/StuffAnThings/qbit_manage/compare/v4.5.5...v4.6.0

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: Fabricio Silva <hi@fabricio.dev>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Parsa Yazdani <parsa@yazdani.au>
Co-authored-by: Actionbot <actions@github.com>
This commit is contained in:
bobokun 2025-08-30 14:54:13 -04:00 committed by GitHub
parent 156291723f
commit 5a4ddf0112
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
36 changed files with 891 additions and 329 deletions

View file

@ -1,47 +0,0 @@
name: Run Pre-Commit
on:
pull_request:
branches:
- develop # Adjust as needed to only run on branches containing 'develop'
push:
branches:
- develop # Adjust as needed
jobs:
pre-commit:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v5
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: '3.9'
- name: Install uv
run: |
curl -LsSf https://astral.sh/uv/install.sh | sh
echo "$HOME/.local/bin" >> $GITHUB_PATH
- name: Install dependencies
run: |
uv venv .venv
source .venv/bin/activate
uv pip install pre-commit
- name: Run pre-commit version check
run: |
source .venv/bin/activate
pre-commit run increase-version --all-files
ruff:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v5
- uses: astral-sh/ruff-action@v3
with:
token: ${{ secrets.GITHUB_TOKEN }}
command: 'ruff check'

View file

@ -3,6 +3,7 @@ name: Docker Develop Release
on:
push:
branches: [ develop ]
workflow_dispatch:
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}

View file

@ -74,3 +74,11 @@ jobs:
fi
echo "Successfully updated develop branch to $NEW_VERSION"
- name: Trigger develop workflow
if: success()
run: |
echo "Triggering develop workflow..."
gh workflow run develop.yml --ref develop
env:
GH_TOKEN: ${{ secrets.PAT || secrets.GITHUB_TOKEN }}

View file

@ -28,7 +28,7 @@ repos:
exclude: ^.github/
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.12.9
rev: v0.12.10
hooks:
# Run the linter.
- id: ruff-check

View file

@ -1,13 +1,20 @@
# Requirements Updated
- "humanize==4.13.0"
- "ruff==0.12.11"
# Breaking Changes
- **DEPRECATE `QBT_CONFIG` / `--config-file` OPTION**
- No longer supporting `QBT_CONFIG` / `--config-file`. Instead please switch over to **`QBT_CONFIG_DIR` / `--config-dir`**.
- `QBT_CONFIG` / `--config-file` option will still work for now but is now considered legacy and will be removed in a future release.
- **Note**: All yml/yaml files will be treated as valid configuration files and loaded in the `QBT_CONFIG_DIR` path. Please ensure you **remove** any old/unused configurations that you don't want to be loaded prior to using this path.
# Improvements
- **ci(docker)**: add OCI labels and build metadata to Docker images
- **Web UI**: Show an "Update available" badge next to the version and a toast notification when a newer version is detected
- **Web UI**: Add integrated docs with collapsible sections
- **ci(build)**: Publish to PyPI
- **Category**: Allow category changes regardless of the "Category Update All" status (Fixes #913)
- Adds docker support for PUID/PGID environment variables
- Dockerfile copies the latest `config.yml.sample` in the config folder
- Add `QBT_HOST` / `--host` option to specify webUI host address (#929 Thanks to @QuixThe2nd)
- WebUI: Quick action settings persist now
# Bug Fixes
- Fixes container hanging when using run command with QBT_RUN flag (Fixes #911)
- Fixes bug on interval scheduler not displaying the correct next run time
- Fix bug on webAPI requests not being queued correctly when called during a scheduled run
- WebUI: Fix loading spinner to be centered in the webUI
**Full Changelog**: https://github.com/StuffAnThings/qbit_manage/compare/v4.5.4...v4.5.5
**Full Changelog**: https://github.com/StuffAnThings/qbit_manage/compare/v4.5.5...v4.6.0

View file

@ -3,6 +3,7 @@ FROM python:3.13-alpine as builder
ARG BRANCH_NAME=master
ENV BRANCH_NAME=${BRANCH_NAME}
ENV QBM_DOCKER=True
# Install build-time dependencies only
RUN apk add --no-cache \
@ -48,6 +49,7 @@ LABEL org.opencontainers.image.base.name="python:3.13-alpine"
ENV TINI_VERSION=v0.19.0
# Runtime dependencies (smaller than build stage)
RUN apk add --no-cache \
tzdata \
@ -55,17 +57,20 @@ RUN apk add --no-cache \
curl \
jq \
tini \
su-exec \
&& rm -rf /var/cache/apk/*
# Copy installed packages and scripts from builder
COPY --from=builder /usr/local/lib/python3.13/site-packages/ /usr/local/lib/python3.13/site-packages/
COPY --from=builder /app /app
COPY . /app
COPY entrypoint.sh /app/entrypoint.sh
WORKDIR /app
RUN chmod +x /app/entrypoint.sh
VOLUME /config
# Expose port 8080
EXPOSE 8080
ENTRYPOINT ["/sbin/tini", "-s", "--"]
ENTRYPOINT ["/sbin/tini", "-s", "/app/entrypoint.sh"]
CMD ["python3", "qbit_manage.py"]

View file

@ -280,23 +280,31 @@ reinstall: uninstall install
.PHONY: prep-release
prep-release:
@echo "Preparing release..."
@# Step 1: Strip '-develop*' suffix from VERSION
@# Step 1: Update uv lock and sync dependencies
@echo "Updating uv lock and syncing dependencies..."
@uv lock --upgrade
@uv sync
@echo "✓ Dependencies updated"
@# Step 2: Strip '-develop*' suffix from VERSION
@current_version=$$(cat VERSION); \
clean_version=$$(echo $$current_version | sed 's/-develop.*$$//'); \
echo "$$clean_version" > VERSION; \
echo "✓ VERSION updated to $$clean_version"
@# Step 2: Check Tauri Rust project builds
@# Step 3: Check Tauri Rust project builds
@echo "Running cargo check in desktop/tauri/src-tauri..."
@cd desktop/tauri/src-tauri && cargo check
@# Step 3: Prepare CHANGELOG skeleton and bump Full Changelog link
@# Step 4: Prepare CHANGELOG skeleton and bump Full Changelog link
@new_version=$$(cat VERSION); \
major=$$(echo "$$new_version" | cut -d. -f1); \
minor=$$(echo "$$new_version" | cut -d. -f2); \
patch=$$(echo "$$new_version" | cut -d. -f3); \
prev_patch=$$((patch - 1)); \
prev_version="$$major.$$minor.$$prev_patch"; \
updated_deps=$$(git diff master..HEAD -- pyproject.toml | grep '^+' | grep '==' | sed 's/^+//' | sed 's/^ *//' | sed 's/,$$//' | sed 's/^/- /'); \
echo "# Requirements Updated" > CHANGELOG; \
echo "" >> CHANGELOG; \
if [ -n "$$updated_deps" ]; then \
echo "$$updated_deps" >> CHANGELOG; \
fi; \
echo "" >> CHANGELOG; \
echo "# New Features" >> CHANGELOG; \
echo "" >> CHANGELOG; \

View file

@ -1,9 +1,10 @@
# <img src="icons/qbm_logo.png" width="75"> qBit Manage
# <img src="https://github.com/StuffAnThings/qbit_manage/blob/master/icons/qbm_logo.png?raw=true" width="75"> qBit Manage
[![GitHub release (latest by date)](https://img.shields.io/github/v/release/StuffAnThings/qbit_manage?style=plastic)](https://github.com/StuffAnThings/qbit_manage/releases)
[![GitHub commits since latest release (by SemVer)](https://img.shields.io/github/commits-since/StuffAnThings/qbit_manage/latest/develop?label=Commits%20in%20Develop&style=plastic)](https://github.com/StuffAnThings/qbit_manage/tree/develop)
[![Docker Image Version (latest semver)](https://img.shields.io/docker/v/bobokun/qbit_manage?label=docker&sort=semver&style=plastic)](https://hub.docker.com/r/bobokun/qbit_manage)
![Github Workflow Status](https://img.shields.io/github/actions/workflow/status/StuffAnThings/qbit_manage/version.yml?style=plastic)
[![PyPi (latest semver)](https://img.shields.io/pypi/v/qbit-manage?label=PyPI&sort=semver&style=plastic)](https://pypi.org/project/qbit-manage)
[![Github Workflow Status](https://img.shields.io/github/actions/workflow/status/StuffAnThings/qbit_manage/version.yml?style=plastic)](https://github.com/StuffAnThings/qbit_manage/actions/workflows/version.yml)
[![pre-commit.ci status](https://results.pre-commit.ci/badge/github/StuffAnThings/qbit_manage/master.svg)](https://results.pre-commit.ci/latest/github/StuffAnThings/qbit_manage/master)
[![Ghcr packages](https://img.shields.io/badge/ghcr.io-packages?style=plastic&label=packages)](https://ghcr.io/StuffAnThings/qbit_manage)
[![Docker Pulls](https://img.shields.io/docker/pulls/bobokun/qbit_manage?style=plastic)](https://hub.docker.com/r/bobokun/qbit_manage)

View file

@ -1 +1 @@
4.5.5
4.6.0

View file

@ -3003,7 +3003,7 @@ dependencies = [
[[package]]
name = "qbit-manage-desktop"
version = "4.5.5-develop10"
version = "4.6.0"
dependencies = [
"glib 0.20.12",
"libc",

View file

@ -43,7 +43,7 @@ license = "MIT"
name = "qbit-manage-desktop"
repository = ""
rust-version = "1.70"
version = "4.5.5"
version = "4.6.0"
[target."cfg(unix)".dependencies]
glib = "0.20.0"

View file

@ -68,5 +68,5 @@
},
"identifier": "com.qbitmanage.desktop",
"productName": "qBit Manage",
"version": "4.5.5"
"version": "4.6.0"
}

View file

@ -1,14 +1,15 @@
# qbit_manage Run Commands
| **Shell Command** | **Docker Environment Variable** | **Config Command** | **Description** | **Default Value** |
|:---------------------------------------:|:-------------------------------:|:---------------------:|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------:|
|:----------------------------------------------------------------------------------:|:-------------------------------:|:---------------------:|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------:|
| `-ws` or `--web-server` | QBT_WEB_SERVER | N/A | Start the webUI server to handle command requests via HTTP API. | False |
| `-H` or `--host` | QBT_HOST | N/A | Hostname for the web server (default: 0.0.0.0). | 0.0.0.0 |
| `-p` or `--port` | QBT_PORT | N/A | Port number for the web server (default: 8080). | 8080 |
| `-b` or `--base-url` | QBT_BASE_URL | N/A | Base URL path for the web UI (e.g., '/qbit-manage'). Default is empty (root). | "" |
| `-r` or`--run` | QBT_RUN | N/A | Run without the scheduler. Script will exit after completion. | False |
| `-sch` or `--schedule` | QBT_SCHEDULE | N/A | Schedule to run every x minutes or choose customize schedule via [cron](https://crontab.guru/examples.html). (Default set to 1440 (1 day)) | 1440 |
| `-sd` or `--startup-delay` | QBT_STARTUP_DELAY | N/A | Set delay in seconds on the first run of a schedule (Default set to 0) | 0 |
| `-c CONFIG` or `--config-file CONFIG` | QBT_CONFIG | N/A | Override the default config file location. By default, qbit_manage looks for `config.yml` in platform-specific directories (see [Config-Setup](Config-Setup) for details). Use this to specify a custom path or filename. `Example: tv.yml`. Supports wildcards to use multiple configs. `Example: config-*.yml` | Platform-specific |
| `-cd CONFIG_DIR` or `--config-dir CONFIG_DIR` | QBT_CONFIG_DIR | N/A | Override the default config directory location. By default, qbit_manage looks for `config.yml` in platform-specific directories (see [Config-Setup](Config-Setup) for details). Use this to specify a custom directory path. `Example: /path/to/config/dir`. | Platform-specific |
| `-lf LOGFILE,` or `--log-file LOGFILE,` | QBT_LOGFILE | N/A | This is used if you want to use a different name for your log file. `Example: tv.log` | activity.log |
| `-re` or `--recheck` | QBT_RECHECK | recheck | Recheck paused torrents sorted by lowest size. Resume if Completed. | False |
| `-cu` or `--cat-update` | QBT_CAT_UPDATE | cat_update | Use this option to update your categories or switch between them. The category function takes the save path of the torrent and assigns the corresponding category to it based on that path. | False |

View file

@ -13,9 +13,9 @@ The script looks for the configuration file in different locations depending on
- **Linux/Unix**: `~/.config/qbit-manage/config.yml` (or `$XDG_CONFIG_HOME/qbit-manage/config.yml` if XDG_CONFIG_HOME is set)
### Docker Installation
- `/app/config.yml` (inside the container)
- `/config/config.yml` (inside the container)
You can override the default location by using the `--config-file` or `-c` command line option to specify a custom path.
You can override the default location by using the `--config-dir` or `-cd` command line option to specify a custom directory.
A template Configuration File can be found in the repo [config/config.yml.sample](https://github.com/StuffAnThings/qbit_manage/blob/master/config/config.yml.sample).

View file

@ -43,8 +43,8 @@ services:
# Scheduler Configuration
- QBT_RUN=false
- QBT_SCHEDULE=1440
- QBT_CONFIG=config.yml
- QBT_LOGFILE=activity.log
- QBT_CONFIG_DIR=/config
- QBT_LOGFILE=qbit_manage.log
# Command Flags
- QBT_RECHECK=false

View file

@ -29,31 +29,33 @@ This wiki should tell you everything you need to know about the script to get it
## Table of Contents
* [Home](Home)
* [Installation](Installation)
* [unRAID Installation](Unraid-Installation)
* [Local Installation](Local-Installations)
* [NIX Installation](Nix-Installation)
* [Docker Installation](Docker-Installation)
* [V4 Migration Guide](v4-Migration-Guide)
* [Config Setup](Config-Setup)
* [Sample Config File](Config-Setup#config-file)
* [List of variables](Config-Setup#list-of-variables)
* [commands](Config-Setup#commands)
* [qbt](Config-Setup#qbt)
* [settings](Config-Setup#settings)
* [directory](Config-Setup#directory)
* [cat](Config-Setup#cat)
* [cat_changes](Config-Setup#cat_changes)
* [tracker](Config-Setup#tracker)
* [nohardlinks](Config-Setup#nohardlinks)
* [share_limits](Config-Setup#share_limits)
* [recyclebin](Config-Setup#recyclebin)
* [orphaned](Config-Setup#orphaned)
* [apprise](Config-Setup#apprise)
* [notifiarr](Config-Setup#notifiarr)
* [webhooks](Config-Setup#webhooks)
* [Commands](Commands)
* [Standalone Scripts](Standalone-Scripts)
* [Web API](Web-API)
* [Web UI](Web-UI)
- [Home](Home)
- [Installation](Installation)
- [Desktop App](Installation#desktop-app-installation)
- [Standalone Binary Installation](Installation#standalone-binary-installation)
- [Python/Source Installation](Installation#pythonsource-installation)
- [Docker Installation](Docker-Installation)
- [unRAID Installation](Unraid-Installation)
- [Config Setup](Config-Setup)
- [Sample Config File](Config-Setup#config-file)
- [List of variables](Config-Setup#list-of-variables)
- [commands](Config-Setup#commands)
- [qbt](Config-Setup#qbt)
- [settings](Config-Setup#settings)
- [directory](Config-Setup#directory)
- [cat](Config-Setup#cat)
- [cat_change](Config-Setup#cat_change)
- [tracker](Config-Setup#tracker)
- [nohardlinks](Config-Setup#nohardlinks)
- [share_limits](Config-Setup#share_limits)
- [recyclebin](Config-Setup#recyclebin)
- [orphaned](Config-Setup#orphaned)
- [apprise](Config-Setup#apprise)
- [notifiarr](Config-Setup#notifiarr)
- [webhooks](Config-Setup#webhooks)
- [Commands](Commands)
- [Web API](Web-API)
- [Web UI](Web-UI)
- Extras
- [Standalone Scripts](Standalone-Scripts)
- [V4 Migration Guide](v4-Migration-Guide)

View file

@ -206,7 +206,7 @@ uv pip install -e . --upgrade
- **Host Mount**: Typically mounted from `/path/to/your/config:/config`
### Custom Location
You can override the default location using the `--config-file` or `-c` command line option:
You can override the default location using the `--config-dir` or `-cd` command line option:
```bash
qbit-manage --config-file /path/to/your/config.yml
qbit-manage --config-dir /path/to/your/config/directory
```

View file

@ -81,7 +81,7 @@ http://[UNRAID-IP]:8080
#!/bin/bash
echo "Running qBitTorrent Management"
python3 /mnt/user/data/scripts/qbit/qbit_manage.py \
--config-file /mnt/user/data/scripts/qbit/config.yml \
--config-dir /mnt/user/data/scripts/qbit/ \
--log-file /mnt/user/data/scripts/qbit/activity.log \
--run
echo "qBitTorrent Management Completed"

View file

@ -2,14 +2,14 @@
## Overview
qBit Manage provides a REST API that allows you to trigger commands via HTTP requests. The API server runs on port 8080 by default and can be configured using the `--port` option or `QBT_PORT` environment variable.
qBit Manage provides a REST API that allows you to trigger commands via HTTP requests. The API server runs at 8080, listening to all hostnames, by default and can be configured using the `--host` and `--port` options or `QBT_HOST` and `QBT_PORT` environment variables.
## Running the Web Server
### Command Line
```bash
python qbit_manage.py --web-server --port 8080
python qbit_manage.py --web-server --host 0.0.0.0 --port 8080
```
### Docker
@ -22,6 +22,7 @@ services:
container_name: qbit_manage
environment:
- QBT_WEB_SERVER=true # Enable web server
- QBT_HOST=0.0.0.0 # Set web server host
- QBT_PORT=8080 # Set web server port
ports:
- "8080:8080" # Map container port to host

180
entrypoint.sh Normal file
View file

@ -0,0 +1,180 @@
#!/bin/bash
set -euo pipefail # Exit on error, undefined vars, pipe failures
# Configuration
readonly SOURCE_FILE="/app/config/config.yml.sample"
readonly DEST_DIR="${QBT_CONFIG_DIR:-/config}"
readonly DEST_FILE="${DEST_DIR}/config.yml.sample"
# Logging function for consistent output
log() {
echo "[$(date '+%Y-%m-%d %H:%M:%S')] $*" >&2
}
# Validate numeric environment variables
validate_numeric_env() {
local var_name="$1"
local var_value="$2"
if [[ -n "$var_value" ]] && ! [[ "$var_value" =~ ^[0-9]+$ ]]; then
log "Warning: $var_name must be numeric. Got $var_name='$var_value' - ignoring PUID/PGID and running as root"
return 1
fi
return 0
}
# Validate and set PUID/PGID
validate_user_group_ids() {
local puid_valid=0
local pgid_valid=0
if ! validate_numeric_env "PUID" "${PUID:-}"; then
puid_valid=1
fi
if ! validate_numeric_env "PGID" "${PGID:-}"; then
pgid_valid=1
fi
# If either is invalid, clear both
if [[ $puid_valid -eq 1 ]] || [[ $pgid_valid -eq 1 ]]; then
PUID=""
PGID=""
return 1
fi
return 0
}
# Safely copy file with atomic operation and error handling
safe_copy() {
local src="$1"
local dest="$2"
local temp_file="${dest}.tmp"
# Validate source file exists and is readable
if [[ ! -f "$src" ]] || [[ ! -r "$src" ]]; then
log "Error: Source file '$src' does not exist or is not readable"
return 1
fi
# Create parent directory if it doesn't exist
local dest_dir
dest_dir="$(dirname "$dest")"
if [[ ! -d "$dest_dir" ]]; then
mkdir -p "$dest_dir" || {
log "Error: Could not create destination directory '$dest_dir'"
return 1
}
fi
# Atomic copy operation
if cp "$src" "$temp_file" && mv "$temp_file" "$dest"; then
log "Successfully copied $src to $dest"
return 0
else
# Clean up temp file on failure
[[ -f "$temp_file" ]] && rm -f "$temp_file"
log "Error: Failed to copy $src to $dest"
return 1
fi
}
# Optimized permission fixing with better performance
fix_permissions() {
local path="$1"
# Skip if PUID or PGID are not set
if [[ -z "${PUID:-}" ]] || [[ -z "${PGID:-}" ]]; then
log "Skipping permission fix for $path - PUID or PGID not set"
return 0
fi
# Check if we're running as root
if [[ "$(id -u)" != "0" ]]; then
log "Skipping permission fix for $path - not running as root"
return 0
fi
local needs_fix=0
if [[ -d "$path" ]]; then
# Check if any files in directory need ownership change
if find "$path" -xdev \( -not -user "$PUID" -o -not -group "$PGID" \) -print -quit 2>/dev/null | grep -q .; then
needs_fix=1
fi
elif [[ -e "$path" ]]; then
# Check if file needs ownership change
if [[ "$(stat -c '%u:%g' "$path" 2>/dev/null || echo "0:0")" != "$PUID:$PGID" ]]; then
needs_fix=1
fi
else
log "Warning: Path '$path' does not exist, skipping permission fix"
return 0
fi
if [[ $needs_fix -eq 1 ]]; then
if chown -R "$PUID:$PGID" "$path" 2>/dev/null; then
local type_msg="file"
[[ -d "$path" ]] && type_msg="directory"
log "Corrected ownership of $type_msg $path to $PUID:$PGID"
return 0
else
log "Warning: Could not change ownership of $path"
return 1
fi
fi
return 0
}
# Execute command with appropriate privilege level
execute_command() {
local current_uid
current_uid="$(id -u)"
if [[ "$current_uid" = "0" ]]; then
if [[ -n "${PUID:-}" ]] && [[ -n "${PGID:-}" ]]; then
log "Changing privileges to PUID:PGID = $PUID:$PGID"
exec /sbin/su-exec "${PUID}:${PGID}" "$@" || {
log "Warning: Could not drop privileges to ${PUID}:${PGID}, continuing as root"
exec "$@"
}
else
log "PUID/PGID not set, running as root"
exec "$@"
fi
else
log "Already running as non-root user (UID: $current_uid), executing command"
exec "$@"
fi
}
# Main execution
main() {
# Validate user/group IDs
validate_user_group_ids
# Handle config file setup
if [[ -d "$DEST_DIR" ]]; then
if [[ -f "$SOURCE_FILE" ]] && [[ -s "$SOURCE_FILE" ]]; then
# Check if destination needs updating
if [[ ! -f "$DEST_FILE" ]] || ! cmp -s "$SOURCE_FILE" "$DEST_FILE" 2>/dev/null; then
if safe_copy "$SOURCE_FILE" "$DEST_FILE"; then
# Fix permissions if running as root and IDs are set
if [[ "$(id -u)" = "0" ]] && [[ -n "${PUID:-}" ]] && [[ -n "${PGID:-}" ]]; then
fix_permissions "$DEST_FILE"
fi
fi
fi
elif [[ ! -f "$SOURCE_FILE" ]]; then
log "Warning: Source file $SOURCE_FILE does not exist, skipping config setup"
fi
fi
# Execute the main command
execute_command "$@"
}
# Run main function with all arguments
main "$@"

View file

@ -130,7 +130,11 @@ class Config:
logger.debug(f" --run (QBT_RUN): {self.args['run']}")
logger.debug(f" --schedule (QBT_SCHEDULE): {self.args['sch']}")
logger.debug(f" --startup-delay (QBT_STARTUP_DELAY): {self.args['startupDelay']}")
logger.debug(f" --config-file (QBT_CONFIG): {self.args['config_files']}")
logger.debug(f" --config-dir (QBT_CONFIG_DIR): {self.args['config_dir_args']}")
if self.args["config_dir_args"] is None:
logger.debug(f" --config-file (QBT_CONFIG): {self.args['config_files']} (legacy)")
else:
logger.debug(f" Configs found from QBT_CONFIG_DIR: {self.args['config_files']}")
logger.debug(f" --log-file (QBT_LOGFILE): {self.args['log_file']}")
logger.debug(f" --log-level (QBT_LOG_LEVEL): {self.args['log_level']}")
logger.debug(f" --log-size (QBT_LOG_SIZE): {self.args['log_size']}")
@ -142,6 +146,7 @@ class Config:
logger.debug(f" --web-server (QBT_WEB_SERVER): {self.args['web_server']}")
logger.debug(f" --port (QBT_PORT): {self.args['port']}")
logger.debug(f" --base-url (QBT_BASE_URL): {self.args['base_url']}")
logger.debug(f" --host (QBT_HOST): {self.args['host']}")
# Log run commands (which may come from config or env)
logger.separator(command_source, space=False, border=False, loglevel="DEBUG")

View file

@ -85,7 +85,6 @@ class Qbt:
except Exception as exc:
self.config.notify(exc, "Qbittorrent")
raise Failed(exc)
logger.separator("Getting Torrent List", space=False, border=False)
self.torrent_list = self.get_torrents({"sort": "added_on"})
self.torrentfiles = {} # a map of torrent files to track cross-seeds

View file

@ -151,6 +151,21 @@ def format_stats_summary(stats: dict, config) -> list[str]:
return stats_output
def in_docker():
# Docker 1.13+ puts this file inside containers
if os.path.exists("/.dockerenv"):
return True
# Fallback: check cgroup info
try:
with open("/proc/1/cgroup") as f:
return any("docker" in line or "kubepods" in line or "containerd" in line or "lxc" in line for line in f)
except FileNotFoundError:
pass
return False
# Global variables for get_arg function
test_value = None
static_envs = []
@ -228,25 +243,31 @@ def _platform_config_base() -> Path:
return base / "qbit-manage"
def get_default_config_dir(config_hint: str = None) -> str:
def get_default_config_dir(config_hint: str = None, config_dir: str = None) -> str:
"""
Determine the default persistent config directory, leveraging a provided config path/pattern first.
Resolution order:
1) If config_hint is an absolute path or contains a directory component, use its parent directory
2) Otherwise, if config_hint is a name/pattern (e.g. 'config.yml'), search common bases for:
- A direct match to that filename/pattern
- OR a persisted scheduler file 'schedule.yml' (so we don't lose an existing schedule when config.yml is absent)
Common bases (in order):
- /config (container volume)
- repository ./config
- user OS config directory
Return the first base containing either.
3) Fallback to legacy-ish behavior:
- /config if it contains any *.yml / *.yaml
- otherwise user OS config directory
1) If config_dir is provided, use it directly (takes precedence over config_hint)
2) If config_hint is an absolute path or contains a directory component, use its parent directory
3) Otherwise, if config_hint is a name/pattern (e.g. 'config.yml'), search common bases for:
- A direct match to that filename/pattern
- OR a persisted scheduler file 'schedule.yml' (so we don't lose an existing schedule when config.yml is absent)
Common bases (in order):
- /config (container volume)
- repository ./config
- user OS config directory
Return the first base containing either.
4) Fallback to legacy-ish behavior:
- /config if it contains any *.yml.sample / *.yaml.sample
- otherwise user OS config directory
"""
# 1) If a direct path is provided, prefer its parent directory
# 1) If config_dir is provided, use it directly (takes precedence)
if config_dir:
p = Path(config_dir).expanduser()
return str(p.resolve())
# 2) If a direct path is provided, prefer its parent directory
if config_hint:
primary = str(config_hint).split(",")[0].strip() # take first if comma-separated
if primary:
@ -274,8 +295,9 @@ def get_default_config_dir(config_hint: str = None) -> str:
pass
# 3) Fallbacks
has_yaml_sample = glob.glob(os.path.join("/config", "*.yml.sample")) or glob.glob(os.path.join("/config", "*.yaml.sample"))
has_yaml = glob.glob(os.path.join("/config", "*.yml")) or glob.glob(os.path.join("/config", "*.yaml"))
if os.path.isdir("/config") and has_yaml:
if os.path.isdir("/config") and (has_yaml_sample or has_yaml):
return "/config"
return str(_platform_config_base())
@ -285,7 +307,7 @@ def ensure_config_dir_initialized(config_dir) -> str:
Ensure the config directory exists and is initialized:
- Creates the config directory
- Creates logs/ and .backups/ subdirectories
- Seeds a default config.yml from bundled config/config.yml.sample if no *.yml/*.yaml present
- Creates an empty config.yml if no *.yml/*.yaml present
Returns the absolute config directory as a string.
"""
p = Path(config_dir).expanduser().resolve()
@ -295,14 +317,12 @@ def ensure_config_dir_initialized(config_dir) -> str:
has_yaml = any(p.glob("*.yml")) or any(p.glob("*.yaml"))
if not has_yaml:
sample = runtime_path("config", "config.yml.sample")
if sample.exists():
dest = p / "config.yml"
try:
shutil.copyfile(sample, dest)
except Exception:
# Non-fatal; if copy fails, user can create a config manually
pass
dest = p / "config.yml"
try:
dest.touch() # Create empty file
except Exception:
# Non-fatal; if creation fails, user can create a config manually
pass
return str(p)
@ -1456,12 +1476,14 @@ class EnvStr(str):
return super().__repr__()
def get_matching_config_files(config_pattern: str, default_dir: str) -> list:
def get_matching_config_files(config_pattern: str, default_dir: str, use_config_dir_mode: bool = False) -> list:
"""Get list of config files matching a pattern.
Args:
config_pattern (str): Config file pattern (e.g. "config.yml" or "config*.yml")
default_dir (str): Default directory to look for configs
use_config_dir_mode (bool): If True, use new config-dir approach (find all .yml/.yaml files)
If False, use legacy config-file approach (pattern matching)
Returns:
list: List of matching config file names
@ -1475,16 +1497,39 @@ def get_matching_config_files(config_pattern: str, default_dir: str) -> list:
else:
search_dir = default_dir
# Handle single file vs pattern
if "*" not in config_pattern:
return [config_pattern]
else:
glob_configs = glob.glob(os.path.join(search_dir, config_pattern))
if glob_configs:
# Return just the filenames without paths
return [os.path.split(x)[-1] for x in glob_configs]
if use_config_dir_mode:
# New --config-dir approach: find all .yml and .yaml files, excluding reserved files
config_files = []
for pattern in ["*.yml", "*.yaml"]:
glob_configs = glob.glob(os.path.join(search_dir, pattern))
for config_file in glob_configs:
filename = os.path.basename(config_file)
# Exclude reserved files
if filename != "schedule.yml":
config_files.append(filename)
if config_files:
# Return just the filenames without paths, sorted for consistency
return sorted(config_files)
else:
raise Failed(f"Config Error: Unable to find any config files in the pattern '{config_pattern}'")
raise Failed(f"Config Error: Unable to find any config files in '{search_dir}'")
else:
# Legacy --config-file approach: pattern matching
# Handle single file vs pattern
if "*" not in config_pattern:
# For single file, check if it exists
if os.path.exists(os.path.join(search_dir, config_pattern)):
return [config_pattern]
else:
raise Failed(f"Config Error: Unable to find config file '{config_pattern}' in '{search_dir}'")
else:
# For patterns, use glob matching
glob_configs = glob.glob(os.path.join(search_dir, config_pattern))
if glob_configs:
# Return just the filenames without paths
return [os.path.basename(x) for x in glob_configs]
else:
raise Failed(f"Config Error: Unable to find any config files in the pattern '{config_pattern}' in '{search_dir}'")
def execute_qbit_commands(qbit_manager, commands, stats, hashes=None):

View file

@ -5,14 +5,17 @@ from __future__ import annotations
import asyncio
import json
import logging
import math
import os
import re
import shutil
import tempfile
import uuid
from contextlib import asynccontextmanager
from dataclasses import dataclass
from dataclasses import field
from datetime import datetime
from datetime import timedelta
from multiprocessing import Queue
from multiprocessing.sharedctypes import Synchronized
from pathlib import Path
@ -27,12 +30,18 @@ from fastapi import HTTPException
from fastapi import Request
from fastapi.middleware.cors import CORSMiddleware
from fastapi.responses import FileResponse
from fastapi.responses import PlainTextResponse
from fastapi.responses import RedirectResponse
from fastapi.staticfiles import StaticFiles
from humanize import precisedelta
from pydantic import BaseModel
from modules import util
from modules.config import Config
from modules.scheduler import Scheduler
from modules.util import YAML
from modules.util import EnvStr
from modules.util import execute_qbit_commands
from modules.util import format_stats_summary
from modules.util import get_matching_config_files
@ -85,6 +94,7 @@ class ValidationResponse(BaseModel):
valid: bool
errors: list[str] = []
warnings: list[str] = []
config_modified: bool = False
class HealthCheckResponse(BaseModel):
@ -276,8 +286,6 @@ class WebAPI:
async def serve_index():
# If base URL is configured, redirect to the base URL path
if base_url:
from fastapi.responses import RedirectResponse
return RedirectResponse(url=base_url + "/", status_code=302)
# Otherwise, serve the web UI normally
@ -341,7 +349,6 @@ class WebAPI:
if qbit_manager:
# Execute qBittorrent commands using shared function
from modules.util import execute_qbit_commands
execute_qbit_commands(qbit_manager, args, stats, hashes=hashes)
@ -764,44 +771,114 @@ class WebAPI:
raise HTTPException(status_code=500, detail=str(e))
async def validate_config(self, filename: str, request: ConfigRequest) -> ValidationResponse:
"""Validate a configuration."""
"""Validate a configuration using a temporary file, but persist changes if defaults are added."""
try:
errors = []
warnings = []
config_modified = False
# Create temporary config for validation
# Get the actual config file path
config_path = self.config_path / filename
if not config_path.exists():
raise HTTPException(status_code=404, detail=f"Config file '{filename}' not found")
# Load original config
original_yaml = None
try:
original_yaml = YAML(str(config_path))
except Exception as e:
logger.error(f"Error reading original config: {str(e)}")
raise HTTPException(status_code=500, detail=f"Failed to read original config: {str(e)}")
# Create temporary config file for validation
temp_config_path = None
try:
# Create a temporary file in the same directory as the config
temp_fd, temp_path = tempfile.mkstemp(suffix=".yml", dir=str(config_path.parent))
temp_config_path = Path(temp_path)
# Convert !ENV strings back to EnvStr objects before saving
processed_data = self._restore_env_objects(request.data)
# Write to temporary file for validation
temp_yaml = YAML(str(temp_config_path))
temp_yaml.data = processed_data
temp_yaml.save_preserving_format(processed_data)
# Close the file descriptor
os.close(temp_fd)
except Exception as e:
logger.error(f"Error creating temporary config: {str(e)}")
raise HTTPException(status_code=500, detail=f"Failed to create temporary config: {str(e)}")
# Create validation args using the temporary file
now = datetime.now()
temp_args = self.args.copy()
temp_args["config_file"] = filename
temp_args["config_file"] = temp_config_path.name # Use temp file name
temp_args["_from_web_api"] = True
temp_args["time"] = now.strftime("%H:%M")
temp_args["time_obj"] = now
temp_args["run"] = True
# Write temporary config file for validation
temp_config_path = self.config_path / f".temp_{filename}"
try:
self._write_yaml_config(temp_config_path, request.data)
logger.separator("Configuration Validation Check", space=False, border=False)
# Try to load config using existing validation logic
try:
Config(self.default_dir, temp_args)
except Exception as e:
errors.append(str(e))
logger.separator("Configuration Validation Failed", space=False, border=False)
valid = len(errors) == 0
if valid:
logger.separator("Configuration Valid", space=False, border=False)
# Check if temp config was modified during validation
try:
# Reload the temp config to see if it was modified
modified_temp_yaml = YAML(str(temp_config_path))
modified_temp_data = modified_temp_yaml.data.copy() if modified_temp_yaml.data else {}
# Compare the data structures
if processed_data != modified_temp_data:
config_modified = True
logger.info("Configuration was modified during validation (defaults added)")
# If config was modified, copy the changes to the original file
try:
original_yaml.data = modified_temp_data
original_yaml.save_preserving_format(modified_temp_data)
logger.info("Successfully applied validation changes to original config")
except Exception as copy_error:
logger.error(f"Failed to copy changes to original config: {str(copy_error)}")
# Don't fail the validation if we can't copy changes
except Exception as e:
logger.warning(f"Error checking if config was modified: {str(e)}")
except Exception as e:
logger.error(f"Validation failed: {str(e)}")
raise
finally:
# Clean up temporary file
if temp_config_path.exists():
temp_config_path.unlink()
try:
if temp_config_path and temp_config_path.exists():
temp_config_path.unlink()
logger.debug(f"Cleaned up temporary config file: {temp_config_path}")
except Exception as cleanup_error:
logger.warning(f"Failed to clean up temporary config file: {str(cleanup_error)}")
return ValidationResponse(valid=len(errors) == 0, errors=errors, warnings=warnings)
# Create response with modification info
response_data = {"valid": valid, "errors": errors, "warnings": warnings, "config_modified": config_modified}
logger.info(f"Validation response: {response_data}")
return ValidationResponse(**response_data)
except Exception as e:
logger.error(f"Error validating config '{filename}': {str(e)}")
raise HTTPException(status_code=500, detail=str(e))
def _write_yaml_config(self, config_path: Path, data: dict[str, Any]):
"""Write configuration data to YAML file while preserving formatting and comments."""
from modules.util import YAML
try:
logger.trace(f"Attempting to write config to: {config_path}")
@ -809,15 +886,18 @@ class WebAPI:
logger.trace(f"Data to write: {data}")
# Convert !ENV strings back to EnvStr objects
processed_data = self._restore_env_objects(data)
# Use the custom YAML class with format preservation
if config_path.exists():
# Load existing file to preserve formatting
yaml_writer = YAML(path=str(config_path))
yaml_writer.save_preserving_format(data)
yaml_writer.save_preserving_format(processed_data)
else:
# Create new file with standard formatting
yaml_writer = YAML(input_data="")
yaml_writer.data = data
yaml_writer.data = processed_data
yaml_writer.path = str(config_path)
yaml_writer.save()
@ -1021,8 +1101,6 @@ class WebAPI:
with open(docs_path, encoding="utf-8") as f:
content = f.read()
from fastapi.responses import PlainTextResponse
return PlainTextResponse(content=content, media_type="text/markdown")
except HTTPException:
@ -1169,7 +1247,6 @@ class WebAPI:
def _preserve_env_syntax(self, data):
"""Convert EnvStr objects back to !ENV syntax for frontend display"""
from modules.util import EnvStr
if isinstance(data, EnvStr):
# Return the original !ENV syntax
@ -1186,9 +1263,6 @@ class WebAPI:
def _restore_env_objects(self, data):
"""Convert !ENV syntax back to EnvStr objects for proper YAML serialization."""
import os
from modules.util import EnvStr
if isinstance(data, str) and data.startswith("!ENV "):
env_var = data[5:] # Remove "!ENV " prefix
@ -1203,7 +1277,6 @@ class WebAPI:
def _log_env_str_values(self, data, path):
"""Helper method to log EnvStr values for debugging"""
from modules.util import EnvStr
if isinstance(data, dict):
for key, value in data.items():
@ -1224,7 +1297,6 @@ class WebAPI:
"""Get complete scheduler status including schedule configuration and persistence information."""
try:
# Always create a fresh scheduler instance to get current state
from modules.scheduler import Scheduler
fresh_scheduler = Scheduler(self.default_dir, suppress_logging=True, read_only=True)
@ -1274,8 +1346,6 @@ class WebAPI:
async def update_schedule(self, request: Request) -> dict:
"""Update and persist schedule configuration with diagnostic instrumentation."""
try:
from modules.scheduler import Scheduler
correlation_id = uuid.uuid4().hex[:12]
client_host = "n/a"
if getattr(request, "client", None):
@ -1380,8 +1450,6 @@ class WebAPI:
Toggle persistent schedule enable/disable (non-destructive) with diagnostics.
"""
try:
from modules.scheduler import Scheduler
correlation_id = uuid.uuid4().hex[:12]
scheduler = Scheduler(self.default_dir, suppress_logging=True, read_only=True)
file_exists_before = scheduler.schedule_file.exists()
@ -1448,11 +1516,6 @@ class WebAPI:
def _update_next_run_info(self, next_run: datetime):
"""Update the shared next run info dictionary."""
try:
import math
from datetime import timedelta
from humanize import precisedelta
current_time = datetime.now()
current = current_time.strftime("%I:%M %p")
time_to_run_str = next_run.strftime("%Y-%m-%d %I:%M %p")

View file

@ -21,7 +21,7 @@ dependencies = [
"croniter==6.0.0",
"fastapi==0.116.1",
"GitPython==3.1.45",
"humanize==4.12.3",
"humanize==4.13.0",
"pytimeparse2==1.7.1",
"qbittorrent-api==2025.7.0",
"requests==2.32.5",
@ -40,7 +40,7 @@ Repository = "https://github.com/StuffAnThings/qbit_manage"
[project.optional-dependencies]
dev = [
"pre-commit==4.3.0",
"ruff==0.12.10",
"ruff==0.12.11",
]
[tool.ruff]

View file

@ -24,6 +24,7 @@ from modules.util import format_stats_summary
from modules.util import get_arg
from modules.util import get_default_config_dir
from modules.util import get_matching_config_files
from modules.util import in_docker
try:
from croniter import croniter
@ -55,6 +56,14 @@ parser.add_argument(
help="Start the webUI server to handle command requests via HTTP API. "
"Default: enabled on desktop (non-Docker) runs; disabled in Docker.",
)
parser.add_argument(
"-H",
"--host",
dest="host",
type=str,
default="0.0.0.0",
help="Hostname for the web server (default: 0.0.0.0).",
)
parser.add_argument(
"-p",
"--port",
@ -107,9 +116,17 @@ parser.add_argument(
action="store",
default="config.yml",
type=str,
help=argparse.SUPPRESS,
)
parser.add_argument(
"-cd",
"--config-dir",
dest="config_dir",
action="store",
default=None,
type=str,
help=(
"This is used if you want to use a different name for your config.yml or if you want to load multiple"
"config files using *. Example: tv.yml or config*.yml"
"This is used to specify the configuration directory. It will treat all YAML files in this directory as valid configs."
),
)
parser.add_argument(
@ -253,17 +270,19 @@ except ImportError:
git_branch = None
env_version = get_arg("BRANCH_NAME", "master")
is_docker = get_arg("QBM_DOCKER", False, arg_bool=True)
is_docker = get_arg("QBM_DOCKER", False, arg_bool=True) or in_docker()
web_server = get_arg("QBT_WEB_SERVER", args.web_server, arg_bool=True)
# Auto-enable web server by default on non-Docker if not explicitly set via env/flag
if web_server is None and not is_docker:
web_server = True
host = get_arg("QBT_HOST", args.host, arg_int=True)
port = get_arg("QBT_PORT", args.port, arg_int=True)
base_url = get_arg("QBT_BASE_URL", args.base_url)
run = get_arg("QBT_RUN", args.run, arg_bool=True)
sch = get_arg("QBT_SCHEDULE", args.schedule)
startupDelay = get_arg("QBT_STARTUP_DELAY", args.startupDelay)
config_files = get_arg("QBT_CONFIG", args.configfiles)
config_dir = get_arg("QBT_CONFIG_DIR", args.config_dir)
log_file = get_arg("QBT_LOGFILE", args.logfile)
recheck = get_arg("QBT_RECHECK", args.recheck, arg_bool=True)
cat_update = get_arg("QBT_CAT_UPDATE", args.cat_update, arg_bool=True)
@ -293,10 +312,13 @@ stats = {}
args = {}
scheduler = None # Global scheduler instance
default_dir = ensure_config_dir_initialized(get_default_config_dir(config_files))
default_dir = ensure_config_dir_initialized(get_default_config_dir(config_files, config_dir))
args["config_dir"] = default_dir
args["config_dir_args"] = config_dir
config_files = get_matching_config_files(config_files, default_dir)
# Use config_dir_mode if --config-dir was provided, otherwise use legacy mode
use_config_dir_mode = config_dir is not None and config_files
config_files = get_matching_config_files(config_files, default_dir, use_config_dir_mode)
for v in [
@ -324,6 +346,7 @@ for v in [
"debug",
"trace",
"web_server",
"host",
"port",
"base_url",
]:
@ -597,6 +620,7 @@ def end():
# Define the web server target at module level (required for Windows spawn/frozen PyInstaller)
def run_web_server(
host,
port,
process_args,
is_running,
@ -628,7 +652,7 @@ def run_web_server(
# Configure and run uvicorn
import uvicorn as _uvicorn
config = _uvicorn.Config(app, host="0.0.0.0", port=port, log_level="info", access_log=False)
config = _uvicorn.Config(app, host=host, port=port, log_level="info", access_log=False)
server = _uvicorn.Server(config)
server.run()
except KeyboardInterrupt:
@ -655,7 +679,8 @@ def print_logo(logger):
logger.info_center(r" \__, |_.__/|_|\__| |_| |_| |_|\__,_|_| |_|\__,_|\__, |\___|") # noqa: W605
logger.info_center(" | | ______ __/ | ") # noqa: W605
logger.info_center(" |_| |______| |___/ ") # noqa: W605
system_ver = "Docker" if is_docker else f"Python {platform.python_version()}"
python_ver = f"Python {platform.python_version()}"
system_ver = f"Docker: {python_ver}" if is_docker else python_ver
logger.info(f" Version: {version[0]} ({system_ver}){f' (Git: {git_branch})' if git_branch else ''}")
latest_version = util.current_version(version, branch=branch)
new_version = (
@ -736,12 +761,12 @@ def main():
logger.debug(f"Unknown CLI arguments ignored: {_unknown_cli}")
logger.separator("Starting Web Server")
logger.info(f"Web API server running on http://0.0.0.0:{port}")
logger.info(f"Web API server running on http://{host}:{port}")
if base_url:
logger.info(f"Access the WebUI at http://localhost:{port}/{base_url.lstrip('/')}")
logger.info(f"Root path http://localhost:{port}/ will redirect to the WebUI")
logger.info(f"Access the WebUI at http://{host}:{port}/{base_url.lstrip('/')}")
logger.info(f"Root path http://{host}:{port}/ will redirect to the WebUI")
else:
logger.info(f"Access the WebUI at http://localhost:{port}")
logger.info(f"Access the WebUI at http://{host}:{port}")
# Create a copy of args to pass to the web server process
process_args = args.copy()
@ -750,6 +775,7 @@ def main():
web_process = multiprocessing.Process(
target=run_web_server,
args=(
host,
port,
process_args,
is_running,
@ -766,7 +792,7 @@ def main():
is_desktop_app = os.getenv("QBT_DESKTOP_APP", "").lower() == "true"
if not is_docker and not is_desktop_app:
try:
ui_url = f"http://127.0.0.1:{port}"
ui_url = f"http://{'127.0.0.1' if host == '0.0.0.0' else host}:{port}"
if base_url:
ui_url = f"{ui_url}/{base_url.lstrip('/')}"
threading.Thread(target=_open_browser_when_ready, args=(ui_url, logger), daemon=True).start()

View file

@ -9,55 +9,18 @@ fi
# CI: For pull_request events, check if the PR itself changes VERSION.
# If not, run the develop version updater. This avoids relying on staged files.
if [[ "$IN_CI" == "true" ]]; then
BASE_REF="${GITHUB_BASE_REF:-}"
# If BASE_REF not provided (e.g., pre-commit.ci), infer remote default branch
if [[ -z "$BASE_REF" ]]; then
DEFAULT_BASE="$(git symbolic-ref -q --short refs/remotes/origin/HEAD 2>/dev/null | sed 's#^origin/##')"
if [[ -z "$DEFAULT_BASE" ]]; then
DEFAULT_BASE="$(git remote show origin 2>/dev/null | sed -n 's/.*HEAD branch: //p' | head -n1)"
fi
BASE_REF="$DEFAULT_BASE"
fi
# Resolve a usable base ref
CANDIDATES=()
if [[ -n "$BASE_REF" ]]; then
CANDIDATES+=("refs/remotes/origin/$BASE_REF")
CANDIDATES+=("refs/heads/$BASE_REF")
fi
BASE_RESOLVED=""
for ref in "${CANDIDATES[@]}"; do
if git rev-parse --verify -q "$ref" >/dev/null; then
BASE_RESOLVED="$ref"
break
fi
done
# Attempt to fetch the remote-tracking base if missing (handles shallow clones)
if [[ -z "$BASE_RESOLVED" && -n "$BASE_REF" ]]; then
git fetch --no-tags --depth=100 origin "refs/heads/$BASE_REF:refs/remotes/origin/$BASE_REF" >/dev/null 2>&1 || true
if git rev-parse --verify -q "refs/remotes/origin/$BASE_REF" >/dev/null; then
BASE_RESOLVED="refs/remotes/origin/$BASE_REF"
elif git rev-parse --verify -q "refs/heads/$BASE_REF" >/dev/null; then
BASE_RESOLVED="refs/heads/$BASE_REF"
fi
fi
if [[ -z "$BASE_RESOLVED" ]]; then
echo "Warning: Could not resolve PR base ref for '$BASE_REF'."
echo "Hint: ensure the base ref is available (e.g., full fetch)."
echo "Skipping version update because PR base could not be resolved."
# Check if VERSION file contains "develop"
if ! grep -q "develop" VERSION; then
echo "VERSION file does not contain 'develop'. Skipping version update."
exit 0
fi
# If diff is quiet, there were no changes to VERSION between base and head.
if git diff --quiet "$BASE_RESOLVED...HEAD" -- VERSION; then
echo "No VERSION bump detected in PR range ($BASE_RESOLVED...HEAD). Updating develop version."
# Check if VERSION differs from develop branch
if git diff --quiet origin/develop...HEAD -- VERSION 2>/dev/null; then
echo "VERSION file is the same as in develop branch. User didn't bump version, so updating develop version."
source "$(dirname "$0")/update_develop_version.sh"
else
echo "PR includes a VERSION change. Skipping version update."
echo "VERSION file differs from develop branch. User already bumped version. Skipping update."
fi
exit 0
fi
@ -70,15 +33,30 @@ if [[ "$IN_CI" != "true" && -z $(git diff --cached --name-only) ]]; then
exit 0
fi
# Check if the VERSION file is staged for modification
if git diff --cached --name-only | grep -q "VERSION"; then
echo "The VERSION file is already modified. Skipping version update."
exit 0
elif git diff --name-only | grep -q "VERSION"; then
echo "The VERSION file has unstaged changes. Please stage them before committing."
exit 0
elif ! git show --name-only HEAD | grep -q "VERSION"; then
# For local development, check if VERSION contains "develop"
if [[ "$IN_CI" != "true" ]]; then
# Check if VERSION file contains "develop"
if ! grep -q "develop" VERSION; then
echo "VERSION file does not contain 'develop'. Skipping version update."
exit 0
# Check if the VERSION file is staged for modification
elif git diff --cached --name-only | grep -q "VERSION"; then
echo "The VERSION file is already modified. Skipping version update."
exit 0
elif git diff --name-only | grep -q "VERSION"; then
echo "The VERSION file has unstaged changes. Please stage them before committing."
exit 0
fi
fi
# Check if we should run version update
if ! git diff --quiet origin/develop -- VERSION 2>/dev/null; then
# VERSION differs from develop branch, so we should update it
source "$(dirname "$0")/update_develop_version.sh"
elif [[ -n "$(git diff --cached --name-only)" ]] && ! git diff --cached --name-only | grep -q "VERSION"; then
# There are staged changes but VERSION is not among them
source "$(dirname "$0")/update_develop_version.sh"
elif ! git show --name-only HEAD | grep -q "VERSION"; then
# VERSION doesn't exist in HEAD (new file)
source "$(dirname "$0")/update_develop_version.sh"
fi

84
uv.lock generated
View file

@ -261,11 +261,11 @@ wheels = [
[[package]]
name = "humanize"
version = "4.12.3"
version = "4.13.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/22/d1/bbc4d251187a43f69844f7fd8941426549bbe4723e8ff0a7441796b0789f/humanize-4.12.3.tar.gz", hash = "sha256:8430be3a615106fdfceb0b2c1b41c4c98c6b0fc5cc59663a5539b111dd325fb0", size = 80514, upload-time = "2025-04-30T11:51:07.98Z" }
sdist = { url = "https://files.pythonhosted.org/packages/98/1d/3062fcc89ee05a715c0b9bfe6490c00c576314f27ffee3a704122c6fd259/humanize-4.13.0.tar.gz", hash = "sha256:78f79e68f76f0b04d711c4e55d32bebef5be387148862cb1ef83d2b58e7935a0", size = 81884, upload-time = "2025-08-25T09:39:20.04Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/a0/1e/62a2ec3104394a2975a2629eec89276ede9dbe717092f6966fcf963e1bf0/humanize-4.12.3-py3-none-any.whl", hash = "sha256:2cbf6370af06568fa6d2da77c86edb7886f3160ecd19ee1ffef07979efc597f6", size = 128487, upload-time = "2025-04-30T11:51:06.468Z" },
{ url = "https://files.pythonhosted.org/packages/1e/c7/316e7ca04d26695ef0635dc81683d628350810eb8e9b2299fc08ba49f366/humanize-4.13.0-py3-none-any.whl", hash = "sha256:b810820b31891813b1673e8fec7f1ed3312061eab2f26e3fa192c393d11ed25f", size = 128869, upload-time = "2025-08-25T09:39:18.54Z" },
]
[[package]]
@ -306,11 +306,11 @@ wheels = [
[[package]]
name = "platformdirs"
version = "4.3.8"
version = "4.4.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/fe/8b/3c73abc9c759ecd3f1f7ceff6685840859e8070c4d947c93fae71f6a0bf2/platformdirs-4.3.8.tar.gz", hash = "sha256:3d512d96e16bcb959a814c9f348431070822a6496326a4be0911c40b5a74c2bc", size = 21362, upload-time = "2025-05-07T22:47:42.121Z" }
sdist = { url = "https://files.pythonhosted.org/packages/23/e8/21db9c9987b0e728855bd57bff6984f67952bea55d6f75e055c46b5383e8/platformdirs-4.4.0.tar.gz", hash = "sha256:ca753cf4d81dc309bc67b0ea38fd15dc97bc30ce419a7f58d13eb3bf14c4febf", size = 21634, upload-time = "2025-08-26T14:32:04.268Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/fe/39/979e8e21520d4e47a0bbe349e2713c0aac6f3d853d0e5b34d76206c439aa/platformdirs-4.3.8-py3-none-any.whl", hash = "sha256:ff7059bb7eb1179e2685604f4aaf157cfd9535242bd23742eadc3c13542139b4", size = 18567, upload-time = "2025-05-07T22:47:40.376Z" },
{ url = "https://files.pythonhosted.org/packages/40/4b/2028861e724d3bd36227adfa20d3fd24c3fc6d52032f4a93c133be5d17ce/platformdirs-4.4.0-py3-none-any.whl", hash = "sha256:abd01743f24e5287cd7a5db3752faf1a2d65353f38ec26d98e25a6db65958c85", size = 18654, upload-time = "2025-08-26T14:32:02.735Z" },
]
[[package]]
@ -565,14 +565,14 @@ requires-dist = [
{ name = "croniter", specifier = "==6.0.0" },
{ name = "fastapi", specifier = "==0.116.1" },
{ name = "gitpython", specifier = "==3.1.45" },
{ name = "humanize", specifier = "==4.12.3" },
{ name = "humanize", specifier = "==4.13.0" },
{ name = "pre-commit", marker = "extra == 'dev'", specifier = "==4.3.0" },
{ name = "pytimeparse2", specifier = "==1.7.1" },
{ name = "qbittorrent-api", specifier = "==2025.7.0" },
{ name = "requests", specifier = "==2.32.4" },
{ name = "requests", specifier = "==2.32.5" },
{ name = "retrying", specifier = "==1.4.2" },
{ name = "ruamel-yaml", specifier = "==0.18.14" },
{ name = "ruff", marker = "extra == 'dev'", specifier = "==0.12.9" },
{ name = "ruamel-yaml", specifier = "==0.18.15" },
{ name = "ruff", marker = "extra == 'dev'", specifier = "==0.12.11" },
{ name = "uvicorn", specifier = "==0.35.0" },
]
provides-extras = ["dev"]
@ -593,7 +593,7 @@ wheels = [
[[package]]
name = "requests"
version = "2.32.4"
version = "2.32.5"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "certifi" },
@ -601,9 +601,9 @@ dependencies = [
{ name = "idna" },
{ name = "urllib3" },
]
sdist = { url = "https://files.pythonhosted.org/packages/e1/0a/929373653770d8a0d7ea76c37de6e41f11eb07559b103b1c02cafb3f7cf8/requests-2.32.4.tar.gz", hash = "sha256:27d0316682c8a29834d3264820024b62a36942083d52caf2f14c0591336d3422", size = 135258, upload-time = "2025-06-09T16:43:07.34Z" }
sdist = { url = "https://files.pythonhosted.org/packages/c9/74/b3ff8e6c8446842c3f5c837e9c3dfcfe2018ea6ecef224c710c85ef728f4/requests-2.32.5.tar.gz", hash = "sha256:dbba0bac56e100853db0ea71b82b4dfd5fe2bf6d3754a8893c3af500cec7d7cf", size = 134517, upload-time = "2025-08-18T20:46:02.573Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/7c/e4/56027c4a6b4ae70ca9de302488c5ca95ad4a39e190093d6c1a8ace08341b/requests-2.32.4-py3-none-any.whl", hash = "sha256:27babd3cda2a6d50b30443204ee89830707d396671944c998b5975b031ac2b2c", size = 64847, upload-time = "2025-06-09T16:43:05.728Z" },
{ url = "https://files.pythonhosted.org/packages/1e/db/4254e3eabe8020b458f1a747140d32277ec7a271daf1d235b70dc0b4e6e3/requests-2.32.5-py3-none-any.whl", hash = "sha256:2462f94637a34fd532264295e186976db0f5d453d1cdd31473c85a6a161affb6", size = 64738, upload-time = "2025-08-18T20:46:00.542Z" },
]
[[package]]
@ -617,14 +617,14 @@ wheels = [
[[package]]
name = "ruamel-yaml"
version = "0.18.14"
version = "0.18.15"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "ruamel-yaml-clib", marker = "python_full_version < '3.14' and platform_python_implementation == 'CPython'" },
]
sdist = { url = "https://files.pythonhosted.org/packages/39/87/6da0df742a4684263261c253f00edd5829e6aca970fff69e75028cccc547/ruamel.yaml-0.18.14.tar.gz", hash = "sha256:7227b76aaec364df15936730efbf7d72b30c0b79b1d578bbb8e3dcb2d81f52b7", size = 145511, upload-time = "2025-06-09T08:51:09.828Z" }
sdist = { url = "https://files.pythonhosted.org/packages/3e/db/f3950f5e5031b618aae9f423a39bf81a55c148aecd15a34527898e752cf4/ruamel.yaml-0.18.15.tar.gz", hash = "sha256:dbfca74b018c4c3fba0b9cc9ee33e53c371194a9000e694995e620490fd40700", size = 146865, upload-time = "2025-08-19T11:15:10.694Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/af/6d/6fe4805235e193aad4aaf979160dd1f3c487c57d48b810c816e6e842171b/ruamel.yaml-0.18.14-py3-none-any.whl", hash = "sha256:710ff198bb53da66718c7db27eec4fbcc9aa6ca7204e4c1df2f282b6fe5eb6b2", size = 118570, upload-time = "2025-06-09T08:51:06.348Z" },
{ url = "https://files.pythonhosted.org/packages/d1/e5/f2a0621f1781b76a38194acae72f01e37b1941470407345b6e8653ad7640/ruamel.yaml-0.18.15-py3-none-any.whl", hash = "sha256:148f6488d698b7a5eded5ea793a025308b25eca97208181b6a026037f391f701", size = 119702, upload-time = "2025-08-19T11:15:07.696Z" },
]
[[package]]
@ -682,28 +682,28 @@ wheels = [
[[package]]
name = "ruff"
version = "0.12.9"
version = "0.12.11"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/4a/45/2e403fa7007816b5fbb324cb4f8ed3c7402a927a0a0cb2b6279879a8bfdc/ruff-0.12.9.tar.gz", hash = "sha256:fbd94b2e3c623f659962934e52c2bea6fc6da11f667a427a368adaf3af2c866a", size = 5254702, upload-time = "2025-08-14T16:08:55.2Z" }
sdist = { url = "https://files.pythonhosted.org/packages/de/55/16ab6a7d88d93001e1ae4c34cbdcfb376652d761799459ff27c1dc20f6fa/ruff-0.12.11.tar.gz", hash = "sha256:c6b09ae8426a65bbee5425b9d0b82796dbb07cb1af045743c79bfb163001165d", size = 5347103, upload-time = "2025-08-28T13:59:08.87Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/ad/20/53bf098537adb7b6a97d98fcdebf6e916fcd11b2e21d15f8c171507909cc/ruff-0.12.9-py3-none-linux_armv6l.whl", hash = "sha256:fcebc6c79fcae3f220d05585229463621f5dbf24d79fdc4936d9302e177cfa3e", size = 11759705, upload-time = "2025-08-14T16:08:12.968Z" },
{ url = "https://files.pythonhosted.org/packages/20/4d/c764ee423002aac1ec66b9d541285dd29d2c0640a8086c87de59ebbe80d5/ruff-0.12.9-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:aed9d15f8c5755c0e74467731a007fcad41f19bcce41cd75f768bbd687f8535f", size = 12527042, upload-time = "2025-08-14T16:08:16.54Z" },
{ url = "https://files.pythonhosted.org/packages/8b/45/cfcdf6d3eb5fc78a5b419e7e616d6ccba0013dc5b180522920af2897e1be/ruff-0.12.9-py3-none-macosx_11_0_arm64.whl", hash = "sha256:5b15ea354c6ff0d7423814ba6d44be2807644d0c05e9ed60caca87e963e93f70", size = 11724457, upload-time = "2025-08-14T16:08:18.686Z" },
{ url = "https://files.pythonhosted.org/packages/72/e6/44615c754b55662200c48bebb02196dbb14111b6e266ab071b7e7297b4ec/ruff-0.12.9-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d596c2d0393c2502eaabfef723bd74ca35348a8dac4267d18a94910087807c53", size = 11949446, upload-time = "2025-08-14T16:08:21.059Z" },
{ url = "https://files.pythonhosted.org/packages/fd/d1/9b7d46625d617c7df520d40d5ac6cdcdf20cbccb88fad4b5ecd476a6bb8d/ruff-0.12.9-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:1b15599931a1a7a03c388b9c5df1bfa62be7ede6eb7ef753b272381f39c3d0ff", size = 11566350, upload-time = "2025-08-14T16:08:23.433Z" },
{ url = "https://files.pythonhosted.org/packages/59/20/b73132f66f2856bc29d2d263c6ca457f8476b0bbbe064dac3ac3337a270f/ruff-0.12.9-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3d02faa2977fb6f3f32ddb7828e212b7dd499c59eb896ae6c03ea5c303575756", size = 13270430, upload-time = "2025-08-14T16:08:25.837Z" },
{ url = "https://files.pythonhosted.org/packages/a2/21/eaf3806f0a3d4c6be0a69d435646fba775b65f3f2097d54898b0fd4bb12e/ruff-0.12.9-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:17d5b6b0b3a25259b69ebcba87908496e6830e03acfb929ef9fd4c58675fa2ea", size = 14264717, upload-time = "2025-08-14T16:08:27.907Z" },
{ url = "https://files.pythonhosted.org/packages/d2/82/1d0c53bd37dcb582b2c521d352fbf4876b1e28bc0d8894344198f6c9950d/ruff-0.12.9-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:72db7521860e246adbb43f6ef464dd2a532ef2ef1f5dd0d470455b8d9f1773e0", size = 13684331, upload-time = "2025-08-14T16:08:30.352Z" },
{ url = "https://files.pythonhosted.org/packages/3b/2f/1c5cf6d8f656306d42a686f1e207f71d7cebdcbe7b2aa18e4e8a0cb74da3/ruff-0.12.9-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a03242c1522b4e0885af63320ad754d53983c9599157ee33e77d748363c561ce", size = 12739151, upload-time = "2025-08-14T16:08:32.55Z" },
{ url = "https://files.pythonhosted.org/packages/47/09/25033198bff89b24d734e6479e39b1968e4c992e82262d61cdccaf11afb9/ruff-0.12.9-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9fc83e4e9751e6c13b5046d7162f205d0a7bac5840183c5beebf824b08a27340", size = 12954992, upload-time = "2025-08-14T16:08:34.816Z" },
{ url = "https://files.pythonhosted.org/packages/52/8e/d0dbf2f9dca66c2d7131feefc386523404014968cd6d22f057763935ab32/ruff-0.12.9-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:881465ed56ba4dd26a691954650de6ad389a2d1fdb130fe51ff18a25639fe4bb", size = 12899569, upload-time = "2025-08-14T16:08:36.852Z" },
{ url = "https://files.pythonhosted.org/packages/a0/bd/b614d7c08515b1428ed4d3f1d4e3d687deffb2479703b90237682586fa66/ruff-0.12.9-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:43f07a3ccfc62cdb4d3a3348bf0588358a66da756aa113e071b8ca8c3b9826af", size = 11751983, upload-time = "2025-08-14T16:08:39.314Z" },
{ url = "https://files.pythonhosted.org/packages/58/d6/383e9f818a2441b1a0ed898d7875f11273f10882f997388b2b51cb2ae8b5/ruff-0.12.9-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:07adb221c54b6bba24387911e5734357f042e5669fa5718920ee728aba3cbadc", size = 11538635, upload-time = "2025-08-14T16:08:41.297Z" },
{ url = "https://files.pythonhosted.org/packages/20/9c/56f869d314edaa9fc1f491706d1d8a47747b9d714130368fbd69ce9024e9/ruff-0.12.9-py3-none-musllinux_1_2_i686.whl", hash = "sha256:f5cd34fabfdea3933ab85d72359f118035882a01bff15bd1d2b15261d85d5f66", size = 12534346, upload-time = "2025-08-14T16:08:43.39Z" },
{ url = "https://files.pythonhosted.org/packages/bd/4b/d8b95c6795a6c93b439bc913ee7a94fda42bb30a79285d47b80074003ee7/ruff-0.12.9-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:f6be1d2ca0686c54564da8e7ee9e25f93bdd6868263805f8c0b8fc6a449db6d7", size = 13017021, upload-time = "2025-08-14T16:08:45.889Z" },
{ url = "https://files.pythonhosted.org/packages/c7/c1/5f9a839a697ce1acd7af44836f7c2181cdae5accd17a5cb85fcbd694075e/ruff-0.12.9-py3-none-win32.whl", hash = "sha256:cc7a37bd2509974379d0115cc5608a1a4a6c4bff1b452ea69db83c8855d53f93", size = 11734785, upload-time = "2025-08-14T16:08:48.062Z" },
{ url = "https://files.pythonhosted.org/packages/fa/66/cdddc2d1d9a9f677520b7cfc490d234336f523d4b429c1298de359a3be08/ruff-0.12.9-py3-none-win_amd64.whl", hash = "sha256:6fb15b1977309741d7d098c8a3cb7a30bc112760a00fb6efb7abc85f00ba5908", size = 12840654, upload-time = "2025-08-14T16:08:50.158Z" },
{ url = "https://files.pythonhosted.org/packages/ac/fd/669816bc6b5b93b9586f3c1d87cd6bc05028470b3ecfebb5938252c47a35/ruff-0.12.9-py3-none-win_arm64.whl", hash = "sha256:63c8c819739d86b96d500cce885956a1a48ab056bbcbc61b747ad494b2485089", size = 11949623, upload-time = "2025-08-14T16:08:52.233Z" },
{ url = "https://files.pythonhosted.org/packages/d6/a2/3b3573e474de39a7a475f3fbaf36a25600bfeb238e1a90392799163b64a0/ruff-0.12.11-py3-none-linux_armv6l.whl", hash = "sha256:93fce71e1cac3a8bf9200e63a38ac5c078f3b6baebffb74ba5274fb2ab276065", size = 11979885, upload-time = "2025-08-28T13:58:26.654Z" },
{ url = "https://files.pythonhosted.org/packages/76/e4/235ad6d1785a2012d3ded2350fd9bc5c5af8c6f56820e696b0118dfe7d24/ruff-0.12.11-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:b8e33ac7b28c772440afa80cebb972ffd823621ded90404f29e5ab6d1e2d4b93", size = 12742364, upload-time = "2025-08-28T13:58:30.256Z" },
{ url = "https://files.pythonhosted.org/packages/2c/0d/15b72c5fe6b1e402a543aa9d8960e0a7e19dfb079f5b0b424db48b7febab/ruff-0.12.11-py3-none-macosx_11_0_arm64.whl", hash = "sha256:d69fb9d4937aa19adb2e9f058bc4fbfe986c2040acb1a4a9747734834eaa0bfd", size = 11920111, upload-time = "2025-08-28T13:58:33.677Z" },
{ url = "https://files.pythonhosted.org/packages/3e/c0/f66339d7893798ad3e17fa5a1e587d6fd9806f7c1c062b63f8b09dda6702/ruff-0.12.11-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:411954eca8464595077a93e580e2918d0a01a19317af0a72132283e28ae21bee", size = 12160060, upload-time = "2025-08-28T13:58:35.74Z" },
{ url = "https://files.pythonhosted.org/packages/03/69/9870368326db26f20c946205fb2d0008988aea552dbaec35fbacbb46efaa/ruff-0.12.11-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:6a2c0a2e1a450f387bf2c6237c727dd22191ae8c00e448e0672d624b2bbd7fb0", size = 11799848, upload-time = "2025-08-28T13:58:38.051Z" },
{ url = "https://files.pythonhosted.org/packages/25/8c/dd2c7f990e9b3a8a55eee09d4e675027d31727ce33cdb29eab32d025bdc9/ruff-0.12.11-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8ca4c3a7f937725fd2413c0e884b5248a19369ab9bdd850b5781348ba283f644", size = 13536288, upload-time = "2025-08-28T13:58:40.046Z" },
{ url = "https://files.pythonhosted.org/packages/7a/30/d5496fa09aba59b5e01ea76775a4c8897b13055884f56f1c35a4194c2297/ruff-0.12.11-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:4d1df0098124006f6a66ecf3581a7f7e754c4df7644b2e6704cd7ca80ff95211", size = 14490633, upload-time = "2025-08-28T13:58:42.285Z" },
{ url = "https://files.pythonhosted.org/packages/9b/2f/81f998180ad53445d403c386549d6946d0748e536d58fce5b5e173511183/ruff-0.12.11-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5a8dd5f230efc99a24ace3b77e3555d3fbc0343aeed3fc84c8d89e75ab2ff793", size = 13888430, upload-time = "2025-08-28T13:58:44.641Z" },
{ url = "https://files.pythonhosted.org/packages/87/71/23a0d1d5892a377478c61dbbcffe82a3476b050f38b5162171942a029ef3/ruff-0.12.11-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4dc75533039d0ed04cd33fb8ca9ac9620b99672fe7ff1533b6402206901c34ee", size = 12913133, upload-time = "2025-08-28T13:58:47.039Z" },
{ url = "https://files.pythonhosted.org/packages/80/22/3c6cef96627f89b344c933781ed38329bfb87737aa438f15da95907cbfd5/ruff-0.12.11-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4fc58f9266d62c6eccc75261a665f26b4ef64840887fc6cbc552ce5b29f96cc8", size = 13169082, upload-time = "2025-08-28T13:58:49.157Z" },
{ url = "https://files.pythonhosted.org/packages/05/b5/68b3ff96160d8b49e8dd10785ff3186be18fd650d356036a3770386e6c7f/ruff-0.12.11-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:5a0113bd6eafd545146440225fe60b4e9489f59eb5f5f107acd715ba5f0b3d2f", size = 13139490, upload-time = "2025-08-28T13:58:51.593Z" },
{ url = "https://files.pythonhosted.org/packages/59/b9/050a3278ecd558f74f7ee016fbdf10591d50119df8d5f5da45a22c6afafc/ruff-0.12.11-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:0d737b4059d66295c3ea5720e6efc152623bb83fde5444209b69cd33a53e2000", size = 11958928, upload-time = "2025-08-28T13:58:53.943Z" },
{ url = "https://files.pythonhosted.org/packages/f9/bc/93be37347db854806904a43b0493af8d6873472dfb4b4b8cbb27786eb651/ruff-0.12.11-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:916fc5defee32dbc1fc1650b576a8fed68f5e8256e2180d4d9855aea43d6aab2", size = 11764513, upload-time = "2025-08-28T13:58:55.976Z" },
{ url = "https://files.pythonhosted.org/packages/7a/a1/1471751e2015a81fd8e166cd311456c11df74c7e8769d4aabfbc7584c7ac/ruff-0.12.11-py3-none-musllinux_1_2_i686.whl", hash = "sha256:c984f07d7adb42d3ded5be894fb4007f30f82c87559438b4879fe7aa08c62b39", size = 12745154, upload-time = "2025-08-28T13:58:58.16Z" },
{ url = "https://files.pythonhosted.org/packages/68/ab/2542b14890d0f4872dd81b7b2a6aed3ac1786fae1ce9b17e11e6df9e31e3/ruff-0.12.11-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:e07fbb89f2e9249f219d88331c833860489b49cdf4b032b8e4432e9b13e8a4b9", size = 13227653, upload-time = "2025-08-28T13:59:00.276Z" },
{ url = "https://files.pythonhosted.org/packages/22/16/2fbfc61047dbfd009c58a28369a693a1484ad15441723be1cd7fe69bb679/ruff-0.12.11-py3-none-win32.whl", hash = "sha256:c792e8f597c9c756e9bcd4d87cf407a00b60af77078c96f7b6366ea2ce9ba9d3", size = 11944270, upload-time = "2025-08-28T13:59:02.347Z" },
{ url = "https://files.pythonhosted.org/packages/08/a5/34276984705bfe069cd383101c45077ee029c3fe3b28225bf67aa35f0647/ruff-0.12.11-py3-none-win_amd64.whl", hash = "sha256:a3283325960307915b6deb3576b96919ee89432ebd9c48771ca12ee8afe4a0fd", size = 13046600, upload-time = "2025-08-28T13:59:04.751Z" },
{ url = "https://files.pythonhosted.org/packages/84/a8/001d4a7c2b37623a3fd7463208267fb906df40ff31db496157549cfd6e72/ruff-0.12.11-py3-none-win_arm64.whl", hash = "sha256:bae4d6e6a2676f8fb0f98b74594a048bae1b944aab17e9f5d504062303c6dbea", size = 12135290, upload-time = "2025-08-28T13:59:06.933Z" },
]
[[package]]
@ -735,24 +735,24 @@ wheels = [
[[package]]
name = "starlette"
version = "0.47.2"
version = "0.47.3"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "anyio" },
{ name = "typing-extensions", marker = "python_full_version < '3.13'" },
]
sdist = { url = "https://files.pythonhosted.org/packages/04/57/d062573f391d062710d4088fa1369428c38d51460ab6fedff920efef932e/starlette-0.47.2.tar.gz", hash = "sha256:6ae9aa5db235e4846decc1e7b79c4f346adf41e9777aebeb49dfd09bbd7023d8", size = 2583948, upload-time = "2025-07-20T17:31:58.522Z" }
sdist = { url = "https://files.pythonhosted.org/packages/15/b9/cc3017f9a9c9b6e27c5106cc10cc7904653c3eec0729793aec10479dd669/starlette-0.47.3.tar.gz", hash = "sha256:6bc94f839cc176c4858894f1f8908f0ab79dfec1a6b8402f6da9be26ebea52e9", size = 2584144, upload-time = "2025-08-24T13:36:42.122Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/f7/1f/b876b1f83aef204198a42dc101613fefccb32258e5428b5f9259677864b4/starlette-0.47.2-py3-none-any.whl", hash = "sha256:c5847e96134e5c5371ee9fac6fdf1a67336d5815e09eb2a01fdb57a351ef915b", size = 72984, upload-time = "2025-07-20T17:31:56.738Z" },
{ url = "https://files.pythonhosted.org/packages/ce/fd/901cfa59aaa5b30a99e16876f11abe38b59a1a2c51ffb3d7142bb6089069/starlette-0.47.3-py3-none-any.whl", hash = "sha256:89c0778ca62a76b826101e7c709e70680a1699ca7da6b44d38eb0a7e61fe4b51", size = 72991, upload-time = "2025-08-24T13:36:40.887Z" },
]
[[package]]
name = "typing-extensions"
version = "4.14.1"
version = "4.15.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/98/5a/da40306b885cc8c09109dc2e1abd358d5684b1425678151cdaed4731c822/typing_extensions-4.14.1.tar.gz", hash = "sha256:38b39f4aeeab64884ce9f74c94263ef78f3c22467c8724005483154c26648d36", size = 107673, upload-time = "2025-07-04T13:28:34.16Z" }
sdist = { url = "https://files.pythonhosted.org/packages/72/94/1a15dd82efb362ac84269196e94cf00f187f7ed21c242792a923cdb1c61f/typing_extensions-4.15.0.tar.gz", hash = "sha256:0cea48d173cc12fa28ecabc3b837ea3cf6f38c6d1136f85cbaaf598984861466", size = 109391, upload-time = "2025-08-25T13:49:26.313Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/b5/00/d631e67a838026495268c2f6884f3711a15a9a2a96cd244fdaea53b823fb/typing_extensions-4.14.1-py3-none-any.whl", hash = "sha256:d1e1e3b58374dc93031d6eda2420a48ea44a36c2b4766a4fdeb3710755731d76", size = 43906, upload-time = "2025-07-04T13:28:32.743Z" },
{ url = "https://files.pythonhosted.org/packages/18/67/36e9267722cc04a6b9f15c7f3441c2363321a3ea07da7ae0c0707beb2a9c/typing_extensions-4.15.0-py3-none-any.whl", hash = "sha256:f0fa19c6845758ab08074a0cfa8b7aecb71c999ca73d62883bc25cc018c4e548", size = 44614, upload-time = "2025-08-25T13:49:24.86Z" },
]
[[package]]

View file

@ -337,18 +337,7 @@
gap: var(--spacing-sm);
}
.loading-spinner {
width: 14px;
height: 14px;
border: 2px solid transparent;
border-top: 2px solid currentColor;
border-radius: 50%;
animation: spin 1s linear infinite;
}
@keyframes spin {
to { transform: rotate(360deg); }
}
/* Loading spinner now uses unified system from main.css */
/* Preset Buttons */
.preset-buttons {

View file

@ -252,7 +252,114 @@ body {
transition: all var(--transition-fast);
}
/* Animation for loading states */
/* Unified Spinner System */
@keyframes modern-spin {
0% {
transform: rotate(0deg) scale(1);
opacity: 1;
}
50% {
transform: rotate(180deg) scale(1.1);
opacity: 0.7;
}
100% {
transform: rotate(360deg) scale(1);
opacity: 1;
}
}
/* Unified Spinner System */
@keyframes smooth-rotate {
0% { transform: rotate(0deg); }
100% { transform: rotate(360deg); }
}
/* Base spinner class - can be used anywhere */
.spinner {
position: relative;
border-radius: 50%;
animation: smooth-rotate 1s linear infinite;
flex-shrink: 0;
border: 2px solid transparent;
border-top: 2px solid var(--primary-color);
}
/* Size variants */
.spinner-sm {
width: 16px !important;
height: 16px !important;
}
.spinner-md {
width: 24px !important;
height: 24px !important;
}
.spinner-lg {
width: 48px !important;
height: 48px !important;
}
/* Context-specific spinner styles */
.spinner.spinner-inline {
display: inline-block;
vertical-align: middle;
margin-right: var(--spacing-xs);
}
.spinner.spinner-button {
border-top-color: currentColor;
}
/* Loading overlay specific styles */
.loading-overlay {
position: fixed;
top: 0;
left: 0;
width: 100vw;
height: 100vh;
background-color: rgba(0, 0, 0, 0.6);
display: flex;
justify-content: center;
align-items: center;
z-index: var(--z-modal-backdrop);
backdrop-filter: blur(2px);
-webkit-backdrop-filter: blur(2px);
}
.loading-spinner {
display: flex;
flex-direction: column;
align-items: center;
justify-content: center;
text-align: center;
color: var(--text-inverse);
padding: var(--spacing-xl);
border-radius: var(--border-radius-lg);
background-color: rgba(255, 255, 255, 0.05);
backdrop-filter: blur(8px);
-webkit-backdrop-filter: blur(8px);
border: 1px solid rgba(255, 255, 255, 0.1);
box-shadow: 0 8px 32px rgba(0, 0, 0, 0.3);
}
/* Dark mode: lighter text for better readability */
[data-theme="dark"] .loading-spinner {
color: var(--text-secondary);
}
/* Auto dark mode support */
@media (prefers-color-scheme: dark) {
:root:not([data-theme]) .loading-spinner {
color: var(--text-secondary);
}
}
.loading-spinner .spinner {
margin-bottom: var(--spacing-md);
}
/* Legacy support - keep for backward compatibility */
@keyframes rotate {
from {
transform: rotate(0deg);
@ -693,37 +800,20 @@ body {
border: 0;
}\n
.loading-overlay {
position: absolute;
position: fixed;
top: 0;
left: 0;
width: 100%;
height: 100%;
background-color: rgba(0, 0, 0, 0.5);
width: 100vw;
height: 100vh;
background-color: rgba(0, 0, 0, 0.6);
display: flex;
justify-content: center;
align-items: center;
z-index: var(--z-modal-backdrop); /* Use a high z-index to ensure it's on top */
z-index: var(--z-modal-backdrop);
backdrop-filter: blur(2px);
-webkit-backdrop-filter: blur(2px);
}
.loading-spinner {
text-align: center;
color: var(--text-inverse); /* Use text-inverse for better contrast on dark overlay */
}
.spinner {
border: 4px solid rgba(255, 255, 255, 0.3);
border-top: 4px solid var(--primary-color);
border-radius: 50%;
width: 40px;
height: 40px;
animation: spin 1s linear infinite;
margin: 0 auto 10px auto;
}
@keyframes spin {
0% { transform: rotate(0deg); }
100% { transform: rotate(360deg); }
}
/* Password Input Group */
.password-input-group {

View file

@ -225,8 +225,6 @@ class QbitManageApp {
this.backupConfig();
});
// Theme toggle is handled by ThemeManager
// Undo button
const undoBtn = get('undo-btn');
if (undoBtn) {
@ -442,7 +440,15 @@ class QbitManageApp {
}
try {
const processedData = this.configForm._postprocessDataForSave(this.currentSection, this.configForm.currentData);
// For commands section, collect all form values since commands should override env vars
let dataToProcess;
if (this.currentSection === 'commands') {
dataToProcess = this.configForm.collectAllFormValues(this.currentSection);
} else {
dataToProcess = this.configForm.currentData;
}
const processedData = this.configForm._postprocessDataForSave(this.currentSection, dataToProcess);
const isMultiRoot = this.configForm.schemas[this.currentSection]?.type === 'multi-root-object';
const dataToSave = isMultiRoot ? processedData : { [this.currentSection]: processedData };
@ -629,7 +635,34 @@ class QbitManageApp {
hideLoading();
if (response.valid) {
showToast('Configuration is valid', 'success');
// Check if config was modified during validation
if (response.config_modified) {
showToast('Configuration validated successfully! Default values have been added.', 'success');
// Reload the configuration data from the server to reflect changes
try {
const configResponse = await this.api.getConfig(this.currentConfig);
if (configResponse && configResponse.data) {
// Update the app's config data
this.configData = configResponse.data;
// Reload the current section to reflect changes
if (this.configForm && this.currentSection) {
await this.configForm.loadSection(this.currentSection, this.configData[this.currentSection] || {});
}
// Update YAML preview if it's open
if (this.yamlPreviewVisible) {
this.updateYamlPreview();
}
}
} catch (reloadError) {
console.error('Error reloading config after main validation:', reloadError);
showToast('Configuration validated but failed to reload updated data.', 'warning');
}
} else {
showToast('Configuration is valid', 'success');
}
} else {
// Pass the errors array directly instead of converting to string
this.showValidationModal('Configuration Validation Failed', response.errors, response.warnings);

View file

@ -133,7 +133,7 @@ class SchedulerControl {
<button type="submit" class="btn btn-primary" id="update-schedule-btn" disabled aria-describedby="update-help">
<span class="btn-text">Save Schedule</span>
<span class="btn-loading" style="display: none;" aria-hidden="true">
<span class="loading-spinner" aria-hidden="true"></span>
<span class="spinner spinner-sm spinner-button" aria-hidden="true"></span>
Saving...
</span>
</button>

View file

@ -141,6 +141,9 @@ class CommandPanel {
</div>
`;
// Load saved quick action values after rendering
this.loadQuickActionValues();
}
bindEvents() {
@ -175,6 +178,9 @@ class CommandPanel {
this.showRunCommandsModal();
}
});
// Bind quick action input change events for persistence
this.bindQuickActionPersistence();
}
async executeQuickCommand(command) {
@ -475,6 +481,74 @@ class CommandPanel {
this.show();
}
}
// Load saved quick action values from localStorage
loadQuickActionValues() {
if (!this.drawerContainer) return;
// Get saved values, defaulting dry run to true if not previously saved
const savedDryRunValue = localStorage.getItem('qbm-quick-dry-run');
const savedDryRun = savedDryRunValue !== null ? savedDryRunValue === 'true' : true; // Default to true
const savedSkipCleanup = localStorage.getItem('qbm-quick-skip-cleanup') === 'true';
const savedSkipQbVersionCheck = localStorage.getItem('qbm-quick-skip-qb-version-check') === 'true';
const savedLogLevel = localStorage.getItem('qbm-quick-log-level') || '';
const dryRunCheckbox = this.drawerContainer.querySelector('#dry-run-checkbox');
const skipCleanupCheckbox = this.drawerContainer.querySelector('#quick-skip-cleanup-checkbox');
const skipQbVersionCheckCheckbox = this.drawerContainer.querySelector('#quick-skip-qb-version-check-checkbox');
const logLevelSelect = this.drawerContainer.querySelector('#quick-log-level-select');
if (dryRunCheckbox) dryRunCheckbox.checked = savedDryRun;
if (skipCleanupCheckbox) skipCleanupCheckbox.checked = savedSkipCleanup;
if (skipQbVersionCheckCheckbox) skipQbVersionCheckCheckbox.checked = savedSkipQbVersionCheck;
if (logLevelSelect) logLevelSelect.value = savedLogLevel;
// Save the default value if it was set
if (savedDryRunValue === null) {
localStorage.setItem('qbm-quick-dry-run', 'true');
}
}
// Save quick action values to localStorage
saveQuickActionValues() {
if (!this.drawerContainer) return;
const dryRunCheckbox = this.drawerContainer.querySelector('#dry-run-checkbox');
const skipCleanupCheckbox = this.drawerContainer.querySelector('#quick-skip-cleanup-checkbox');
const skipQbVersionCheckCheckbox = this.drawerContainer.querySelector('#quick-skip-qb-version-check-checkbox');
const logLevelSelect = this.drawerContainer.querySelector('#quick-log-level-select');
const dryRun = dryRunCheckbox ? dryRunCheckbox.checked : false;
const skipCleanup = skipCleanupCheckbox ? skipCleanupCheckbox.checked : false;
const skipQbVersionCheck = skipQbVersionCheckCheckbox ? skipQbVersionCheckCheckbox.checked : false;
const logLevel = logLevelSelect ? logLevelSelect.value : '';
localStorage.setItem('qbm-quick-dry-run', dryRun);
localStorage.setItem('qbm-quick-skip-cleanup', skipCleanup);
localStorage.setItem('qbm-quick-skip-qb-version-check', skipQbVersionCheck);
localStorage.setItem('qbm-quick-log-level', logLevel);
}
// Bind event listeners for quick action persistence
bindQuickActionPersistence() {
if (!this.drawerContainer) return;
// Bind checkbox change events
const checkboxes = this.drawerContainer.querySelectorAll('#dry-run-checkbox, #quick-skip-cleanup-checkbox, #quick-skip-qb-version-check-checkbox');
checkboxes.forEach(checkbox => {
checkbox.addEventListener('change', () => {
this.saveQuickActionValues();
});
});
// Bind select change event
const logLevelSelect = this.drawerContainer.querySelector('#quick-log-level-select');
if (logLevelSelect) {
logLevelSelect.addEventListener('change', () => {
this.saveQuickActionValues();
});
}
}
}
export { CommandPanel };

View file

@ -1302,6 +1302,51 @@ class ConfigForm {
* @param {object} obj - The object to clean up.
* @returns {object} The cleaned up object.
*/
/**
* Collects all current form values for a section, not just dirty ones
* This is used for sections like commands where we want to save all values
*/
collectAllFormValues(sectionName) {
const sectionConfig = this.schemas[sectionName];
if (!sectionConfig || !sectionConfig.fields) {
return {};
}
const allValues = {};
// Iterate through all fields in the schema
sectionConfig.fields.forEach(field => {
if (field.name && field.type !== 'documentation' && field.type !== 'section_header') {
// Find the corresponding form input
const input = this.container.querySelector(`[name="${field.name}"]`);
if (input) {
let value;
if (input.type === 'checkbox') {
value = input.checked;
} else if (input.type === 'number') {
value = input.value ? parseFloat(input.value) : null;
} else {
value = input.value;
}
// Handle default values for boolean fields
if (field.type === 'boolean' && value === null) {
value = field.default || false;
}
allValues[field.name] = value;
} else if (field.default !== undefined) {
// Use default value if input not found
allValues[field.name] = field.default;
}
}
});
return allValues;
}
cleanupEmptyValues(obj) {
if (obj === null || obj === undefined) {
return null;
@ -1346,7 +1391,7 @@ class ConfigForm {
return null;
}
validateSection() {
async validateSection() {
const sectionConfig = this.schemas[this.currentSection];
this.validationState = { valid: true, errors: [], warnings: [] };
@ -1372,7 +1417,55 @@ class ConfigForm {
this.updateValidationDisplay();
if (this.validationState.errors.length === 0) {
showToast('Section is valid!', 'success');
// Perform backend validation which may add default values
try {
const response = await this.api.validateConfig(this.currentSection, this.currentData);
if (response.valid) {
// Check if config was modified during validation
if (response.config_modified) {
showToast('Configuration validated successfully! Default values have been added.', 'success');
// Reload the configuration data from the server to reflect changes
try {
const configResponse = await this.api.getConfig(this.currentSection);
if (configResponse && configResponse.data) {
// Update current data with the modified config
this.currentData = this._preprocessComplexObjectData(this.currentSection, configResponse.data);
// Store initial data only once per section
if (!this.initialSectionData[this.currentSection]) {
this.initialSectionData[this.currentSection] = JSON.parse(JSON.stringify(this.currentData));
}
// Always reset to initial data when loading a section
this.originalData = JSON.parse(JSON.stringify(this.initialSectionData[this.currentSection]));
// Re-render the section with updated data
await this.renderSection();
// Notify parent component of data change
this.onDataChange(this.currentData);
}
} catch (reloadError) {
console.error('Error reloading config after validation:', reloadError);
showToast('Configuration validated but failed to reload updated data.', 'warning');
}
} else {
showToast('Configuration validated successfully!', 'success');
}
} else {
// Handle validation errors from backend
this.validationState.valid = false;
this.validationState.errors = response.errors || [];
this.validationState.warnings = response.warnings || [];
this.onValidationChange(this.validationState);
this.updateValidationDisplay();
}
} catch (error) {
console.error('Error during backend validation:', error);
showToast('Failed to validate configuration with backend.', 'error');
}
}
}

View file

@ -93,7 +93,7 @@ export class ShareLimitsComponent {
addButton.disabled = loading;
const uiTexts = shareLimitsSchema.ui?.texts || {};
addButton.innerHTML = loading
? `<span class="loading-spinner"></span> ${uiTexts.addGroupLoadingText || 'Creating...'}`
? `<span class="spinner spinner-sm spinner-button"></span> ${uiTexts.addGroupLoadingText || 'Creating...'}`
: (uiTexts.addGroupButtonText || 'Add New Group');
}
}

View file

@ -65,7 +65,7 @@ export function showLoading(container, message = 'Loading...') {
loadingOverlay.className = 'loading-overlay'; // Add a class for styling
loadingOverlay.innerHTML = `
<div class="loading-spinner">
<div class="spinner"></div>
<div class="spinner spinner-lg"></div>
<p>${message}</p>
</div>
`;