26 Commits

Author SHA1 Message Date
Javanaut
2593c95b5c Release v0.2.5 2026-04-12 19:58:30 +02:00
Javanaut
8a8c43ecdf v0.2.5 2026-04-12 19:57:46 +02:00
Javanaut
6170ac641c ff 2026-04-12 19:35:03 +02:00
Javanaut
497c0e500b ff 2026-04-12 19:34:51 +02:00
Javanaut
008c643272 change disposition order for sidecar files 2026-04-12 19:31:49 +02:00
Javanaut
c302b30e63 ff 2026-04-12 19:19:08 +02:00
Javanaut
7926407534 ff 2026-04-12 19:09:26 +02:00
Javanaut
0894ac2fab ff 2026-04-12 18:50:41 +02:00
Javanaut
353759b983 ff 2026-04-12 18:47:54 +02:00
Javanaut
454f5f0656 ff 2026-04-12 18:46:54 +02:00
Javanaut
0e51d6337f ff 2026-04-12 18:35:13 +02:00
Javanaut
a24b6dedaa ff 2026-04-12 18:26:39 +02:00
Javanaut
8361fc536b ff 2026-04-12 17:53:56 +02:00
Javanaut
4d4272e5e8 ff 2026-04-12 17:47:06 +02:00
Javanaut
559869ca68 iteration1 2026-04-12 17:12:32 +02:00
Javanaut
0e4fae538b prep season shift 2026-04-12 16:52:12 +02:00
Javanaut
12509cd4e2 Release v0.2.4 2026-04-12 12:28:37 +02:00
Javanaut
2595bfe4f4 prep 0.2.4 2026-04-12 12:28:23 +02:00
Javanaut
3df11be5e9 upd .gitignore 2026-04-12 12:24:19 +02:00
Javanaut
fc9d94aeee prep 0.2.4 2026-04-12 12:21:26 +02:00
Javanaut
111df11199 ff 2026-04-12 12:20:01 +02:00
Javanaut
f0d4c36bc3 Adds release script and bumps 0.2.4 2026-04-12 12:12:41 +02:00
Javanaut
ef0d6e9274 Extd rename/unmux to pad with zeroes 2026-04-12 11:44:32 +02:00
Javanaut
d05b01cfb2 Adds rename command 2026-04-12 10:38:36 +02:00
Javanaut
72c735c3ee ffn 2026-04-09 01:06:32 +02:00
Javanaut
381a62046b nightly 2026-04-09 01:04:47 +02:00
49 changed files with 2691 additions and 1411 deletions

1
.gitignore vendored
View File

@@ -21,3 +21,4 @@ venv/
*.mkv
*.webm
ffmpeg2pass-0.log
*.sup

376
AGENTS.md
View File

@@ -1,376 +0,0 @@
# AGENTS.md
This file is the entry point for agent guidance in this repository.
It is intentionally generic and reusable across projects. Keep this file focused on non-project-specific constraints, working style, and the structure used to link more detailed guidance.
# Purpose
- Provide a small default rule set for agents working in this repository.
- Keep the base guidance modular and easy to extend.
- Separate reusable agent behavior from project-specific requirements.
# Comment Syntax
- A segment wrapped in `<!--` and `-->` is a comment and must be ignored by agents.
- Use HTML comments for optional guidance that should stay inactive until enabled.
- To enable an optional segment, remove the surrounding `<!--` and `-->` markers.
# Core Principles
- Prefer the simplest solution that satisfies the current goal.
- Keep guidance lightweight: only add detail when it meaningfully improves outcomes.
- Reuse modular guideline files instead of expanding this file indefinitely.
- Treat project-specific documents as the source of truth for project behavior.
- When guidance conflicts, use the most specific applicable document.
# Rule Terms
- A `rule` is the general term for any constraint, requirement, definition, or similar guidance item.
- A `rule set` addresses all rules inside one file that share the same rule set ID.
- Any rule inside a rule set shall use an ID following the schema `RULESET-0001`, `RULESET-0002`, and so on.
- Rules without a rule set ID are also valid, but they are not addressable by rule ID.
# Scope Of This File
This file should contain:
- Generic agent behavior and constraints.
- Rules that are reusable across multiple projects.
- Links to optional guideline modules.
- Links to project-specific requirements.
- Commented optional templates for released-product documentation and agent-output locations.
This file should not contain:
- Project business requirements.
- Project architecture decisions.
- Stack-specific implementation details unless they are universally applicable.
- Task-specific runbooks that belong in dedicated modules.
# Default Agent Behavior
- Read the relevant context before making changes.
- Prefer small, understandable edits over broad refactors.
- Preserve existing patterns unless there is a clear reason to change them.
- Document assumptions when context is missing.
- Ignore HTML comment segments.
- If a more specific enabled guideline exists for the current task, follow it.
# Guideline Structure
Use the following structure for reusable guidance files and project-specific documentation as needed:
```text
/
|-- AGENTS.md
|-- guidance/
| |-- stacks/
| |-- conventions/
| `-- workflows/
|-- prompts/
`-- requirements/
Optional files and directories
|-- SCRATCHPAD.md
|-- docs/
| |-- readme.md
| |-- installation.md
| `-- history.md
|-- process/
| |-- log.md
| `-- coding-handbook.md
```
# Optional Reusable Modules
Add files under `guidance/` only when they are needed.
# Optional Scratchpad
- `SCRATCHPAD.md` is an optional repo-root scratchpad for temporary
information aimed at the next iteration.
- Developers may create or delete `SCRATCHPAD.md` at any time.
- Developers may refer to `SCRATCHPAD.md` as `scratchpad` when giving agents a
source or target for information.
- Agents may read, update, create, or remove the scratchpad when the task
explicitly calls for it.
- Treat the scratchpad as low-formality working context rather than canonical
project truth.
- Use the scratchpad for short-lived notes, open questions, sketches, and
temporary decisions that should be resolved away.
- Move durable outcomes into `requirements/`, `guidance/`, code, tests, or
another long-lived location.
- If `SCRATCHPAD.md` is absent, agents should continue normally.
# Optional Rule Sets
- Optional rule sets may be stored in `guidance/optional/` or in `guidance/{section}/optional/`.
- Optional rule sets are inactive by default and shall only be applied when a prompt explicitly requests them, for example by phrases such as `Apply rules for lean interface iteration in the following steps.` or `Apply LII rules.`
- An optional rule set may be requested by its descriptive name, by its rule set ID, or by another equally clear explicit reference.
- Agents shall never infer or auto-enable optional rule sets from general intent alone.
- If an optional rule or rule set cannot be identified and addressed clearly, agents shall stop and ask before proceeding.
# Prepared Orders
- An `order` is a prepared prompt for one isolated operation rather than a general workflow or standing rule set.
- Orders shall be stored under `prompts/`.
- Order files shall use the naming schema `ORDER-0001-<slug>.md`, `ORDER-0002-<slug>.md`, and so on.
- The canonical order identifier is the `ORDER-0001` style prefix. The trailing slug is descriptive only.
- Recommended internal order file structure is: prompt ID, prompt name, purpose, trigger examples, scope, operation, and expected output.
- Orders shall only be executed when they are explicitly requested by a prompt such as `Execute ORDER-0007.` or `Execute ORDER 7.`
- Agents may accept an unambiguous short numeric reference such as `ORDER 7` as an alias for `ORDER-0007`.
- If an order cannot be identified uniquely and clearly, agents shall stop and ask before proceeding.
# Toolstack Guides
Location:
```text
guidance/stacks/
```
Examples:
- `guidance/stacks/python.md`
- `guidance/stacks/typescript.md`
- `guidance/stacks/docker.md`
- `guidance/stacks/terraform.md`
Use for:
- Language or framework expectations.
- Tooling and environment conventions.
- Build, test, and runtime guidance tied to a specific stack.
# Coding Conventions
Location:
```text
guidance/conventions/
```
Examples:
- `guidance/conventions/naming.md`
- `guidance/conventions/testing.md`
- `guidance/conventions/review.md`
Use for:
- Naming and structure conventions.
- Testing expectations.
- Code review and quality rules.
# Recurring Workflows
Location:
```text
guidance/workflows/
```
Examples:
- `guidance/workflows/feature-delivery.md`
- `guidance/workflows/bugfix.md`
- `guidance/workflows/release.md`
- `guidance/workflows/incident-response.md`
Use for:
- Repeatable task flows.
- Checklists for common delivery work.
- Operational or maintenance procedures.
<!-- Enable this optional section by removing the outer HTML comment markers from this segment
when you want agents to create, update, and consult released-product
documentation in `docs/`.
# Released Product Documentation
Released-product documentation should live outside the generic sections above.
Recommended location:
```text
docs/
```
Examples:
- `docs/readme.md`
- `docs/installation.md`
- `docs/history.md`
Agent rules for docs output:
- Keep content compact but comprehensive.
- Write for end users, operators, or other consumers of the released product.
- Prefer shipped behavior, supported workflows, and stable terminology over
internal implementation detail.
- Keep documentation synchronized with released behavior.
- Update release history when user-visible changes are shipped.
Recommended topics:
- Product overview and intended use.
- Installation, configuration, and upgrade guidance.
- Usage patterns, operational instructions, and support boundaries.
- Compatibility notes, migration notes, and release history.
- Troubleshooting and common pitfalls when relevant. -->
<!-- Enable this optional section by removing the outer HTML comment markers from this
segment when you want agents to produce and consult workflow output in `process/`.
# Agent Output In `process/`
The `process/` directory is primarily for agent output created during
delivery, maintenance, and review work.
Recommended location:
```text
process/
```
Agent rules for process output:
- Use `process/` for agent-produced artifacts rather than released-product
documentation.
- Keep entries concise, traceable, and tied to resulting changes.
- Treat `process/` as workflow output, not as the primary source of product
truth.
- Prefer summaries and rationale over raw transcript dumps unless a workflow
explicitly requires full prompt history.
# Agent Change Log
Location:
```text
process/log.md
```
Use for:
- Capturing prompts given to agents.
- Recording concise explanations of the resulting changes made by agents.
- Preserving task-by-task rationale, decisions, and implementation notes.
# Coding Handbook
Location:
```text
process/coding-handbook.md
```
Use for:
- A tutorial-style handbook that explains the programming components used in
the project.
- Compact but comprehensive technical onboarding material for future
contributors.
- Written explanations that connect code structure, concepts, and
implementation patterns. -->
# Project-Specific Requirements
Project-specific material should live outside the generic sections above.
Recommended location:
```text
requirements/
```
Examples:
- `requirements/project.md`
- `requirements/architecture.md`
- `requirements/decisions.md`
- `requirements/domain.md`
Use for:
- Product and business requirements.
- Project goals and constraints.
- Architecture and design decisions.
- Domain knowledge that is specific to this repository.
# Agent-Level Variables
When present, `requirements/identifiers.yml` is an optional project-specific
input that defines agent-level variables for use inside `requirements/` and
`guidance/`.
Variable schema:
- Use `@{VARIABLE_NAME}` for agent-level variables.
- Prefer uppercase snake case names such as `@{PROJECT_ID}` or `@{VENDOR_ID}`.
- Do not treat `${...}` as an agent-level variable form; that syntax may appear
in Bash or other code and should not be interpreted as agent metadata.
Scope:
- The effective scope of `requirements/identifiers.yml` is limited to
`requirements/` and `guidance/`.
- Definitions from `requirements/identifiers.yml` must not leak into product code.
Defaults:
- Default `@{VENDOR_ID}` is `osgw`.
- Default `@{PROJECT_ID}` is the current repository directory name.
Resolution rules:
- Treat `requirements/identifiers.yml` as optional; when it is absent, agents
may still resolve the defaults defined above.
- If a variable is used in `requirements/` or `guidance/` and it is not
defined in `requirements/identifiers.yml` and does not have a default in this
file, agents may stop and report the undefined variable.
- Prefer updating duplicated identifier values in `requirements/` and
`guidance/` to use the variable schema when that improves consistency.
# Precedence
Some precedence levels may be absent because optional levels can remain inside
HTML comments. The smaller numeric index wins.
Apply guidance in this order:
1. Direct user or task instructions.
2. Project-specific documents in `requirements/`.
<!-- 3. Released-product documentation in `docs/` when shipped behavior or
user-facing expectations are relevant. -->
4. Relevant modular guides in `guidance/stacks/`, `guidance/conventions/`, or `guidance/workflows/`.
<!-- 5. Agent output in `process/` when prior prompts, rationale, or
implementation notes are relevant. -->
6. This `AGENTS.md`.
# Maintenance
- Keep this file short and stable.
- Move detail into dedicated modules when a section becomes too specific or too long.
- Add new guideline files only when they solve a recurring need.
- Remove outdated references when the repository structure changes.
# Current Status
This repository defines the base `AGENTS.md` structure plus project-specific
requirements and modular guidance.
Future project work can add:
- Reusable modules under `guidance/`
- Project-specific documentation under `requirements/`
- Optional temporary iteration context in `SCRATCHPAD.md`
- Optional released-product documentation under `docs/` by uncommenting its segment
- Optional agent output under `process/` by uncommenting its segment
- Cross-references from this file once those documents exist

View File

@@ -99,6 +99,28 @@ TMDB-backed metadata enrichment requires `TMDB_API_KEY` to be set in the environ
## Version History
### 0.2.5
- show-level quality and notes fields
- pattern-over-show-over-default season-shift resolution with dynamic DB migration loading
- migration prompt now reports the upgrade path and creates an in-place DB backup before applying schema changes
- `upgrade --branch <name>` now fetches remote-only branches before switching
- `unmux` now applies season shifting to subtitle output filenames
- convert now keeps DB-defined target subtitle dispositions authoritative over sidecar filename disposition flags when a pattern definition exists
- focused modern tests added around migrations, unmux, upgrade, and subtitle-disposition import precedence
### 0.2.4
- lightweight CLI commands now stay import-light via lazy runtime loading
- setup/config templating moved to `assets/ffx.json.j2`
- aligned two-step local setup wrappers: `ffx setup` and `ffx configure_workstation`
- combined `ffprobe` payload reuse in `FileProperties`
- configurable crop-detect sampling plus per-process crop result caching
- single-query controller accessors and conditional DB schema bootstrap
- shared screen bootstrap/controller wiring for large detail screens
- configurable default season/episode digit lengths
- digit-aware `rename` and padded `unmux` filename markers
### 0.2.3
- PyPI packaging

View File

@@ -1,89 +0,0 @@
# Scratchpad
## Goal
- Capture a compact, project-wide list of optimization candidates after a broad scan of the current FFX codebase, tooling, and requirements.
## Settled
- The biggest near-term wins are in startup cost, repeated subprocess work, repeated database query patterns, and general repo hygiene.
- This list is intentionally optimization-oriented rather than bug-oriented. Some items below also improve correctness or maintainability, but they were selected because they can reduce runtime cost, operator friction, or iteration overhead.
- A first modern integration slice now exists under [`tests/integration/subtrack_mapping`](/home/osgw/.local/src/codex/ffx/tests/integration/subtrack_mapping). Remaining test-suite cleanup is now mostly about migrating and shrinking the legacy harness surface under [`tests/legacy`](/home/osgw/.local/src/codex/ffx/tests/legacy).
- The CLI root now lazy-loads heavy runtime dependencies so lightweight commands such as `version`, `help`, `setup`, `configure_workstation`, and `upgrade` stay import-light.
- Shared CLI defaults for container/output tokens now live outside [`src/ffx/ffx_controller.py`](/home/osgw/.local/src/codex/ffx/src/ffx/ffx_controller.py), and a focused unit test locks in the lazy-import contract.
- `FileProperties` now uses one cached `ffprobe -show_format -show_streams -of json` call per source file, and the combined payload was confirmed against the Dragonball asset to satisfy both previous probe call sites fully.
- Crop detection now uses configurable sampling windows plus per-process caching keyed by source file and sampling range, and the `cropdetect` CLI command now calls the real `FileProperties.findCropArguments()` path.
- Database startup now bootstraps schema only when required tables are actually missing, while version enforcement still runs on ordinary DB-backed context creation.
- Helper filename and rich-text utilities now use compiled raw regexes plus translate-based filename filtering, with unit coverage for TMDB suffix rewriting and Rich color stripping.
- Process resource limiting now has explicit disabled/default states in the CLI and requirements, and combined CPU-plus-niceness wrapping now executes as `cpulimit -- nice -n ... <command>` instead of a less explicit prefix chain.
- FFX logger setup now reuses named handlers, and fallback logger access no longer mutates handlers in ordinary constructors and helpers.
- The process wrapper now uses `subprocess.run(...)` with centralized command formatting plus stable timeout and missing-command error mapping.
- Active ORM controllers now use single-query accessors instead of paired `count()` plus `first()` lookups.
- Pattern matching now uses cached compiled regexes plus explicit duplicate-match errors, and pattern creation flows no longer persist zero-track patterns.
- The two-step local setup flow now has aligned CLI wrappers for both phases: `ffx setup` for bundle prep and `ffx configure_workstation` for workstation prep, while the shell scripts remain the bootstrap entrypoints before the bundle exists.
- The large detail screens now share one screen-bootstrap helper for context, metadata-filter extraction, and controller wiring, and show-pattern loading now goes through `PatternController` instead of a screen-local session query.
## Focused Snapshot
- Highest-leverage application optimizations:
- Decide whether placeholder help/settings screens should ship or disappear.
- Trim dead helpers and other dormant surface that still looks active.
- Highest-leverage repo and workflow optimizations:
- Continue migrating the oversized legacy test/combinator surface into focused modern tests so it is easier to run, debug, and extend.
## Optimization Candidates
1. Placeholder UI surfaces should either ship or disappear
- [`src/ffx/help_screen.py`](/home/osgw/.local/src/codex/ffx/src/ffx/help_screen.py) and [`src/ffx/settings_screen.py`](/home/osgw/.local/src/codex/ffx/src/ffx/settings_screen.py) are placeholders.
- Optimization:
- Either remove them from the active UI surface or complete them.
- Avoid paying ongoing maintenance cost for unfinished navigation targets.
- Expected value:
- Leaner interface.
- Lower UX ambiguity.
2. Several helper functions are unfinished or dead-weight
- [`src/ffx/helper.py`](/home/osgw/.local/src/codex/ffx/src/ffx/helper.py) contains `permutateList(...): pass`.
- There are many combinator and conversion placeholders across tests and migrations.
- Optimization:
- Remove dead code, finish it, or isolate it behind a clearly dormant area.
- Avoid carrying stubbed utility surface that looks reusable but is not.
- Expected value:
- Smaller mental model.
- Less time spent re-evaluating inactive paths.
3. Test suite shape is expensive to understand and likely expensive to run
- The project still carries a large legacy matrix of combinator files under [`tests/legacy`](/home/osgw/.local/src/codex/ffx/tests/legacy), several placeholder `pass` implementations, and at least one suspicious filename with an embedded space: [`tests/legacy/disposition_combinator_2_3 .py`](/home/osgw/.local/src/codex/ffx/tests/legacy/disposition_combinator_2_3 .py).
- A first focused replacement slice now exists in [`tests/integration/subtrack_mapping/test_cli_bundle.py`](/home/osgw/.local/src/codex/ffx/tests/integration/subtrack_mapping/test_cli_bundle.py), so the remaining work is migration and consolidation rather than creating the modern test shape from scratch.
- Optimization:
- Continue replacing broad combinator matrices with focused parametrized integration and unit tests.
- Retire the bespoke legacy discovery and runner path once equivalent coverage exists.
- Normalize file naming and test discovery conventions.
- Expected value:
- Faster contributor onboarding.
- Easier CI adoption later.
## Open
- Should optimization work focus first on operator-perceived latency, internal maintainability, or correctness-risk cleanup that also has performance upside?
- Is the long-term supported model still “local Linux workstation plus Textual UI,” or should optimization decisions bias toward a more scriptable/headless CLI?
## Gaps Right Now
- No explicit prioritization owner or milestone for the optimization backlog.
- No benchmark or timing harness exists for startup, probe, DB, or conversion orchestration overhead.
- Repo hygiene is still mixed with generated artifacts and some clearly unfinished files.
- The legacy TMDB-backed `Scenario 4` path is currently blocked by a pattern/track regression: `Patterns must define at least one track before they can be stored.` This surfaced while rerunning TMDB-dependent checks after the zero-track pattern hardening.
## Next
1. Triage the list into quick wins, medium refactors, and long-horizon cleanup.
2. Tackle the cheapest remaining product-surface cleanup first:
- placeholder UI surfaces and dead helper cleanup.
3. Continue replacing oversized legacy test matrices with focused modern integration and unit coverage.
4. Triage the legacy `Scenario 4` pattern/track failure and decide whether to fix the harness, adapt it to the zero-track guard, or retire that path during the ongoing test-suite migration.
## Delete When
- Delete this scratchpad once the optimization backlog is either converted into issues/work items or distilled into durable project guidance.

36
assets/ffx.json.j2 Normal file
View File

@@ -0,0 +1,36 @@
{
"databasePath": {{ database_path_json }},
"logDirectory": {{ log_directory_json }},
"subtitlesDirectory": {{ subtitles_directory_json }},
"defaultIndexSeasonDigits": {{ default_index_season_digits }},
"defaultIndexEpisodeDigits": {{ default_index_episode_digits }},
"defaultIndicatorSeasonDigits": {{ default_indicator_season_digits }},
"defaultIndicatorEpisodeDigits": {{ default_indicator_episode_digits }},
"metadata": {
"signature": {
"RECODED_WITH": "FFX"
},
"remove": [
"VERSION-eng",
"creation_time",
"NAME"
],
"streams": {
"remove": [
"BPS",
"NUMBER_OF_FRAMES",
"NUMBER_OF_BYTES",
"_STATISTICS_WRITING_APP",
"_STATISTICS_WRITING_DATE_UTC",
"_STATISTICS_TAGS",
"BPS-eng",
"DURATION-eng",
"NUMBER_OF_FRAMES-eng",
"NUMBER_OF_BYTES-eng",
"_STATISTICS_WRITING_APP-eng",
"_STATISTICS_WRITING_DATE_UTC-eng",
"_STATISTICS_TAGS-eng"
]
}
}
}

View File

@@ -1,28 +0,0 @@
# Lean Interface Iteration
Rule set name: `lean-interface-iteration`
Rule set ID: `LII`
Status: optional, prompt-activated only
Trigger examples:
- `Apply the lean-interface-iteration rules.`
- `Apply LII rules.`
LII-0001: Apply this rule set only when it is explicitly requested in the prompt.
LII-0002: The target of work under this rule set is the iterated product state for the addressed iteration only.
LII-0003: Optimize the addressed interface toward the leanest and least complex model that still satisfies the iteration order.
LII-0004: Backward compatibility, legacy aliases, and compatibility shims are not required unless the prompt explicitly asks to preserve them.
LII-0005: Prefer one authoritative interface over multiple overlapping parameters, flags, or naming variants.
LII-0006: Remove or avoid transitional interface layers when they are not required by the addressed iteration order.
LII-0007: Update affected tests, guidance, requirements, and documentation so they describe the simplified interface model rather than a mixed legacy-and-new model.
LII-0008: Never change behavior, interfaces, or surrounding areas that are not addressed by the current iteration order.

View File

@@ -1,56 +0,0 @@
# Preparation Script Design
Rule set name: `preparation-script-design`
Rule set ID: `PSD`
Status: optional, prompt-activated only
Trigger examples:
- `Apply the preparation-script-design rules.`
- `Apply PSD rules.`
PSD-0001: Apply this rule set only when it is explicitly requested in the prompt.
PSD-0002: Use this rule set for scripts whose purpose is to prepare, verify, or expose a local development or automation environment rather than to perform product runtime behavior.
PSD-0003: Keep a preparation script focused on environment readiness, dependency installation, local helper exposure, and clear verification output; do not mix unrelated product logic into the script.
PSD-0004: Design the script to be idempotent so repeated runs converge on the same prepared state without unnecessary reinstallation or destructive side effects.
PSD-0005: Provide a verification-only mode such as `--check` that reports readiness without installing, modifying, or creating dependencies.
PSD-0006: Separate component checks from installation steps so the script can report what is missing before or after attempted remediation.
PSD-0007: Group required capabilities into clear purpose-oriented sections such as support toolchains, local package bundles, generated environment helpers, or other relevant readiness areas instead of presenting one undifferentiated dependency list.
PSD-0008: Prefer explicit per-component check helpers over opaque one-shot checks so failures remain traceable and easy to extend.
PSD-0009: Generate or update environment helper files only when they provide a stable, reusable way to expose repo-local or workspace-local tools, paths, or environment variables.
PSD-0010: Generated environment helper files shall be safe to source multiple times and should avoid duplicating path entries or clobbering unrelated user environment state.
PSD-0011: When a preparation flow seeds optional user-owned files such as config templates, do so non-destructively by creating them only when absent unless the prompt explicitly requests overwrite behavior.
PSD-0012: Report status in a concise scan-friendly line format of the shape `[status] Label: detail`, where the label names the checked component and the detail string stays short and specific.
PSD-0013: Prefer a small canonical status vocabulary in those report lines, with `ok` for satisfied checks, `warn` for non-blocking gaps, and a failure status such as `failed` for blocking or unsuccessful states.
PSD-0014: When a preparation script uses terminal colors in its status output, apply a consistent severity mapping so `ok` is green, `warn` is yellow, and all other status levels are red.
PSD-0015: In bracketed status markers such as `[ok]` or `[warn]`, keep the square brackets uncolored and apply the severity color only to the inner status text.
PSD-0016: Colorized status output shall degrade safely in non-terminal or non-color contexts so the script remains readable and automation-friendly without ANSI support.
PSD-0017: End with an explicit readiness conclusion that distinguishes between successful preparation, incomplete prerequisites, and failed installation attempts.
PSD-0018: Installation logic should use the narrowest supported platform-specific package-manager actions necessary for the declared scope and should fail clearly when no supported installation path is available.
PSD-0019: Treat repo-local helper tooling and local package installation boundaries explicitly rather than assuming global installs, especially when the prepared environment is intended to be reproducible.
PSD-0020: Keep the script suitable for both interactive local developer use and non-interactive automation checks by avoiding prompts during normal execution unless the prompt explicitly requires interactivity.
PSD-0021: When a script depends on generated helper files or adjacent validation helpers, update those supporting files only as needed to keep the preparation flow coherent and usable.
PSD-0022: Verify shell syntax after changes and, when feasible, run a dry readiness check so the resulting preparation flow is validated rather than only written.

View File

@@ -1,7 +1,7 @@
[project]
name = "ffx"
description = "FFX recoding and metadata managing tool"
version = "0.2.3"
version = "0.2.5"
license = {file = "LICENSE.md"}
dependencies = [
"requests",

View File

@@ -1,97 +0,0 @@
# Architecture
## Architecture Goals
- Keep the tool small, local, and easy to reason about.
- Separate media inspection, stored normalization rules, and conversion execution clearly enough that users can inspect and adjust behavior.
- Favor explicit local state and deterministic rule application over opaque automation.
- Make external runtime dependencies and platform assumptions visible.
## System Context
- Primary actors:
- Local operator running the CLI.
- Local operator using the Textual TUI to inspect files and maintain rules.
- External systems:
- `ffprobe` for media introspection.
- `ffmpeg` for conversion and extraction.
- TMDB API for optional show and episode metadata.
- Local filesystem for source media, generated outputs, subtitles, logs, config, and database files.
- Data entering the system:
- Media container and stream metadata from source files.
- Regex patterns and per-show normalization rules entered in the TUI.
- Optional config values from `~/.local/etc/ffx.json`.
- Optional TMDB identifiers and CLI overrides.
- Optional external subtitle files.
- Data leaving the system:
- Normalized output media files.
- Extracted stream files from unmux operations.
- SQLite rows representing shows, patterns, tracks, tags, shifted seasons, and properties.
- Local log output and console messages.
## High-Level Building Blocks
- Frontend, CLI, API, or worker:
- A Click-based CLI in [`src/ffx/cli.py`](/home/osgw/.local/src/codex/ffx/src/ffx/cli.py), exposed as the `ffx` command and via `python -m ffx`, including lightweight maintenance wrappers for bundle setup, workstation preparation, and upgrade tasks.
- A Textual terminal UI rooted in [`src/ffx/ffx_app.py`](/home/osgw/.local/src/codex/ffx/src/ffx/ffx_app.py) with screens for shows, patterns, file inspection, tracks, tags, and shifted seasons.
- Core business logic:
- Descriptor objects model media files, shows, and tracks.
- Controllers encapsulate CRUD operations and workflow orchestration for shows, patterns, tags, tracks, season shifts, configuration, and conversion.
- `MediaDescriptorChangeSet` computes differences between a file and its stored target schema to drive metadata and disposition updates.
- File inspection caches combined `ffprobe` data and crop-detection results per source and sampling window within one process to avoid repeated subprocess work.
- Storage:
- SQLite via SQLAlchemy ORM, with schema rooted in shows, patterns, tracks, media tags, track tags, shifted seasons, and generic properties.
- A configuration JSON file supplies optional path, metadata-filtering, and filename-template settings.
- Integration adapters:
- Process execution wrapper for `ffmpeg`, `ffprobe`, `nice`, and `cpulimit`, with explicit disabled states for niceness and CPU limiting, support for both absolute `cpulimit` values and machine-wide percent input, and a combined `cpulimit -- nice -n ... <command>` execution shape when both limits are configured.
- HTTP adapter for TMDB via `requests`.
## Data And Interface Notes
- Key entities or records:
- `Show`: canonical TV show metadata plus digit-formatting rules for generated filenames.
- `Pattern`: regex rule tying filenames to one show and one target media schema.
- `Track` and `TrackTag`: persisted target stream records, codec, dispositions, audio layout, and stream-level tags. Detailed source-to-target mapping rules live in `requirements/subtrack_mapping.md`.
- `MediaTag`: persisted container-level metadata for a pattern.
- `ShiftedSeason`: mapping from source numbering ranges to adjusted season and episode numbers.
- `Property`: internal key-value storage currently used for database versioning.
- External interfaces:
- CLI commands for conversion, inspection, extraction, and crop detection.
- TUI workflows for rule authoring and rule maintenance.
- Environment variable `TMDB_API_KEY` for TMDB access.
- Config keys `databasePath`, `logDirectory`, and `outputFilenameTemplate`, plus optional metadata-filter rules.
- Validation rules:
- Only supported media-file extensions are accepted for conversion.
- Stored database version must match the runtime-required version.
- A normalized descriptor may have at most one default and one forced stream per relevant track type.
- Shifted-season ranges are intended not to overlap for the same show and season.
- TMDB lookups require a show ID and season and episode numbers.
- Error-handling approach:
- User-facing operational failures are raised as `click.ClickException` or warnings.
- Ambiguous default and forced stream states trigger prompts unless `--no-prompt` is set, in which case the command fails fast.
- External-process failures and invalid media are surfaced through logs and command errors rather than retries, except for TMDB rate-limit retries.
## Deployment And Operations
- Runtime environment:
- Local Python environment with the package installed and `ffmpeg`, `ffprobe`, `nice`, and `cpulimit` available on `PATH`.
- Deployment shape:
- Single-process command execution on demand; no daemon, queue, or network service of its own.
- Secrets and configuration handling:
- TMDB secret is read from `TMDB_API_KEY`.
- User config is read from `~/.local/etc/ffx.json`.
- Database path may also be overridden per command via `--database-file`.
- Logging and monitoring approach:
- File and console logging configured per invocation.
- Default log file path is `~/.local/var/log/ffx.log`.
- No dedicated monitoring integration is present.
## Open Technical Questions
- Question: Should Linux-specific assumptions such as `/dev/null`, `nice`, `cpulimit`, and `~/.local` remain part of the supported-platform contract?
- Risk: Portability and operational behavior are underspecified for non-Linux environments.
- Next decision needed: Either document Linux-like systems as the official support boundary or refactor the process and path handling for broader portability.
- Question: Should placeholder TUI surfaces such as settings and help become part of the required product surface or stay explicitly out of scope?
- Risk: The UI appears broader than the actually finished feature set.
- Next decision needed: Either remove or complete placeholder screens and update requirements accordingly.

View File

@@ -1,68 +0,0 @@
# Pattern Management
This file defines the behavioral contract for managing shows, patterns, and
pattern-backed filename matching.
Primary source: actual tool code in `src/ffx/`.
Secondary source: operator intent captured in task discussion.
## Scope
- The show, pattern, and track hierarchy stored in SQLite.
- The role of a pattern as a reusable normalization definition for related media files.
- Filename-driven assignment of a scanned media file to one show through one matching pattern.
- Duplicate-match handling when more than one pattern matches the same filename.
## Terms
- `show`: logical series identity such as one TV show entry in the database.
- `pattern`: regex-backed normalization definition attached to one show.
- `track`: one persisted target-track definition attached to one pattern.
- `scanned media file`: one source file currently being inspected or converted.
- `duplicate pattern match`: a filename state where more than one stored pattern matches the same scanned media file.
- `pattern-backed target schema`: the combination of one pattern's stored media tags and stored track definitions.
## Rules
- `PATTERN_MANAGEMENT-0001`: The domain model shall treat a show as the parent entity for patterns that describe distinct release families or normalization schemas for that show. A show may temporarily exist without patterns during editing or initial TUI creation.
- `PATTERN_MANAGEMENT-0002`: Each persisted pattern shall belong to exactly one show.
- `PATTERN_MANAGEMENT-0003`: The domain model shall treat a pattern as the reusable normalization definition for a series of media files expected to share the same internal track layout and materially similar stream and container metadata.
- `PATTERN_MANAGEMENT-0004`: Each persisted track definition shall belong to exactly one pattern.
- `PATTERN_MANAGEMENT-0005`: A pattern may also carry pattern-level media tags. The pattern's media tags plus its track definitions together form the pattern-backed target schema.
- `PATTERN_MANAGEMENT-0006`: A scanned media file shall resolve to at most one pattern and therefore at most one show.
- `PATTERN_MANAGEMENT-0007`: If no pattern matches a filename, the file shall remain unmatched rather than being assigned implicitly.
- `PATTERN_MANAGEMENT-0008`: If more than one pattern matches the same filename, the system shall raise a duplicate pattern match error instead of silently selecting one.
- `PATTERN_MANAGEMENT-0009`: Duplicate-match detection shall apply regardless of whether the competing patterns belong to the same show or to different shows.
- `PATTERN_MANAGEMENT-0010`: Exact duplicate pattern definitions for the same show should not create multiple persisted pattern rows.
- `PATTERN_MANAGEMENT-0011`: A persisted pattern shall define one or more tracks. Creating or retaining a zero-track pattern in the database is invalid managed state and shall be prohibited.
- `PATTERN_MANAGEMENT-0012`: A show may exist without patterns as an intermediate editing state, for example when a user creates the show first in the TUI and adds patterns later.
- `PATTERN_MANAGEMENT-0013`: Operator-facing pattern management should expose the owning show, regex pattern, stored track set, and stored media-tag set so a user can reason about matching and normalization behavior.
- `PATTERN_MANAGEMENT-0014`: Matching semantics shall be deterministic and documented. Implicit "last matching pattern wins" behavior is not acceptable released behavior.
## Acceptance
- A filename that matches exactly one pattern yields one matched pattern and one show identity.
- A filename that matches no pattern yields no matched pattern and an unmatched state.
- A filename that matches more than one pattern yields an explicit duplicate-match error.
- A pattern-backed target schema can be reconstructed from one pattern's stored media tags and stored track definitions.
- A show may be stored before any patterns are attached to it.
- A pattern cannot be stored or retained as a valid managed pattern unless at least one track is defined for it.
- Pattern-backed conversion never proceeds with two competing matching patterns for the same input filename.
## Current Code Fit
- `src/ffx/model/show.py` implements a one-to-many `Show -> Pattern` relationship.
- `src/ffx/model/pattern.py` implements `Pattern.show_id`, a one-to-many `Pattern -> Track` relationship, a one-to-many `Pattern -> MediaTag` relationship, and a unique `(show_id, pattern)` constraint for freshly created databases.
- `src/ffx/model/track.py` implements `Track.pattern_id`, so each persisted track belongs to one pattern.
- `src/ffx/model/pattern.py` reconstructs a pattern-backed target schema through `Pattern.getMediaDescriptor(...)`, combining stored media tags and stored tracks.
- `src/ffx/file_properties.py` assumes a scanned file resolves to at most one pattern, because it stores only one `self.__pattern` and derives one `show_id` from it.
- `src/ffx/pattern_controller.py` prevents exact duplicate `(show_id, pattern)` definitions during create and update flows, and it refreshes cached compiled regexes when stored pattern expressions change.
- `src/ffx/pattern_controller.py` now complies with duplicate-match safety. `matchFilename(...)` scans deterministically, returns exactly one match, returns `{}` for no match, and raises an explicit duplicate-pattern-match error when more than one pattern matches the same filename.
- The current persistence layer already aligns with the intended empty-show workflow because a show can exist without patterns.
- New pattern creation and schema replacement flows now require at least one track, and `TrackController.deleteTrack(...)` prevents deleting the last persisted track from a pattern.
- Trackless legacy rows can still exist in preexisting databases, but matching now rejects them explicitly instead of letting them participate silently.
## Risks
- The intended "release family" meaning of a pattern is a domain assumption, not something the code verifies automatically across all files matching that pattern.
- Preexisting databases created before the newer validation rules may still contain invalid rows, so upgrade and cleanup paths should continue to treat explicit validation failures as recoverable operator signals.

View File

@@ -1,116 +0,0 @@
## Purpose And Scope
- Project name: FFX
- User problem: TV episode files from mixed sources arrive with inconsistent codecs, stream metadata, subtitle layouts, season and episode numbering, and output filenames, which makes them awkward to archive and use in media-player applications.
- Target users: Individual operators curating a local TV media library on a workstation, especially users willing to define normalization rules per show.
- Success outcome: A user can inspect source files, define reusable show and pattern rules, and produce output files whose streams, metadata, and filenames follow a predictable schema for web playback and library import.
- Out of scope:
- Multi-user or hosted service workflows.
- General movie-library management.
- Distributed transcoding or remote job orchestration.
- Broad media-server administration beyond file preparation.
## Required Product
- Deliverable type: Installable Python command-line application with a Textual terminal UI for inspection and rule editing.
- Core capabilities:
- Maintain an SQLite-backed database of shows, filename-matching patterns, per-pattern stream layouts and metadata tags, and optional season-shift rules.
- Inspect existing media files through `ffprobe` and compare discovered stream metadata with stored normalization rules.
- Convert media files through `ffmpeg` into a normalized output layout, including video recoding, audio transcoding to Opus, metadata cleanup and rewrite, and controlled disposition flags.
- Build output filenames from detected or configured show, season, and episode information, optionally enriched from TMDB and a configurable Jinja-style filename template.
- Support auxiliary file operations such as subtitle import, unmuxing, crop detection, and rename-only runs.
- Supported environments:
- Local execution on a Python-capable workstation.
- Best-supported on Linux-like systems because the implementation assumes `~/.local`, `/dev/null`, `nice`, and `cpulimit`.
- Requires `ffmpeg`, `ffprobe`, and `cpulimit` on `PATH`.
- Operational owner: The local user running the tool and maintaining its config, database, and external tooling.
## Suggested User Stories
- As a library maintainer, I want to define show-specific matching rules once so that future source files can be normalized automatically.
- As an operator, I want to inspect a file before conversion so that I can compare its actual streams and tags against the stored target schema.
- As a user preparing web-playback files, I want to recode video and audio with a small set of predictable options so that results are compatible and consistently named.
- As a user dealing with nonstandard releases, I want CLI overrides for language, title, stream order, default and forced tracks, and season and episode data so that one-off fixes do not require database edits first.
- As a user importing anime or other shifted numbering schemes, I want season and episode offsets per show so that generated filenames align with TMDB and media-library expectations.
## Functional Requirements
- The system shall provide a CLI entrypoint named `ffx` with commands for `convert`, `inspect`, `shows`, `unmux`, `cropdetect`, `setup`, `configure_workstation`, `upgrade`, `version`, and `help`.
- The system shall support a two-step local installation and preparation flow:
- `tools/setup.sh` is the bootstrap entrypoint for the first step and shall own bundle virtualenv creation, package installation, shell alias exposure, and optional Python test-package installation.
- `tools/configure_workstation.sh` is the bootstrap entrypoint for the second step and shall own workstation dependency checks and installation plus local config and directory seeding.
- After the bundle is installed, `ffx setup` and `ffx configure_workstation` shall remain aligned wrapper entrypoints for those same two steps.
- The CLI command `ffx setup` shall act as a wrapper for the first-step bundle-preparation flow in `tools/setup.sh`.
- The CLI command `ffx configure_workstation` shall act as a wrapper for the second-step preparation flow in `tools/configure_workstation.sh`.
- The system shall persist reusable normalization rules in SQLite for:
- shows and show formatting digits,
- regex-based filename patterns,
- per-pattern media tags,
- per-pattern stream definitions,
- shifted-season mappings,
- internal database version properties.
- Detailed show, pattern, and duplicate-match management rules live in `requirements/pattern_management.md`.
- The system shall inspect source media using `ffprobe` and derive a structured description of container metadata and streams.
- The system shall optionally open a Textual UI to browse shows, inspect files, and create, edit, or delete shows, patterns, stream definitions, tags, and shifted-season rules.
- The system shall match filenames against stored regex patterns to decide whether an input file should inherit a target stream and metadata schema.
- The system shall convert supported input files (`mkv`, `mp4`, `avi`, `flv`, `webm`) with `ffmpeg`, supporting at least:
- VP9, AV1, and H.264 video encoding,
- Opus audio encoding with bitrate selection based on channel layout,
- metadata and disposition rewriting,
- optional crop detection and crop application,
- optional deinterlacing and denoising,
- optional subtitle import from external files,
- rename-only move mode.
- The system shall support optional TMDB lookups to resolve show names, years, and episode titles when a show ID, season, and episode are available.
- The system shall generate output filenames from show metadata, season and episode indices, and episode names using the configured filename template.
- The system shall allow CLI overrides for stream languages, stream titles, default and forced tracks, stream order, TMDB show and episode data, output directory, label prefix, and processing resource limits.
- Processing resource limit rules:
- `--nice` shall accept niceness values from `-20` through `19`; omitting the option shall disable niceness adjustment.
- `--cpu` shall accept either a positive absolute `cpulimit` value such as `200`, or a percentage suffixed with `%` such as `25%` to represent a share of present CPUs; omitting the option or using `0` shall disable CPU limiting.
- When both limits are configured, the process wrapper shall execute the target command through `cpulimit` around a `nice -n ...` invocation so both limits apply to the launched media command.
- The system shall support extracting streams into separate files via `unmux` and reporting suggested crop parameters via `cropdetect`.
- Crop detection shall use a configurable sampling window, defaulting to a 60-second seek and a 180-second analysis duration, and repeated crop-detection requests for the same source plus sampling window shall reuse cached results within one process.
- The system shall handle invalid input and system failures gracefully by logging warnings or raising `click` errors for missing files, invalid media, missing TMDB credentials, incompatible database versions, and ambiguous track dispositions when prompting is disabled.
## Quality Requirements
- The system should stay understandable as a small local tool: controllers, descriptors, models, and screens should remain separate enough for contributors to trace a workflow end to end.
- The system should produce predictable output for the same database rules, CLI overrides, and source files.
- The system should preserve a lightweight operational footprint: local SQLite state, local log file, no mandatory background services.
- The system should be testable through modern automatically discovered tests and through remaining legacy harness coverage during migration.
- The system should expose enough logging to diagnose failed probes, failed conversions, and rule mismatches without requiring a debugger.
## Constraints And Assumptions
- Technology constraints:
- Python package built with setuptools.
- Primary libraries: `click`, `textual`, `sqlalchemy`, `jinja2`, `requests`.
- Conversion and inspection rely on external executables rather than pure-Python media libraries.
- Hosting or infrastructure constraints:
- Intended for local execution, not server deployment.
- Stores default state in `~/.local/etc/ffx.json`, `~/.local/var/ffx/ffx.db`, and `~/.local/var/log/ffx.log`.
- Timeline constraints:
- The current implemented scope reflects a compact alpha release stream up to version `0.2.3`.
- Team capacity assumptions:
- Maintained as a small codebase where simple patterns and direct controller logic are preferred over framework-heavy abstractions.
- Third-party dependencies:
- `ffmpeg`, `ffprobe`, and `cpulimit`.
- TMDB API access through `TMDB_API_KEY` for metadata enrichment.
- Installation assumptions:
- The Python-side bundle install step and optional Python test extras are managed by `tools/setup.sh`, with `ffx setup` as the aligned wrapper after bootstrap.
- The workstation-preparation step is managed separately by `tools/configure_workstation.sh` or `ffx configure_workstation`.
## Acceptance Scope
- First release boundary:
- Local installation through `pip`.
- Working SQLite-backed rule storage.
- Functional CLI conversion and inspection workflows.
- Textual CRUD flows for shows, patterns, tags, tracks, and shifted seasons.
- TMDB-assisted filename generation, subtitle import, season shifting, database versioning, and configurable output filename templating.
- Excluded follow-up ideas:
- Completing placeholder screens such as settings and help.
- Hardening platform portability beyond Linux-like systems.
- Broader media types, richer release packaging, and production-grade background processing.
- Demonstration scenario:
- Inspect a TV episode file, define or update the matching show and pattern in the TUI, then run `ffx convert` so the result uses the stored stream schema, optional TMDB episode naming, and a normalized output filename.

View File

@@ -1,74 +0,0 @@
# Subtrack Mapping
This file defines the behavioral contract for mapping input subtracks to output
subtracks during conversion.
Primary source: actual tool code in `src/ffx/`.
Secondary source: `tests/legacy/`, used only to clarify intent and reveal gaps.
## Scope
- Ensuring each target subtrack is created from the corresponding source-subtrack information, including stream-level metadata.
- Mapping input streams to output streams during conversion.
- Using persisted pattern-track definitions from the database as the target schema.
- Allowing omission and reordering of retained tracks.
- Keeping stream-level metadata attached to the correct source-derived logical track after remapping.
- Normalizing target output into ordered track groups: video, audio, subtitle, then special types such as fonts or images.
## Terms
- `source_index`: identity of the originating input stream from ffprobe or an imported source descriptor.
- `index`: final output-track order across all retained tracks.
- `sub_index`: per-type position within the retained tracks of one type, for example audio stream `0` or subtitle stream `1`.
- `target schema`: stored or constructed output-track definition that decides which tracks are kept, omitted, reordered, and rewritten.
- `separate source file`: additional file bound to one target track slot whose media payload replaces the regular source payload for that slot.
## Rules
- `SUBTRACK_MAPPING-0001`: The system shall represent source-stream identity separately from output order. `source_index`, `index`, and `sub_index` are distinct concepts and shall not be collapsed into one field.
- `SUBTRACK_MAPPING-0002`: The system shall derive `source_index` for probed tracks from the original ffprobe stream index and preserve that identity through conversion planning.
- `SUBTRACK_MAPPING-0003`: Pattern-backed track definitions stored in the database shall persist both target output order and originating source-stream identity.
- `SUBTRACK_MAPPING-0004`: When a filename matches a pattern, the pattern target schema shall be the source of truth for which source tracks are retained, which are omitted, and in what order retained tracks appear in the output.
- `SUBTRACK_MAPPING-0005`: A target track may refer only to an existing source track of the same type. Conversion shall fail fast when a target track refers to a nonexistent source stream or a source stream of a different type.
- `SUBTRACK_MAPPING-0006`: The ffmpeg mapping phase shall be generated from target output order while resolving each retained output track back to its originating source stream via `source_index`.
- `SUBTRACK_MAPPING-0007`: Reordering and omission shall preserve logical track identity. Stream-level metadata, titles, languages, and disposition decisions shall stay attached to the correct source-derived logical track after mapping.
- `SUBTRACK_MAPPING-0008`: The system shall support one-off CLI stream-order overrides without requiring prior database edits.
- `SUBTRACK_MAPPING-0009`: Operator-facing inspection and editing surfaces shall expose enough source-versus-target information to let a user reason about subtrack mapping decisions.
- `SUBTRACK_MAPPING-0010`: Test coverage for subtrack mapping shall assert source-derived identity, omission, and output order explicitly. Final track counts or final type sequences alone are insufficient proof of correct mapping.
- `SUBTRACK_MAPPING-0011`: Retained target tracks shall appear in ordered groups: video track or tracks first, then audio tracks, then subtitle tracks, then special types such as fonts or images. Within each group, the target schema shall define the order.
- `SUBTRACK_MAPPING-0012`: Track omission is valid when required by output compatibility, when needed to normalize source tracks into the required target group order and schema, or when explicitly requested by database rules or CLI options.
- `SUBTRACK_MAPPING-0013`: If source tracks do not already comply with the required target group order, conversion shall reorder retained tracks to match the target ordering contract without losing source-track identity or stream-level metadata lineage.
## Separate Additional Source Files
- `SUBTRACK_MAPPING-0014`: A separate source file may substitute the media payload of one target subtrack without changing that target track's intended output position.
- `SUBTRACK_MAPPING-0015`: When a separate source file is used, the target track shall remain bound to the corresponding logical source track for mapping, validation, and metadata lineage.
- `SUBTRACK_MAPPING-0016`: Metadata for a substituted target track shall be merged from the regular source track and the separate source file when available.
- `SUBTRACK_MAPPING-0017`: If the separate source file provides a metadata field that is also present on the regular source track, the separate source file value shall win in the target output.
- `SUBTRACK_MAPPING-0018`: If a metadata field is absent from the separate source file, the system shall fall back to the corresponding metadata from the regular source track or target schema rewrite rules.
## Acceptance
- Given a source media descriptor and a pattern-backed target schema, the planned output tracks can be listed in final output order and each retained track can still be traced to one originating source stream.
- Planned output order follows grouped target order: video, audio, subtitle, then special types.
- Tracks not referenced by the target schema are omitted from output mapping.
- Tracks may also be omitted when they are incompatible with the chosen output format or explicitly excluded by database or CLI rules.
- Two retained target tracks never originate from the same source stream unless duplication is implemented explicitly as a separate feature.
- If target-track metadata is rewritten after reordering, it is written onto the correct source-derived logical track rather than the track that merely occupies the same final output position.
- Invalid target-to-source references fail deterministically before the conversion job is launched.
- If a separate source file substitutes one target track, that track keeps its target slot and ordering while metadata is merged with separate-file values taking precedence when both sides provide the same field.
- A test proving subtrack mapping must assert at least one of: exact `source_index` to output-order mapping, omission of named source tracks, or preservation of per-track metadata after reorder.
## Test Notes
- `tests/legacy/scenario.py` names pattern behavior as `Filter/Reorder Tracks`.
- `tests/legacy/scenario_4.py` is the strongest end-to-end signal because it runs DB-backed conversion and reapplies source indices before assertion.
- `tests/legacy/track_tag_combinator_2_0.py` and `tests/legacy/track_tag_combinator_3_4.py` sort result tracks by `source_index` before checking tags, which matches the intended identity model.
- Legacy permutation combinators define permutations but their assertion functions are stubs.
- Some legacy scenarios produce `AP` and `SP` selectors but do not execute them.
## Risks
- `src/ffx/media_descriptor.py` contains an explicit `rearrangeTrackDescriptors()` path whose current implementation appears defective and under-tested.
- Separate-source-file metadata precedence is only partly expressed in current implementation paths and should be covered directly in the rewritten test suite.
- Production code expresses the mapping contract more clearly than the legacy harness, so a rewrite should add direct logic-level tests for mapping and reorder planning.

View File

@@ -1,144 +0,0 @@
# Test Rewrite
This file captures the structure executed by `tests/legacy_runner.py` today and
defines the target shape for a complete rewrite.
Detailed product rules for source-to-target subtrack mapping live in
`requirements/subtrack_mapping.md`. This file describes only how tests cover
that area.
## Interpreter Requirement
- Agents shall run Python-side test commands with `~/.local/share/ffx.venv/bin/python`.
- This applies to the legacy harness, `unittest`, `pytest`, helper scripts, and `python -m ffx ...` test invocations.
- Agents shall not silently substitute `python`, `python3`, or another interpreter for Python-side test work.
- If `~/.local/share/ffx.venv/bin/python` is missing or not executable, agents shall stop and report the missing venv instead of continuing with Python-side test execution.
## Shell Environment Requirement
- Agents shall source `~/.bashrc` from an interactive Bash shell before running TMDB-dependent test commands or TMDB-dependent `python -m ffx ...` test invocations.
- Agents shall not source `~/.bashrc.d/interactive/77_tmdb.sh` directly for normal test work; `~/.bashrc` is the required entry point.
- In automation this means agents shall use an interactive Bash invocation such as `bash -ic 'source ~/.bashrc && ...'`, because a non-interactive `bash -lc` returns from `~/.bashrc` before the interactive fragments are loaded.
- If sourcing `~/.bashrc` still does not provide required shell environment such as `TMDB_API_KEY`, agents shall stop and report the missing environment instead of continuing with TMDB-dependent test execution.
## Current Harness
- Entrypoint: `~/.local/share/ffx.venv/bin/python tests/legacy_runner.py run`
- Runner style: custom Click CLI, not `pytest` or `unittest`
- Commands:
- `run`: discover scenario files, instantiate each scenario, run yielded jobs
- `dupe`: helper command that creates duplicate media fixtures; not part of the test run
- Filters: `--scenario`, `--variant`, `--limit`
- Shared context:
- builds one mutable dict for the whole run
- installs loggers and writes `ffx_test_report.log`
- creates `ConfigurationController` eagerly
- tracks only passed and failed counters
- Discovery:
- scenario files: `tests/legacy/scenario_*.py`
- combinators: `glob + importlib + inspect` by filename convention
- ordering: implicit glob order, no explicit sorting
- Skip behavior:
- Scenario 4 is skipped when `TMDB_API_KEY` is missing
- only `TMDB_API_KEY_NOT_PRESENT_EXCEPTION` is caught at scenario construction time
## Current Scenarios
- `1`: `tests/legacy/scenario_1.py`
- focus: basename generation without pattern lookup or TMDB
- inputs per job: `1`
- jobs: `140`
- expected failures: `0`
- execution: build one synthetic source file, run `~/.local/share/ffx.venv/bin/python -m ffx convert`, assert filename selectors only
- selectors executed: `B`, `L`, `I`
- selectors defined but not executed: `S`, `R`
- `2`: `tests/legacy/scenario_2.py`
- focus: conversion matrix over media layouts, dispositions, tags, and permutations
- inputs per job: `1`
- jobs: `8193`
- expected failures: `3267`
- execution: build one synthetic source file, run `~/.local/share/ffx.venv/bin/python -m ffx convert`, probe result with `FileProperties`, assert track layout and selected audio and subtitle metadata
- selectors executed: `M`, `AD`, `AT`, `SD`, `ST`
- selectors defined but not executed: `MT`, `AP`, `SP`, `J`
- `4`: `tests/legacy/scenario_4.py`
- focus: pattern-driven batch conversion with SQLite state and live TMDB naming
- inputs per job: `6`
- jobs: `768`
- expected failures: `336`
- execution: build six synthetic preset files, recreate temp SQLite DB, insert show and pattern, run one batch convert command via `~/.local/share/ffx.venv/bin/python`, query TMDB during assertions
- selectors executed: `M`, `AD`, `AT`, `SD`, `ST`
- selectors defined but not executed: `MT`, `AP`, `SP`, `J`
- notes:
- uses `MediaCombinator6` only
- issues live HTTP requests through `TmdbController` with no request cache
## Current Combinator Families
- scenario files discovered: `3`
- basename combinators discovered: `2`
- media combinators discovered: `8`
- media tag combinators discovered: `3`
- disposition combinator 2 variants: `4`
- disposition combinator 3 variants: `5`
- track tag combinator 2 variants: `4`
- track tag combinator 3 variants: `5`
- indicator variants: `7`
- label variants: `2`
- show variants: `3`
- release variants: `3`
- permutation 2 variants: `2`
- permutation 3 variants: `3`
## Current Totals
- full run without TMDB: `8333`
- full run with TMDB: `9101`
- Scenario 4 generated source files: `4608`
- Scenario 4 live TMDB episode queries: `4608`
## Current Behavior Areas
- output basename rules for label, season and episode indicator, show name, and release suffix combinations
- track layout normalization across the eight media combinator shapes from `VA` through `VAASSS`
- two-track and three-track disposition edge cases, including intentional failure cases
- two-track and three-track track-tag preservation checks, including checks that sort results by source identity
- container-level media tag handling
- pattern-backed conversion against a temporary SQLite database
- TMDB-assisted episode naming for batch conversion
## Structural Findings
- The suite is process-heavy: most jobs run `ffmpeg` to generate a fixture and then spawn the FFX CLI as a subprocess.
- The suite is integration-first and has almost no isolated unit-level coverage for pure logic.
- The base `Combinator` class is a placeholder and is not the real abstraction boundary used by the suite.
- Many combinator methods are placeholders: there are `25` `pass` statements across the current test modules.
- Several assertion families are never executed because scenario selector dispatch is incomplete.
- Scenario comments mention a Scenario 3, but no `scenario_3.py` exists.
- `tests/legacy/_basename_combinator_1.py` is effectively orphaned because discovery only matches `basename_combinator_*.py`.
- `tests/legacy/disposition_combinator_2_3 .py` contains an embedded space in the filename and is still part of discovery.
- Expected failures are validated only as subprocess return-code matches, not as specific error types or messages.
- The current suite depends on `ffmpeg`, `ffprobe`, SQLite, the local Python environment, and for Scenario 4 a live TMDB API key plus network access.
## Rewrite Target
- Replace the custom Click harness with a standard test runner, preferably `pytest`.
- Split the suite into explicit layers: unit, integration, and optional external-system tests.
- Keep unit tests as the default path and make them runnable without `ffmpeg`, `ffprobe`, TMDB, or a user config directory.
- Model discovery explicitly in code instead of relying on glob-plus-reflection naming conventions.
- Convert the current Cartesian-product combinators into readable parametrized cases grouped by behavior area.
- Preserve the current behavior areas, but represent them with targeted cases instead of thousands of opaque variant IDs.
- Make every assertion family explicit and executable; there must be no selector that is produced but never consumed.
- Replace live TMDB access with fixtures or mocks in normal runs; any live-contract test must be opt-in.
- Replace ad hoc subprocess return-code checks with assertions on typed exceptions, stderr content, or structured outputs.
- Provide small reusable media fixtures or fixture builders so only a narrow integration slice needs `ffmpeg`-generated media.
- Make database tests self-contained and fast through temporary databases and direct controller-level assertions.
- Make ordering, naming, and selection deterministic so a contributor can predict exactly what will run.
- Expose a small smoke suite for quick local runs and CI, plus a separately marked slower integration suite.
- Prefer domain-oriented test modules over combinator-family modules: basename, pattern matching, metadata rewrite, track ordering, TMDB naming, CLI smoke, and failure handling.
## Rewrite Acceptance
- A default local test run finishes quickly and without network access.
- A contributor can identify which behavior a failing test covers without decoding variant strings like `VAASSS-A:D10-S:T001`.
- All current intended failure behaviors remain covered, but each one is asserted directly and readably.
- The rewritten suite can be adopted by CI without requiring live TMDB credentials.

View File

@@ -33,7 +33,7 @@ if TYPE_CHECKING:
from ffx.media_descriptor import MediaDescriptor
from ffx.track_descriptor import TrackDescriptor
LIGHTWEIGHT_COMMANDS = {None, 'version', 'help', 'setup', 'configure_workstation', 'upgrade'}
LIGHTWEIGHT_COMMANDS = {None, 'version', 'help', 'setup', 'configure_workstation', 'upgrade', 'rename'}
CPU_OPTION_HELP = (
"Limit CPU for started processes. Use an absolute cpulimit value such as 200 "
+ "(about 2 cores), or use a percentage such as 25% for a share of present cores. "
@@ -185,6 +185,67 @@ def resolveUnmuxOutputDirectory(context, outputDirectory, subtitlesOnly, label):
return os.path.join(configuredSubtitlesBaseDirectory, resolvedLabel), True
def resolveIndicatorDigitLengths(context=None, showDescriptor=None):
from ffx.show_descriptor import ShowDescriptor
defaultDigitLengths = ShowDescriptor.getDefaultDigitLengths(context)
if showDescriptor is None:
return (
defaultDigitLengths[ShowDescriptor.INDICATOR_SEASON_DIGITS_KEY],
defaultDigitLengths[ShowDescriptor.INDICATOR_EPISODE_DIGITS_KEY],
)
return (
int(showDescriptor.getIndicatorSeasonDigits()),
int(showDescriptor.getIndicatorEpisodeDigits()),
)
def buildRenameTargetFilename(
sourcePath,
prefix,
seasonOverride=None,
suffix='',
indicatorSeasonDigits=None,
indicatorEpisodeDigits=None,
):
from ffx.file_properties import FileProperties
from ffx.show_descriptor import ShowDescriptor
sourceFilename = os.path.basename(sourcePath)
seasonEpisodeValues = FileProperties.extractSeasonEpisodeValues(sourceFilename)
if seasonEpisodeValues is None:
return None
sourceSeason, sourceEpisode = seasonEpisodeValues
resolvedSeason = int(seasonOverride) if seasonOverride is not None else (
int(sourceSeason) if sourceSeason is not None else 1
)
resolvedIndicatorSeasonDigits = (
int(indicatorSeasonDigits)
if indicatorSeasonDigits is not None
else ShowDescriptor.DEFAULT_INDICATOR_SEASON_DIGITS
)
resolvedIndicatorEpisodeDigits = (
int(indicatorEpisodeDigits)
if indicatorEpisodeDigits is not None
else ShowDescriptor.DEFAULT_INDICATOR_EPISODE_DIGITS
)
_sourceBasename, sourceExtension = os.path.splitext(sourceFilename)
targetFilenameTokens = [
str(prefix).strip(),
f"s{resolvedSeason:0{resolvedIndicatorSeasonDigits}d}e{int(sourceEpisode):0{resolvedIndicatorEpisodeDigits}d}",
]
resolvedSuffix = str(suffix).strip()
if resolvedSuffix:
targetFilenameTokens.append(resolvedSuffix)
return f"{'_'.join(targetFilenameTokens)}{sourceExtension}"
@click.group()
@click.pass_context
@@ -242,7 +303,7 @@ def version():
def help():
click.echo(f"ffx {VERSION}\n")
click.echo("Maintenance commands: setup, configure_workstation, upgrade")
click.echo("Media commands: shows, inspect, convert, unmux, cropdetect")
click.echo("Media commands: shows, inspect, convert, rename, unmux, cropdetect")
click.echo("Use 'ffx --help' or 'ffx <command> --help' for full command help.")
@@ -375,10 +436,14 @@ def upgrade(ctx, branch):
commandSequences.append(['git', 'reset', '--hard', 'HEAD'])
if branch:
commandSequences.append(['git', 'checkout', branch])
commandSequences += [
['git', 'fetch', 'origin', branch],
['git', 'checkout', '-B', branch, 'FETCH_HEAD'],
]
else:
commandSequences.append(['git', 'pull'])
commandSequences += [
['git', 'pull'],
[bundlePipPath, 'install', '--upgrade', 'pip', 'setuptools', 'wheel'],
[bundlePipPath, 'install', '--editable', '.'],
]
@@ -408,6 +473,62 @@ def inspect(ctx, filename):
app.run()
@ffx.command()
@click.pass_context
@click.argument('paths', nargs=-1)
@click.option('--prefix', type=str, required=True, help='Required target filename prefix')
@click.option('--season', type=int, default=None, help='Override target season index')
@click.option('--suffix', type=str, default='', help='Optional target filename suffix')
@click.option('--dry-run', is_flag=True, default=False, help='Only print planned renames')
def rename(ctx, paths, prefix, season, suffix, dry_run):
"""Rename matching episode files in place."""
from ffx.configuration_controller import ConfigurationController
resolvedPrefix = str(prefix).strip()
resolvedSuffix = str(suffix).strip()
effectiveDryRun = bool(ctx.obj.get('dry_run', False) or dry_run)
renameContext = {
'config': ctx.obj.get('config') or ConfigurationController(),
}
indicatorSeasonDigits, indicatorEpisodeDigits = resolveIndicatorDigitLengths(renameContext)
if not resolvedPrefix:
raise click.ClickException("Rename prefix must not be empty.")
processedCount = 0
for sourcePath in paths:
if not os.path.isfile(sourcePath):
continue
targetFilename = buildRenameTargetFilename(
sourcePath,
resolvedPrefix,
seasonOverride=season,
suffix=resolvedSuffix,
indicatorSeasonDigits=indicatorSeasonDigits,
indicatorEpisodeDigits=indicatorEpisodeDigits,
)
if targetFilename is None:
continue
sourceFilename = os.path.basename(sourcePath)
targetPath = os.path.join(os.path.dirname(sourcePath), targetFilename)
click.echo(f"{sourceFilename} -> {targetFilename}")
processedCount += 1
if effectiveDryRun or os.path.abspath(sourcePath) == os.path.abspath(targetPath):
continue
if os.path.exists(targetPath):
raise click.ClickException(f"Target file already exists: {targetPath}")
shutil.move(sourcePath, targetPath)
if processedCount == 0:
click.echo("No matching files found.")
def getUnmuxSequence(trackDescriptor: TrackDescriptor, sourcePath, targetPrefix, targetDirectory = ''):
# executable and input file
@@ -468,6 +589,7 @@ def unmux(ctx,
cpu):
from ffx.file_properties import FileProperties
from ffx.process import executeProcess
from ffx.shifted_season_controller import ShiftedSeasonController
from ffx.track_disposition import TrackDisposition
from ffx.track_type import TrackType
@@ -488,6 +610,8 @@ def unmux(ctx,
if create_output_directory and existingSourcePaths and not ctx.obj.get('dry_run', False):
os.makedirs(output_directory, exist_ok=True)
shiftedSeasonController = ShiftedSeasonController(ctx.obj)
for sourcePath in existingSourcePaths:
fp = FileProperties(ctx.obj, sourcePath)
@@ -495,13 +619,29 @@ def unmux(ctx,
try:
sourceMediaDescriptor = fp.getMediaDescriptor()
currentPattern = fp.getPattern()
currentShowDescriptor = (
currentPattern.getShowDescriptor(ctx.obj) if currentPattern is not None else None
)
indicatorSeasonDigits, indicatorEpisodeDigits = resolveIndicatorDigitLengths(
ctx.obj,
currentShowDescriptor,
)
season = fp.getSeason()
episode = fp.getEpisode()
season, episode = shiftedSeasonController.shiftSeason(
fp.getShowId(),
season=fp.getSeason(),
episode=fp.getEpisode(),
patternId=currentPattern.getId() if currentPattern is not None else None,
)
#TODO: Recognition für alle Formate anpassen
targetLabel = label if label else fp.getFileBasename()
targetIndicator = f"_S{season}E{episode}" if label and season != -1 and episode != -1 else ''
targetIndicator = (
f"_S{season:0{indicatorSeasonDigits}d}E{episode:0{indicatorEpisodeDigits}d}"
if label and season != -1 and episode != -1
else ''
)
if label and not targetIndicator:
ctx.obj['logger'].warning(f"Skipping file {fp.getFilename()}: Label set but no indicator recognized")
@@ -837,6 +977,7 @@ def convert(ctx,
from ffx.filter.quality_filter import QualityFilter
from ffx.helper import filterFilename, getEpisodeFileBasename, substituteTmdbFilename
from ffx.shifted_season_controller import ShiftedSeasonController
from ffx.show_controller import ShowController
from ffx.show_descriptor import ShowDescriptor
from ffx.tmdb_controller import TmdbController
from ffx.track_codec import TrackCodec
@@ -1020,6 +1161,7 @@ def convert(ctx,
ctx.obj['logger'].info(f"\nRunning {len(existingSourcePaths) * len(chainYield)} jobs")
jobIndex = 0
showController = ShowController(context)
for sourcePath in existingSourcePaths:
@@ -1052,7 +1194,7 @@ def convert(ctx,
ssc = ShiftedSeasonController(context)
showId = mediaFileProperties.getShowId()
matchedShowId = mediaFileProperties.getShowId()
#HINT: -1 if not set
if 'tmdb' in cliOverrides.keys() and 'season' in cliOverrides['tmdb']:
@@ -1134,7 +1276,8 @@ def convert(ctx,
targetMediaDescriptor.importSubtitles(context['subtitle_directory'],
context['subtitle_prefix'],
showSeason,
showEpisode)
showEpisode,
preserve_dispositions=True)
# ctx.obj['logger'].debug(f"tmd subindices: {[t.getIndex() for t in targetMediaDescriptor.getAllTrackDescriptors()]} {[t.getSubIndex() for t in targetMediaDescriptor.getAllTrackDescriptors()]} {[t.getDispositionFlag(TrackDisposition.DEFAULT) for t in targetMediaDescriptor.getAllTrackDescriptors()]}")
ctx.obj['logger'].debug(f"tmd subindices: {[t.getIndex() for t in targetMediaDescriptor.getTrackDescriptors()]} {[t.getSubIndex() for t in targetMediaDescriptor.getTrackDescriptors()]} {[t.getDispositionFlag(TrackDisposition.DEFAULT) for t in targetMediaDescriptor.getTrackDescriptors()]}")
@@ -1149,26 +1292,59 @@ def convert(ctx,
fc = FfxController(context, targetMediaDescriptor, sourceMediaDescriptor)
indexSeasonDigits = currentShowDescriptor.getIndexSeasonDigits() if not currentPattern is None else ShowDescriptor.DEFAULT_INDEX_SEASON_DIGITS
indexEpisodeDigits = currentShowDescriptor.getIndexEpisodeDigits() if not currentPattern is None else ShowDescriptor.DEFAULT_INDEX_EPISODE_DIGITS
indicatorSeasonDigits = currentShowDescriptor.getIndicatorSeasonDigits() if not currentPattern is None else ShowDescriptor.DEFAULT_INDICATOR_SEASON_DIGITS
indicatorEpisodeDigits = currentShowDescriptor.getIndicatorEpisodeDigits() if not currentPattern is None else ShowDescriptor.DEFAULT_INDICATOR_EPISODE_DIGITS
qualityShowId = (
cliOverrides['tmdb']['show']
if 'tmdb' in cliOverrides.keys() and 'show' in cliOverrides['tmdb']
else matchedShowId
)
if currentShowDescriptor is None and qualityShowId != -1:
currentShowDescriptor = showController.getShowDescriptor(qualityShowId)
# Shift season and episode if defined for this show
if ('tmdb' not in cliOverrides.keys() and showId != -1
and showSeason != -1 and showEpisode != -1):
shiftedShowSeason, shiftedShowEpisode = ssc.shiftSeason(showId,
season=showSeason,
episode=showEpisode)
defaultDigitLengths = ShowDescriptor.getDefaultDigitLengths(context)
indexSeasonDigits = currentShowDescriptor.getIndexSeasonDigits() if not currentPattern is None else defaultDigitLengths[ShowDescriptor.INDEX_SEASON_DIGITS_KEY]
indexEpisodeDigits = currentShowDescriptor.getIndexEpisodeDigits() if not currentPattern is None else defaultDigitLengths[ShowDescriptor.INDEX_EPISODE_DIGITS_KEY]
indicatorSeasonDigits = currentShowDescriptor.getIndicatorSeasonDigits() if not currentPattern is None else defaultDigitLengths[ShowDescriptor.INDICATOR_SEASON_DIGITS_KEY]
indicatorEpisodeDigits = currentShowDescriptor.getIndicatorEpisodeDigits() if not currentPattern is None else defaultDigitLengths[ShowDescriptor.INDICATOR_EPISODE_DIGITS_KEY]
showIdForShift = (
cliOverrides['tmdb']['show']
if 'tmdb' in cliOverrides.keys() and 'show' in cliOverrides['tmdb']
else matchedShowId
)
patternIdForShift = currentPattern.getId() if currentPattern is not None else None
hasExplicitTargetSeasonOrEpisode = (
'tmdb' in cliOverrides.keys()
and (
'season' in cliOverrides['tmdb']
or 'episode' in cliOverrides['tmdb']
)
)
# Shift season and episode if defined for the matched pattern or show
if (
not hasExplicitTargetSeasonOrEpisode
and showSeason != -1
and showEpisode != -1
):
shiftedShowSeason, shiftedShowEpisode = ssc.shiftSeason(
showIdForShift,
season=showSeason,
episode=showEpisode,
patternId=patternIdForShift,
)
else:
shiftedShowSeason = showSeason
shiftedShowEpisode = showEpisode
# Assemble target filename accordingly depending on TMDB lookup is enabled
#HINT: -1 if not set
showId = cliOverrides['tmdb']['show'] if 'tmdb' in cliOverrides.keys() and 'show' in cliOverrides['tmdb'] else (-1 if currentShowDescriptor is None else currentShowDescriptor.getId())
showId = (
cliOverrides['tmdb']['show']
if 'tmdb' in cliOverrides.keys() and 'show' in cliOverrides['tmdb']
else (-1 if currentShowDescriptor is None else currentShowDescriptor.getId())
)
if context['use_tmdb'] and showId != -1 and shiftedShowSeason != -1 and shiftedShowEpisode != -1:
@@ -1254,7 +1430,8 @@ def convert(ctx,
targetFormat,
chainIteration,
cropArguments,
currentPattern)
currentPattern,
currentShowDescriptor)

View File

@@ -1,5 +1,12 @@
import os, json
from .constants import (
DEFAULT_SHOW_INDEX_EPISODE_DIGITS,
DEFAULT_SHOW_INDEX_SEASON_DIGITS,
DEFAULT_SHOW_INDICATOR_EPISODE_DIGITS,
DEFAULT_SHOW_INDICATOR_SEASON_DIGITS,
)
class ConfigurationController():
CONFIG_FILENAME = 'ffx.json'
@@ -10,6 +17,10 @@ class ConfigurationController():
LOG_DIRECTORY_CONFIG_KEY = 'logDirectory'
SUBTITLES_DIRECTORY_CONFIG_KEY = 'subtitlesDirectory'
OUTPUT_FILENAME_TEMPLATE_KEY = 'outputFilenameTemplate'
DEFAULT_INDEX_SEASON_DIGITS_CONFIG_KEY = 'defaultIndexSeasonDigits'
DEFAULT_INDEX_EPISODE_DIGITS_CONFIG_KEY = 'defaultIndexEpisodeDigits'
DEFAULT_INDICATOR_SEASON_DIGITS_CONFIG_KEY = 'defaultIndicatorSeasonDigits'
DEFAULT_INDICATOR_EPISODE_DIGITS_CONFIG_KEY = 'defaultIndicatorEpisodeDigits'
def __init__(self):
@@ -57,6 +68,42 @@ class ConfigurationController():
)
return os.path.expanduser(str(subtitlesDirectory)) if subtitlesDirectory else ''
@classmethod
def getConfiguredIntegerValue(cls, configurationData: dict, configKey: str, defaultValue: int) -> int:
configuredValue = configurationData.get(configKey, defaultValue)
try:
return int(configuredValue)
except (TypeError, ValueError):
return int(defaultValue)
def getDefaultIndexSeasonDigits(self):
return ConfigurationController.getConfiguredIntegerValue(
self.__configurationData,
ConfigurationController.DEFAULT_INDEX_SEASON_DIGITS_CONFIG_KEY,
DEFAULT_SHOW_INDEX_SEASON_DIGITS,
)
def getDefaultIndexEpisodeDigits(self):
return ConfigurationController.getConfiguredIntegerValue(
self.__configurationData,
ConfigurationController.DEFAULT_INDEX_EPISODE_DIGITS_CONFIG_KEY,
DEFAULT_SHOW_INDEX_EPISODE_DIGITS,
)
def getDefaultIndicatorSeasonDigits(self):
return ConfigurationController.getConfiguredIntegerValue(
self.__configurationData,
ConfigurationController.DEFAULT_INDICATOR_SEASON_DIGITS_CONFIG_KEY,
DEFAULT_SHOW_INDICATOR_SEASON_DIGITS,
)
def getDefaultIndicatorEpisodeDigits(self):
return ConfigurationController.getConfiguredIntegerValue(
self.__configurationData,
ConfigurationController.DEFAULT_INDICATOR_EPISODE_DIGITS_CONFIG_KEY,
DEFAULT_SHOW_INDICATOR_EPISODE_DIGITS,
)
def getData(self):
return self.__configurationData

View File

@@ -1,5 +1,5 @@
VERSION='0.2.3'
DATABASE_VERSION = 2
VERSION='0.2.5'
DATABASE_VERSION = 3
DEFAULT_QUALITY = 32
DEFAULT_AV1_PRESET = 5
@@ -22,4 +22,9 @@ DEFAULT_CROPDETECT_DURATION_SECONDS = 180
DEFAULT_cut_start = 60
DEFAULT_cut_length = 180
DEFAULT_SHOW_INDEX_SEASON_DIGITS = 2
DEFAULT_SHOW_INDEX_EPISODE_DIGITS = 2
DEFAULT_SHOW_INDICATOR_SEASON_DIGITS = 2
DEFAULT_SHOW_INDICATOR_EPISODE_DIGITS = 2
DEFAULT_OUTPUT_FILENAME_TEMPLATE = '{{ ffx_show_name }} - {{ ffx_index }}{{ ffx_index_separator }}{{ ffx_episode_name }}{{ ffx_indicator_separator }}{{ ffx_indicator }}'

View File

@@ -1,6 +1,6 @@
import os, click
import os, shutil, click
from sqlalchemy import create_engine, inspect
from sqlalchemy import create_engine, inspect, text
from sqlalchemy.orm import sessionmaker
# Import the full model package so SQLAlchemy registers every mapped class
@@ -9,6 +9,11 @@ import ffx.model
from ffx.model.show import Base
from ffx.model.property import Property
from ffx.model.migration import (
DatabaseVersionException,
getMigrationPlan,
migrateDatabase,
)
from ffx.constants import DATABASE_VERSION
@@ -16,10 +21,6 @@ from ffx.constants import DATABASE_VERSION
DATABASE_VERSION_KEY = 'database_version'
EXPECTED_TABLE_NAMES = set(Base.metadata.tables.keys())
class DatabaseVersionException(Exception):
def __init__(self, errorMessage):
super().__init__(errorMessage)
def databaseContext(databasePath: str = ''):
databaseContext = {}
@@ -33,7 +34,13 @@ def databaseContext(databasePath: str = ''):
if not os.path.exists(ffxVarDir):
os.makedirs(ffxVarDir)
databasePath = os.path.join(ffxVarDir, 'ffx.db')
else:
databasePath = os.path.expanduser(databasePath)
if databasePath != ':memory:':
databasePath = os.path.abspath(databasePath)
databaseContext['path'] = databasePath
databaseContext['url'] = f"sqlite:///{databasePath}"
databaseContext['engine'] = create_engine(databaseContext['url'])
databaseContext['session'] = sessionmaker(bind=databaseContext['engine'])
@@ -68,14 +75,113 @@ def bootstrapDatabaseIfNeeded(databaseContext):
Base.metadata.create_all(databaseContext['engine'])
def ensureDatabaseVersion(databaseContext):
currentDatabaseVersion = getDatabaseVersion(databaseContext)
if currentDatabaseVersion:
if currentDatabaseVersion != DATABASE_VERSION:
raise DatabaseVersionException(f"Current database version ({currentDatabaseVersion}) does not match required ({DATABASE_VERSION})")
else:
if not currentDatabaseVersion:
setDatabaseVersion(databaseContext, DATABASE_VERSION)
return
if currentDatabaseVersion > DATABASE_VERSION:
raise DatabaseVersionException(
f"Current database version ({currentDatabaseVersion}) does not match required ({DATABASE_VERSION})"
)
if currentDatabaseVersion < DATABASE_VERSION:
promptForDatabaseMigration(databaseContext, currentDatabaseVersion, DATABASE_VERSION)
migrateDatabase(databaseContext, currentDatabaseVersion, DATABASE_VERSION, setDatabaseVersion)
currentDatabaseVersion = getDatabaseVersion(databaseContext)
if currentDatabaseVersion != DATABASE_VERSION:
raise DatabaseVersionException(
f"Current database version ({currentDatabaseVersion}) does not match required ({DATABASE_VERSION})"
)
ensureCurrentSchemaCompatibility(databaseContext)
def ensureCurrentSchemaCompatibility(databaseContext):
engine = databaseContext['engine']
inspector = inspect(engine)
showColumns = {
column['name']
for column in inspector.get_columns('shows')
}
alterStatements = []
if 'quality' not in showColumns:
alterStatements.append("ALTER TABLE shows ADD COLUMN quality INTEGER DEFAULT 0")
if 'notes' not in showColumns:
alterStatements.append("ALTER TABLE shows ADD COLUMN notes TEXT DEFAULT ''")
if not alterStatements:
return
with engine.begin() as connection:
for alterStatement in alterStatements:
connection.execute(text(alterStatement))
def promptForDatabaseMigration(databaseContext, currentDatabaseVersion: int, targetDatabaseVersion: int):
migrationPlan = getMigrationPlan(currentDatabaseVersion, targetDatabaseVersion)
click.echo("Database migration required.")
click.echo(f"Current version: {currentDatabaseVersion}")
click.echo(f"Target version: {targetDatabaseVersion}")
click.echo("Steps required:")
missingSteps = []
for migrationStep in migrationPlan:
moduleStatus = "present" if migrationStep.modulePresent else "missing"
click.echo(
f" {migrationStep.versionFrom} -> {migrationStep.versionTo}: "
+ f"{migrationStep.moduleName} [{moduleStatus}]"
)
if not migrationStep.modulePresent:
missingSteps.append(migrationStep)
if missingSteps:
firstMissingStep = missingSteps[0]
raise DatabaseVersionException(
f"No migration path from database version "
+ f"{firstMissingStep.versionFrom} to {firstMissingStep.versionTo}"
)
if not click.confirm(
"Create a backup and continue with database migration?",
default=True,
):
raise click.ClickException("Database migration aborted by user.")
backupPath = backupDatabaseBeforeMigration(
databaseContext,
currentDatabaseVersion,
targetDatabaseVersion,
)
click.echo(f"Database backup created: {backupPath}")
def backupDatabaseBeforeMigration(databaseContext, currentDatabaseVersion: int, targetDatabaseVersion: int) -> str:
databasePath = databaseContext.get('path', '')
if not databasePath or databasePath == ':memory:':
raise click.ClickException("Database migration backup requires a file-backed SQLite database.")
if not os.path.isfile(databasePath):
raise click.ClickException(f"Database file not found for backup: {databasePath}")
backupPath = f"{databasePath}.v{currentDatabaseVersion}-to-v{targetDatabaseVersion}.bak"
backupIndex = 1
while os.path.exists(backupPath):
backupPath = (
f"{databasePath}.v{currentDatabaseVersion}-to-v{targetDatabaseVersion}.{backupIndex}.bak"
)
backupIndex += 1
databaseContext['engine'].dispose()
shutil.copy2(databasePath, backupPath)
return backupPath
def getDatabaseVersion(databaseContext):

View File

@@ -245,7 +245,8 @@ class FfxController():
targetFormat: str = '',
chainIteration: list = [],
cropArguments: dict = {},
currentPattern: Pattern = None):
currentPattern: Pattern = None,
currentShowDescriptor = None):
# quality: int = DEFAULT_QUALITY,
# preset: int = DEFAULT_AV1_PRESET):
@@ -262,9 +263,11 @@ class FfxController():
if qualityFilters and (quality := qualityFilters[0]['parameters']['quality']):
self.__logger.info(f"Setting quality {quality} from command line parameter")
self.__logger.info(f"Setting quality {quality} from command line")
elif currentPattern is not None and (quality := currentPattern.quality):
self.__logger.info(f"Setting quality {quality} from pattern default")
self.__logger.info(f"Setting quality {quality} from pattern")
elif currentShowDescriptor is not None and (quality := currentShowDescriptor.getQuality()):
self.__logger.info(f"Setting quality {quality} from show")
else:
quality = (QualityFilter.DEFAULT_H264_QUALITY
if (videoEncoder == VideoEncoder.H264)

View File

@@ -30,6 +30,18 @@ class FileProperties():
DEFAULT_INDEX_DIGITS = 3
@classmethod
def extractSeasonEpisodeValues(cls, sourceText: str) -> tuple[int | None, int] | None:
seasonEpisodeMatch = re.search(cls.SEASON_EPISODE_INDICATOR_MATCH, str(sourceText))
if seasonEpisodeMatch is not None:
return int(seasonEpisodeMatch.group(1)), int(seasonEpisodeMatch.group(2))
episodeMatch = re.search(cls.EPISODE_INDICATOR_MATCH, str(sourceText))
if episodeMatch is not None:
return None, int(episodeMatch.group(1))
return None
def __init__(self, context, sourcePath):
self.context = context
@@ -65,26 +77,19 @@ class FileProperties():
databaseMatchedGroups = matchResult['match'].groups()
self.__logger.debug(f"FileProperties.__init__(): Matched groups: {databaseMatchedGroups}")
seIndicator = databaseMatchedGroups[0]
se_match = re.search(FileProperties.SEASON_EPISODE_INDICATOR_MATCH, seIndicator)
e_match = re.search(FileProperties.EPISODE_INDICATOR_MATCH, seIndicator)
indicatorSource = databaseMatchedGroups[0]
else:
self.__logger.debug(f"FileProperties.__init__(): Checking file name for indicator {self.__sourceFilename}")
indicatorSource = self.__sourceFilename
se_match = re.search(FileProperties.SEASON_EPISODE_INDICATOR_MATCH, self.__sourceFilename)
e_match = re.search(FileProperties.EPISODE_INDICATOR_MATCH, self.__sourceFilename)
if se_match is not None:
self.__season = int(se_match.group(1))
self.__episode = int(se_match.group(2))
elif e_match is not None:
self.__season = -1
self.__episode = int(e_match.group(1))
else:
seasonEpisodeValues = self.extractSeasonEpisodeValues(indicatorSource)
if seasonEpisodeValues is None:
self.__season = -1
self.__episode = -1
else:
sourceSeason, sourceEpisode = seasonEpisodeValues
self.__season = -1 if sourceSeason is None else int(sourceSeason)
self.__episode = int(sourceEpisode)
self.__ffprobeData = None

View File

@@ -4,6 +4,7 @@ from jinja2 import Environment, Undefined
from .constants import DEFAULT_OUTPUT_FILENAME_TEMPLATE
from .configuration_controller import ConfigurationController
from .logging_utils import get_ffx_logger
from .show_descriptor import ShowDescriptor
class EmptyStringUndefined(Undefined):
@@ -164,10 +165,10 @@ def getEpisodeFileBasename(showName,
episodeName,
season,
episode,
indexSeasonDigits = 2,
indexEpisodeDigits = 2,
indicatorSeasonDigits = 2,
indicatorEpisodeDigits = 2,
indexSeasonDigits = None,
indexEpisodeDigits = None,
indicatorSeasonDigits = None,
indicatorEpisodeDigits = None,
context = None):
"""
One Piece:
@@ -199,6 +200,16 @@ def getEpisodeFileBasename(showName,
configData = cc.getData() if cc is not None else {}
outputFilenameTemplate = configData.get(ConfigurationController.OUTPUT_FILENAME_TEMPLATE_KEY,
DEFAULT_OUTPUT_FILENAME_TEMPLATE)
defaultDigitLengths = ShowDescriptor.getDefaultDigitLengths(context)
if indexSeasonDigits is None:
indexSeasonDigits = defaultDigitLengths[ShowDescriptor.INDEX_SEASON_DIGITS_KEY]
if indexEpisodeDigits is None:
indexEpisodeDigits = defaultDigitLengths[ShowDescriptor.INDEX_EPISODE_DIGITS_KEY]
if indicatorSeasonDigits is None:
indicatorSeasonDigits = defaultDigitLengths[ShowDescriptor.INDICATOR_SEASON_DIGITS_KEY]
if indicatorEpisodeDigits is None:
indicatorEpisodeDigits = defaultDigitLengths[ShowDescriptor.INDICATOR_EPISODE_DIGITS_KEY]
if context is not None and 'logger' in context.keys():
logger = context['logger']

View File

@@ -500,7 +500,14 @@ class MediaDescriptor:
return subtitleFileDescriptors
def importSubtitles(self, searchDirectory, prefix, season: int = -1, episode: int = -1):
def importSubtitles(
self,
searchDirectory,
prefix,
season: int = -1,
episode: int = -1,
preserve_dispositions: bool = False,
):
# click.echo(f"Season: {season} Episode: {episode}")
self.__logger.debug(f"importSubtitles(): Season: {season} Episode: {episode}")
@@ -543,7 +550,7 @@ class MediaDescriptor:
# Prefer metadata coming from the external single-track source when
# it is provided explicitly by the filename contract.
matchingTrack.getTags()["language"] = msfd["language"]
if msfd["disposition_set"]:
if msfd["disposition_set"] and not preserve_dispositions:
matchingTrack.setDispositionSet(msfd["disposition_set"])

View File

@@ -559,6 +559,7 @@ class MediaDetailsScreen(Screen):
try:
kwargs = {}
kwargs[ShowDescriptor.CONTEXT_KEY] = self.context
kwargs[ShowDescriptor.ID_KEY] = int(selected_row_data[0])
kwargs[ShowDescriptor.NAME_KEY] = str(selected_row_data[1])
kwargs[ShowDescriptor.YEAR_KEY] = int(selected_row_data[2])

View File

@@ -1,47 +0,0 @@
import os, sys, importlib, inspect, glob, re
from ffx.configuration_controller import ConfigurationController
from ffx.database import databaseContext
from sqlalchemy import Engine
from sqlalchemy.orm import sessionmaker
class Conversion():
def __init__(self):
self._context = {}
self._context['config'] = ConfigurationController()
self._context['database'] = databaseContext(databasePath=self._context['config'].getDatabaseFilePath())
self.__databaseSession: sessionmaker = self._context['database']['session']
self.__databaseEngine: Engine = self._context['database']['engine']
@staticmethod
def list():
basePath = os.path.dirname(__file__)
filenamePattern = re.compile("conversion_([0-9]+)_([0-9]+)\\.py")
filenameList = [os.path.basename(fp) for fp in glob.glob(f"{ basePath }/*.py") if fp != __file__]
versionTupleList = [(fm.group(1), fm.group(2)) for fn in filenameList if (fm := filenamePattern.search(fn))]
return versionTupleList
@staticmethod
def getClassReference(versionFrom, versionTo):
importlib.import_module(f"ffx.model.conversions.conversion_{ versionFrom }_{ versionTo }")
for name, obj in inspect.getmembers(sys.modules[f"ffx.model.conversions.conversion_{ versionFrom }_{ versionTo }"]):
#HINT: Excluding DispositionCombination as it seems to be included by import (?)
if inspect.isclass(obj) and name != 'Conversion' and name.startswith('Conversion'):
return obj
@staticmethod
def getAllClassReferences():
return [Conversion.getClassReference(verFrom, verTo) for verFrom, verTo in Conversion.list()]

View File

@@ -1,17 +0,0 @@
import os, sys, importlib, inspect, glob, re
from .conversion import Conversion
class Conversion_2_3(Conversion):
def __init__(self):
super().__init__()
def applyConversion(self):
s = self.__databaseSession()
e = self.__databaseEngine
with e.connect() as c:
c.execute("ALTER TABLE user ADD COLUMN email VARCHAR(255)")

View File

@@ -1,7 +0,0 @@
import os, sys, importlib, inspect, glob, re
from .conversion import Conversion
class Conversion_3_4(Conversion):
pass

View File

@@ -0,0 +1,82 @@
from __future__ import annotations
from dataclasses import dataclass
import importlib
import importlib.util
class DatabaseVersionException(Exception):
def __init__(self, errorMessage):
super().__init__(errorMessage)
@dataclass(frozen=True)
class MigrationStep:
versionFrom: int
versionTo: int
moduleName: str
modulePresent: bool
def getMigrationStepModuleName(versionFrom: int, versionTo: int) -> str:
return f"ffx.model.migration.step_{int(versionFrom)}_{int(versionTo)}"
def migrationStepModuleExists(versionFrom: int, versionTo: int) -> bool:
moduleName = getMigrationStepModuleName(versionFrom, versionTo)
try:
return importlib.util.find_spec(moduleName) is not None
except ModuleNotFoundError:
return False
def getMigrationPlan(currentVersion: int, targetVersion: int) -> list[MigrationStep]:
version = int(currentVersion)
target = int(targetVersion)
migrationPlan = []
while version < target:
nextVersion = version + 1
migrationPlan.append(
MigrationStep(
versionFrom=version,
versionTo=nextVersion,
moduleName=getMigrationStepModuleName(version, nextVersion),
modulePresent=migrationStepModuleExists(version, nextVersion),
)
)
version = nextVersion
return migrationPlan
def loadMigrationStep(versionFrom: int, versionTo: int):
moduleName = getMigrationStepModuleName(versionFrom, versionTo)
try:
module = importlib.import_module(moduleName)
except ModuleNotFoundError as ex:
if ex.name == moduleName:
raise DatabaseVersionException(
f"No migration path from database version {versionFrom} to {versionTo}"
) from ex
raise
migrationStep = getattr(module, "applyMigration", None)
if migrationStep is None:
raise DatabaseVersionException(
f"Migration module {moduleName} does not define applyMigration()"
)
return migrationStep
def migrateDatabase(databaseContext, currentVersion: int, targetVersion: int, setDatabaseVersion):
for migrationStepInfo in getMigrationPlan(currentVersion, targetVersion):
migrationStep = loadMigrationStep(
migrationStepInfo.versionFrom,
migrationStepInfo.versionTo,
)
migrationStep(databaseContext)
setDatabaseVersion(databaseContext, migrationStepInfo.versionTo)

View File

@@ -0,0 +1,84 @@
from sqlalchemy import inspect, text
def applyMigration(databaseContext):
engine = databaseContext['engine']
inspector = inspect(engine)
shiftedSeasonColumns = {
column['name']
for column in inspector.get_columns('shifted_seasons')
}
showColumns = {
column['name']
for column in inspector.get_columns('shows')
}
with engine.begin() as connection:
if 'pattern_id' not in shiftedSeasonColumns:
connection.execute(text("PRAGMA foreign_keys=OFF"))
connection.execute(
text(
"""
CREATE TABLE shifted_seasons_v3 (
id INTEGER PRIMARY KEY,
show_id INTEGER,
pattern_id INTEGER,
original_season INTEGER,
first_episode INTEGER DEFAULT -1,
last_episode INTEGER DEFAULT -1,
season_offset INTEGER DEFAULT 0,
episode_offset INTEGER DEFAULT 0,
FOREIGN KEY(show_id) REFERENCES shows(id) ON DELETE CASCADE,
FOREIGN KEY(pattern_id) REFERENCES patterns(id) ON DELETE CASCADE,
CHECK (
(show_id IS NOT NULL AND pattern_id IS NULL)
OR (show_id IS NULL AND pattern_id IS NOT NULL)
)
)
"""
)
)
connection.execute(
text(
"""
INSERT INTO shifted_seasons_v3 (
id,
show_id,
pattern_id,
original_season,
first_episode,
last_episode,
season_offset,
episode_offset
)
SELECT
id,
show_id,
NULL,
original_season,
first_episode,
last_episode,
season_offset,
episode_offset
FROM shifted_seasons
"""
)
)
connection.execute(text("DROP TABLE shifted_seasons"))
connection.execute(text("ALTER TABLE shifted_seasons_v3 RENAME TO shifted_seasons"))
connection.execute(
text("CREATE INDEX ix_shifted_seasons_show_id ON shifted_seasons(show_id)")
)
connection.execute(
text("CREATE INDEX ix_shifted_seasons_pattern_id ON shifted_seasons(pattern_id)")
)
connection.execute(text("PRAGMA foreign_keys=ON"))
if 'quality' not in showColumns:
connection.execute(
text("ALTER TABLE shows ADD COLUMN quality INTEGER DEFAULT 0")
)
if 'notes' not in showColumns:
connection.execute(
text("ALTER TABLE shows ADD COLUMN notes TEXT DEFAULT ''")
)

View File

@@ -35,6 +35,7 @@ class Pattern(Base):
tracks = relationship('Track', back_populates='pattern', cascade="all, delete", lazy='joined')
media_tags = relationship('MediaTag', back_populates='pattern', cascade="all, delete", lazy='joined')
shifted_seasons = relationship('ShiftedSeason', back_populates='pattern', cascade="all, delete", lazy='joined')
quality = Column(Integer, default=0)

View File

@@ -1,6 +1,6 @@
import click
from sqlalchemy import Column, Integer, ForeignKey
from sqlalchemy import CheckConstraint, Column, ForeignKey, Index, Integer
from sqlalchemy.orm import relationship
from .show import Base, Show
@@ -9,6 +9,14 @@ from .show import Base, Show
class ShiftedSeason(Base):
__tablename__ = 'shifted_seasons'
__table_args__ = (
CheckConstraint(
"(show_id IS NOT NULL AND pattern_id IS NULL) OR (show_id IS NULL AND pattern_id IS NOT NULL)",
name="ck_shifted_seasons_single_owner",
),
Index("ix_shifted_seasons_show_id", "show_id"),
Index("ix_shifted_seasons_pattern_id", "pattern_id"),
)
# v1.x
id = Column(Integer, primary_key=True)
@@ -19,9 +27,12 @@ class ShiftedSeason(Base):
# pattern: Mapped[str] = mapped_column(String, nullable=False)
# v1.x
show_id = Column(Integer, ForeignKey('shows.id', ondelete="CASCADE"))
show_id = Column(Integer, ForeignKey('shows.id', ondelete="CASCADE"), nullable=True)
show = relationship(Show, back_populates='shifted_seasons', lazy='joined')
pattern_id = Column(Integer, ForeignKey('patterns.id', ondelete="CASCADE"), nullable=True)
pattern = relationship('Pattern', back_populates='shifted_seasons', lazy='joined')
# v2.0
# show_id: Mapped[int] = mapped_column(ForeignKey("shows.id", ondelete="CASCADE"))
# show: Mapped["Show"] = relationship(back_populates="patterns")
@@ -39,6 +50,12 @@ class ShiftedSeason(Base):
def getId(self):
return self.id
def getShowId(self):
return self.show_id
def getPatternId(self):
return self.pattern_id
def getOriginalSeason(self):
return self.original_season
@@ -61,6 +78,8 @@ class ShiftedSeason(Base):
shiftedSeasonObj = {}
shiftedSeasonObj['show_id'] = self.getShowId()
shiftedSeasonObj['pattern_id'] = self.getPatternId()
shiftedSeasonObj['original_season'] = self.getOriginalSeason()
shiftedSeasonObj['first_episode'] = self.getFirstEpisode()
shiftedSeasonObj['last_episode'] = self.getLastEpisode()
@@ -68,4 +87,3 @@ class ShiftedSeason(Base):
shiftedSeasonObj['episode_offset'] = self.getEpisodeOffset()
return shiftedSeasonObj

View File

@@ -1,5 +1,5 @@
# from typing import List
from sqlalchemy import create_engine, Column, Integer, String, ForeignKey
from sqlalchemy import create_engine, Column, Integer, String, Text, ForeignKey
from sqlalchemy.orm import relationship, declarative_base, sessionmaker
from ffx.show_descriptor import ShowDescriptor
@@ -45,6 +45,8 @@ class Show(Base):
index_episode_digits = Column(Integer, default=ShowDescriptor.DEFAULT_INDEX_EPISODE_DIGITS)
indicator_season_digits = Column(Integer, default=ShowDescriptor.DEFAULT_INDICATOR_SEASON_DIGITS)
indicator_episode_digits = Column(Integer, default=ShowDescriptor.DEFAULT_INDICATOR_EPISODE_DIGITS)
quality = Column(Integer, default=0)
notes = Column(Text, default='')
def getDescriptor(self, context):
@@ -58,5 +60,7 @@ class Show(Base):
kwargs[ShowDescriptor.INDEX_EPISODE_DIGITS_KEY] = int(self.index_episode_digits)
kwargs[ShowDescriptor.INDICATOR_SEASON_DIGITS_KEY] = int(self.indicator_season_digits)
kwargs[ShowDescriptor.INDICATOR_EPISODE_DIGITS_KEY] = int(self.indicator_episode_digits)
kwargs[ShowDescriptor.QUALITY_KEY] = int(self.quality or 0)
kwargs[ShowDescriptor.NOTES_KEY] = str(self.notes or '')
return ShowDescriptor(**kwargs)

View File

@@ -9,6 +9,8 @@ from ffx.model.pattern import Pattern
from .track_details_screen import TrackDetailsScreen
from .track_delete_screen import TrackDeleteScreen
from .shifted_season_delete_screen import ShiftedSeasonDeleteScreen
from .shifted_season_details_screen import ShiftedSeasonDetailsScreen
from .tag_details_screen import TagDetailsScreen
from .tag_delete_screen import TagDeleteScreen
@@ -24,6 +26,7 @@ from textual.widgets._data_table import CellDoesNotExist
from ffx.file_properties import FileProperties
from ffx.iso_language import IsoLanguage
from ffx.audio_layout import AudioLayout
from ffx.model.shifted_season import ShiftedSeason
from ffx.helper import formatRichColor, removeRichColor
@@ -34,8 +37,8 @@ class PatternDetailsScreen(Screen):
CSS = """
Grid {
grid-size: 7 17;
grid-rows: 2 2 2 2 2 2 6 2 2 8 2 2 8 2 2 2 2;
grid-size: 7 20;
grid-rows: 2 2 2 2 2 2 6 2 2 8 2 2 8 2 2 8 2 2 2 2;
grid-columns: 25 25 25 25 25 25 25;
height: 100%;
width: 100%;
@@ -115,11 +118,13 @@ class PatternDetailsScreen(Screen):
show=True,
track=True,
tag=True,
shifted_season=True,
)
self.__pc = controllers['pattern']
self.__sc = controllers['show']
self.__tc = controllers['track']
self.__tac = controllers['tag']
self.__ssc = controllers['shifted_season']
self.__pattern : Pattern = self.__pc.getPattern(patternId) if patternId is not None else None
self.__showDescriptor = self.__sc.getShowDescriptor(showId) if showId is not None else None
@@ -258,6 +263,72 @@ class PatternDetailsScreen(Screen):
row = (formatRichColor(tagKey, textColor), formatRichColor(tagValue, textColor))
self.tagsTable.add_row(*map(str, row))
def updateShiftedSeasons(self):
self.shiftedSeasonsTable.clear()
if self.__pattern is None:
return
shiftedSeason: ShiftedSeason
for shiftedSeason in self.__ssc.getShiftedSeasonSiblings(patternId=self.__pattern.getId()):
shiftedSeasonObj = shiftedSeason.getObj()
firstEpisode = shiftedSeasonObj['first_episode']
firstEpisodeStr = str(firstEpisode) if firstEpisode != -1 else ''
lastEpisode = shiftedSeasonObj['last_episode']
lastEpisodeStr = str(lastEpisode) if lastEpisode != -1 else ''
row = (
shiftedSeasonObj['original_season'],
firstEpisodeStr,
lastEpisodeStr,
shiftedSeasonObj['season_offset'],
shiftedSeasonObj['episode_offset'],
)
self.shiftedSeasonsTable.add_row(*map(str, row))
def getSelectedShiftedSeasonObjFromInput(self):
shiftedSeasonObj = {}
try:
row_key, col_key = self.shiftedSeasonsTable.coordinate_to_cell_key(
self.shiftedSeasonsTable.cursor_coordinate
)
if row_key is not None:
selected_row_data = self.shiftedSeasonsTable.get_row(row_key)
def parse_int_or_default(value: str, default: int) -> int:
try:
return int(value)
except (TypeError, ValueError):
return default
shiftedSeasonObj['original_season'] = int(selected_row_data[0])
shiftedSeasonObj['first_episode'] = parse_int_or_default(selected_row_data[1], -1)
shiftedSeasonObj['last_episode'] = parse_int_or_default(selected_row_data[2], -1)
shiftedSeasonObj['season_offset'] = parse_int_or_default(selected_row_data[3], 0)
shiftedSeasonObj['episode_offset'] = parse_int_or_default(selected_row_data[4], 0)
if self.__pattern is not None:
shiftedSeasonId = self.__ssc.findShiftedSeason(
patternId=self.__pattern.getId(),
originalSeason=shiftedSeasonObj['original_season'],
firstEpisode=shiftedSeasonObj['first_episode'],
lastEpisode=shiftedSeasonObj['last_episode'],
)
if shiftedSeasonId is not None:
shiftedSeasonObj['id'] = shiftedSeasonId
except CellDoesNotExist:
pass
return shiftedSeasonObj
def on_mount(self):
@@ -276,6 +347,7 @@ class PatternDetailsScreen(Screen):
self.updateTags()
self.updateTracks()
self.updateShiftedSeasons()
def compose(self):
@@ -304,6 +376,16 @@ class PatternDetailsScreen(Screen):
self.tracksTable.cursor_type = 'row'
self.shiftedSeasonsTable = DataTable(classes="seven")
self.column_key_original_season = self.shiftedSeasonsTable.add_column("Source Season", width=18)
self.column_key_first_episode = self.shiftedSeasonsTable.add_column("First Episode", width=18)
self.column_key_last_episode = self.shiftedSeasonsTable.add_column("Last Episode", width=18)
self.column_key_season_offset = self.shiftedSeasonsTable.add_column("Season Offset", width=18)
self.column_key_episode_offset = self.shiftedSeasonsTable.add_column("Episode Offset", width=18)
self.shiftedSeasonsTable.cursor_type = 'row'
yield Header()
@@ -345,6 +427,27 @@ class PatternDetailsScreen(Screen):
yield Static(" ", classes="seven")
# 9
yield Static("Shifted Seasons")
if self.__pattern is not None:
yield Button("Add", id="button_add_shifted_season")
yield Button("Edit", id="button_edit_shifted_season")
yield Button("Delete", id="button_delete_shifted_season")
else:
yield Static(" ")
yield Static(" ")
yield Static(" ")
yield Static(" ")
yield Static(" ")
yield Static(" ")
# 10
yield self.shiftedSeasonsTable
# 11
yield Static(" ", classes="seven")
# 12
yield Static("Media Tags")
yield Button("Add", id="button_add_tag")
yield Button("Edit", id="button_edit_tag")
@@ -354,13 +457,13 @@ class PatternDetailsScreen(Screen):
yield Static(" ")
yield Static(" ")
# 10
# 13
yield self.tagsTable
# 11
# 14
yield Static(" ", classes="seven")
# 12
# 15
yield Static("Streams")
yield Button("Add", id="button_add_track")
yield Button("Edit", id="button_edit_track")
@@ -370,21 +473,21 @@ class PatternDetailsScreen(Screen):
yield Button("Up", id="button_track_up")
yield Button("Down", id="button_track_down")
# 13
# 16
yield self.tracksTable
# 14
# 17
yield Static(" ", classes="seven")
# 15
# 18
yield Static(" ", classes="seven")
# 16
# 19
yield Button("Save", id="save_button")
yield Button("Cancel", id="cancel_button")
yield Static(" ", classes="five")
# 17
# 20
yield Static(" ", classes="seven")
yield Footer()
@@ -486,6 +589,35 @@ class PatternDetailsScreen(Screen):
if event.button.id == "cancel_button":
self.app.pop_screen()
if event.button.id == "button_add_shifted_season":
if self.__pattern is not None:
self.app.push_screen(
ShiftedSeasonDetailsScreen(patternId=self.__pattern.getId()),
self.handle_update_shifted_season,
)
if event.button.id == "button_edit_shifted_season":
selectedShiftedSeasonObj = self.getSelectedShiftedSeasonObjFromInput()
if 'id' in selectedShiftedSeasonObj.keys():
self.app.push_screen(
ShiftedSeasonDetailsScreen(
patternId=self.__pattern.getId(),
shiftedSeasonId=selectedShiftedSeasonObj['id'],
),
self.handle_update_shifted_season,
)
if event.button.id == "button_delete_shifted_season":
selectedShiftedSeasonObj = self.getSelectedShiftedSeasonObjFromInput()
if 'id' in selectedShiftedSeasonObj.keys():
self.app.push_screen(
ShiftedSeasonDeleteScreen(
patternId=self.__pattern.getId(),
shiftedSeasonId=selectedShiftedSeasonObj['id'],
),
self.handle_delete_shifted_season,
)
numTracks = len(self.getCurrentTrackDescriptors())
@@ -654,3 +786,9 @@ class PatternDetailsScreen(Screen):
self.updateTags()
else:
raise click.ClickException('tag delete failed')
def handle_update_shifted_season(self, screenResult):
self.updateShiftedSeasons()
def handle_delete_shifted_season(self, screenResult):
self.updateShiftedSeasons()

View File

@@ -6,225 +6,433 @@ from ffx.model.shifted_season import ShiftedSeason
class EpisodeOrderException(Exception):
pass
class RangeOverlapException(Exception):
pass
class ShiftedSeasonController():
class ShiftedSeasonOwnerException(Exception):
pass
class ShiftedSeasonController:
def __init__(self, context):
self.context = context
self.Session = self.context['database']['session'] # convenience
self.Session = self.context['database']['session'] # convenience
def checkShiftedSeason(self, showId: int, shiftedSeasonObj: dict, shiftedSeasonId: int = 0):
def _resolve_owner(self, showId=None, patternId=None):
hasShow = showId is not None
hasPattern = patternId is not None
if hasShow == hasPattern:
raise ShiftedSeasonOwnerException(
"ShiftedSeason rules require exactly one owner: either showId or patternId."
)
if hasShow:
if type(showId) is not int:
raise ValueError(
"ShiftedSeasonController: Argument showId is required to be of type int"
)
return {
'show_id': int(showId),
'pattern_id': None,
'label': f"show #{int(showId)}",
}
if type(patternId) is not int:
raise ValueError(
"ShiftedSeasonController: Argument patternId is required to be of type int"
)
return {
'show_id': None,
'pattern_id': int(patternId),
'label': f"pattern #{int(patternId)}",
}
def _apply_owner_filter(self, query, owner):
if owner['pattern_id'] is not None:
return query.filter(ShiftedSeason.pattern_id == owner['pattern_id'])
return query.filter(ShiftedSeason.show_id == owner['show_id'])
def _normalize_shifted_season_fields(self, shiftedSeasonObj: dict):
if type(shiftedSeasonObj) is not dict:
raise ValueError(
"ShiftedSeasonController: Argument shiftedSeasonObj is required to be of type dict"
)
fields = {
'original_season': int(shiftedSeasonObj['original_season']),
'first_episode': int(shiftedSeasonObj['first_episode']),
'last_episode': int(shiftedSeasonObj['last_episode']),
'season_offset': int(shiftedSeasonObj['season_offset']),
'episode_offset': int(shiftedSeasonObj['episode_offset']),
}
firstEpisode = fields['first_episode']
lastEpisode = fields['last_episode']
if firstEpisode != -1 and lastEpisode != -1 and lastEpisode < firstEpisode:
raise EpisodeOrderException(
"ShiftedSeason last_episode must be greater than or equal to first_episode."
)
return fields
def _ranges_overlap(self, firstEpisodeA, lastEpisodeA, firstEpisodeB, lastEpisodeB):
startA = float('-inf') if int(firstEpisodeA) == -1 else int(firstEpisodeA)
endA = float('inf') if int(lastEpisodeA) == -1 else int(lastEpisodeA)
startB = float('-inf') if int(firstEpisodeB) == -1 else int(firstEpisodeB)
endB = float('inf') if int(lastEpisodeB) == -1 else int(lastEpisodeB)
return startA <= endB and startB <= endA
def _ordered_query(self, session, owner):
q = self._apply_owner_filter(session.query(ShiftedSeason), owner)
return q.order_by(
ShiftedSeason.original_season.asc(),
ShiftedSeason.first_episode.asc(),
ShiftedSeason.last_episode.asc(),
ShiftedSeason.id.asc(),
)
def _find_matching_rule(self, session, owner, season: int, episode: int):
for shiftedSeasonEntry in self._ordered_query(session, owner).all():
if (
season == shiftedSeasonEntry.getOriginalSeason()
and (
shiftedSeasonEntry.getFirstEpisode() == -1
or episode >= shiftedSeasonEntry.getFirstEpisode()
)
and (
shiftedSeasonEntry.getLastEpisode() == -1
or episode <= shiftedSeasonEntry.getLastEpisode()
)
):
return shiftedSeasonEntry
return None
def checkShiftedSeason(
self,
showId: int | None = None,
shiftedSeasonObj: dict | None = None,
shiftedSeasonId: int = 0,
patternId: int | None = None,
):
"""
Check if for a particula season
shiftedSeasonId
Check whether a shifted-season rule is valid within one owner scope.
"""
session = None
try:
s = self.Session()
owner = self._resolve_owner(showId=showId, patternId=patternId)
fields = self._normalize_shifted_season_fields(shiftedSeasonObj)
session = self.Session()
originalSeason = shiftedSeasonObj['original_season']
firstEpisode = int(shiftedSeasonObj['first_episode'])
lastEpisode = int(shiftedSeasonObj['last_episode'])
q = s.query(ShiftedSeason).filter(ShiftedSeason.show_id == int(showId))
q = self._ordered_query(session, owner)
if shiftedSeasonId:
q = q.filter(ShiftedSeason.id != int(shiftedSeasonId))
siblingShiftedSeason: ShiftedSeason
for siblingShiftedSeason in q.all():
if fields['original_season'] != siblingShiftedSeason.getOriginalSeason():
continue
siblingOriginalSeason = siblingShiftedSeason.getOriginalSeason
siblingFirstEpisode = siblingShiftedSeason.getFirstEpisode()
siblingLastEpisode = siblingShiftedSeason.getLastEpisode()
if (originalSeason == siblingOriginalSeason
and lastEpisode >= siblingFirstEpisode
and siblingLastEpisode >= firstEpisode):
if self._ranges_overlap(
fields['first_episode'],
fields['last_episode'],
siblingShiftedSeason.getFirstEpisode(),
siblingShiftedSeason.getLastEpisode(),
):
return False
return True
except (EpisodeOrderException, ShiftedSeasonOwnerException) as ex:
raise click.ClickException(str(ex))
except Exception as ex:
raise click.ClickException(f"ShiftedSeasonController.addShiftedSeason(): {repr(ex)}")
raise click.ClickException(
f"ShiftedSeasonController.checkShiftedSeason(): {repr(ex)}"
)
finally:
s.close()
if session is not None:
session.close()
def addShiftedSeason(
self,
showId: int | None = None,
shiftedSeasonObj: dict | None = None,
patternId: int | None = None,
):
def addShiftedSeason(self, showId: int, shiftedSeasonObj: dict):
if type(showId) is not int:
raise ValueError(f"ShiftedSeasonController.addShiftedSeason(): Argument showId is required to be of type int")
if type(shiftedSeasonObj) is not dict:
raise ValueError(f"ShiftedSeasonController.addShiftedSeason(): Argument shiftedSeasonObj is required to be of type dict")
session = None
try:
s = self.Session()
owner = self._resolve_owner(showId=showId, patternId=patternId)
fields = self._normalize_shifted_season_fields(shiftedSeasonObj)
firstEpisode = int(shiftedSeasonObj['first_episode'])
lastEpisode = int(shiftedSeasonObj['last_episode'])
if not self.checkShiftedSeason(
showId=owner['show_id'],
patternId=owner['pattern_id'],
shiftedSeasonObj=fields,
):
raise RangeOverlapException(
f"ShiftedSeason rule overlaps with an existing rule for {owner['label']}."
)
if lastEpisode < firstEpisode:
raise EpisodeOrderException()
q = s.query(ShiftedSeason).filter(ShiftedSeason.show_id == int(showId))
shiftedSeason = ShiftedSeason(show_id = int(showId),
original_season = int(shiftedSeasonObj['original_season']),
first_episode = firstEpisode,
last_episode = lastEpisode,
season_offset = int(shiftedSeasonObj['season_offset']),
episode_offset = int(shiftedSeasonObj['episode_offset']))
s.add(shiftedSeason)
s.commit()
session = self.Session()
shiftedSeason = ShiftedSeason(
show_id=owner['show_id'],
pattern_id=owner['pattern_id'],
original_season=fields['original_season'],
first_episode=fields['first_episode'],
last_episode=fields['last_episode'],
season_offset=fields['season_offset'],
episode_offset=fields['episode_offset'],
)
session.add(shiftedSeason)
session.commit()
return shiftedSeason.getId()
except (EpisodeOrderException, RangeOverlapException, ShiftedSeasonOwnerException) as ex:
raise click.ClickException(str(ex))
except Exception as ex:
raise click.ClickException(f"ShiftedSeasonController.addShiftedSeason(): {repr(ex)}")
raise click.ClickException(
f"ShiftedSeasonController.addShiftedSeason(): {repr(ex)}"
)
finally:
s.close()
if session is not None:
session.close()
def updateShiftedSeason(self, shiftedSeasonId: int, shiftedSeasonObj: dict):
if type(shiftedSeasonId) is not int:
raise ValueError(f"ShiftedSeasonController.updateShiftedSeason(): Argument shiftedSeasonId is required to be of type int")
if type(shiftedSeasonObj) is not dict:
raise ValueError(f"ShiftedSeasonController.updateShiftedSeason(): Argument shiftedSeasonObj is required to be of type dict")
raise ValueError(
"ShiftedSeasonController.updateShiftedSeason(): Argument shiftedSeasonId is required to be of type int"
)
session = None
try:
s = self.Session()
fields = self._normalize_shifted_season_fields(shiftedSeasonObj)
session = self.Session()
shiftedSeason = s.query(ShiftedSeason).filter(ShiftedSeason.id == int(shiftedSeasonId)).first()
shiftedSeason = (
session.query(ShiftedSeason)
.filter(ShiftedSeason.id == int(shiftedSeasonId))
.first()
)
if shiftedSeason is not None:
shiftedSeason.original_season = int(shiftedSeasonObj['original_season'])
shiftedSeason.first_episode = int(shiftedSeasonObj['first_episode'])
shiftedSeason.last_episode = int(shiftedSeasonObj['last_episode'])
shiftedSeason.season_offset = int(shiftedSeasonObj['season_offset'])
shiftedSeason.episode_offset = int(shiftedSeasonObj['episode_offset'])
s.commit()
return True
else:
if shiftedSeason is None:
return False
owner = self._resolve_owner(
showId=shiftedSeason.getShowId(),
patternId=shiftedSeason.getPatternId(),
)
if not self.checkShiftedSeason(
showId=owner['show_id'],
patternId=owner['pattern_id'],
shiftedSeasonObj=fields,
shiftedSeasonId=shiftedSeasonId,
):
raise RangeOverlapException(
f"ShiftedSeason rule overlaps with an existing rule for {owner['label']}."
)
shiftedSeason.original_season = fields['original_season']
shiftedSeason.first_episode = fields['first_episode']
shiftedSeason.last_episode = fields['last_episode']
shiftedSeason.season_offset = fields['season_offset']
shiftedSeason.episode_offset = fields['episode_offset']
session.commit()
return True
except (EpisodeOrderException, RangeOverlapException, ShiftedSeasonOwnerException) as ex:
raise click.ClickException(str(ex))
except Exception as ex:
raise click.ClickException(f"ShiftedSeasonController.updateShiftedSeason(): {repr(ex)}")
raise click.ClickException(
f"ShiftedSeasonController.updateShiftedSeason(): {repr(ex)}"
)
finally:
s.close()
if session is not None:
session.close()
def findShiftedSeason(self, showId: int, originalSeason: int, firstEpisode: int, lastEpisode: int):
if type(showId) is not int:
raise ValueError(f"ShiftedSeasonController.findShiftedSeason(): Argument shiftedSeasonId is required to be of type int")
def findShiftedSeason(
self,
showId: int | None = None,
originalSeason: int | None = None,
firstEpisode: int | None = None,
lastEpisode: int | None = None,
patternId: int | None = None,
):
if type(originalSeason) is not int:
raise ValueError(f"ShiftedSeasonController.findShiftedSeason(): Argument originalSeason is required to be of type int")
raise ValueError(
"ShiftedSeasonController.findShiftedSeason(): Argument originalSeason is required to be of type int"
)
if type(firstEpisode) is not int:
raise ValueError(f"ShiftedSeasonController.findShiftedSeason(): Argument firstEpisode is required to be of type int")
raise ValueError(
"ShiftedSeasonController.findShiftedSeason(): Argument firstEpisode is required to be of type int"
)
if type(lastEpisode) is not int:
raise ValueError(f"ShiftedSeasonController.findShiftedSeason(): Argument lastEpisode is required to be of type int")
raise ValueError(
"ShiftedSeasonController.findShiftedSeason(): Argument lastEpisode is required to be of type int"
)
session = None
try:
s = self.Session()
shiftedSeason = s.query(ShiftedSeason).filter(
ShiftedSeason.show_id == int(showId),
ShiftedSeason.original_season == int(originalSeason),
ShiftedSeason.first_episode == int(firstEpisode),
ShiftedSeason.last_episode == int(lastEpisode),
).first()
owner = self._resolve_owner(showId=showId, patternId=patternId)
session = self.Session()
shiftedSeason = (
self._apply_owner_filter(session.query(ShiftedSeason), owner)
.filter(
ShiftedSeason.original_season == int(originalSeason),
ShiftedSeason.first_episode == int(firstEpisode),
ShiftedSeason.last_episode == int(lastEpisode),
)
.first()
)
return shiftedSeason.getId() if shiftedSeason is not None else None
except ShiftedSeasonOwnerException as ex:
raise click.ClickException(str(ex))
except Exception as ex:
raise click.ClickException(f"PatternController.findShiftedSeason(): {repr(ex)}")
raise click.ClickException(
f"ShiftedSeasonController.findShiftedSeason(): {repr(ex)}"
)
finally:
s.close()
if session is not None:
session.close()
def getShiftedSeasonSiblings(self, showId: int):
if type(showId) is not int:
raise ValueError(f"ShiftedSeasonController.getShiftedSeasonSiblings(): Argument shiftedSeasonId is required to be of type int")
def getShiftedSeasonSiblings(
self,
showId: int | None = None,
patternId: int | None = None,
):
session = None
try:
s = self.Session()
q = s.query(ShiftedSeason).filter(ShiftedSeason.show_id == int(showId))
return q.all()
owner = self._resolve_owner(showId=showId, patternId=patternId)
session = self.Session()
return self._ordered_query(session, owner).all()
except ShiftedSeasonOwnerException as ex:
raise click.ClickException(str(ex))
except Exception as ex:
raise click.ClickException(f"PatternController.getShiftedSeasonSiblings(): {repr(ex)}")
raise click.ClickException(
f"ShiftedSeasonController.getShiftedSeasonSiblings(): {repr(ex)}"
)
finally:
s.close()
if session is not None:
session.close()
def getShiftedSeason(self, shiftedSeasonId: int):
if type(shiftedSeasonId) is not int:
raise ValueError(f"ShiftedSeasonController.getShiftedSeason(): Argument shiftedSeasonId is required to be of type int")
raise ValueError(
"ShiftedSeasonController.getShiftedSeason(): Argument shiftedSeasonId is required to be of type int"
)
session = None
try:
s = self.Session()
return s.query(ShiftedSeason).filter(ShiftedSeason.id == int(shiftedSeasonId)).first()
session = self.Session()
return (
session.query(ShiftedSeason)
.filter(ShiftedSeason.id == int(shiftedSeasonId))
.first()
)
except Exception as ex:
raise click.ClickException(f"ShiftedSeasonController.getShiftedSeason(): {repr(ex)}")
raise click.ClickException(
f"ShiftedSeasonController.getShiftedSeason(): {repr(ex)}"
)
finally:
s.close()
if session is not None:
session.close()
def deleteShiftedSeason(self, shiftedSeasonId):
if type(shiftedSeasonId) is not int:
raise ValueError(f"ShiftedSeasonController.deleteShiftedSeason(): Argument shiftedSeasonId is required to be of type int")
raise ValueError(
"ShiftedSeasonController.deleteShiftedSeason(): Argument shiftedSeasonId is required to be of type int"
)
session = None
try:
s = self.Session()
shiftedSeason = s.query(ShiftedSeason).filter(ShiftedSeason.id == int(shiftedSeasonId)).first()
session = self.Session()
shiftedSeason = (
session.query(ShiftedSeason)
.filter(ShiftedSeason.id == int(shiftedSeasonId))
.first()
)
if shiftedSeason is not None:
#DAFUQ: https://stackoverflow.com/a/19245058
# q.delete()
s.delete(shiftedSeason)
s.commit()
session.delete(shiftedSeason)
session.commit()
return True
return False
except Exception as ex:
raise click.ClickException(f"ShiftedSeasonController.deleteShiftedSeason(): {repr(ex)}")
raise click.ClickException(
f"ShiftedSeasonController.deleteShiftedSeason(): {repr(ex)}"
)
finally:
s.close()
if session is not None:
session.close()
def shiftSeason(self, showId, season, episode, patternId=None):
def shiftSeason(self, showId, season, episode):
if season == -1 or episode == -1:
return season, episode
shiftedSeasonEntry: ShiftedSeason
for shiftedSeasonEntry in self.getShiftedSeasonSiblings(showId):
session = None
try:
session = self.Session()
activeShift = None
if (season == shiftedSeasonEntry.getOriginalSeason()
and (shiftedSeasonEntry.getFirstEpisode() == -1 or episode >= shiftedSeasonEntry.getFirstEpisode())
and (shiftedSeasonEntry.getLastEpisode() == -1 or episode <= shiftedSeasonEntry.getLastEpisode())):
if patternId is not None:
activeShift = self._find_matching_rule(
session,
self._resolve_owner(patternId=patternId),
season=int(season),
episode=int(episode),
)
shiftedSeason = season + shiftedSeasonEntry.getSeasonOffset()
shiftedEpisode = episode + shiftedSeasonEntry.getEpisodeOffset()
if activeShift is None and showId is not None and showId != -1:
activeShift = self._find_matching_rule(
session,
self._resolve_owner(showId=showId),
season=int(season),
episode=int(episode),
)
self.context['logger'].info(f"Shifting season: {season} episode: {episode} "
+f"-> season: {shiftedSeason} episode: {shiftedEpisode}")
if activeShift is None:
shiftedSeason = season
shiftedEpisode = episode
sourceLabel = "default"
else:
shiftedSeason = season + activeShift.getSeasonOffset()
shiftedEpisode = episode + activeShift.getEpisodeOffset()
sourceLabel = (
"pattern"
if activeShift.getPatternId() is not None
else "show"
)
return shiftedSeason, shiftedEpisode
self.context['logger'].info(
f"Setting season shift {season}/{episode} -> {shiftedSeason}/{shiftedEpisode} from {sourceLabel}"
)
return season, episode
return shiftedSeason, shiftedEpisode
except ShiftedSeasonOwnerException as ex:
raise click.ClickException(str(ex))
except Exception as ex:
raise click.ClickException(
f"ShiftedSeasonController.shiftSeason(): {repr(ex)}"
)
finally:
if session is not None:
session.close()

View File

@@ -43,7 +43,7 @@ class ShiftedSeasonDeleteScreen(Screen):
}
"""
def __init__(self, showId = None, shiftedSeasonId = None):
def __init__(self, showId = None, patternId = None, shiftedSeasonId = None):
super().__init__()
self.context = self.app.getContext()
@@ -52,6 +52,7 @@ class ShiftedSeasonDeleteScreen(Screen):
self.__ssc = ShiftedSeasonController(context = self.context)
self._showId = showId
self._patternId = patternId
self.__shiftedSeasonId = shiftedSeasonId
@@ -59,7 +60,12 @@ class ShiftedSeasonDeleteScreen(Screen):
shiftedSeason: ShiftedSeason = self.__ssc.getShiftedSeason(self.__shiftedSeasonId)
self.query_one("#static_show_id", Static).update(str(self._showId))
ownerLabel = (
f"pattern #{self._patternId}"
if self._patternId is not None
else f"show #{self._showId}"
)
self.query_one("#static_owner", Static).update(ownerLabel)
self.query_one("#static_original_season", Static).update(str(shiftedSeason.getOriginalSeason()))
self.query_one("#static_first_episode", Static).update(str(shiftedSeason.getFirstEpisode()))
self.query_one("#static_last_episode", Static).update(str(shiftedSeason.getLastEpisode()))
@@ -77,12 +83,12 @@ class ShiftedSeasonDeleteScreen(Screen):
yield Static(" ", classes="two")
yield Static("from show")
yield Static(" ", id="static_show_id")
yield Static("from")
yield Static(" ", id="static_owner")
yield Static(" ", classes="two")
yield Static("Original season")
yield Static("Source season")
yield Static(" ", id="static_original_season")
yield Static("First episode")
@@ -122,4 +128,3 @@ class ShiftedSeasonDeleteScreen(Screen):
if event.button.id == "cancel_button":
self.app.pop_screen()

View File

@@ -81,7 +81,7 @@ class ShiftedSeasonDetailsScreen(Screen):
}
"""
def __init__(self, showId = None, shiftedSeasonId = None):
def __init__(self, showId = None, patternId = None, shiftedSeasonId = None):
super().__init__()
self.context = self.app.getContext()
@@ -90,8 +90,14 @@ class ShiftedSeasonDetailsScreen(Screen):
self.__ssc = ShiftedSeasonController(context = self.context)
self.__showId = showId
self.__patternId = patternId
self.__shiftedSeasonId = shiftedSeasonId
def _owner_kwargs(self):
if self.__patternId is not None:
return {'patternId': self.__patternId}
return {'showId': self.__showId}
def on_mount(self):
if self.__shiftedSeasonId is not None:
@@ -126,7 +132,7 @@ class ShiftedSeasonDetailsScreen(Screen):
yield Static(" ", classes="three")
# 3
yield Static("Original season")
yield Static("Source season")
yield Input(id="input_original_season", classes="two")
# 4
@@ -203,8 +209,11 @@ class ShiftedSeasonDetailsScreen(Screen):
if self.__shiftedSeasonId is not None:
if self.__ssc.checkShiftedSeason(self.__showId, shiftedSeasonObj,
shiftedSeasonId = self.__shiftedSeasonId):
if self.__ssc.checkShiftedSeason(
shiftedSeasonObj=shiftedSeasonObj,
shiftedSeasonId=self.__shiftedSeasonId,
**self._owner_kwargs(),
):
if self.__ssc.updateShiftedSeason(self.__shiftedSeasonId, shiftedSeasonObj):
self.dismiss((self.__shiftedSeasonId, shiftedSeasonObj))
else:
@@ -212,8 +221,14 @@ class ShiftedSeasonDetailsScreen(Screen):
self.app.pop_screen()
else:
if self.__ssc.checkShiftedSeason(self.__showId, shiftedSeasonObj):
self.__shiftedSeasonId = self.__ssc.addShiftedSeason(self.__showId, shiftedSeasonObj)
if self.__ssc.checkShiftedSeason(
shiftedSeasonObj=shiftedSeasonObj,
**self._owner_kwargs(),
):
self.__shiftedSeasonId = self.__ssc.addShiftedSeason(
shiftedSeasonObj=shiftedSeasonObj,
**self._owner_kwargs(),
)
self.dismiss((self.__shiftedSeasonId, shiftedSeasonObj))

View File

@@ -62,7 +62,9 @@ class ShowController():
index_season_digits = showDescriptor.getIndexSeasonDigits(),
index_episode_digits = showDescriptor.getIndexEpisodeDigits(),
indicator_season_digits = showDescriptor.getIndicatorSeasonDigits(),
indicator_episode_digits = showDescriptor.getIndicatorEpisodeDigits())
indicator_episode_digits = showDescriptor.getIndicatorEpisodeDigits(),
quality = showDescriptor.getQuality(),
notes = showDescriptor.getNotes())
s.add(show)
s.commit()
@@ -88,6 +90,12 @@ class ShowController():
if currentShow.indicator_episode_digits != int(showDescriptor.getIndicatorEpisodeDigits()):
currentShow.indicator_episode_digits = int(showDescriptor.getIndicatorEpisodeDigits())
changed = True
if int(currentShow.quality or 0) != int(showDescriptor.getQuality()):
currentShow.quality = int(showDescriptor.getQuality())
changed = True
if str(currentShow.notes or '') != str(showDescriptor.getNotes()):
currentShow.notes = str(showDescriptor.getNotes())
changed = True
if changed:
s.commit()

View File

@@ -1,3 +1,10 @@
from .configuration_controller import ConfigurationController
from .constants import (
DEFAULT_SHOW_INDEX_EPISODE_DIGITS,
DEFAULT_SHOW_INDEX_SEASON_DIGITS,
DEFAULT_SHOW_INDICATOR_EPISODE_DIGITS,
DEFAULT_SHOW_INDICATOR_SEASON_DIGITS,
)
from .logging_utils import get_ffx_logger
@@ -14,11 +21,45 @@ class ShowDescriptor():
INDEX_EPISODE_DIGITS_KEY = 'index_episode_digits'
INDICATOR_SEASON_DIGITS_KEY = 'indicator_season_digits'
INDICATOR_EPISODE_DIGITS_KEY = 'indicator_episode_digits'
QUALITY_KEY = 'quality'
NOTES_KEY = 'notes'
DEFAULT_INDEX_SEASON_DIGITS = 2
DEFAULT_INDEX_EPISODE_DIGITS = 2
DEFAULT_INDICATOR_SEASON_DIGITS = 2
DEFAULT_INDICATOR_EPISODE_DIGITS = 2
DEFAULT_INDEX_SEASON_DIGITS = DEFAULT_SHOW_INDEX_SEASON_DIGITS
DEFAULT_INDEX_EPISODE_DIGITS = DEFAULT_SHOW_INDEX_EPISODE_DIGITS
DEFAULT_INDICATOR_SEASON_DIGITS = DEFAULT_SHOW_INDICATOR_SEASON_DIGITS
DEFAULT_INDICATOR_EPISODE_DIGITS = DEFAULT_SHOW_INDICATOR_EPISODE_DIGITS
@classmethod
def getDefaultDigitLengths(cls, context: dict | None = None) -> dict[str, int]:
configurationData = {}
if context is not None:
configController = context.get('config')
if configController is not None and hasattr(configController, 'getData'):
configurationData = configController.getData()
return {
cls.INDEX_SEASON_DIGITS_KEY: ConfigurationController.getConfiguredIntegerValue(
configurationData,
ConfigurationController.DEFAULT_INDEX_SEASON_DIGITS_CONFIG_KEY,
cls.DEFAULT_INDEX_SEASON_DIGITS,
),
cls.INDEX_EPISODE_DIGITS_KEY: ConfigurationController.getConfiguredIntegerValue(
configurationData,
ConfigurationController.DEFAULT_INDEX_EPISODE_DIGITS_CONFIG_KEY,
cls.DEFAULT_INDEX_EPISODE_DIGITS,
),
cls.INDICATOR_SEASON_DIGITS_KEY: ConfigurationController.getConfiguredIntegerValue(
configurationData,
ConfigurationController.DEFAULT_INDICATOR_SEASON_DIGITS_CONFIG_KEY,
cls.DEFAULT_INDICATOR_SEASON_DIGITS,
),
cls.INDICATOR_EPISODE_DIGITS_KEY: ConfigurationController.getConfiguredIntegerValue(
configurationData,
ConfigurationController.DEFAULT_INDICATOR_EPISODE_DIGITS_CONFIG_KEY,
cls.DEFAULT_INDICATOR_EPISODE_DIGITS,
),
}
def __init__(self, **kwargs):
@@ -53,36 +94,51 @@ class ShowDescriptor():
raise TypeError(f"ShowDescriptor.__init__(): Argument {ShowDescriptor.YEAR_KEY} is required to be of type int")
self.__showYear = kwargs[ShowDescriptor.YEAR_KEY]
else:
self.__showYear = -1
self.__showYear = -1
defaultDigitLengths = self.getDefaultDigitLengths(self.__context)
if ShowDescriptor.INDEX_SEASON_DIGITS_KEY in kwargs.keys():
if type(kwargs[ShowDescriptor.INDEX_SEASON_DIGITS_KEY]) is not int:
raise TypeError(f"ShowDescriptor.__init__(): Argument {ShowDescriptor.INDEX_SEASON_DIGITS_KEY} is required to be of type int")
self.__indexSeasonDigits = kwargs[ShowDescriptor.INDEX_SEASON_DIGITS_KEY]
else:
self.__indexSeasonDigits = ShowDescriptor.DEFAULT_INDEX_SEASON_DIGITS
self.__indexSeasonDigits = defaultDigitLengths[ShowDescriptor.INDEX_SEASON_DIGITS_KEY]
if ShowDescriptor.INDEX_EPISODE_DIGITS_KEY in kwargs.keys():
if type(kwargs[ShowDescriptor.INDEX_EPISODE_DIGITS_KEY]) is not int:
raise TypeError(f"ShowDescriptor.__init__(): Argument {ShowDescriptor.INDEX_EPISODE_DIGITS_KEY} is required to be of type int")
self.__indexEpisodeDigits = kwargs[ShowDescriptor.INDEX_EPISODE_DIGITS_KEY]
else:
self.__indexEpisodeDigits = ShowDescriptor.DEFAULT_INDEX_EPISODE_DIGITS
self.__indexEpisodeDigits = defaultDigitLengths[ShowDescriptor.INDEX_EPISODE_DIGITS_KEY]
if ShowDescriptor.INDICATOR_SEASON_DIGITS_KEY in kwargs.keys():
if type(kwargs[ShowDescriptor.INDICATOR_SEASON_DIGITS_KEY]) is not int:
raise TypeError(f"ShowDescriptor.__init__(): Argument {ShowDescriptor.INDICATOR_SEASON_DIGITS_KEY} is required to be of type int")
self.__indicatorSeasonDigits = kwargs[ShowDescriptor.INDICATOR_SEASON_DIGITS_KEY]
else:
self.__indicatorSeasonDigits = ShowDescriptor.DEFAULT_INDICATOR_SEASON_DIGITS
self.__indicatorSeasonDigits = defaultDigitLengths[ShowDescriptor.INDICATOR_SEASON_DIGITS_KEY]
if ShowDescriptor.INDICATOR_EPISODE_DIGITS_KEY in kwargs.keys():
if type(kwargs[ShowDescriptor.INDICATOR_EPISODE_DIGITS_KEY]) is not int:
raise TypeError(f"ShowDescriptor.__init__(): Argument {ShowDescriptor.INDICATOR_EPISODE_DIGITS_KEY} is required to be of type int")
self.__indicatorEpisodeDigits = kwargs[ShowDescriptor.INDICATOR_EPISODE_DIGITS_KEY]
else:
self.__indicatorEpisodeDigits = ShowDescriptor.DEFAULT_INDICATOR_EPISODE_DIGITS
self.__indicatorEpisodeDigits = defaultDigitLengths[ShowDescriptor.INDICATOR_EPISODE_DIGITS_KEY]
if ShowDescriptor.QUALITY_KEY in kwargs.keys():
if type(kwargs[ShowDescriptor.QUALITY_KEY]) is not int:
raise TypeError(f"ShowDescriptor.__init__(): Argument {ShowDescriptor.QUALITY_KEY} is required to be of type int")
self.__quality = kwargs[ShowDescriptor.QUALITY_KEY]
else:
self.__quality = 0
if ShowDescriptor.NOTES_KEY in kwargs.keys():
if type(kwargs[ShowDescriptor.NOTES_KEY]) is not str:
raise TypeError(f"ShowDescriptor.__init__(): Argument {ShowDescriptor.NOTES_KEY} is required to be of type str")
self.__notes = kwargs[ShowDescriptor.NOTES_KEY]
else:
self.__notes = ''
def getId(self):
@@ -100,6 +156,10 @@ class ShowDescriptor():
return self.__indicatorSeasonDigits
def getIndicatorEpisodeDigits(self):
return self.__indicatorEpisodeDigits
def getQuality(self):
return self.__quality
def getNotes(self):
return self.__notes
def getFilenamePrefix(self):
return f"{self.__showName} ({str(self.__showYear)})"

View File

@@ -1,7 +1,7 @@
import click
from textual.screen import Screen
from textual.widgets import Header, Footer, Static, Button, DataTable, Input
from textual.widgets import Header, Footer, Static, Button, DataTable, Input, TextArea
from textual.containers import Grid
from textual.widgets._data_table import CellDoesNotExist
@@ -25,8 +25,8 @@ class ShowDetailsScreen(Screen):
CSS = """
Grid {
grid-size: 5 16;
grid-rows: 2 2 2 2 2 2 2 2 2 2 2 9 2 9 2 2;
grid-size: 5 18;
grid-rows: 2 2 2 2 2 2 6 2 2 2 2 2 2 9 2 9 2 2;
grid-columns: 30 30 30 30 30;
height: 100%;
width: 100%;
@@ -77,6 +77,10 @@ class ShowDetailsScreen(Screen):
height: 100%;
border: solid green;
}
.note_box {
min-height: 6;
}
"""
BINDINGS = [
@@ -150,6 +154,10 @@ class ShowDetailsScreen(Screen):
self.query_one("#index_episode_digits_input", Input).value = str(self.__showDescriptor.getIndexEpisodeDigits())
self.query_one("#indicator_season_digits_input", Input).value = str(self.__showDescriptor.getIndicatorSeasonDigits())
self.query_one("#indicator_episode_digits_input", Input).value = str(self.__showDescriptor.getIndicatorEpisodeDigits())
if self.__showDescriptor.getQuality():
self.query_one("#quality_input", Input).value = str(self.__showDescriptor.getQuality())
if self.__showDescriptor.getNotes():
self.query_one("#notes_textarea", TextArea).text = str(self.__showDescriptor.getNotes())
#raise click.ClickException(f"show_id {showId}")
@@ -160,11 +168,20 @@ class ShowDetailsScreen(Screen):
self.updateShiftedSeasons()
else:
defaultDigitLengths = ShowDescriptor.getDefaultDigitLengths(self.context)
self.query_one("#index_season_digits_input", Input).value = "2"
self.query_one("#index_episode_digits_input", Input).value = "2"
self.query_one("#indicator_season_digits_input", Input).value = "2"
self.query_one("#indicator_episode_digits_input", Input).value = "2"
self.query_one("#index_season_digits_input", Input).value = str(
defaultDigitLengths[ShowDescriptor.INDEX_SEASON_DIGITS_KEY]
)
self.query_one("#index_episode_digits_input", Input).value = str(
defaultDigitLengths[ShowDescriptor.INDEX_EPISODE_DIGITS_KEY]
)
self.query_one("#indicator_season_digits_input", Input).value = str(
defaultDigitLengths[ShowDescriptor.INDICATOR_SEASON_DIGITS_KEY]
)
self.query_one("#indicator_episode_digits_input", Input).value = str(
defaultDigitLengths[ShowDescriptor.INDICATOR_EPISODE_DIGITS_KEY]
)
def getSelectedPatternDescriptor(self):
@@ -202,11 +219,17 @@ class ShowDetailsScreen(Screen):
if row_key is not None:
selected_row_data = self.shiftedSeasonsTable.get_row(row_key)
def parse_int_or_default(value: str, default: int) -> int:
try:
return int(value)
except (TypeError, ValueError):
return default
shiftedSeasonObj['original_season'] = int(selected_row_data[0])
shiftedSeasonObj['first_episode'] = int(selected_row_data[1]) if selected_row_data[1].isnumeric() else -1
shiftedSeasonObj['last_episode'] = int(selected_row_data[2]) if selected_row_data[2].isnumeric() else -1
shiftedSeasonObj['season_offset'] = int(selected_row_data[3]) if selected_row_data[3].isnumeric() else 0
shiftedSeasonObj['episode_offset'] = int(selected_row_data[4]) if selected_row_data[4].isnumeric() else 0
shiftedSeasonObj['first_episode'] = parse_int_or_default(selected_row_data[1], -1)
shiftedSeasonObj['last_episode'] = parse_int_or_default(selected_row_data[2], -1)
shiftedSeasonObj['season_offset'] = parse_int_or_default(selected_row_data[3], 0)
shiftedSeasonObj['episode_offset'] = parse_int_or_default(selected_row_data[4], 0)
if self.__showDescriptor is not None:
@@ -299,7 +322,7 @@ class ShowDetailsScreen(Screen):
self.shiftedSeasonsTable = DataTable(classes="five")
self.column_key_original_season = self.shiftedSeasonsTable.add_column("Original Season", width=30)
self.column_key_original_season = self.shiftedSeasonsTable.add_column("Source Season", width=30)
self.column_key_first_episode = self.shiftedSeasonsTable.add_column("First Episode", width=30)
self.column_key_last_episode = self.shiftedSeasonsTable.add_column("Last Episode", width=30)
self.column_key_season_offset = self.shiftedSeasonsTable.add_column("Season Offset", width=30)
@@ -333,28 +356,36 @@ class ShowDetailsScreen(Screen):
yield Input(type="integer", id="year_input", classes="four")
#5
yield Static(" ", classes="five")
yield Static("Quality")
yield Input(type="integer", id="quality_input", classes="four")
#6
yield Static("Notes")
yield Static(" ", classes="four")
#7
yield TextArea(id="notes_textarea", classes="five note_box")
#8
yield Static("Index Season Digits")
yield Input(type="integer", id="index_season_digits_input", classes="four")
#7
#9
yield Static("Index Episode Digits")
yield Input(type="integer", id="index_episode_digits_input", classes="four")
#8
#10
yield Static("Indicator Season Digits")
yield Input(type="integer", id="indicator_season_digits_input", classes="four")
#9
#11
yield Static("Indicator Edisode Digits")
yield Input(type="integer", id="indicator_episode_digits_input", classes="four")
# 10
# 12
yield Static(" ", classes="five")
# 11
# 13
yield Static("Shifted seasons", classes="two")
if self.__showDescriptor is not None:
@@ -366,18 +397,18 @@ class ShowDetailsScreen(Screen):
yield Static(" ")
yield Static(" ")
# 12
# 14
yield self.shiftedSeasonsTable
# 13
# 15
yield Static("File patterns", classes="five")
# 14
# 16
yield self.patternTable
# 15
# 17
yield Static(" ", classes="five")
# 16
# 18
yield Button("Save", id="save_button")
yield Button("Cancel", id="cancel_button")
@@ -387,7 +418,7 @@ class ShowDetailsScreen(Screen):
def getShowDescriptorFromInput(self) -> ShowDescriptor:
kwargs = {}
kwargs = {ShowDescriptor.CONTEXT_KEY: self.context}
try:
if self.__showDescriptor:
@@ -423,6 +454,11 @@ class ShowDetailsScreen(Screen):
kwargs[ShowDescriptor.INDICATOR_EPISODE_DIGITS_KEY] = int(self.query_one("#indicator_episode_digits_input", Input).value)
except ValueError:
pass
try:
kwargs[ShowDescriptor.QUALITY_KEY] = int(self.query_one("#quality_input", Input).value)
except ValueError:
pass
kwargs[ShowDescriptor.NOTES_KEY] = str(self.query_one("#notes_textarea", TextArea).text)
return ShowDescriptor(**kwargs)

View File

@@ -8,8 +8,19 @@ import sys
import tempfile
import unittest
from tests.support.ffx_bundle import SourceTrackSpec, create_source_fixture
from tests.support.ffx_bundle import (
SourceTrackSpec,
build_controller_context,
create_source_fixture,
dispose_controller_context,
)
from ffx.pattern_controller import PatternController
from ffx.show_controller import ShowController
from ffx.show_descriptor import ShowDescriptor
from ffx.shifted_season_controller import ShiftedSeasonController
from ffx.track_codec import TrackCodec
from ffx.track_descriptor import TrackDescriptor
from ffx.track_type import TrackType
try:
@@ -66,6 +77,64 @@ class UnmuxCliTests(unittest.TestCase):
f"STDERR:\n{completed.stderr}"
)
def seed_matching_show(self, pattern_expression: str, *, indicator_season_digits: int, indicator_episode_digits: int) -> None:
context = build_controller_context(self.database_path)
try:
ShowController(context).updateShow(
ShowDescriptor(
id=1,
name="Unmux Test Show",
year=2000,
indicator_season_digits=indicator_season_digits,
indicator_episode_digits=indicator_episode_digits,
)
)
PatternController(context).savePatternSchema(
{
"show_id": 1,
"pattern": pattern_expression,
"quality": 0,
"notes": "",
},
trackDescriptors=[
TrackDescriptor(
index=0,
source_index=0,
track_type=TrackType.VIDEO,
codec_name=TrackCodec.H264,
tags={},
disposition_set=set(),
)
],
)
finally:
dispose_controller_context(context)
def add_show_shift(
self,
*,
show_id: int,
original_season: int,
first_episode: int,
last_episode: int,
season_offset: int,
episode_offset: int,
) -> None:
context = build_controller_context(self.database_path)
try:
ShiftedSeasonController(context).addShiftedSeason(
showId=show_id,
shiftedSeasonObj={
"original_season": original_season,
"first_episode": first_episode,
"last_episode": last_episode,
"season_offset": season_offset,
"episode_offset": episode_offset,
},
)
finally:
dispose_controller_context(context)
def test_subtitles_only_without_output_directory_uses_configured_base_plus_label(self):
self.write_config(
{
@@ -101,6 +170,134 @@ class UnmuxCliTests(unittest.TestCase):
expected_directory = self.home_dir / ".local" / "var" / "sync" / "subtitles" / "dball"
self.assertTrue(expected_directory.is_dir(), expected_directory)
def test_unmux_uses_configured_indicator_digits_in_output_filenames(self):
self.write_config(
{
"defaultIndicatorSeasonDigits": 3,
"defaultIndicatorEpisodeDigits": 4,
}
)
source_filename = "unmux_s01e01.mkv"
output_directory = self.workdir / "unmux-output"
output_directory.mkdir()
source_path = create_source_fixture(
self.workdir,
source_filename,
[
SourceTrackSpec(TrackType.VIDEO, identity="video-0"),
],
)
completed = run_ffx_unmux(
self.workdir,
self.home_dir,
self.database_path,
"--label",
"dball",
"--output-directory",
str(output_directory),
str(source_path),
)
self.assertCompleted(completed)
output_filenames = sorted(path.name for path in output_directory.iterdir())
self.assertEqual(1, len(output_filenames), output_filenames)
self.assertTrue(
output_filenames[0].startswith("dball_S001E0001_"),
output_filenames,
)
def test_unmux_prefers_matched_show_indicator_digits_over_config_defaults(self):
self.write_config(
{
"defaultIndicatorSeasonDigits": 4,
"defaultIndicatorEpisodeDigits": 4,
}
)
self.seed_matching_show(
r"^unmux_([sS][0-9]+[eE][0-9]+)\.mkv$",
indicator_season_digits=1,
indicator_episode_digits=3,
)
source_filename = "unmux_s01e01.mkv"
output_directory = self.workdir / "unmux-output"
output_directory.mkdir()
source_path = create_source_fixture(
self.workdir,
source_filename,
[
SourceTrackSpec(TrackType.VIDEO, identity="video-0"),
],
)
completed = run_ffx_unmux(
self.workdir,
self.home_dir,
self.database_path,
"--label",
"dball",
"--output-directory",
str(output_directory),
str(source_path),
)
self.assertCompleted(completed)
output_filenames = sorted(path.name for path in output_directory.iterdir())
self.assertEqual(1, len(output_filenames), output_filenames)
self.assertTrue(
output_filenames[0].startswith("dball_S1E001_"),
output_filenames,
)
def test_unmux_applies_shifted_season_mapping_to_output_filenames(self):
self.seed_matching_show(
r"^unmux_([sS][0-9]+[eE][0-9]+)\.mkv$",
indicator_season_digits=2,
indicator_episode_digits=2,
)
self.add_show_shift(
show_id=1,
original_season=1,
first_episode=1,
last_episode=99,
season_offset=1,
episode_offset=-88,
)
source_filename = "unmux_s01e89.mkv"
output_directory = self.workdir / "unmux-output"
output_directory.mkdir()
source_path = create_source_fixture(
self.workdir,
source_filename,
[
SourceTrackSpec(TrackType.VIDEO, identity="video-0"),
SourceTrackSpec(
TrackType.SUBTITLE,
identity="subtitle-1",
language="eng",
subtitle_lines=("subtitle payload",),
),
],
)
completed = run_ffx_unmux(
self.workdir,
self.home_dir,
self.database_path,
"--label",
"dball",
"--output-directory",
str(output_directory),
"--subtitles-only",
str(source_path),
)
self.assertCompleted(completed)
self.assertIn(
"Unmuxing stream 1 into file dball_S02E01_1_eng",
completed.stderr,
)
if __name__ == "__main__":
unittest.main()

View File

@@ -0,0 +1,137 @@
from __future__ import annotations
import json
import os
from pathlib import Path
import sys
import tempfile
import unittest
from click.testing import CliRunner
SRC_ROOT = Path(__file__).resolve().parents[2] / "src"
if str(SRC_ROOT) not in sys.path:
sys.path.insert(0, str(SRC_ROOT))
from ffx import cli # noqa: E402
class RenameCliTests(unittest.TestCase):
def setUp(self):
self.tempdir = tempfile.TemporaryDirectory()
self.workspace = Path(self.tempdir.name)
self.home_dir = self.workspace / "home"
self.home_dir.mkdir()
def tearDown(self):
self.tempdir.cleanup()
def write_source(self, filename: str, payload: bytes = b"episode") -> Path:
source_path = self.workspace / filename
source_path.write_bytes(payload)
return source_path
def write_config(self, data: dict) -> None:
config_dir = self.home_dir / ".local" / "etc"
config_dir.mkdir(parents=True, exist_ok=True)
(config_dir / "ffx.json").write_text(json.dumps(data), encoding="utf-8")
def invoke_rename(self, *args: str):
runner = CliRunner()
result = runner.invoke(
cli.ffx,
["rename", *args],
env={**os.environ, "HOME": str(self.home_dir)},
)
self.assertEqual(0, result.exit_code, result.output)
return result
def test_rename_moves_matching_file_in_place(self):
source_path = self.write_source("demo_S02E03.mkv", b"season-episode")
result = self.invoke_rename("--prefix", "dball", str(source_path))
target_path = self.workspace / "dball_s02e03.mkv"
self.assertIn("demo_S02E03.mkv -> dball_s02e03.mkv", result.output)
self.assertFalse(source_path.exists())
self.assertTrue(target_path.exists())
self.assertEqual(b"season-episode", target_path.read_bytes())
def test_rename_uses_default_season_and_suffix_for_episode_only_match(self):
source_path = self.write_source("demo_E07.mp4", b"episode-only")
result = self.invoke_rename(
"--prefix",
"dball",
"--suffix",
"bonus",
str(source_path),
)
target_path = self.workspace / "dball_s01e07_bonus.mp4"
self.assertIn("demo_E07.mp4 -> dball_s01e07_bonus.mp4", result.output)
self.assertFalse(source_path.exists())
self.assertTrue(target_path.exists())
self.assertEqual(b"episode-only", target_path.read_bytes())
def test_rename_cli_season_overrides_source_season(self):
source_path = self.write_source("demo_s02e07.webm")
result = self.invoke_rename(
"--prefix",
"dball",
"--season",
"5",
str(source_path),
)
target_path = self.workspace / "dball_s05e07.webm"
self.assertIn("demo_s02e07.webm -> dball_s05e07.webm", result.output)
self.assertFalse(source_path.exists())
self.assertTrue(target_path.exists())
def test_rename_dry_run_prints_mapping_without_moving(self):
source_path = self.write_source("demo_E07.mkv")
result = self.invoke_rename(
"--dry-run",
"--prefix",
"dball",
str(source_path),
)
target_path = self.workspace / "dball_s01e07.mkv"
self.assertIn("demo_E07.mkv -> dball_s01e07.mkv", result.output)
self.assertTrue(source_path.exists())
self.assertFalse(target_path.exists())
def test_rename_uses_configured_indicator_digit_lengths(self):
self.write_config(
{
"defaultIndicatorSeasonDigits": 3,
"defaultIndicatorEpisodeDigits": 4,
}
)
source_path = self.write_source("demo_E07.mkv")
result = self.invoke_rename("--prefix", "dball", str(source_path))
target_path = self.workspace / "dball_s001e0007.mkv"
self.assertIn("demo_E07.mkv -> dball_s001e0007.mkv", result.output)
self.assertFalse(source_path.exists())
self.assertTrue(target_path.exists())
def test_rename_skips_non_matching_filenames(self):
source_path = self.write_source("demo_finale.mkv")
result = self.invoke_rename("--prefix", "dball", str(source_path))
self.assertIn("No matching files found.", result.output)
self.assertTrue(source_path.exists())
if __name__ == "__main__":
unittest.main()

View File

@@ -57,7 +57,7 @@ class UpgradeCommandTests(unittest.TestCase):
self.assertTrue(subprocess_calls[0][1]["capture_output"])
self.assertTrue(subprocess_calls[0][1]["text"])
def test_upgrade_resets_before_checkout_and_pull_when_user_confirms(self):
def test_upgrade_resets_then_fetches_and_checks_out_requested_branch_when_user_confirms(self):
runner = CliRunner()
repo_path = "/tmp/ffx-repo"
pip_path = "/tmp/ffx-venv/bin/pip"
@@ -85,8 +85,8 @@ class UpgradeCommandTests(unittest.TestCase):
[
['git', 'status', '--porcelain', '--untracked-files=no'],
['git', 'reset', '--hard', 'HEAD'],
['git', 'checkout', 'main'],
['git', 'pull'],
['git', 'fetch', 'origin', 'main'],
['git', 'checkout', '-B', 'main', 'FETCH_HEAD'],
[pip_path, 'install', '--upgrade', 'pip', 'setuptools', 'wheel'],
[pip_path, 'install', '--editable', '.'],
],
@@ -95,6 +95,39 @@ class UpgradeCommandTests(unittest.TestCase):
for args, kwargs in subprocess_calls[1:]:
self.assertEqual(repo_path, kwargs["cwd"], args)
def test_upgrade_pulls_current_branch_when_no_branch_is_requested(self):
runner = CliRunner()
repo_path = "/tmp/ffx-repo"
pip_path = "/tmp/ffx-venv/bin/pip"
subprocess_calls = []
def fake_run(args, **kwargs):
subprocess_calls.append((args, kwargs))
if args == ['git', 'status', '--porcelain', '--untracked-files=no']:
return self.make_completed(args, stdout="")
return self.make_completed(args)
with (
patch.object(cli, "getBundleRepoPath", return_value=repo_path),
patch.object(cli, "getBundlePipPath", return_value=pip_path),
patch.object(cli.os.path, "isdir", return_value=True),
patch.object(cli.os.path, "isfile", return_value=True),
patch.object(cli.subprocess, "run", side_effect=fake_run),
):
result = runner.invoke(cli.ffx, ["upgrade"])
self.assertEqual(0, result.exit_code, result.output)
self.assertEqual(
[
['git', 'status', '--porcelain', '--untracked-files=no'],
['git', 'pull'],
[pip_path, 'install', '--upgrade', 'pip', 'setuptools', 'wheel'],
[pip_path, 'install', '--editable', '.'],
],
[call[0] for call in subprocess_calls],
)
if __name__ == "__main__":
unittest.main()

View File

@@ -0,0 +1,150 @@
from __future__ import annotations
import json
import os
from pathlib import Path
import stat
import subprocess
import sys
import tempfile
import textwrap
import unittest
REPO_ROOT = Path(__file__).resolve().parents[2]
SCRIPT_PATH = REPO_ROOT / "tools" / "configure_workstation.sh"
BUNDLE_PYTHON = Path.home() / ".local" / "share" / "ffx.venv" / "bin" / "python"
class ConfigureWorkstationScriptTests(unittest.TestCase):
def setUp(self):
self.tempdir = tempfile.TemporaryDirectory()
self.home_dir = Path(self.tempdir.name) / "home"
self.home_dir.mkdir()
self.stub_bin_dir = Path(self.tempdir.name) / "bin"
self.stub_bin_dir.mkdir()
for command_name in ("git", "python3", "ffmpeg", "ffprobe", "cpulimit"):
self.write_stub_command(command_name)
def tearDown(self):
self.tempdir.cleanup()
def write_stub_command(self, name: str, body: str = "") -> None:
script_path = self.stub_bin_dir / name
script_path.write_text(
"#!/usr/bin/env bash\n"
+ body
+ "\n",
encoding="utf-8",
)
script_path.chmod(script_path.stat().st_mode | stat.S_IXUSR)
def run_script(self, **env_overrides: str) -> subprocess.CompletedProcess[str]:
if not BUNDLE_PYTHON.is_file():
self.skipTest(f"Missing bundle Python at {BUNDLE_PYTHON}")
env = {
**os.environ,
"HOME": str(self.home_dir),
"PATH": f"{self.stub_bin_dir}:{os.environ.get('PATH', '')}",
"FFX_PYTHON": str(BUNDLE_PYTHON),
**env_overrides,
}
return subprocess.run(
["bash", str(SCRIPT_PATH)],
capture_output=True,
cwd=REPO_ROOT,
env=env,
text=True,
)
def test_script_seeds_default_config_from_template(self):
completed = self.run_script()
self.assertEqual(
0,
completed.returncode,
f"STDOUT:\n{completed.stdout}\nSTDERR:\n{completed.stderr}",
)
config_path = self.home_dir / ".local" / "etc" / "ffx.json"
self.assertTrue(config_path.exists())
config_data = json.loads(config_path.read_text(encoding="utf-8"))
self.assertEqual(
{
"databasePath": str(self.home_dir / ".local" / "var" / "ffx" / "ffx.db"),
"logDirectory": str(self.home_dir / ".local" / "var" / "log"),
"subtitlesDirectory": str(
self.home_dir / ".local" / "var" / "sync" / "subtitles"
),
"defaultIndexSeasonDigits": 2,
"defaultIndexEpisodeDigits": 2,
"defaultIndicatorSeasonDigits": 2,
"defaultIndicatorEpisodeDigits": 2,
"metadata": {
"signature": {"RECODED_WITH": "FFX"},
"remove": [
"VERSION-eng",
"creation_time",
"NAME",
],
"streams": {
"remove": [
"BPS",
"NUMBER_OF_FRAMES",
"NUMBER_OF_BYTES",
"_STATISTICS_WRITING_APP",
"_STATISTICS_WRITING_DATE_UTC",
"_STATISTICS_TAGS",
"BPS-eng",
"DURATION-eng",
"NUMBER_OF_FRAMES-eng",
"NUMBER_OF_BYTES-eng",
"_STATISTICS_WRITING_APP-eng",
"_STATISTICS_WRITING_DATE_UTC-eng",
"_STATISTICS_TAGS-eng",
]
},
},
},
config_data,
)
def test_script_honors_custom_template_override(self):
custom_template_path = Path(self.tempdir.name) / "custom-config.j2"
custom_template_path.write_text(
textwrap.dedent(
"""
{
"databasePath": {{ database_path_json }},
"marker": "from-template",
"subtitlesDirectory": {{ subtitles_directory_json }}
}
"""
).lstrip(),
encoding="utf-8",
)
completed = self.run_script(FFX_CONFIG_TEMPLATE=str(custom_template_path))
self.assertEqual(
0,
completed.returncode,
f"STDOUT:\n{completed.stdout}\nSTDERR:\n{completed.stderr}",
)
config_path = self.home_dir / ".local" / "etc" / "ffx.json"
config_data = json.loads(config_path.read_text(encoding="utf-8"))
self.assertEqual("from-template", config_data["marker"])
self.assertEqual(
str(self.home_dir / ".local" / "var" / "ffx" / "ffx.db"),
config_data["databasePath"],
)
if __name__ == "__main__":
unittest.main()

View File

@@ -1,11 +1,14 @@
from __future__ import annotations
from pathlib import Path
import sqlite3
import sys
import tempfile
import unittest
from unittest.mock import patch
import click
SRC_ROOT = Path(__file__).resolve().parents[2] / "src"
@@ -15,8 +18,18 @@ if str(SRC_ROOT) not in sys.path:
from ffx.constants import DATABASE_VERSION # noqa: E402
from ffx.database import DATABASE_VERSION_KEY, databaseContext, getDatabaseVersion # noqa: E402
from ffx.model.shifted_season import ShiftedSeason # noqa: E402
from ffx.model.property import Property # noqa: E402
from ffx.model.show import Show # noqa: E402
from ffx.model.show import Base # noqa: E402
from ffx.show_controller import ShowController # noqa: E402
from ffx.show_descriptor import ShowDescriptor # noqa: E402
from ffx.shifted_season_controller import ShiftedSeasonController # noqa: E402
class StaticConfig:
def getData(self):
return {}
class DatabaseContextTests(unittest.TestCase):
@@ -27,6 +40,115 @@ class DatabaseContextTests(unittest.TestCase):
def tearDown(self):
self.tempdir.cleanup()
def create_demo_show_with_shift(self):
database_context = databaseContext(str(self.database_path))
context = {
"database": database_context,
"config": StaticConfig(),
"logger": object(),
}
try:
ShowController(context).updateShow(
ShowDescriptor(id=1, name="Demo", year=2000)
)
shifted_season_id = ShiftedSeasonController(context).addShiftedSeason(
showId=1,
shiftedSeasonObj={
"original_season": 1,
"first_episode": 1,
"last_episode": 10,
"season_offset": 1,
"episode_offset": -10,
},
)
finally:
database_context["engine"].dispose()
return shifted_season_id
def rewrite_shows_table_without_show_fields(self, cursor):
cursor.execute("ALTER TABLE shows RENAME TO shows_current")
cursor.execute(
"""
CREATE TABLE shows (
id INTEGER PRIMARY KEY,
name VARCHAR,
year INTEGER,
index_season_digits INTEGER,
index_episode_digits INTEGER,
indicator_season_digits INTEGER,
indicator_episode_digits INTEGER
)
"""
)
cursor.execute(
"""
INSERT INTO shows (
id,
name,
year,
index_season_digits,
index_episode_digits,
indicator_season_digits,
indicator_episode_digits
)
SELECT
id,
name,
year,
index_season_digits,
index_episode_digits,
indicator_season_digits,
indicator_episode_digits
FROM shows_current
"""
)
cursor.execute("DROP TABLE shows_current")
def rewrite_shifted_seasons_table_without_pattern_owner(self, cursor):
cursor.execute("DROP INDEX IF EXISTS ix_shifted_seasons_show_id")
cursor.execute("DROP INDEX IF EXISTS ix_shifted_seasons_pattern_id")
cursor.execute(
"ALTER TABLE shifted_seasons RENAME TO shifted_seasons_current"
)
cursor.execute(
"""
CREATE TABLE shifted_seasons (
id INTEGER PRIMARY KEY,
show_id INTEGER,
original_season INTEGER,
first_episode INTEGER DEFAULT -1,
last_episode INTEGER DEFAULT -1,
season_offset INTEGER DEFAULT 0,
episode_offset INTEGER DEFAULT 0,
FOREIGN KEY(show_id) REFERENCES shows(id) ON DELETE CASCADE
)
"""
)
cursor.execute(
"""
INSERT INTO shifted_seasons (
id,
show_id,
original_season,
first_episode,
last_episode,
season_offset,
episode_offset
)
SELECT
id,
show_id,
original_season,
first_episode,
last_episode,
season_offset,
episode_offset
FROM shifted_seasons_current
"""
)
cursor.execute("DROP TABLE shifted_seasons_current")
def test_database_context_bootstraps_new_database_with_current_version(self):
with patch("ffx.database.Base.metadata.create_all", wraps=Base.metadata.create_all) as mocked_create_all:
context = databaseContext(str(self.database_path))
@@ -78,6 +200,127 @@ class DatabaseContextTests(unittest.TestCase):
mocked_create_all.assert_not_called()
def test_database_context_migrates_v2_shifted_seasons_schema_to_v3(self):
shifted_season_id = self.create_demo_show_with_shift()
connection = sqlite3.connect(self.database_path)
try:
cursor = connection.cursor()
cursor.execute("PRAGMA foreign_keys=OFF")
self.rewrite_shifted_seasons_table_without_pattern_owner(cursor)
self.rewrite_shows_table_without_show_fields(cursor)
cursor.execute(
"UPDATE properties SET value = '2' WHERE key = ?",
(DATABASE_VERSION_KEY,),
)
connection.commit()
finally:
connection.close()
with patch("ffx.database.click.confirm", return_value=True) as mocked_confirm, patch(
"ffx.database.click.echo"
) as mocked_echo:
reopened_context = databaseContext(str(self.database_path))
try:
self.assertEqual(DATABASE_VERSION, getDatabaseVersion(reopened_context))
mocked_confirm.assert_called_once()
backup_path = Path(f"{self.database_path}.v2-to-v{DATABASE_VERSION}.bak")
self.assertTrue(backup_path.exists())
Session = reopened_context["session"]
session = Session()
try:
migrated_shifted_season = (
session.query(ShiftedSeason)
.filter(ShiftedSeason.id == shifted_season_id)
.first()
)
self.assertIsNotNone(migrated_shifted_season)
self.assertEqual(1, migrated_shifted_season.getShowId())
self.assertIsNone(migrated_shifted_season.getPatternId())
self.assertEqual(1, migrated_shifted_season.getOriginalSeason())
self.assertEqual(1, migrated_shifted_season.getFirstEpisode())
self.assertEqual(10, migrated_shifted_season.getLastEpisode())
migrated_show = session.query(Show).filter(Show.id == 1).first()
self.assertIsNotNone(migrated_show)
self.assertEqual(0, int(migrated_show.quality or 0))
self.assertEqual('', str(migrated_show.notes or ''))
finally:
session.close()
finally:
reopened_context["engine"].dispose()
echoedLines = [call.args[0] for call in mocked_echo.call_args_list]
self.assertIn("Database migration required.", echoedLines)
self.assertIn("Current version: 2", echoedLines)
self.assertIn(f"Target version: {DATABASE_VERSION}", echoedLines)
self.assertIn(
" 2 -> 3: ffx.model.migration.step_2_3 [present]",
echoedLines,
)
def test_database_context_aborts_migration_when_confirmation_is_declined(self):
context = databaseContext(str(self.database_path))
try:
Session = context["session"]
session = Session()
try:
version_row = (
session.query(Property)
.filter(Property.key == DATABASE_VERSION_KEY)
.first()
)
version_row.value = "2"
session.commit()
finally:
session.close()
finally:
context["engine"].dispose()
with patch("ffx.database.click.confirm", return_value=False), patch(
"ffx.database.click.echo"
):
with self.assertRaises(click.ClickException) as raisedContext:
databaseContext(str(self.database_path))
self.assertEqual("Database migration aborted by user.", str(raisedContext.exception))
self.assertFalse(Path(f"{self.database_path}.v2-to-v{DATABASE_VERSION}.bak").exists())
def test_database_context_repairs_current_show_schema_without_version_bump(self):
self.create_demo_show_with_shift()
connection = sqlite3.connect(self.database_path)
try:
cursor = connection.cursor()
cursor.execute("PRAGMA foreign_keys=OFF")
self.rewrite_shows_table_without_show_fields(cursor)
connection.commit()
finally:
connection.close()
with patch("ffx.database.click.confirm") as mocked_confirm, patch(
"ffx.database.click.echo"
) as mocked_echo:
reopened_context = databaseContext(str(self.database_path))
try:
self.assertEqual(DATABASE_VERSION, getDatabaseVersion(reopened_context))
Session = reopened_context["session"]
session = Session()
try:
repaired_show = session.query(Show).filter(Show.id == 1).first()
self.assertIsNotNone(repaired_show)
self.assertEqual(0, int(repaired_show.quality or 0))
self.assertEqual('', str(repaired_show.notes or ''))
finally:
session.close()
finally:
reopened_context["engine"].dispose()
mocked_confirm.assert_not_called()
mocked_echo.assert_not_called()
if __name__ == "__main__":
unittest.main()

View File

@@ -4,6 +4,7 @@ from pathlib import Path
import sys
import unittest
from unittest.mock import patch
from types import SimpleNamespace
SRC_ROOT = Path(__file__).resolve().parents[2] / "src"
@@ -15,6 +16,7 @@ if str(SRC_ROOT) not in sys.path:
from ffx.ffx_controller import FfxController # noqa: E402
from ffx.logging_utils import get_ffx_logger # noqa: E402
from ffx.media_descriptor import MediaDescriptor # noqa: E402
from ffx.show_descriptor import ShowDescriptor # noqa: E402
from ffx.track_codec import TrackCodec # noqa: E402
from ffx.track_descriptor import TrackDescriptor # noqa: E402
from ffx.track_type import TrackType # noqa: E402
@@ -134,6 +136,62 @@ class FfxControllerTests(unittest.TestCase):
self.assertIn("ENCODING_QUALITY=29", commands[0])
self.assertIn("ENCODING_PRESET=7", commands[0])
def test_run_job_uses_show_quality_when_pattern_quality_is_unset(self):
context = self.make_context(VideoEncoder.H264)
target_descriptor, source_descriptor = self.make_media_descriptors()
controller = FfxController(context, target_descriptor, source_descriptor)
commands = []
show_descriptor = ShowDescriptor(id=1, name="Show", year=2024, quality=23)
pattern = SimpleNamespace(quality=0)
with (
patch.object(
controller,
"executeCommandSequence",
side_effect=lambda command: commands.append(command) or ("", "", 0),
),
patch.object(context["logger"], "info") as mocked_info,
):
controller.runJob(
"input.mkv",
"output.mkv",
chainIteration=[],
currentPattern=pattern,
currentShowDescriptor=show_descriptor,
)
self.assertEqual(1, len(commands))
self.assertIn("ENCODING_QUALITY=23", commands[0])
mocked_info.assert_any_call("Setting quality 23 from show")
def test_run_job_prefers_pattern_quality_over_show_quality(self):
context = self.make_context(VideoEncoder.H264)
target_descriptor, source_descriptor = self.make_media_descriptors()
controller = FfxController(context, target_descriptor, source_descriptor)
commands = []
show_descriptor = ShowDescriptor(id=1, name="Show", year=2024, quality=23)
pattern = SimpleNamespace(quality=19)
with (
patch.object(
controller,
"executeCommandSequence",
side_effect=lambda command: commands.append(command) or ("", "", 0),
),
patch.object(context["logger"], "info") as mocked_info,
):
controller.runJob(
"input.mkv",
"output.mkv",
chainIteration=[],
currentPattern=pattern,
currentShowDescriptor=show_descriptor,
)
self.assertEqual(1, len(commands))
self.assertIn("ENCODING_QUALITY=19", commands[0])
mocked_info.assert_any_call("Setting quality 19 from pattern")
if __name__ == "__main__":
unittest.main()

View File

@@ -0,0 +1,79 @@
from __future__ import annotations
from pathlib import Path
import sys
import tempfile
import unittest
SRC_ROOT = Path(__file__).resolve().parents[2] / "src"
if str(SRC_ROOT) not in sys.path:
sys.path.insert(0, str(SRC_ROOT))
from ffx.logging_utils import get_ffx_logger # noqa: E402
from ffx.media_descriptor import MediaDescriptor # noqa: E402
from ffx.track_descriptor import TrackDescriptor # noqa: E402
from ffx.track_disposition import TrackDisposition # noqa: E402
from ffx.track_type import TrackType # noqa: E402
class MediaDescriptorImportSubtitlesTests(unittest.TestCase):
def make_descriptor(self) -> MediaDescriptor:
return MediaDescriptor(
context={"logger": get_ffx_logger()},
track_descriptors=[
TrackDescriptor(
index=3,
source_index=3,
sub_index=0,
track_type=TrackType.SUBTITLE,
tags={"language": "eng", "title": "DB Subtitle"},
disposition_set={TrackDisposition.DEFAULT},
)
],
)
def test_import_subtitles_preserves_target_dispositions_when_requested(self):
descriptor = self.make_descriptor()
with tempfile.TemporaryDirectory() as tmpdir:
sidecar_path = Path(tmpdir) / "dball_S01E01_3_deu_FOR.vtt"
sidecar_path.write_text("WEBVTT\n\n", encoding="utf-8")
descriptor.importSubtitles(
tmpdir,
"dball",
season=1,
episode=1,
preserve_dispositions=True,
)
track = descriptor.getSubtitleTracks()[0]
self.assertEqual(str(sidecar_path), track.getExternalSourceFilePath())
self.assertEqual("deu", track.getTags()["language"])
self.assertEqual({TrackDisposition.DEFAULT}, track.getDispositionSet())
def test_import_subtitles_uses_sidecar_dispositions_by_default(self):
descriptor = self.make_descriptor()
with tempfile.TemporaryDirectory() as tmpdir:
sidecar_path = Path(tmpdir) / "dball_S01E01_3_deu_FOR.vtt"
sidecar_path.write_text("WEBVTT\n\n", encoding="utf-8")
descriptor.importSubtitles(
tmpdir,
"dball",
season=1,
episode=1,
)
track = descriptor.getSubtitleTracks()[0]
self.assertEqual(str(sidecar_path), track.getExternalSourceFilePath())
self.assertEqual("deu", track.getTags()["language"])
self.assertEqual({TrackDisposition.FORCED}, track.getDispositionSet())
if __name__ == "__main__":
unittest.main()

View File

@@ -0,0 +1,47 @@
from __future__ import annotations
from pathlib import Path
import sys
import unittest
SRC_ROOT = Path(__file__).resolve().parents[2] / "src"
if str(SRC_ROOT) not in sys.path:
sys.path.insert(0, str(SRC_ROOT))
from ffx.model.migration import ( # noqa: E402
DatabaseVersionException,
getMigrationPlan,
loadMigrationStep,
migrateDatabase,
)
class MigrationTests(unittest.TestCase):
def test_get_migration_plan_lists_known_step_with_module_presence(self):
migrationPlan = getMigrationPlan(2, 3)
self.assertEqual(1, len(migrationPlan))
self.assertEqual(2, migrationPlan[0].versionFrom)
self.assertEqual(3, migrationPlan[0].versionTo)
self.assertEqual("ffx.model.migration.step_2_3", migrationPlan[0].moduleName)
self.assertTrue(migrationPlan[0].modulePresent)
def test_load_migration_step_returns_known_step(self):
migrationStep = loadMigrationStep(2, 3)
self.assertTrue(callable(migrationStep))
def test_migrate_database_raises_when_step_module_is_missing(self):
updatedVersions = []
with self.assertRaises(DatabaseVersionException):
migrateDatabase({}, 1, 2, lambda context, version: updatedVersions.append(version))
self.assertEqual([], updatedVersions)
if __name__ == "__main__":
unittest.main()

View File

@@ -0,0 +1,208 @@
from __future__ import annotations
import logging
from pathlib import Path
import sys
import tempfile
import unittest
from unittest.mock import patch
SRC_ROOT = Path(__file__).resolve().parents[2] / "src"
if str(SRC_ROOT) not in sys.path:
sys.path.insert(0, str(SRC_ROOT))
from ffx.database import databaseContext # noqa: E402
from ffx.model.pattern import Pattern # noqa: E402
from ffx.model.track import Track # noqa: E402
from ffx.show_controller import ShowController # noqa: E402
from ffx.show_descriptor import ShowDescriptor # noqa: E402
from ffx.shifted_season_controller import ShiftedSeasonController # noqa: E402
from ffx.track_type import TrackType # noqa: E402
class StaticConfig:
def __init__(self, data: dict | None = None):
self._data = data or {}
def getData(self):
return self._data
def make_logger(name: str) -> logging.Logger:
logger = logging.getLogger(name)
logger.handlers = []
logger.setLevel(logging.DEBUG)
logger.propagate = False
logger.addHandler(logging.NullHandler())
return logger
def make_context(database_path: Path) -> dict:
return {
"logger": make_logger(f"ffx-test-shifted-{database_path.stem}"),
"config": StaticConfig(),
"database": databaseContext(str(database_path)),
}
class ShiftedSeasonControllerTests(unittest.TestCase):
def setUp(self):
self.tempdir = tempfile.TemporaryDirectory()
self.database_path = Path(self.tempdir.name) / "shifted-season-test.db"
self.context = make_context(self.database_path)
self.show_controller = ShowController(self.context)
self.shifted_season_controller = ShiftedSeasonController(self.context)
def tearDown(self):
self.context["database"]["engine"].dispose()
self.tempdir.cleanup()
def add_show(self, show_id: int, name: str = "Demo Show"):
self.show_controller.updateShow(
ShowDescriptor(id=show_id, name=name, year=2000 + show_id)
)
def add_pattern(self, show_id: int, expression: str) -> int:
self.add_show(show_id)
Session = self.context["database"]["session"]
session = Session()
try:
pattern = Pattern(show_id=show_id, pattern=expression)
session.add(pattern)
session.flush()
session.add(
Track(
pattern_id=pattern.getId(),
track_type=TrackType.VIDEO.index(),
codec_name="h264",
index=0,
source_index=0,
disposition_flags=0,
audio_layout=0,
)
)
session.commit()
return pattern.getId()
finally:
session.close()
def test_shift_season_uses_show_mapping_when_no_pattern_mapping_exists(self):
pattern_id = self.add_pattern(1, r"^demo_(s[0-9]+e[0-9]+)\.mkv$")
self.shifted_season_controller.addShiftedSeason(
showId=1,
shiftedSeasonObj={
"original_season": 1,
"first_episode": 1,
"last_episode": 10,
"season_offset": 2,
"episode_offset": 5,
},
)
with patch.object(self.context["logger"], "info") as mocked_info:
shifted_season, shifted_episode = self.shifted_season_controller.shiftSeason(
showId=1,
patternId=pattern_id,
season=1,
episode=3,
)
self.assertEqual((3, 8), (shifted_season, shifted_episode))
mocked_info.assert_called_once_with(
"Setting season shift 1/3 -> 3/8 from show"
)
def test_shift_season_prefers_pattern_mapping_over_show_mapping(self):
pattern_id = self.add_pattern(1, r"^demo_(s[0-9]+e[0-9]+)\.mkv$")
self.shifted_season_controller.addShiftedSeason(
showId=1,
shiftedSeasonObj={
"original_season": 1,
"first_episode": 1,
"last_episode": 10,
"season_offset": 2,
"episode_offset": 5,
},
)
self.shifted_season_controller.addShiftedSeason(
patternId=pattern_id,
shiftedSeasonObj={
"original_season": 1,
"first_episode": 1,
"last_episode": 10,
"season_offset": 1,
"episode_offset": -2,
},
)
with patch.object(self.context["logger"], "info") as mocked_info:
shifted_season, shifted_episode = self.shifted_season_controller.shiftSeason(
showId=1,
patternId=pattern_id,
season=1,
episode=3,
)
self.assertEqual((2, 1), (shifted_season, shifted_episode))
mocked_info.assert_called_once_with(
"Setting season shift 1/3 -> 2/1 from pattern"
)
def test_shift_season_pattern_zero_offsets_override_show_mapping_to_identity(self):
pattern_id = self.add_pattern(1, r"^demo_(s[0-9]+e[0-9]+)\.mkv$")
self.shifted_season_controller.addShiftedSeason(
showId=1,
shiftedSeasonObj={
"original_season": 1,
"first_episode": 1,
"last_episode": 10,
"season_offset": 2,
"episode_offset": 5,
},
)
self.shifted_season_controller.addShiftedSeason(
patternId=pattern_id,
shiftedSeasonObj={
"original_season": 1,
"first_episode": 1,
"last_episode": 10,
"season_offset": 0,
"episode_offset": 0,
},
)
with patch.object(self.context["logger"], "info") as mocked_info:
shifted_season, shifted_episode = self.shifted_season_controller.shiftSeason(
showId=1,
patternId=pattern_id,
season=1,
episode=3,
)
self.assertEqual((1, 3), (shifted_season, shifted_episode))
mocked_info.assert_called_once_with(
"Setting season shift 1/3 -> 1/3 from pattern"
)
def test_shift_season_falls_back_to_identity_when_no_rule_matches(self):
pattern_id = self.add_pattern(1, r"^demo_(s[0-9]+e[0-9]+)\.mkv$")
with patch.object(self.context["logger"], "info") as mocked_info:
shifted_season, shifted_episode = self.shifted_season_controller.shiftSeason(
showId=1,
patternId=pattern_id,
season=4,
episode=20,
)
self.assertEqual((4, 20), (shifted_season, shifted_episode))
mocked_info.assert_called_once_with(
"Setting season shift 4/20 -> 4/20 from default"
)
if __name__ == "__main__":
unittest.main()

View File

@@ -0,0 +1,111 @@
from __future__ import annotations
import logging
from pathlib import Path
import sys
import unittest
SRC_ROOT = Path(__file__).resolve().parents[2] / "src"
if str(SRC_ROOT) not in sys.path:
sys.path.insert(0, str(SRC_ROOT))
from ffx.constants import (
DEFAULT_SHOW_INDEX_EPISODE_DIGITS,
DEFAULT_SHOW_INDEX_SEASON_DIGITS,
DEFAULT_SHOW_INDICATOR_EPISODE_DIGITS,
DEFAULT_SHOW_INDICATOR_SEASON_DIGITS,
)
from ffx.helper import getEpisodeFileBasename
from ffx.show_descriptor import ShowDescriptor
class StaticConfig:
def __init__(self, data: dict | None = None):
self._data = data or {}
def getData(self):
return self._data
class ShowDescriptorDefaultTests(unittest.TestCase):
def make_context(self, config_data: dict | None = None) -> dict:
logger = logging.getLogger("ffx-test-show-descriptor-defaults")
logger.handlers = []
logger.addHandler(logging.NullHandler())
return {"config": StaticConfig(config_data), "logger": logger}
def test_show_descriptor_uses_config_defaults_when_context_is_present(self):
descriptor = ShowDescriptor(
context=self.make_context(
{
"defaultIndexSeasonDigits": "1",
"defaultIndexEpisodeDigits": "3",
"defaultIndicatorSeasonDigits": "3",
"defaultIndicatorEpisodeDigits": "4",
}
),
id=1,
name="Configured Show",
year=2024,
)
self.assertEqual(1, descriptor.getIndexSeasonDigits())
self.assertEqual(3, descriptor.getIndexEpisodeDigits())
self.assertEqual(3, descriptor.getIndicatorSeasonDigits())
self.assertEqual(4, descriptor.getIndicatorEpisodeDigits())
self.assertEqual(0, descriptor.getQuality())
self.assertEqual("", descriptor.getNotes())
def test_show_descriptor_without_context_uses_shared_constants(self):
descriptor = ShowDescriptor(id=1, name="Default Show", year=2024)
self.assertEqual(DEFAULT_SHOW_INDEX_SEASON_DIGITS, descriptor.getIndexSeasonDigits())
self.assertEqual(DEFAULT_SHOW_INDEX_EPISODE_DIGITS, descriptor.getIndexEpisodeDigits())
self.assertEqual(
DEFAULT_SHOW_INDICATOR_SEASON_DIGITS,
descriptor.getIndicatorSeasonDigits(),
)
self.assertEqual(
DEFAULT_SHOW_INDICATOR_EPISODE_DIGITS,
descriptor.getIndicatorEpisodeDigits(),
)
self.assertEqual(0, descriptor.getQuality())
self.assertEqual("", descriptor.getNotes())
def test_show_descriptor_preserves_explicit_quality(self):
descriptor = ShowDescriptor(id=1, name="Quality Show", year=2024, quality=23)
self.assertEqual(23, descriptor.getQuality())
def test_show_descriptor_preserves_explicit_notes(self):
descriptor = ShowDescriptor(id=1, name="Notes Show", year=2024, notes="show notes")
self.assertEqual("show notes", descriptor.getNotes())
def test_episode_basename_uses_configured_digit_defaults_when_omitted(self):
basename = getEpisodeFileBasename(
"Configured Show",
"Episode Name",
2,
7,
context=self.make_context(
{
"defaultIndexSeasonDigits": 1,
"defaultIndexEpisodeDigits": 3,
"defaultIndicatorSeasonDigits": 3,
"defaultIndicatorEpisodeDigits": 4,
}
),
)
self.assertEqual(
"Configured Show - 2007 Episode Name - S002E0007",
basename,
)
if __name__ == "__main__":
unittest.main()

View File

@@ -2,12 +2,15 @@
set -u
ROOT_DIR="$(cd -- "$(dirname -- "${BASH_SOURCE[0]}")/.." && pwd)"
CONFIG_DIR="${FFX_CONFIG_DIR:-${HOME}/.local/etc}"
CONFIG_FILE="${FFX_CONFIG_FILE:-${CONFIG_DIR}/ffx.json}"
VAR_DIR="${FFX_VAR_DIR:-${HOME}/.local/var/ffx}"
LOG_DIR="${FFX_LOG_DIR:-${HOME}/.local/var/log}"
DATABASE_FILE="${FFX_DATABASE_FILE:-${VAR_DIR}/ffx.db}"
SUBTITLES_BASE_DIR="${FFX_SUBTITLES_BASE_DIR:-${HOME}/.local/var/sync/subtitles}"
FFX_PYTHON="${FFX_PYTHON:-${HOME}/.local/share/ffx.venv/bin/python}"
CONFIG_TEMPLATE_FILE="${FFX_CONFIG_TEMPLATE:-${ROOT_DIR}/assets/ffx.json.j2}"
CHECK_ONLY=0
WITH_TESTS=0
@@ -49,6 +52,8 @@ Environment overrides:
FFX_LOG_DIR Override the default log directory.
FFX_DATABASE_FILE Override the database path written into a newly seeded config.
FFX_SUBTITLES_BASE_DIR Override the default subtitles base directory written into a newly seeded config.
FFX_PYTHON Override the bundle venv Python used to render the seeded config.
FFX_CONFIG_TEMPLATE Override the Jinja2 template path used to seed the config.
Notes:
- tools/setup.sh is the first installation step and owns bundle venv setup.
@@ -316,6 +321,93 @@ install_system_requirements() {
return 0
}
render_default_config() {
local output_path="$1"
local temporary_output_path=""
if [ ! -x "${FFX_PYTHON}" ]; then
printf 'Missing bundle Python interpreter at %s.\n' "${FFX_PYTHON}" >&2
INSTALL_FAILURES=$((INSTALL_FAILURES + 1))
return 1
fi
if [ ! -f "${CONFIG_TEMPLATE_FILE}" ]; then
printf 'Missing FFX config template at %s.\n' "${CONFIG_TEMPLATE_FILE}" >&2
INSTALL_FAILURES=$((INSTALL_FAILURES + 1))
return 1
fi
if ! temporary_output_path="$(mktemp "${output_path}.tmp.XXXXXX")"; then
printf 'Failed to create a temporary config file next to %s.\n' "${output_path}" >&2
INSTALL_FAILURES=$((INSTALL_FAILURES + 1))
return 1
fi
if ! FFX_CONFIG_TEMPLATE_FILE="${CONFIG_TEMPLATE_FILE}" \
FFX_REPO_ROOT="${ROOT_DIR}" \
FFX_DATABASE_PATH="${DATABASE_FILE}" \
FFX_LOG_DIRECTORY="${LOG_DIR}" \
FFX_SUBTITLES_DIRECTORY="${SUBTITLES_BASE_DIR}" \
"${FFX_PYTHON}" - >"${temporary_output_path}" <<'PY'
from __future__ import annotations
import json
import os
import sys
from pathlib import Path
from jinja2 import Environment, FileSystemLoader, StrictUndefined
repo_root = Path(os.environ["FFX_REPO_ROOT"])
src_root = repo_root / "src"
if str(src_root) not in sys.path:
sys.path.insert(0, str(src_root))
from ffx.constants import (
DEFAULT_SHOW_INDEX_EPISODE_DIGITS,
DEFAULT_SHOW_INDEX_SEASON_DIGITS,
DEFAULT_SHOW_INDICATOR_EPISODE_DIGITS,
DEFAULT_SHOW_INDICATOR_SEASON_DIGITS,
)
template_path = Path(os.environ["FFX_CONFIG_TEMPLATE_FILE"])
environment = Environment(
loader=FileSystemLoader(str(template_path.parent)),
undefined=StrictUndefined,
autoescape=False,
keep_trailing_newline=True,
)
template = environment.get_template(template_path.name)
sys.stdout.write(
template.render(
database_path_json=json.dumps(os.environ["FFX_DATABASE_PATH"]),
log_directory_json=json.dumps(os.environ["FFX_LOG_DIRECTORY"]),
subtitles_directory_json=json.dumps(os.environ["FFX_SUBTITLES_DIRECTORY"]),
default_index_season_digits=DEFAULT_SHOW_INDEX_SEASON_DIGITS,
default_index_episode_digits=DEFAULT_SHOW_INDEX_EPISODE_DIGITS,
default_indicator_season_digits=DEFAULT_SHOW_INDICATOR_SEASON_DIGITS,
default_indicator_episode_digits=DEFAULT_SHOW_INDICATOR_EPISODE_DIGITS,
)
)
PY
then
rm -f "${temporary_output_path}"
printf 'Failed to render ffx config from template %s.\n' "${CONFIG_TEMPLATE_FILE}" >&2
INSTALL_FAILURES=$((INSTALL_FAILURES + 1))
return 1
fi
if ! mv "${temporary_output_path}" "${output_path}"; then
rm -f "${temporary_output_path}"
printf 'Failed to move rendered ffx config into place at %s.\n' "${output_path}" >&2
INSTALL_FAILURES=$((INSTALL_FAILURES + 1))
return 1
fi
return 0
}
seed_default_config() {
if [ "${CHECK_ONLY}" -eq 1 ]; then
return 0
@@ -365,43 +457,7 @@ seed_default_config() {
if [ ! -f "${CONFIG_FILE}" ]; then
printf 'Seeding ffx config at %s...\n' "${CONFIG_FILE}"
if ! cat >"${CONFIG_FILE}" <<EOF
{
"databasePath": "${DATABASE_FILE}",
"logDirectory": "${LOG_DIR}",
"subtitlesDirectory": "${SUBTITLES_BASE_DIR}",
"metadata": {
"signature": {
"RECODED_WITH": "FFX"
},
"remove": [
"VERSION-eng",
"creation_time",
"NAME"
],
"streams": {
"remove": [
"BPS",
"NUMBER_OF_FRAMES",
"NUMBER_OF_BYTES",
"_STATISTICS_WRITING_APP",
"_STATISTICS_WRITING_DATE_UTC",
"_STATISTICS_TAGS",
"BPS-eng",
"DURATION-eng",
"NUMBER_OF_FRAMES-eng",
"NUMBER_OF_BYTES-eng",
"_STATISTICS_WRITING_APP-eng",
"_STATISTICS_WRITING_DATE_UTC-eng",
"_STATISTICS_TAGS-eng"
]
}
}
}
EOF
then
printf 'Failed to write ffx config at %s.\n' "${CONFIG_FILE}" >&2
INSTALL_FAILURES=$((INSTALL_FAILURES + 1))
if ! render_default_config "${CONFIG_FILE}"; then
return 1
fi
created_any=1