Compare commits
20 Commits
4365e083dc
...
season-shi
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
008c643272 | ||
|
|
c302b30e63 | ||
|
|
7926407534 | ||
|
|
0894ac2fab | ||
|
|
353759b983 | ||
|
|
454f5f0656 | ||
|
|
0e51d6337f | ||
|
|
a24b6dedaa | ||
|
|
8361fc536b | ||
|
|
4d4272e5e8 | ||
|
|
559869ca68 | ||
|
|
0e4fae538b | ||
|
|
2595bfe4f4 | ||
|
|
fc9d94aeee | ||
|
|
111df11199 | ||
|
|
f0d4c36bc3 | ||
|
|
ef0d6e9274 | ||
|
|
d05b01cfb2 | ||
|
|
9dc08d48e9 | ||
|
|
20bdfc0dd7 |
12
README.md
12
README.md
@@ -99,6 +99,18 @@ TMDB-backed metadata enrichment requires `TMDB_API_KEY` to be set in the environ
|
||||
|
||||
## Version History
|
||||
|
||||
### 0.2.4
|
||||
|
||||
- lightweight CLI commands now stay import-light via lazy runtime loading
|
||||
- setup/config templating moved to `assets/ffx.json.j2`
|
||||
- aligned two-step local setup wrappers: `ffx setup` and `ffx configure_workstation`
|
||||
- combined `ffprobe` payload reuse in `FileProperties`
|
||||
- configurable crop-detect sampling plus per-process crop result caching
|
||||
- single-query controller accessors and conditional DB schema bootstrap
|
||||
- shared screen bootstrap/controller wiring for large detail screens
|
||||
- configurable default season/episode digit lengths
|
||||
- digit-aware `rename` and padded `unmux` filename markers
|
||||
|
||||
### 0.2.3
|
||||
|
||||
- PyPI packaging
|
||||
|
||||
@@ -9,19 +9,12 @@
|
||||
- The biggest near-term wins are in startup cost, repeated subprocess work, repeated database query patterns, and general repo hygiene.
|
||||
- This list is intentionally optimization-oriented rather than bug-oriented. Some items below also improve correctness or maintainability, but they were selected because they can reduce runtime cost, operator friction, or iteration overhead.
|
||||
- A first modern integration slice now exists under [`tests/integration/subtrack_mapping`](/home/osgw/.local/src/codex/ffx/tests/integration/subtrack_mapping). Remaining test-suite cleanup is now mostly about migrating and shrinking the legacy harness surface under [`tests/legacy`](/home/osgw/.local/src/codex/ffx/tests/legacy).
|
||||
- The CLI root now lazy-loads heavy runtime dependencies so lightweight commands such as `version`, `help`, `setup`, `configure_workstation`, and `upgrade` stay import-light.
|
||||
- Shared CLI defaults for container/output tokens now live outside [`src/ffx/ffx_controller.py`](/home/osgw/.local/src/codex/ffx/src/ffx/ffx_controller.py), and a focused unit test locks in the lazy-import contract.
|
||||
- `FileProperties` now uses one cached `ffprobe -show_format -show_streams -of json` call per source file, and the combined payload was confirmed against the Dragonball asset to satisfy both previous probe call sites fully.
|
||||
- Crop detection now uses configurable sampling windows plus per-process caching keyed by source file and sampling range, and the `cropdetect` CLI command now calls the real `FileProperties.findCropArguments()` path.
|
||||
- Database startup now bootstraps schema only when required tables are actually missing, while version enforcement still runs on ordinary DB-backed context creation.
|
||||
- Helper filename and rich-text utilities now use compiled raw regexes plus translate-based filename filtering, with unit coverage for TMDB suffix rewriting and Rich color stripping.
|
||||
- Process resource limiting now has explicit disabled/default states in the CLI and requirements, and combined CPU-plus-niceness wrapping now executes as `cpulimit -- nice -n ... <command>` instead of a less explicit prefix chain.
|
||||
- FFX logger setup now reuses named handlers, and fallback logger access no longer mutates handlers in ordinary constructors and helpers.
|
||||
- The process wrapper now uses `subprocess.run(...)` with centralized command formatting plus stable timeout and missing-command error mapping.
|
||||
- Active ORM controllers now use single-query accessors instead of paired `count()` plus `first()` lookups.
|
||||
- Pattern matching now uses cached compiled regexes plus explicit duplicate-match errors, and pattern creation flows no longer persist zero-track patterns.
|
||||
- The two-step local setup flow now has aligned CLI wrappers for both phases: `ffx setup` for bundle prep and `ffx configure_workstation` for workstation prep, while the shell scripts remain the bootstrap entrypoints before the bundle exists.
|
||||
- The large detail screens now share one screen-bootstrap helper for context, metadata-filter extraction, and controller wiring, and show-pattern loading now goes through `PatternController` instead of a screen-local session query.
|
||||
|
||||
## Focused Snapshot
|
||||
|
||||
@@ -84,6 +77,52 @@
|
||||
3. Continue replacing oversized legacy test matrices with focused modern integration and unit coverage.
|
||||
4. Triage the legacy `Scenario 4` pattern/track failure and decide whether to fix the harness, adapt it to the zero-track guard, or retire that path during the ongoing test-suite migration.
|
||||
|
||||
## Shifted Season Status (2026-04-12)
|
||||
|
||||
- Current assessment:
|
||||
- The shifted-season subsystem is present end to end and looks feature-complete in shape, but it is not yet hardened.
|
||||
- The storage, TUI CRUD surface, and CLI/TMDB filename application path all exist, so this is no longer a stubbed or half-started area.
|
||||
- The main gap is correctness and direct verification rather than missing surface area.
|
||||
|
||||
- Implemented surface confirmed:
|
||||
- Requirements still treat shifted seasons as part of the accepted product surface in [`requirements/project.md`](/home/osgw/.local/src/codex/ffx/requirements/project.md) and [`requirements/architecture.md`](/home/osgw/.local/src/codex/ffx/requirements/architecture.md).
|
||||
- Persistence exists via [`src/ffx/model/shifted_season.py`](/home/osgw/.local/src/codex/ffx/src/ffx/model/shifted_season.py) plus the `Show.shifted_seasons` relationship in [`src/ffx/model/show.py`](/home/osgw/.local/src/codex/ffx/src/ffx/model/show.py).
|
||||
- CRUD logic exists in [`src/ffx/shifted_season_controller.py`](/home/osgw/.local/src/codex/ffx/src/ffx/shifted_season_controller.py).
|
||||
- Textual add/edit/delete flows are wired through [`src/ffx/shifted_season_details_screen.py`](/home/osgw/.local/src/codex/ffx/src/ffx/shifted_season_details_screen.py), [`src/ffx/shifted_season_delete_screen.py`](/home/osgw/.local/src/codex/ffx/src/ffx/shifted_season_delete_screen.py), and the show details table in [`src/ffx/show_details_screen.py`](/home/osgw/.local/src/codex/ffx/src/ffx/show_details_screen.py).
|
||||
- CLI conversion applies season shifts before TMDB lookup and output suffix generation in [`src/ffx/cli.py`](/home/osgw/.local/src/codex/ffx/src/ffx/cli.py).
|
||||
|
||||
- Verified current behavior:
|
||||
- `~/.local/share/ffx.venv/bin/python -m unittest discover -s tests/unit -p 'test_*.py'` passed on 2026-04-12: `75` tests in `0.795s`.
|
||||
- That run emitted `ResourceWarning` messages for unclosed SQLite connections, so the suite is green but not perfectly clean.
|
||||
- There is almost no direct shifted-season coverage in the modern tests:
|
||||
- [`tests/unit/test_cli_rename_only.py`](/home/osgw/.local/src/codex/ffx/tests/unit/test_cli_rename_only.py) stubs `ShiftedSeasonController` rather than exercising it.
|
||||
- [`tests/unit/test_screen_support.py`](/home/osgw/.local/src/codex/ffx/tests/unit/test_screen_support.py) only verifies controller bootstrap wiring.
|
||||
- Net effect: the subsystem is integrated, but its core rules are effectively untested by the current modern suite.
|
||||
|
||||
- Reproduced correctness gaps:
|
||||
- Overlap validation is broken in [`src/ffx/shifted_season_controller.py:41`](/home/osgw/.local/src/codex/ffx/src/ffx/shifted_season_controller.py:41) because `getOriginalSeason` is compared as a method object instead of being called.
|
||||
- Reproduction on 2026-04-12 with a temp SQLite DB:
|
||||
- Added `S1 E1-E10`.
|
||||
- `checkShiftedSeason(...)` incorrectly returned `True` for overlapping `S1 E5-E15`.
|
||||
- `addShiftedSeason(...)` then stored the overlapping row successfully.
|
||||
- `updateShiftedSeason(...)` in [`src/ffx/shifted_season_controller.py:93`](/home/osgw/.local/src/codex/ffx/src/ffx/shifted_season_controller.py:93) does not enforce episode ordering, so an invalid range like `first_episode=20`, `last_episode=10` was accepted in the same reproduction.
|
||||
- Because [`src/ffx/shifted_season_controller.py:213`](/home/osgw/.local/src/codex/ffx/src/ffx/shifted_season_controller.py:213) returns the first matching sibling and [`src/ffx/shifted_season_controller.py:163`](/home/osgw/.local/src/codex/ffx/src/ffx/shifted_season_controller.py:163) applies no explicit ordering, overlapping rows would also make runtime shifting ambiguous.
|
||||
|
||||
- Progress summary:
|
||||
- Good progress:
|
||||
- The subsystem exists across requirements, schema, UI, and conversion flow.
|
||||
- It appears fully integrated into the show-editing workflow rather than parked as dead code.
|
||||
- Incomplete progress:
|
||||
- Validation logic is not trustworthy yet.
|
||||
- Modern tests do not currently protect the subsystem's real behavior.
|
||||
- User-facing error feedback in the shifted-season screens still has placeholder `#TODO: Meldung` branches.
|
||||
|
||||
- Recommended next slice:
|
||||
1. Add direct controller tests for overlap rejection, episode-order validation, and `shiftSeason(...)` selection behavior.
|
||||
2. Fix `checkShiftedSeason(...)` and add the same range/order validation to `updateShiftedSeason(...)`.
|
||||
3. Make sibling selection deterministic or enforce non-overlap strongly enough that ordering no longer matters in practice.
|
||||
4. Add at least one focused integration test that proves a stored shifted season changes TMDB lookup and/or generated filename numbering during conversion.
|
||||
|
||||
## Delete When
|
||||
|
||||
- Delete this scratchpad once the optimization backlog is either converted into issues/work items or distilled into durable project guidance.
|
||||
|
||||
36
assets/ffx.json.j2
Normal file
36
assets/ffx.json.j2
Normal file
@@ -0,0 +1,36 @@
|
||||
{
|
||||
"databasePath": {{ database_path_json }},
|
||||
"logDirectory": {{ log_directory_json }},
|
||||
"subtitlesDirectory": {{ subtitles_directory_json }},
|
||||
"defaultIndexSeasonDigits": {{ default_index_season_digits }},
|
||||
"defaultIndexEpisodeDigits": {{ default_index_episode_digits }},
|
||||
"defaultIndicatorSeasonDigits": {{ default_indicator_season_digits }},
|
||||
"defaultIndicatorEpisodeDigits": {{ default_indicator_episode_digits }},
|
||||
"metadata": {
|
||||
"signature": {
|
||||
"RECODED_WITH": "FFX"
|
||||
},
|
||||
"remove": [
|
||||
"VERSION-eng",
|
||||
"creation_time",
|
||||
"NAME"
|
||||
],
|
||||
"streams": {
|
||||
"remove": [
|
||||
"BPS",
|
||||
"NUMBER_OF_FRAMES",
|
||||
"NUMBER_OF_BYTES",
|
||||
"_STATISTICS_WRITING_APP",
|
||||
"_STATISTICS_WRITING_DATE_UTC",
|
||||
"_STATISTICS_TAGS",
|
||||
"BPS-eng",
|
||||
"DURATION-eng",
|
||||
"NUMBER_OF_FRAMES-eng",
|
||||
"NUMBER_OF_BYTES-eng",
|
||||
"_STATISTICS_WRITING_APP-eng",
|
||||
"_STATISTICS_WRITING_DATE_UTC-eng",
|
||||
"_STATISTICS_TAGS-eng"
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,7 +1,7 @@
|
||||
[project]
|
||||
name = "ffx"
|
||||
description = "FFX recoding and metadata managing tool"
|
||||
version = "0.2.3"
|
||||
version = "0.2.4"
|
||||
license = {file = "LICENSE.md"}
|
||||
dependencies = [
|
||||
"requests",
|
||||
|
||||
@@ -41,6 +41,7 @@
|
||||
- File inspection caches combined `ffprobe` data and crop-detection results per source and sampling window within one process to avoid repeated subprocess work.
|
||||
- Storage:
|
||||
- SQLite via SQLAlchemy ORM, with schema rooted in shows, patterns, tracks, media tags, track tags, shifted seasons, and generic properties.
|
||||
- Ordered schema migrations are loaded dynamically from per-version-step modules under [`src/ffx/model/migration/`](/home/osgw/.local/src/codex/ffx/src/ffx/model/migration/).
|
||||
- A configuration JSON file supplies optional path, metadata-filtering, and filename-template settings.
|
||||
- Integration adapters:
|
||||
- Process execution wrapper for `ffmpeg`, `ffprobe`, `nice`, and `cpulimit`, with explicit disabled states for niceness and CPU limiting, support for both absolute `cpulimit` values and machine-wide percent input, and a combined `cpulimit -- nice -n ... <command>` execution shape when both limits are configured.
|
||||
@@ -49,11 +50,11 @@
|
||||
## Data And Interface Notes
|
||||
|
||||
- Key entities or records:
|
||||
- `Show`: canonical TV show metadata plus digit-formatting rules for generated filenames.
|
||||
- `Show`: canonical TV show metadata plus digit-formatting rules, optional show-level notes, and an optional show-level encoding-quality fallback.
|
||||
- `Pattern`: regex rule tying filenames to one show and one target media schema.
|
||||
- `Track` and `TrackTag`: persisted target stream records, codec, dispositions, audio layout, and stream-level tags. Detailed source-to-target mapping rules live in `requirements/subtrack_mapping.md`.
|
||||
- `MediaTag`: persisted container-level metadata for a pattern.
|
||||
- `ShiftedSeason`: mapping from source numbering ranges to adjusted season and episode numbers.
|
||||
- `ShiftedSeason`: mapping from source numbering ranges to adjusted season and episode numbers, owned either by a show as fallback or by a pattern as override.
|
||||
- `Property`: internal key-value storage currently used for database versioning.
|
||||
- External interfaces:
|
||||
- CLI commands for conversion, inspection, extraction, and crop detection.
|
||||
@@ -62,9 +63,9 @@
|
||||
- Config keys `databasePath`, `logDirectory`, and `outputFilenameTemplate`, plus optional metadata-filter rules.
|
||||
- Validation rules:
|
||||
- Only supported media-file extensions are accepted for conversion.
|
||||
- Stored database version must match the runtime-required version.
|
||||
- Stored database version must either match the runtime-required version already or have a supported sequential migration path to it.
|
||||
- A normalized descriptor may have at most one default and one forced stream per relevant track type.
|
||||
- Shifted-season ranges are intended not to overlap for the same show and season.
|
||||
- Shifted-season ranges are intended not to overlap within the same owner scope and season, and runtime resolution prefers pattern-owned matches over show-owned matches.
|
||||
- TMDB lookups require a show ID and season and episode numbers.
|
||||
- Error-handling approach:
|
||||
- User-facing operational failures are raised as `click.ClickException` or warnings.
|
||||
|
||||
@@ -18,7 +18,7 @@
|
||||
- Inspect existing media files through `ffprobe` and compare discovered stream metadata with stored normalization rules.
|
||||
- Convert media files through `ffmpeg` into a normalized output layout, including video recoding, audio transcoding to Opus, metadata cleanup and rewrite, and controlled disposition flags.
|
||||
- Build output filenames from detected or configured show, season, and episode information, optionally enriched from TMDB and a configurable Jinja-style filename template.
|
||||
- Support auxiliary file operations such as subtitle import, unmuxing, crop detection, and rename-only runs.
|
||||
- Support auxiliary file operations such as subtitle import, unmuxing, crop detection, rename-only conversion runs, and direct in-place episode renaming.
|
||||
- Supported environments:
|
||||
- Local execution on a Python-capable workstation.
|
||||
- Best-supported on Linux-like systems because the implementation assumes `~/.local`, `/dev/null`, `nice`, and `cpulimit`.
|
||||
@@ -31,11 +31,11 @@
|
||||
- As an operator, I want to inspect a file before conversion so that I can compare its actual streams and tags against the stored target schema.
|
||||
- As a user preparing web-playback files, I want to recode video and audio with a small set of predictable options so that results are compatible and consistently named.
|
||||
- As a user dealing with nonstandard releases, I want CLI overrides for language, title, stream order, default and forced tracks, and season and episode data so that one-off fixes do not require database edits first.
|
||||
- As a user importing anime or other shifted numbering schemes, I want season and episode offsets per show so that generated filenames align with TMDB and media-library expectations.
|
||||
- As a user importing anime or other shifted numbering schemes, I want season and episode offsets at the show level with optional pattern-specific overrides so that generated filenames align with TMDB and media-library expectations.
|
||||
|
||||
## Functional Requirements
|
||||
|
||||
- The system shall provide a CLI entrypoint named `ffx` with commands for `convert`, `inspect`, `shows`, `unmux`, `cropdetect`, `setup`, `configure_workstation`, `upgrade`, `version`, and `help`.
|
||||
- The system shall provide a CLI entrypoint named `ffx` with commands for `convert`, `inspect`, `shows`, `rename`, `unmux`, `cropdetect`, `setup`, `configure_workstation`, `upgrade`, `version`, and `help`.
|
||||
- The system shall support a two-step local installation and preparation flow:
|
||||
- `tools/setup.sh` is the bootstrap entrypoint for the first step and shall own bundle virtualenv creation, package installation, shell alias exposure, and optional Python test-package installation.
|
||||
- `tools/configure_workstation.sh` is the bootstrap entrypoint for the second step and shall own workstation dependency checks and installation plus local config and directory seeding.
|
||||
@@ -44,11 +44,16 @@
|
||||
- The CLI command `ffx configure_workstation` shall act as a wrapper for the second-step preparation flow in `tools/configure_workstation.sh`.
|
||||
- The system shall persist reusable normalization rules in SQLite for:
|
||||
- shows and show formatting digits,
|
||||
- optional show-level notes,
|
||||
- optional show-level quality defaults,
|
||||
- regex-based filename patterns,
|
||||
- per-pattern media tags,
|
||||
- per-pattern stream definitions,
|
||||
- shifted-season mappings,
|
||||
- show-level and pattern-level shifted-season mappings,
|
||||
- internal database version properties.
|
||||
- The system shall apply supported ordered database migrations automatically when opening an older local database file and shall fail fast when no supported path exists.
|
||||
- Before applying a required database migration, the system shall show the current version, target version, required sequential steps, and whether each corresponding migration module is present, then require user confirmation.
|
||||
- Before applying a confirmed file-backed database migration, the system shall create an in-place backup copy whose filename includes the covered version range.
|
||||
- Detailed show, pattern, and duplicate-match management rules live in `requirements/pattern_management.md`.
|
||||
- The system shall inspect source media using `ffprobe` and derive a structured description of container metadata and streams.
|
||||
- The system shall optionally open a Textual UI to browse shows, inspect files, and create, edit, or delete shows, patterns, stream definitions, tags, and shifted-season rules.
|
||||
@@ -60,15 +65,18 @@
|
||||
- optional crop detection and crop application,
|
||||
- optional deinterlacing and denoising,
|
||||
- optional subtitle import from external files,
|
||||
- rename-only copy mode.
|
||||
- rename-only move mode.
|
||||
- The system shall support optional TMDB lookups to resolve show names, years, and episode titles when a show ID, season, and episode are available.
|
||||
- The system shall generate output filenames from show metadata, season and episode indices, and episode names using the configured filename template.
|
||||
- The system shall allow CLI overrides for stream languages, stream titles, default and forced tracks, stream order, TMDB show and episode data, output directory, label prefix, and processing resource limits.
|
||||
- The system shall resolve encoding quality by precedence `CLI override -> pattern -> show -> encoder default` and shall report the chosen value and source.
|
||||
- The system shall resolve season shifting by precedence `pattern -> show -> identity default` and shall report the chosen mapping and source.
|
||||
- Processing resource limit rules:
|
||||
- `--nice` shall accept niceness values from `-20` through `19`; omitting the option shall disable niceness adjustment.
|
||||
- `--cpu` shall accept either a positive absolute `cpulimit` value such as `200`, or a percentage suffixed with `%` such as `25%` to represent a share of present CPUs; omitting the option or using `0` shall disable CPU limiting.
|
||||
- When both limits are configured, the process wrapper shall execute the target command through `cpulimit` around a `nice -n ...` invocation so both limits apply to the launched media command.
|
||||
- The system shall support extracting streams into separate files via `unmux` and reporting suggested crop parameters via `cropdetect`.
|
||||
- The system shall support in-place episode renaming via `rename`, requiring a `--prefix`, accepting optional `--season` and `--suffix` overrides, preserving the source extension, and supporting dry-run output without moving files.
|
||||
- Crop detection shall use a configurable sampling window, defaulting to a 60-second seek and a 180-second analysis duration, and repeated crop-detection requests for the same source plus sampling window shall reuse cached results within one process.
|
||||
- The system shall handle invalid input and system failures gracefully by logging warnings or raising `click` errors for missing files, invalid media, missing TMDB credentials, incompatible database versions, and ambiguous track dispositions when prompting is disabled.
|
||||
|
||||
@@ -90,7 +98,7 @@
|
||||
- Intended for local execution, not server deployment.
|
||||
- Stores default state in `~/.local/etc/ffx.json`, `~/.local/var/ffx/ffx.db`, and `~/.local/var/log/ffx.log`.
|
||||
- Timeline constraints:
|
||||
- The current implemented scope reflects a compact alpha release stream up to version `0.2.3`.
|
||||
- The current implemented scope reflects a compact alpha release stream up to version `0.2.4`.
|
||||
- Team capacity assumptions:
|
||||
- Maintained as a small codebase where simple patterns and direct controller logic are preferred over framework-heavy abstractions.
|
||||
- Third-party dependencies:
|
||||
|
||||
177
requirements/shifted_seasons_handling.md
Normal file
177
requirements/shifted_seasons_handling.md
Normal file
@@ -0,0 +1,177 @@
|
||||
# Shifted Seasons Handling
|
||||
|
||||
This file defines the behavioral contract for mapping source season and episode
|
||||
numbering to target season and episode numbering through stored shifted-season
|
||||
rules.
|
||||
|
||||
Primary sources:
|
||||
- `requirements/project.md`
|
||||
- `requirements/architecture.md`
|
||||
- actual tool code in `src/ffx/`
|
||||
|
||||
Secondary source:
|
||||
- `SCRATCHPAD.md`, used only to clarify current hardening gaps and not as the
|
||||
primary contract source.
|
||||
|
||||
## Scope
|
||||
|
||||
- Persisting shifted-season rules in SQLite.
|
||||
- Allowing shifted-season rules to be attached either to a show or to a
|
||||
specific pattern.
|
||||
- Selecting at most one active shifted-season rule for one concrete source
|
||||
season and episode tuple.
|
||||
- Applying additive season and episode offsets to produce target numbering.
|
||||
- Using shifted target numbering during `convert` for TMDB episode lookup and
|
||||
generated season and episode filename tokens.
|
||||
- Managing show-level default mappings and pattern-level override mappings from
|
||||
the Textual editing workflows.
|
||||
|
||||
## Out Of Scope
|
||||
|
||||
- General filename parsing rules for detecting season and episode values.
|
||||
- Standalone `rename` command behavior, which currently uses explicit rename
|
||||
inputs rather than stored shifted-season rules.
|
||||
- Stream or track mapping behavior unrelated to season and episode numbering.
|
||||
|
||||
## Terms
|
||||
|
||||
- `shifted-season rule`: one persisted row describing how one source-numbering
|
||||
range maps to target numbering through additive offsets.
|
||||
- `show-level shifted-season rule`: a rule attached directly to a show and used
|
||||
as the fallback mapping layer for that show.
|
||||
- `pattern-level shifted-season rule`: a rule attached directly to a pattern and
|
||||
used as the override mapping layer for that pattern.
|
||||
- `source numbering`: the season and episode values detected from the current
|
||||
source file or supplied as source-side conversion inputs before shifting.
|
||||
- `target numbering`: the season and episode values after one active
|
||||
shifted-season rule has been applied.
|
||||
- `original season`: the source-domain season number a shifted-season rule is
|
||||
eligible to match.
|
||||
- `episode range`: the optional source-domain episode interval covered by one
|
||||
shifted-season rule.
|
||||
- `open bound`: an unbounded start or end of the episode range. Current storage
|
||||
uses `-1` as the internal sentinel for an open bound.
|
||||
- `active shifted-season rule`: the single rule selected for one concrete input
|
||||
after precedence resolution.
|
||||
- `identity mapping`: the default `1:1` outcome where source numbering is used
|
||||
unchanged.
|
||||
|
||||
## Rules
|
||||
|
||||
- `SHIFTED_SEASONS_HANDLING-0001`: The domain model shall allow a
|
||||
shifted-season rule to be owned by exactly one of:
|
||||
- one show
|
||||
- one pattern
|
||||
- `SHIFTED_SEASONS_HANDLING-0002`: A single shifted-season rule shall not
|
||||
belong to both a show and a pattern at the same time.
|
||||
- `SHIFTED_SEASONS_HANDLING-0003`: A shifted-season rule shall carry these
|
||||
fields: `original_season`, `first_episode`, `last_episode`,
|
||||
`season_offset`, and `episode_offset`.
|
||||
- `SHIFTED_SEASONS_HANDLING-0004`: `season_offset` and `episode_offset` shall
|
||||
be additive signed integers applied to matched source numbering to produce
|
||||
target numbering.
|
||||
- `SHIFTED_SEASONS_HANDLING-0005`: A shifted-season rule shall match a source
|
||||
tuple only when:
|
||||
- the source season equals `original_season`
|
||||
- the source episode is greater than or equal to `first_episode` when the
|
||||
lower bound is closed
|
||||
- the source episode is less than or equal to `last_episode` when the upper
|
||||
bound is closed
|
||||
- `SHIFTED_SEASONS_HANDLING-0006`: An open lower or upper episode bound shall
|
||||
represent an unbounded side of the covered source episode range.
|
||||
- `SHIFTED_SEASONS_HANDLING-0007`: If one shifted-season rule matches, target
|
||||
numbering shall be:
|
||||
- `target season = source season + season_offset`
|
||||
- `target episode = source episode + episode_offset`
|
||||
- `SHIFTED_SEASONS_HANDLING-0008`: If no shifted-season rule matches, source
|
||||
numbering shall pass through unchanged.
|
||||
- `SHIFTED_SEASONS_HANDLING-0009`: Shifted-season handling shall operate in a
|
||||
source-to-target numbering model. Stored rules map detected source numbering
|
||||
to the target numbering used by conversion-facing metadata and output naming.
|
||||
- `SHIFTED_SEASONS_HANDLING-0010`: Pattern matching identifies the owning show
|
||||
and optionally a more specific owning pattern. Resolution of the active
|
||||
shifted-season rule shall use this precedence order:
|
||||
- matching pattern-level rule
|
||||
- matching show-level rule
|
||||
- identity mapping
|
||||
- `SHIFTED_SEASONS_HANDLING-0011`: At most one shifted-season rule may be
|
||||
active for one concrete source season and episode tuple. Shifted-season rules
|
||||
shall never stack or compose.
|
||||
- `SHIFTED_SEASONS_HANDLING-0012`: Within one owner scope, shifted-season rules
|
||||
shall not overlap in their effective episode coverage for the same
|
||||
`original_season`.
|
||||
- `SHIFTED_SEASONS_HANDLING-0013`: If a shifted-season rule uses two closed
|
||||
episode bounds, `last_episode` shall be greater than or equal to
|
||||
`first_episode`.
|
||||
- `SHIFTED_SEASONS_HANDLING-0014`: Shifted-season rule evaluation shall be
|
||||
deterministic. Released behavior shall not depend on arbitrary database row
|
||||
order when invalid overlapping rules exist.
|
||||
- `SHIFTED_SEASONS_HANDLING-0015`: A pattern-level rule is permitted to map to
|
||||
zero offsets. Such a rule is a valid explicit override that beats show-level
|
||||
fallback and produces identity mapping for its covered source range.
|
||||
- `SHIFTED_SEASONS_HANDLING-0016`: During `convert`, when show, season, and
|
||||
episode values are available and stored shifting is active, the shifted target
|
||||
numbering shall drive:
|
||||
- TMDB episode lookup
|
||||
- season and episode filename tokens such as `S01E02`
|
||||
- generated episode basenames that include season and episode numbering
|
||||
- `SHIFTED_SEASONS_HANDLING-0017`: When conversion is supplied explicit
|
||||
target-domain season or episode values for TMDB naming, the system shall not
|
||||
apply stored shifting on top of those already-targeted values.
|
||||
- `SHIFTED_SEASONS_HANDLING-0018`: Operator-facing editing shall expose
|
||||
shifted-season rule management in both of these places:
|
||||
- show editing for show-level default mappings
|
||||
- pattern editing for pattern-level override mappings
|
||||
- `SHIFTED_SEASONS_HANDLING-0019`: User-facing shifted-season editing should
|
||||
present open episode bounds as a natural empty-state input rather than forcing
|
||||
operators to type the internal sentinel directly.
|
||||
|
||||
## Acceptance
|
||||
|
||||
- A show can exist with zero or more show-level shifted-season rules.
|
||||
- A pattern can exist with zero or more pattern-level shifted-season rules.
|
||||
- A shifted-season rule is stored against exactly one owner scope.
|
||||
- A source tuple matching a pattern-level rule yields target numbering from that
|
||||
rule even when a matching show-level rule also exists.
|
||||
- A source tuple matching no pattern-level rule but matching a show-level rule
|
||||
yields target numbering from the show-level rule.
|
||||
- A source tuple matching neither scope yields identity mapping.
|
||||
- A pattern-level zero-offset rule can explicitly override a nonzero show-level
|
||||
rule for the same covered source range.
|
||||
- Two shifted-season rules for the same owner scope and original season cannot
|
||||
both be valid if they cover overlapping episode ranges.
|
||||
- During `convert`, shifted numbering is what TMDB episode lookup and generated
|
||||
season and episode tokens see when stored shifting is active.
|
||||
- The TUI can display and maintain shifted-season rules from both the show and
|
||||
pattern editing flows.
|
||||
|
||||
## Current Code Fit
|
||||
|
||||
- `src/ffx/model/show.py` and `src/ffx/model/pattern.py` now both expose
|
||||
shifted-season relationships, and `src/ffx/model/shifted_season.py` stores
|
||||
each rule against exactly one owner scope through `show_id` or `pattern_id`.
|
||||
- `src/ffx/shifted_season_controller.py` now resolves mappings with
|
||||
pattern-over-show precedence and applies at most one active rule for a source
|
||||
tuple.
|
||||
- `src/ffx/show_details_screen.py`,
|
||||
`src/ffx/shifted_season_details_screen.py`, and
|
||||
`src/ffx/shifted_season_delete_screen.py` provide reusable shifted-season
|
||||
editing dialogs, and `src/ffx/pattern_details_screen.py` now exposes the
|
||||
pattern-level override flow.
|
||||
- `src/ffx/cli.py` now resolves shifted numbering during `convert` from:
|
||||
pattern-level match, then show-level match, then identity mapping.
|
||||
- `src/ffx/database.py` now migrates version-2 databases to version 3 by
|
||||
preserving existing show-level rows and extending the schema for pattern-level
|
||||
ownership.
|
||||
|
||||
## Risks
|
||||
|
||||
- The current CLI groups `--show`, `--season`, and `--episode` under one
|
||||
override bucket used for TMDB-related behavior. Source-domain versus
|
||||
target-domain semantics of each override must stay documented clearly so
|
||||
stored shifting is neither skipped nor double-applied unexpectedly.
|
||||
- Existing version-2 databases only contain show-owned shifted-season rows, so a
|
||||
version-3 migration must preserve those rows as the show-level fallback layer.
|
||||
- Current modern automated test coverage for shifted-season behavior is light,
|
||||
so precedence, migration, and convert-time numbering behavior need focused
|
||||
tests.
|
||||
227
src/ffx/cli.py
227
src/ffx/cli.py
@@ -33,7 +33,7 @@ if TYPE_CHECKING:
|
||||
from ffx.media_descriptor import MediaDescriptor
|
||||
from ffx.track_descriptor import TrackDescriptor
|
||||
|
||||
LIGHTWEIGHT_COMMANDS = {None, 'version', 'help', 'setup', 'configure_workstation', 'upgrade'}
|
||||
LIGHTWEIGHT_COMMANDS = {None, 'version', 'help', 'setup', 'configure_workstation', 'upgrade', 'rename'}
|
||||
CPU_OPTION_HELP = (
|
||||
"Limit CPU for started processes. Use an absolute cpulimit value such as 200 "
|
||||
+ "(about 2 cores), or use a percentage such as 25% for a share of present cores. "
|
||||
@@ -185,6 +185,67 @@ def resolveUnmuxOutputDirectory(context, outputDirectory, subtitlesOnly, label):
|
||||
return os.path.join(configuredSubtitlesBaseDirectory, resolvedLabel), True
|
||||
|
||||
|
||||
def resolveIndicatorDigitLengths(context=None, showDescriptor=None):
|
||||
from ffx.show_descriptor import ShowDescriptor
|
||||
|
||||
defaultDigitLengths = ShowDescriptor.getDefaultDigitLengths(context)
|
||||
if showDescriptor is None:
|
||||
return (
|
||||
defaultDigitLengths[ShowDescriptor.INDICATOR_SEASON_DIGITS_KEY],
|
||||
defaultDigitLengths[ShowDescriptor.INDICATOR_EPISODE_DIGITS_KEY],
|
||||
)
|
||||
|
||||
return (
|
||||
int(showDescriptor.getIndicatorSeasonDigits()),
|
||||
int(showDescriptor.getIndicatorEpisodeDigits()),
|
||||
)
|
||||
|
||||
|
||||
def buildRenameTargetFilename(
|
||||
sourcePath,
|
||||
prefix,
|
||||
seasonOverride=None,
|
||||
suffix='',
|
||||
indicatorSeasonDigits=None,
|
||||
indicatorEpisodeDigits=None,
|
||||
):
|
||||
from ffx.file_properties import FileProperties
|
||||
from ffx.show_descriptor import ShowDescriptor
|
||||
|
||||
sourceFilename = os.path.basename(sourcePath)
|
||||
seasonEpisodeValues = FileProperties.extractSeasonEpisodeValues(sourceFilename)
|
||||
if seasonEpisodeValues is None:
|
||||
return None
|
||||
|
||||
sourceSeason, sourceEpisode = seasonEpisodeValues
|
||||
resolvedSeason = int(seasonOverride) if seasonOverride is not None else (
|
||||
int(sourceSeason) if sourceSeason is not None else 1
|
||||
)
|
||||
resolvedIndicatorSeasonDigits = (
|
||||
int(indicatorSeasonDigits)
|
||||
if indicatorSeasonDigits is not None
|
||||
else ShowDescriptor.DEFAULT_INDICATOR_SEASON_DIGITS
|
||||
)
|
||||
resolvedIndicatorEpisodeDigits = (
|
||||
int(indicatorEpisodeDigits)
|
||||
if indicatorEpisodeDigits is not None
|
||||
else ShowDescriptor.DEFAULT_INDICATOR_EPISODE_DIGITS
|
||||
)
|
||||
|
||||
_sourceBasename, sourceExtension = os.path.splitext(sourceFilename)
|
||||
|
||||
targetFilenameTokens = [
|
||||
str(prefix).strip(),
|
||||
f"s{resolvedSeason:0{resolvedIndicatorSeasonDigits}d}e{int(sourceEpisode):0{resolvedIndicatorEpisodeDigits}d}",
|
||||
]
|
||||
|
||||
resolvedSuffix = str(suffix).strip()
|
||||
if resolvedSuffix:
|
||||
targetFilenameTokens.append(resolvedSuffix)
|
||||
|
||||
return f"{'_'.join(targetFilenameTokens)}{sourceExtension}"
|
||||
|
||||
|
||||
|
||||
@click.group()
|
||||
@click.pass_context
|
||||
@@ -242,7 +303,7 @@ def version():
|
||||
def help():
|
||||
click.echo(f"ffx {VERSION}\n")
|
||||
click.echo("Maintenance commands: setup, configure_workstation, upgrade")
|
||||
click.echo("Media commands: shows, inspect, convert, unmux, cropdetect")
|
||||
click.echo("Media commands: shows, inspect, convert, rename, unmux, cropdetect")
|
||||
click.echo("Use 'ffx --help' or 'ffx <command> --help' for full command help.")
|
||||
|
||||
|
||||
@@ -375,10 +436,14 @@ def upgrade(ctx, branch):
|
||||
commandSequences.append(['git', 'reset', '--hard', 'HEAD'])
|
||||
|
||||
if branch:
|
||||
commandSequences.append(['git', 'checkout', branch])
|
||||
commandSequences += [
|
||||
['git', 'fetch', 'origin', branch],
|
||||
['git', 'checkout', '-B', branch, 'FETCH_HEAD'],
|
||||
]
|
||||
else:
|
||||
commandSequences.append(['git', 'pull'])
|
||||
|
||||
commandSequences += [
|
||||
['git', 'pull'],
|
||||
[bundlePipPath, 'install', '--upgrade', 'pip', 'setuptools', 'wheel'],
|
||||
[bundlePipPath, 'install', '--editable', '.'],
|
||||
]
|
||||
@@ -408,6 +473,62 @@ def inspect(ctx, filename):
|
||||
app.run()
|
||||
|
||||
|
||||
@ffx.command()
|
||||
@click.pass_context
|
||||
@click.argument('paths', nargs=-1)
|
||||
@click.option('--prefix', type=str, required=True, help='Required target filename prefix')
|
||||
@click.option('--season', type=int, default=None, help='Override target season index')
|
||||
@click.option('--suffix', type=str, default='', help='Optional target filename suffix')
|
||||
@click.option('--dry-run', is_flag=True, default=False, help='Only print planned renames')
|
||||
def rename(ctx, paths, prefix, season, suffix, dry_run):
|
||||
"""Rename matching episode files in place."""
|
||||
from ffx.configuration_controller import ConfigurationController
|
||||
|
||||
resolvedPrefix = str(prefix).strip()
|
||||
resolvedSuffix = str(suffix).strip()
|
||||
effectiveDryRun = bool(ctx.obj.get('dry_run', False) or dry_run)
|
||||
renameContext = {
|
||||
'config': ctx.obj.get('config') or ConfigurationController(),
|
||||
}
|
||||
indicatorSeasonDigits, indicatorEpisodeDigits = resolveIndicatorDigitLengths(renameContext)
|
||||
|
||||
if not resolvedPrefix:
|
||||
raise click.ClickException("Rename prefix must not be empty.")
|
||||
|
||||
processedCount = 0
|
||||
|
||||
for sourcePath in paths:
|
||||
if not os.path.isfile(sourcePath):
|
||||
continue
|
||||
|
||||
targetFilename = buildRenameTargetFilename(
|
||||
sourcePath,
|
||||
resolvedPrefix,
|
||||
seasonOverride=season,
|
||||
suffix=resolvedSuffix,
|
||||
indicatorSeasonDigits=indicatorSeasonDigits,
|
||||
indicatorEpisodeDigits=indicatorEpisodeDigits,
|
||||
)
|
||||
if targetFilename is None:
|
||||
continue
|
||||
|
||||
sourceFilename = os.path.basename(sourcePath)
|
||||
targetPath = os.path.join(os.path.dirname(sourcePath), targetFilename)
|
||||
click.echo(f"{sourceFilename} -> {targetFilename}")
|
||||
processedCount += 1
|
||||
|
||||
if effectiveDryRun or os.path.abspath(sourcePath) == os.path.abspath(targetPath):
|
||||
continue
|
||||
|
||||
if os.path.exists(targetPath):
|
||||
raise click.ClickException(f"Target file already exists: {targetPath}")
|
||||
|
||||
shutil.move(sourcePath, targetPath)
|
||||
|
||||
if processedCount == 0:
|
||||
click.echo("No matching files found.")
|
||||
|
||||
|
||||
def getUnmuxSequence(trackDescriptor: TrackDescriptor, sourcePath, targetPrefix, targetDirectory = ''):
|
||||
|
||||
# executable and input file
|
||||
@@ -468,6 +589,7 @@ def unmux(ctx,
|
||||
cpu):
|
||||
from ffx.file_properties import FileProperties
|
||||
from ffx.process import executeProcess
|
||||
from ffx.shifted_season_controller import ShiftedSeasonController
|
||||
from ffx.track_disposition import TrackDisposition
|
||||
from ffx.track_type import TrackType
|
||||
|
||||
@@ -488,6 +610,8 @@ def unmux(ctx,
|
||||
if create_output_directory and existingSourcePaths and not ctx.obj.get('dry_run', False):
|
||||
os.makedirs(output_directory, exist_ok=True)
|
||||
|
||||
shiftedSeasonController = ShiftedSeasonController(ctx.obj)
|
||||
|
||||
for sourcePath in existingSourcePaths:
|
||||
|
||||
fp = FileProperties(ctx.obj, sourcePath)
|
||||
@@ -495,13 +619,29 @@ def unmux(ctx,
|
||||
|
||||
try:
|
||||
sourceMediaDescriptor = fp.getMediaDescriptor()
|
||||
currentPattern = fp.getPattern()
|
||||
currentShowDescriptor = (
|
||||
currentPattern.getShowDescriptor(ctx.obj) if currentPattern is not None else None
|
||||
)
|
||||
indicatorSeasonDigits, indicatorEpisodeDigits = resolveIndicatorDigitLengths(
|
||||
ctx.obj,
|
||||
currentShowDescriptor,
|
||||
)
|
||||
|
||||
season = fp.getSeason()
|
||||
episode = fp.getEpisode()
|
||||
season, episode = shiftedSeasonController.shiftSeason(
|
||||
fp.getShowId(),
|
||||
season=fp.getSeason(),
|
||||
episode=fp.getEpisode(),
|
||||
patternId=currentPattern.getId() if currentPattern is not None else None,
|
||||
)
|
||||
|
||||
#TODO: Recognition für alle Formate anpassen
|
||||
targetLabel = label if label else fp.getFileBasename()
|
||||
targetIndicator = f"_S{season}E{episode}" if label and season != -1 and episode != -1 else ''
|
||||
targetIndicator = (
|
||||
f"_S{season:0{indicatorSeasonDigits}d}E{episode:0{indicatorEpisodeDigits}d}"
|
||||
if label and season != -1 and episode != -1
|
||||
else ''
|
||||
)
|
||||
|
||||
if label and not targetIndicator:
|
||||
ctx.obj['logger'].warning(f"Skipping file {fp.getFilename()}: Label set but no indicator recognized")
|
||||
@@ -764,7 +904,7 @@ def checkUniqueDispositions(context, mediaDescriptor: MediaDescriptor):
|
||||
help=CPU_OPTION_HELP,
|
||||
)
|
||||
|
||||
@click.option('--rename-only', is_flag=True, default=False, help='Only renaming, no recoding')
|
||||
@click.option('--rename-only', is_flag=True, default=False, help='Only renaming and moving, no recoding')
|
||||
|
||||
def convert(ctx,
|
||||
paths,
|
||||
@@ -837,6 +977,7 @@ def convert(ctx,
|
||||
from ffx.filter.quality_filter import QualityFilter
|
||||
from ffx.helper import filterFilename, getEpisodeFileBasename, substituteTmdbFilename
|
||||
from ffx.shifted_season_controller import ShiftedSeasonController
|
||||
from ffx.show_controller import ShowController
|
||||
from ffx.show_descriptor import ShowDescriptor
|
||||
from ffx.tmdb_controller import TmdbController
|
||||
from ffx.track_codec import TrackCodec
|
||||
@@ -1020,6 +1161,7 @@ def convert(ctx,
|
||||
ctx.obj['logger'].info(f"\nRunning {len(existingSourcePaths) * len(chainYield)} jobs")
|
||||
|
||||
jobIndex = 0
|
||||
showController = ShowController(context)
|
||||
|
||||
for sourcePath in existingSourcePaths:
|
||||
|
||||
@@ -1051,8 +1193,8 @@ def convert(ctx,
|
||||
|
||||
|
||||
ssc = ShiftedSeasonController(context)
|
||||
|
||||
showId = mediaFileProperties.getShowId()
|
||||
|
||||
matchedShowId = mediaFileProperties.getShowId()
|
||||
|
||||
#HINT: -1 if not set
|
||||
if 'tmdb' in cliOverrides.keys() and 'season' in cliOverrides['tmdb']:
|
||||
@@ -1134,7 +1276,8 @@ def convert(ctx,
|
||||
targetMediaDescriptor.importSubtitles(context['subtitle_directory'],
|
||||
context['subtitle_prefix'],
|
||||
showSeason,
|
||||
showEpisode)
|
||||
showEpisode,
|
||||
preserve_dispositions=True)
|
||||
|
||||
# ctx.obj['logger'].debug(f"tmd subindices: {[t.getIndex() for t in targetMediaDescriptor.getAllTrackDescriptors()]} {[t.getSubIndex() for t in targetMediaDescriptor.getAllTrackDescriptors()]} {[t.getDispositionFlag(TrackDisposition.DEFAULT) for t in targetMediaDescriptor.getAllTrackDescriptors()]}")
|
||||
ctx.obj['logger'].debug(f"tmd subindices: {[t.getIndex() for t in targetMediaDescriptor.getTrackDescriptors()]} {[t.getSubIndex() for t in targetMediaDescriptor.getTrackDescriptors()]} {[t.getDispositionFlag(TrackDisposition.DEFAULT) for t in targetMediaDescriptor.getTrackDescriptors()]}")
|
||||
@@ -1149,26 +1292,59 @@ def convert(ctx,
|
||||
|
||||
fc = FfxController(context, targetMediaDescriptor, sourceMediaDescriptor)
|
||||
|
||||
|
||||
indexSeasonDigits = currentShowDescriptor.getIndexSeasonDigits() if not currentPattern is None else ShowDescriptor.DEFAULT_INDEX_SEASON_DIGITS
|
||||
indexEpisodeDigits = currentShowDescriptor.getIndexEpisodeDigits() if not currentPattern is None else ShowDescriptor.DEFAULT_INDEX_EPISODE_DIGITS
|
||||
indicatorSeasonDigits = currentShowDescriptor.getIndicatorSeasonDigits() if not currentPattern is None else ShowDescriptor.DEFAULT_INDICATOR_SEASON_DIGITS
|
||||
indicatorEpisodeDigits = currentShowDescriptor.getIndicatorEpisodeDigits() if not currentPattern is None else ShowDescriptor.DEFAULT_INDICATOR_EPISODE_DIGITS
|
||||
qualityShowId = (
|
||||
cliOverrides['tmdb']['show']
|
||||
if 'tmdb' in cliOverrides.keys() and 'show' in cliOverrides['tmdb']
|
||||
else matchedShowId
|
||||
)
|
||||
if currentShowDescriptor is None and qualityShowId != -1:
|
||||
currentShowDescriptor = showController.getShowDescriptor(qualityShowId)
|
||||
|
||||
|
||||
# Shift season and episode if defined for this show
|
||||
if ('tmdb' not in cliOverrides.keys() and showId != -1
|
||||
and showSeason != -1 and showEpisode != -1):
|
||||
shiftedShowSeason, shiftedShowEpisode = ssc.shiftSeason(showId,
|
||||
season=showSeason,
|
||||
episode=showEpisode)
|
||||
defaultDigitLengths = ShowDescriptor.getDefaultDigitLengths(context)
|
||||
indexSeasonDigits = currentShowDescriptor.getIndexSeasonDigits() if not currentPattern is None else defaultDigitLengths[ShowDescriptor.INDEX_SEASON_DIGITS_KEY]
|
||||
indexEpisodeDigits = currentShowDescriptor.getIndexEpisodeDigits() if not currentPattern is None else defaultDigitLengths[ShowDescriptor.INDEX_EPISODE_DIGITS_KEY]
|
||||
indicatorSeasonDigits = currentShowDescriptor.getIndicatorSeasonDigits() if not currentPattern is None else defaultDigitLengths[ShowDescriptor.INDICATOR_SEASON_DIGITS_KEY]
|
||||
indicatorEpisodeDigits = currentShowDescriptor.getIndicatorEpisodeDigits() if not currentPattern is None else defaultDigitLengths[ShowDescriptor.INDICATOR_EPISODE_DIGITS_KEY]
|
||||
|
||||
|
||||
showIdForShift = (
|
||||
cliOverrides['tmdb']['show']
|
||||
if 'tmdb' in cliOverrides.keys() and 'show' in cliOverrides['tmdb']
|
||||
else matchedShowId
|
||||
)
|
||||
patternIdForShift = currentPattern.getId() if currentPattern is not None else None
|
||||
hasExplicitTargetSeasonOrEpisode = (
|
||||
'tmdb' in cliOverrides.keys()
|
||||
and (
|
||||
'season' in cliOverrides['tmdb']
|
||||
or 'episode' in cliOverrides['tmdb']
|
||||
)
|
||||
)
|
||||
|
||||
# Shift season and episode if defined for the matched pattern or show
|
||||
if (
|
||||
not hasExplicitTargetSeasonOrEpisode
|
||||
and showSeason != -1
|
||||
and showEpisode != -1
|
||||
):
|
||||
shiftedShowSeason, shiftedShowEpisode = ssc.shiftSeason(
|
||||
showIdForShift,
|
||||
season=showSeason,
|
||||
episode=showEpisode,
|
||||
patternId=patternIdForShift,
|
||||
)
|
||||
else:
|
||||
shiftedShowSeason = showSeason
|
||||
shiftedShowEpisode = showEpisode
|
||||
|
||||
# Assemble target filename accordingly depending on TMDB lookup is enabled
|
||||
#HINT: -1 if not set
|
||||
showId = cliOverrides['tmdb']['show'] if 'tmdb' in cliOverrides.keys() and 'show' in cliOverrides['tmdb'] else (-1 if currentShowDescriptor is None else currentShowDescriptor.getId())
|
||||
showId = (
|
||||
cliOverrides['tmdb']['show']
|
||||
if 'tmdb' in cliOverrides.keys() and 'show' in cliOverrides['tmdb']
|
||||
else (-1 if currentShowDescriptor is None else currentShowDescriptor.getId())
|
||||
)
|
||||
|
||||
if context['use_tmdb'] and showId != -1 and shiftedShowSeason != -1 and shiftedShowEpisode != -1:
|
||||
|
||||
@@ -1247,14 +1423,15 @@ def convert(ctx,
|
||||
|
||||
|
||||
if rename_only:
|
||||
shutil.copyfile(sourcePath, targetPath)
|
||||
shutil.move(sourcePath, targetPath)
|
||||
else:
|
||||
fc.runJob(sourcePath,
|
||||
targetPath,
|
||||
targetFormat,
|
||||
chainIteration,
|
||||
cropArguments,
|
||||
currentPattern)
|
||||
currentPattern,
|
||||
currentShowDescriptor)
|
||||
|
||||
|
||||
|
||||
|
||||
@@ -1,5 +1,12 @@
|
||||
import os, json
|
||||
|
||||
from .constants import (
|
||||
DEFAULT_SHOW_INDEX_EPISODE_DIGITS,
|
||||
DEFAULT_SHOW_INDEX_SEASON_DIGITS,
|
||||
DEFAULT_SHOW_INDICATOR_EPISODE_DIGITS,
|
||||
DEFAULT_SHOW_INDICATOR_SEASON_DIGITS,
|
||||
)
|
||||
|
||||
class ConfigurationController():
|
||||
|
||||
CONFIG_FILENAME = 'ffx.json'
|
||||
@@ -10,6 +17,10 @@ class ConfigurationController():
|
||||
LOG_DIRECTORY_CONFIG_KEY = 'logDirectory'
|
||||
SUBTITLES_DIRECTORY_CONFIG_KEY = 'subtitlesDirectory'
|
||||
OUTPUT_FILENAME_TEMPLATE_KEY = 'outputFilenameTemplate'
|
||||
DEFAULT_INDEX_SEASON_DIGITS_CONFIG_KEY = 'defaultIndexSeasonDigits'
|
||||
DEFAULT_INDEX_EPISODE_DIGITS_CONFIG_KEY = 'defaultIndexEpisodeDigits'
|
||||
DEFAULT_INDICATOR_SEASON_DIGITS_CONFIG_KEY = 'defaultIndicatorSeasonDigits'
|
||||
DEFAULT_INDICATOR_EPISODE_DIGITS_CONFIG_KEY = 'defaultIndicatorEpisodeDigits'
|
||||
|
||||
|
||||
def __init__(self):
|
||||
@@ -57,6 +68,42 @@ class ConfigurationController():
|
||||
)
|
||||
return os.path.expanduser(str(subtitlesDirectory)) if subtitlesDirectory else ''
|
||||
|
||||
@classmethod
|
||||
def getConfiguredIntegerValue(cls, configurationData: dict, configKey: str, defaultValue: int) -> int:
|
||||
configuredValue = configurationData.get(configKey, defaultValue)
|
||||
try:
|
||||
return int(configuredValue)
|
||||
except (TypeError, ValueError):
|
||||
return int(defaultValue)
|
||||
|
||||
def getDefaultIndexSeasonDigits(self):
|
||||
return ConfigurationController.getConfiguredIntegerValue(
|
||||
self.__configurationData,
|
||||
ConfigurationController.DEFAULT_INDEX_SEASON_DIGITS_CONFIG_KEY,
|
||||
DEFAULT_SHOW_INDEX_SEASON_DIGITS,
|
||||
)
|
||||
|
||||
def getDefaultIndexEpisodeDigits(self):
|
||||
return ConfigurationController.getConfiguredIntegerValue(
|
||||
self.__configurationData,
|
||||
ConfigurationController.DEFAULT_INDEX_EPISODE_DIGITS_CONFIG_KEY,
|
||||
DEFAULT_SHOW_INDEX_EPISODE_DIGITS,
|
||||
)
|
||||
|
||||
def getDefaultIndicatorSeasonDigits(self):
|
||||
return ConfigurationController.getConfiguredIntegerValue(
|
||||
self.__configurationData,
|
||||
ConfigurationController.DEFAULT_INDICATOR_SEASON_DIGITS_CONFIG_KEY,
|
||||
DEFAULT_SHOW_INDICATOR_SEASON_DIGITS,
|
||||
)
|
||||
|
||||
def getDefaultIndicatorEpisodeDigits(self):
|
||||
return ConfigurationController.getConfiguredIntegerValue(
|
||||
self.__configurationData,
|
||||
ConfigurationController.DEFAULT_INDICATOR_EPISODE_DIGITS_CONFIG_KEY,
|
||||
DEFAULT_SHOW_INDICATOR_EPISODE_DIGITS,
|
||||
)
|
||||
|
||||
def getData(self):
|
||||
return self.__configurationData
|
||||
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
VERSION='0.2.3'
|
||||
DATABASE_VERSION = 2
|
||||
VERSION='0.2.4'
|
||||
DATABASE_VERSION = 3
|
||||
|
||||
DEFAULT_QUALITY = 32
|
||||
DEFAULT_AV1_PRESET = 5
|
||||
@@ -22,4 +22,9 @@ DEFAULT_CROPDETECT_DURATION_SECONDS = 180
|
||||
DEFAULT_cut_start = 60
|
||||
DEFAULT_cut_length = 180
|
||||
|
||||
DEFAULT_SHOW_INDEX_SEASON_DIGITS = 2
|
||||
DEFAULT_SHOW_INDEX_EPISODE_DIGITS = 2
|
||||
DEFAULT_SHOW_INDICATOR_SEASON_DIGITS = 2
|
||||
DEFAULT_SHOW_INDICATOR_EPISODE_DIGITS = 2
|
||||
|
||||
DEFAULT_OUTPUT_FILENAME_TEMPLATE = '{{ ffx_show_name }} - {{ ffx_index }}{{ ffx_index_separator }}{{ ffx_episode_name }}{{ ffx_indicator_separator }}{{ ffx_indicator }}'
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import os, click
|
||||
import os, shutil, click
|
||||
|
||||
from sqlalchemy import create_engine, inspect
|
||||
from sqlalchemy import create_engine, inspect, text
|
||||
from sqlalchemy.orm import sessionmaker
|
||||
|
||||
# Import the full model package so SQLAlchemy registers every mapped class
|
||||
@@ -9,6 +9,11 @@ import ffx.model
|
||||
from ffx.model.show import Base
|
||||
|
||||
from ffx.model.property import Property
|
||||
from ffx.model.migration import (
|
||||
DatabaseVersionException,
|
||||
getMigrationPlan,
|
||||
migrateDatabase,
|
||||
)
|
||||
|
||||
from ffx.constants import DATABASE_VERSION
|
||||
|
||||
@@ -16,10 +21,6 @@ from ffx.constants import DATABASE_VERSION
|
||||
DATABASE_VERSION_KEY = 'database_version'
|
||||
EXPECTED_TABLE_NAMES = set(Base.metadata.tables.keys())
|
||||
|
||||
class DatabaseVersionException(Exception):
|
||||
def __init__(self, errorMessage):
|
||||
super().__init__(errorMessage)
|
||||
|
||||
def databaseContext(databasePath: str = ''):
|
||||
|
||||
databaseContext = {}
|
||||
@@ -33,7 +34,13 @@ def databaseContext(databasePath: str = ''):
|
||||
if not os.path.exists(ffxVarDir):
|
||||
os.makedirs(ffxVarDir)
|
||||
databasePath = os.path.join(ffxVarDir, 'ffx.db')
|
||||
else:
|
||||
databasePath = os.path.expanduser(databasePath)
|
||||
|
||||
if databasePath != ':memory:':
|
||||
databasePath = os.path.abspath(databasePath)
|
||||
|
||||
databaseContext['path'] = databasePath
|
||||
databaseContext['url'] = f"sqlite:///{databasePath}"
|
||||
databaseContext['engine'] = create_engine(databaseContext['url'])
|
||||
databaseContext['session'] = sessionmaker(bind=databaseContext['engine'])
|
||||
@@ -68,14 +75,113 @@ def bootstrapDatabaseIfNeeded(databaseContext):
|
||||
|
||||
Base.metadata.create_all(databaseContext['engine'])
|
||||
|
||||
|
||||
def ensureDatabaseVersion(databaseContext):
|
||||
|
||||
currentDatabaseVersion = getDatabaseVersion(databaseContext)
|
||||
if currentDatabaseVersion:
|
||||
if currentDatabaseVersion != DATABASE_VERSION:
|
||||
raise DatabaseVersionException(f"Current database version ({currentDatabaseVersion}) does not match required ({DATABASE_VERSION})")
|
||||
else:
|
||||
if not currentDatabaseVersion:
|
||||
setDatabaseVersion(databaseContext, DATABASE_VERSION)
|
||||
return
|
||||
|
||||
if currentDatabaseVersion > DATABASE_VERSION:
|
||||
raise DatabaseVersionException(
|
||||
f"Current database version ({currentDatabaseVersion}) does not match required ({DATABASE_VERSION})"
|
||||
)
|
||||
|
||||
if currentDatabaseVersion < DATABASE_VERSION:
|
||||
promptForDatabaseMigration(databaseContext, currentDatabaseVersion, DATABASE_VERSION)
|
||||
migrateDatabase(databaseContext, currentDatabaseVersion, DATABASE_VERSION, setDatabaseVersion)
|
||||
currentDatabaseVersion = getDatabaseVersion(databaseContext)
|
||||
|
||||
if currentDatabaseVersion != DATABASE_VERSION:
|
||||
raise DatabaseVersionException(
|
||||
f"Current database version ({currentDatabaseVersion}) does not match required ({DATABASE_VERSION})"
|
||||
)
|
||||
|
||||
ensureCurrentSchemaCompatibility(databaseContext)
|
||||
|
||||
|
||||
def ensureCurrentSchemaCompatibility(databaseContext):
|
||||
engine = databaseContext['engine']
|
||||
inspector = inspect(engine)
|
||||
showColumns = {
|
||||
column['name']
|
||||
for column in inspector.get_columns('shows')
|
||||
}
|
||||
|
||||
alterStatements = []
|
||||
if 'quality' not in showColumns:
|
||||
alterStatements.append("ALTER TABLE shows ADD COLUMN quality INTEGER DEFAULT 0")
|
||||
if 'notes' not in showColumns:
|
||||
alterStatements.append("ALTER TABLE shows ADD COLUMN notes TEXT DEFAULT ''")
|
||||
|
||||
if not alterStatements:
|
||||
return
|
||||
|
||||
with engine.begin() as connection:
|
||||
for alterStatement in alterStatements:
|
||||
connection.execute(text(alterStatement))
|
||||
|
||||
|
||||
def promptForDatabaseMigration(databaseContext, currentDatabaseVersion: int, targetDatabaseVersion: int):
|
||||
migrationPlan = getMigrationPlan(currentDatabaseVersion, targetDatabaseVersion)
|
||||
|
||||
click.echo("Database migration required.")
|
||||
click.echo(f"Current version: {currentDatabaseVersion}")
|
||||
click.echo(f"Target version: {targetDatabaseVersion}")
|
||||
click.echo("Steps required:")
|
||||
|
||||
missingSteps = []
|
||||
for migrationStep in migrationPlan:
|
||||
moduleStatus = "present" if migrationStep.modulePresent else "missing"
|
||||
click.echo(
|
||||
f" {migrationStep.versionFrom} -> {migrationStep.versionTo}: "
|
||||
+ f"{migrationStep.moduleName} [{moduleStatus}]"
|
||||
)
|
||||
if not migrationStep.modulePresent:
|
||||
missingSteps.append(migrationStep)
|
||||
|
||||
if missingSteps:
|
||||
firstMissingStep = missingSteps[0]
|
||||
raise DatabaseVersionException(
|
||||
f"No migration path from database version "
|
||||
+ f"{firstMissingStep.versionFrom} to {firstMissingStep.versionTo}"
|
||||
)
|
||||
|
||||
if not click.confirm(
|
||||
"Create a backup and continue with database migration?",
|
||||
default=True,
|
||||
):
|
||||
raise click.ClickException("Database migration aborted by user.")
|
||||
|
||||
backupPath = backupDatabaseBeforeMigration(
|
||||
databaseContext,
|
||||
currentDatabaseVersion,
|
||||
targetDatabaseVersion,
|
||||
)
|
||||
click.echo(f"Database backup created: {backupPath}")
|
||||
|
||||
|
||||
def backupDatabaseBeforeMigration(databaseContext, currentDatabaseVersion: int, targetDatabaseVersion: int) -> str:
|
||||
databasePath = databaseContext.get('path', '')
|
||||
if not databasePath or databasePath == ':memory:':
|
||||
raise click.ClickException("Database migration backup requires a file-backed SQLite database.")
|
||||
|
||||
if not os.path.isfile(databasePath):
|
||||
raise click.ClickException(f"Database file not found for backup: {databasePath}")
|
||||
|
||||
backupPath = f"{databasePath}.v{currentDatabaseVersion}-to-v{targetDatabaseVersion}.bak"
|
||||
backupIndex = 1
|
||||
while os.path.exists(backupPath):
|
||||
backupPath = (
|
||||
f"{databasePath}.v{currentDatabaseVersion}-to-v{targetDatabaseVersion}.{backupIndex}.bak"
|
||||
)
|
||||
backupIndex += 1
|
||||
|
||||
databaseContext['engine'].dispose()
|
||||
shutil.copy2(databasePath, backupPath)
|
||||
|
||||
return backupPath
|
||||
|
||||
|
||||
def getDatabaseVersion(databaseContext):
|
||||
|
||||
@@ -245,7 +245,8 @@ class FfxController():
|
||||
targetFormat: str = '',
|
||||
chainIteration: list = [],
|
||||
cropArguments: dict = {},
|
||||
currentPattern: Pattern = None):
|
||||
currentPattern: Pattern = None,
|
||||
currentShowDescriptor = None):
|
||||
# quality: int = DEFAULT_QUALITY,
|
||||
# preset: int = DEFAULT_AV1_PRESET):
|
||||
|
||||
@@ -262,9 +263,11 @@ class FfxController():
|
||||
|
||||
|
||||
if qualityFilters and (quality := qualityFilters[0]['parameters']['quality']):
|
||||
self.__logger.info(f"Setting quality {quality} from command line parameter")
|
||||
self.__logger.info(f"Setting quality {quality} from command line")
|
||||
elif currentPattern is not None and (quality := currentPattern.quality):
|
||||
self.__logger.info(f"Setting quality {quality} from pattern default")
|
||||
self.__logger.info(f"Setting quality {quality} from pattern")
|
||||
elif currentShowDescriptor is not None and (quality := currentShowDescriptor.getQuality()):
|
||||
self.__logger.info(f"Setting quality {quality} from show")
|
||||
else:
|
||||
quality = (QualityFilter.DEFAULT_H264_QUALITY
|
||||
if (videoEncoder == VideoEncoder.H264)
|
||||
|
||||
@@ -30,6 +30,18 @@ class FileProperties():
|
||||
|
||||
DEFAULT_INDEX_DIGITS = 3
|
||||
|
||||
@classmethod
|
||||
def extractSeasonEpisodeValues(cls, sourceText: str) -> tuple[int | None, int] | None:
|
||||
seasonEpisodeMatch = re.search(cls.SEASON_EPISODE_INDICATOR_MATCH, str(sourceText))
|
||||
if seasonEpisodeMatch is not None:
|
||||
return int(seasonEpisodeMatch.group(1)), int(seasonEpisodeMatch.group(2))
|
||||
|
||||
episodeMatch = re.search(cls.EPISODE_INDICATOR_MATCH, str(sourceText))
|
||||
if episodeMatch is not None:
|
||||
return None, int(episodeMatch.group(1))
|
||||
|
||||
return None
|
||||
|
||||
def __init__(self, context, sourcePath):
|
||||
|
||||
self.context = context
|
||||
@@ -65,26 +77,19 @@ class FileProperties():
|
||||
databaseMatchedGroups = matchResult['match'].groups()
|
||||
self.__logger.debug(f"FileProperties.__init__(): Matched groups: {databaseMatchedGroups}")
|
||||
|
||||
seIndicator = databaseMatchedGroups[0]
|
||||
|
||||
se_match = re.search(FileProperties.SEASON_EPISODE_INDICATOR_MATCH, seIndicator)
|
||||
e_match = re.search(FileProperties.EPISODE_INDICATOR_MATCH, seIndicator)
|
||||
|
||||
indicatorSource = databaseMatchedGroups[0]
|
||||
else:
|
||||
self.__logger.debug(f"FileProperties.__init__(): Checking file name for indicator {self.__sourceFilename}")
|
||||
indicatorSource = self.__sourceFilename
|
||||
|
||||
se_match = re.search(FileProperties.SEASON_EPISODE_INDICATOR_MATCH, self.__sourceFilename)
|
||||
e_match = re.search(FileProperties.EPISODE_INDICATOR_MATCH, self.__sourceFilename)
|
||||
|
||||
if se_match is not None:
|
||||
self.__season = int(se_match.group(1))
|
||||
self.__episode = int(se_match.group(2))
|
||||
elif e_match is not None:
|
||||
self.__season = -1
|
||||
self.__episode = int(e_match.group(1))
|
||||
else:
|
||||
seasonEpisodeValues = self.extractSeasonEpisodeValues(indicatorSource)
|
||||
if seasonEpisodeValues is None:
|
||||
self.__season = -1
|
||||
self.__episode = -1
|
||||
else:
|
||||
sourceSeason, sourceEpisode = seasonEpisodeValues
|
||||
self.__season = -1 if sourceSeason is None else int(sourceSeason)
|
||||
self.__episode = int(sourceEpisode)
|
||||
|
||||
self.__ffprobeData = None
|
||||
|
||||
|
||||
@@ -4,6 +4,7 @@ from jinja2 import Environment, Undefined
|
||||
from .constants import DEFAULT_OUTPUT_FILENAME_TEMPLATE
|
||||
from .configuration_controller import ConfigurationController
|
||||
from .logging_utils import get_ffx_logger
|
||||
from .show_descriptor import ShowDescriptor
|
||||
|
||||
|
||||
class EmptyStringUndefined(Undefined):
|
||||
@@ -164,10 +165,10 @@ def getEpisodeFileBasename(showName,
|
||||
episodeName,
|
||||
season,
|
||||
episode,
|
||||
indexSeasonDigits = 2,
|
||||
indexEpisodeDigits = 2,
|
||||
indicatorSeasonDigits = 2,
|
||||
indicatorEpisodeDigits = 2,
|
||||
indexSeasonDigits = None,
|
||||
indexEpisodeDigits = None,
|
||||
indicatorSeasonDigits = None,
|
||||
indicatorEpisodeDigits = None,
|
||||
context = None):
|
||||
"""
|
||||
One Piece:
|
||||
@@ -199,6 +200,16 @@ def getEpisodeFileBasename(showName,
|
||||
configData = cc.getData() if cc is not None else {}
|
||||
outputFilenameTemplate = configData.get(ConfigurationController.OUTPUT_FILENAME_TEMPLATE_KEY,
|
||||
DEFAULT_OUTPUT_FILENAME_TEMPLATE)
|
||||
defaultDigitLengths = ShowDescriptor.getDefaultDigitLengths(context)
|
||||
|
||||
if indexSeasonDigits is None:
|
||||
indexSeasonDigits = defaultDigitLengths[ShowDescriptor.INDEX_SEASON_DIGITS_KEY]
|
||||
if indexEpisodeDigits is None:
|
||||
indexEpisodeDigits = defaultDigitLengths[ShowDescriptor.INDEX_EPISODE_DIGITS_KEY]
|
||||
if indicatorSeasonDigits is None:
|
||||
indicatorSeasonDigits = defaultDigitLengths[ShowDescriptor.INDICATOR_SEASON_DIGITS_KEY]
|
||||
if indicatorEpisodeDigits is None:
|
||||
indicatorEpisodeDigits = defaultDigitLengths[ShowDescriptor.INDICATOR_EPISODE_DIGITS_KEY]
|
||||
|
||||
if context is not None and 'logger' in context.keys():
|
||||
logger = context['logger']
|
||||
|
||||
@@ -1,85 +1,196 @@
|
||||
from enum import Enum
|
||||
import difflib
|
||||
|
||||
|
||||
class IsoLanguage(Enum):
|
||||
|
||||
AFRIKAANS = {"name": "Afrikaans", "iso639_1": "af", "iso639_2": ["afr"]}
|
||||
ALBANIAN = {"name": "Albanian", "iso639_1": "sq", "iso639_2": ["alb"]}
|
||||
ARABIC = {"name": "Arabic", "iso639_1": "ar", "iso639_2": ["ara"]}
|
||||
ARMENIAN = {"name": "Armenian", "iso639_1": "hy", "iso639_2": ["arm"]}
|
||||
AZERBAIJANI = {"name": "Azerbaijani", "iso639_1": "az", "iso639_2": ["aze"]}
|
||||
BASQUE = {"name": "Basque", "iso639_1": "eu", "iso639_2": ["baq"]}
|
||||
BELARUSIAN = {"name": "Belarusian", "iso639_1": "be", "iso639_2": ["bel"]}
|
||||
BOKMAL = {"name": "Bokmål", "iso639_1": "nb", "iso639_2": ["nob"]} # Norwegian Bokmål
|
||||
BULGARIAN = {"name": "Bulgarian", "iso639_1": "bg", "iso639_2": ["bul"]}
|
||||
CATALAN = {"name": "Catalan", "iso639_1": "ca", "iso639_2": ["cat"]}
|
||||
CHINESE = {"name": "Chinese", "iso639_1": "zh", "iso639_2": ["zho", "chi"]}
|
||||
CROATIAN = {"name": "Croatian", "iso639_1": "hr", "iso639_2": ["hrv"]}
|
||||
CZECH = {"name": "Czech", "iso639_1": "cs", "iso639_2": ["cze"]}
|
||||
DANISH = {"name": "Danish", "iso639_1": "da", "iso639_2": ["dan"]}
|
||||
DUTCH = {"name": "Dutch", "iso639_1": "nl", "iso639_2": ["nld", "dut"]}
|
||||
ENGLISH = {"name": "English", "iso639_1": "en", "iso639_2": ["eng"]}
|
||||
ESTONIAN = {"name": "Estonian", "iso639_1": "et", "iso639_2": ["est"]}
|
||||
FILIPINO = {"name": "Filipino", "iso639_1": "tl", "iso639_2": ["fil"]} # Tagalog
|
||||
FINNISH = {"name": "Finnish", "iso639_1": "fi", "iso639_2": ["fin"]}
|
||||
FRENCH = {"name": "French", "iso639_1": "fr", "iso639_2": ["fra", "fre"]}
|
||||
GALICIAN = {"name": "Galician", "iso639_1": "gl", "iso639_2": ["glg"]}
|
||||
GEORGIAN = {"name": "Georgian", "iso639_1": "ka", "iso639_2": ["geo"]}
|
||||
GERMAN = {"name": "German", "iso639_1": "de", "iso639_2": ["deu", "ger"]}
|
||||
GREEK = {"name": "Greek", "iso639_1": "el", "iso639_2": ["gre"]}
|
||||
HEBREW = {"name": "Hebrew", "iso639_1": "he", "iso639_2": ["heb"]}
|
||||
HINDI = {"name": "Hindi", "iso639_1": "hi", "iso639_2": ["hin"]}
|
||||
HUNGARIAN = {"name": "Hungarian", "iso639_1": "hu", "iso639_2": ["hun"]}
|
||||
ICELANDIC = {"name": "Icelandic", "iso639_1": "is", "iso639_2": ["ice"]}
|
||||
INDONESIAN = {"name": "Indonesian", "iso639_1": "id", "iso639_2": ["ind"]}
|
||||
IRISH = {"name": "Irish", "iso639_1": "ga", "iso639_2": ["gle"]}
|
||||
ITALIAN = {"name": "Italian", "iso639_1": "it", "iso639_2": ["ita"]}
|
||||
JAPANESE = {"name": "Japanese", "iso639_1": "ja", "iso639_2": ["jpn"]}
|
||||
KANNADA = {"name": "Kannada", "iso639_1": "kn", "iso639_2": ["kan"]}
|
||||
KAZAKH = {"name": "Kazakh", "iso639_1": "kk", "iso639_2": ["kaz"]}
|
||||
KOREAN = {"name": "Korean", "iso639_1": "ko", "iso639_2": ["kor"]}
|
||||
LATIN = {"name": "Latin", "iso639_1": "la", "iso639_2": ["lat"]}
|
||||
LATVIAN = {"name": "Latvian", "iso639_1": "lv", "iso639_2": ["lav"]}
|
||||
LITHUANIAN = {"name": "Lithuanian", "iso639_1": "lt", "iso639_2": ["lit"]}
|
||||
MACEDONIAN = {"name": "Macedonian", "iso639_1": "mk", "iso639_2": ["mac"]}
|
||||
MALAY = {"name": "Malay", "iso639_1": "ms", "iso639_2": ["may"]}
|
||||
MALAYALAM = {"name": "Malayalam", "iso639_1": "ml", "iso639_2": ["mal"]}
|
||||
MALTESE = {"name": "Maltese", "iso639_1": "mt", "iso639_2": ["mlt"]}
|
||||
NORWEGIAN = {"name": "Norwegian", "iso639_1": "no", "iso639_2": ["nor"]}
|
||||
PERSIAN = {"name": "Persian", "iso639_1": "fa", "iso639_2": ["per"]}
|
||||
POLISH = {"name": "Polish", "iso639_1": "pl", "iso639_2": ["pol"]}
|
||||
PORTUGUESE = {"name": "Portuguese", "iso639_1": "pt", "iso639_2": ["por"]}
|
||||
ROMANIAN = {"name": "Romanian", "iso639_1": "ro", "iso639_2": ["rum"]}
|
||||
RUSSIAN = {"name": "Russian", "iso639_1": "ru", "iso639_2": ["rus"]}
|
||||
NORTHERN_SAMI = {"name": "Northern Sami", "iso639_1": "se", "iso639_2": ["sme"]}
|
||||
SAMOAN = {"name": "Samoan", "iso639_1": "sm", "iso639_2": ["smo"]}
|
||||
SANGO = {"name": "Sango", "iso639_1": "sg", "iso639_2": ["sag"]}
|
||||
SANSKRIT = {"name": "Sanskrit", "iso639_1": "sa", "iso639_2": ["san"]}
|
||||
SARDINIAN = {"name": "Sardinian", "iso639_1": "sc", "iso639_2": ["srd"]}
|
||||
SERBIAN = {"name": "Serbian", "iso639_1": "sr", "iso639_2": ["srp"]}
|
||||
SHONA = {"name": "Shona", "iso639_1": "sn", "iso639_2": ["sna"]}
|
||||
SINDHI = {"name": "Sindhi", "iso639_1": "sd", "iso639_2": ["snd"]}
|
||||
SINHALA = {"name": "Sinhala", "iso639_1": "si", "iso639_2": ["sin"]}
|
||||
SLOVAK = {"name": "Slovak", "iso639_1": "sk", "iso639_2": ["slk"]}
|
||||
SLOVENIAN = {"name": "Slovenian", "iso639_1": "sl", "iso639_2": ["slv"]}
|
||||
SOMALI = {"name": "Somali", "iso639_1": "so", "iso639_2": ["som"]}
|
||||
SOUTHERN_SOTHO = {"name": "Southern Sotho", "iso639_1": "st", "iso639_2": ["sot"]}
|
||||
SPANISH = {"name": "Spanish", "iso639_1": "es", "iso639_2": ["spa"]}
|
||||
SUNDANESE = {"name": "Sundanese", "iso639_1": "su", "iso639_2": ["sun"]}
|
||||
SWAHILI = {"name": "Swahili", "iso639_1": "sw", "iso639_2": ["swa"]}
|
||||
SWATI = {"name": "Swati", "iso639_1": "ss", "iso639_2": ["ssw"]}
|
||||
SWEDISH = {"name": "Swedish", "iso639_1": "sv", "iso639_2": ["swe"]}
|
||||
TAGALOG = {"name": "Tagalog", "iso639_1": "tl", "iso639_2": ["tgl"]}
|
||||
TAMIL = {"name": "Tamil", "iso639_1": "ta", "iso639_2": ["tam"]}
|
||||
TELUGU = {"name": "Telugu", "iso639_1": "te", "iso639_2": ["tel"]}
|
||||
THAI = {"name": "Thai", "iso639_1": "th", "iso639_2": ["tha"]}
|
||||
TURKISH = {"name": "Turkish", "iso639_1": "tr", "iso639_2": ["tur"]}
|
||||
UKRAINIAN = {"name": "Ukrainian", "iso639_1": "uk", "iso639_2": ["ukr"]}
|
||||
URDU = {"name": "Urdu", "iso639_1": "ur", "iso639_2": ["urd"]}
|
||||
VIETNAMESE = {"name": "Vietnamese", "iso639_1": "vi", "iso639_2":[ "vie"]}
|
||||
WELSH = {"name": "Welsh", "iso639_1": "cy", "iso639_2": ["wel"]}
|
||||
ABKHAZIAN = {"name": "Abkhazian", "iso639_1": "ab", "iso639_2": ["abk"]}
|
||||
AFAR = {"name": "Afar", "iso639_1": "aa", "iso639_2": ["aar"]}
|
||||
AFRIKAANS = {"name": "Afrikaans", "iso639_1": "af", "iso639_2": ["afr"]}
|
||||
AKAN = {"name": "Akan", "iso639_1": "ak", "iso639_2": ["aka"]}
|
||||
ALBANIAN = {"name": "Albanian", "iso639_1": "sq", "iso639_2": ["sqi", "alb"]}
|
||||
AMHARIC = {"name": "Amharic", "iso639_1": "am", "iso639_2": ["amh"]}
|
||||
ARABIC = {"name": "Arabic", "iso639_1": "ar", "iso639_2": ["ara"]}
|
||||
ARAGONESE = {"name": "Aragonese", "iso639_1": "an", "iso639_2": ["arg"]}
|
||||
ARMENIAN = {"name": "Armenian", "iso639_1": "hy", "iso639_2": ["hye", "arm"]}
|
||||
ASSAMESE = {"name": "Assamese", "iso639_1": "as", "iso639_2": ["asm"]}
|
||||
AVARIC = {"name": "Avaric", "iso639_1": "av", "iso639_2": ["ava"]}
|
||||
AVESTAN = {"name": "Avestan", "iso639_1": "ae", "iso639_2": ["ave"]}
|
||||
AYMARA = {"name": "Aymara", "iso639_1": "ay", "iso639_2": ["aym"]}
|
||||
AZERBAIJANI = {"name": "Azerbaijani", "iso639_1": "az", "iso639_2": ["aze"]}
|
||||
BAMBARA = {"name": "Bambara", "iso639_1": "bm", "iso639_2": ["bam"]}
|
||||
BASHKIR = {"name": "Bashkir", "iso639_1": "ba", "iso639_2": ["bak"]}
|
||||
BASQUE = {"name": "Basque", "iso639_1": "eu", "iso639_2": ["eus", "baq"]}
|
||||
BELARUSIAN = {"name": "Belarusian", "iso639_1": "be", "iso639_2": ["bel"]}
|
||||
BENGALI = {"name": "Bengali", "iso639_1": "bn", "iso639_2": ["ben"]}
|
||||
BISLAMA = {"name": "Bislama", "iso639_1": "bi", "iso639_2": ["bis"]}
|
||||
BOKMAL = {"name": "Bokmål", "iso639_1": "nb", "iso639_2": ["nob"]}
|
||||
BOSNIAN = {"name": "Bosnian", "iso639_1": "bs", "iso639_2": ["bos"]}
|
||||
BRETON = {"name": "Breton", "iso639_1": "br", "iso639_2": ["bre"]}
|
||||
BULGARIAN = {"name": "Bulgarian", "iso639_1": "bg", "iso639_2": ["bul"]}
|
||||
BURMESE = {"name": "Burmese", "iso639_1": "my", "iso639_2": ["mya", "bur"]}
|
||||
CATALAN = {"name": "Catalan", "iso639_1": "ca", "iso639_2": ["cat"]}
|
||||
CHAMORRO = {"name": "Chamorro", "iso639_1": "ch", "iso639_2": ["cha"]}
|
||||
CHECHEN = {"name": "Chechen", "iso639_1": "ce", "iso639_2": ["che"]}
|
||||
CHICHEWA = {"name": "Chichewa", "iso639_1": "ny", "iso639_2": ["nya"]}
|
||||
CHINESE = {"name": "Chinese", "iso639_1": "zh", "iso639_2": ["zho", "chi"]}
|
||||
CHURCH_SLAVIC = {"name": "Church Slavic", "iso639_1": "cu", "iso639_2": ["chu"]}
|
||||
CHUVASH = {"name": "Chuvash", "iso639_1": "cv", "iso639_2": ["chv"]}
|
||||
CORNISH = {"name": "Cornish", "iso639_1": "kw", "iso639_2": ["cor"]}
|
||||
CORSICAN = {"name": "Corsican", "iso639_1": "co", "iso639_2": ["cos"]}
|
||||
CREE = {"name": "Cree", "iso639_1": "cr", "iso639_2": ["cre"]}
|
||||
CROATIAN = {"name": "Croatian", "iso639_1": "hr", "iso639_2": ["hrv"]}
|
||||
CZECH = {"name": "Czech", "iso639_1": "cs", "iso639_2": ["ces", "cze"]}
|
||||
DANISH = {"name": "Danish", "iso639_1": "da", "iso639_2": ["dan"]}
|
||||
DIVEHI = {"name": "Divehi", "iso639_1": "dv", "iso639_2": ["div"]}
|
||||
DUTCH = {"name": "Dutch", "iso639_1": "nl", "iso639_2": ["nld", "dut"]}
|
||||
DZONGKHA = {"name": "Dzongkha", "iso639_1": "dz", "iso639_2": ["dzo"]}
|
||||
ENGLISH = {"name": "English", "iso639_1": "en", "iso639_2": ["eng"]}
|
||||
ESPERANTO = {"name": "Esperanto", "iso639_1": "eo", "iso639_2": ["epo"]}
|
||||
ESTONIAN = {"name": "Estonian", "iso639_1": "et", "iso639_2": ["est"]}
|
||||
EWE = {"name": "Ewe", "iso639_1": "ee", "iso639_2": ["ewe"]}
|
||||
FAROESE = {"name": "Faroese", "iso639_1": "fo", "iso639_2": ["fao"]}
|
||||
FIJIAN = {"name": "Fijian", "iso639_1": "fj", "iso639_2": ["fij"]}
|
||||
FINNISH = {"name": "Finnish", "iso639_1": "fi", "iso639_2": ["fin"]}
|
||||
FRENCH = {"name": "French", "iso639_1": "fr", "iso639_2": ["fra", "fre"]}
|
||||
FULAH = {"name": "Fulah", "iso639_1": "ff", "iso639_2": ["ful"]}
|
||||
GALICIAN = {"name": "Galician", "iso639_1": "gl", "iso639_2": ["glg"]}
|
||||
GANDA = {"name": "Ganda", "iso639_1": "lg", "iso639_2": ["lug"]}
|
||||
GEORGIAN = {"name": "Georgian", "iso639_1": "ka", "iso639_2": ["kat", "geo"]}
|
||||
GERMAN = {"name": "German", "iso639_1": "de", "iso639_2": ["deu", "ger"]}
|
||||
GREEK = {"name": "Greek", "iso639_1": "el", "iso639_2": ["ell", "gre"]}
|
||||
GUARANI = {"name": "Guarani", "iso639_1": "gn", "iso639_2": ["grn"]}
|
||||
GUJARATI = {"name": "Gujarati", "iso639_1": "gu", "iso639_2": ["guj"]}
|
||||
HAITIAN = {"name": "Haitian", "iso639_1": "ht", "iso639_2": ["hat"]}
|
||||
HAUSA = {"name": "Hausa", "iso639_1": "ha", "iso639_2": ["hau"]}
|
||||
HEBREW = {"name": "Hebrew", "iso639_1": "he", "iso639_2": ["heb"]}
|
||||
HERERO = {"name": "Herero", "iso639_1": "hz", "iso639_2": ["her"]}
|
||||
HINDI = {"name": "Hindi", "iso639_1": "hi", "iso639_2": ["hin"]}
|
||||
HIRI_MOTU = {"name": "Hiri Motu", "iso639_1": "ho", "iso639_2": ["hmo"]}
|
||||
HUNGARIAN = {"name": "Hungarian", "iso639_1": "hu", "iso639_2": ["hun"]}
|
||||
ICELANDIC = {"name": "Icelandic", "iso639_1": "is", "iso639_2": ["isl", "ice"]}
|
||||
IDO = {"name": "Ido", "iso639_1": "io", "iso639_2": ["ido"]}
|
||||
IGBO = {"name": "Igbo", "iso639_1": "ig", "iso639_2": ["ibo"]}
|
||||
INDONESIAN = {"name": "Indonesian", "iso639_1": "id", "iso639_2": ["ind"]}
|
||||
INTERLINGUA = {"name": "Interlingua", "iso639_1": "ia", "iso639_2": ["ina"]}
|
||||
INTERLINGUE = {"name": "Interlingue", "iso639_1": "ie", "iso639_2": ["ile"]}
|
||||
INUKTITUT = {"name": "Inuktitut", "iso639_1": "iu", "iso639_2": ["iku"]}
|
||||
INUPIAQ = {"name": "Inupiaq", "iso639_1": "ik", "iso639_2": ["ipk"]}
|
||||
IRISH = {"name": "Irish", "iso639_1": "ga", "iso639_2": ["gle"]}
|
||||
ITALIAN = {"name": "Italian", "iso639_1": "it", "iso639_2": ["ita"]}
|
||||
JAPANESE = {"name": "Japanese", "iso639_1": "ja", "iso639_2": ["jpn"]}
|
||||
JAVANESE = {"name": "Javanese", "iso639_1": "jv", "iso639_2": ["jav"]}
|
||||
KALAALLISUT = {"name": "Kalaallisut", "iso639_1": "kl", "iso639_2": ["kal"]}
|
||||
KANNADA = {"name": "Kannada", "iso639_1": "kn", "iso639_2": ["kan"]}
|
||||
KANURI = {"name": "Kanuri", "iso639_1": "kr", "iso639_2": ["kau"]}
|
||||
KASHMIRI = {"name": "Kashmiri", "iso639_1": "ks", "iso639_2": ["kas"]}
|
||||
KAZAKH = {"name": "Kazakh", "iso639_1": "kk", "iso639_2": ["kaz"]}
|
||||
KHMER = {"name": "Khmer", "iso639_1": "km", "iso639_2": ["khm"]}
|
||||
KIKUYU = {"name": "Kikuyu", "iso639_1": "ki", "iso639_2": ["kik"]}
|
||||
KINYARWANDA = {"name": "Kinyarwanda", "iso639_1": "rw", "iso639_2": ["kin"]}
|
||||
KIRGHIZ = {"name": "Kirghiz", "iso639_1": "ky", "iso639_2": ["kir"]}
|
||||
KOMI = {"name": "Komi", "iso639_1": "kv", "iso639_2": ["kom"]}
|
||||
KONGO = {"name": "Kongo", "iso639_1": "kg", "iso639_2": ["kon"]}
|
||||
KOREAN = {"name": "Korean", "iso639_1": "ko", "iso639_2": ["kor"]}
|
||||
KUANYAMA = {"name": "Kuanyama", "iso639_1": "kj", "iso639_2": ["kua"]}
|
||||
KURDISH = {"name": "Kurdish", "iso639_1": "ku", "iso639_2": ["kur"]}
|
||||
LAO = {"name": "Lao", "iso639_1": "lo", "iso639_2": ["lao"]}
|
||||
LATIN = {"name": "Latin", "iso639_1": "la", "iso639_2": ["lat"]}
|
||||
LATVIAN = {"name": "Latvian", "iso639_1": "lv", "iso639_2": ["lav"]}
|
||||
LIMBURGAN = {"name": "Limburgan", "iso639_1": "li", "iso639_2": ["lim"]}
|
||||
LINGALA = {"name": "Lingala", "iso639_1": "ln", "iso639_2": ["lin"]}
|
||||
LITHUANIAN = {"name": "Lithuanian", "iso639_1": "lt", "iso639_2": ["lit"]}
|
||||
LUBA_KATANGA = {"name": "Luba-Katanga", "iso639_1": "lu", "iso639_2": ["lub"]}
|
||||
LUXEMBOURGISH = {"name": "Luxembourgish", "iso639_1": "lb", "iso639_2": ["ltz"]}
|
||||
MACEDONIAN = {"name": "Macedonian", "iso639_1": "mk", "iso639_2": ["mkd", "mac"]}
|
||||
MALAGASY = {"name": "Malagasy", "iso639_1": "mg", "iso639_2": ["mlg"]}
|
||||
MALAY = {"name": "Malay", "iso639_1": "ms", "iso639_2": ["msa", "may"]}
|
||||
MALAYALAM = {"name": "Malayalam", "iso639_1": "ml", "iso639_2": ["mal"]}
|
||||
MALTESE = {"name": "Maltese", "iso639_1": "mt", "iso639_2": ["mlt"]}
|
||||
MANX = {"name": "Manx", "iso639_1": "gv", "iso639_2": ["glv"]}
|
||||
MAORI = {"name": "Maori", "iso639_1": "mi", "iso639_2": ["mri", "mao"]}
|
||||
MARATHI = {"name": "Marathi", "iso639_1": "mr", "iso639_2": ["mar"]}
|
||||
MARSHALLESE = {"name": "Marshallese", "iso639_1": "mh", "iso639_2": ["mah"]}
|
||||
MONGOLIAN = {"name": "Mongolian", "iso639_1": "mn", "iso639_2": ["mon"]}
|
||||
NAURU = {"name": "Nauru", "iso639_1": "na", "iso639_2": ["nau"]}
|
||||
NAVAJO = {"name": "Navajo", "iso639_1": "nv", "iso639_2": ["nav"]}
|
||||
NDONGA = {"name": "Ndonga", "iso639_1": "ng", "iso639_2": ["ndo"]}
|
||||
NEPALI = {"name": "Nepali", "iso639_1": "ne", "iso639_2": ["nep"]}
|
||||
NORTH_NDEBELE = {"name": "North Ndebele", "iso639_1": "nd", "iso639_2": ["nde"]}
|
||||
NORTHERN_SAMI = {"name": "Northern Sami", "iso639_1": "se", "iso639_2": ["sme"]}
|
||||
NORWEGIAN = {"name": "Norwegian", "iso639_1": "no", "iso639_2": ["nor"]}
|
||||
NORWEGIAN_NYNORSK = {"name": "Nynorsk", "iso639_1": "nn", "iso639_2": ["nno"]}
|
||||
OCCITAN = {"name": "Occitan", "iso639_1": "oc", "iso639_2": ["oci"]}
|
||||
OJIBWA = {"name": "Ojibwa", "iso639_1": "oj", "iso639_2": ["oji"]}
|
||||
ORIYA = {"name": "Oriya", "iso639_1": "or", "iso639_2": ["ori"]}
|
||||
OROMO = {"name": "Oromo", "iso639_1": "om", "iso639_2": ["orm"]}
|
||||
OSSETIAN = {"name": "Ossetian", "iso639_1": "os", "iso639_2": ["oss"]}
|
||||
PALI = {"name": "Pali", "iso639_1": "pi", "iso639_2": ["pli"]}
|
||||
PANJABI = {"name": "Panjabi", "iso639_1": "pa", "iso639_2": ["pan"]}
|
||||
PERSIAN = {"name": "Persian", "iso639_1": "fa", "iso639_2": ["fas", "per"]}
|
||||
POLISH = {"name": "Polish", "iso639_1": "pl", "iso639_2": ["pol"]}
|
||||
PORTUGUESE = {"name": "Portuguese", "iso639_1": "pt", "iso639_2": ["por"]}
|
||||
PUSHTO = {"name": "Pushto", "iso639_1": "ps", "iso639_2": ["pus"]}
|
||||
QUECHUA = {"name": "Quechua", "iso639_1": "qu", "iso639_2": ["que"]}
|
||||
ROMANIAN = {"name": "Romanian", "iso639_1": "ro", "iso639_2": ["ron", "rum"]}
|
||||
ROMANSH = {"name": "Romansh", "iso639_1": "rm", "iso639_2": ["roh"]}
|
||||
RUNDI = {"name": "Rundi", "iso639_1": "rn", "iso639_2": ["run"]}
|
||||
RUSSIAN = {"name": "Russian", "iso639_1": "ru", "iso639_2": ["rus"]}
|
||||
SAMOAN = {"name": "Samoan", "iso639_1": "sm", "iso639_2": ["smo"]}
|
||||
SANGO = {"name": "Sango", "iso639_1": "sg", "iso639_2": ["sag"]}
|
||||
SANSKRIT = {"name": "Sanskrit", "iso639_1": "sa", "iso639_2": ["san"]}
|
||||
SARDINIAN = {"name": "Sardinian", "iso639_1": "sc", "iso639_2": ["srd"]}
|
||||
SCOTTISH_GAELIC = {"name": "Scottish Gaelic", "iso639_1": "gd", "iso639_2": ["gla"]}
|
||||
SERBIAN = {"name": "Serbian", "iso639_1": "sr", "iso639_2": ["srp"]}
|
||||
SHONA = {"name": "Shona", "iso639_1": "sn", "iso639_2": ["sna"]}
|
||||
SICHUAN_YI = {"name": "Sichuan Yi", "iso639_1": "ii", "iso639_2": ["iii"]}
|
||||
SINDHI = {"name": "Sindhi", "iso639_1": "sd", "iso639_2": ["snd"]}
|
||||
SINHALA = {"name": "Sinhala", "iso639_1": "si", "iso639_2": ["sin"]}
|
||||
SLOVAK = {"name": "Slovak", "iso639_1": "sk", "iso639_2": ["slk", "slo"]}
|
||||
SLOVENIAN = {"name": "Slovenian", "iso639_1": "sl", "iso639_2": ["slv"]}
|
||||
SOMALI = {"name": "Somali", "iso639_1": "so", "iso639_2": ["som"]}
|
||||
SOUTH_NDEBELE = {"name": "South Ndebele", "iso639_1": "nr", "iso639_2": ["nbl"]}
|
||||
SOUTHERN_SOTHO = {"name": "Southern Sotho", "iso639_1": "st", "iso639_2": ["sot"]}
|
||||
SPANISH = {"name": "Spanish", "iso639_1": "es", "iso639_2": ["spa"]}
|
||||
SUNDANESE = {"name": "Sundanese", "iso639_1": "su", "iso639_2": ["sun"]}
|
||||
SWAHILI = {"name": "Swahili", "iso639_1": "sw", "iso639_2": ["swa"]}
|
||||
SWATI = {"name": "Swati", "iso639_1": "ss", "iso639_2": ["ssw"]}
|
||||
SWEDISH = {"name": "Swedish", "iso639_1": "sv", "iso639_2": ["swe"]}
|
||||
TAGALOG = {"name": "Tagalog", "iso639_1": "tl", "iso639_2": ["tgl"]}
|
||||
TAHITIAN = {"name": "Tahitian", "iso639_1": "ty", "iso639_2": ["tah"]}
|
||||
TAJIK = {"name": "Tajik", "iso639_1": "tg", "iso639_2": ["tgk"]}
|
||||
TAMIL = {"name": "Tamil", "iso639_1": "ta", "iso639_2": ["tam"]}
|
||||
TATAR = {"name": "Tatar", "iso639_1": "tt", "iso639_2": ["tat"]}
|
||||
TELUGU = {"name": "Telugu", "iso639_1": "te", "iso639_2": ["tel"]}
|
||||
THAI = {"name": "Thai", "iso639_1": "th", "iso639_2": ["tha"]}
|
||||
TIBETAN = {"name": "Tibetan", "iso639_1": "bo", "iso639_2": ["bod", "tib"]}
|
||||
TIGRINYA = {"name": "Tigrinya", "iso639_1": "ti", "iso639_2": ["tir"]}
|
||||
TONGA = {"name": "Tonga", "iso639_1": "to", "iso639_2": ["ton"]}
|
||||
TSONGA = {"name": "Tsonga", "iso639_1": "ts", "iso639_2": ["tso"]}
|
||||
TSWANA = {"name": "Tswana", "iso639_1": "tn", "iso639_2": ["tsn"]}
|
||||
TURKISH = {"name": "Turkish", "iso639_1": "tr", "iso639_2": ["tur"]}
|
||||
TURKMEN = {"name": "Turkmen", "iso639_1": "tk", "iso639_2": ["tuk"]}
|
||||
TWI = {"name": "Twi", "iso639_1": "tw", "iso639_2": ["twi"]}
|
||||
UIGHUR = {"name": "Uighur", "iso639_1": "ug", "iso639_2": ["uig"]}
|
||||
UKRAINIAN = {"name": "Ukrainian", "iso639_1": "uk", "iso639_2": ["ukr"]}
|
||||
URDU = {"name": "Urdu", "iso639_1": "ur", "iso639_2": ["urd"]}
|
||||
UZBEK = {"name": "Uzbek", "iso639_1": "uz", "iso639_2": ["uzb"]}
|
||||
VENDA = {"name": "Venda", "iso639_1": "ve", "iso639_2": ["ven"]}
|
||||
VIETNAMESE = {"name": "Vietnamese", "iso639_1": "vi", "iso639_2": ["vie"]}
|
||||
VOLAPUK = {"name": "Volapük", "iso639_1": "vo", "iso639_2": ["vol"]}
|
||||
WALLOON = {"name": "Walloon", "iso639_1": "wa", "iso639_2": ["wln"]}
|
||||
WELSH = {"name": "Welsh", "iso639_1": "cy", "iso639_2": ["cym", "wel"]}
|
||||
WESTERN_FRISIAN = {"name": "Western Frisian", "iso639_1": "fy", "iso639_2": ["fry"]}
|
||||
WOLOF = {"name": "Wolof", "iso639_1": "wo", "iso639_2": ["wol"]}
|
||||
XHOSA = {"name": "Xhosa", "iso639_1": "xh", "iso639_2": ["xho"]}
|
||||
YIDDISH = {"name": "Yiddish", "iso639_1": "yi", "iso639_2": ["yid"]}
|
||||
YORUBA = {"name": "Yoruba", "iso639_1": "yo", "iso639_2": ["yor"]}
|
||||
ZHUANG = {"name": "Zhuang", "iso639_1": "za", "iso639_2": ["zha"]}
|
||||
ZULU = {"name": "Zulu", "iso639_1": "zu", "iso639_2": ["zul"]}
|
||||
|
||||
UNDEFINED = {"name": "undefined", "iso639_1": "xx", "iso639_2": ["und"]}
|
||||
FILIPINO = {"name": "Filipino", "iso639_1": "tl", "iso639_2": ["fil"]}
|
||||
|
||||
UNDEFINED = {"name": "undefined", "iso639_1": "xx", "iso639_2": ["und"]}
|
||||
|
||||
|
||||
@staticmethod
|
||||
@@ -88,24 +199,22 @@ class IsoLanguage(Enum):
|
||||
closestMatches = difflib.get_close_matches(label, [l.value["name"] for l in IsoLanguage], n=1)
|
||||
|
||||
if closestMatches:
|
||||
foundLangs = [l for l in IsoLanguage if l.value['name'] == closestMatches[0]]
|
||||
foundLangs = [l for l in IsoLanguage if l.value["name"] == closestMatches[0]]
|
||||
return foundLangs[0] if foundLangs else IsoLanguage.UNDEFINED
|
||||
else:
|
||||
return IsoLanguage.UNDEFINED
|
||||
|
||||
@staticmethod
|
||||
def findThreeLetter(theeLetter : str):
|
||||
foundLangs = [l for l in IsoLanguage if str(theeLetter) in l.value['iso639_2']]
|
||||
foundLangs = [l for l in IsoLanguage if str(theeLetter) in l.value["iso639_2"]]
|
||||
return foundLangs[0] if foundLangs else IsoLanguage.UNDEFINED
|
||||
|
||||
|
||||
def label(self):
|
||||
return str(self.value['name'])
|
||||
return str(self.value["name"])
|
||||
|
||||
def twoLetter(self):
|
||||
return str(self.value['iso639_1'])
|
||||
return str(self.value["iso639_1"])
|
||||
|
||||
def threeLetter(self):
|
||||
return str(self.value['iso639_2'][0])
|
||||
|
||||
|
||||
return str(self.value["iso639_2"][0])
|
||||
|
||||
@@ -500,7 +500,14 @@ class MediaDescriptor:
|
||||
return subtitleFileDescriptors
|
||||
|
||||
|
||||
def importSubtitles(self, searchDirectory, prefix, season: int = -1, episode: int = -1):
|
||||
def importSubtitles(
|
||||
self,
|
||||
searchDirectory,
|
||||
prefix,
|
||||
season: int = -1,
|
||||
episode: int = -1,
|
||||
preserve_dispositions: bool = False,
|
||||
):
|
||||
|
||||
# click.echo(f"Season: {season} Episode: {episode}")
|
||||
self.__logger.debug(f"importSubtitles(): Season: {season} Episode: {episode}")
|
||||
@@ -543,7 +550,7 @@ class MediaDescriptor:
|
||||
# Prefer metadata coming from the external single-track source when
|
||||
# it is provided explicitly by the filename contract.
|
||||
matchingTrack.getTags()["language"] = msfd["language"]
|
||||
if msfd["disposition_set"]:
|
||||
if msfd["disposition_set"] and not preserve_dispositions:
|
||||
matchingTrack.setDispositionSet(msfd["disposition_set"])
|
||||
|
||||
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import click
|
||||
|
||||
from ffx.iso_language import IsoLanguage
|
||||
from ffx.media_descriptor import MediaDescriptor
|
||||
from ffx.track_descriptor import TrackDescriptor
|
||||
|
||||
@@ -117,7 +118,11 @@ class MediaDescriptorChangeSet():
|
||||
sourceTrackDescriptor: TrackDescriptor = None):
|
||||
|
||||
sourceTrackTags = sourceTrackDescriptor.getTags() if sourceTrackDescriptor is not None else {}
|
||||
targetTrackTags = targetTrackDescriptor.getTags() if targetTrackDescriptor is not None else {}
|
||||
targetTrackTags = (
|
||||
self.normalizeTrackTags(targetTrackDescriptor.getTags())
|
||||
if targetTrackDescriptor is not None
|
||||
else {}
|
||||
)
|
||||
|
||||
trackCompareResult = {}
|
||||
|
||||
@@ -142,6 +147,25 @@ class MediaDescriptorChangeSet():
|
||||
|
||||
return trackCompareResult
|
||||
|
||||
def normalizeTrackTagValue(self, tagKey, tagValue):
|
||||
if tagKey != "language":
|
||||
return tagValue
|
||||
|
||||
if isinstance(tagValue, IsoLanguage):
|
||||
return tagValue.threeLetter()
|
||||
|
||||
trackLanguage = IsoLanguage.findThreeLetter(str(tagValue))
|
||||
if trackLanguage != IsoLanguage.UNDEFINED:
|
||||
return trackLanguage.threeLetter()
|
||||
|
||||
return tagValue
|
||||
|
||||
def normalizeTrackTags(self, trackTags: dict):
|
||||
return {
|
||||
tagKey: self.normalizeTrackTagValue(tagKey, tagValue)
|
||||
for tagKey, tagValue in trackTags.items()
|
||||
}
|
||||
|
||||
|
||||
def generateDispositionTokens(self):
|
||||
"""
|
||||
@@ -243,7 +267,7 @@ class MediaDescriptorChangeSet():
|
||||
addedTracks: dict = self.__changeSetObj[MediaDescriptorChangeSet.TRACKS_KEY][DIFF_ADDED_KEY]
|
||||
trackDescriptor: TrackDescriptor
|
||||
for trackDescriptor in addedTracks.values():
|
||||
for tagKey, tagValue in trackDescriptor.getTags().items():
|
||||
for tagKey, tagValue in self.normalizeTrackTags(trackDescriptor.getTags()).items():
|
||||
if not tagKey in self.__removeTrackKeys:
|
||||
metadataTokens += [f"-metadata:s:{trackDescriptor.getType().indicator()}"
|
||||
+ f":{trackDescriptor.getSubIndex()}",
|
||||
@@ -267,7 +291,7 @@ class MediaDescriptorChangeSet():
|
||||
|
||||
trackDescriptor = self.__targetTrackDescriptorsByIndex[trackIndex]
|
||||
|
||||
for tagKey, tagValue in outputTrackTags.items():
|
||||
for tagKey, tagValue in self.normalizeTrackTags(outputTrackTags).items():
|
||||
metadataTokens += [f"-metadata:s:{trackDescriptor.getType().indicator()}"
|
||||
+ f":{trackDescriptor.getSubIndex()}",
|
||||
f"{tagKey}={tagValue}"]
|
||||
@@ -285,7 +309,7 @@ class MediaDescriptorChangeSet():
|
||||
}
|
||||
| unchangedTrackTags
|
||||
)
|
||||
for tagKey, tagValue in preservedTrackTags.items():
|
||||
for tagKey, tagValue in self.normalizeTrackTags(preservedTrackTags).items():
|
||||
metadataTokens += [f"-metadata:s:{trackDescriptor.getType().indicator()}"
|
||||
+ f":{trackDescriptor.getSubIndex()}",
|
||||
f"{tagKey}={tagValue}"]
|
||||
|
||||
@@ -559,6 +559,7 @@ class MediaDetailsScreen(Screen):
|
||||
try:
|
||||
kwargs = {}
|
||||
|
||||
kwargs[ShowDescriptor.CONTEXT_KEY] = self.context
|
||||
kwargs[ShowDescriptor.ID_KEY] = int(selected_row_data[0])
|
||||
kwargs[ShowDescriptor.NAME_KEY] = str(selected_row_data[1])
|
||||
kwargs[ShowDescriptor.YEAR_KEY] = int(selected_row_data[2])
|
||||
|
||||
@@ -1,47 +0,0 @@
|
||||
import os, sys, importlib, inspect, glob, re
|
||||
|
||||
from ffx.configuration_controller import ConfigurationController
|
||||
from ffx.database import databaseContext
|
||||
|
||||
from sqlalchemy import Engine
|
||||
from sqlalchemy.orm import sessionmaker
|
||||
|
||||
|
||||
class Conversion():
|
||||
|
||||
def __init__(self):
|
||||
|
||||
self._context = {}
|
||||
self._context['config'] = ConfigurationController()
|
||||
|
||||
self._context['database'] = databaseContext(databasePath=self._context['config'].getDatabaseFilePath())
|
||||
|
||||
self.__databaseSession: sessionmaker = self._context['database']['session']
|
||||
self.__databaseEngine: Engine = self._context['database']['engine']
|
||||
|
||||
|
||||
@staticmethod
|
||||
def list():
|
||||
|
||||
basePath = os.path.dirname(__file__)
|
||||
|
||||
filenamePattern = re.compile("conversion_([0-9]+)_([0-9]+)\\.py")
|
||||
|
||||
filenameList = [os.path.basename(fp) for fp in glob.glob(f"{ basePath }/*.py") if fp != __file__]
|
||||
|
||||
versionTupleList = [(fm.group(1), fm.group(2)) for fn in filenameList if (fm := filenamePattern.search(fn))]
|
||||
|
||||
return versionTupleList
|
||||
|
||||
|
||||
@staticmethod
|
||||
def getClassReference(versionFrom, versionTo):
|
||||
importlib.import_module(f"ffx.model.conversions.conversion_{ versionFrom }_{ versionTo }")
|
||||
for name, obj in inspect.getmembers(sys.modules[f"ffx.model.conversions.conversion_{ versionFrom }_{ versionTo }"]):
|
||||
#HINT: Excluding DispositionCombination as it seems to be included by import (?)
|
||||
if inspect.isclass(obj) and name != 'Conversion' and name.startswith('Conversion'):
|
||||
return obj
|
||||
|
||||
@staticmethod
|
||||
def getAllClassReferences():
|
||||
return [Conversion.getClassReference(verFrom, verTo) for verFrom, verTo in Conversion.list()]
|
||||
@@ -1,17 +0,0 @@
|
||||
import os, sys, importlib, inspect, glob, re
|
||||
|
||||
from .conversion import Conversion
|
||||
|
||||
|
||||
class Conversion_2_3(Conversion):
|
||||
|
||||
def __init__(self):
|
||||
super().__init__()
|
||||
|
||||
def applyConversion(self):
|
||||
|
||||
s = self.__databaseSession()
|
||||
e = self.__databaseEngine
|
||||
|
||||
with e.connect() as c:
|
||||
c.execute("ALTER TABLE user ADD COLUMN email VARCHAR(255)")
|
||||
@@ -1,7 +0,0 @@
|
||||
import os, sys, importlib, inspect, glob, re
|
||||
|
||||
from .conversion import Conversion
|
||||
|
||||
|
||||
class Conversion_3_4(Conversion):
|
||||
pass
|
||||
82
src/ffx/model/migration/__init__.py
Normal file
82
src/ffx/model/migration/__init__.py
Normal file
@@ -0,0 +1,82 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from dataclasses import dataclass
|
||||
import importlib
|
||||
import importlib.util
|
||||
|
||||
|
||||
class DatabaseVersionException(Exception):
|
||||
def __init__(self, errorMessage):
|
||||
super().__init__(errorMessage)
|
||||
|
||||
|
||||
@dataclass(frozen=True)
|
||||
class MigrationStep:
|
||||
versionFrom: int
|
||||
versionTo: int
|
||||
moduleName: str
|
||||
modulePresent: bool
|
||||
|
||||
|
||||
def getMigrationStepModuleName(versionFrom: int, versionTo: int) -> str:
|
||||
return f"ffx.model.migration.step_{int(versionFrom)}_{int(versionTo)}"
|
||||
|
||||
|
||||
def migrationStepModuleExists(versionFrom: int, versionTo: int) -> bool:
|
||||
moduleName = getMigrationStepModuleName(versionFrom, versionTo)
|
||||
|
||||
try:
|
||||
return importlib.util.find_spec(moduleName) is not None
|
||||
except ModuleNotFoundError:
|
||||
return False
|
||||
|
||||
|
||||
def getMigrationPlan(currentVersion: int, targetVersion: int) -> list[MigrationStep]:
|
||||
version = int(currentVersion)
|
||||
target = int(targetVersion)
|
||||
migrationPlan = []
|
||||
|
||||
while version < target:
|
||||
nextVersion = version + 1
|
||||
migrationPlan.append(
|
||||
MigrationStep(
|
||||
versionFrom=version,
|
||||
versionTo=nextVersion,
|
||||
moduleName=getMigrationStepModuleName(version, nextVersion),
|
||||
modulePresent=migrationStepModuleExists(version, nextVersion),
|
||||
)
|
||||
)
|
||||
version = nextVersion
|
||||
|
||||
return migrationPlan
|
||||
|
||||
|
||||
def loadMigrationStep(versionFrom: int, versionTo: int):
|
||||
moduleName = getMigrationStepModuleName(versionFrom, versionTo)
|
||||
|
||||
try:
|
||||
module = importlib.import_module(moduleName)
|
||||
except ModuleNotFoundError as ex:
|
||||
if ex.name == moduleName:
|
||||
raise DatabaseVersionException(
|
||||
f"No migration path from database version {versionFrom} to {versionTo}"
|
||||
) from ex
|
||||
raise
|
||||
|
||||
migrationStep = getattr(module, "applyMigration", None)
|
||||
if migrationStep is None:
|
||||
raise DatabaseVersionException(
|
||||
f"Migration module {moduleName} does not define applyMigration()"
|
||||
)
|
||||
|
||||
return migrationStep
|
||||
|
||||
|
||||
def migrateDatabase(databaseContext, currentVersion: int, targetVersion: int, setDatabaseVersion):
|
||||
for migrationStepInfo in getMigrationPlan(currentVersion, targetVersion):
|
||||
migrationStep = loadMigrationStep(
|
||||
migrationStepInfo.versionFrom,
|
||||
migrationStepInfo.versionTo,
|
||||
)
|
||||
migrationStep(databaseContext)
|
||||
setDatabaseVersion(databaseContext, migrationStepInfo.versionTo)
|
||||
84
src/ffx/model/migration/step_2_3.py
Normal file
84
src/ffx/model/migration/step_2_3.py
Normal file
@@ -0,0 +1,84 @@
|
||||
from sqlalchemy import inspect, text
|
||||
|
||||
|
||||
def applyMigration(databaseContext):
|
||||
engine = databaseContext['engine']
|
||||
inspector = inspect(engine)
|
||||
shiftedSeasonColumns = {
|
||||
column['name']
|
||||
for column in inspector.get_columns('shifted_seasons')
|
||||
}
|
||||
showColumns = {
|
||||
column['name']
|
||||
for column in inspector.get_columns('shows')
|
||||
}
|
||||
|
||||
with engine.begin() as connection:
|
||||
if 'pattern_id' not in shiftedSeasonColumns:
|
||||
connection.execute(text("PRAGMA foreign_keys=OFF"))
|
||||
connection.execute(
|
||||
text(
|
||||
"""
|
||||
CREATE TABLE shifted_seasons_v3 (
|
||||
id INTEGER PRIMARY KEY,
|
||||
show_id INTEGER,
|
||||
pattern_id INTEGER,
|
||||
original_season INTEGER,
|
||||
first_episode INTEGER DEFAULT -1,
|
||||
last_episode INTEGER DEFAULT -1,
|
||||
season_offset INTEGER DEFAULT 0,
|
||||
episode_offset INTEGER DEFAULT 0,
|
||||
FOREIGN KEY(show_id) REFERENCES shows(id) ON DELETE CASCADE,
|
||||
FOREIGN KEY(pattern_id) REFERENCES patterns(id) ON DELETE CASCADE,
|
||||
CHECK (
|
||||
(show_id IS NOT NULL AND pattern_id IS NULL)
|
||||
OR (show_id IS NULL AND pattern_id IS NOT NULL)
|
||||
)
|
||||
)
|
||||
"""
|
||||
)
|
||||
)
|
||||
connection.execute(
|
||||
text(
|
||||
"""
|
||||
INSERT INTO shifted_seasons_v3 (
|
||||
id,
|
||||
show_id,
|
||||
pattern_id,
|
||||
original_season,
|
||||
first_episode,
|
||||
last_episode,
|
||||
season_offset,
|
||||
episode_offset
|
||||
)
|
||||
SELECT
|
||||
id,
|
||||
show_id,
|
||||
NULL,
|
||||
original_season,
|
||||
first_episode,
|
||||
last_episode,
|
||||
season_offset,
|
||||
episode_offset
|
||||
FROM shifted_seasons
|
||||
"""
|
||||
)
|
||||
)
|
||||
connection.execute(text("DROP TABLE shifted_seasons"))
|
||||
connection.execute(text("ALTER TABLE shifted_seasons_v3 RENAME TO shifted_seasons"))
|
||||
connection.execute(
|
||||
text("CREATE INDEX ix_shifted_seasons_show_id ON shifted_seasons(show_id)")
|
||||
)
|
||||
connection.execute(
|
||||
text("CREATE INDEX ix_shifted_seasons_pattern_id ON shifted_seasons(pattern_id)")
|
||||
)
|
||||
connection.execute(text("PRAGMA foreign_keys=ON"))
|
||||
|
||||
if 'quality' not in showColumns:
|
||||
connection.execute(
|
||||
text("ALTER TABLE shows ADD COLUMN quality INTEGER DEFAULT 0")
|
||||
)
|
||||
if 'notes' not in showColumns:
|
||||
connection.execute(
|
||||
text("ALTER TABLE shows ADD COLUMN notes TEXT DEFAULT ''")
|
||||
)
|
||||
@@ -35,6 +35,7 @@ class Pattern(Base):
|
||||
tracks = relationship('Track', back_populates='pattern', cascade="all, delete", lazy='joined')
|
||||
|
||||
media_tags = relationship('MediaTag', back_populates='pattern', cascade="all, delete", lazy='joined')
|
||||
shifted_seasons = relationship('ShiftedSeason', back_populates='pattern', cascade="all, delete", lazy='joined')
|
||||
|
||||
quality = Column(Integer, default=0)
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import click
|
||||
|
||||
from sqlalchemy import Column, Integer, ForeignKey
|
||||
from sqlalchemy import CheckConstraint, Column, ForeignKey, Index, Integer
|
||||
from sqlalchemy.orm import relationship
|
||||
|
||||
from .show import Base, Show
|
||||
@@ -9,6 +9,14 @@ from .show import Base, Show
|
||||
class ShiftedSeason(Base):
|
||||
|
||||
__tablename__ = 'shifted_seasons'
|
||||
__table_args__ = (
|
||||
CheckConstraint(
|
||||
"(show_id IS NOT NULL AND pattern_id IS NULL) OR (show_id IS NULL AND pattern_id IS NOT NULL)",
|
||||
name="ck_shifted_seasons_single_owner",
|
||||
),
|
||||
Index("ix_shifted_seasons_show_id", "show_id"),
|
||||
Index("ix_shifted_seasons_pattern_id", "pattern_id"),
|
||||
)
|
||||
|
||||
# v1.x
|
||||
id = Column(Integer, primary_key=True)
|
||||
@@ -19,9 +27,12 @@ class ShiftedSeason(Base):
|
||||
# pattern: Mapped[str] = mapped_column(String, nullable=False)
|
||||
|
||||
# v1.x
|
||||
show_id = Column(Integer, ForeignKey('shows.id', ondelete="CASCADE"))
|
||||
show_id = Column(Integer, ForeignKey('shows.id', ondelete="CASCADE"), nullable=True)
|
||||
show = relationship(Show, back_populates='shifted_seasons', lazy='joined')
|
||||
|
||||
pattern_id = Column(Integer, ForeignKey('patterns.id', ondelete="CASCADE"), nullable=True)
|
||||
pattern = relationship('Pattern', back_populates='shifted_seasons', lazy='joined')
|
||||
|
||||
# v2.0
|
||||
# show_id: Mapped[int] = mapped_column(ForeignKey("shows.id", ondelete="CASCADE"))
|
||||
# show: Mapped["Show"] = relationship(back_populates="patterns")
|
||||
@@ -39,6 +50,12 @@ class ShiftedSeason(Base):
|
||||
def getId(self):
|
||||
return self.id
|
||||
|
||||
def getShowId(self):
|
||||
return self.show_id
|
||||
|
||||
def getPatternId(self):
|
||||
return self.pattern_id
|
||||
|
||||
|
||||
def getOriginalSeason(self):
|
||||
return self.original_season
|
||||
@@ -61,6 +78,8 @@ class ShiftedSeason(Base):
|
||||
|
||||
shiftedSeasonObj = {}
|
||||
|
||||
shiftedSeasonObj['show_id'] = self.getShowId()
|
||||
shiftedSeasonObj['pattern_id'] = self.getPatternId()
|
||||
shiftedSeasonObj['original_season'] = self.getOriginalSeason()
|
||||
shiftedSeasonObj['first_episode'] = self.getFirstEpisode()
|
||||
shiftedSeasonObj['last_episode'] = self.getLastEpisode()
|
||||
@@ -68,4 +87,3 @@ class ShiftedSeason(Base):
|
||||
shiftedSeasonObj['episode_offset'] = self.getEpisodeOffset()
|
||||
|
||||
return shiftedSeasonObj
|
||||
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
# from typing import List
|
||||
from sqlalchemy import create_engine, Column, Integer, String, ForeignKey
|
||||
from sqlalchemy import create_engine, Column, Integer, String, Text, ForeignKey
|
||||
from sqlalchemy.orm import relationship, declarative_base, sessionmaker
|
||||
|
||||
from ffx.show_descriptor import ShowDescriptor
|
||||
@@ -45,6 +45,8 @@ class Show(Base):
|
||||
index_episode_digits = Column(Integer, default=ShowDescriptor.DEFAULT_INDEX_EPISODE_DIGITS)
|
||||
indicator_season_digits = Column(Integer, default=ShowDescriptor.DEFAULT_INDICATOR_SEASON_DIGITS)
|
||||
indicator_episode_digits = Column(Integer, default=ShowDescriptor.DEFAULT_INDICATOR_EPISODE_DIGITS)
|
||||
quality = Column(Integer, default=0)
|
||||
notes = Column(Text, default='')
|
||||
|
||||
|
||||
def getDescriptor(self, context):
|
||||
@@ -58,5 +60,7 @@ class Show(Base):
|
||||
kwargs[ShowDescriptor.INDEX_EPISODE_DIGITS_KEY] = int(self.index_episode_digits)
|
||||
kwargs[ShowDescriptor.INDICATOR_SEASON_DIGITS_KEY] = int(self.indicator_season_digits)
|
||||
kwargs[ShowDescriptor.INDICATOR_EPISODE_DIGITS_KEY] = int(self.indicator_episode_digits)
|
||||
kwargs[ShowDescriptor.QUALITY_KEY] = int(self.quality or 0)
|
||||
kwargs[ShowDescriptor.NOTES_KEY] = str(self.notes or '')
|
||||
|
||||
return ShowDescriptor(**kwargs)
|
||||
|
||||
@@ -9,6 +9,8 @@ from ffx.model.pattern import Pattern
|
||||
|
||||
from .track_details_screen import TrackDetailsScreen
|
||||
from .track_delete_screen import TrackDeleteScreen
|
||||
from .shifted_season_delete_screen import ShiftedSeasonDeleteScreen
|
||||
from .shifted_season_details_screen import ShiftedSeasonDetailsScreen
|
||||
|
||||
from .tag_details_screen import TagDetailsScreen
|
||||
from .tag_delete_screen import TagDeleteScreen
|
||||
@@ -24,6 +26,7 @@ from textual.widgets._data_table import CellDoesNotExist
|
||||
from ffx.file_properties import FileProperties
|
||||
from ffx.iso_language import IsoLanguage
|
||||
from ffx.audio_layout import AudioLayout
|
||||
from ffx.model.shifted_season import ShiftedSeason
|
||||
|
||||
from ffx.helper import formatRichColor, removeRichColor
|
||||
|
||||
@@ -34,8 +37,8 @@ class PatternDetailsScreen(Screen):
|
||||
CSS = """
|
||||
|
||||
Grid {
|
||||
grid-size: 7 17;
|
||||
grid-rows: 2 2 2 2 2 2 6 2 2 8 2 2 8 2 2 2 2;
|
||||
grid-size: 7 20;
|
||||
grid-rows: 2 2 2 2 2 2 6 2 2 8 2 2 8 2 2 8 2 2 2 2;
|
||||
grid-columns: 25 25 25 25 25 25 25;
|
||||
height: 100%;
|
||||
width: 100%;
|
||||
@@ -115,11 +118,13 @@ class PatternDetailsScreen(Screen):
|
||||
show=True,
|
||||
track=True,
|
||||
tag=True,
|
||||
shifted_season=True,
|
||||
)
|
||||
self.__pc = controllers['pattern']
|
||||
self.__sc = controllers['show']
|
||||
self.__tc = controllers['track']
|
||||
self.__tac = controllers['tag']
|
||||
self.__ssc = controllers['shifted_season']
|
||||
|
||||
self.__pattern : Pattern = self.__pc.getPattern(patternId) if patternId is not None else None
|
||||
self.__showDescriptor = self.__sc.getShowDescriptor(showId) if showId is not None else None
|
||||
@@ -258,6 +263,72 @@ class PatternDetailsScreen(Screen):
|
||||
row = (formatRichColor(tagKey, textColor), formatRichColor(tagValue, textColor))
|
||||
self.tagsTable.add_row(*map(str, row))
|
||||
|
||||
def updateShiftedSeasons(self):
|
||||
|
||||
self.shiftedSeasonsTable.clear()
|
||||
|
||||
if self.__pattern is None:
|
||||
return
|
||||
|
||||
shiftedSeason: ShiftedSeason
|
||||
for shiftedSeason in self.__ssc.getShiftedSeasonSiblings(patternId=self.__pattern.getId()):
|
||||
shiftedSeasonObj = shiftedSeason.getObj()
|
||||
|
||||
firstEpisode = shiftedSeasonObj['first_episode']
|
||||
firstEpisodeStr = str(firstEpisode) if firstEpisode != -1 else ''
|
||||
|
||||
lastEpisode = shiftedSeasonObj['last_episode']
|
||||
lastEpisodeStr = str(lastEpisode) if lastEpisode != -1 else ''
|
||||
|
||||
row = (
|
||||
shiftedSeasonObj['original_season'],
|
||||
firstEpisodeStr,
|
||||
lastEpisodeStr,
|
||||
shiftedSeasonObj['season_offset'],
|
||||
shiftedSeasonObj['episode_offset'],
|
||||
)
|
||||
|
||||
self.shiftedSeasonsTable.add_row(*map(str, row))
|
||||
|
||||
def getSelectedShiftedSeasonObjFromInput(self):
|
||||
|
||||
shiftedSeasonObj = {}
|
||||
|
||||
try:
|
||||
row_key, col_key = self.shiftedSeasonsTable.coordinate_to_cell_key(
|
||||
self.shiftedSeasonsTable.cursor_coordinate
|
||||
)
|
||||
|
||||
if row_key is not None:
|
||||
selected_row_data = self.shiftedSeasonsTable.get_row(row_key)
|
||||
|
||||
def parse_int_or_default(value: str, default: int) -> int:
|
||||
try:
|
||||
return int(value)
|
||||
except (TypeError, ValueError):
|
||||
return default
|
||||
|
||||
shiftedSeasonObj['original_season'] = int(selected_row_data[0])
|
||||
shiftedSeasonObj['first_episode'] = parse_int_or_default(selected_row_data[1], -1)
|
||||
shiftedSeasonObj['last_episode'] = parse_int_or_default(selected_row_data[2], -1)
|
||||
shiftedSeasonObj['season_offset'] = parse_int_or_default(selected_row_data[3], 0)
|
||||
shiftedSeasonObj['episode_offset'] = parse_int_or_default(selected_row_data[4], 0)
|
||||
|
||||
if self.__pattern is not None:
|
||||
shiftedSeasonId = self.__ssc.findShiftedSeason(
|
||||
patternId=self.__pattern.getId(),
|
||||
originalSeason=shiftedSeasonObj['original_season'],
|
||||
firstEpisode=shiftedSeasonObj['first_episode'],
|
||||
lastEpisode=shiftedSeasonObj['last_episode'],
|
||||
)
|
||||
if shiftedSeasonId is not None:
|
||||
shiftedSeasonObj['id'] = shiftedSeasonId
|
||||
|
||||
except CellDoesNotExist:
|
||||
pass
|
||||
|
||||
return shiftedSeasonObj
|
||||
|
||||
|
||||
def on_mount(self):
|
||||
|
||||
@@ -276,6 +347,7 @@ class PatternDetailsScreen(Screen):
|
||||
|
||||
self.updateTags()
|
||||
self.updateTracks()
|
||||
self.updateShiftedSeasons()
|
||||
|
||||
def compose(self):
|
||||
|
||||
@@ -304,6 +376,16 @@ class PatternDetailsScreen(Screen):
|
||||
|
||||
self.tracksTable.cursor_type = 'row'
|
||||
|
||||
self.shiftedSeasonsTable = DataTable(classes="seven")
|
||||
|
||||
self.column_key_original_season = self.shiftedSeasonsTable.add_column("Source Season", width=18)
|
||||
self.column_key_first_episode = self.shiftedSeasonsTable.add_column("First Episode", width=18)
|
||||
self.column_key_last_episode = self.shiftedSeasonsTable.add_column("Last Episode", width=18)
|
||||
self.column_key_season_offset = self.shiftedSeasonsTable.add_column("Season Offset", width=18)
|
||||
self.column_key_episode_offset = self.shiftedSeasonsTable.add_column("Episode Offset", width=18)
|
||||
|
||||
self.shiftedSeasonsTable.cursor_type = 'row'
|
||||
|
||||
|
||||
yield Header()
|
||||
|
||||
@@ -345,6 +427,27 @@ class PatternDetailsScreen(Screen):
|
||||
yield Static(" ", classes="seven")
|
||||
|
||||
# 9
|
||||
yield Static("Shifted Seasons")
|
||||
if self.__pattern is not None:
|
||||
yield Button("Add", id="button_add_shifted_season")
|
||||
yield Button("Edit", id="button_edit_shifted_season")
|
||||
yield Button("Delete", id="button_delete_shifted_season")
|
||||
else:
|
||||
yield Static(" ")
|
||||
yield Static(" ")
|
||||
yield Static(" ")
|
||||
|
||||
yield Static(" ")
|
||||
yield Static(" ")
|
||||
yield Static(" ")
|
||||
|
||||
# 10
|
||||
yield self.shiftedSeasonsTable
|
||||
|
||||
# 11
|
||||
yield Static(" ", classes="seven")
|
||||
|
||||
# 12
|
||||
yield Static("Media Tags")
|
||||
yield Button("Add", id="button_add_tag")
|
||||
yield Button("Edit", id="button_edit_tag")
|
||||
@@ -354,13 +457,13 @@ class PatternDetailsScreen(Screen):
|
||||
yield Static(" ")
|
||||
yield Static(" ")
|
||||
|
||||
# 10
|
||||
# 13
|
||||
yield self.tagsTable
|
||||
|
||||
# 11
|
||||
# 14
|
||||
yield Static(" ", classes="seven")
|
||||
|
||||
# 12
|
||||
# 15
|
||||
yield Static("Streams")
|
||||
yield Button("Add", id="button_add_track")
|
||||
yield Button("Edit", id="button_edit_track")
|
||||
@@ -370,21 +473,21 @@ class PatternDetailsScreen(Screen):
|
||||
yield Button("Up", id="button_track_up")
|
||||
yield Button("Down", id="button_track_down")
|
||||
|
||||
# 13
|
||||
# 16
|
||||
yield self.tracksTable
|
||||
|
||||
# 14
|
||||
# 17
|
||||
yield Static(" ", classes="seven")
|
||||
|
||||
# 15
|
||||
# 18
|
||||
yield Static(" ", classes="seven")
|
||||
|
||||
# 16
|
||||
# 19
|
||||
yield Button("Save", id="save_button")
|
||||
yield Button("Cancel", id="cancel_button")
|
||||
yield Static(" ", classes="five")
|
||||
|
||||
# 17
|
||||
# 20
|
||||
yield Static(" ", classes="seven")
|
||||
|
||||
yield Footer()
|
||||
@@ -486,6 +589,35 @@ class PatternDetailsScreen(Screen):
|
||||
if event.button.id == "cancel_button":
|
||||
self.app.pop_screen()
|
||||
|
||||
if event.button.id == "button_add_shifted_season":
|
||||
if self.__pattern is not None:
|
||||
self.app.push_screen(
|
||||
ShiftedSeasonDetailsScreen(patternId=self.__pattern.getId()),
|
||||
self.handle_update_shifted_season,
|
||||
)
|
||||
|
||||
if event.button.id == "button_edit_shifted_season":
|
||||
selectedShiftedSeasonObj = self.getSelectedShiftedSeasonObjFromInput()
|
||||
if 'id' in selectedShiftedSeasonObj.keys():
|
||||
self.app.push_screen(
|
||||
ShiftedSeasonDetailsScreen(
|
||||
patternId=self.__pattern.getId(),
|
||||
shiftedSeasonId=selectedShiftedSeasonObj['id'],
|
||||
),
|
||||
self.handle_update_shifted_season,
|
||||
)
|
||||
|
||||
if event.button.id == "button_delete_shifted_season":
|
||||
selectedShiftedSeasonObj = self.getSelectedShiftedSeasonObjFromInput()
|
||||
if 'id' in selectedShiftedSeasonObj.keys():
|
||||
self.app.push_screen(
|
||||
ShiftedSeasonDeleteScreen(
|
||||
patternId=self.__pattern.getId(),
|
||||
shiftedSeasonId=selectedShiftedSeasonObj['id'],
|
||||
),
|
||||
self.handle_delete_shifted_season,
|
||||
)
|
||||
|
||||
|
||||
numTracks = len(self.getCurrentTrackDescriptors())
|
||||
|
||||
@@ -654,3 +786,9 @@ class PatternDetailsScreen(Screen):
|
||||
self.updateTags()
|
||||
else:
|
||||
raise click.ClickException('tag delete failed')
|
||||
|
||||
def handle_update_shifted_season(self, screenResult):
|
||||
self.updateShiftedSeasons()
|
||||
|
||||
def handle_delete_shifted_season(self, screenResult):
|
||||
self.updateShiftedSeasons()
|
||||
|
||||
@@ -6,225 +6,433 @@ from ffx.model.shifted_season import ShiftedSeason
|
||||
class EpisodeOrderException(Exception):
|
||||
pass
|
||||
|
||||
|
||||
class RangeOverlapException(Exception):
|
||||
pass
|
||||
|
||||
|
||||
class ShiftedSeasonController():
|
||||
|
||||
class ShiftedSeasonOwnerException(Exception):
|
||||
pass
|
||||
|
||||
|
||||
class ShiftedSeasonController:
|
||||
|
||||
def __init__(self, context):
|
||||
|
||||
|
||||
self.context = context
|
||||
self.Session = self.context['database']['session'] # convenience
|
||||
self.Session = self.context['database']['session'] # convenience
|
||||
|
||||
def checkShiftedSeason(self, showId: int, shiftedSeasonObj: dict, shiftedSeasonId: int = 0):
|
||||
def _resolve_owner(self, showId=None, patternId=None):
|
||||
hasShow = showId is not None
|
||||
hasPattern = patternId is not None
|
||||
|
||||
if hasShow == hasPattern:
|
||||
raise ShiftedSeasonOwnerException(
|
||||
"ShiftedSeason rules require exactly one owner: either showId or patternId."
|
||||
)
|
||||
|
||||
if hasShow:
|
||||
if type(showId) is not int:
|
||||
raise ValueError(
|
||||
"ShiftedSeasonController: Argument showId is required to be of type int"
|
||||
)
|
||||
return {
|
||||
'show_id': int(showId),
|
||||
'pattern_id': None,
|
||||
'label': f"show #{int(showId)}",
|
||||
}
|
||||
|
||||
if type(patternId) is not int:
|
||||
raise ValueError(
|
||||
"ShiftedSeasonController: Argument patternId is required to be of type int"
|
||||
)
|
||||
return {
|
||||
'show_id': None,
|
||||
'pattern_id': int(patternId),
|
||||
'label': f"pattern #{int(patternId)}",
|
||||
}
|
||||
|
||||
def _apply_owner_filter(self, query, owner):
|
||||
if owner['pattern_id'] is not None:
|
||||
return query.filter(ShiftedSeason.pattern_id == owner['pattern_id'])
|
||||
return query.filter(ShiftedSeason.show_id == owner['show_id'])
|
||||
|
||||
def _normalize_shifted_season_fields(self, shiftedSeasonObj: dict):
|
||||
if type(shiftedSeasonObj) is not dict:
|
||||
raise ValueError(
|
||||
"ShiftedSeasonController: Argument shiftedSeasonObj is required to be of type dict"
|
||||
)
|
||||
|
||||
fields = {
|
||||
'original_season': int(shiftedSeasonObj['original_season']),
|
||||
'first_episode': int(shiftedSeasonObj['first_episode']),
|
||||
'last_episode': int(shiftedSeasonObj['last_episode']),
|
||||
'season_offset': int(shiftedSeasonObj['season_offset']),
|
||||
'episode_offset': int(shiftedSeasonObj['episode_offset']),
|
||||
}
|
||||
|
||||
firstEpisode = fields['first_episode']
|
||||
lastEpisode = fields['last_episode']
|
||||
if firstEpisode != -1 and lastEpisode != -1 and lastEpisode < firstEpisode:
|
||||
raise EpisodeOrderException(
|
||||
"ShiftedSeason last_episode must be greater than or equal to first_episode."
|
||||
)
|
||||
|
||||
return fields
|
||||
|
||||
def _ranges_overlap(self, firstEpisodeA, lastEpisodeA, firstEpisodeB, lastEpisodeB):
|
||||
startA = float('-inf') if int(firstEpisodeA) == -1 else int(firstEpisodeA)
|
||||
endA = float('inf') if int(lastEpisodeA) == -1 else int(lastEpisodeA)
|
||||
startB = float('-inf') if int(firstEpisodeB) == -1 else int(firstEpisodeB)
|
||||
endB = float('inf') if int(lastEpisodeB) == -1 else int(lastEpisodeB)
|
||||
return startA <= endB and startB <= endA
|
||||
|
||||
def _ordered_query(self, session, owner):
|
||||
q = self._apply_owner_filter(session.query(ShiftedSeason), owner)
|
||||
return q.order_by(
|
||||
ShiftedSeason.original_season.asc(),
|
||||
ShiftedSeason.first_episode.asc(),
|
||||
ShiftedSeason.last_episode.asc(),
|
||||
ShiftedSeason.id.asc(),
|
||||
)
|
||||
|
||||
def _find_matching_rule(self, session, owner, season: int, episode: int):
|
||||
for shiftedSeasonEntry in self._ordered_query(session, owner).all():
|
||||
if (
|
||||
season == shiftedSeasonEntry.getOriginalSeason()
|
||||
and (
|
||||
shiftedSeasonEntry.getFirstEpisode() == -1
|
||||
or episode >= shiftedSeasonEntry.getFirstEpisode()
|
||||
)
|
||||
and (
|
||||
shiftedSeasonEntry.getLastEpisode() == -1
|
||||
or episode <= shiftedSeasonEntry.getLastEpisode()
|
||||
)
|
||||
):
|
||||
return shiftedSeasonEntry
|
||||
return None
|
||||
|
||||
def checkShiftedSeason(
|
||||
self,
|
||||
showId: int | None = None,
|
||||
shiftedSeasonObj: dict | None = None,
|
||||
shiftedSeasonId: int = 0,
|
||||
patternId: int | None = None,
|
||||
):
|
||||
"""
|
||||
Check if for a particula season
|
||||
|
||||
shiftedSeasonId
|
||||
Check whether a shifted-season rule is valid within one owner scope.
|
||||
"""
|
||||
|
||||
session = None
|
||||
try:
|
||||
s = self.Session()
|
||||
owner = self._resolve_owner(showId=showId, patternId=patternId)
|
||||
fields = self._normalize_shifted_season_fields(shiftedSeasonObj)
|
||||
session = self.Session()
|
||||
|
||||
originalSeason = shiftedSeasonObj['original_season']
|
||||
firstEpisode = int(shiftedSeasonObj['first_episode'])
|
||||
lastEpisode = int(shiftedSeasonObj['last_episode'])
|
||||
|
||||
q = s.query(ShiftedSeason).filter(ShiftedSeason.show_id == int(showId))
|
||||
q = self._ordered_query(session, owner)
|
||||
if shiftedSeasonId:
|
||||
q = q.filter(ShiftedSeason.id != int(shiftedSeasonId))
|
||||
|
||||
siblingShiftedSeason: ShiftedSeason
|
||||
for siblingShiftedSeason in q.all():
|
||||
|
||||
siblingOriginalSeason = siblingShiftedSeason.getOriginalSeason
|
||||
siblingFirstEpisode = siblingShiftedSeason.getFirstEpisode()
|
||||
siblingLastEpisode = siblingShiftedSeason.getLastEpisode()
|
||||
|
||||
if (originalSeason == siblingOriginalSeason
|
||||
and lastEpisode >= siblingFirstEpisode
|
||||
and siblingLastEpisode >= firstEpisode):
|
||||
if fields['original_season'] != siblingShiftedSeason.getOriginalSeason():
|
||||
continue
|
||||
|
||||
if self._ranges_overlap(
|
||||
fields['first_episode'],
|
||||
fields['last_episode'],
|
||||
siblingShiftedSeason.getFirstEpisode(),
|
||||
siblingShiftedSeason.getLastEpisode(),
|
||||
):
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
except (EpisodeOrderException, ShiftedSeasonOwnerException) as ex:
|
||||
raise click.ClickException(str(ex))
|
||||
except Exception as ex:
|
||||
raise click.ClickException(f"ShiftedSeasonController.addShiftedSeason(): {repr(ex)}")
|
||||
raise click.ClickException(
|
||||
f"ShiftedSeasonController.checkShiftedSeason(): {repr(ex)}"
|
||||
)
|
||||
finally:
|
||||
s.close()
|
||||
if session is not None:
|
||||
session.close()
|
||||
|
||||
def addShiftedSeason(
|
||||
self,
|
||||
showId: int | None = None,
|
||||
shiftedSeasonObj: dict | None = None,
|
||||
patternId: int | None = None,
|
||||
):
|
||||
|
||||
def addShiftedSeason(self, showId: int, shiftedSeasonObj: dict):
|
||||
|
||||
if type(showId) is not int:
|
||||
raise ValueError(f"ShiftedSeasonController.addShiftedSeason(): Argument showId is required to be of type int")
|
||||
|
||||
if type(shiftedSeasonObj) is not dict:
|
||||
raise ValueError(f"ShiftedSeasonController.addShiftedSeason(): Argument shiftedSeasonObj is required to be of type dict")
|
||||
|
||||
session = None
|
||||
try:
|
||||
s = self.Session()
|
||||
owner = self._resolve_owner(showId=showId, patternId=patternId)
|
||||
fields = self._normalize_shifted_season_fields(shiftedSeasonObj)
|
||||
|
||||
firstEpisode = int(shiftedSeasonObj['first_episode'])
|
||||
lastEpisode = int(shiftedSeasonObj['last_episode'])
|
||||
if not self.checkShiftedSeason(
|
||||
showId=owner['show_id'],
|
||||
patternId=owner['pattern_id'],
|
||||
shiftedSeasonObj=fields,
|
||||
):
|
||||
raise RangeOverlapException(
|
||||
f"ShiftedSeason rule overlaps with an existing rule for {owner['label']}."
|
||||
)
|
||||
|
||||
if lastEpisode < firstEpisode:
|
||||
raise EpisodeOrderException()
|
||||
|
||||
q = s.query(ShiftedSeason).filter(ShiftedSeason.show_id == int(showId))
|
||||
|
||||
shiftedSeason = ShiftedSeason(show_id = int(showId),
|
||||
original_season = int(shiftedSeasonObj['original_season']),
|
||||
first_episode = firstEpisode,
|
||||
last_episode = lastEpisode,
|
||||
season_offset = int(shiftedSeasonObj['season_offset']),
|
||||
episode_offset = int(shiftedSeasonObj['episode_offset']))
|
||||
s.add(shiftedSeason)
|
||||
s.commit()
|
||||
session = self.Session()
|
||||
shiftedSeason = ShiftedSeason(
|
||||
show_id=owner['show_id'],
|
||||
pattern_id=owner['pattern_id'],
|
||||
original_season=fields['original_season'],
|
||||
first_episode=fields['first_episode'],
|
||||
last_episode=fields['last_episode'],
|
||||
season_offset=fields['season_offset'],
|
||||
episode_offset=fields['episode_offset'],
|
||||
)
|
||||
session.add(shiftedSeason)
|
||||
session.commit()
|
||||
return shiftedSeason.getId()
|
||||
|
||||
except (EpisodeOrderException, RangeOverlapException, ShiftedSeasonOwnerException) as ex:
|
||||
raise click.ClickException(str(ex))
|
||||
except Exception as ex:
|
||||
raise click.ClickException(f"ShiftedSeasonController.addShiftedSeason(): {repr(ex)}")
|
||||
raise click.ClickException(
|
||||
f"ShiftedSeasonController.addShiftedSeason(): {repr(ex)}"
|
||||
)
|
||||
finally:
|
||||
s.close()
|
||||
|
||||
if session is not None:
|
||||
session.close()
|
||||
|
||||
def updateShiftedSeason(self, shiftedSeasonId: int, shiftedSeasonObj: dict):
|
||||
|
||||
if type(shiftedSeasonId) is not int:
|
||||
raise ValueError(f"ShiftedSeasonController.updateShiftedSeason(): Argument shiftedSeasonId is required to be of type int")
|
||||
|
||||
if type(shiftedSeasonObj) is not dict:
|
||||
raise ValueError(f"ShiftedSeasonController.updateShiftedSeason(): Argument shiftedSeasonObj is required to be of type dict")
|
||||
raise ValueError(
|
||||
"ShiftedSeasonController.updateShiftedSeason(): Argument shiftedSeasonId is required to be of type int"
|
||||
)
|
||||
|
||||
session = None
|
||||
try:
|
||||
s = self.Session()
|
||||
fields = self._normalize_shifted_season_fields(shiftedSeasonObj)
|
||||
session = self.Session()
|
||||
|
||||
shiftedSeason = s.query(ShiftedSeason).filter(ShiftedSeason.id == int(shiftedSeasonId)).first()
|
||||
shiftedSeason = (
|
||||
session.query(ShiftedSeason)
|
||||
.filter(ShiftedSeason.id == int(shiftedSeasonId))
|
||||
.first()
|
||||
)
|
||||
|
||||
if shiftedSeason is not None:
|
||||
|
||||
shiftedSeason.original_season = int(shiftedSeasonObj['original_season'])
|
||||
shiftedSeason.first_episode = int(shiftedSeasonObj['first_episode'])
|
||||
shiftedSeason.last_episode = int(shiftedSeasonObj['last_episode'])
|
||||
shiftedSeason.season_offset = int(shiftedSeasonObj['season_offset'])
|
||||
shiftedSeason.episode_offset = int(shiftedSeasonObj['episode_offset'])
|
||||
|
||||
s.commit()
|
||||
return True
|
||||
|
||||
else:
|
||||
if shiftedSeason is None:
|
||||
return False
|
||||
|
||||
owner = self._resolve_owner(
|
||||
showId=shiftedSeason.getShowId(),
|
||||
patternId=shiftedSeason.getPatternId(),
|
||||
)
|
||||
if not self.checkShiftedSeason(
|
||||
showId=owner['show_id'],
|
||||
patternId=owner['pattern_id'],
|
||||
shiftedSeasonObj=fields,
|
||||
shiftedSeasonId=shiftedSeasonId,
|
||||
):
|
||||
raise RangeOverlapException(
|
||||
f"ShiftedSeason rule overlaps with an existing rule for {owner['label']}."
|
||||
)
|
||||
|
||||
shiftedSeason.original_season = fields['original_season']
|
||||
shiftedSeason.first_episode = fields['first_episode']
|
||||
shiftedSeason.last_episode = fields['last_episode']
|
||||
shiftedSeason.season_offset = fields['season_offset']
|
||||
shiftedSeason.episode_offset = fields['episode_offset']
|
||||
|
||||
session.commit()
|
||||
return True
|
||||
|
||||
except (EpisodeOrderException, RangeOverlapException, ShiftedSeasonOwnerException) as ex:
|
||||
raise click.ClickException(str(ex))
|
||||
except Exception as ex:
|
||||
raise click.ClickException(f"ShiftedSeasonController.updateShiftedSeason(): {repr(ex)}")
|
||||
raise click.ClickException(
|
||||
f"ShiftedSeasonController.updateShiftedSeason(): {repr(ex)}"
|
||||
)
|
||||
finally:
|
||||
s.close()
|
||||
if session is not None:
|
||||
session.close()
|
||||
|
||||
|
||||
def findShiftedSeason(self, showId: int, originalSeason: int, firstEpisode: int, lastEpisode: int):
|
||||
|
||||
if type(showId) is not int:
|
||||
raise ValueError(f"ShiftedSeasonController.findShiftedSeason(): Argument shiftedSeasonId is required to be of type int")
|
||||
def findShiftedSeason(
|
||||
self,
|
||||
showId: int | None = None,
|
||||
originalSeason: int | None = None,
|
||||
firstEpisode: int | None = None,
|
||||
lastEpisode: int | None = None,
|
||||
patternId: int | None = None,
|
||||
):
|
||||
|
||||
if type(originalSeason) is not int:
|
||||
raise ValueError(f"ShiftedSeasonController.findShiftedSeason(): Argument originalSeason is required to be of type int")
|
||||
raise ValueError(
|
||||
"ShiftedSeasonController.findShiftedSeason(): Argument originalSeason is required to be of type int"
|
||||
)
|
||||
|
||||
if type(firstEpisode) is not int:
|
||||
raise ValueError(f"ShiftedSeasonController.findShiftedSeason(): Argument firstEpisode is required to be of type int")
|
||||
raise ValueError(
|
||||
"ShiftedSeasonController.findShiftedSeason(): Argument firstEpisode is required to be of type int"
|
||||
)
|
||||
|
||||
if type(lastEpisode) is not int:
|
||||
raise ValueError(f"ShiftedSeasonController.findShiftedSeason(): Argument lastEpisode is required to be of type int")
|
||||
raise ValueError(
|
||||
"ShiftedSeasonController.findShiftedSeason(): Argument lastEpisode is required to be of type int"
|
||||
)
|
||||
|
||||
session = None
|
||||
try:
|
||||
s = self.Session()
|
||||
shiftedSeason = s.query(ShiftedSeason).filter(
|
||||
ShiftedSeason.show_id == int(showId),
|
||||
ShiftedSeason.original_season == int(originalSeason),
|
||||
ShiftedSeason.first_episode == int(firstEpisode),
|
||||
ShiftedSeason.last_episode == int(lastEpisode),
|
||||
).first()
|
||||
owner = self._resolve_owner(showId=showId, patternId=patternId)
|
||||
session = self.Session()
|
||||
shiftedSeason = (
|
||||
self._apply_owner_filter(session.query(ShiftedSeason), owner)
|
||||
.filter(
|
||||
ShiftedSeason.original_season == int(originalSeason),
|
||||
ShiftedSeason.first_episode == int(firstEpisode),
|
||||
ShiftedSeason.last_episode == int(lastEpisode),
|
||||
)
|
||||
.first()
|
||||
)
|
||||
|
||||
return shiftedSeason.getId() if shiftedSeason is not None else None
|
||||
|
||||
except ShiftedSeasonOwnerException as ex:
|
||||
raise click.ClickException(str(ex))
|
||||
except Exception as ex:
|
||||
raise click.ClickException(f"PatternController.findShiftedSeason(): {repr(ex)}")
|
||||
raise click.ClickException(
|
||||
f"ShiftedSeasonController.findShiftedSeason(): {repr(ex)}"
|
||||
)
|
||||
finally:
|
||||
s.close()
|
||||
if session is not None:
|
||||
session.close()
|
||||
|
||||
def getShiftedSeasonSiblings(self, showId: int):
|
||||
|
||||
if type(showId) is not int:
|
||||
raise ValueError(f"ShiftedSeasonController.getShiftedSeasonSiblings(): Argument shiftedSeasonId is required to be of type int")
|
||||
def getShiftedSeasonSiblings(
|
||||
self,
|
||||
showId: int | None = None,
|
||||
patternId: int | None = None,
|
||||
):
|
||||
session = None
|
||||
|
||||
try:
|
||||
s = self.Session()
|
||||
q = s.query(ShiftedSeason).filter(ShiftedSeason.show_id == int(showId))
|
||||
|
||||
return q.all()
|
||||
owner = self._resolve_owner(showId=showId, patternId=patternId)
|
||||
session = self.Session()
|
||||
return self._ordered_query(session, owner).all()
|
||||
|
||||
except ShiftedSeasonOwnerException as ex:
|
||||
raise click.ClickException(str(ex))
|
||||
except Exception as ex:
|
||||
raise click.ClickException(f"PatternController.getShiftedSeasonSiblings(): {repr(ex)}")
|
||||
raise click.ClickException(
|
||||
f"ShiftedSeasonController.getShiftedSeasonSiblings(): {repr(ex)}"
|
||||
)
|
||||
finally:
|
||||
s.close()
|
||||
|
||||
if session is not None:
|
||||
session.close()
|
||||
|
||||
def getShiftedSeason(self, shiftedSeasonId: int):
|
||||
|
||||
if type(shiftedSeasonId) is not int:
|
||||
raise ValueError(f"ShiftedSeasonController.getShiftedSeason(): Argument shiftedSeasonId is required to be of type int")
|
||||
raise ValueError(
|
||||
"ShiftedSeasonController.getShiftedSeason(): Argument shiftedSeasonId is required to be of type int"
|
||||
)
|
||||
|
||||
session = None
|
||||
try:
|
||||
s = self.Session()
|
||||
return s.query(ShiftedSeason).filter(ShiftedSeason.id == int(shiftedSeasonId)).first()
|
||||
session = self.Session()
|
||||
return (
|
||||
session.query(ShiftedSeason)
|
||||
.filter(ShiftedSeason.id == int(shiftedSeasonId))
|
||||
.first()
|
||||
)
|
||||
|
||||
except Exception as ex:
|
||||
raise click.ClickException(f"ShiftedSeasonController.getShiftedSeason(): {repr(ex)}")
|
||||
raise click.ClickException(
|
||||
f"ShiftedSeasonController.getShiftedSeason(): {repr(ex)}"
|
||||
)
|
||||
finally:
|
||||
s.close()
|
||||
|
||||
if session is not None:
|
||||
session.close()
|
||||
|
||||
def deleteShiftedSeason(self, shiftedSeasonId):
|
||||
|
||||
if type(shiftedSeasonId) is not int:
|
||||
raise ValueError(f"ShiftedSeasonController.deleteShiftedSeason(): Argument shiftedSeasonId is required to be of type int")
|
||||
raise ValueError(
|
||||
"ShiftedSeasonController.deleteShiftedSeason(): Argument shiftedSeasonId is required to be of type int"
|
||||
)
|
||||
|
||||
session = None
|
||||
try:
|
||||
s = self.Session()
|
||||
shiftedSeason = s.query(ShiftedSeason).filter(ShiftedSeason.id == int(shiftedSeasonId)).first()
|
||||
session = self.Session()
|
||||
shiftedSeason = (
|
||||
session.query(ShiftedSeason)
|
||||
.filter(ShiftedSeason.id == int(shiftedSeasonId))
|
||||
.first()
|
||||
)
|
||||
|
||||
if shiftedSeason is not None:
|
||||
|
||||
#DAFUQ: https://stackoverflow.com/a/19245058
|
||||
# q.delete()
|
||||
s.delete(shiftedSeason)
|
||||
|
||||
s.commit()
|
||||
session.delete(shiftedSeason)
|
||||
session.commit()
|
||||
return True
|
||||
return False
|
||||
|
||||
except Exception as ex:
|
||||
raise click.ClickException(f"ShiftedSeasonController.deleteShiftedSeason(): {repr(ex)}")
|
||||
raise click.ClickException(
|
||||
f"ShiftedSeasonController.deleteShiftedSeason(): {repr(ex)}"
|
||||
)
|
||||
finally:
|
||||
s.close()
|
||||
if session is not None:
|
||||
session.close()
|
||||
|
||||
def shiftSeason(self, showId, season, episode, patternId=None):
|
||||
|
||||
def shiftSeason(self, showId, season, episode):
|
||||
if season == -1 or episode == -1:
|
||||
return season, episode
|
||||
|
||||
shiftedSeasonEntry: ShiftedSeason
|
||||
for shiftedSeasonEntry in self.getShiftedSeasonSiblings(showId):
|
||||
session = None
|
||||
try:
|
||||
session = self.Session()
|
||||
activeShift = None
|
||||
|
||||
if (season == shiftedSeasonEntry.getOriginalSeason()
|
||||
and (shiftedSeasonEntry.getFirstEpisode() == -1 or episode >= shiftedSeasonEntry.getFirstEpisode())
|
||||
and (shiftedSeasonEntry.getLastEpisode() == -1 or episode <= shiftedSeasonEntry.getLastEpisode())):
|
||||
if patternId is not None:
|
||||
activeShift = self._find_matching_rule(
|
||||
session,
|
||||
self._resolve_owner(patternId=patternId),
|
||||
season=int(season),
|
||||
episode=int(episode),
|
||||
)
|
||||
|
||||
shiftedSeason = season + shiftedSeasonEntry.getSeasonOffset()
|
||||
shiftedEpisode = episode + shiftedSeasonEntry.getEpisodeOffset()
|
||||
if activeShift is None and showId is not None and showId != -1:
|
||||
activeShift = self._find_matching_rule(
|
||||
session,
|
||||
self._resolve_owner(showId=showId),
|
||||
season=int(season),
|
||||
episode=int(episode),
|
||||
)
|
||||
|
||||
self.context['logger'].info(f"Shifting season: {season} episode: {episode} "
|
||||
+f"-> season: {shiftedSeason} episode: {shiftedEpisode}")
|
||||
if activeShift is None:
|
||||
shiftedSeason = season
|
||||
shiftedEpisode = episode
|
||||
sourceLabel = "default"
|
||||
else:
|
||||
shiftedSeason = season + activeShift.getSeasonOffset()
|
||||
shiftedEpisode = episode + activeShift.getEpisodeOffset()
|
||||
sourceLabel = (
|
||||
"pattern"
|
||||
if activeShift.getPatternId() is not None
|
||||
else "show"
|
||||
)
|
||||
|
||||
return shiftedSeason, shiftedEpisode
|
||||
|
||||
return season, episode
|
||||
self.context['logger'].info(
|
||||
f"Setting season shift {season}/{episode} -> {shiftedSeason}/{shiftedEpisode} from {sourceLabel}"
|
||||
)
|
||||
|
||||
return shiftedSeason, shiftedEpisode
|
||||
|
||||
except ShiftedSeasonOwnerException as ex:
|
||||
raise click.ClickException(str(ex))
|
||||
except Exception as ex:
|
||||
raise click.ClickException(
|
||||
f"ShiftedSeasonController.shiftSeason(): {repr(ex)}"
|
||||
)
|
||||
finally:
|
||||
if session is not None:
|
||||
session.close()
|
||||
|
||||
@@ -43,7 +43,7 @@ class ShiftedSeasonDeleteScreen(Screen):
|
||||
}
|
||||
"""
|
||||
|
||||
def __init__(self, showId = None, shiftedSeasonId = None):
|
||||
def __init__(self, showId = None, patternId = None, shiftedSeasonId = None):
|
||||
super().__init__()
|
||||
|
||||
self.context = self.app.getContext()
|
||||
@@ -52,6 +52,7 @@ class ShiftedSeasonDeleteScreen(Screen):
|
||||
self.__ssc = ShiftedSeasonController(context = self.context)
|
||||
|
||||
self._showId = showId
|
||||
self._patternId = patternId
|
||||
self.__shiftedSeasonId = shiftedSeasonId
|
||||
|
||||
|
||||
@@ -59,7 +60,12 @@ class ShiftedSeasonDeleteScreen(Screen):
|
||||
|
||||
shiftedSeason: ShiftedSeason = self.__ssc.getShiftedSeason(self.__shiftedSeasonId)
|
||||
|
||||
self.query_one("#static_show_id", Static).update(str(self._showId))
|
||||
ownerLabel = (
|
||||
f"pattern #{self._patternId}"
|
||||
if self._patternId is not None
|
||||
else f"show #{self._showId}"
|
||||
)
|
||||
self.query_one("#static_owner", Static).update(ownerLabel)
|
||||
self.query_one("#static_original_season", Static).update(str(shiftedSeason.getOriginalSeason()))
|
||||
self.query_one("#static_first_episode", Static).update(str(shiftedSeason.getFirstEpisode()))
|
||||
self.query_one("#static_last_episode", Static).update(str(shiftedSeason.getLastEpisode()))
|
||||
@@ -77,12 +83,12 @@ class ShiftedSeasonDeleteScreen(Screen):
|
||||
|
||||
yield Static(" ", classes="two")
|
||||
|
||||
yield Static("from show")
|
||||
yield Static(" ", id="static_show_id")
|
||||
yield Static("from")
|
||||
yield Static(" ", id="static_owner")
|
||||
|
||||
yield Static(" ", classes="two")
|
||||
|
||||
yield Static("Original season")
|
||||
yield Static("Source season")
|
||||
yield Static(" ", id="static_original_season")
|
||||
|
||||
yield Static("First episode")
|
||||
@@ -122,4 +128,3 @@ class ShiftedSeasonDeleteScreen(Screen):
|
||||
|
||||
if event.button.id == "cancel_button":
|
||||
self.app.pop_screen()
|
||||
|
||||
|
||||
@@ -81,7 +81,7 @@ class ShiftedSeasonDetailsScreen(Screen):
|
||||
}
|
||||
"""
|
||||
|
||||
def __init__(self, showId = None, shiftedSeasonId = None):
|
||||
def __init__(self, showId = None, patternId = None, shiftedSeasonId = None):
|
||||
super().__init__()
|
||||
|
||||
self.context = self.app.getContext()
|
||||
@@ -90,8 +90,14 @@ class ShiftedSeasonDetailsScreen(Screen):
|
||||
self.__ssc = ShiftedSeasonController(context = self.context)
|
||||
|
||||
self.__showId = showId
|
||||
self.__patternId = patternId
|
||||
self.__shiftedSeasonId = shiftedSeasonId
|
||||
|
||||
def _owner_kwargs(self):
|
||||
if self.__patternId is not None:
|
||||
return {'patternId': self.__patternId}
|
||||
return {'showId': self.__showId}
|
||||
|
||||
def on_mount(self):
|
||||
|
||||
if self.__shiftedSeasonId is not None:
|
||||
@@ -126,7 +132,7 @@ class ShiftedSeasonDetailsScreen(Screen):
|
||||
yield Static(" ", classes="three")
|
||||
|
||||
# 3
|
||||
yield Static("Original season")
|
||||
yield Static("Source season")
|
||||
yield Input(id="input_original_season", classes="two")
|
||||
|
||||
# 4
|
||||
@@ -203,8 +209,11 @@ class ShiftedSeasonDetailsScreen(Screen):
|
||||
|
||||
if self.__shiftedSeasonId is not None:
|
||||
|
||||
if self.__ssc.checkShiftedSeason(self.__showId, shiftedSeasonObj,
|
||||
shiftedSeasonId = self.__shiftedSeasonId):
|
||||
if self.__ssc.checkShiftedSeason(
|
||||
shiftedSeasonObj=shiftedSeasonObj,
|
||||
shiftedSeasonId=self.__shiftedSeasonId,
|
||||
**self._owner_kwargs(),
|
||||
):
|
||||
if self.__ssc.updateShiftedSeason(self.__shiftedSeasonId, shiftedSeasonObj):
|
||||
self.dismiss((self.__shiftedSeasonId, shiftedSeasonObj))
|
||||
else:
|
||||
@@ -212,8 +221,14 @@ class ShiftedSeasonDetailsScreen(Screen):
|
||||
self.app.pop_screen()
|
||||
|
||||
else:
|
||||
if self.__ssc.checkShiftedSeason(self.__showId, shiftedSeasonObj):
|
||||
self.__shiftedSeasonId = self.__ssc.addShiftedSeason(self.__showId, shiftedSeasonObj)
|
||||
if self.__ssc.checkShiftedSeason(
|
||||
shiftedSeasonObj=shiftedSeasonObj,
|
||||
**self._owner_kwargs(),
|
||||
):
|
||||
self.__shiftedSeasonId = self.__ssc.addShiftedSeason(
|
||||
shiftedSeasonObj=shiftedSeasonObj,
|
||||
**self._owner_kwargs(),
|
||||
)
|
||||
self.dismiss((self.__shiftedSeasonId, shiftedSeasonObj))
|
||||
|
||||
|
||||
|
||||
@@ -62,7 +62,9 @@ class ShowController():
|
||||
index_season_digits = showDescriptor.getIndexSeasonDigits(),
|
||||
index_episode_digits = showDescriptor.getIndexEpisodeDigits(),
|
||||
indicator_season_digits = showDescriptor.getIndicatorSeasonDigits(),
|
||||
indicator_episode_digits = showDescriptor.getIndicatorEpisodeDigits())
|
||||
indicator_episode_digits = showDescriptor.getIndicatorEpisodeDigits(),
|
||||
quality = showDescriptor.getQuality(),
|
||||
notes = showDescriptor.getNotes())
|
||||
|
||||
s.add(show)
|
||||
s.commit()
|
||||
@@ -88,6 +90,12 @@ class ShowController():
|
||||
if currentShow.indicator_episode_digits != int(showDescriptor.getIndicatorEpisodeDigits()):
|
||||
currentShow.indicator_episode_digits = int(showDescriptor.getIndicatorEpisodeDigits())
|
||||
changed = True
|
||||
if int(currentShow.quality or 0) != int(showDescriptor.getQuality()):
|
||||
currentShow.quality = int(showDescriptor.getQuality())
|
||||
changed = True
|
||||
if str(currentShow.notes or '') != str(showDescriptor.getNotes()):
|
||||
currentShow.notes = str(showDescriptor.getNotes())
|
||||
changed = True
|
||||
|
||||
if changed:
|
||||
s.commit()
|
||||
|
||||
@@ -1,3 +1,10 @@
|
||||
from .configuration_controller import ConfigurationController
|
||||
from .constants import (
|
||||
DEFAULT_SHOW_INDEX_EPISODE_DIGITS,
|
||||
DEFAULT_SHOW_INDEX_SEASON_DIGITS,
|
||||
DEFAULT_SHOW_INDICATOR_EPISODE_DIGITS,
|
||||
DEFAULT_SHOW_INDICATOR_SEASON_DIGITS,
|
||||
)
|
||||
from .logging_utils import get_ffx_logger
|
||||
|
||||
|
||||
@@ -14,11 +21,45 @@ class ShowDescriptor():
|
||||
INDEX_EPISODE_DIGITS_KEY = 'index_episode_digits'
|
||||
INDICATOR_SEASON_DIGITS_KEY = 'indicator_season_digits'
|
||||
INDICATOR_EPISODE_DIGITS_KEY = 'indicator_episode_digits'
|
||||
QUALITY_KEY = 'quality'
|
||||
NOTES_KEY = 'notes'
|
||||
|
||||
DEFAULT_INDEX_SEASON_DIGITS = 2
|
||||
DEFAULT_INDEX_EPISODE_DIGITS = 2
|
||||
DEFAULT_INDICATOR_SEASON_DIGITS = 2
|
||||
DEFAULT_INDICATOR_EPISODE_DIGITS = 2
|
||||
DEFAULT_INDEX_SEASON_DIGITS = DEFAULT_SHOW_INDEX_SEASON_DIGITS
|
||||
DEFAULT_INDEX_EPISODE_DIGITS = DEFAULT_SHOW_INDEX_EPISODE_DIGITS
|
||||
DEFAULT_INDICATOR_SEASON_DIGITS = DEFAULT_SHOW_INDICATOR_SEASON_DIGITS
|
||||
DEFAULT_INDICATOR_EPISODE_DIGITS = DEFAULT_SHOW_INDICATOR_EPISODE_DIGITS
|
||||
|
||||
@classmethod
|
||||
def getDefaultDigitLengths(cls, context: dict | None = None) -> dict[str, int]:
|
||||
configurationData = {}
|
||||
|
||||
if context is not None:
|
||||
configController = context.get('config')
|
||||
if configController is not None and hasattr(configController, 'getData'):
|
||||
configurationData = configController.getData()
|
||||
|
||||
return {
|
||||
cls.INDEX_SEASON_DIGITS_KEY: ConfigurationController.getConfiguredIntegerValue(
|
||||
configurationData,
|
||||
ConfigurationController.DEFAULT_INDEX_SEASON_DIGITS_CONFIG_KEY,
|
||||
cls.DEFAULT_INDEX_SEASON_DIGITS,
|
||||
),
|
||||
cls.INDEX_EPISODE_DIGITS_KEY: ConfigurationController.getConfiguredIntegerValue(
|
||||
configurationData,
|
||||
ConfigurationController.DEFAULT_INDEX_EPISODE_DIGITS_CONFIG_KEY,
|
||||
cls.DEFAULT_INDEX_EPISODE_DIGITS,
|
||||
),
|
||||
cls.INDICATOR_SEASON_DIGITS_KEY: ConfigurationController.getConfiguredIntegerValue(
|
||||
configurationData,
|
||||
ConfigurationController.DEFAULT_INDICATOR_SEASON_DIGITS_CONFIG_KEY,
|
||||
cls.DEFAULT_INDICATOR_SEASON_DIGITS,
|
||||
),
|
||||
cls.INDICATOR_EPISODE_DIGITS_KEY: ConfigurationController.getConfiguredIntegerValue(
|
||||
configurationData,
|
||||
ConfigurationController.DEFAULT_INDICATOR_EPISODE_DIGITS_CONFIG_KEY,
|
||||
cls.DEFAULT_INDICATOR_EPISODE_DIGITS,
|
||||
),
|
||||
}
|
||||
|
||||
|
||||
def __init__(self, **kwargs):
|
||||
@@ -53,36 +94,51 @@ class ShowDescriptor():
|
||||
raise TypeError(f"ShowDescriptor.__init__(): Argument {ShowDescriptor.YEAR_KEY} is required to be of type int")
|
||||
self.__showYear = kwargs[ShowDescriptor.YEAR_KEY]
|
||||
else:
|
||||
self.__showYear = -1
|
||||
self.__showYear = -1
|
||||
|
||||
defaultDigitLengths = self.getDefaultDigitLengths(self.__context)
|
||||
|
||||
if ShowDescriptor.INDEX_SEASON_DIGITS_KEY in kwargs.keys():
|
||||
if type(kwargs[ShowDescriptor.INDEX_SEASON_DIGITS_KEY]) is not int:
|
||||
raise TypeError(f"ShowDescriptor.__init__(): Argument {ShowDescriptor.INDEX_SEASON_DIGITS_KEY} is required to be of type int")
|
||||
self.__indexSeasonDigits = kwargs[ShowDescriptor.INDEX_SEASON_DIGITS_KEY]
|
||||
else:
|
||||
self.__indexSeasonDigits = ShowDescriptor.DEFAULT_INDEX_SEASON_DIGITS
|
||||
self.__indexSeasonDigits = defaultDigitLengths[ShowDescriptor.INDEX_SEASON_DIGITS_KEY]
|
||||
|
||||
if ShowDescriptor.INDEX_EPISODE_DIGITS_KEY in kwargs.keys():
|
||||
if type(kwargs[ShowDescriptor.INDEX_EPISODE_DIGITS_KEY]) is not int:
|
||||
raise TypeError(f"ShowDescriptor.__init__(): Argument {ShowDescriptor.INDEX_EPISODE_DIGITS_KEY} is required to be of type int")
|
||||
self.__indexEpisodeDigits = kwargs[ShowDescriptor.INDEX_EPISODE_DIGITS_KEY]
|
||||
else:
|
||||
self.__indexEpisodeDigits = ShowDescriptor.DEFAULT_INDEX_EPISODE_DIGITS
|
||||
self.__indexEpisodeDigits = defaultDigitLengths[ShowDescriptor.INDEX_EPISODE_DIGITS_KEY]
|
||||
|
||||
if ShowDescriptor.INDICATOR_SEASON_DIGITS_KEY in kwargs.keys():
|
||||
if type(kwargs[ShowDescriptor.INDICATOR_SEASON_DIGITS_KEY]) is not int:
|
||||
raise TypeError(f"ShowDescriptor.__init__(): Argument {ShowDescriptor.INDICATOR_SEASON_DIGITS_KEY} is required to be of type int")
|
||||
self.__indicatorSeasonDigits = kwargs[ShowDescriptor.INDICATOR_SEASON_DIGITS_KEY]
|
||||
else:
|
||||
self.__indicatorSeasonDigits = ShowDescriptor.DEFAULT_INDICATOR_SEASON_DIGITS
|
||||
self.__indicatorSeasonDigits = defaultDigitLengths[ShowDescriptor.INDICATOR_SEASON_DIGITS_KEY]
|
||||
|
||||
if ShowDescriptor.INDICATOR_EPISODE_DIGITS_KEY in kwargs.keys():
|
||||
if type(kwargs[ShowDescriptor.INDICATOR_EPISODE_DIGITS_KEY]) is not int:
|
||||
raise TypeError(f"ShowDescriptor.__init__(): Argument {ShowDescriptor.INDICATOR_EPISODE_DIGITS_KEY} is required to be of type int")
|
||||
self.__indicatorEpisodeDigits = kwargs[ShowDescriptor.INDICATOR_EPISODE_DIGITS_KEY]
|
||||
else:
|
||||
self.__indicatorEpisodeDigits = ShowDescriptor.DEFAULT_INDICATOR_EPISODE_DIGITS
|
||||
self.__indicatorEpisodeDigits = defaultDigitLengths[ShowDescriptor.INDICATOR_EPISODE_DIGITS_KEY]
|
||||
|
||||
if ShowDescriptor.QUALITY_KEY in kwargs.keys():
|
||||
if type(kwargs[ShowDescriptor.QUALITY_KEY]) is not int:
|
||||
raise TypeError(f"ShowDescriptor.__init__(): Argument {ShowDescriptor.QUALITY_KEY} is required to be of type int")
|
||||
self.__quality = kwargs[ShowDescriptor.QUALITY_KEY]
|
||||
else:
|
||||
self.__quality = 0
|
||||
|
||||
if ShowDescriptor.NOTES_KEY in kwargs.keys():
|
||||
if type(kwargs[ShowDescriptor.NOTES_KEY]) is not str:
|
||||
raise TypeError(f"ShowDescriptor.__init__(): Argument {ShowDescriptor.NOTES_KEY} is required to be of type str")
|
||||
self.__notes = kwargs[ShowDescriptor.NOTES_KEY]
|
||||
else:
|
||||
self.__notes = ''
|
||||
|
||||
|
||||
def getId(self):
|
||||
@@ -100,6 +156,10 @@ class ShowDescriptor():
|
||||
return self.__indicatorSeasonDigits
|
||||
def getIndicatorEpisodeDigits(self):
|
||||
return self.__indicatorEpisodeDigits
|
||||
def getQuality(self):
|
||||
return self.__quality
|
||||
def getNotes(self):
|
||||
return self.__notes
|
||||
|
||||
def getFilenamePrefix(self):
|
||||
return f"{self.__showName} ({str(self.__showYear)})"
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import click
|
||||
|
||||
from textual.screen import Screen
|
||||
from textual.widgets import Header, Footer, Static, Button, DataTable, Input
|
||||
from textual.widgets import Header, Footer, Static, Button, DataTable, Input, TextArea
|
||||
from textual.containers import Grid
|
||||
from textual.widgets._data_table import CellDoesNotExist
|
||||
|
||||
@@ -25,8 +25,8 @@ class ShowDetailsScreen(Screen):
|
||||
CSS = """
|
||||
|
||||
Grid {
|
||||
grid-size: 5 16;
|
||||
grid-rows: 2 2 2 2 2 2 2 2 2 2 2 9 2 9 2 2;
|
||||
grid-size: 5 18;
|
||||
grid-rows: 2 2 2 2 2 2 6 2 2 2 2 2 2 9 2 9 2 2;
|
||||
grid-columns: 30 30 30 30 30;
|
||||
height: 100%;
|
||||
width: 100%;
|
||||
@@ -77,6 +77,10 @@ class ShowDetailsScreen(Screen):
|
||||
height: 100%;
|
||||
border: solid green;
|
||||
}
|
||||
|
||||
.note_box {
|
||||
min-height: 6;
|
||||
}
|
||||
"""
|
||||
|
||||
BINDINGS = [
|
||||
@@ -150,6 +154,10 @@ class ShowDetailsScreen(Screen):
|
||||
self.query_one("#index_episode_digits_input", Input).value = str(self.__showDescriptor.getIndexEpisodeDigits())
|
||||
self.query_one("#indicator_season_digits_input", Input).value = str(self.__showDescriptor.getIndicatorSeasonDigits())
|
||||
self.query_one("#indicator_episode_digits_input", Input).value = str(self.__showDescriptor.getIndicatorEpisodeDigits())
|
||||
if self.__showDescriptor.getQuality():
|
||||
self.query_one("#quality_input", Input).value = str(self.__showDescriptor.getQuality())
|
||||
if self.__showDescriptor.getNotes():
|
||||
self.query_one("#notes_textarea", TextArea).text = str(self.__showDescriptor.getNotes())
|
||||
|
||||
|
||||
#raise click.ClickException(f"show_id {showId}")
|
||||
@@ -160,11 +168,20 @@ class ShowDetailsScreen(Screen):
|
||||
self.updateShiftedSeasons()
|
||||
|
||||
else:
|
||||
|
||||
self.query_one("#index_season_digits_input", Input).value = "2"
|
||||
self.query_one("#index_episode_digits_input", Input).value = "2"
|
||||
self.query_one("#indicator_season_digits_input", Input).value = "2"
|
||||
self.query_one("#indicator_episode_digits_input", Input).value = "2"
|
||||
defaultDigitLengths = ShowDescriptor.getDefaultDigitLengths(self.context)
|
||||
|
||||
self.query_one("#index_season_digits_input", Input).value = str(
|
||||
defaultDigitLengths[ShowDescriptor.INDEX_SEASON_DIGITS_KEY]
|
||||
)
|
||||
self.query_one("#index_episode_digits_input", Input).value = str(
|
||||
defaultDigitLengths[ShowDescriptor.INDEX_EPISODE_DIGITS_KEY]
|
||||
)
|
||||
self.query_one("#indicator_season_digits_input", Input).value = str(
|
||||
defaultDigitLengths[ShowDescriptor.INDICATOR_SEASON_DIGITS_KEY]
|
||||
)
|
||||
self.query_one("#indicator_episode_digits_input", Input).value = str(
|
||||
defaultDigitLengths[ShowDescriptor.INDICATOR_EPISODE_DIGITS_KEY]
|
||||
)
|
||||
|
||||
|
||||
def getSelectedPatternDescriptor(self):
|
||||
@@ -202,11 +219,17 @@ class ShowDetailsScreen(Screen):
|
||||
if row_key is not None:
|
||||
selected_row_data = self.shiftedSeasonsTable.get_row(row_key)
|
||||
|
||||
def parse_int_or_default(value: str, default: int) -> int:
|
||||
try:
|
||||
return int(value)
|
||||
except (TypeError, ValueError):
|
||||
return default
|
||||
|
||||
shiftedSeasonObj['original_season'] = int(selected_row_data[0])
|
||||
shiftedSeasonObj['first_episode'] = int(selected_row_data[1]) if selected_row_data[1].isnumeric() else -1
|
||||
shiftedSeasonObj['last_episode'] = int(selected_row_data[2]) if selected_row_data[2].isnumeric() else -1
|
||||
shiftedSeasonObj['season_offset'] = int(selected_row_data[3]) if selected_row_data[3].isnumeric() else 0
|
||||
shiftedSeasonObj['episode_offset'] = int(selected_row_data[4]) if selected_row_data[4].isnumeric() else 0
|
||||
shiftedSeasonObj['first_episode'] = parse_int_or_default(selected_row_data[1], -1)
|
||||
shiftedSeasonObj['last_episode'] = parse_int_or_default(selected_row_data[2], -1)
|
||||
shiftedSeasonObj['season_offset'] = parse_int_or_default(selected_row_data[3], 0)
|
||||
shiftedSeasonObj['episode_offset'] = parse_int_or_default(selected_row_data[4], 0)
|
||||
|
||||
|
||||
if self.__showDescriptor is not None:
|
||||
@@ -299,7 +322,7 @@ class ShowDetailsScreen(Screen):
|
||||
|
||||
self.shiftedSeasonsTable = DataTable(classes="five")
|
||||
|
||||
self.column_key_original_season = self.shiftedSeasonsTable.add_column("Original Season", width=30)
|
||||
self.column_key_original_season = self.shiftedSeasonsTable.add_column("Source Season", width=30)
|
||||
self.column_key_first_episode = self.shiftedSeasonsTable.add_column("First Episode", width=30)
|
||||
self.column_key_last_episode = self.shiftedSeasonsTable.add_column("Last Episode", width=30)
|
||||
self.column_key_season_offset = self.shiftedSeasonsTable.add_column("Season Offset", width=30)
|
||||
@@ -333,28 +356,36 @@ class ShowDetailsScreen(Screen):
|
||||
yield Input(type="integer", id="year_input", classes="four")
|
||||
|
||||
#5
|
||||
yield Static(" ", classes="five")
|
||||
yield Static("Quality")
|
||||
yield Input(type="integer", id="quality_input", classes="four")
|
||||
|
||||
#6
|
||||
yield Static("Notes")
|
||||
yield Static(" ", classes="four")
|
||||
|
||||
#7
|
||||
yield TextArea(id="notes_textarea", classes="five note_box")
|
||||
|
||||
#8
|
||||
yield Static("Index Season Digits")
|
||||
yield Input(type="integer", id="index_season_digits_input", classes="four")
|
||||
|
||||
#7
|
||||
#9
|
||||
yield Static("Index Episode Digits")
|
||||
yield Input(type="integer", id="index_episode_digits_input", classes="four")
|
||||
|
||||
#8
|
||||
#10
|
||||
yield Static("Indicator Season Digits")
|
||||
yield Input(type="integer", id="indicator_season_digits_input", classes="four")
|
||||
|
||||
#9
|
||||
#11
|
||||
yield Static("Indicator Edisode Digits")
|
||||
yield Input(type="integer", id="indicator_episode_digits_input", classes="four")
|
||||
|
||||
# 10
|
||||
# 12
|
||||
yield Static(" ", classes="five")
|
||||
|
||||
# 11
|
||||
# 13
|
||||
yield Static("Shifted seasons", classes="two")
|
||||
|
||||
if self.__showDescriptor is not None:
|
||||
@@ -366,18 +397,18 @@ class ShowDetailsScreen(Screen):
|
||||
yield Static(" ")
|
||||
yield Static(" ")
|
||||
|
||||
# 12
|
||||
# 14
|
||||
yield self.shiftedSeasonsTable
|
||||
|
||||
# 13
|
||||
# 15
|
||||
yield Static("File patterns", classes="five")
|
||||
# 14
|
||||
# 16
|
||||
yield self.patternTable
|
||||
|
||||
# 15
|
||||
# 17
|
||||
yield Static(" ", classes="five")
|
||||
|
||||
# 16
|
||||
# 18
|
||||
yield Button("Save", id="save_button")
|
||||
yield Button("Cancel", id="cancel_button")
|
||||
|
||||
@@ -387,7 +418,7 @@ class ShowDetailsScreen(Screen):
|
||||
|
||||
def getShowDescriptorFromInput(self) -> ShowDescriptor:
|
||||
|
||||
kwargs = {}
|
||||
kwargs = {ShowDescriptor.CONTEXT_KEY: self.context}
|
||||
|
||||
try:
|
||||
if self.__showDescriptor:
|
||||
@@ -423,6 +454,11 @@ class ShowDetailsScreen(Screen):
|
||||
kwargs[ShowDescriptor.INDICATOR_EPISODE_DIGITS_KEY] = int(self.query_one("#indicator_episode_digits_input", Input).value)
|
||||
except ValueError:
|
||||
pass
|
||||
try:
|
||||
kwargs[ShowDescriptor.QUALITY_KEY] = int(self.query_one("#quality_input", Input).value)
|
||||
except ValueError:
|
||||
pass
|
||||
kwargs[ShowDescriptor.NOTES_KEY] = str(self.query_one("#notes_textarea", TextArea).text)
|
||||
|
||||
return ShowDescriptor(**kwargs)
|
||||
|
||||
|
||||
BIN
tests/assets/dball_S01E89_3_und.sup
Normal file
BIN
tests/assets/dball_S01E89_3_und.sup
Normal file
Binary file not shown.
BIN
tests/assets/dball_S01E89_4_und.sup
Normal file
BIN
tests/assets/dball_S01E89_4_und.sup
Normal file
Binary file not shown.
@@ -8,8 +8,19 @@ import sys
|
||||
import tempfile
|
||||
import unittest
|
||||
|
||||
from tests.support.ffx_bundle import SourceTrackSpec, create_source_fixture
|
||||
from tests.support.ffx_bundle import (
|
||||
SourceTrackSpec,
|
||||
build_controller_context,
|
||||
create_source_fixture,
|
||||
dispose_controller_context,
|
||||
)
|
||||
|
||||
from ffx.pattern_controller import PatternController
|
||||
from ffx.show_controller import ShowController
|
||||
from ffx.show_descriptor import ShowDescriptor
|
||||
from ffx.shifted_season_controller import ShiftedSeasonController
|
||||
from ffx.track_codec import TrackCodec
|
||||
from ffx.track_descriptor import TrackDescriptor
|
||||
from ffx.track_type import TrackType
|
||||
|
||||
try:
|
||||
@@ -66,6 +77,64 @@ class UnmuxCliTests(unittest.TestCase):
|
||||
f"STDERR:\n{completed.stderr}"
|
||||
)
|
||||
|
||||
def seed_matching_show(self, pattern_expression: str, *, indicator_season_digits: int, indicator_episode_digits: int) -> None:
|
||||
context = build_controller_context(self.database_path)
|
||||
try:
|
||||
ShowController(context).updateShow(
|
||||
ShowDescriptor(
|
||||
id=1,
|
||||
name="Unmux Test Show",
|
||||
year=2000,
|
||||
indicator_season_digits=indicator_season_digits,
|
||||
indicator_episode_digits=indicator_episode_digits,
|
||||
)
|
||||
)
|
||||
PatternController(context).savePatternSchema(
|
||||
{
|
||||
"show_id": 1,
|
||||
"pattern": pattern_expression,
|
||||
"quality": 0,
|
||||
"notes": "",
|
||||
},
|
||||
trackDescriptors=[
|
||||
TrackDescriptor(
|
||||
index=0,
|
||||
source_index=0,
|
||||
track_type=TrackType.VIDEO,
|
||||
codec_name=TrackCodec.H264,
|
||||
tags={},
|
||||
disposition_set=set(),
|
||||
)
|
||||
],
|
||||
)
|
||||
finally:
|
||||
dispose_controller_context(context)
|
||||
|
||||
def add_show_shift(
|
||||
self,
|
||||
*,
|
||||
show_id: int,
|
||||
original_season: int,
|
||||
first_episode: int,
|
||||
last_episode: int,
|
||||
season_offset: int,
|
||||
episode_offset: int,
|
||||
) -> None:
|
||||
context = build_controller_context(self.database_path)
|
||||
try:
|
||||
ShiftedSeasonController(context).addShiftedSeason(
|
||||
showId=show_id,
|
||||
shiftedSeasonObj={
|
||||
"original_season": original_season,
|
||||
"first_episode": first_episode,
|
||||
"last_episode": last_episode,
|
||||
"season_offset": season_offset,
|
||||
"episode_offset": episode_offset,
|
||||
},
|
||||
)
|
||||
finally:
|
||||
dispose_controller_context(context)
|
||||
|
||||
def test_subtitles_only_without_output_directory_uses_configured_base_plus_label(self):
|
||||
self.write_config(
|
||||
{
|
||||
@@ -101,6 +170,134 @@ class UnmuxCliTests(unittest.TestCase):
|
||||
expected_directory = self.home_dir / ".local" / "var" / "sync" / "subtitles" / "dball"
|
||||
self.assertTrue(expected_directory.is_dir(), expected_directory)
|
||||
|
||||
def test_unmux_uses_configured_indicator_digits_in_output_filenames(self):
|
||||
self.write_config(
|
||||
{
|
||||
"defaultIndicatorSeasonDigits": 3,
|
||||
"defaultIndicatorEpisodeDigits": 4,
|
||||
}
|
||||
)
|
||||
source_filename = "unmux_s01e01.mkv"
|
||||
output_directory = self.workdir / "unmux-output"
|
||||
output_directory.mkdir()
|
||||
source_path = create_source_fixture(
|
||||
self.workdir,
|
||||
source_filename,
|
||||
[
|
||||
SourceTrackSpec(TrackType.VIDEO, identity="video-0"),
|
||||
],
|
||||
)
|
||||
|
||||
completed = run_ffx_unmux(
|
||||
self.workdir,
|
||||
self.home_dir,
|
||||
self.database_path,
|
||||
"--label",
|
||||
"dball",
|
||||
"--output-directory",
|
||||
str(output_directory),
|
||||
str(source_path),
|
||||
)
|
||||
self.assertCompleted(completed)
|
||||
|
||||
output_filenames = sorted(path.name for path in output_directory.iterdir())
|
||||
self.assertEqual(1, len(output_filenames), output_filenames)
|
||||
self.assertTrue(
|
||||
output_filenames[0].startswith("dball_S001E0001_"),
|
||||
output_filenames,
|
||||
)
|
||||
|
||||
def test_unmux_prefers_matched_show_indicator_digits_over_config_defaults(self):
|
||||
self.write_config(
|
||||
{
|
||||
"defaultIndicatorSeasonDigits": 4,
|
||||
"defaultIndicatorEpisodeDigits": 4,
|
||||
}
|
||||
)
|
||||
self.seed_matching_show(
|
||||
r"^unmux_([sS][0-9]+[eE][0-9]+)\.mkv$",
|
||||
indicator_season_digits=1,
|
||||
indicator_episode_digits=3,
|
||||
)
|
||||
source_filename = "unmux_s01e01.mkv"
|
||||
output_directory = self.workdir / "unmux-output"
|
||||
output_directory.mkdir()
|
||||
source_path = create_source_fixture(
|
||||
self.workdir,
|
||||
source_filename,
|
||||
[
|
||||
SourceTrackSpec(TrackType.VIDEO, identity="video-0"),
|
||||
],
|
||||
)
|
||||
|
||||
completed = run_ffx_unmux(
|
||||
self.workdir,
|
||||
self.home_dir,
|
||||
self.database_path,
|
||||
"--label",
|
||||
"dball",
|
||||
"--output-directory",
|
||||
str(output_directory),
|
||||
str(source_path),
|
||||
)
|
||||
self.assertCompleted(completed)
|
||||
|
||||
output_filenames = sorted(path.name for path in output_directory.iterdir())
|
||||
self.assertEqual(1, len(output_filenames), output_filenames)
|
||||
self.assertTrue(
|
||||
output_filenames[0].startswith("dball_S1E001_"),
|
||||
output_filenames,
|
||||
)
|
||||
|
||||
def test_unmux_applies_shifted_season_mapping_to_output_filenames(self):
|
||||
self.seed_matching_show(
|
||||
r"^unmux_([sS][0-9]+[eE][0-9]+)\.mkv$",
|
||||
indicator_season_digits=2,
|
||||
indicator_episode_digits=2,
|
||||
)
|
||||
self.add_show_shift(
|
||||
show_id=1,
|
||||
original_season=1,
|
||||
first_episode=1,
|
||||
last_episode=99,
|
||||
season_offset=1,
|
||||
episode_offset=-88,
|
||||
)
|
||||
source_filename = "unmux_s01e89.mkv"
|
||||
output_directory = self.workdir / "unmux-output"
|
||||
output_directory.mkdir()
|
||||
source_path = create_source_fixture(
|
||||
self.workdir,
|
||||
source_filename,
|
||||
[
|
||||
SourceTrackSpec(TrackType.VIDEO, identity="video-0"),
|
||||
SourceTrackSpec(
|
||||
TrackType.SUBTITLE,
|
||||
identity="subtitle-1",
|
||||
language="eng",
|
||||
subtitle_lines=("subtitle payload",),
|
||||
),
|
||||
],
|
||||
)
|
||||
|
||||
completed = run_ffx_unmux(
|
||||
self.workdir,
|
||||
self.home_dir,
|
||||
self.database_path,
|
||||
"--label",
|
||||
"dball",
|
||||
"--output-directory",
|
||||
str(output_directory),
|
||||
"--subtitles-only",
|
||||
str(source_path),
|
||||
)
|
||||
self.assertCompleted(completed)
|
||||
|
||||
self.assertIn(
|
||||
"Unmuxing stream 1 into file dball_S02E01_1_eng",
|
||||
completed.stderr,
|
||||
)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main()
|
||||
|
||||
137
tests/unit/test_cli_rename.py
Normal file
137
tests/unit/test_cli_rename.py
Normal file
@@ -0,0 +1,137 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
import os
|
||||
from pathlib import Path
|
||||
import sys
|
||||
import tempfile
|
||||
import unittest
|
||||
|
||||
from click.testing import CliRunner
|
||||
|
||||
|
||||
SRC_ROOT = Path(__file__).resolve().parents[2] / "src"
|
||||
|
||||
if str(SRC_ROOT) not in sys.path:
|
||||
sys.path.insert(0, str(SRC_ROOT))
|
||||
|
||||
|
||||
from ffx import cli # noqa: E402
|
||||
|
||||
|
||||
class RenameCliTests(unittest.TestCase):
|
||||
def setUp(self):
|
||||
self.tempdir = tempfile.TemporaryDirectory()
|
||||
self.workspace = Path(self.tempdir.name)
|
||||
self.home_dir = self.workspace / "home"
|
||||
self.home_dir.mkdir()
|
||||
|
||||
def tearDown(self):
|
||||
self.tempdir.cleanup()
|
||||
|
||||
def write_source(self, filename: str, payload: bytes = b"episode") -> Path:
|
||||
source_path = self.workspace / filename
|
||||
source_path.write_bytes(payload)
|
||||
return source_path
|
||||
|
||||
def write_config(self, data: dict) -> None:
|
||||
config_dir = self.home_dir / ".local" / "etc"
|
||||
config_dir.mkdir(parents=True, exist_ok=True)
|
||||
(config_dir / "ffx.json").write_text(json.dumps(data), encoding="utf-8")
|
||||
|
||||
def invoke_rename(self, *args: str):
|
||||
runner = CliRunner()
|
||||
result = runner.invoke(
|
||||
cli.ffx,
|
||||
["rename", *args],
|
||||
env={**os.environ, "HOME": str(self.home_dir)},
|
||||
)
|
||||
self.assertEqual(0, result.exit_code, result.output)
|
||||
return result
|
||||
|
||||
def test_rename_moves_matching_file_in_place(self):
|
||||
source_path = self.write_source("demo_S02E03.mkv", b"season-episode")
|
||||
|
||||
result = self.invoke_rename("--prefix", "dball", str(source_path))
|
||||
|
||||
target_path = self.workspace / "dball_s02e03.mkv"
|
||||
self.assertIn("demo_S02E03.mkv -> dball_s02e03.mkv", result.output)
|
||||
self.assertFalse(source_path.exists())
|
||||
self.assertTrue(target_path.exists())
|
||||
self.assertEqual(b"season-episode", target_path.read_bytes())
|
||||
|
||||
def test_rename_uses_default_season_and_suffix_for_episode_only_match(self):
|
||||
source_path = self.write_source("demo_E07.mp4", b"episode-only")
|
||||
|
||||
result = self.invoke_rename(
|
||||
"--prefix",
|
||||
"dball",
|
||||
"--suffix",
|
||||
"bonus",
|
||||
str(source_path),
|
||||
)
|
||||
|
||||
target_path = self.workspace / "dball_s01e07_bonus.mp4"
|
||||
self.assertIn("demo_E07.mp4 -> dball_s01e07_bonus.mp4", result.output)
|
||||
self.assertFalse(source_path.exists())
|
||||
self.assertTrue(target_path.exists())
|
||||
self.assertEqual(b"episode-only", target_path.read_bytes())
|
||||
|
||||
def test_rename_cli_season_overrides_source_season(self):
|
||||
source_path = self.write_source("demo_s02e07.webm")
|
||||
|
||||
result = self.invoke_rename(
|
||||
"--prefix",
|
||||
"dball",
|
||||
"--season",
|
||||
"5",
|
||||
str(source_path),
|
||||
)
|
||||
|
||||
target_path = self.workspace / "dball_s05e07.webm"
|
||||
self.assertIn("demo_s02e07.webm -> dball_s05e07.webm", result.output)
|
||||
self.assertFalse(source_path.exists())
|
||||
self.assertTrue(target_path.exists())
|
||||
|
||||
def test_rename_dry_run_prints_mapping_without_moving(self):
|
||||
source_path = self.write_source("demo_E07.mkv")
|
||||
|
||||
result = self.invoke_rename(
|
||||
"--dry-run",
|
||||
"--prefix",
|
||||
"dball",
|
||||
str(source_path),
|
||||
)
|
||||
|
||||
target_path = self.workspace / "dball_s01e07.mkv"
|
||||
self.assertIn("demo_E07.mkv -> dball_s01e07.mkv", result.output)
|
||||
self.assertTrue(source_path.exists())
|
||||
self.assertFalse(target_path.exists())
|
||||
|
||||
def test_rename_uses_configured_indicator_digit_lengths(self):
|
||||
self.write_config(
|
||||
{
|
||||
"defaultIndicatorSeasonDigits": 3,
|
||||
"defaultIndicatorEpisodeDigits": 4,
|
||||
}
|
||||
)
|
||||
source_path = self.write_source("demo_E07.mkv")
|
||||
|
||||
result = self.invoke_rename("--prefix", "dball", str(source_path))
|
||||
|
||||
target_path = self.workspace / "dball_s001e0007.mkv"
|
||||
self.assertIn("demo_E07.mkv -> dball_s001e0007.mkv", result.output)
|
||||
self.assertFalse(source_path.exists())
|
||||
self.assertTrue(target_path.exists())
|
||||
|
||||
def test_rename_skips_non_matching_filenames(self):
|
||||
source_path = self.write_source("demo_finale.mkv")
|
||||
|
||||
result = self.invoke_rename("--prefix", "dball", str(source_path))
|
||||
|
||||
self.assertIn("No matching files found.", result.output)
|
||||
self.assertTrue(source_path.exists())
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main()
|
||||
125
tests/unit/test_cli_rename_only.py
Normal file
125
tests/unit/test_cli_rename_only.py
Normal file
@@ -0,0 +1,125 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import os
|
||||
from pathlib import Path
|
||||
import sys
|
||||
import tempfile
|
||||
import unittest
|
||||
from unittest.mock import patch
|
||||
|
||||
from click.testing import CliRunner
|
||||
|
||||
|
||||
SRC_ROOT = Path(__file__).resolve().parents[2] / "src"
|
||||
|
||||
if str(SRC_ROOT) not in sys.path:
|
||||
sys.path.insert(0, str(SRC_ROOT))
|
||||
|
||||
|
||||
from ffx import cli # noqa: E402
|
||||
|
||||
|
||||
class _FakeMediaDescriptor:
|
||||
def getVideoTracks(self):
|
||||
return []
|
||||
|
||||
def getAudioTracks(self):
|
||||
return []
|
||||
|
||||
def getSubtitleTracks(self):
|
||||
return []
|
||||
|
||||
def getAttachmentTracks(self):
|
||||
return []
|
||||
|
||||
|
||||
class _FakeFileProperties:
|
||||
def __init__(self, context, source_path):
|
||||
self.source_path = source_path
|
||||
|
||||
def getShowId(self):
|
||||
return -1
|
||||
|
||||
def getSeason(self):
|
||||
return -1
|
||||
|
||||
def getEpisode(self):
|
||||
return -1
|
||||
|
||||
def getMediaDescriptor(self):
|
||||
return _FakeMediaDescriptor()
|
||||
|
||||
def getPattern(self):
|
||||
return None
|
||||
|
||||
|
||||
class _FakeShiftedSeasonController:
|
||||
def __init__(self, context):
|
||||
self.context = context
|
||||
|
||||
def shiftSeason(self, show_id, season, episode):
|
||||
return season, episode
|
||||
|
||||
|
||||
class _FakeFfxController:
|
||||
def __init__(self, *args, **kwargs):
|
||||
pass
|
||||
|
||||
def runJob(self, *args, **kwargs):
|
||||
raise AssertionError("runJob should not be called for --rename-only")
|
||||
|
||||
|
||||
class RenameOnlyCliTests(unittest.TestCase):
|
||||
def setUp(self):
|
||||
self.tempdir = tempfile.TemporaryDirectory()
|
||||
self.home_dir = Path(self.tempdir.name) / "home"
|
||||
self.home_dir.mkdir()
|
||||
self.database_path = Path(self.tempdir.name) / "test.db"
|
||||
self.source_dir = Path(self.tempdir.name) / "source"
|
||||
self.source_dir.mkdir()
|
||||
self.output_dir = Path(self.tempdir.name) / "output"
|
||||
self.output_dir.mkdir()
|
||||
self.source_path = self.source_dir / "episode.mkv"
|
||||
self.source_bytes = b"rename-only-source"
|
||||
self.source_path.write_bytes(self.source_bytes)
|
||||
|
||||
def tearDown(self):
|
||||
self.tempdir.cleanup()
|
||||
|
||||
def test_rename_only_moves_source_file_into_output_directory(self):
|
||||
runner = CliRunner()
|
||||
|
||||
with (
|
||||
patch("ffx.file_properties.FileProperties", _FakeFileProperties),
|
||||
patch("ffx.ffx_controller.FfxController", _FakeFfxController),
|
||||
patch(
|
||||
"ffx.shifted_season_controller.ShiftedSeasonController",
|
||||
_FakeShiftedSeasonController,
|
||||
),
|
||||
):
|
||||
result = runner.invoke(
|
||||
cli.ffx,
|
||||
[
|
||||
"--database-file",
|
||||
str(self.database_path),
|
||||
"convert",
|
||||
"--no-tmdb",
|
||||
"--no-pattern",
|
||||
"--rename-only",
|
||||
"--output-directory",
|
||||
str(self.output_dir),
|
||||
str(self.source_path),
|
||||
],
|
||||
env={**os.environ, "HOME": str(self.home_dir)},
|
||||
)
|
||||
|
||||
self.assertEqual(0, result.exit_code, result.output)
|
||||
|
||||
target_path = self.output_dir / "out_episode.mkv"
|
||||
self.assertFalse(self.source_path.exists())
|
||||
self.assertTrue(target_path.exists())
|
||||
self.assertEqual(self.source_bytes, target_path.read_bytes())
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main()
|
||||
@@ -57,7 +57,7 @@ class UpgradeCommandTests(unittest.TestCase):
|
||||
self.assertTrue(subprocess_calls[0][1]["capture_output"])
|
||||
self.assertTrue(subprocess_calls[0][1]["text"])
|
||||
|
||||
def test_upgrade_resets_before_checkout_and_pull_when_user_confirms(self):
|
||||
def test_upgrade_resets_then_fetches_and_checks_out_requested_branch_when_user_confirms(self):
|
||||
runner = CliRunner()
|
||||
repo_path = "/tmp/ffx-repo"
|
||||
pip_path = "/tmp/ffx-venv/bin/pip"
|
||||
@@ -85,8 +85,8 @@ class UpgradeCommandTests(unittest.TestCase):
|
||||
[
|
||||
['git', 'status', '--porcelain', '--untracked-files=no'],
|
||||
['git', 'reset', '--hard', 'HEAD'],
|
||||
['git', 'checkout', 'main'],
|
||||
['git', 'pull'],
|
||||
['git', 'fetch', 'origin', 'main'],
|
||||
['git', 'checkout', '-B', 'main', 'FETCH_HEAD'],
|
||||
[pip_path, 'install', '--upgrade', 'pip', 'setuptools', 'wheel'],
|
||||
[pip_path, 'install', '--editable', '.'],
|
||||
],
|
||||
@@ -95,6 +95,39 @@ class UpgradeCommandTests(unittest.TestCase):
|
||||
for args, kwargs in subprocess_calls[1:]:
|
||||
self.assertEqual(repo_path, kwargs["cwd"], args)
|
||||
|
||||
def test_upgrade_pulls_current_branch_when_no_branch_is_requested(self):
|
||||
runner = CliRunner()
|
||||
repo_path = "/tmp/ffx-repo"
|
||||
pip_path = "/tmp/ffx-venv/bin/pip"
|
||||
|
||||
subprocess_calls = []
|
||||
|
||||
def fake_run(args, **kwargs):
|
||||
subprocess_calls.append((args, kwargs))
|
||||
if args == ['git', 'status', '--porcelain', '--untracked-files=no']:
|
||||
return self.make_completed(args, stdout="")
|
||||
return self.make_completed(args)
|
||||
|
||||
with (
|
||||
patch.object(cli, "getBundleRepoPath", return_value=repo_path),
|
||||
patch.object(cli, "getBundlePipPath", return_value=pip_path),
|
||||
patch.object(cli.os.path, "isdir", return_value=True),
|
||||
patch.object(cli.os.path, "isfile", return_value=True),
|
||||
patch.object(cli.subprocess, "run", side_effect=fake_run),
|
||||
):
|
||||
result = runner.invoke(cli.ffx, ["upgrade"])
|
||||
|
||||
self.assertEqual(0, result.exit_code, result.output)
|
||||
self.assertEqual(
|
||||
[
|
||||
['git', 'status', '--porcelain', '--untracked-files=no'],
|
||||
['git', 'pull'],
|
||||
[pip_path, 'install', '--upgrade', 'pip', 'setuptools', 'wheel'],
|
||||
[pip_path, 'install', '--editable', '.'],
|
||||
],
|
||||
[call[0] for call in subprocess_calls],
|
||||
)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main()
|
||||
|
||||
150
tests/unit/test_configure_workstation_script.py
Normal file
150
tests/unit/test_configure_workstation_script.py
Normal file
@@ -0,0 +1,150 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
import os
|
||||
from pathlib import Path
|
||||
import stat
|
||||
import subprocess
|
||||
import sys
|
||||
import tempfile
|
||||
import textwrap
|
||||
import unittest
|
||||
|
||||
|
||||
REPO_ROOT = Path(__file__).resolve().parents[2]
|
||||
SCRIPT_PATH = REPO_ROOT / "tools" / "configure_workstation.sh"
|
||||
BUNDLE_PYTHON = Path.home() / ".local" / "share" / "ffx.venv" / "bin" / "python"
|
||||
|
||||
|
||||
class ConfigureWorkstationScriptTests(unittest.TestCase):
|
||||
def setUp(self):
|
||||
self.tempdir = tempfile.TemporaryDirectory()
|
||||
self.home_dir = Path(self.tempdir.name) / "home"
|
||||
self.home_dir.mkdir()
|
||||
self.stub_bin_dir = Path(self.tempdir.name) / "bin"
|
||||
self.stub_bin_dir.mkdir()
|
||||
|
||||
for command_name in ("git", "python3", "ffmpeg", "ffprobe", "cpulimit"):
|
||||
self.write_stub_command(command_name)
|
||||
|
||||
def tearDown(self):
|
||||
self.tempdir.cleanup()
|
||||
|
||||
def write_stub_command(self, name: str, body: str = "") -> None:
|
||||
script_path = self.stub_bin_dir / name
|
||||
script_path.write_text(
|
||||
"#!/usr/bin/env bash\n"
|
||||
+ body
|
||||
+ "\n",
|
||||
encoding="utf-8",
|
||||
)
|
||||
script_path.chmod(script_path.stat().st_mode | stat.S_IXUSR)
|
||||
|
||||
def run_script(self, **env_overrides: str) -> subprocess.CompletedProcess[str]:
|
||||
if not BUNDLE_PYTHON.is_file():
|
||||
self.skipTest(f"Missing bundle Python at {BUNDLE_PYTHON}")
|
||||
|
||||
env = {
|
||||
**os.environ,
|
||||
"HOME": str(self.home_dir),
|
||||
"PATH": f"{self.stub_bin_dir}:{os.environ.get('PATH', '')}",
|
||||
"FFX_PYTHON": str(BUNDLE_PYTHON),
|
||||
**env_overrides,
|
||||
}
|
||||
|
||||
return subprocess.run(
|
||||
["bash", str(SCRIPT_PATH)],
|
||||
capture_output=True,
|
||||
cwd=REPO_ROOT,
|
||||
env=env,
|
||||
text=True,
|
||||
)
|
||||
|
||||
def test_script_seeds_default_config_from_template(self):
|
||||
completed = self.run_script()
|
||||
|
||||
self.assertEqual(
|
||||
0,
|
||||
completed.returncode,
|
||||
f"STDOUT:\n{completed.stdout}\nSTDERR:\n{completed.stderr}",
|
||||
)
|
||||
|
||||
config_path = self.home_dir / ".local" / "etc" / "ffx.json"
|
||||
self.assertTrue(config_path.exists())
|
||||
|
||||
config_data = json.loads(config_path.read_text(encoding="utf-8"))
|
||||
self.assertEqual(
|
||||
{
|
||||
"databasePath": str(self.home_dir / ".local" / "var" / "ffx" / "ffx.db"),
|
||||
"logDirectory": str(self.home_dir / ".local" / "var" / "log"),
|
||||
"subtitlesDirectory": str(
|
||||
self.home_dir / ".local" / "var" / "sync" / "subtitles"
|
||||
),
|
||||
"defaultIndexSeasonDigits": 2,
|
||||
"defaultIndexEpisodeDigits": 2,
|
||||
"defaultIndicatorSeasonDigits": 2,
|
||||
"defaultIndicatorEpisodeDigits": 2,
|
||||
"metadata": {
|
||||
"signature": {"RECODED_WITH": "FFX"},
|
||||
"remove": [
|
||||
"VERSION-eng",
|
||||
"creation_time",
|
||||
"NAME",
|
||||
],
|
||||
"streams": {
|
||||
"remove": [
|
||||
"BPS",
|
||||
"NUMBER_OF_FRAMES",
|
||||
"NUMBER_OF_BYTES",
|
||||
"_STATISTICS_WRITING_APP",
|
||||
"_STATISTICS_WRITING_DATE_UTC",
|
||||
"_STATISTICS_TAGS",
|
||||
"BPS-eng",
|
||||
"DURATION-eng",
|
||||
"NUMBER_OF_FRAMES-eng",
|
||||
"NUMBER_OF_BYTES-eng",
|
||||
"_STATISTICS_WRITING_APP-eng",
|
||||
"_STATISTICS_WRITING_DATE_UTC-eng",
|
||||
"_STATISTICS_TAGS-eng",
|
||||
]
|
||||
},
|
||||
},
|
||||
},
|
||||
config_data,
|
||||
)
|
||||
|
||||
def test_script_honors_custom_template_override(self):
|
||||
custom_template_path = Path(self.tempdir.name) / "custom-config.j2"
|
||||
custom_template_path.write_text(
|
||||
textwrap.dedent(
|
||||
"""
|
||||
{
|
||||
"databasePath": {{ database_path_json }},
|
||||
"marker": "from-template",
|
||||
"subtitlesDirectory": {{ subtitles_directory_json }}
|
||||
}
|
||||
"""
|
||||
).lstrip(),
|
||||
encoding="utf-8",
|
||||
)
|
||||
|
||||
completed = self.run_script(FFX_CONFIG_TEMPLATE=str(custom_template_path))
|
||||
|
||||
self.assertEqual(
|
||||
0,
|
||||
completed.returncode,
|
||||
f"STDOUT:\n{completed.stdout}\nSTDERR:\n{completed.stderr}",
|
||||
)
|
||||
|
||||
config_path = self.home_dir / ".local" / "etc" / "ffx.json"
|
||||
config_data = json.loads(config_path.read_text(encoding="utf-8"))
|
||||
|
||||
self.assertEqual("from-template", config_data["marker"])
|
||||
self.assertEqual(
|
||||
str(self.home_dir / ".local" / "var" / "ffx" / "ffx.db"),
|
||||
config_data["databasePath"],
|
||||
)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main()
|
||||
@@ -1,11 +1,14 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from pathlib import Path
|
||||
import sqlite3
|
||||
import sys
|
||||
import tempfile
|
||||
import unittest
|
||||
from unittest.mock import patch
|
||||
|
||||
import click
|
||||
|
||||
|
||||
SRC_ROOT = Path(__file__).resolve().parents[2] / "src"
|
||||
|
||||
@@ -15,8 +18,18 @@ if str(SRC_ROOT) not in sys.path:
|
||||
|
||||
from ffx.constants import DATABASE_VERSION # noqa: E402
|
||||
from ffx.database import DATABASE_VERSION_KEY, databaseContext, getDatabaseVersion # noqa: E402
|
||||
from ffx.model.shifted_season import ShiftedSeason # noqa: E402
|
||||
from ffx.model.property import Property # noqa: E402
|
||||
from ffx.model.show import Show # noqa: E402
|
||||
from ffx.model.show import Base # noqa: E402
|
||||
from ffx.show_controller import ShowController # noqa: E402
|
||||
from ffx.show_descriptor import ShowDescriptor # noqa: E402
|
||||
from ffx.shifted_season_controller import ShiftedSeasonController # noqa: E402
|
||||
|
||||
|
||||
class StaticConfig:
|
||||
def getData(self):
|
||||
return {}
|
||||
|
||||
|
||||
class DatabaseContextTests(unittest.TestCase):
|
||||
@@ -27,6 +40,115 @@ class DatabaseContextTests(unittest.TestCase):
|
||||
def tearDown(self):
|
||||
self.tempdir.cleanup()
|
||||
|
||||
def create_demo_show_with_shift(self):
|
||||
database_context = databaseContext(str(self.database_path))
|
||||
context = {
|
||||
"database": database_context,
|
||||
"config": StaticConfig(),
|
||||
"logger": object(),
|
||||
}
|
||||
try:
|
||||
ShowController(context).updateShow(
|
||||
ShowDescriptor(id=1, name="Demo", year=2000)
|
||||
)
|
||||
shifted_season_id = ShiftedSeasonController(context).addShiftedSeason(
|
||||
showId=1,
|
||||
shiftedSeasonObj={
|
||||
"original_season": 1,
|
||||
"first_episode": 1,
|
||||
"last_episode": 10,
|
||||
"season_offset": 1,
|
||||
"episode_offset": -10,
|
||||
},
|
||||
)
|
||||
finally:
|
||||
database_context["engine"].dispose()
|
||||
|
||||
return shifted_season_id
|
||||
|
||||
def rewrite_shows_table_without_show_fields(self, cursor):
|
||||
cursor.execute("ALTER TABLE shows RENAME TO shows_current")
|
||||
cursor.execute(
|
||||
"""
|
||||
CREATE TABLE shows (
|
||||
id INTEGER PRIMARY KEY,
|
||||
name VARCHAR,
|
||||
year INTEGER,
|
||||
index_season_digits INTEGER,
|
||||
index_episode_digits INTEGER,
|
||||
indicator_season_digits INTEGER,
|
||||
indicator_episode_digits INTEGER
|
||||
)
|
||||
"""
|
||||
)
|
||||
cursor.execute(
|
||||
"""
|
||||
INSERT INTO shows (
|
||||
id,
|
||||
name,
|
||||
year,
|
||||
index_season_digits,
|
||||
index_episode_digits,
|
||||
indicator_season_digits,
|
||||
indicator_episode_digits
|
||||
)
|
||||
SELECT
|
||||
id,
|
||||
name,
|
||||
year,
|
||||
index_season_digits,
|
||||
index_episode_digits,
|
||||
indicator_season_digits,
|
||||
indicator_episode_digits
|
||||
FROM shows_current
|
||||
"""
|
||||
)
|
||||
cursor.execute("DROP TABLE shows_current")
|
||||
|
||||
def rewrite_shifted_seasons_table_without_pattern_owner(self, cursor):
|
||||
cursor.execute("DROP INDEX IF EXISTS ix_shifted_seasons_show_id")
|
||||
cursor.execute("DROP INDEX IF EXISTS ix_shifted_seasons_pattern_id")
|
||||
cursor.execute(
|
||||
"ALTER TABLE shifted_seasons RENAME TO shifted_seasons_current"
|
||||
)
|
||||
cursor.execute(
|
||||
"""
|
||||
CREATE TABLE shifted_seasons (
|
||||
id INTEGER PRIMARY KEY,
|
||||
show_id INTEGER,
|
||||
original_season INTEGER,
|
||||
first_episode INTEGER DEFAULT -1,
|
||||
last_episode INTEGER DEFAULT -1,
|
||||
season_offset INTEGER DEFAULT 0,
|
||||
episode_offset INTEGER DEFAULT 0,
|
||||
FOREIGN KEY(show_id) REFERENCES shows(id) ON DELETE CASCADE
|
||||
)
|
||||
"""
|
||||
)
|
||||
cursor.execute(
|
||||
"""
|
||||
INSERT INTO shifted_seasons (
|
||||
id,
|
||||
show_id,
|
||||
original_season,
|
||||
first_episode,
|
||||
last_episode,
|
||||
season_offset,
|
||||
episode_offset
|
||||
)
|
||||
SELECT
|
||||
id,
|
||||
show_id,
|
||||
original_season,
|
||||
first_episode,
|
||||
last_episode,
|
||||
season_offset,
|
||||
episode_offset
|
||||
FROM shifted_seasons_current
|
||||
"""
|
||||
)
|
||||
cursor.execute("DROP TABLE shifted_seasons_current")
|
||||
|
||||
def test_database_context_bootstraps_new_database_with_current_version(self):
|
||||
with patch("ffx.database.Base.metadata.create_all", wraps=Base.metadata.create_all) as mocked_create_all:
|
||||
context = databaseContext(str(self.database_path))
|
||||
@@ -78,6 +200,127 @@ class DatabaseContextTests(unittest.TestCase):
|
||||
|
||||
mocked_create_all.assert_not_called()
|
||||
|
||||
def test_database_context_migrates_v2_shifted_seasons_schema_to_v3(self):
|
||||
shifted_season_id = self.create_demo_show_with_shift()
|
||||
|
||||
connection = sqlite3.connect(self.database_path)
|
||||
try:
|
||||
cursor = connection.cursor()
|
||||
cursor.execute("PRAGMA foreign_keys=OFF")
|
||||
self.rewrite_shifted_seasons_table_without_pattern_owner(cursor)
|
||||
self.rewrite_shows_table_without_show_fields(cursor)
|
||||
cursor.execute(
|
||||
"UPDATE properties SET value = '2' WHERE key = ?",
|
||||
(DATABASE_VERSION_KEY,),
|
||||
)
|
||||
connection.commit()
|
||||
finally:
|
||||
connection.close()
|
||||
|
||||
with patch("ffx.database.click.confirm", return_value=True) as mocked_confirm, patch(
|
||||
"ffx.database.click.echo"
|
||||
) as mocked_echo:
|
||||
reopened_context = databaseContext(str(self.database_path))
|
||||
try:
|
||||
self.assertEqual(DATABASE_VERSION, getDatabaseVersion(reopened_context))
|
||||
mocked_confirm.assert_called_once()
|
||||
|
||||
backup_path = Path(f"{self.database_path}.v2-to-v{DATABASE_VERSION}.bak")
|
||||
self.assertTrue(backup_path.exists())
|
||||
|
||||
Session = reopened_context["session"]
|
||||
session = Session()
|
||||
try:
|
||||
migrated_shifted_season = (
|
||||
session.query(ShiftedSeason)
|
||||
.filter(ShiftedSeason.id == shifted_season_id)
|
||||
.first()
|
||||
)
|
||||
self.assertIsNotNone(migrated_shifted_season)
|
||||
self.assertEqual(1, migrated_shifted_season.getShowId())
|
||||
self.assertIsNone(migrated_shifted_season.getPatternId())
|
||||
self.assertEqual(1, migrated_shifted_season.getOriginalSeason())
|
||||
self.assertEqual(1, migrated_shifted_season.getFirstEpisode())
|
||||
self.assertEqual(10, migrated_shifted_season.getLastEpisode())
|
||||
migrated_show = session.query(Show).filter(Show.id == 1).first()
|
||||
self.assertIsNotNone(migrated_show)
|
||||
self.assertEqual(0, int(migrated_show.quality or 0))
|
||||
self.assertEqual('', str(migrated_show.notes or ''))
|
||||
finally:
|
||||
session.close()
|
||||
finally:
|
||||
reopened_context["engine"].dispose()
|
||||
|
||||
echoedLines = [call.args[0] for call in mocked_echo.call_args_list]
|
||||
self.assertIn("Database migration required.", echoedLines)
|
||||
self.assertIn("Current version: 2", echoedLines)
|
||||
self.assertIn(f"Target version: {DATABASE_VERSION}", echoedLines)
|
||||
self.assertIn(
|
||||
" 2 -> 3: ffx.model.migration.step_2_3 [present]",
|
||||
echoedLines,
|
||||
)
|
||||
|
||||
def test_database_context_aborts_migration_when_confirmation_is_declined(self):
|
||||
context = databaseContext(str(self.database_path))
|
||||
try:
|
||||
Session = context["session"]
|
||||
session = Session()
|
||||
try:
|
||||
version_row = (
|
||||
session.query(Property)
|
||||
.filter(Property.key == DATABASE_VERSION_KEY)
|
||||
.first()
|
||||
)
|
||||
version_row.value = "2"
|
||||
session.commit()
|
||||
finally:
|
||||
session.close()
|
||||
finally:
|
||||
context["engine"].dispose()
|
||||
|
||||
with patch("ffx.database.click.confirm", return_value=False), patch(
|
||||
"ffx.database.click.echo"
|
||||
):
|
||||
with self.assertRaises(click.ClickException) as raisedContext:
|
||||
databaseContext(str(self.database_path))
|
||||
|
||||
self.assertEqual("Database migration aborted by user.", str(raisedContext.exception))
|
||||
self.assertFalse(Path(f"{self.database_path}.v2-to-v{DATABASE_VERSION}.bak").exists())
|
||||
|
||||
def test_database_context_repairs_current_show_schema_without_version_bump(self):
|
||||
self.create_demo_show_with_shift()
|
||||
|
||||
connection = sqlite3.connect(self.database_path)
|
||||
try:
|
||||
cursor = connection.cursor()
|
||||
cursor.execute("PRAGMA foreign_keys=OFF")
|
||||
self.rewrite_shows_table_without_show_fields(cursor)
|
||||
connection.commit()
|
||||
finally:
|
||||
connection.close()
|
||||
|
||||
with patch("ffx.database.click.confirm") as mocked_confirm, patch(
|
||||
"ffx.database.click.echo"
|
||||
) as mocked_echo:
|
||||
reopened_context = databaseContext(str(self.database_path))
|
||||
try:
|
||||
self.assertEqual(DATABASE_VERSION, getDatabaseVersion(reopened_context))
|
||||
|
||||
Session = reopened_context["session"]
|
||||
session = Session()
|
||||
try:
|
||||
repaired_show = session.query(Show).filter(Show.id == 1).first()
|
||||
self.assertIsNotNone(repaired_show)
|
||||
self.assertEqual(0, int(repaired_show.quality or 0))
|
||||
self.assertEqual('', str(repaired_show.notes or ''))
|
||||
finally:
|
||||
session.close()
|
||||
finally:
|
||||
reopened_context["engine"].dispose()
|
||||
|
||||
mocked_confirm.assert_not_called()
|
||||
mocked_echo.assert_not_called()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main()
|
||||
|
||||
@@ -4,6 +4,7 @@ from pathlib import Path
|
||||
import sys
|
||||
import unittest
|
||||
from unittest.mock import patch
|
||||
from types import SimpleNamespace
|
||||
|
||||
|
||||
SRC_ROOT = Path(__file__).resolve().parents[2] / "src"
|
||||
@@ -15,6 +16,7 @@ if str(SRC_ROOT) not in sys.path:
|
||||
from ffx.ffx_controller import FfxController # noqa: E402
|
||||
from ffx.logging_utils import get_ffx_logger # noqa: E402
|
||||
from ffx.media_descriptor import MediaDescriptor # noqa: E402
|
||||
from ffx.show_descriptor import ShowDescriptor # noqa: E402
|
||||
from ffx.track_codec import TrackCodec # noqa: E402
|
||||
from ffx.track_descriptor import TrackDescriptor # noqa: E402
|
||||
from ffx.track_type import TrackType # noqa: E402
|
||||
@@ -134,6 +136,62 @@ class FfxControllerTests(unittest.TestCase):
|
||||
self.assertIn("ENCODING_QUALITY=29", commands[0])
|
||||
self.assertIn("ENCODING_PRESET=7", commands[0])
|
||||
|
||||
def test_run_job_uses_show_quality_when_pattern_quality_is_unset(self):
|
||||
context = self.make_context(VideoEncoder.H264)
|
||||
target_descriptor, source_descriptor = self.make_media_descriptors()
|
||||
controller = FfxController(context, target_descriptor, source_descriptor)
|
||||
commands = []
|
||||
show_descriptor = ShowDescriptor(id=1, name="Show", year=2024, quality=23)
|
||||
pattern = SimpleNamespace(quality=0)
|
||||
|
||||
with (
|
||||
patch.object(
|
||||
controller,
|
||||
"executeCommandSequence",
|
||||
side_effect=lambda command: commands.append(command) or ("", "", 0),
|
||||
),
|
||||
patch.object(context["logger"], "info") as mocked_info,
|
||||
):
|
||||
controller.runJob(
|
||||
"input.mkv",
|
||||
"output.mkv",
|
||||
chainIteration=[],
|
||||
currentPattern=pattern,
|
||||
currentShowDescriptor=show_descriptor,
|
||||
)
|
||||
|
||||
self.assertEqual(1, len(commands))
|
||||
self.assertIn("ENCODING_QUALITY=23", commands[0])
|
||||
mocked_info.assert_any_call("Setting quality 23 from show")
|
||||
|
||||
def test_run_job_prefers_pattern_quality_over_show_quality(self):
|
||||
context = self.make_context(VideoEncoder.H264)
|
||||
target_descriptor, source_descriptor = self.make_media_descriptors()
|
||||
controller = FfxController(context, target_descriptor, source_descriptor)
|
||||
commands = []
|
||||
show_descriptor = ShowDescriptor(id=1, name="Show", year=2024, quality=23)
|
||||
pattern = SimpleNamespace(quality=19)
|
||||
|
||||
with (
|
||||
patch.object(
|
||||
controller,
|
||||
"executeCommandSequence",
|
||||
side_effect=lambda command: commands.append(command) or ("", "", 0),
|
||||
),
|
||||
patch.object(context["logger"], "info") as mocked_info,
|
||||
):
|
||||
controller.runJob(
|
||||
"input.mkv",
|
||||
"output.mkv",
|
||||
chainIteration=[],
|
||||
currentPattern=pattern,
|
||||
currentShowDescriptor=show_descriptor,
|
||||
)
|
||||
|
||||
self.assertEqual(1, len(commands))
|
||||
self.assertIn("ENCODING_QUALITY=19", commands[0])
|
||||
mocked_info.assert_any_call("Setting quality 19 from pattern")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main()
|
||||
|
||||
41
tests/unit/test_iso_language.py
Normal file
41
tests/unit/test_iso_language.py
Normal file
@@ -0,0 +1,41 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from pathlib import Path
|
||||
import sys
|
||||
import unittest
|
||||
|
||||
|
||||
SRC_ROOT = Path(__file__).resolve().parents[2] / "src"
|
||||
|
||||
if str(SRC_ROOT) not in sys.path:
|
||||
sys.path.insert(0, str(SRC_ROOT))
|
||||
|
||||
|
||||
from ffx.iso_language import IsoLanguage # noqa: E402
|
||||
|
||||
|
||||
class IsoLanguageTests(unittest.TestCase):
|
||||
def test_language_constant_set_covers_iso_639_1_plus_filipino_alias(self):
|
||||
languages = [language for language in IsoLanguage if language is not IsoLanguage.UNDEFINED]
|
||||
|
||||
self.assertEqual(184, len(languages))
|
||||
self.assertEqual(183, len({language.twoLetter() for language in languages}))
|
||||
|
||||
def test_primary_three_letter_code_is_returned_first(self):
|
||||
self.assertEqual("sqi", IsoLanguage.ALBANIAN.threeLetter())
|
||||
self.assertEqual("deu", IsoLanguage.GERMAN.threeLetter())
|
||||
self.assertEqual("cym", IsoLanguage.WELSH.threeLetter())
|
||||
|
||||
def test_secondary_three_letter_codes_still_resolve_to_the_same_language(self):
|
||||
self.assertIs(IsoLanguage.ALBANIAN, IsoLanguage.findThreeLetter("alb"))
|
||||
self.assertIs(IsoLanguage.GERMAN, IsoLanguage.findThreeLetter("ger"))
|
||||
self.assertIs(IsoLanguage.WELSH, IsoLanguage.findThreeLetter("wel"))
|
||||
|
||||
def test_newly_added_languages_and_media_aliases_resolve(self):
|
||||
self.assertIs(IsoLanguage.ASSAMESE, IsoLanguage.find("Assamese"))
|
||||
self.assertIs(IsoLanguage.YORUBA, IsoLanguage.findThreeLetter("yor"))
|
||||
self.assertIs(IsoLanguage.FILIPINO, IsoLanguage.findThreeLetter("fil"))
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main()
|
||||
@@ -27,6 +27,64 @@ class StaticConfig:
|
||||
|
||||
|
||||
class MediaDescriptorChangeSetTests(unittest.TestCase):
|
||||
def test_non_primary_source_language_code_is_normalized_in_changed_track_metadata(self):
|
||||
context = {
|
||||
"logger": get_ffx_logger(),
|
||||
"config": StaticConfig({}),
|
||||
}
|
||||
|
||||
source_track = TrackDescriptor(
|
||||
index=0,
|
||||
source_index=0,
|
||||
sub_index=0,
|
||||
track_type=TrackType.AUDIO,
|
||||
tags={"language": "ger", "title": "German Main"},
|
||||
)
|
||||
target_track = TrackDescriptor(
|
||||
index=0,
|
||||
source_index=0,
|
||||
sub_index=0,
|
||||
track_type=TrackType.AUDIO,
|
||||
tags={"language": "ger", "title": "German Main"},
|
||||
)
|
||||
|
||||
change_set = MediaDescriptorChangeSet(
|
||||
context,
|
||||
MediaDescriptor(track_descriptors=[target_track]),
|
||||
MediaDescriptor(track_descriptors=[source_track]),
|
||||
)
|
||||
|
||||
metadata_tokens = change_set.generateMetadataTokens()
|
||||
|
||||
self.assertIn("-metadata:s:a:0", metadata_tokens)
|
||||
self.assertIn("language=deu", metadata_tokens)
|
||||
self.assertNotIn("language=ger", metadata_tokens)
|
||||
|
||||
def test_target_only_track_language_metadata_uses_primary_code(self):
|
||||
context = {
|
||||
"logger": get_ffx_logger(),
|
||||
"config": StaticConfig({}),
|
||||
}
|
||||
|
||||
target_track = TrackDescriptor(
|
||||
index=0,
|
||||
source_index=0,
|
||||
sub_index=0,
|
||||
track_type=TrackType.AUDIO,
|
||||
tags={"language": "ger", "title": "German Main"},
|
||||
)
|
||||
|
||||
change_set = MediaDescriptorChangeSet(
|
||||
context,
|
||||
MediaDescriptor(track_descriptors=[target_track]),
|
||||
)
|
||||
|
||||
metadata_tokens = change_set.generateMetadataTokens()
|
||||
|
||||
self.assertIn("-metadata:s:a:0", metadata_tokens)
|
||||
self.assertIn("language=deu", metadata_tokens)
|
||||
self.assertNotIn("language=ger", metadata_tokens)
|
||||
|
||||
def test_external_subtitle_preserves_source_only_tags_except_removed_keys(self):
|
||||
context = {
|
||||
"logger": get_ffx_logger(),
|
||||
@@ -79,6 +137,40 @@ class MediaDescriptorChangeSetTests(unittest.TestCase):
|
||||
self.assertNotIn("BPS=remove-me", metadata_tokens)
|
||||
self.assertIn("BPS=", metadata_tokens)
|
||||
|
||||
def test_external_subtitle_normalizes_preserved_source_language_metadata(self):
|
||||
context = {
|
||||
"logger": get_ffx_logger(),
|
||||
"config": StaticConfig({}),
|
||||
}
|
||||
|
||||
source_track = TrackDescriptor(
|
||||
index=0,
|
||||
source_index=0,
|
||||
sub_index=0,
|
||||
track_type=TrackType.SUBTITLE,
|
||||
tags={"language": "ger", "title": "German Subtitle"},
|
||||
)
|
||||
target_track = TrackDescriptor(
|
||||
index=0,
|
||||
source_index=0,
|
||||
sub_index=0,
|
||||
track_type=TrackType.SUBTITLE,
|
||||
tags={},
|
||||
external_source_file="/tmp/external-subtitle.vtt",
|
||||
)
|
||||
|
||||
change_set = MediaDescriptorChangeSet(
|
||||
context,
|
||||
MediaDescriptor(track_descriptors=[target_track]),
|
||||
MediaDescriptor(track_descriptors=[source_track]),
|
||||
)
|
||||
|
||||
metadata_tokens = change_set.generateMetadataTokens()
|
||||
|
||||
self.assertIn("-metadata:s:s:0", metadata_tokens)
|
||||
self.assertIn("language=deu", metadata_tokens)
|
||||
self.assertNotIn("language=ger", metadata_tokens)
|
||||
|
||||
def test_target_only_tracks_still_emit_remove_tokens_for_configured_stream_keys(self):
|
||||
context = {
|
||||
"logger": get_ffx_logger(),
|
||||
|
||||
79
tests/unit/test_media_descriptor_import_subtitles.py
Normal file
79
tests/unit/test_media_descriptor_import_subtitles.py
Normal file
@@ -0,0 +1,79 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from pathlib import Path
|
||||
import sys
|
||||
import tempfile
|
||||
import unittest
|
||||
|
||||
|
||||
SRC_ROOT = Path(__file__).resolve().parents[2] / "src"
|
||||
|
||||
if str(SRC_ROOT) not in sys.path:
|
||||
sys.path.insert(0, str(SRC_ROOT))
|
||||
|
||||
|
||||
from ffx.logging_utils import get_ffx_logger # noqa: E402
|
||||
from ffx.media_descriptor import MediaDescriptor # noqa: E402
|
||||
from ffx.track_descriptor import TrackDescriptor # noqa: E402
|
||||
from ffx.track_disposition import TrackDisposition # noqa: E402
|
||||
from ffx.track_type import TrackType # noqa: E402
|
||||
|
||||
|
||||
class MediaDescriptorImportSubtitlesTests(unittest.TestCase):
|
||||
def make_descriptor(self) -> MediaDescriptor:
|
||||
return MediaDescriptor(
|
||||
context={"logger": get_ffx_logger()},
|
||||
track_descriptors=[
|
||||
TrackDescriptor(
|
||||
index=3,
|
||||
source_index=3,
|
||||
sub_index=0,
|
||||
track_type=TrackType.SUBTITLE,
|
||||
tags={"language": "eng", "title": "DB Subtitle"},
|
||||
disposition_set={TrackDisposition.DEFAULT},
|
||||
)
|
||||
],
|
||||
)
|
||||
|
||||
def test_import_subtitles_preserves_target_dispositions_when_requested(self):
|
||||
descriptor = self.make_descriptor()
|
||||
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
sidecar_path = Path(tmpdir) / "dball_S01E01_3_deu_FOR.vtt"
|
||||
sidecar_path.write_text("WEBVTT\n\n", encoding="utf-8")
|
||||
|
||||
descriptor.importSubtitles(
|
||||
tmpdir,
|
||||
"dball",
|
||||
season=1,
|
||||
episode=1,
|
||||
preserve_dispositions=True,
|
||||
)
|
||||
|
||||
track = descriptor.getSubtitleTracks()[0]
|
||||
self.assertEqual(str(sidecar_path), track.getExternalSourceFilePath())
|
||||
self.assertEqual("deu", track.getTags()["language"])
|
||||
self.assertEqual({TrackDisposition.DEFAULT}, track.getDispositionSet())
|
||||
|
||||
def test_import_subtitles_uses_sidecar_dispositions_by_default(self):
|
||||
descriptor = self.make_descriptor()
|
||||
|
||||
with tempfile.TemporaryDirectory() as tmpdir:
|
||||
sidecar_path = Path(tmpdir) / "dball_S01E01_3_deu_FOR.vtt"
|
||||
sidecar_path.write_text("WEBVTT\n\n", encoding="utf-8")
|
||||
|
||||
descriptor.importSubtitles(
|
||||
tmpdir,
|
||||
"dball",
|
||||
season=1,
|
||||
episode=1,
|
||||
)
|
||||
|
||||
track = descriptor.getSubtitleTracks()[0]
|
||||
self.assertEqual(str(sidecar_path), track.getExternalSourceFilePath())
|
||||
self.assertEqual("deu", track.getTags()["language"])
|
||||
self.assertEqual({TrackDisposition.FORCED}, track.getDispositionSet())
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main()
|
||||
47
tests/unit/test_migration.py
Normal file
47
tests/unit/test_migration.py
Normal file
@@ -0,0 +1,47 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from pathlib import Path
|
||||
import sys
|
||||
import unittest
|
||||
|
||||
|
||||
SRC_ROOT = Path(__file__).resolve().parents[2] / "src"
|
||||
|
||||
if str(SRC_ROOT) not in sys.path:
|
||||
sys.path.insert(0, str(SRC_ROOT))
|
||||
|
||||
|
||||
from ffx.model.migration import ( # noqa: E402
|
||||
DatabaseVersionException,
|
||||
getMigrationPlan,
|
||||
loadMigrationStep,
|
||||
migrateDatabase,
|
||||
)
|
||||
|
||||
|
||||
class MigrationTests(unittest.TestCase):
|
||||
def test_get_migration_plan_lists_known_step_with_module_presence(self):
|
||||
migrationPlan = getMigrationPlan(2, 3)
|
||||
|
||||
self.assertEqual(1, len(migrationPlan))
|
||||
self.assertEqual(2, migrationPlan[0].versionFrom)
|
||||
self.assertEqual(3, migrationPlan[0].versionTo)
|
||||
self.assertEqual("ffx.model.migration.step_2_3", migrationPlan[0].moduleName)
|
||||
self.assertTrue(migrationPlan[0].modulePresent)
|
||||
|
||||
def test_load_migration_step_returns_known_step(self):
|
||||
migrationStep = loadMigrationStep(2, 3)
|
||||
|
||||
self.assertTrue(callable(migrationStep))
|
||||
|
||||
def test_migrate_database_raises_when_step_module_is_missing(self):
|
||||
updatedVersions = []
|
||||
|
||||
with self.assertRaises(DatabaseVersionException):
|
||||
migrateDatabase({}, 1, 2, lambda context, version: updatedVersions.append(version))
|
||||
|
||||
self.assertEqual([], updatedVersions)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main()
|
||||
208
tests/unit/test_shifted_season_controller.py
Normal file
208
tests/unit/test_shifted_season_controller.py
Normal file
@@ -0,0 +1,208 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
from pathlib import Path
|
||||
import sys
|
||||
import tempfile
|
||||
import unittest
|
||||
from unittest.mock import patch
|
||||
|
||||
|
||||
SRC_ROOT = Path(__file__).resolve().parents[2] / "src"
|
||||
|
||||
if str(SRC_ROOT) not in sys.path:
|
||||
sys.path.insert(0, str(SRC_ROOT))
|
||||
|
||||
|
||||
from ffx.database import databaseContext # noqa: E402
|
||||
from ffx.model.pattern import Pattern # noqa: E402
|
||||
from ffx.model.track import Track # noqa: E402
|
||||
from ffx.show_controller import ShowController # noqa: E402
|
||||
from ffx.show_descriptor import ShowDescriptor # noqa: E402
|
||||
from ffx.shifted_season_controller import ShiftedSeasonController # noqa: E402
|
||||
from ffx.track_type import TrackType # noqa: E402
|
||||
|
||||
|
||||
class StaticConfig:
|
||||
def __init__(self, data: dict | None = None):
|
||||
self._data = data or {}
|
||||
|
||||
def getData(self):
|
||||
return self._data
|
||||
|
||||
|
||||
def make_logger(name: str) -> logging.Logger:
|
||||
logger = logging.getLogger(name)
|
||||
logger.handlers = []
|
||||
logger.setLevel(logging.DEBUG)
|
||||
logger.propagate = False
|
||||
logger.addHandler(logging.NullHandler())
|
||||
return logger
|
||||
|
||||
|
||||
def make_context(database_path: Path) -> dict:
|
||||
return {
|
||||
"logger": make_logger(f"ffx-test-shifted-{database_path.stem}"),
|
||||
"config": StaticConfig(),
|
||||
"database": databaseContext(str(database_path)),
|
||||
}
|
||||
|
||||
|
||||
class ShiftedSeasonControllerTests(unittest.TestCase):
|
||||
def setUp(self):
|
||||
self.tempdir = tempfile.TemporaryDirectory()
|
||||
self.database_path = Path(self.tempdir.name) / "shifted-season-test.db"
|
||||
self.context = make_context(self.database_path)
|
||||
self.show_controller = ShowController(self.context)
|
||||
self.shifted_season_controller = ShiftedSeasonController(self.context)
|
||||
|
||||
def tearDown(self):
|
||||
self.context["database"]["engine"].dispose()
|
||||
self.tempdir.cleanup()
|
||||
|
||||
def add_show(self, show_id: int, name: str = "Demo Show"):
|
||||
self.show_controller.updateShow(
|
||||
ShowDescriptor(id=show_id, name=name, year=2000 + show_id)
|
||||
)
|
||||
|
||||
def add_pattern(self, show_id: int, expression: str) -> int:
|
||||
self.add_show(show_id)
|
||||
Session = self.context["database"]["session"]
|
||||
session = Session()
|
||||
try:
|
||||
pattern = Pattern(show_id=show_id, pattern=expression)
|
||||
session.add(pattern)
|
||||
session.flush()
|
||||
session.add(
|
||||
Track(
|
||||
pattern_id=pattern.getId(),
|
||||
track_type=TrackType.VIDEO.index(),
|
||||
codec_name="h264",
|
||||
index=0,
|
||||
source_index=0,
|
||||
disposition_flags=0,
|
||||
audio_layout=0,
|
||||
)
|
||||
)
|
||||
session.commit()
|
||||
return pattern.getId()
|
||||
finally:
|
||||
session.close()
|
||||
|
||||
def test_shift_season_uses_show_mapping_when_no_pattern_mapping_exists(self):
|
||||
pattern_id = self.add_pattern(1, r"^demo_(s[0-9]+e[0-9]+)\.mkv$")
|
||||
self.shifted_season_controller.addShiftedSeason(
|
||||
showId=1,
|
||||
shiftedSeasonObj={
|
||||
"original_season": 1,
|
||||
"first_episode": 1,
|
||||
"last_episode": 10,
|
||||
"season_offset": 2,
|
||||
"episode_offset": 5,
|
||||
},
|
||||
)
|
||||
|
||||
with patch.object(self.context["logger"], "info") as mocked_info:
|
||||
shifted_season, shifted_episode = self.shifted_season_controller.shiftSeason(
|
||||
showId=1,
|
||||
patternId=pattern_id,
|
||||
season=1,
|
||||
episode=3,
|
||||
)
|
||||
|
||||
self.assertEqual((3, 8), (shifted_season, shifted_episode))
|
||||
mocked_info.assert_called_once_with(
|
||||
"Setting season shift 1/3 -> 3/8 from show"
|
||||
)
|
||||
|
||||
def test_shift_season_prefers_pattern_mapping_over_show_mapping(self):
|
||||
pattern_id = self.add_pattern(1, r"^demo_(s[0-9]+e[0-9]+)\.mkv$")
|
||||
self.shifted_season_controller.addShiftedSeason(
|
||||
showId=1,
|
||||
shiftedSeasonObj={
|
||||
"original_season": 1,
|
||||
"first_episode": 1,
|
||||
"last_episode": 10,
|
||||
"season_offset": 2,
|
||||
"episode_offset": 5,
|
||||
},
|
||||
)
|
||||
self.shifted_season_controller.addShiftedSeason(
|
||||
patternId=pattern_id,
|
||||
shiftedSeasonObj={
|
||||
"original_season": 1,
|
||||
"first_episode": 1,
|
||||
"last_episode": 10,
|
||||
"season_offset": 1,
|
||||
"episode_offset": -2,
|
||||
},
|
||||
)
|
||||
|
||||
with patch.object(self.context["logger"], "info") as mocked_info:
|
||||
shifted_season, shifted_episode = self.shifted_season_controller.shiftSeason(
|
||||
showId=1,
|
||||
patternId=pattern_id,
|
||||
season=1,
|
||||
episode=3,
|
||||
)
|
||||
|
||||
self.assertEqual((2, 1), (shifted_season, shifted_episode))
|
||||
mocked_info.assert_called_once_with(
|
||||
"Setting season shift 1/3 -> 2/1 from pattern"
|
||||
)
|
||||
|
||||
def test_shift_season_pattern_zero_offsets_override_show_mapping_to_identity(self):
|
||||
pattern_id = self.add_pattern(1, r"^demo_(s[0-9]+e[0-9]+)\.mkv$")
|
||||
self.shifted_season_controller.addShiftedSeason(
|
||||
showId=1,
|
||||
shiftedSeasonObj={
|
||||
"original_season": 1,
|
||||
"first_episode": 1,
|
||||
"last_episode": 10,
|
||||
"season_offset": 2,
|
||||
"episode_offset": 5,
|
||||
},
|
||||
)
|
||||
self.shifted_season_controller.addShiftedSeason(
|
||||
patternId=pattern_id,
|
||||
shiftedSeasonObj={
|
||||
"original_season": 1,
|
||||
"first_episode": 1,
|
||||
"last_episode": 10,
|
||||
"season_offset": 0,
|
||||
"episode_offset": 0,
|
||||
},
|
||||
)
|
||||
|
||||
with patch.object(self.context["logger"], "info") as mocked_info:
|
||||
shifted_season, shifted_episode = self.shifted_season_controller.shiftSeason(
|
||||
showId=1,
|
||||
patternId=pattern_id,
|
||||
season=1,
|
||||
episode=3,
|
||||
)
|
||||
|
||||
self.assertEqual((1, 3), (shifted_season, shifted_episode))
|
||||
mocked_info.assert_called_once_with(
|
||||
"Setting season shift 1/3 -> 1/3 from pattern"
|
||||
)
|
||||
|
||||
def test_shift_season_falls_back_to_identity_when_no_rule_matches(self):
|
||||
pattern_id = self.add_pattern(1, r"^demo_(s[0-9]+e[0-9]+)\.mkv$")
|
||||
|
||||
with patch.object(self.context["logger"], "info") as mocked_info:
|
||||
shifted_season, shifted_episode = self.shifted_season_controller.shiftSeason(
|
||||
showId=1,
|
||||
patternId=pattern_id,
|
||||
season=4,
|
||||
episode=20,
|
||||
)
|
||||
|
||||
self.assertEqual((4, 20), (shifted_season, shifted_episode))
|
||||
mocked_info.assert_called_once_with(
|
||||
"Setting season shift 4/20 -> 4/20 from default"
|
||||
)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main()
|
||||
111
tests/unit/test_show_descriptor_defaults.py
Normal file
111
tests/unit/test_show_descriptor_defaults.py
Normal file
@@ -0,0 +1,111 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import logging
|
||||
from pathlib import Path
|
||||
import sys
|
||||
import unittest
|
||||
|
||||
|
||||
SRC_ROOT = Path(__file__).resolve().parents[2] / "src"
|
||||
|
||||
if str(SRC_ROOT) not in sys.path:
|
||||
sys.path.insert(0, str(SRC_ROOT))
|
||||
|
||||
|
||||
from ffx.constants import (
|
||||
DEFAULT_SHOW_INDEX_EPISODE_DIGITS,
|
||||
DEFAULT_SHOW_INDEX_SEASON_DIGITS,
|
||||
DEFAULT_SHOW_INDICATOR_EPISODE_DIGITS,
|
||||
DEFAULT_SHOW_INDICATOR_SEASON_DIGITS,
|
||||
)
|
||||
from ffx.helper import getEpisodeFileBasename
|
||||
from ffx.show_descriptor import ShowDescriptor
|
||||
|
||||
|
||||
class StaticConfig:
|
||||
def __init__(self, data: dict | None = None):
|
||||
self._data = data or {}
|
||||
|
||||
def getData(self):
|
||||
return self._data
|
||||
|
||||
|
||||
class ShowDescriptorDefaultTests(unittest.TestCase):
|
||||
def make_context(self, config_data: dict | None = None) -> dict:
|
||||
logger = logging.getLogger("ffx-test-show-descriptor-defaults")
|
||||
logger.handlers = []
|
||||
logger.addHandler(logging.NullHandler())
|
||||
return {"config": StaticConfig(config_data), "logger": logger}
|
||||
|
||||
def test_show_descriptor_uses_config_defaults_when_context_is_present(self):
|
||||
descriptor = ShowDescriptor(
|
||||
context=self.make_context(
|
||||
{
|
||||
"defaultIndexSeasonDigits": "1",
|
||||
"defaultIndexEpisodeDigits": "3",
|
||||
"defaultIndicatorSeasonDigits": "3",
|
||||
"defaultIndicatorEpisodeDigits": "4",
|
||||
}
|
||||
),
|
||||
id=1,
|
||||
name="Configured Show",
|
||||
year=2024,
|
||||
)
|
||||
|
||||
self.assertEqual(1, descriptor.getIndexSeasonDigits())
|
||||
self.assertEqual(3, descriptor.getIndexEpisodeDigits())
|
||||
self.assertEqual(3, descriptor.getIndicatorSeasonDigits())
|
||||
self.assertEqual(4, descriptor.getIndicatorEpisodeDigits())
|
||||
self.assertEqual(0, descriptor.getQuality())
|
||||
self.assertEqual("", descriptor.getNotes())
|
||||
|
||||
def test_show_descriptor_without_context_uses_shared_constants(self):
|
||||
descriptor = ShowDescriptor(id=1, name="Default Show", year=2024)
|
||||
|
||||
self.assertEqual(DEFAULT_SHOW_INDEX_SEASON_DIGITS, descriptor.getIndexSeasonDigits())
|
||||
self.assertEqual(DEFAULT_SHOW_INDEX_EPISODE_DIGITS, descriptor.getIndexEpisodeDigits())
|
||||
self.assertEqual(
|
||||
DEFAULT_SHOW_INDICATOR_SEASON_DIGITS,
|
||||
descriptor.getIndicatorSeasonDigits(),
|
||||
)
|
||||
self.assertEqual(
|
||||
DEFAULT_SHOW_INDICATOR_EPISODE_DIGITS,
|
||||
descriptor.getIndicatorEpisodeDigits(),
|
||||
)
|
||||
self.assertEqual(0, descriptor.getQuality())
|
||||
self.assertEqual("", descriptor.getNotes())
|
||||
|
||||
def test_show_descriptor_preserves_explicit_quality(self):
|
||||
descriptor = ShowDescriptor(id=1, name="Quality Show", year=2024, quality=23)
|
||||
|
||||
self.assertEqual(23, descriptor.getQuality())
|
||||
|
||||
def test_show_descriptor_preserves_explicit_notes(self):
|
||||
descriptor = ShowDescriptor(id=1, name="Notes Show", year=2024, notes="show notes")
|
||||
|
||||
self.assertEqual("show notes", descriptor.getNotes())
|
||||
|
||||
def test_episode_basename_uses_configured_digit_defaults_when_omitted(self):
|
||||
basename = getEpisodeFileBasename(
|
||||
"Configured Show",
|
||||
"Episode Name",
|
||||
2,
|
||||
7,
|
||||
context=self.make_context(
|
||||
{
|
||||
"defaultIndexSeasonDigits": 1,
|
||||
"defaultIndexEpisodeDigits": 3,
|
||||
"defaultIndicatorSeasonDigits": 3,
|
||||
"defaultIndicatorEpisodeDigits": 4,
|
||||
}
|
||||
),
|
||||
)
|
||||
|
||||
self.assertEqual(
|
||||
"Configured Show - 2007 Episode Name - S002E0007",
|
||||
basename,
|
||||
)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main()
|
||||
@@ -2,12 +2,15 @@
|
||||
|
||||
set -u
|
||||
|
||||
ROOT_DIR="$(cd -- "$(dirname -- "${BASH_SOURCE[0]}")/.." && pwd)"
|
||||
CONFIG_DIR="${FFX_CONFIG_DIR:-${HOME}/.local/etc}"
|
||||
CONFIG_FILE="${FFX_CONFIG_FILE:-${CONFIG_DIR}/ffx.json}"
|
||||
VAR_DIR="${FFX_VAR_DIR:-${HOME}/.local/var/ffx}"
|
||||
LOG_DIR="${FFX_LOG_DIR:-${HOME}/.local/var/log}"
|
||||
DATABASE_FILE="${FFX_DATABASE_FILE:-${VAR_DIR}/ffx.db}"
|
||||
SUBTITLES_BASE_DIR="${FFX_SUBTITLES_BASE_DIR:-${HOME}/.local/var/sync/subtitles}"
|
||||
FFX_PYTHON="${FFX_PYTHON:-${HOME}/.local/share/ffx.venv/bin/python}"
|
||||
CONFIG_TEMPLATE_FILE="${FFX_CONFIG_TEMPLATE:-${ROOT_DIR}/assets/ffx.json.j2}"
|
||||
|
||||
CHECK_ONLY=0
|
||||
WITH_TESTS=0
|
||||
@@ -49,6 +52,8 @@ Environment overrides:
|
||||
FFX_LOG_DIR Override the default log directory.
|
||||
FFX_DATABASE_FILE Override the database path written into a newly seeded config.
|
||||
FFX_SUBTITLES_BASE_DIR Override the default subtitles base directory written into a newly seeded config.
|
||||
FFX_PYTHON Override the bundle venv Python used to render the seeded config.
|
||||
FFX_CONFIG_TEMPLATE Override the Jinja2 template path used to seed the config.
|
||||
|
||||
Notes:
|
||||
- tools/setup.sh is the first installation step and owns bundle venv setup.
|
||||
@@ -316,6 +321,93 @@ install_system_requirements() {
|
||||
return 0
|
||||
}
|
||||
|
||||
render_default_config() {
|
||||
local output_path="$1"
|
||||
local temporary_output_path=""
|
||||
|
||||
if [ ! -x "${FFX_PYTHON}" ]; then
|
||||
printf 'Missing bundle Python interpreter at %s.\n' "${FFX_PYTHON}" >&2
|
||||
INSTALL_FAILURES=$((INSTALL_FAILURES + 1))
|
||||
return 1
|
||||
fi
|
||||
|
||||
if [ ! -f "${CONFIG_TEMPLATE_FILE}" ]; then
|
||||
printf 'Missing FFX config template at %s.\n' "${CONFIG_TEMPLATE_FILE}" >&2
|
||||
INSTALL_FAILURES=$((INSTALL_FAILURES + 1))
|
||||
return 1
|
||||
fi
|
||||
|
||||
if ! temporary_output_path="$(mktemp "${output_path}.tmp.XXXXXX")"; then
|
||||
printf 'Failed to create a temporary config file next to %s.\n' "${output_path}" >&2
|
||||
INSTALL_FAILURES=$((INSTALL_FAILURES + 1))
|
||||
return 1
|
||||
fi
|
||||
|
||||
if ! FFX_CONFIG_TEMPLATE_FILE="${CONFIG_TEMPLATE_FILE}" \
|
||||
FFX_REPO_ROOT="${ROOT_DIR}" \
|
||||
FFX_DATABASE_PATH="${DATABASE_FILE}" \
|
||||
FFX_LOG_DIRECTORY="${LOG_DIR}" \
|
||||
FFX_SUBTITLES_DIRECTORY="${SUBTITLES_BASE_DIR}" \
|
||||
"${FFX_PYTHON}" - >"${temporary_output_path}" <<'PY'
|
||||
from __future__ import annotations
|
||||
|
||||
import json
|
||||
import os
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
from jinja2 import Environment, FileSystemLoader, StrictUndefined
|
||||
|
||||
repo_root = Path(os.environ["FFX_REPO_ROOT"])
|
||||
src_root = repo_root / "src"
|
||||
if str(src_root) not in sys.path:
|
||||
sys.path.insert(0, str(src_root))
|
||||
|
||||
from ffx.constants import (
|
||||
DEFAULT_SHOW_INDEX_EPISODE_DIGITS,
|
||||
DEFAULT_SHOW_INDEX_SEASON_DIGITS,
|
||||
DEFAULT_SHOW_INDICATOR_EPISODE_DIGITS,
|
||||
DEFAULT_SHOW_INDICATOR_SEASON_DIGITS,
|
||||
)
|
||||
|
||||
template_path = Path(os.environ["FFX_CONFIG_TEMPLATE_FILE"])
|
||||
environment = Environment(
|
||||
loader=FileSystemLoader(str(template_path.parent)),
|
||||
undefined=StrictUndefined,
|
||||
autoescape=False,
|
||||
keep_trailing_newline=True,
|
||||
)
|
||||
template = environment.get_template(template_path.name)
|
||||
|
||||
sys.stdout.write(
|
||||
template.render(
|
||||
database_path_json=json.dumps(os.environ["FFX_DATABASE_PATH"]),
|
||||
log_directory_json=json.dumps(os.environ["FFX_LOG_DIRECTORY"]),
|
||||
subtitles_directory_json=json.dumps(os.environ["FFX_SUBTITLES_DIRECTORY"]),
|
||||
default_index_season_digits=DEFAULT_SHOW_INDEX_SEASON_DIGITS,
|
||||
default_index_episode_digits=DEFAULT_SHOW_INDEX_EPISODE_DIGITS,
|
||||
default_indicator_season_digits=DEFAULT_SHOW_INDICATOR_SEASON_DIGITS,
|
||||
default_indicator_episode_digits=DEFAULT_SHOW_INDICATOR_EPISODE_DIGITS,
|
||||
)
|
||||
)
|
||||
PY
|
||||
then
|
||||
rm -f "${temporary_output_path}"
|
||||
printf 'Failed to render ffx config from template %s.\n' "${CONFIG_TEMPLATE_FILE}" >&2
|
||||
INSTALL_FAILURES=$((INSTALL_FAILURES + 1))
|
||||
return 1
|
||||
fi
|
||||
|
||||
if ! mv "${temporary_output_path}" "${output_path}"; then
|
||||
rm -f "${temporary_output_path}"
|
||||
printf 'Failed to move rendered ffx config into place at %s.\n' "${output_path}" >&2
|
||||
INSTALL_FAILURES=$((INSTALL_FAILURES + 1))
|
||||
return 1
|
||||
fi
|
||||
|
||||
return 0
|
||||
}
|
||||
|
||||
seed_default_config() {
|
||||
if [ "${CHECK_ONLY}" -eq 1 ]; then
|
||||
return 0
|
||||
@@ -365,43 +457,7 @@ seed_default_config() {
|
||||
|
||||
if [ ! -f "${CONFIG_FILE}" ]; then
|
||||
printf 'Seeding ffx config at %s...\n' "${CONFIG_FILE}"
|
||||
if ! cat >"${CONFIG_FILE}" <<EOF
|
||||
{
|
||||
"databasePath": "${DATABASE_FILE}",
|
||||
"logDirectory": "${LOG_DIR}",
|
||||
"subtitlesDirectory": "${SUBTITLES_BASE_DIR}",
|
||||
"metadata": {
|
||||
"signature": {
|
||||
"RECODED_WITH": "FFX"
|
||||
},
|
||||
"remove": [
|
||||
"VERSION-eng",
|
||||
"creation_time",
|
||||
"NAME"
|
||||
],
|
||||
"streams": {
|
||||
"remove": [
|
||||
"BPS",
|
||||
"NUMBER_OF_FRAMES",
|
||||
"NUMBER_OF_BYTES",
|
||||
"_STATISTICS_WRITING_APP",
|
||||
"_STATISTICS_WRITING_DATE_UTC",
|
||||
"_STATISTICS_TAGS",
|
||||
"BPS-eng",
|
||||
"DURATION-eng",
|
||||
"NUMBER_OF_FRAMES-eng",
|
||||
"NUMBER_OF_BYTES-eng",
|
||||
"_STATISTICS_WRITING_APP-eng",
|
||||
"_STATISTICS_WRITING_DATE_UTC-eng",
|
||||
"_STATISTICS_TAGS-eng"
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
EOF
|
||||
then
|
||||
printf 'Failed to write ffx config at %s.\n' "${CONFIG_FILE}" >&2
|
||||
INSTALL_FAILURES=$((INSTALL_FAILURES + 1))
|
||||
if ! render_default_config "${CONFIG_FILE}"; then
|
||||
return 1
|
||||
fi
|
||||
created_any=1
|
||||
|
||||
386
tools/merge_dev_into_main.sh
Executable file
386
tools/merge_dev_into_main.sh
Executable file
@@ -0,0 +1,386 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
DEV_BRANCH="dev"
|
||||
MAIN_BRANCH="main"
|
||||
ORIGIN_REMOTE="origin"
|
||||
DEFAULT_AGENT_DEVELOPMENT_PATHS=(
|
||||
"AGENTS.md"
|
||||
"SCRATCHPAD.md"
|
||||
"guidance"
|
||||
"requirements"
|
||||
"prompts"
|
||||
"process"
|
||||
"tools/merge_dev_into_main.sh"
|
||||
)
|
||||
AGENT_DEVELOPMENT_PATHS=("${DEFAULT_AGENT_DEVELOPMENT_PATHS[@]}")
|
||||
|
||||
CURRENT_BRANCH="${DEV_BRANCH}"
|
||||
ASSUME_YES=0
|
||||
DRY_RUN=0
|
||||
SKIP_TESTS=0
|
||||
|
||||
usage() {
|
||||
cat <<EOF
|
||||
Usage: $(basename "$0") [--yes] [--dry-run] [--skip-tests] [--help]
|
||||
|
||||
Merge the local ${DEV_BRANCH} branch into ${MAIN_BRANCH}, remove agent-development files
|
||||
from ${MAIN_BRANCH}, auto-resolve merge conflicts limited to those cleanup paths,
|
||||
create a release merge commit and tag, push to ${ORIGIN_REMOTE}/${MAIN_BRANCH}, and
|
||||
switch back to ${DEV_BRANCH}.
|
||||
|
||||
Options:
|
||||
--yes Skip the interactive confirmation prompt.
|
||||
--dry-run Print the validated release plan without changing git state.
|
||||
--skip-tests Skip the default pre-release test gate (./tools/test.sh).
|
||||
--help Show this help text.
|
||||
|
||||
Environment overrides:
|
||||
FFX_RELEASE_CLEAN_PATHS Colon-separated path list to remove from ${MAIN_BRANCH}
|
||||
after merging ${DEV_BRANCH}. Defaults to:
|
||||
${DEFAULT_AGENT_DEVELOPMENT_PATHS[*]}
|
||||
EOF
|
||||
}
|
||||
|
||||
fail() {
|
||||
printf '%s\n' "$*" >&2
|
||||
exit 1
|
||||
}
|
||||
|
||||
cleanup() {
|
||||
local exit_code="$1"
|
||||
|
||||
trap - EXIT
|
||||
|
||||
if git rev-parse -q --verify MERGE_HEAD >/dev/null 2>&1; then
|
||||
printf 'Merge is incomplete; aborting merge on %s...\n' "${CURRENT_BRANCH}" >&2
|
||||
git merge --abort >/dev/null 2>&1 || true
|
||||
fi
|
||||
|
||||
if [ "${CURRENT_BRANCH}" != "${DEV_BRANCH}" ]; then
|
||||
printf 'Switching back to %s...\n' "${DEV_BRANCH}" >&2
|
||||
git switch "${DEV_BRANCH}" >/dev/null 2>&1 || true
|
||||
CURRENT_BRANCH="${DEV_BRANCH}"
|
||||
fi
|
||||
|
||||
exit "${exit_code}"
|
||||
}
|
||||
|
||||
load_cleanup_paths() {
|
||||
if [ -n "${FFX_RELEASE_CLEAN_PATHS:-}" ]; then
|
||||
IFS=':' read -r -a AGENT_DEVELOPMENT_PATHS <<< "${FFX_RELEASE_CLEAN_PATHS}"
|
||||
fi
|
||||
|
||||
if [ "${#AGENT_DEVELOPMENT_PATHS[@]}" -eq 0 ]; then
|
||||
fail "Release cleanup path list is empty."
|
||||
fi
|
||||
}
|
||||
|
||||
path_is_cleanup_target() {
|
||||
local candidate_path="$1"
|
||||
local cleanup_path=""
|
||||
|
||||
for cleanup_path in "${AGENT_DEVELOPMENT_PATHS[@]}"; do
|
||||
case "${candidate_path}" in
|
||||
"${cleanup_path}"|"${cleanup_path}"/*)
|
||||
return 0
|
||||
;;
|
||||
esac
|
||||
done
|
||||
|
||||
return 1
|
||||
}
|
||||
|
||||
auto_resolve_cleanup_conflicts() {
|
||||
local unmerged_paths=()
|
||||
local non_cleanup_conflicts=()
|
||||
local remaining_conflicts=()
|
||||
local conflicted_path=""
|
||||
|
||||
mapfile -t unmerged_paths < <(git diff --name-only --diff-filter=U)
|
||||
if [ "${#unmerged_paths[@]}" -eq 0 ]; then
|
||||
return 1
|
||||
fi
|
||||
|
||||
for conflicted_path in "${unmerged_paths[@]}"; do
|
||||
if ! path_is_cleanup_target "${conflicted_path}"; then
|
||||
non_cleanup_conflicts+=("${conflicted_path}")
|
||||
fi
|
||||
done
|
||||
|
||||
if [ "${#non_cleanup_conflicts[@]}" -ne 0 ]; then
|
||||
printf 'Merge produced non-cleanup conflicts:\n' >&2
|
||||
for conflicted_path in "${non_cleanup_conflicts[@]}"; do
|
||||
printf ' - %s\n' "${conflicted_path}" >&2
|
||||
done
|
||||
return 1
|
||||
fi
|
||||
|
||||
printf 'Auto-resolving merge conflicts for release-cleanup paths:\n'
|
||||
for conflicted_path in "${unmerged_paths[@]}"; do
|
||||
printf ' - %s\n' "${conflicted_path}"
|
||||
done
|
||||
|
||||
git rm -r -f --ignore-unmatch "${AGENT_DEVELOPMENT_PATHS[@]}" >/dev/null
|
||||
|
||||
mapfile -t remaining_conflicts < <(git diff --name-only --diff-filter=U)
|
||||
if [ "${#remaining_conflicts[@]}" -ne 0 ]; then
|
||||
printf 'Cleanup conflict auto-resolution left unresolved paths:\n' >&2
|
||||
for conflicted_path in "${remaining_conflicts[@]}"; do
|
||||
printf ' - %s\n' "${conflicted_path}" >&2
|
||||
done
|
||||
return 1
|
||||
fi
|
||||
|
||||
return 0
|
||||
}
|
||||
|
||||
require_repo_state() {
|
||||
if ! git rev-parse --show-toplevel >/dev/null 2>&1; then
|
||||
fail "This helper must be run inside a git repository."
|
||||
fi
|
||||
|
||||
if ! git show-ref --verify --quiet "refs/heads/${DEV_BRANCH}"; then
|
||||
fail "Local branch '${DEV_BRANCH}' does not exist."
|
||||
fi
|
||||
|
||||
if ! git show-ref --verify --quiet "refs/heads/${MAIN_BRANCH}"; then
|
||||
fail "Local branch '${MAIN_BRANCH}' does not exist."
|
||||
fi
|
||||
|
||||
if ! git remote get-url "${ORIGIN_REMOTE}" >/dev/null 2>&1; then
|
||||
fail "Remote '${ORIGIN_REMOTE}' is not configured."
|
||||
fi
|
||||
}
|
||||
|
||||
require_dev_checkout() {
|
||||
CURRENT_BRANCH="$(git rev-parse --abbrev-ref HEAD)"
|
||||
if [ "${CURRENT_BRANCH}" != "${DEV_BRANCH}" ]; then
|
||||
fail "Current branch is '${CURRENT_BRANCH}', but '${DEV_BRANCH}' is required."
|
||||
fi
|
||||
}
|
||||
|
||||
require_clean_worktree() {
|
||||
if [ -n "$(git status --porcelain)" ]; then
|
||||
fail "Local '${DEV_BRANCH}' branch is dirty. Commit, stash, or clean changes first."
|
||||
fi
|
||||
}
|
||||
|
||||
fetch_remote_state() {
|
||||
printf 'Fetching %s branch and tag state...\n' "${ORIGIN_REMOTE}"
|
||||
git fetch "${ORIGIN_REMOTE}" "${DEV_BRANCH}" "${MAIN_BRANCH}" --tags >/dev/null
|
||||
}
|
||||
|
||||
require_branch_matches_remote() {
|
||||
local branch="$1"
|
||||
local local_sha=""
|
||||
local remote_sha=""
|
||||
|
||||
if ! git show-ref --verify --quiet "refs/remotes/${ORIGIN_REMOTE}/${branch}"; then
|
||||
fail "Remote branch '${ORIGIN_REMOTE}/${branch}' does not exist."
|
||||
fi
|
||||
|
||||
local_sha="$(git rev-parse "refs/heads/${branch}")"
|
||||
remote_sha="$(git rev-parse "refs/remotes/${ORIGIN_REMOTE}/${branch}")"
|
||||
|
||||
if [ "${local_sha}" != "${remote_sha}" ]; then
|
||||
fail "Local branch '${branch}' is not up to date with '${ORIGIN_REMOTE}/${branch}'. Pull, rebase, or push first."
|
||||
fi
|
||||
}
|
||||
|
||||
resolve_release_version() {
|
||||
local version_from_pyproject=""
|
||||
local version_from_constants=""
|
||||
|
||||
version_from_pyproject="$(
|
||||
sed -n 's/^version = "\(.*\)"$/\1/p' pyproject.toml | head -n 1
|
||||
)"
|
||||
version_from_constants="$(
|
||||
sed -n "s/^VERSION='\(.*\)'$/\1/p" src/ffx/constants.py | head -n 1
|
||||
)"
|
||||
|
||||
if [ -z "${version_from_pyproject}" ]; then
|
||||
fail "Could not resolve release version from pyproject.toml."
|
||||
fi
|
||||
|
||||
if [ -z "${version_from_constants}" ]; then
|
||||
fail "Could not resolve release version from src/ffx/constants.py."
|
||||
fi
|
||||
|
||||
if [ "${version_from_pyproject}" != "${version_from_constants}" ]; then
|
||||
fail "Version mismatch: pyproject.toml=${version_from_pyproject}, src/ffx/constants.py=${version_from_constants}."
|
||||
fi
|
||||
|
||||
printf '%s\n' "${version_from_pyproject}"
|
||||
}
|
||||
|
||||
require_release_tag_available() {
|
||||
local release_version="$1"
|
||||
local release_tag="v${release_version}"
|
||||
|
||||
if git rev-parse -q --verify "refs/tags/${release_tag}" >/dev/null 2>&1; then
|
||||
fail "Tag '${release_tag}' already exists."
|
||||
fi
|
||||
|
||||
if git rev-parse -q --verify "refs/tags/${release_version}" >/dev/null 2>&1; then
|
||||
fail "Bare tag '${release_version}' already exists; refusing to create ambiguous release tags."
|
||||
fi
|
||||
}
|
||||
|
||||
run_pre_release_tests() {
|
||||
if [ "${SKIP_TESTS}" -eq 1 ]; then
|
||||
printf 'Skipping pre-release tests.\n'
|
||||
return 0
|
||||
fi
|
||||
|
||||
if [ ! -x "./tools/test.sh" ]; then
|
||||
fail "Missing executable test runner at ./tools/test.sh."
|
||||
fi
|
||||
|
||||
printf 'Running pre-release tests via ./tools/test.sh...\n'
|
||||
./tools/test.sh
|
||||
}
|
||||
|
||||
print_release_plan() {
|
||||
local release_version="$1"
|
||||
local release_tag="v${release_version}"
|
||||
local release_commit_message="Release ${release_tag}"
|
||||
|
||||
printf 'Dry run only. Planned steps:\n'
|
||||
printf '1. Ensure current branch is %s and the worktree is clean.\n' "${DEV_BRANCH}"
|
||||
printf '2. Fetch %s and verify local %s and %s exactly match %s/%s and %s/%s.\n' \
|
||||
"${ORIGIN_REMOTE}" \
|
||||
"${DEV_BRANCH}" \
|
||||
"${MAIN_BRANCH}" \
|
||||
"${ORIGIN_REMOTE}" \
|
||||
"${DEV_BRANCH}" \
|
||||
"${ORIGIN_REMOTE}" \
|
||||
"${MAIN_BRANCH}"
|
||||
if [ "${SKIP_TESTS}" -eq 1 ]; then
|
||||
printf '3. Skip the pre-release test gate.\n'
|
||||
else
|
||||
printf '3. Run ./tools/test.sh as the pre-release test gate.\n'
|
||||
fi
|
||||
printf '4. Switch to %s and merge %s with --no-ff --no-commit.\n' "${MAIN_BRANCH}" "${DEV_BRANCH}"
|
||||
printf '5. Auto-resolve merge conflicts limited to release-cleanup paths and remove them from %s:\n' "${MAIN_BRANCH}"
|
||||
local cleanup_path=""
|
||||
for cleanup_path in "${AGENT_DEVELOPMENT_PATHS[@]}"; do
|
||||
printf ' - %s\n' "${cleanup_path}"
|
||||
done
|
||||
printf '6. Create merge commit: %s\n' "${release_commit_message}"
|
||||
printf '7. Create annotated tag: %s\n' "${release_tag}"
|
||||
printf '8. Push %s to %s/%s with --follow-tags.\n' "${MAIN_BRANCH}" "${ORIGIN_REMOTE}" "${MAIN_BRANCH}"
|
||||
printf '9. Switch back to %s.\n' "${DEV_BRANCH}"
|
||||
}
|
||||
|
||||
trap 'cleanup $?' EXIT
|
||||
|
||||
while [ "$#" -gt 0 ]; do
|
||||
case "$1" in
|
||||
--yes)
|
||||
ASSUME_YES=1
|
||||
;;
|
||||
--dry-run)
|
||||
DRY_RUN=1
|
||||
;;
|
||||
--skip-tests)
|
||||
SKIP_TESTS=1
|
||||
;;
|
||||
--help|-h)
|
||||
usage
|
||||
exit 0
|
||||
;;
|
||||
*)
|
||||
usage >&2
|
||||
fail "Unknown option: $1"
|
||||
;;
|
||||
esac
|
||||
shift
|
||||
done
|
||||
|
||||
load_cleanup_paths
|
||||
require_repo_state
|
||||
require_dev_checkout
|
||||
require_clean_worktree
|
||||
fetch_remote_state
|
||||
require_branch_matches_remote "${DEV_BRANCH}"
|
||||
require_branch_matches_remote "${MAIN_BRANCH}"
|
||||
|
||||
RELEASE_VERSION="$(resolve_release_version)"
|
||||
RELEASE_TAG="v${RELEASE_VERSION}"
|
||||
RELEASE_COMMIT_MESSAGE="Release ${RELEASE_TAG}"
|
||||
require_release_tag_available "${RELEASE_VERSION}"
|
||||
|
||||
printf 'This will merge %s into %s, remove agent-development files on %s,\n' "${DEV_BRANCH}" "${MAIN_BRANCH}" "${MAIN_BRANCH}"
|
||||
printf 'auto-resolve cleanup-path conflicts, run the pre-release gate%s, create %s,\n' \
|
||||
"$([ "${SKIP_TESTS}" -eq 1 ] && printf ' (skipped)' || printf '')" \
|
||||
"${RELEASE_TAG}"
|
||||
printf 'push to %s/%s, and switch back to %s.\n' \
|
||||
"${ORIGIN_REMOTE}" \
|
||||
"${MAIN_BRANCH}" \
|
||||
"${DEV_BRANCH}"
|
||||
|
||||
if [ "${ASSUME_YES}" -ne 1 ]; then
|
||||
printf 'Are you sure? [y/N] '
|
||||
read -r confirmation
|
||||
case "${confirmation}" in
|
||||
y|Y|yes|YES)
|
||||
;;
|
||||
*)
|
||||
fail "Aborted by user."
|
||||
;;
|
||||
esac
|
||||
fi
|
||||
|
||||
if [ "${DRY_RUN}" -eq 1 ]; then
|
||||
print_release_plan "${RELEASE_VERSION}"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
run_pre_release_tests
|
||||
require_clean_worktree
|
||||
fetch_remote_state
|
||||
require_branch_matches_remote "${DEV_BRANCH}"
|
||||
require_branch_matches_remote "${MAIN_BRANCH}"
|
||||
require_release_tag_available "${RELEASE_VERSION}"
|
||||
|
||||
git switch "${MAIN_BRANCH}" >/dev/null
|
||||
CURRENT_BRANCH="${MAIN_BRANCH}"
|
||||
|
||||
printf 'Merging %s into %s...\n' "${DEV_BRANCH}" "${MAIN_BRANCH}"
|
||||
if ! git merge --no-ff --no-commit "${DEV_BRANCH}"; then
|
||||
if ! auto_resolve_cleanup_conflicts; then
|
||||
fail "Merge from '${DEV_BRANCH}' into '${MAIN_BRANCH}' failed."
|
||||
fi
|
||||
fi
|
||||
|
||||
if ! git rev-parse -q --verify MERGE_HEAD >/dev/null 2>&1; then
|
||||
fail "'${MAIN_BRANCH}' is already up to date with '${DEV_BRANCH}'. Nothing to merge."
|
||||
fi
|
||||
|
||||
printf 'Removing agent-development files from %s...\n' "${MAIN_BRANCH}"
|
||||
git rm -r -f --ignore-unmatch "${AGENT_DEVELOPMENT_PATHS[@]}" >/dev/null
|
||||
|
||||
if git diff --cached --quiet; then
|
||||
fail "No staged changes are present after merging '${DEV_BRANCH}' into '${MAIN_BRANCH}'."
|
||||
fi
|
||||
|
||||
printf 'Creating release merge commit: %s\n' "${RELEASE_COMMIT_MESSAGE}"
|
||||
git commit -m "${RELEASE_COMMIT_MESSAGE}"
|
||||
|
||||
printf 'Creating annotated tag: %s\n' "${RELEASE_TAG}"
|
||||
git tag -a "${RELEASE_TAG}" -m "FFX ${RELEASE_VERSION}"
|
||||
|
||||
printf 'Pushing %s and annotated tags to %s...\n' "${MAIN_BRANCH}" "${ORIGIN_REMOTE}"
|
||||
git push "${ORIGIN_REMOTE}" "${MAIN_BRANCH}" --follow-tags
|
||||
|
||||
printf 'Switching back to %s...\n' "${DEV_BRANCH}"
|
||||
git switch "${DEV_BRANCH}" >/dev/null
|
||||
CURRENT_BRANCH="${DEV_BRANCH}"
|
||||
|
||||
printf 'Release merge complete: %s pushed to %s/%s and tagged as %s.\n' \
|
||||
"${RELEASE_COMMIT_MESSAGE}" \
|
||||
"${ORIGIN_REMOTE}" \
|
||||
"${MAIN_BRANCH}" \
|
||||
"${RELEASE_TAG}"
|
||||
Reference in New Issue
Block a user