Optimize database queries
This commit is contained in:
@@ -11,13 +11,13 @@
|
|||||||
- A first modern integration slice now exists under [`tests/integration/subtrack_mapping`](/home/osgw/.local/src/codex/ffx/tests/integration/subtrack_mapping). Remaining test-suite cleanup is now mostly about migrating and shrinking the legacy harness surface under [`tests/legacy`](/home/osgw/.local/src/codex/ffx/tests/legacy).
|
- A first modern integration slice now exists under [`tests/integration/subtrack_mapping`](/home/osgw/.local/src/codex/ffx/tests/integration/subtrack_mapping). Remaining test-suite cleanup is now mostly about migrating and shrinking the legacy harness surface under [`tests/legacy`](/home/osgw/.local/src/codex/ffx/tests/legacy).
|
||||||
- FFX logger setup now reuses named handlers, and fallback logger access no longer mutates handlers in ordinary constructors and helpers.
|
- FFX logger setup now reuses named handlers, and fallback logger access no longer mutates handlers in ordinary constructors and helpers.
|
||||||
- The process wrapper now uses `subprocess.run(...)` with centralized command formatting plus stable timeout and missing-command error mapping.
|
- The process wrapper now uses `subprocess.run(...)` with centralized command formatting plus stable timeout and missing-command error mapping.
|
||||||
|
- Active ORM controllers now use single-query accessors instead of paired `count()` plus `first()` lookups.
|
||||||
|
|
||||||
## Focused Snapshot
|
## Focused Snapshot
|
||||||
|
|
||||||
- Highest-leverage application optimizations:
|
- Highest-leverage application optimizations:
|
||||||
- Lazy-load CLI command dependencies so lightweight commands do not import most of the app.
|
- Lazy-load CLI command dependencies so lightweight commands do not import most of the app.
|
||||||
- Collapse repeated `ffprobe` calls into a single probe result per source file.
|
- Collapse repeated `ffprobe` calls into a single probe result per source file.
|
||||||
- Replace `query.count()` plus `first()` patterns with single-query ORM accessors.
|
|
||||||
- Cache or precompile filename pattern regexes instead of scanning every pattern for every file.
|
- Cache or precompile filename pattern regexes instead of scanning every pattern for every file.
|
||||||
|
|
||||||
- Highest-leverage repo and workflow optimizations:
|
- Highest-leverage repo and workflow optimizations:
|
||||||
@@ -35,16 +35,7 @@
|
|||||||
- Faster startup for scripting and tooling commands.
|
- Faster startup for scripting and tooling commands.
|
||||||
- Less coupling between maintenance commands and the runtime stack.
|
- Less coupling between maintenance commands and the runtime stack.
|
||||||
|
|
||||||
2. Repeated database queries via `count()` plus `first()`
|
2. Filename pattern matching scales linearly across all patterns
|
||||||
- Controllers such as [`src/ffx/show_controller.py`](/home/osgw/.local/src/codex/ffx/src/ffx/show_controller.py), [`src/ffx/pattern_controller.py`](/home/osgw/.local/src/codex/ffx/src/ffx/pattern_controller.py), and [`src/ffx/database.py`](/home/osgw/.local/src/codex/ffx/src/ffx/database.py) often do `q.count()` and then `q.first()`.
|
|
||||||
- Optimization:
|
|
||||||
- Replace with `first()`, `one_or_none()`, or existence checks that do not issue two queries.
|
|
||||||
- Standardize this across all controllers.
|
|
||||||
- Expected value:
|
|
||||||
- Lower SQLite query volume.
|
|
||||||
- Simpler controller code.
|
|
||||||
|
|
||||||
3. Filename pattern matching scales linearly across all patterns
|
|
||||||
- [`src/ffx/pattern_controller.py`](/home/osgw/.local/src/codex/ffx/src/ffx/pattern_controller.py) loads every pattern and runs `re.search` against each filename on every lookup.
|
- [`src/ffx/pattern_controller.py`](/home/osgw/.local/src/codex/ffx/src/ffx/pattern_controller.py) loads every pattern and runs `re.search` against each filename on every lookup.
|
||||||
- Optimization:
|
- Optimization:
|
||||||
- Cache compiled regexes in process memory.
|
- Cache compiled regexes in process memory.
|
||||||
@@ -54,7 +45,7 @@
|
|||||||
- Faster per-file setup when many patterns exist.
|
- Faster per-file setup when many patterns exist.
|
||||||
- More predictable matching behavior.
|
- More predictable matching behavior.
|
||||||
|
|
||||||
4. Media probing does two separate `ffprobe` subprocesses per file
|
3. Media probing does two separate `ffprobe` subprocesses per file
|
||||||
- [`src/ffx/file_properties.py`](/home/osgw/.local/src/codex/ffx/src/ffx/file_properties.py) calls `ffprobe` once for format data and once for stream data.
|
- [`src/ffx/file_properties.py`](/home/osgw/.local/src/codex/ffx/src/ffx/file_properties.py) calls `ffprobe` once for format data and once for stream data.
|
||||||
- Optimization:
|
- Optimization:
|
||||||
- Use one probe call that requests both format and streams.
|
- Use one probe call that requests both format and streams.
|
||||||
@@ -63,7 +54,7 @@
|
|||||||
- Less subprocess overhead.
|
- Less subprocess overhead.
|
||||||
- Faster inspect and convert flows.
|
- Faster inspect and convert flows.
|
||||||
|
|
||||||
5. Crop detection is always a full extra ffmpeg scan
|
4. Crop detection is always a full extra ffmpeg scan
|
||||||
- [`src/ffx/file_properties.py`](/home/osgw/.local/src/codex/ffx/src/ffx/file_properties.py) runs a dedicated `ffmpeg -vf cropdetect` pass for each file when crop detection is requested.
|
- [`src/ffx/file_properties.py`](/home/osgw/.local/src/codex/ffx/src/ffx/file_properties.py) runs a dedicated `ffmpeg -vf cropdetect` pass for each file when crop detection is requested.
|
||||||
- Optimization:
|
- Optimization:
|
||||||
- Cache crop results for repeated runs on the same source.
|
- Cache crop results for repeated runs on the same source.
|
||||||
@@ -71,7 +62,7 @@
|
|||||||
- Expected value:
|
- Expected value:
|
||||||
- Lower latency on repeated experimentation.
|
- Lower latency on repeated experimentation.
|
||||||
|
|
||||||
6. Tooling overlap and naming drift
|
5. Tooling overlap and naming drift
|
||||||
- There are still overlapping workstation-setup entrypoints across [`tools/configure_workstation.sh`](/home/osgw/.local/src/codex/ffx/tools/configure_workstation.sh), [`tools/setup.sh`](/home/osgw/.local/src/codex/ffx/tools/setup.sh), and newer CLI maintenance commands.
|
- There are still overlapping workstation-setup entrypoints across [`tools/configure_workstation.sh`](/home/osgw/.local/src/codex/ffx/tools/configure_workstation.sh), [`tools/setup.sh`](/home/osgw/.local/src/codex/ffx/tools/setup.sh), and newer CLI maintenance commands.
|
||||||
- Optimization:
|
- Optimization:
|
||||||
- Decide which scripts remain canonical.
|
- Decide which scripts remain canonical.
|
||||||
@@ -81,7 +72,7 @@
|
|||||||
- Less operator confusion.
|
- Less operator confusion.
|
||||||
- Fewer duplicated procedures to maintain.
|
- Fewer duplicated procedures to maintain.
|
||||||
|
|
||||||
7. Placeholder UI surfaces should either ship or disappear
|
6. Placeholder UI surfaces should either ship or disappear
|
||||||
- [`src/ffx/help_screen.py`](/home/osgw/.local/src/codex/ffx/src/ffx/help_screen.py) and [`src/ffx/settings_screen.py`](/home/osgw/.local/src/codex/ffx/src/ffx/settings_screen.py) are placeholders.
|
- [`src/ffx/help_screen.py`](/home/osgw/.local/src/codex/ffx/src/ffx/help_screen.py) and [`src/ffx/settings_screen.py`](/home/osgw/.local/src/codex/ffx/src/ffx/settings_screen.py) are placeholders.
|
||||||
- Optimization:
|
- Optimization:
|
||||||
- Either remove them from the active UI surface or complete them.
|
- Either remove them from the active UI surface or complete them.
|
||||||
@@ -90,7 +81,7 @@
|
|||||||
- Leaner interface.
|
- Leaner interface.
|
||||||
- Lower UX ambiguity.
|
- Lower UX ambiguity.
|
||||||
|
|
||||||
8. Large Textual screens repeat configuration and controller loading
|
7. Large Textual screens repeat configuration and controller loading
|
||||||
- Screens such as [`src/ffx/media_details_screen.py`](/home/osgw/.local/src/codex/ffx/src/ffx/media_details_screen.py), [`src/ffx/pattern_details_screen.py`](/home/osgw/.local/src/codex/ffx/src/ffx/pattern_details_screen.py), and [`src/ffx/show_details_screen.py`](/home/osgw/.local/src/codex/ffx/src/ffx/show_details_screen.py) repeat setup patterns and local metadata filtering extraction.
|
- Screens such as [`src/ffx/media_details_screen.py`](/home/osgw/.local/src/codex/ffx/src/ffx/media_details_screen.py), [`src/ffx/pattern_details_screen.py`](/home/osgw/.local/src/codex/ffx/src/ffx/pattern_details_screen.py), and [`src/ffx/show_details_screen.py`](/home/osgw/.local/src/codex/ffx/src/ffx/show_details_screen.py) repeat setup patterns and local metadata filtering extraction.
|
||||||
- Optimization:
|
- Optimization:
|
||||||
- Extract a shared screen base or helper for common config/controller/bootstrap logic.
|
- Extract a shared screen base or helper for common config/controller/bootstrap logic.
|
||||||
@@ -99,7 +90,7 @@
|
|||||||
- Lower maintenance overhead.
|
- Lower maintenance overhead.
|
||||||
- Easier UI iteration.
|
- Easier UI iteration.
|
||||||
|
|
||||||
9. Several helper functions are unfinished or dead-weight
|
8. Several helper functions are unfinished or dead-weight
|
||||||
- [`src/ffx/helper.py`](/home/osgw/.local/src/codex/ffx/src/ffx/helper.py) contains `permutateList(...): pass`.
|
- [`src/ffx/helper.py`](/home/osgw/.local/src/codex/ffx/src/ffx/helper.py) contains `permutateList(...): pass`.
|
||||||
- There are many combinator and conversion placeholders across tests and migrations.
|
- There are many combinator and conversion placeholders across tests and migrations.
|
||||||
- Optimization:
|
- Optimization:
|
||||||
@@ -109,7 +100,7 @@
|
|||||||
- Smaller mental model.
|
- Smaller mental model.
|
||||||
- Less time spent re-evaluating inactive paths.
|
- Less time spent re-evaluating inactive paths.
|
||||||
|
|
||||||
10. Test suite shape is expensive to understand and likely expensive to run
|
9. Test suite shape is expensive to understand and likely expensive to run
|
||||||
- The project still carries a large legacy matrix of combinator files under [`tests/legacy`](/home/osgw/.local/src/codex/ffx/tests/legacy), several placeholder `pass` implementations, and at least one suspicious filename with an embedded space: [`tests/legacy/disposition_combinator_2_3 .py`](/home/osgw/.local/src/codex/ffx/tests/legacy/disposition_combinator_2_3 .py).
|
- The project still carries a large legacy matrix of combinator files under [`tests/legacy`](/home/osgw/.local/src/codex/ffx/tests/legacy), several placeholder `pass` implementations, and at least one suspicious filename with an embedded space: [`tests/legacy/disposition_combinator_2_3 .py`](/home/osgw/.local/src/codex/ffx/tests/legacy/disposition_combinator_2_3 .py).
|
||||||
- A first focused replacement slice now exists in [`tests/integration/subtrack_mapping/test_cli_bundle.py`](/home/osgw/.local/src/codex/ffx/tests/integration/subtrack_mapping/test_cli_bundle.py), so the remaining work is migration and consolidation rather than creating the modern test shape from scratch.
|
- A first focused replacement slice now exists in [`tests/integration/subtrack_mapping/test_cli_bundle.py`](/home/osgw/.local/src/codex/ffx/tests/integration/subtrack_mapping/test_cli_bundle.py), so the remaining work is migration and consolidation rather than creating the modern test shape from scratch.
|
||||||
- Optimization:
|
- Optimization:
|
||||||
@@ -120,7 +111,7 @@
|
|||||||
- Faster contributor onboarding.
|
- Faster contributor onboarding.
|
||||||
- Easier CI adoption later.
|
- Easier CI adoption later.
|
||||||
|
|
||||||
11. Process resource limiting semantics could be clearer
|
10. Process resource limiting semantics could be clearer
|
||||||
- [`src/ffx/process.py`](/home/osgw/.local/src/codex/ffx/src/ffx/process.py) prepends `nice` and `cpulimit` directly when values are set.
|
- [`src/ffx/process.py`](/home/osgw/.local/src/codex/ffx/src/ffx/process.py) prepends `nice` and `cpulimit` directly when values are set.
|
||||||
- Optimization:
|
- Optimization:
|
||||||
- Validate and document effective behavior for combined `nice` + `cpulimit`.
|
- Validate and document effective behavior for combined `nice` + `cpulimit`.
|
||||||
@@ -129,7 +120,7 @@
|
|||||||
- Fewer surprises in production-like runs.
|
- Fewer surprises in production-like runs.
|
||||||
- Easier support for user-reported performance behavior.
|
- Easier support for user-reported performance behavior.
|
||||||
|
|
||||||
12. Import-time dependency coupling makes maintenance commands brittle
|
11. Import-time dependency coupling makes maintenance commands brittle
|
||||||
- Even after recent CLI maintenance additions, the top-level CLI module still imports most application modules before Click dispatch.
|
- Even after recent CLI maintenance additions, the top-level CLI module still imports most application modules before Click dispatch.
|
||||||
- Optimization:
|
- Optimization:
|
||||||
- Push imports for ORM, Textual, TMDB, ffmpeg helpers, and descriptors behind the commands that actually need them.
|
- Push imports for ORM, Textual, TMDB, ffmpeg helpers, and descriptors behind the commands that actually need them.
|
||||||
@@ -137,7 +128,7 @@
|
|||||||
- Maintenance commands such as setup and upgrade stay usable when optional runtime dependencies are broken.
|
- Maintenance commands such as setup and upgrade stay usable when optional runtime dependencies are broken.
|
||||||
- Better separation between media runtime code and maintenance tooling.
|
- Better separation between media runtime code and maintenance tooling.
|
||||||
|
|
||||||
13. Regex and string utility cleanup
|
12. Regex and string utility cleanup
|
||||||
- [`src/ffx/helper.py`](/home/osgw/.local/src/codex/ffx/src/ffx/helper.py) still emits a `SyntaxWarning` for `RICH_COLOR_PATTERN`.
|
- [`src/ffx/helper.py`](/home/osgw/.local/src/codex/ffx/src/ffx/helper.py) still emits a `SyntaxWarning` for `RICH_COLOR_PATTERN`.
|
||||||
- Optimization:
|
- Optimization:
|
||||||
- Convert regex literals to raw strings where appropriate.
|
- Convert regex literals to raw strings where appropriate.
|
||||||
@@ -146,7 +137,7 @@
|
|||||||
- Cleaner runtime output.
|
- Cleaner runtime output.
|
||||||
- Less warning noise during dry-run maintenance commands.
|
- Less warning noise during dry-run maintenance commands.
|
||||||
|
|
||||||
14. Database startup always runs schema creation and version checks
|
13. Database startup always runs schema creation and version checks
|
||||||
- [`src/ffx/database.py`](/home/osgw/.local/src/codex/ffx/src/ffx/database.py) runs `Base.metadata.create_all(...)` and version checks every time a DB-backed context is created.
|
- [`src/ffx/database.py`](/home/osgw/.local/src/codex/ffx/src/ffx/database.py) runs `Base.metadata.create_all(...)` and version checks every time a DB-backed context is created.
|
||||||
- Optimization:
|
- Optimization:
|
||||||
- Measure startup cost and consider separating bootstrapping from ordinary command execution.
|
- Measure startup cost and consider separating bootstrapping from ordinary command execution.
|
||||||
@@ -171,7 +162,6 @@
|
|||||||
1. Triage the list into quick wins, medium refactors, and long-horizon cleanup.
|
1. Triage the list into quick wins, medium refactors, and long-horizon cleanup.
|
||||||
2. Tackle the cheapest high-impact items first:
|
2. Tackle the cheapest high-impact items first:
|
||||||
- regex raw-string warning cleanup,
|
- regex raw-string warning cleanup,
|
||||||
- `count()` plus `first()` query cleanup,
|
|
||||||
- single-call `ffprobe` refactor.
|
- single-call `ffprobe` refactor.
|
||||||
3. Decide whether maintenance/tooling command imports should be split from media-runtime imports before adding more CLI maintenance surface.
|
3. Decide whether maintenance/tooling command imports should be split from media-runtime imports before adding more CLI maintenance surface.
|
||||||
|
|
||||||
|
|||||||
@@ -70,9 +70,9 @@ def getDatabaseVersion(databaseContext):
|
|||||||
|
|
||||||
Session = databaseContext['session']
|
Session = databaseContext['session']
|
||||||
s = Session()
|
s = Session()
|
||||||
q = s.query(Property).filter(Property.key == DATABASE_VERSION_KEY)
|
versionProperty = s.query(Property).filter(Property.key == DATABASE_VERSION_KEY).first()
|
||||||
|
|
||||||
return int(q.first().value) if q.count() else 0
|
return int(versionProperty.value) if versionProperty is not None else 0
|
||||||
|
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
raise click.ClickException(f"getDatabaseVersion(): {repr(ex)}")
|
raise click.ClickException(f"getDatabaseVersion(): {repr(ex)}")
|
||||||
|
|||||||
@@ -25,10 +25,9 @@ class MediaController():
|
|||||||
pid = int(patternId)
|
pid = int(patternId)
|
||||||
|
|
||||||
s = self.Session()
|
s = self.Session()
|
||||||
q = s.query(Pattern).filter(Pattern.id == pid)
|
pattern = s.query(Pattern).filter(Pattern.id == pid).first()
|
||||||
|
|
||||||
if q.count():
|
if pattern is not None:
|
||||||
pattern = q.first
|
|
||||||
|
|
||||||
for mediaTagKey, mediaTagValue in mediaDescriptor.getTags():
|
for mediaTagKey, mediaTagValue in mediaDescriptor.getTags():
|
||||||
self.__tac.updateMediaTag(pid, mediaTagKey, mediaTagValue)
|
self.__tac.updateMediaTag(pid, mediaTagKey, mediaTagValue)
|
||||||
|
|||||||
@@ -19,10 +19,12 @@ class PatternController():
|
|||||||
try:
|
try:
|
||||||
|
|
||||||
s = self.Session()
|
s = self.Session()
|
||||||
q = s.query(Pattern).filter(Pattern.show_id == int(patternObj['show_id']),
|
pattern = s.query(Pattern).filter(
|
||||||
Pattern.pattern == str(patternObj['pattern']))
|
Pattern.show_id == int(patternObj['show_id']),
|
||||||
|
Pattern.pattern == str(patternObj['pattern']),
|
||||||
|
).first()
|
||||||
|
|
||||||
if not q.count():
|
if pattern is None:
|
||||||
pattern = Pattern(show_id = int(patternObj['show_id']),
|
pattern = Pattern(show_id = int(patternObj['show_id']),
|
||||||
pattern = str(patternObj['pattern']))
|
pattern = str(patternObj['pattern']))
|
||||||
s.add(pattern)
|
s.add(pattern)
|
||||||
@@ -41,11 +43,9 @@ class PatternController():
|
|||||||
|
|
||||||
try:
|
try:
|
||||||
s = self.Session()
|
s = self.Session()
|
||||||
q = s.query(Pattern).filter(Pattern.id == int(patternId))
|
pattern = s.query(Pattern).filter(Pattern.id == int(patternId)).first()
|
||||||
|
|
||||||
if q.count():
|
if pattern is not None:
|
||||||
|
|
||||||
pattern: Pattern = q.first()
|
|
||||||
|
|
||||||
pattern.show_id = int(patternObj['show_id'])
|
pattern.show_id = int(patternObj['show_id'])
|
||||||
pattern.pattern = str(patternObj['pattern'])
|
pattern.pattern = str(patternObj['pattern'])
|
||||||
@@ -69,10 +69,12 @@ class PatternController():
|
|||||||
|
|
||||||
try:
|
try:
|
||||||
s = self.Session()
|
s = self.Session()
|
||||||
q = s.query(Pattern).filter(Pattern.show_id == int(patternObj['show_id']), Pattern.pattern == str(patternObj['pattern']))
|
pattern = s.query(Pattern).filter(
|
||||||
|
Pattern.show_id == int(patternObj['show_id']),
|
||||||
|
Pattern.pattern == str(patternObj['pattern']),
|
||||||
|
).first()
|
||||||
|
|
||||||
if q.count():
|
if pattern is not None:
|
||||||
pattern = q.first()
|
|
||||||
return int(pattern.id)
|
return int(pattern.id)
|
||||||
else:
|
else:
|
||||||
return None
|
return None
|
||||||
@@ -90,9 +92,7 @@ class PatternController():
|
|||||||
|
|
||||||
try:
|
try:
|
||||||
s = self.Session()
|
s = self.Session()
|
||||||
q = s.query(Pattern).filter(Pattern.id == int(patternId))
|
return s.query(Pattern).filter(Pattern.id == int(patternId)).first()
|
||||||
|
|
||||||
return q.first() if q.count() else None
|
|
||||||
|
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
raise click.ClickException(f"PatternController.getPattern(): {repr(ex)}")
|
raise click.ClickException(f"PatternController.getPattern(): {repr(ex)}")
|
||||||
@@ -103,13 +103,12 @@ class PatternController():
|
|||||||
def deletePattern(self, patternId):
|
def deletePattern(self, patternId):
|
||||||
try:
|
try:
|
||||||
s = self.Session()
|
s = self.Session()
|
||||||
q = s.query(Pattern).filter(Pattern.id == int(patternId))
|
pattern = s.query(Pattern).filter(Pattern.id == int(patternId)).first()
|
||||||
|
|
||||||
if q.count():
|
if pattern is not None:
|
||||||
|
|
||||||
#DAFUQ: https://stackoverflow.com/a/19245058
|
#DAFUQ: https://stackoverflow.com/a/19245058
|
||||||
# q.delete()
|
# q.delete()
|
||||||
pattern = q.first()
|
|
||||||
s.delete(pattern)
|
s.delete(pattern)
|
||||||
|
|
||||||
s.commit()
|
s.commit()
|
||||||
|
|||||||
@@ -101,11 +101,9 @@ class ShiftedSeasonController():
|
|||||||
try:
|
try:
|
||||||
s = self.Session()
|
s = self.Session()
|
||||||
|
|
||||||
q = s.query(ShiftedSeason).filter(ShiftedSeason.id == int(shiftedSeasonId))
|
shiftedSeason = s.query(ShiftedSeason).filter(ShiftedSeason.id == int(shiftedSeasonId)).first()
|
||||||
|
|
||||||
if q.count():
|
if shiftedSeason is not None:
|
||||||
|
|
||||||
shiftedSeason = q.first()
|
|
||||||
|
|
||||||
shiftedSeason.original_season = int(shiftedSeasonObj['original_season'])
|
shiftedSeason.original_season = int(shiftedSeasonObj['original_season'])
|
||||||
shiftedSeason.first_episode = int(shiftedSeasonObj['first_episode'])
|
shiftedSeason.first_episode = int(shiftedSeasonObj['first_episode'])
|
||||||
@@ -141,12 +139,14 @@ class ShiftedSeasonController():
|
|||||||
|
|
||||||
try:
|
try:
|
||||||
s = self.Session()
|
s = self.Session()
|
||||||
q = s.query(ShiftedSeason).filter(ShiftedSeason.show_id == int(showId),
|
shiftedSeason = s.query(ShiftedSeason).filter(
|
||||||
|
ShiftedSeason.show_id == int(showId),
|
||||||
ShiftedSeason.original_season == int(originalSeason),
|
ShiftedSeason.original_season == int(originalSeason),
|
||||||
ShiftedSeason.first_episode == int(firstEpisode),
|
ShiftedSeason.first_episode == int(firstEpisode),
|
||||||
ShiftedSeason.last_episode == int(lastEpisode))
|
ShiftedSeason.last_episode == int(lastEpisode),
|
||||||
|
).first()
|
||||||
|
|
||||||
return q.first().getId() if q.count() else None
|
return shiftedSeason.getId() if shiftedSeason is not None else None
|
||||||
|
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
raise click.ClickException(f"PatternController.findShiftedSeason(): {repr(ex)}")
|
raise click.ClickException(f"PatternController.findShiftedSeason(): {repr(ex)}")
|
||||||
@@ -177,9 +177,7 @@ class ShiftedSeasonController():
|
|||||||
|
|
||||||
try:
|
try:
|
||||||
s = self.Session()
|
s = self.Session()
|
||||||
q = s.query(ShiftedSeason).filter(ShiftedSeason.id == int(shiftedSeasonId))
|
return s.query(ShiftedSeason).filter(ShiftedSeason.id == int(shiftedSeasonId)).first()
|
||||||
|
|
||||||
return q.first() if q.count() else None
|
|
||||||
|
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
raise click.ClickException(f"ShiftedSeasonController.getShiftedSeason(): {repr(ex)}")
|
raise click.ClickException(f"ShiftedSeasonController.getShiftedSeason(): {repr(ex)}")
|
||||||
@@ -194,13 +192,12 @@ class ShiftedSeasonController():
|
|||||||
|
|
||||||
try:
|
try:
|
||||||
s = self.Session()
|
s = self.Session()
|
||||||
q = s.query(ShiftedSeason).filter(ShiftedSeason.id == int(shiftedSeasonId))
|
shiftedSeason = s.query(ShiftedSeason).filter(ShiftedSeason.id == int(shiftedSeasonId)).first()
|
||||||
|
|
||||||
if q.count():
|
if shiftedSeason is not None:
|
||||||
|
|
||||||
#DAFUQ: https://stackoverflow.com/a/19245058
|
#DAFUQ: https://stackoverflow.com/a/19245058
|
||||||
# q.delete()
|
# q.delete()
|
||||||
shiftedSeason = q.first()
|
|
||||||
s.delete(shiftedSeason)
|
s.delete(shiftedSeason)
|
||||||
|
|
||||||
s.commit()
|
s.commit()
|
||||||
|
|||||||
@@ -16,10 +16,9 @@ class ShowController():
|
|||||||
|
|
||||||
try:
|
try:
|
||||||
s = self.Session()
|
s = self.Session()
|
||||||
q = s.query(Show).filter(Show.id == showId)
|
show = s.query(Show).filter(Show.id == showId).first()
|
||||||
|
|
||||||
if q.count():
|
if show is not None:
|
||||||
show: Show = q.first()
|
|
||||||
return show.getDescriptor(self.context)
|
return show.getDescriptor(self.context)
|
||||||
|
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
@@ -31,9 +30,7 @@ class ShowController():
|
|||||||
|
|
||||||
try:
|
try:
|
||||||
s = self.Session()
|
s = self.Session()
|
||||||
q = s.query(Show).filter(Show.id == showId)
|
return s.query(Show).filter(Show.id == showId).first()
|
||||||
|
|
||||||
return q.first() if q.count() else None
|
|
||||||
|
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
raise click.ClickException(f"ShowController.getShow(): {repr(ex)}")
|
raise click.ClickException(f"ShowController.getShow(): {repr(ex)}")
|
||||||
@@ -44,12 +41,7 @@ class ShowController():
|
|||||||
|
|
||||||
try:
|
try:
|
||||||
s = self.Session()
|
s = self.Session()
|
||||||
q = s.query(Show)
|
return s.query(Show).all()
|
||||||
|
|
||||||
if q.count():
|
|
||||||
return q.all()
|
|
||||||
else:
|
|
||||||
return []
|
|
||||||
|
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
raise click.ClickException(f"ShowController.getAllShows(): {repr(ex)}")
|
raise click.ClickException(f"ShowController.getAllShows(): {repr(ex)}")
|
||||||
@@ -61,9 +53,9 @@ class ShowController():
|
|||||||
|
|
||||||
try:
|
try:
|
||||||
s = self.Session()
|
s = self.Session()
|
||||||
q = s.query(Show).filter(Show.id == showDescriptor.getId())
|
currentShow = s.query(Show).filter(Show.id == showDescriptor.getId()).first()
|
||||||
|
|
||||||
if not q.count():
|
if currentShow is None:
|
||||||
show = Show(id = int(showDescriptor.getId()),
|
show = Show(id = int(showDescriptor.getId()),
|
||||||
name = str(showDescriptor.getName()),
|
name = str(showDescriptor.getName()),
|
||||||
year = int(showDescriptor.getYear()),
|
year = int(showDescriptor.getYear()),
|
||||||
@@ -76,9 +68,6 @@ class ShowController():
|
|||||||
s.commit()
|
s.commit()
|
||||||
return True
|
return True
|
||||||
else:
|
else:
|
||||||
|
|
||||||
currentShow = q.first()
|
|
||||||
|
|
||||||
changed = False
|
changed = False
|
||||||
if currentShow.name != str(showDescriptor.getName()):
|
if currentShow.name != str(showDescriptor.getName()):
|
||||||
currentShow.name = str(showDescriptor.getName())
|
currentShow.name = str(showDescriptor.getName())
|
||||||
@@ -113,14 +102,12 @@ class ShowController():
|
|||||||
def deleteShow(self, show_id):
|
def deleteShow(self, show_id):
|
||||||
try:
|
try:
|
||||||
s = self.Session()
|
s = self.Session()
|
||||||
q = s.query(Show).filter(Show.id == int(show_id))
|
show = s.query(Show).filter(Show.id == int(show_id)).first()
|
||||||
|
|
||||||
|
if show is not None:
|
||||||
if q.count():
|
|
||||||
|
|
||||||
#DAFUQ: https://stackoverflow.com/a/19245058
|
#DAFUQ: https://stackoverflow.com/a/19245058
|
||||||
# q.delete()
|
# q.delete()
|
||||||
show = q.first()
|
|
||||||
s.delete(show)
|
s.delete(show)
|
||||||
|
|
||||||
s.commit()
|
s.commit()
|
||||||
|
|||||||
@@ -67,10 +67,11 @@ class TagController():
|
|||||||
try:
|
try:
|
||||||
s = self.Session()
|
s = self.Session()
|
||||||
|
|
||||||
q = s.query(MediaTag).filter(MediaTag.pattern_id == int(patternId),
|
tag = s.query(MediaTag).filter(
|
||||||
MediaTag.key == str(tagKey))
|
MediaTag.pattern_id == int(patternId),
|
||||||
if q.count():
|
MediaTag.key == str(tagKey),
|
||||||
tag = q.first()
|
).first()
|
||||||
|
if tag is not None:
|
||||||
s.delete(tag)
|
s.delete(tag)
|
||||||
s.commit()
|
s.commit()
|
||||||
return True
|
return True
|
||||||
@@ -107,12 +108,8 @@ class TagController():
|
|||||||
try:
|
try:
|
||||||
s = self.Session()
|
s = self.Session()
|
||||||
|
|
||||||
q = s.query(MediaTag).filter(MediaTag.pattern_id == int(patternId))
|
tags = s.query(MediaTag).filter(MediaTag.pattern_id == int(patternId)).all()
|
||||||
|
return {t.key:t.value for t in tags}
|
||||||
if q.count():
|
|
||||||
return {t.key:t.value for t in q.all()}
|
|
||||||
else:
|
|
||||||
return {}
|
|
||||||
|
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
raise click.ClickException(f"TagController.findAllMediaTags(): {repr(ex)}")
|
raise click.ClickException(f"TagController.findAllMediaTags(): {repr(ex)}")
|
||||||
@@ -125,12 +122,8 @@ class TagController():
|
|||||||
try:
|
try:
|
||||||
s = self.Session()
|
s = self.Session()
|
||||||
|
|
||||||
q = s.query(TrackTag).filter(TrackTag.track_id == int(trackId))
|
tags = s.query(TrackTag).filter(TrackTag.track_id == int(trackId)).all()
|
||||||
|
return {t.key:t.value for t in tags}
|
||||||
if q.count():
|
|
||||||
return {t.key:t.value for t in q.all()}
|
|
||||||
else:
|
|
||||||
return {}
|
|
||||||
|
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
raise click.ClickException(f"TagController.findAllTracks(): {repr(ex)}")
|
raise click.ClickException(f"TagController.findAllTracks(): {repr(ex)}")
|
||||||
@@ -142,12 +135,7 @@ class TagController():
|
|||||||
|
|
||||||
try:
|
try:
|
||||||
s = self.Session()
|
s = self.Session()
|
||||||
q = s.query(Track).filter(MediaTag.track_id == int(trackId), MediaTag.key == str(trackKey))
|
return s.query(Track).filter(MediaTag.track_id == int(trackId), MediaTag.key == str(trackKey)).first()
|
||||||
|
|
||||||
if q.count():
|
|
||||||
return q.first()
|
|
||||||
else:
|
|
||||||
return None
|
|
||||||
|
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
raise click.ClickException(f"TagController.findMediaTag(): {repr(ex)}")
|
raise click.ClickException(f"TagController.findMediaTag(): {repr(ex)}")
|
||||||
@@ -158,12 +146,10 @@ class TagController():
|
|||||||
|
|
||||||
try:
|
try:
|
||||||
s = self.Session()
|
s = self.Session()
|
||||||
q = s.query(TrackTag).filter(TrackTag.track_id == int(trackId), TrackTag.key == str(tagKey))
|
return s.query(TrackTag).filter(
|
||||||
|
TrackTag.track_id == int(trackId),
|
||||||
if q.count():
|
TrackTag.key == str(tagKey),
|
||||||
return q.first()
|
).first()
|
||||||
else:
|
|
||||||
return None
|
|
||||||
|
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
raise click.ClickException(f"TagController.findTrackTag(): {repr(ex)}")
|
raise click.ClickException(f"TagController.findTrackTag(): {repr(ex)}")
|
||||||
@@ -175,11 +161,9 @@ class TagController():
|
|||||||
def deleteMediaTag(self, tagId) -> bool:
|
def deleteMediaTag(self, tagId) -> bool:
|
||||||
try:
|
try:
|
||||||
s = self.Session()
|
s = self.Session()
|
||||||
q = s.query(MediaTag).filter(MediaTag.id == int(tagId))
|
tag = s.query(MediaTag).filter(MediaTag.id == int(tagId)).first()
|
||||||
|
|
||||||
if q.count():
|
if tag is not None:
|
||||||
|
|
||||||
tag = q.first()
|
|
||||||
|
|
||||||
s.delete(tag)
|
s.delete(tag)
|
||||||
|
|
||||||
@@ -201,11 +185,9 @@ class TagController():
|
|||||||
|
|
||||||
try:
|
try:
|
||||||
s = self.Session()
|
s = self.Session()
|
||||||
q = s.query(TrackTag).filter(TrackTag.id == int(tagId))
|
tag = s.query(TrackTag).filter(TrackTag.id == int(tagId)).first()
|
||||||
|
|
||||||
if q.count():
|
if tag is not None:
|
||||||
|
|
||||||
tag = q.first()
|
|
||||||
|
|
||||||
s.delete(tag)
|
s.delete(tag)
|
||||||
|
|
||||||
|
|||||||
@@ -75,11 +75,9 @@ class TrackController():
|
|||||||
|
|
||||||
try:
|
try:
|
||||||
s = self.Session()
|
s = self.Session()
|
||||||
q = s.query(Track).filter(Track.id == int(trackId))
|
track = s.query(Track).filter(Track.id == int(trackId)).first()
|
||||||
|
|
||||||
if q.count():
|
if track is not None:
|
||||||
|
|
||||||
track : Track = q.first()
|
|
||||||
|
|
||||||
track.index = int(trackDescriptor.getIndex())
|
track.index = int(trackDescriptor.getIndex())
|
||||||
|
|
||||||
@@ -193,12 +191,10 @@ class TrackController():
|
|||||||
|
|
||||||
try:
|
try:
|
||||||
s = self.Session()
|
s = self.Session()
|
||||||
q = s.query(Track).filter(Track.pattern_id == int(patternId), Track.index == int(index))
|
return s.query(Track).filter(
|
||||||
|
Track.pattern_id == int(patternId),
|
||||||
if q.count():
|
Track.index == int(index),
|
||||||
return q.first()
|
).first()
|
||||||
else:
|
|
||||||
return None
|
|
||||||
|
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
raise click.ClickException(f"TrackController.getTrack(): {repr(ex)}")
|
raise click.ClickException(f"TrackController.getTrack(): {repr(ex)}")
|
||||||
@@ -218,11 +214,9 @@ class TrackController():
|
|||||||
|
|
||||||
try:
|
try:
|
||||||
s = self.Session()
|
s = self.Session()
|
||||||
q = s.query(Track).filter(Track.pattern_id == patternId, Track.index == index)
|
track = s.query(Track).filter(Track.pattern_id == patternId, Track.index == index).first()
|
||||||
|
|
||||||
if q.count():
|
if track is not None:
|
||||||
|
|
||||||
track : Track = q.first()
|
|
||||||
|
|
||||||
if state:
|
if state:
|
||||||
track.setDisposition(disposition)
|
track.setDisposition(disposition)
|
||||||
@@ -244,10 +238,10 @@ class TrackController():
|
|||||||
try:
|
try:
|
||||||
s = self.Session()
|
s = self.Session()
|
||||||
|
|
||||||
q = s.query(Track).filter(Track.id == int(trackId))
|
track = s.query(Track).filter(Track.id == int(trackId)).first()
|
||||||
|
|
||||||
if q.count():
|
if track is not None:
|
||||||
patternId = int(q.first().pattern_id)
|
patternId = int(track.pattern_id)
|
||||||
|
|
||||||
q_siblings = s.query(Track).filter(Track.pattern_id == patternId).order_by(Track.index)
|
q_siblings = s.query(Track).filter(Track.pattern_id == patternId).order_by(Track.index)
|
||||||
|
|
||||||
|
|||||||
Reference in New Issue
Block a user