Adds requirements, streamlines CLI helper procedures
This commit is contained in:
1
.gitignore
vendored
1
.gitignore
vendored
@@ -12,3 +12,4 @@ bin/conversiontest.py
|
|||||||
build/
|
build/
|
||||||
dist/
|
dist/
|
||||||
*.egg-info/
|
*.egg-info/
|
||||||
|
.codex
|
||||||
|
|||||||
376
AGENTS.md
Normal file
376
AGENTS.md
Normal file
@@ -0,0 +1,376 @@
|
|||||||
|
# AGENTS.md
|
||||||
|
|
||||||
|
This file is the entry point for agent guidance in this repository.
|
||||||
|
|
||||||
|
It is intentionally generic and reusable across projects. Keep this file focused on non-project-specific constraints, working style, and the structure used to link more detailed guidance.
|
||||||
|
|
||||||
|
# Purpose
|
||||||
|
|
||||||
|
- Provide a small default rule set for agents working in this repository.
|
||||||
|
- Keep the base guidance modular and easy to extend.
|
||||||
|
- Separate reusable agent behavior from project-specific requirements.
|
||||||
|
|
||||||
|
# Comment Syntax
|
||||||
|
|
||||||
|
- A segment wrapped in `<!--` and `-->` is a comment and must be ignored by agents.
|
||||||
|
- Use HTML comments for optional guidance that should stay inactive until enabled.
|
||||||
|
- To enable an optional segment, remove the surrounding `<!--` and `-->` markers.
|
||||||
|
|
||||||
|
# Core Principles
|
||||||
|
|
||||||
|
- Prefer the simplest solution that satisfies the current goal.
|
||||||
|
- Keep guidance lightweight: only add detail when it meaningfully improves outcomes.
|
||||||
|
- Reuse modular guideline files instead of expanding this file indefinitely.
|
||||||
|
- Treat project-specific documents as the source of truth for project behavior.
|
||||||
|
- When guidance conflicts, use the most specific applicable document.
|
||||||
|
|
||||||
|
# Rule Terms
|
||||||
|
|
||||||
|
- A `rule` is the general term for any constraint, requirement, definition, or similar guidance item.
|
||||||
|
- A `rule set` addresses all rules inside one file that share the same rule set ID.
|
||||||
|
- Any rule inside a rule set shall use an ID following the schema `RULESET-0001`, `RULESET-0002`, and so on.
|
||||||
|
- Rules without a rule set ID are also valid, but they are not addressable by rule ID.
|
||||||
|
|
||||||
|
# Scope Of This File
|
||||||
|
|
||||||
|
This file should contain:
|
||||||
|
|
||||||
|
- Generic agent behavior and constraints.
|
||||||
|
- Rules that are reusable across multiple projects.
|
||||||
|
- Links to optional guideline modules.
|
||||||
|
- Links to project-specific requirements.
|
||||||
|
- Commented optional templates for released-product documentation and agent-output locations.
|
||||||
|
|
||||||
|
This file should not contain:
|
||||||
|
|
||||||
|
- Project business requirements.
|
||||||
|
- Project architecture decisions.
|
||||||
|
- Stack-specific implementation details unless they are universally applicable.
|
||||||
|
- Task-specific runbooks that belong in dedicated modules.
|
||||||
|
|
||||||
|
# Default Agent Behavior
|
||||||
|
|
||||||
|
- Read the relevant context before making changes.
|
||||||
|
- Prefer small, understandable edits over broad refactors.
|
||||||
|
- Preserve existing patterns unless there is a clear reason to change them.
|
||||||
|
- Document assumptions when context is missing.
|
||||||
|
- Ignore HTML comment segments.
|
||||||
|
- If a more specific enabled guideline exists for the current task, follow it.
|
||||||
|
|
||||||
|
# Guideline Structure
|
||||||
|
|
||||||
|
Use the following structure for reusable guidance files and project-specific documentation as needed:
|
||||||
|
|
||||||
|
```text
|
||||||
|
/
|
||||||
|
|-- AGENTS.md
|
||||||
|
|-- guidance/
|
||||||
|
| |-- stacks/
|
||||||
|
| |-- conventions/
|
||||||
|
| `-- workflows/
|
||||||
|
|-- prompts/
|
||||||
|
`-- requirements/
|
||||||
|
|
||||||
|
Optional files and directories
|
||||||
|
|-- SCRATCHPAD.md
|
||||||
|
|-- docs/
|
||||||
|
| |-- readme.md
|
||||||
|
| |-- installation.md
|
||||||
|
| `-- history.md
|
||||||
|
|-- process/
|
||||||
|
| |-- log.md
|
||||||
|
| `-- coding-handbook.md
|
||||||
|
```
|
||||||
|
|
||||||
|
# Optional Reusable Modules
|
||||||
|
|
||||||
|
Add files under `guidance/` only when they are needed.
|
||||||
|
|
||||||
|
# Optional Scratchpad
|
||||||
|
|
||||||
|
- `SCRATCHPAD.md` is an optional repo-root scratchpad for temporary
|
||||||
|
information aimed at the next iteration.
|
||||||
|
- Developers may create or delete `SCRATCHPAD.md` at any time.
|
||||||
|
- Developers may refer to `SCRATCHPAD.md` as `scratchpad` when giving agents a
|
||||||
|
source or target for information.
|
||||||
|
- Agents may read, update, create, or remove the scratchpad when the task
|
||||||
|
explicitly calls for it.
|
||||||
|
- Treat the scratchpad as low-formality working context rather than canonical
|
||||||
|
project truth.
|
||||||
|
- Use the scratchpad for short-lived notes, open questions, sketches, and
|
||||||
|
temporary decisions that should be resolved away.
|
||||||
|
- Move durable outcomes into `requirements/`, `guidance/`, code, tests, or
|
||||||
|
another long-lived location.
|
||||||
|
- If `SCRATCHPAD.md` is absent, agents should continue normally.
|
||||||
|
|
||||||
|
# Optional Rule Sets
|
||||||
|
|
||||||
|
- Optional rule sets may be stored in `guidance/optional/` or in `guidance/{section}/optional/`.
|
||||||
|
- Optional rule sets are inactive by default and shall only be applied when a prompt explicitly requests them, for example by phrases such as `Apply rules for lean interface iteration in the following steps.` or `Apply LII rules.`
|
||||||
|
- An optional rule set may be requested by its descriptive name, by its rule set ID, or by another equally clear explicit reference.
|
||||||
|
- Agents shall never infer or auto-enable optional rule sets from general intent alone.
|
||||||
|
- If an optional rule or rule set cannot be identified and addressed clearly, agents shall stop and ask before proceeding.
|
||||||
|
|
||||||
|
# Prepared Orders
|
||||||
|
|
||||||
|
- An `order` is a prepared prompt for one isolated operation rather than a general workflow or standing rule set.
|
||||||
|
- Orders shall be stored under `prompts/`.
|
||||||
|
- Order files shall use the naming schema `ORDER-0001-<slug>.md`, `ORDER-0002-<slug>.md`, and so on.
|
||||||
|
- The canonical order identifier is the `ORDER-0001` style prefix. The trailing slug is descriptive only.
|
||||||
|
- Recommended internal order file structure is: prompt ID, prompt name, purpose, trigger examples, scope, operation, and expected output.
|
||||||
|
- Orders shall only be executed when they are explicitly requested by a prompt such as `Execute ORDER-0007.` or `Execute ORDER 7.`
|
||||||
|
- Agents may accept an unambiguous short numeric reference such as `ORDER 7` as an alias for `ORDER-0007`.
|
||||||
|
- If an order cannot be identified uniquely and clearly, agents shall stop and ask before proceeding.
|
||||||
|
|
||||||
|
# Toolstack Guides
|
||||||
|
|
||||||
|
Location:
|
||||||
|
|
||||||
|
```text
|
||||||
|
guidance/stacks/
|
||||||
|
```
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
|
||||||
|
- `guidance/stacks/python.md`
|
||||||
|
- `guidance/stacks/typescript.md`
|
||||||
|
- `guidance/stacks/docker.md`
|
||||||
|
- `guidance/stacks/terraform.md`
|
||||||
|
|
||||||
|
Use for:
|
||||||
|
|
||||||
|
- Language or framework expectations.
|
||||||
|
- Tooling and environment conventions.
|
||||||
|
- Build, test, and runtime guidance tied to a specific stack.
|
||||||
|
|
||||||
|
# Coding Conventions
|
||||||
|
|
||||||
|
Location:
|
||||||
|
|
||||||
|
```text
|
||||||
|
guidance/conventions/
|
||||||
|
```
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
|
||||||
|
- `guidance/conventions/naming.md`
|
||||||
|
- `guidance/conventions/testing.md`
|
||||||
|
- `guidance/conventions/review.md`
|
||||||
|
|
||||||
|
Use for:
|
||||||
|
|
||||||
|
- Naming and structure conventions.
|
||||||
|
- Testing expectations.
|
||||||
|
- Code review and quality rules.
|
||||||
|
|
||||||
|
# Recurring Workflows
|
||||||
|
|
||||||
|
Location:
|
||||||
|
|
||||||
|
```text
|
||||||
|
guidance/workflows/
|
||||||
|
```
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
|
||||||
|
- `guidance/workflows/feature-delivery.md`
|
||||||
|
- `guidance/workflows/bugfix.md`
|
||||||
|
- `guidance/workflows/release.md`
|
||||||
|
- `guidance/workflows/incident-response.md`
|
||||||
|
|
||||||
|
Use for:
|
||||||
|
|
||||||
|
- Repeatable task flows.
|
||||||
|
- Checklists for common delivery work.
|
||||||
|
- Operational or maintenance procedures.
|
||||||
|
|
||||||
|
|
||||||
|
<!-- Enable this optional section by removing the outer HTML comment markers from this segment
|
||||||
|
when you want agents to create, update, and consult released-product
|
||||||
|
documentation in `docs/`.
|
||||||
|
|
||||||
|
# Released Product Documentation
|
||||||
|
|
||||||
|
Released-product documentation should live outside the generic sections above.
|
||||||
|
|
||||||
|
Recommended location:
|
||||||
|
|
||||||
|
```text
|
||||||
|
docs/
|
||||||
|
```
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
|
||||||
|
- `docs/readme.md`
|
||||||
|
- `docs/installation.md`
|
||||||
|
- `docs/history.md`
|
||||||
|
|
||||||
|
Agent rules for docs output:
|
||||||
|
|
||||||
|
- Keep content compact but comprehensive.
|
||||||
|
- Write for end users, operators, or other consumers of the released product.
|
||||||
|
- Prefer shipped behavior, supported workflows, and stable terminology over
|
||||||
|
internal implementation detail.
|
||||||
|
- Keep documentation synchronized with released behavior.
|
||||||
|
- Update release history when user-visible changes are shipped.
|
||||||
|
|
||||||
|
Recommended topics:
|
||||||
|
|
||||||
|
- Product overview and intended use.
|
||||||
|
- Installation, configuration, and upgrade guidance.
|
||||||
|
- Usage patterns, operational instructions, and support boundaries.
|
||||||
|
- Compatibility notes, migration notes, and release history.
|
||||||
|
- Troubleshooting and common pitfalls when relevant. -->
|
||||||
|
|
||||||
|
|
||||||
|
<!-- Enable this optional section by removing the outer HTML comment markers from this
|
||||||
|
segment when you want agents to produce and consult workflow output in `process/`.
|
||||||
|
|
||||||
|
# Agent Output In `process/`
|
||||||
|
|
||||||
|
The `process/` directory is primarily for agent output created during
|
||||||
|
delivery, maintenance, and review work.
|
||||||
|
|
||||||
|
Recommended location:
|
||||||
|
|
||||||
|
```text
|
||||||
|
process/
|
||||||
|
```
|
||||||
|
|
||||||
|
Agent rules for process output:
|
||||||
|
|
||||||
|
- Use `process/` for agent-produced artifacts rather than released-product
|
||||||
|
documentation.
|
||||||
|
- Keep entries concise, traceable, and tied to resulting changes.
|
||||||
|
- Treat `process/` as workflow output, not as the primary source of product
|
||||||
|
truth.
|
||||||
|
- Prefer summaries and rationale over raw transcript dumps unless a workflow
|
||||||
|
explicitly requires full prompt history.
|
||||||
|
|
||||||
|
# Agent Change Log
|
||||||
|
|
||||||
|
Location:
|
||||||
|
|
||||||
|
```text
|
||||||
|
process/log.md
|
||||||
|
```
|
||||||
|
|
||||||
|
Use for:
|
||||||
|
|
||||||
|
- Capturing prompts given to agents.
|
||||||
|
- Recording concise explanations of the resulting changes made by agents.
|
||||||
|
- Preserving task-by-task rationale, decisions, and implementation notes.
|
||||||
|
|
||||||
|
# Coding Handbook
|
||||||
|
|
||||||
|
Location:
|
||||||
|
|
||||||
|
```text
|
||||||
|
process/coding-handbook.md
|
||||||
|
```
|
||||||
|
|
||||||
|
Use for:
|
||||||
|
|
||||||
|
- A tutorial-style handbook that explains the programming components used in
|
||||||
|
the project.
|
||||||
|
- Compact but comprehensive technical onboarding material for future
|
||||||
|
contributors.
|
||||||
|
- Written explanations that connect code structure, concepts, and
|
||||||
|
implementation patterns. -->
|
||||||
|
|
||||||
|
|
||||||
|
# Project-Specific Requirements
|
||||||
|
|
||||||
|
|
||||||
|
Project-specific material should live outside the generic sections above.
|
||||||
|
|
||||||
|
Recommended location:
|
||||||
|
|
||||||
|
```text
|
||||||
|
requirements/
|
||||||
|
```
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
|
||||||
|
- `requirements/project.md`
|
||||||
|
- `requirements/architecture.md`
|
||||||
|
- `requirements/decisions.md`
|
||||||
|
- `requirements/domain.md`
|
||||||
|
|
||||||
|
Use for:
|
||||||
|
|
||||||
|
- Product and business requirements.
|
||||||
|
- Project goals and constraints.
|
||||||
|
- Architecture and design decisions.
|
||||||
|
- Domain knowledge that is specific to this repository.
|
||||||
|
|
||||||
|
# Agent-Level Variables
|
||||||
|
|
||||||
|
When present, `requirements/identifiers.yml` is an optional project-specific
|
||||||
|
input that defines agent-level variables for use inside `requirements/` and
|
||||||
|
`guidance/`.
|
||||||
|
|
||||||
|
Variable schema:
|
||||||
|
|
||||||
|
- Use `@{VARIABLE_NAME}` for agent-level variables.
|
||||||
|
- Prefer uppercase snake case names such as `@{PROJECT_ID}` or `@{VENDOR_ID}`.
|
||||||
|
- Do not treat `${...}` as an agent-level variable form; that syntax may appear
|
||||||
|
in Bash or other code and should not be interpreted as agent metadata.
|
||||||
|
|
||||||
|
Scope:
|
||||||
|
|
||||||
|
- The effective scope of `requirements/identifiers.yml` is limited to
|
||||||
|
`requirements/` and `guidance/`.
|
||||||
|
- Definitions from `requirements/identifiers.yml` must not leak into product code.
|
||||||
|
|
||||||
|
Defaults:
|
||||||
|
|
||||||
|
- Default `@{VENDOR_ID}` is `osgw`.
|
||||||
|
- Default `@{PROJECT_ID}` is the current repository directory name.
|
||||||
|
|
||||||
|
Resolution rules:
|
||||||
|
|
||||||
|
- Treat `requirements/identifiers.yml` as optional; when it is absent, agents
|
||||||
|
may still resolve the defaults defined above.
|
||||||
|
- If a variable is used in `requirements/` or `guidance/` and it is not
|
||||||
|
defined in `requirements/identifiers.yml` and does not have a default in this
|
||||||
|
file, agents may stop and report the undefined variable.
|
||||||
|
- Prefer updating duplicated identifier values in `requirements/` and
|
||||||
|
`guidance/` to use the variable schema when that improves consistency.
|
||||||
|
|
||||||
|
# Precedence
|
||||||
|
|
||||||
|
Some precedence levels may be absent because optional levels can remain inside
|
||||||
|
HTML comments. The smaller numeric index wins.
|
||||||
|
|
||||||
|
Apply guidance in this order:
|
||||||
|
|
||||||
|
1. Direct user or task instructions.
|
||||||
|
2. Project-specific documents in `requirements/`.
|
||||||
|
<!-- 3. Released-product documentation in `docs/` when shipped behavior or
|
||||||
|
user-facing expectations are relevant. -->
|
||||||
|
4. Relevant modular guides in `guidance/stacks/`, `guidance/conventions/`, or `guidance/workflows/`.
|
||||||
|
<!-- 5. Agent output in `process/` when prior prompts, rationale, or
|
||||||
|
implementation notes are relevant. -->
|
||||||
|
6. This `AGENTS.md`.
|
||||||
|
|
||||||
|
# Maintenance
|
||||||
|
|
||||||
|
- Keep this file short and stable.
|
||||||
|
- Move detail into dedicated modules when a section becomes too specific or too long.
|
||||||
|
- Add new guideline files only when they solve a recurring need.
|
||||||
|
- Remove outdated references when the repository structure changes.
|
||||||
|
|
||||||
|
# Current Status
|
||||||
|
|
||||||
|
This repository defines the base `AGENTS.md` structure plus project-specific
|
||||||
|
requirements and modular guidance.
|
||||||
|
|
||||||
|
Future project work can add:
|
||||||
|
|
||||||
|
- Reusable modules under `guidance/`
|
||||||
|
- Project-specific documentation under `requirements/`
|
||||||
|
- Optional temporary iteration context in `SCRATCHPAD.md`
|
||||||
|
- Optional released-product documentation under `docs/` by uncommenting its segment
|
||||||
|
- Optional agent output under `process/` by uncommenting its segment
|
||||||
|
- Cross-references from this file once those documents exist
|
||||||
62
SCRATCHPAD.md
Normal file
62
SCRATCHPAD.md
Normal file
@@ -0,0 +1,62 @@
|
|||||||
|
<!--
|
||||||
|
|
||||||
|
# Scratchpad
|
||||||
|
|
||||||
|
Temporary information holder for the next iteration. Developers may create or
|
||||||
|
delete this file at any time. Anything durable should move into code, tests, or
|
||||||
|
canonical docs, then this file should disappear.
|
||||||
|
|
||||||
|
|
||||||
|
## Goal
|
||||||
|
|
||||||
|
Use this section for the current slice of work. It should explain what the
|
||||||
|
scratchpad is helping us move forward right now.
|
||||||
|
|
||||||
|
## Settled
|
||||||
|
|
||||||
|
Use this for decisions that are stable enough to guide the next steps, but are
|
||||||
|
still temporary enough to live in the scratchpad for now.
|
||||||
|
|
||||||
|
## Focused Snapshot
|
||||||
|
|
||||||
|
Use an extra section like this only when one slice needs its own compact
|
||||||
|
summary. This is useful when a specific API, boundary, or model was recently
|
||||||
|
recreated and should be captured clearly.
|
||||||
|
|
||||||
|
|
||||||
|
## Open
|
||||||
|
|
||||||
|
Use this for unresolved questions, design choices, and risks that still need a
|
||||||
|
decision.
|
||||||
|
|
||||||
|
## Sketches
|
||||||
|
|
||||||
|
Use this for rough candidate structures, names, or shapes. Keep it explicit
|
||||||
|
that these are sketches, not committed architecture.
|
||||||
|
|
||||||
|
|
||||||
|
## Gaps Right Now
|
||||||
|
|
||||||
|
Use this for concrete missing pieces in the current repo state. This section
|
||||||
|
should describe what is absent or incomplete, not broad future ambitions.
|
||||||
|
|
||||||
|
## Next
|
||||||
|
|
||||||
|
Use this for the immediate sequence of work. It should be short, ordered, and
|
||||||
|
biased toward the next deliverable rather than a long roadmap.
|
||||||
|
|
||||||
|
## Delete When
|
||||||
|
|
||||||
|
Use this to define when the scratchpad should disappear. That keeps it clearly
|
||||||
|
temporary and helps prevent it from turning into shadow documentation.
|
||||||
|
|
||||||
|
|
||||||
|
## Suggested Style
|
||||||
|
|
||||||
|
- Prefer short bullets over long prose.
|
||||||
|
- Keep facts, questions, and rough sketches in separate sections.
|
||||||
|
- Add custom sections only when they help the next iteration move faster.
|
||||||
|
- Move durable outcomes out of the scratchpad once they stop being temporary.
|
||||||
|
|
||||||
|
|
||||||
|
-->
|
||||||
28
guidance/workflow/optional/lean-interface-iteration.md
Normal file
28
guidance/workflow/optional/lean-interface-iteration.md
Normal file
@@ -0,0 +1,28 @@
|
|||||||
|
# Lean Interface Iteration
|
||||||
|
|
||||||
|
Rule set name: `lean-interface-iteration`
|
||||||
|
|
||||||
|
Rule set ID: `LII`
|
||||||
|
|
||||||
|
Status: optional, prompt-activated only
|
||||||
|
|
||||||
|
Trigger examples:
|
||||||
|
|
||||||
|
- `Apply the lean-interface-iteration rules.`
|
||||||
|
- `Apply LII rules.`
|
||||||
|
|
||||||
|
LII-0001: Apply this rule set only when it is explicitly requested in the prompt.
|
||||||
|
|
||||||
|
LII-0002: The target of work under this rule set is the iterated product state for the addressed iteration only.
|
||||||
|
|
||||||
|
LII-0003: Optimize the addressed interface toward the leanest and least complex model that still satisfies the iteration order.
|
||||||
|
|
||||||
|
LII-0004: Backward compatibility, legacy aliases, and compatibility shims are not required unless the prompt explicitly asks to preserve them.
|
||||||
|
|
||||||
|
LII-0005: Prefer one authoritative interface over multiple overlapping parameters, flags, or naming variants.
|
||||||
|
|
||||||
|
LII-0006: Remove or avoid transitional interface layers when they are not required by the addressed iteration order.
|
||||||
|
|
||||||
|
LII-0007: Update affected tests, guidance, requirements, and documentation so they describe the simplified interface model rather than a mixed legacy-and-new model.
|
||||||
|
|
||||||
|
LII-0008: Never change behavior, interfaces, or surrounding areas that are not addressed by the current iteration order.
|
||||||
56
guidance/workflow/optional/preparation-script-design.md
Normal file
56
guidance/workflow/optional/preparation-script-design.md
Normal file
@@ -0,0 +1,56 @@
|
|||||||
|
# Preparation Script Design
|
||||||
|
|
||||||
|
Rule set name: `preparation-script-design`
|
||||||
|
|
||||||
|
Rule set ID: `PSD`
|
||||||
|
|
||||||
|
Status: optional, prompt-activated only
|
||||||
|
|
||||||
|
Trigger examples:
|
||||||
|
|
||||||
|
- `Apply the preparation-script-design rules.`
|
||||||
|
- `Apply PSD rules.`
|
||||||
|
|
||||||
|
PSD-0001: Apply this rule set only when it is explicitly requested in the prompt.
|
||||||
|
|
||||||
|
PSD-0002: Use this rule set for scripts whose purpose is to prepare, verify, or expose a local development or automation environment rather than to perform product runtime behavior.
|
||||||
|
|
||||||
|
PSD-0003: Keep a preparation script focused on environment readiness, dependency installation, local helper exposure, and clear verification output; do not mix unrelated product logic into the script.
|
||||||
|
|
||||||
|
PSD-0004: Design the script to be idempotent so repeated runs converge on the same prepared state without unnecessary reinstallation or destructive side effects.
|
||||||
|
|
||||||
|
PSD-0005: Provide a verification-only mode such as `--check` that reports readiness without installing, modifying, or creating dependencies.
|
||||||
|
|
||||||
|
PSD-0006: Separate component checks from installation steps so the script can report what is missing before or after attempted remediation.
|
||||||
|
|
||||||
|
PSD-0007: Group required capabilities into clear purpose-oriented sections such as support toolchains, local package bundles, generated environment helpers, or other relevant readiness areas instead of presenting one undifferentiated dependency list.
|
||||||
|
|
||||||
|
PSD-0008: Prefer explicit per-component check helpers over opaque one-shot checks so failures remain traceable and easy to extend.
|
||||||
|
|
||||||
|
PSD-0009: Generate or update environment helper files only when they provide a stable, reusable way to expose repo-local or workspace-local tools, paths, or environment variables.
|
||||||
|
|
||||||
|
PSD-0010: Generated environment helper files shall be safe to source multiple times and should avoid duplicating path entries or clobbering unrelated user environment state.
|
||||||
|
|
||||||
|
PSD-0011: When a preparation flow seeds optional user-owned files such as config templates, do so non-destructively by creating them only when absent unless the prompt explicitly requests overwrite behavior.
|
||||||
|
|
||||||
|
PSD-0012: Report status in a concise scan-friendly line format of the shape `[status] Label: detail`, where the label names the checked component and the detail string stays short and specific.
|
||||||
|
|
||||||
|
PSD-0013: Prefer a small canonical status vocabulary in those report lines, with `ok` for satisfied checks, `warn` for non-blocking gaps, and a failure status such as `failed` for blocking or unsuccessful states.
|
||||||
|
|
||||||
|
PSD-0014: When a preparation script uses terminal colors in its status output, apply a consistent severity mapping so `ok` is green, `warn` is yellow, and all other status levels are red.
|
||||||
|
|
||||||
|
PSD-0015: In bracketed status markers such as `[ok]` or `[warn]`, keep the square brackets uncolored and apply the severity color only to the inner status text.
|
||||||
|
|
||||||
|
PSD-0016: Colorized status output shall degrade safely in non-terminal or non-color contexts so the script remains readable and automation-friendly without ANSI support.
|
||||||
|
|
||||||
|
PSD-0017: End with an explicit readiness conclusion that distinguishes between successful preparation, incomplete prerequisites, and failed installation attempts.
|
||||||
|
|
||||||
|
PSD-0018: Installation logic should use the narrowest supported platform-specific package-manager actions necessary for the declared scope and should fail clearly when no supported installation path is available.
|
||||||
|
|
||||||
|
PSD-0019: Treat repo-local helper tooling and local package installation boundaries explicitly rather than assuming global installs, especially when the prepared environment is intended to be reproducible.
|
||||||
|
|
||||||
|
PSD-0020: Keep the script suitable for both interactive local developer use and non-interactive automation checks by avoiding prompts during normal execution unless the prompt explicitly requires interactivity.
|
||||||
|
|
||||||
|
PSD-0021: When a script depends on generated helper files or adjacent validation helpers, update those supporting files only as needed to keep the preparation flow coherent and usable.
|
||||||
|
|
||||||
|
PSD-0022: Verify shell syntax after changes and, when feasible, run a dry readiness check so the resulting preparation flow is validated rather than only written.
|
||||||
97
requirements/architecture.md
Normal file
97
requirements/architecture.md
Normal file
@@ -0,0 +1,97 @@
|
|||||||
|
# Architecture
|
||||||
|
|
||||||
|
## Architecture Goals
|
||||||
|
|
||||||
|
- Keep the tool small, local, and easy to reason about.
|
||||||
|
- Separate media inspection, stored normalization rules, and conversion execution clearly enough that users can inspect and adjust behavior.
|
||||||
|
- Favor explicit local state and deterministic rule application over opaque automation.
|
||||||
|
- Make external runtime dependencies and platform assumptions visible.
|
||||||
|
|
||||||
|
## System Context
|
||||||
|
|
||||||
|
- Primary actors:
|
||||||
|
- Local operator running the CLI.
|
||||||
|
- Local operator using the Textual TUI to inspect files and maintain rules.
|
||||||
|
- External systems:
|
||||||
|
- `ffprobe` for media introspection.
|
||||||
|
- `ffmpeg` for conversion and extraction.
|
||||||
|
- TMDB API for optional show and episode metadata.
|
||||||
|
- Local filesystem for source media, generated outputs, subtitles, logs, config, and database files.
|
||||||
|
- Data entering the system:
|
||||||
|
- Media container and stream metadata from source files.
|
||||||
|
- Regex patterns and per-show normalization rules entered in the TUI.
|
||||||
|
- Optional config values from `~/.local/etc/ffx.json`.
|
||||||
|
- Optional TMDB identifiers and CLI overrides.
|
||||||
|
- Optional external subtitle files.
|
||||||
|
- Data leaving the system:
|
||||||
|
- Normalized output media files.
|
||||||
|
- Extracted stream files from unmux operations.
|
||||||
|
- SQLite rows representing shows, patterns, tracks, tags, shifted seasons, and properties.
|
||||||
|
- Local log output and console messages.
|
||||||
|
|
||||||
|
## High-Level Building Blocks
|
||||||
|
|
||||||
|
- Frontend, CLI, API, or worker:
|
||||||
|
- A Click-based CLI in [`src/ffx/ffx.py`](/home/osgw/.local/src/codex/ffx/src/ffx/ffx.py).
|
||||||
|
- A Textual terminal UI rooted in [`src/ffx/ffx_app.py`](/home/osgw/.local/src/codex/ffx/src/ffx/ffx_app.py) with screens for shows, patterns, file inspection, tracks, tags, and shifted seasons.
|
||||||
|
- Core business logic:
|
||||||
|
- Descriptor objects model media files, shows, and tracks.
|
||||||
|
- Controllers encapsulate CRUD operations and workflow orchestration for shows, patterns, tags, tracks, season shifts, configuration, and conversion.
|
||||||
|
- `MediaDescriptorChangeSet` computes differences between a file and its stored target schema to drive metadata and disposition updates.
|
||||||
|
- Storage:
|
||||||
|
- SQLite via SQLAlchemy ORM, with schema rooted in shows, patterns, tracks, media tags, track tags, shifted seasons, and generic properties.
|
||||||
|
- A configuration JSON file supplies optional path, metadata-filtering, and filename-template settings.
|
||||||
|
- Integration adapters:
|
||||||
|
- Process execution wrapper for `ffmpeg`, `ffprobe`, `nice`, and `cpulimit`.
|
||||||
|
- HTTP adapter for TMDB via `requests`.
|
||||||
|
|
||||||
|
## Data And Interface Notes
|
||||||
|
|
||||||
|
- Key entities or records:
|
||||||
|
- `Show`: canonical TV show metadata plus digit-formatting rules for generated filenames.
|
||||||
|
- `Pattern`: regex rule tying filenames to one show and one target media schema.
|
||||||
|
- `Track` and `TrackTag`: persisted target stream layout, codec, dispositions, audio layout, and stream-level tags.
|
||||||
|
- `MediaTag`: persisted container-level metadata for a pattern.
|
||||||
|
- `ShiftedSeason`: mapping from source numbering ranges to adjusted season and episode numbers.
|
||||||
|
- `Property`: internal key-value storage currently used for database versioning.
|
||||||
|
- External interfaces:
|
||||||
|
- CLI commands for conversion, inspection, extraction, and crop detection.
|
||||||
|
- TUI workflows for rule authoring and rule maintenance.
|
||||||
|
- Environment variable `TMDB_API_KEY` for TMDB access.
|
||||||
|
- Config keys `databasePath`, `logDirectory`, and `outputFilenameTemplate`, plus optional metadata-filter rules.
|
||||||
|
- Validation rules:
|
||||||
|
- Only supported media-file extensions are accepted for conversion.
|
||||||
|
- Stored database version must match the runtime-required version.
|
||||||
|
- A normalized descriptor may have at most one default and one forced stream per relevant track type.
|
||||||
|
- Stored target tracks must refer to valid source tracks of matching types.
|
||||||
|
- Shifted-season ranges are intended not to overlap for the same show and season.
|
||||||
|
- TMDB lookups require a show ID and season and episode numbers.
|
||||||
|
- Error-handling approach:
|
||||||
|
- User-facing operational failures are raised as `click.ClickException` or warnings.
|
||||||
|
- Ambiguous default and forced stream states trigger prompts unless `--no-prompt` is set, in which case the command fails fast.
|
||||||
|
- External-process failures and invalid media are surfaced through logs and command errors rather than retries, except for TMDB rate-limit retries.
|
||||||
|
|
||||||
|
## Deployment And Operations
|
||||||
|
|
||||||
|
- Runtime environment:
|
||||||
|
- Local Python environment with the package installed and `ffmpeg`, `ffprobe`, `nice`, and `cpulimit` available on `PATH`.
|
||||||
|
- Deployment shape:
|
||||||
|
- Single-process command execution on demand; no daemon, queue, or network service of its own.
|
||||||
|
- Secrets and configuration handling:
|
||||||
|
- TMDB secret is read from `TMDB_API_KEY`.
|
||||||
|
- User config is read from `~/.local/etc/ffx.json`.
|
||||||
|
- Database path may also be overridden per command via `--database-file`.
|
||||||
|
- Logging and monitoring approach:
|
||||||
|
- File and console logging configured per invocation.
|
||||||
|
- Default log file path is `~/.local/var/log/ffx.log`.
|
||||||
|
- No dedicated monitoring integration is present.
|
||||||
|
|
||||||
|
## Open Technical Questions
|
||||||
|
|
||||||
|
- Question: Should Linux-specific assumptions such as `/dev/null`, `nice`, `cpulimit`, and `~/.local` remain part of the supported-platform contract?
|
||||||
|
- Risk: Portability and operational behavior are underspecified for non-Linux environments.
|
||||||
|
- Next decision needed: Either document Linux-like systems as the official support boundary or refactor the process and path handling for broader portability.
|
||||||
|
|
||||||
|
- Question: Should placeholder TUI surfaces such as settings and help become part of the required product surface or stay explicitly out of scope?
|
||||||
|
- Risk: The UI appears broader than the actually finished feature set.
|
||||||
|
- Next decision needed: Either remove or complete placeholder screens and update requirements accordingly.
|
||||||
101
requirements/project.md
Normal file
101
requirements/project.md
Normal file
@@ -0,0 +1,101 @@
|
|||||||
|
## Purpose And Scope
|
||||||
|
|
||||||
|
- Project name: FFX
|
||||||
|
- User problem: TV episode files from mixed sources arrive with inconsistent codecs, stream metadata, subtitle layouts, season and episode numbering, and output filenames, which makes them awkward to archive and use in media-player applications.
|
||||||
|
- Target users: Individual operators curating a local TV media library on a workstation, especially users willing to define normalization rules per show.
|
||||||
|
- Success outcome: A user can inspect source files, define reusable show and pattern rules, and produce output files whose streams, metadata, and filenames follow a predictable schema for web playback and library import.
|
||||||
|
- Out of scope:
|
||||||
|
- Multi-user or hosted service workflows.
|
||||||
|
- General movie-library management.
|
||||||
|
- Distributed transcoding or remote job orchestration.
|
||||||
|
- Broad media-server administration beyond file preparation.
|
||||||
|
|
||||||
|
## Required Product
|
||||||
|
|
||||||
|
- Deliverable type: Installable Python command-line application with a Textual terminal UI for inspection and rule editing.
|
||||||
|
- Core capabilities:
|
||||||
|
- Maintain an SQLite-backed database of shows, filename-matching patterns, per-pattern stream layouts and metadata tags, and optional season-shift rules.
|
||||||
|
- Inspect existing media files through `ffprobe` and compare discovered stream metadata with stored normalization rules.
|
||||||
|
- Convert media files through `ffmpeg` into a normalized output layout, including video recoding, audio transcoding to Opus, metadata cleanup and rewrite, and controlled disposition flags.
|
||||||
|
- Build output filenames from detected or configured show, season, and episode information, optionally enriched from TMDB and a configurable Jinja-style filename template.
|
||||||
|
- Support auxiliary file operations such as subtitle import, unmuxing, crop detection, and rename-only runs.
|
||||||
|
- Supported environments:
|
||||||
|
- Local execution on a Python-capable workstation.
|
||||||
|
- Best-supported on Linux-like systems because the implementation assumes `~/.local`, `/dev/null`, `nice`, and `cpulimit`.
|
||||||
|
- Requires `ffmpeg`, `ffprobe`, and `cpulimit` on `PATH`.
|
||||||
|
- Operational owner: The local user running the tool and maintaining its config, database, and external tooling.
|
||||||
|
|
||||||
|
## Suggested User Stories
|
||||||
|
|
||||||
|
- As a library maintainer, I want to define show-specific matching rules once so that future source files can be normalized automatically.
|
||||||
|
- As an operator, I want to inspect a file before conversion so that I can compare its actual streams and tags against the stored target schema.
|
||||||
|
- As a user preparing web-playback files, I want to recode video and audio with a small set of predictable options so that results are compatible and consistently named.
|
||||||
|
- As a user dealing with nonstandard releases, I want CLI overrides for language, title, stream order, default and forced tracks, and season and episode data so that one-off fixes do not require database edits first.
|
||||||
|
- As a user importing anime or other shifted numbering schemes, I want season and episode offsets per show so that generated filenames align with TMDB and media-library expectations.
|
||||||
|
|
||||||
|
## Functional Requirements
|
||||||
|
|
||||||
|
- The system shall provide a CLI entrypoint named `ffx` with commands for `convert`, `inspect`, `shows`, `unmux`, `cropdetect`, `version`, and `help`.
|
||||||
|
- The system shall persist reusable normalization rules in SQLite for:
|
||||||
|
- shows and show formatting digits,
|
||||||
|
- regex-based filename patterns,
|
||||||
|
- per-pattern media tags,
|
||||||
|
- per-pattern stream definitions,
|
||||||
|
- shifted-season mappings,
|
||||||
|
- internal database version properties.
|
||||||
|
- The system shall inspect source media using `ffprobe` and derive a structured description of container metadata and streams.
|
||||||
|
- The system shall optionally open a Textual UI to browse shows, inspect files, and create, edit, or delete shows, patterns, stream definitions, tags, and shifted-season rules.
|
||||||
|
- The system shall match filenames against stored regex patterns to decide whether an input file should inherit a target stream and metadata schema.
|
||||||
|
- The system shall convert supported input files (`mkv`, `mp4`, `avi`, `flv`, `webm`) with `ffmpeg`, supporting at least:
|
||||||
|
- VP9, AV1, and H.264 video encoding,
|
||||||
|
- Opus audio encoding with bitrate selection based on channel layout,
|
||||||
|
- metadata and disposition rewriting,
|
||||||
|
- optional crop detection and crop application,
|
||||||
|
- optional deinterlacing and denoising,
|
||||||
|
- optional subtitle import from external files,
|
||||||
|
- rename-only copy mode.
|
||||||
|
- The system shall support optional TMDB lookups to resolve show names, years, and episode titles when a show ID, season, and episode are available.
|
||||||
|
- The system shall generate output filenames from show metadata, season and episode indices, and episode names using the configured filename template.
|
||||||
|
- The system shall allow CLI overrides for stream languages, stream titles, default and forced tracks, stream order, TMDB show and episode data, output directory, label prefix, and processing resource limits.
|
||||||
|
- The system shall support extracting streams into separate files via `unmux` and reporting suggested crop parameters via `cropdetect`.
|
||||||
|
- The system shall handle invalid input and system failures gracefully by logging warnings or raising `click` errors for missing files, invalid media, missing TMDB credentials, incompatible database versions, and ambiguous track dispositions when prompting is disabled.
|
||||||
|
|
||||||
|
## Quality Requirements
|
||||||
|
|
||||||
|
- The system should stay understandable as a small local tool: controllers, descriptors, models, and screens should remain separate enough for contributors to trace a workflow end to end.
|
||||||
|
- The system should produce predictable output for the same database rules, CLI overrides, and source files.
|
||||||
|
- The system should preserve a lightweight operational footprint: local SQLite state, local log file, no mandatory background services.
|
||||||
|
- The system should be testable through the existing combinatorial CLI-oriented test harness and through isolated logic in descriptors and controllers.
|
||||||
|
- The system should expose enough logging to diagnose failed probes, failed conversions, and rule mismatches without requiring a debugger.
|
||||||
|
|
||||||
|
## Constraints And Assumptions
|
||||||
|
|
||||||
|
- Technology constraints:
|
||||||
|
- Python package built with setuptools.
|
||||||
|
- Primary libraries: `click`, `textual`, `sqlalchemy`, `jinja2`, `requests`.
|
||||||
|
- Conversion and inspection rely on external executables rather than pure-Python media libraries.
|
||||||
|
- Hosting or infrastructure constraints:
|
||||||
|
- Intended for local execution, not server deployment.
|
||||||
|
- Stores default state in `~/.local/etc/ffx.json`, `~/.local/var/ffx/ffx.db`, and `~/.local/var/log/ffx.log`.
|
||||||
|
- Timeline constraints:
|
||||||
|
- The current implemented scope reflects a compact alpha release stream up to version `0.2.3`.
|
||||||
|
- Team capacity assumptions:
|
||||||
|
- Maintained as a small codebase where simple patterns and direct controller logic are preferred over framework-heavy abstractions.
|
||||||
|
- Third-party dependencies:
|
||||||
|
- `ffmpeg`, `ffprobe`, and `cpulimit`.
|
||||||
|
- TMDB API access through `TMDB_API_KEY` for metadata enrichment.
|
||||||
|
|
||||||
|
## Acceptance Scope
|
||||||
|
|
||||||
|
- First release boundary:
|
||||||
|
- Local installation through `pip`.
|
||||||
|
- Working SQLite-backed rule storage.
|
||||||
|
- Functional CLI conversion and inspection workflows.
|
||||||
|
- Textual CRUD flows for shows, patterns, tags, tracks, and shifted seasons.
|
||||||
|
- TMDB-assisted filename generation, subtitle import, season shifting, database versioning, and configurable output filename templating.
|
||||||
|
- Excluded follow-up ideas:
|
||||||
|
- Completing placeholder screens such as settings and help.
|
||||||
|
- Hardening platform portability beyond Linux-like systems.
|
||||||
|
- Broader media types, richer release packaging, and production-grade background processing.
|
||||||
|
- Demonstration scenario:
|
||||||
|
- Inspect a TV episode file, define or update the matching show and pattern in the TUI, then run `ffx convert` so the result uses the stored stream schema, optional TMDB episode naming, and a normalized output filename.
|
||||||
@@ -1,6 +1,6 @@
|
|||||||
#! /usr/bin/python3
|
#! /usr/bin/python3
|
||||||
|
|
||||||
import os, click, time, logging, shutil
|
import os, click, time, logging, shutil, subprocess
|
||||||
|
|
||||||
from ffx.configuration_controller import ConfigurationController
|
from ffx.configuration_controller import ConfigurationController
|
||||||
|
|
||||||
@@ -49,6 +49,11 @@ def ffx(ctx, database_file, verbose, dry_run):
|
|||||||
|
|
||||||
ctx.obj = {}
|
ctx.obj = {}
|
||||||
|
|
||||||
|
if ctx.invoked_subcommand in ('setup_dependencies', 'upgrade'):
|
||||||
|
ctx.obj['dry_run'] = dry_run
|
||||||
|
ctx.obj['verbosity'] = verbose
|
||||||
|
return
|
||||||
|
|
||||||
ctx.obj['config'] = ConfigurationController()
|
ctx.obj['config'] = ConfigurationController()
|
||||||
|
|
||||||
ctx.obj['database'] = databaseContext(databasePath=database_file
|
ctx.obj['database'] = databaseContext(databasePath=database_file
|
||||||
@@ -97,6 +102,82 @@ def help():
|
|||||||
click.echo(f"Usage: ffx [input file] [output file] [vp9|av1] [q=[nn[,nn,...]]] [p=nn] [a=nnn[k]] [ac3=nnn[k]] [dts=nnn[k]] [crop]")
|
click.echo(f"Usage: ffx [input file] [output file] [vp9|av1] [q=[nn[,nn,...]]] [p=nn] [a=nnn[k]] [ac3=nnn[k]] [dts=nnn[k]] [crop]")
|
||||||
|
|
||||||
|
|
||||||
|
def getRepoRootPath():
|
||||||
|
currentFilePath = os.path.abspath(__file__)
|
||||||
|
return os.path.dirname(os.path.dirname(os.path.dirname(currentFilePath)))
|
||||||
|
|
||||||
|
|
||||||
|
def getPrepareScriptPath():
|
||||||
|
return os.path.join(getRepoRootPath(), 'tools', 'prepare.sh')
|
||||||
|
|
||||||
|
|
||||||
|
def getBundleVenvDirectory():
|
||||||
|
return os.path.join(os.path.expanduser('~'), '.local', 'share', 'ffx.venv')
|
||||||
|
|
||||||
|
|
||||||
|
def getBundlePipPath():
|
||||||
|
return os.path.join(getBundleVenvDirectory(), 'bin', 'pip')
|
||||||
|
|
||||||
|
|
||||||
|
def getBundleRepoPath():
|
||||||
|
return getRepoRootPath()
|
||||||
|
|
||||||
|
|
||||||
|
@ffx.command(name='setup_dependencies')
|
||||||
|
@click.pass_context
|
||||||
|
@click.option('--check', is_flag=True, default=False, help='Only verify dependency readiness')
|
||||||
|
@click.argument('prepare_args', nargs=-1, type=click.UNPROCESSED)
|
||||||
|
def setup_dependencies(ctx, check, prepare_args):
|
||||||
|
prepareScriptPath = getPrepareScriptPath()
|
||||||
|
|
||||||
|
if not os.path.isfile(prepareScriptPath):
|
||||||
|
raise click.ClickException(f"Preparation script not found at {prepareScriptPath}")
|
||||||
|
|
||||||
|
commandSequence = ['bash', prepareScriptPath]
|
||||||
|
|
||||||
|
if check:
|
||||||
|
commandSequence.append('--check')
|
||||||
|
|
||||||
|
commandSequence += list(prepare_args)
|
||||||
|
|
||||||
|
if ctx.obj.get('dry_run', False):
|
||||||
|
click.echo(' '.join(commandSequence))
|
||||||
|
return
|
||||||
|
|
||||||
|
completed = subprocess.run(commandSequence)
|
||||||
|
ctx.exit(completed.returncode)
|
||||||
|
|
||||||
|
|
||||||
|
@ffx.command(name='upgrade')
|
||||||
|
@click.pass_context
|
||||||
|
@click.argument('branch', required=False, default='main')
|
||||||
|
def upgrade(ctx, branch):
|
||||||
|
bundleRepoPath = getBundleRepoPath()
|
||||||
|
bundlePipPath = getBundlePipPath()
|
||||||
|
|
||||||
|
if not os.path.isdir(bundleRepoPath):
|
||||||
|
raise click.ClickException(f"Bundle repository not found at {bundleRepoPath}")
|
||||||
|
|
||||||
|
if not os.path.isfile(bundlePipPath):
|
||||||
|
raise click.ClickException(f"Bundle pip not found at {bundlePipPath}")
|
||||||
|
|
||||||
|
commandSequences = [
|
||||||
|
['git', 'checkout', branch],
|
||||||
|
['git', 'pull'],
|
||||||
|
[bundlePipPath, 'install', '--editable', '.'],
|
||||||
|
]
|
||||||
|
|
||||||
|
if ctx.obj.get('dry_run', False):
|
||||||
|
for commandSequence in commandSequences:
|
||||||
|
click.echo(f"(cd {bundleRepoPath} && {' '.join(commandSequence)})")
|
||||||
|
return
|
||||||
|
|
||||||
|
for commandSequence in commandSequences:
|
||||||
|
completed = subprocess.run(commandSequence, cwd=bundleRepoPath)
|
||||||
|
if completed.returncode != 0:
|
||||||
|
ctx.exit(completed.returncode)
|
||||||
|
|
||||||
|
|
||||||
@ffx.command()
|
@ffx.command()
|
||||||
@click.pass_context
|
@click.pass_context
|
||||||
@click.argument('filename', nargs=1)
|
@click.argument('filename', nargs=1)
|
||||||
|
|||||||
@@ -11,6 +11,7 @@
|
|||||||
update_cache: true
|
update_cache: true
|
||||||
name:
|
name:
|
||||||
- python3-virtualenv
|
- python3-virtualenv
|
||||||
|
- cpulimit
|
||||||
- ffmpeg
|
- ffmpeg
|
||||||
- git
|
- git
|
||||||
- screen
|
- screen
|
||||||
@@ -21,6 +22,7 @@
|
|||||||
ansible.builtin.pacman:
|
ansible.builtin.pacman:
|
||||||
update_cache: true
|
update_cache: true
|
||||||
name:
|
name:
|
||||||
|
- cpulimit
|
||||||
- ffmpeg
|
- ffmpeg
|
||||||
- git
|
- git
|
||||||
- screen
|
- screen
|
||||||
|
|||||||
444
tools/prepare.sh
Executable file
444
tools/prepare.sh
Executable file
@@ -0,0 +1,444 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
|
||||||
|
set -u
|
||||||
|
|
||||||
|
SCRIPT_DIR="$(cd -- "$(dirname -- "${BASH_SOURCE[0]}")" && pwd)"
|
||||||
|
|
||||||
|
CONFIG_DIR="${FFX_CONFIG_DIR:-${HOME}/.local/etc}"
|
||||||
|
CONFIG_FILE="${FFX_CONFIG_FILE:-${CONFIG_DIR}/ffx.json}"
|
||||||
|
VAR_DIR="${FFX_VAR_DIR:-${HOME}/.local/var/ffx}"
|
||||||
|
LOG_DIR="${FFX_LOG_DIR:-${HOME}/.local/var/log}"
|
||||||
|
DATABASE_FILE="${FFX_DATABASE_FILE:-${VAR_DIR}/ffx.db}"
|
||||||
|
|
||||||
|
CHECK_ONLY=0
|
||||||
|
|
||||||
|
MUTATIONS=0
|
||||||
|
INSTALL_FAILURES=0
|
||||||
|
READINESS_FAILURES=0
|
||||||
|
|
||||||
|
MISSING_REQUIRED_SYSTEM=()
|
||||||
|
MISSING_OPTIONAL_SYSTEM=()
|
||||||
|
|
||||||
|
COLOR_RESET=""
|
||||||
|
COLOR_GREEN=""
|
||||||
|
COLOR_YELLOW=""
|
||||||
|
COLOR_RED=""
|
||||||
|
|
||||||
|
if [ -t 1 ]; then
|
||||||
|
COLOR_RESET="$(printf '\033[0m')"
|
||||||
|
COLOR_GREEN="$(printf '\033[32m')"
|
||||||
|
COLOR_YELLOW="$(printf '\033[33m')"
|
||||||
|
COLOR_RED="$(printf '\033[31m')"
|
||||||
|
fi
|
||||||
|
|
||||||
|
usage() {
|
||||||
|
cat <<EOF
|
||||||
|
Usage: $(basename "$0") [--check] [--help]
|
||||||
|
|
||||||
|
Prepare the local FFX development environment for this repository.
|
||||||
|
|
||||||
|
Options:
|
||||||
|
--check Report readiness only. Do not create, install, or modify.
|
||||||
|
--help Show this help text.
|
||||||
|
|
||||||
|
Environment overrides:
|
||||||
|
FFX_CONFIG_DIR Override the parent directory for the seeded ffx.json file.
|
||||||
|
FFX_CONFIG_FILE Override the seeded config file path directly.
|
||||||
|
FFX_VAR_DIR Override the default data directory.
|
||||||
|
FFX_LOG_DIR Override the default log directory.
|
||||||
|
FFX_DATABASE_FILE Override the database path written into a newly seeded config.
|
||||||
|
EOF
|
||||||
|
}
|
||||||
|
|
||||||
|
status_ok() {
|
||||||
|
printf '%sok%s' "${COLOR_GREEN}" "${COLOR_RESET}"
|
||||||
|
}
|
||||||
|
|
||||||
|
status_warn() {
|
||||||
|
printf '%swarn%s' "${COLOR_YELLOW}" "${COLOR_RESET}"
|
||||||
|
}
|
||||||
|
|
||||||
|
status_fail() {
|
||||||
|
printf '%sfailed%s' "${COLOR_RED}" "${COLOR_RESET}"
|
||||||
|
}
|
||||||
|
|
||||||
|
report_component() {
|
||||||
|
local level="$1"
|
||||||
|
local label="$2"
|
||||||
|
local detail="$3"
|
||||||
|
local rendered_status=""
|
||||||
|
|
||||||
|
case "${level}" in
|
||||||
|
ok)
|
||||||
|
rendered_status="$(status_ok)"
|
||||||
|
;;
|
||||||
|
warn)
|
||||||
|
rendered_status="$(status_warn)"
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
rendered_status="$(status_fail)"
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
|
||||||
|
printf '[%s] %s%s\n' "${rendered_status}" "${label}" "${detail:+: $detail}"
|
||||||
|
}
|
||||||
|
|
||||||
|
command_exists() {
|
||||||
|
command -v "$1" >/dev/null 2>&1
|
||||||
|
}
|
||||||
|
|
||||||
|
check_command_component() {
|
||||||
|
command_exists "$2"
|
||||||
|
}
|
||||||
|
|
||||||
|
check_tmdb_key() {
|
||||||
|
[ -n "${TMDB_API_KEY:-}" ]
|
||||||
|
}
|
||||||
|
|
||||||
|
check_seeded_dir() {
|
||||||
|
[ -d "$1" ]
|
||||||
|
}
|
||||||
|
|
||||||
|
check_seeded_file() {
|
||||||
|
[ -f "$1" ]
|
||||||
|
}
|
||||||
|
|
||||||
|
component_detail() {
|
||||||
|
case "$1" in
|
||||||
|
git|python3|ffmpeg|ffprobe|cpulimit)
|
||||||
|
command -v "$1" || printf "command '%s' not found" "$1"
|
||||||
|
;;
|
||||||
|
tmdb-key)
|
||||||
|
if check_tmdb_key; then
|
||||||
|
printf 'TMDB_API_KEY is set'
|
||||||
|
else
|
||||||
|
printf 'TMDB_API_KEY is unset; TMDB-backed flows will be skipped or fail'
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
config-dir)
|
||||||
|
if check_seeded_dir "${CONFIG_DIR}"; then
|
||||||
|
printf '%s' "${CONFIG_DIR}"
|
||||||
|
else
|
||||||
|
printf 'missing; prep can create it'
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
var-dir)
|
||||||
|
if check_seeded_dir "${VAR_DIR}"; then
|
||||||
|
printf '%s' "${VAR_DIR}"
|
||||||
|
else
|
||||||
|
printf 'missing; prep can create it'
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
log-dir)
|
||||||
|
if check_seeded_dir "${LOG_DIR}"; then
|
||||||
|
printf '%s' "${LOG_DIR}"
|
||||||
|
else
|
||||||
|
printf 'missing; prep can create it'
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
ffx-config)
|
||||||
|
if check_seeded_file "${CONFIG_FILE}"; then
|
||||||
|
printf '%s' "${CONFIG_FILE}"
|
||||||
|
else
|
||||||
|
printf 'missing; prep can seed a default non-destructively'
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
}
|
||||||
|
|
||||||
|
report_toolchain_component() {
|
||||||
|
local label="$1"
|
||||||
|
local command_name="$2"
|
||||||
|
local required="$3"
|
||||||
|
|
||||||
|
if check_command_component "${label}" "${command_name}" "${required}"; then
|
||||||
|
report_component ok "${label}" "$(component_detail "${command_name}")"
|
||||||
|
else
|
||||||
|
if [ "${required}" = "required" ]; then
|
||||||
|
report_component failed "${label}" "$(component_detail "${command_name}")"
|
||||||
|
MISSING_REQUIRED_SYSTEM+=("${command_name}")
|
||||||
|
READINESS_FAILURES=$((READINESS_FAILURES + 1))
|
||||||
|
else
|
||||||
|
report_component warn "${label}" "$(component_detail "${command_name}")"
|
||||||
|
MISSING_OPTIONAL_SYSTEM+=("${command_name}")
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
report_tmdb_component() {
|
||||||
|
if check_tmdb_key; then
|
||||||
|
report_component ok "TMDB API key" "$(component_detail tmdb-key)"
|
||||||
|
else
|
||||||
|
report_component warn "TMDB API key" "$(component_detail tmdb-key)"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
report_seeded_component() {
|
||||||
|
local label="$1"
|
||||||
|
local key="$2"
|
||||||
|
local required="$3"
|
||||||
|
local ok=1
|
||||||
|
|
||||||
|
case "${key}" in
|
||||||
|
config-dir)
|
||||||
|
check_seeded_dir "${CONFIG_DIR}" || ok=0
|
||||||
|
;;
|
||||||
|
var-dir)
|
||||||
|
check_seeded_dir "${VAR_DIR}" || ok=0
|
||||||
|
;;
|
||||||
|
log-dir)
|
||||||
|
check_seeded_dir "${LOG_DIR}" || ok=0
|
||||||
|
;;
|
||||||
|
ffx-config)
|
||||||
|
check_seeded_file "${CONFIG_FILE}" || ok=0
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
|
||||||
|
if [ "${ok}" -eq 1 ]; then
|
||||||
|
report_component ok "${label}" "$(component_detail "${key}")"
|
||||||
|
else
|
||||||
|
if [ "${required}" = "required" ]; then
|
||||||
|
report_component failed "${label}" "$(component_detail "${key}")"
|
||||||
|
READINESS_FAILURES=$((READINESS_FAILURES + 1))
|
||||||
|
else
|
||||||
|
report_component warn "${label}" "$(component_detail "${key}")"
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
print_dependency_status() {
|
||||||
|
READINESS_FAILURES=0
|
||||||
|
MISSING_REQUIRED_SYSTEM=()
|
||||||
|
MISSING_OPTIONAL_SYSTEM=()
|
||||||
|
|
||||||
|
echo "Dependency status:"
|
||||||
|
report_toolchain_component "git" "git" "required"
|
||||||
|
report_toolchain_component "python3" "python3" "required"
|
||||||
|
report_toolchain_component "ffmpeg" "ffmpeg" "required"
|
||||||
|
report_toolchain_component "ffprobe" "ffprobe" "required"
|
||||||
|
report_toolchain_component "cpulimit" "cpulimit" "required"
|
||||||
|
report_tmdb_component
|
||||||
|
}
|
||||||
|
|
||||||
|
print_seeded_file_status() {
|
||||||
|
echo "Seeded local files:"
|
||||||
|
report_seeded_component "Config dir" "config-dir" "optional"
|
||||||
|
report_seeded_component "Var dir" "var-dir" "optional"
|
||||||
|
report_seeded_component "Log dir" "log-dir" "optional"
|
||||||
|
report_seeded_component "ffx config" "ffx-config" "optional"
|
||||||
|
}
|
||||||
|
|
||||||
|
detect_package_manager() {
|
||||||
|
if command_exists apt-get; then
|
||||||
|
printf 'apt-get\n'
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
if command_exists pacman; then
|
||||||
|
printf 'pacman\n'
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
return 1
|
||||||
|
}
|
||||||
|
|
||||||
|
run_root_command() {
|
||||||
|
if [ "${EUID}" -eq 0 ]; then
|
||||||
|
"$@"
|
||||||
|
elif command_exists sudo; then
|
||||||
|
sudo "$@"
|
||||||
|
else
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
install_system_requirements() {
|
||||||
|
local package_manager
|
||||||
|
|
||||||
|
if ! package_manager="$(detect_package_manager)"; then
|
||||||
|
printf 'No supported package manager found for automatic preparation.\n' >&2
|
||||||
|
INSTALL_FAILURES=$((INSTALL_FAILURES + 1))
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
case "${package_manager}" in
|
||||||
|
apt-get)
|
||||||
|
printf 'Installing missing system dependencies via apt-get...\n'
|
||||||
|
if ! run_root_command apt-get update; then
|
||||||
|
printf 'apt-get update failed.\n' >&2
|
||||||
|
INSTALL_FAILURES=$((INSTALL_FAILURES + 1))
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
if ! run_root_command apt-get install -y git python3 ffmpeg cpulimit; then
|
||||||
|
printf 'apt-get install failed.\n' >&2
|
||||||
|
INSTALL_FAILURES=$((INSTALL_FAILURES + 1))
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
pacman)
|
||||||
|
printf 'Installing missing system dependencies via pacman...\n'
|
||||||
|
if ! run_root_command pacman -Sy --noconfirm git python ffmpeg cpulimit; then
|
||||||
|
printf 'pacman install failed.\n' >&2
|
||||||
|
INSTALL_FAILURES=$((INSTALL_FAILURES + 1))
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
|
||||||
|
MUTATIONS=$((MUTATIONS + 1))
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
|
||||||
|
seed_default_config() {
|
||||||
|
if [ "${CHECK_ONLY}" -eq 1 ]; then
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
local created_any=0
|
||||||
|
|
||||||
|
if [ ! -d "${CONFIG_DIR}" ]; then
|
||||||
|
printf 'Creating config dir at %s...\n' "${CONFIG_DIR}"
|
||||||
|
if ! mkdir -p "${CONFIG_DIR}"; then
|
||||||
|
printf 'Failed to create config dir at %s.\n' "${CONFIG_DIR}" >&2
|
||||||
|
INSTALL_FAILURES=$((INSTALL_FAILURES + 1))
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
created_any=1
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ ! -d "${VAR_DIR}" ]; then
|
||||||
|
printf 'Creating var dir at %s...\n' "${VAR_DIR}"
|
||||||
|
if ! mkdir -p "${VAR_DIR}"; then
|
||||||
|
printf 'Failed to create var dir at %s.\n' "${VAR_DIR}" >&2
|
||||||
|
INSTALL_FAILURES=$((INSTALL_FAILURES + 1))
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
created_any=1
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ ! -d "${LOG_DIR}" ]; then
|
||||||
|
printf 'Creating log dir at %s...\n' "${LOG_DIR}"
|
||||||
|
if ! mkdir -p "${LOG_DIR}"; then
|
||||||
|
printf 'Failed to create log dir at %s.\n' "${LOG_DIR}" >&2
|
||||||
|
INSTALL_FAILURES=$((INSTALL_FAILURES + 1))
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
created_any=1
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ ! -f "${CONFIG_FILE}" ]; then
|
||||||
|
printf 'Seeding ffx config at %s...\n' "${CONFIG_FILE}"
|
||||||
|
if ! cat >"${CONFIG_FILE}" <<EOF
|
||||||
|
{
|
||||||
|
"databasePath": "${DATABASE_FILE}",
|
||||||
|
"logDirectory": "${LOG_DIR}",
|
||||||
|
"metadata": {
|
||||||
|
"signature": {
|
||||||
|
"RECODED_WITH": "FFX"
|
||||||
|
},
|
||||||
|
"remove": [
|
||||||
|
"VERSION-eng",
|
||||||
|
"creation_time",
|
||||||
|
"NAME"
|
||||||
|
],
|
||||||
|
"streams": {
|
||||||
|
"remove": [
|
||||||
|
"BPS",
|
||||||
|
"NUMBER_OF_FRAMES",
|
||||||
|
"NUMBER_OF_BYTES",
|
||||||
|
"_STATISTICS_WRITING_APP",
|
||||||
|
"_STATISTICS_WRITING_DATE_UTC",
|
||||||
|
"_STATISTICS_TAGS",
|
||||||
|
"BPS-eng",
|
||||||
|
"DURATION-eng",
|
||||||
|
"NUMBER_OF_FRAMES-eng",
|
||||||
|
"NUMBER_OF_BYTES-eng",
|
||||||
|
"_STATISTICS_WRITING_APP-eng",
|
||||||
|
"_STATISTICS_WRITING_DATE_UTC-eng",
|
||||||
|
"_STATISTICS_TAGS-eng"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
EOF
|
||||||
|
then
|
||||||
|
printf 'Failed to write ffx config at %s.\n' "${CONFIG_FILE}" >&2
|
||||||
|
INSTALL_FAILURES=$((INSTALL_FAILURES + 1))
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
created_any=1
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ "${created_any}" -eq 1 ]; then
|
||||||
|
MUTATIONS=$((MUTATIONS + 1))
|
||||||
|
fi
|
||||||
|
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
|
||||||
|
parse_args() {
|
||||||
|
while [ "$#" -gt 0 ]; do
|
||||||
|
case "$1" in
|
||||||
|
--check)
|
||||||
|
CHECK_ONLY=1
|
||||||
|
;;
|
||||||
|
--help|-h)
|
||||||
|
usage
|
||||||
|
exit 0
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
printf 'Unknown option: %s\n\n' "$1" >&2
|
||||||
|
usage >&2
|
||||||
|
exit 2
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
shift
|
||||||
|
done
|
||||||
|
}
|
||||||
|
|
||||||
|
main() {
|
||||||
|
parse_args "$@"
|
||||||
|
|
||||||
|
print_dependency_status
|
||||||
|
|
||||||
|
if [ "${CHECK_ONLY}" -eq 0 ] && [ "${#MISSING_REQUIRED_SYSTEM[@]}" -gt 0 ]; then
|
||||||
|
install_system_requirements
|
||||||
|
|
||||||
|
echo
|
||||||
|
print_dependency_status
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo
|
||||||
|
print_seeded_file_status
|
||||||
|
|
||||||
|
if [ "${CHECK_ONLY}" -eq 0 ]; then
|
||||||
|
seed_default_config
|
||||||
|
echo
|
||||||
|
print_seeded_file_status
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo
|
||||||
|
if [ "${INSTALL_FAILURES}" -gt 0 ]; then
|
||||||
|
echo "One or more install steps failed; see the status checks above." >&2
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ "${READINESS_FAILURES}" -gt 0 ]; then
|
||||||
|
if [ "${CHECK_ONLY}" -eq 1 ]; then
|
||||||
|
echo "Required system prerequisites are incomplete." >&2
|
||||||
|
else
|
||||||
|
echo "Required components are still missing after preparation." >&2
|
||||||
|
fi
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ "${CHECK_ONLY}" -eq 1 ]; then
|
||||||
|
echo "The FFX preparation environment is ready."
|
||||||
|
elif [ "${MUTATIONS}" -gt 0 ]; then
|
||||||
|
echo "The FFX preparation environment is ready."
|
||||||
|
else
|
||||||
|
echo "The FFX preparation environment is already prepared."
|
||||||
|
fi
|
||||||
|
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
|
||||||
|
main "$@"
|
||||||
350
tools/setup.sh
Executable file
350
tools/setup.sh
Executable file
@@ -0,0 +1,350 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
|
||||||
|
set -u
|
||||||
|
|
||||||
|
ROOT_DIR="$(cd -- "$(dirname -- "${BASH_SOURCE[0]}")/.." && pwd)"
|
||||||
|
VENV_DIR="${HOME}/.local/share/ffx.venv"
|
||||||
|
VENV_BIN_DIR="${VENV_DIR}/bin"
|
||||||
|
VENV_PYTHON="${VENV_BIN_DIR}/python"
|
||||||
|
VENV_PIP="${VENV_BIN_DIR}/pip"
|
||||||
|
VENV_FFX="${VENV_BIN_DIR}/ffx"
|
||||||
|
BASHRC_FILE="${HOME}/.bashrc"
|
||||||
|
ALIAS_BLOCK_BEGIN="# >>> ffx alias >>>"
|
||||||
|
ALIAS_BLOCK_END="# <<< ffx alias <<<"
|
||||||
|
ALIAS_LINE="alias ffx=\"${VENV_FFX}\""
|
||||||
|
|
||||||
|
CHECK_ONLY=0
|
||||||
|
READINESS_FAILURES=0
|
||||||
|
INSTALL_FAILURES=0
|
||||||
|
|
||||||
|
COLOR_RESET=""
|
||||||
|
COLOR_GREEN=""
|
||||||
|
COLOR_YELLOW=""
|
||||||
|
COLOR_RED=""
|
||||||
|
|
||||||
|
if [ -t 1 ]; then
|
||||||
|
COLOR_RESET="$(printf '\033[0m')"
|
||||||
|
COLOR_GREEN="$(printf '\033[32m')"
|
||||||
|
COLOR_YELLOW="$(printf '\033[33m')"
|
||||||
|
COLOR_RED="$(printf '\033[31m')"
|
||||||
|
fi
|
||||||
|
|
||||||
|
usage() {
|
||||||
|
cat <<EOF
|
||||||
|
Usage: $(basename "$0") [--check] [--help]
|
||||||
|
|
||||||
|
Prepare the persistent FFX bundle virtualenv at:
|
||||||
|
${VENV_DIR}
|
||||||
|
|
||||||
|
Actions:
|
||||||
|
- create or reuse ${VENV_DIR}
|
||||||
|
- install this repository into the venv with pip --editable
|
||||||
|
- ensure ${BASHRC_FILE} exposes alias ffx -> ${VENV_FFX}
|
||||||
|
|
||||||
|
Options:
|
||||||
|
--check Report readiness only. Do not create or modify anything.
|
||||||
|
--help Show this help text.
|
||||||
|
EOF
|
||||||
|
}
|
||||||
|
|
||||||
|
status_ok() {
|
||||||
|
printf '%sok%s' "${COLOR_GREEN}" "${COLOR_RESET}"
|
||||||
|
}
|
||||||
|
|
||||||
|
status_warn() {
|
||||||
|
printf '%swarn%s' "${COLOR_YELLOW}" "${COLOR_RESET}"
|
||||||
|
}
|
||||||
|
|
||||||
|
status_fail() {
|
||||||
|
printf '%sfailed%s' "${COLOR_RED}" "${COLOR_RESET}"
|
||||||
|
}
|
||||||
|
|
||||||
|
report_component() {
|
||||||
|
local level="$1"
|
||||||
|
local label="$2"
|
||||||
|
local detail="$3"
|
||||||
|
local rendered_status=""
|
||||||
|
|
||||||
|
case "${level}" in
|
||||||
|
ok)
|
||||||
|
rendered_status="$(status_ok)"
|
||||||
|
;;
|
||||||
|
warn)
|
||||||
|
rendered_status="$(status_warn)"
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
rendered_status="$(status_fail)"
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
|
||||||
|
printf '[%s] %s%s\n' "${rendered_status}" "${label}" "${detail:+: $detail}"
|
||||||
|
}
|
||||||
|
|
||||||
|
command_exists() {
|
||||||
|
command -v "$1" >/dev/null 2>&1
|
||||||
|
}
|
||||||
|
|
||||||
|
check_python3() {
|
||||||
|
command_exists python3
|
||||||
|
}
|
||||||
|
|
||||||
|
check_venv_dir() {
|
||||||
|
[ -x "${VENV_PYTHON}" ]
|
||||||
|
}
|
||||||
|
|
||||||
|
check_venv_pip() {
|
||||||
|
check_venv_dir && "${VENV_PIP}" --version >/dev/null 2>&1
|
||||||
|
}
|
||||||
|
|
||||||
|
check_venv_ffx() {
|
||||||
|
[ -x "${VENV_FFX}" ]
|
||||||
|
}
|
||||||
|
|
||||||
|
check_bashrc_file() {
|
||||||
|
[ -f "${BASHRC_FILE}" ]
|
||||||
|
}
|
||||||
|
|
||||||
|
check_bashrc_alias() {
|
||||||
|
check_bashrc_file && grep -Fqx "${ALIAS_LINE}" "${BASHRC_FILE}"
|
||||||
|
}
|
||||||
|
|
||||||
|
detail_python3() {
|
||||||
|
command -v python3 || printf "command 'python3' not found"
|
||||||
|
}
|
||||||
|
|
||||||
|
detail_venv_dir() {
|
||||||
|
if check_venv_dir; then
|
||||||
|
printf '%s' "${VENV_DIR}"
|
||||||
|
else
|
||||||
|
printf 'missing %s' "${VENV_DIR}"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
detail_venv_pip() {
|
||||||
|
if check_venv_pip; then
|
||||||
|
"${VENV_PIP}" --version
|
||||||
|
else
|
||||||
|
printf 'missing pip in %s' "${VENV_DIR}"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
detail_venv_ffx() {
|
||||||
|
if check_venv_ffx; then
|
||||||
|
printf '%s' "${VENV_FFX}"
|
||||||
|
else
|
||||||
|
printf 'missing %s' "${VENV_FFX}"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
detail_bashrc_file() {
|
||||||
|
if check_bashrc_file; then
|
||||||
|
printf '%s' "${BASHRC_FILE}"
|
||||||
|
else
|
||||||
|
printf 'missing %s; prep can create it' "${BASHRC_FILE}"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
detail_bashrc_alias() {
|
||||||
|
if check_bashrc_alias; then
|
||||||
|
printf '%s' "${ALIAS_LINE}"
|
||||||
|
else
|
||||||
|
printf 'missing alias line for %s' "${VENV_FFX}"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
print_status_report() {
|
||||||
|
READINESS_FAILURES=0
|
||||||
|
|
||||||
|
echo "Dependency status:"
|
||||||
|
if check_python3; then
|
||||||
|
report_component ok "python3" "$(detail_python3)"
|
||||||
|
else
|
||||||
|
report_component failed "python3" "$(detail_python3)"
|
||||||
|
READINESS_FAILURES=$((READINESS_FAILURES + 1))
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo
|
||||||
|
echo "Bundle venv status:"
|
||||||
|
if check_venv_dir; then
|
||||||
|
report_component ok "bundle virtualenv" "$(detail_venv_dir)"
|
||||||
|
else
|
||||||
|
report_component failed "bundle virtualenv" "$(detail_venv_dir)"
|
||||||
|
READINESS_FAILURES=$((READINESS_FAILURES + 1))
|
||||||
|
fi
|
||||||
|
|
||||||
|
if check_venv_pip; then
|
||||||
|
report_component ok "bundle pip" "$(detail_venv_pip)"
|
||||||
|
else
|
||||||
|
report_component failed "bundle pip" "$(detail_venv_pip)"
|
||||||
|
READINESS_FAILURES=$((READINESS_FAILURES + 1))
|
||||||
|
fi
|
||||||
|
|
||||||
|
if check_venv_ffx; then
|
||||||
|
report_component ok "bundle ffx" "$(detail_venv_ffx)"
|
||||||
|
else
|
||||||
|
report_component failed "bundle ffx" "$(detail_venv_ffx)"
|
||||||
|
READINESS_FAILURES=$((READINESS_FAILURES + 1))
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo
|
||||||
|
echo "Shell exposure status:"
|
||||||
|
if check_bashrc_file; then
|
||||||
|
report_component ok ".bashrc" "$(detail_bashrc_file)"
|
||||||
|
else
|
||||||
|
report_component warn ".bashrc" "$(detail_bashrc_file)"
|
||||||
|
fi
|
||||||
|
|
||||||
|
if check_bashrc_alias; then
|
||||||
|
report_component ok "ffx alias" "$(detail_bashrc_alias)"
|
||||||
|
else
|
||||||
|
report_component failed "ffx alias" "$(detail_bashrc_alias)"
|
||||||
|
READINESS_FAILURES=$((READINESS_FAILURES + 1))
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
ensure_bundle_venv() {
|
||||||
|
mkdir -p "${HOME}/.local/share"
|
||||||
|
|
||||||
|
if ! check_venv_dir; then
|
||||||
|
printf 'Creating bundle virtualenv at %s...\n' "${VENV_DIR}"
|
||||||
|
if ! python3 -m venv "${VENV_DIR}"; then
|
||||||
|
printf 'Failed to create virtualenv at %s.\n' "${VENV_DIR}" >&2
|
||||||
|
INSTALL_FAILURES=$((INSTALL_FAILURES + 1))
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
if ! check_venv_pip; then
|
||||||
|
printf 'Missing pip in %s.\n' "${VENV_DIR}" >&2
|
||||||
|
INSTALL_FAILURES=$((INSTALL_FAILURES + 1))
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
printf 'Installing FFX package into %s...\n' "${VENV_DIR}"
|
||||||
|
if ! "${VENV_PIP}" install --editable "${ROOT_DIR}"; then
|
||||||
|
printf 'Failed to install FFX package into %s.\n' "${VENV_DIR}" >&2
|
||||||
|
INSTALL_FAILURES=$((INSTALL_FAILURES + 1))
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
|
||||||
|
write_alias_block() {
|
||||||
|
local bashrc_dir
|
||||||
|
bashrc_dir="$(dirname "${BASHRC_FILE}")"
|
||||||
|
mkdir -p "${bashrc_dir}"
|
||||||
|
touch "${BASHRC_FILE}"
|
||||||
|
|
||||||
|
if grep -Fq "${ALIAS_BLOCK_BEGIN}" "${BASHRC_FILE}" || grep -Fq "${ALIAS_BLOCK_END}" "${BASHRC_FILE}"; then
|
||||||
|
if ! python3 - "${BASHRC_FILE}" "${ALIAS_BLOCK_BEGIN}" "${ALIAS_BLOCK_END}" "${ALIAS_LINE}" <<'PY'
|
||||||
|
import pathlib
|
||||||
|
import sys
|
||||||
|
|
||||||
|
path = pathlib.Path(sys.argv[1])
|
||||||
|
begin = sys.argv[2]
|
||||||
|
end = sys.argv[3]
|
||||||
|
alias_line = sys.argv[4]
|
||||||
|
|
||||||
|
content = path.read_text()
|
||||||
|
block = f"{begin}\n{alias_line}\n{end}\n"
|
||||||
|
|
||||||
|
start = content.find(begin)
|
||||||
|
stop = content.find(end)
|
||||||
|
|
||||||
|
if start != -1 and stop != -1 and stop >= start:
|
||||||
|
stop += len(end)
|
||||||
|
if stop < len(content) and content[stop] == "\n":
|
||||||
|
stop += 1
|
||||||
|
content = content[:start] + block + content[stop:]
|
||||||
|
else:
|
||||||
|
if content and not content.endswith("\n"):
|
||||||
|
content += "\n"
|
||||||
|
content += block
|
||||||
|
|
||||||
|
path.write_text(content)
|
||||||
|
PY
|
||||||
|
then
|
||||||
|
printf 'Failed to update managed alias block in %s.\n' "${BASHRC_FILE}" >&2
|
||||||
|
INSTALL_FAILURES=$((INSTALL_FAILURES + 1))
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
elif check_bashrc_alias; then
|
||||||
|
:
|
||||||
|
else
|
||||||
|
{
|
||||||
|
if [ -s "${BASHRC_FILE}" ] && [ "$(tail -c 1 "${BASHRC_FILE}" 2>/dev/null || true)" != "" ]; then
|
||||||
|
printf '\n'
|
||||||
|
fi
|
||||||
|
printf '%s\n' "${ALIAS_BLOCK_BEGIN}"
|
||||||
|
printf '%s\n' "${ALIAS_LINE}"
|
||||||
|
printf '%s\n' "${ALIAS_BLOCK_END}"
|
||||||
|
} >>"${BASHRC_FILE}" || {
|
||||||
|
printf 'Failed to append alias block to %s.\n' "${BASHRC_FILE}" >&2
|
||||||
|
INSTALL_FAILURES=$((INSTALL_FAILURES + 1))
|
||||||
|
return 1
|
||||||
|
}
|
||||||
|
fi
|
||||||
|
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
|
||||||
|
ensure_bashrc_alias() {
|
||||||
|
printf 'Ensuring ffx alias in %s...\n' "${BASHRC_FILE}"
|
||||||
|
write_alias_block
|
||||||
|
}
|
||||||
|
|
||||||
|
parse_args() {
|
||||||
|
while [ "$#" -gt 0 ]; do
|
||||||
|
case "$1" in
|
||||||
|
--check)
|
||||||
|
CHECK_ONLY=1
|
||||||
|
;;
|
||||||
|
--help|-h)
|
||||||
|
usage
|
||||||
|
exit 0
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
printf 'Unknown option: %s\n\n' "$1" >&2
|
||||||
|
usage >&2
|
||||||
|
exit 2
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
shift
|
||||||
|
done
|
||||||
|
}
|
||||||
|
|
||||||
|
main() {
|
||||||
|
parse_args "$@"
|
||||||
|
|
||||||
|
print_status_report
|
||||||
|
|
||||||
|
if [ "${CHECK_ONLY}" -eq 0 ]; then
|
||||||
|
if ! check_python3; then
|
||||||
|
printf '\npython3 is required before the bundle venv can be prepared.\n' >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo
|
||||||
|
ensure_bundle_venv
|
||||||
|
ensure_bashrc_alias
|
||||||
|
|
||||||
|
echo
|
||||||
|
print_status_report
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo
|
||||||
|
if [ "${INSTALL_FAILURES}" -gt 0 ]; then
|
||||||
|
echo "One or more bundle preparation steps failed; see the status checks above." >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ "${READINESS_FAILURES}" -gt 0 ]; then
|
||||||
|
echo "The FFX bundle virtualenv and/or alias setup is incomplete." >&2
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "The FFX bundle virtualenv is ready."
|
||||||
|
}
|
||||||
|
|
||||||
|
main "$@"
|
||||||
Reference in New Issue
Block a user