Pipeline Configuration¶
A typed, versionable configuration system powers the tracking pipeline. It is implemented with Pydantic models and exposed to all pipeline stages through the per‑frame context["config"] entry. This eliminates fragile nested‑dict lookups and centralizes validation, defaults, and documentation of pipeline parameters.
Code:
src/s6/schema/pipeline_config.pyAccess:
cfg: PipelineConfig = context["config"]Formats: YAML (
.yaml/.yml) and JSON (.json)Defaults: Loaded from
configs/pipeline.config.yaml|yml|json(YAML preferred)
How It Loads¶
CLI flag:
s6 track --config <path.{json|yaml|yml}>If omitted, the context generator loads the first existing default in this order:
configs/pipeline.config.yamlconfigs/pipeline.config.ymlconfigs/pipeline.config.json
Parsing uses Pydantic (
parse_obj/parse_file) so types and ranges are validated at load time.
Example (YAML):
solver:
instrument_length: 0.135 # meters
tracking:
enable_prediction: true
search_radius_px: 160
trajectory_maxlen: 30
Example (JSON):
{
"solver": { "instrument_length": 0.135 },
"tracking": {
"enable_prediction": false,
"search_radius_px": 150,
"trajectory_maxlen": 20
}
}
Schema Overview¶
PipelineConfig is a container composed of sub‑configs. The primary fields and defaults are:
tracking
enable_prediction: bool = Truesearch_radius_px: int = 150(>= 1)trajectory_maxlen: int = 20(>= 1)
solver
instrument_length: float = 0.135(> 0.0)
detection
components_area_thresholds: (int, int) = (600, 5000)fallback_margin_frac: float = 0.05(0.0–0.5)
refine
zoom_factor: float = 4.5(> 0.0)patch_size: (int, int) = (128, 128)
boundary
smoothing_window: int = 3(>= 1)max_radius_change_frac: float = 0.05(>= 0.0)max_center_change_frac: float = 0.05(>= 0.0)default_center: (float, float) = (640.0, 480.0)default_radius: float = 600.0(> 0.0)
tip
boundary_margin_px: int = 0(>= 0)tracking_box_radius_px: int = 80(>= 1)tracking_box_next_radius_px: int = 100(>= 1)refine_area_thresholds: (int, int) = (20, 1200)suppression_radius_px: int = 30(>= 0)
export
preview_size: (int, int) = (320, 240)preview_format: str = ".jpg"
Validators coerce and validate inputs for tuple pairs and sizes, ensuring bad values are rejected early.
Using Config in Stages¶
Access the typed config from the frame context and use attributes directly:
from s6.schema import PipelineConfig
# inside a pipeline stage
def tips_search(context, cam_l, cam_r, cam_b):
cfg: PipelineConfig = context["config"]
margin = int(cfg.tip.boundary_margin_px)
box_r = int(cfg.tip.tracking_box_radius_px)
# ... use values in algorithms and drawings ...
The context generator attaches the same cached PipelineConfig instance to each yielded frame, so lookups are cheap and consistent across stages.
Extending the Schema (add a new parameter)¶
Suppose you want to add a new algorithm parameter, min_keypoint_conf, to the tip detection logic.
Add a field to the appropriate sub‑config in
src/s6/schema/pipeline_config.py:
class TipConfig(BaseModel):
boundary_margin_px: int = Field(default=0, ge=0)
tracking_box_radius_px: int = Field(default=80, ge=1)
tracking_box_next_radius_px: int = Field(default=100, ge=1)
refine_area_thresholds: Tuple[int, int] = Field(default=(20, 1200))
suppression_radius_px: int = Field(default=30, ge=0)
min_keypoint_conf: float = Field(default=0.25, ge=0.0, le=1.0)
Pick a sensible default to preserve backward compatibility.
Add
ge/leor a custom validator if more complex validation is needed.
Use the new field in your pipeline code:
from s6.schema import PipelineConfig
@pipeline_stage(require=["B.image"], produce=["B.tip_point"])
def tips_search(context, cam_l, cam_r, cam_b):
cfg: PipelineConfig = context["config"]
conf_threshold = cfg.tip.min_keypoint_conf
# pass conf_threshold into your detector/refiner, e.g.:
points, heatmap = kdetv2(cropped, min_conf=conf_threshold)
Expose the parameter in your config files (YAML/JSON):
# configs/pipeline.config.yaml
tip:
min_keypoint_conf: 0.35
Validate by loading the pipeline with your config:
s6 track --config configs/pipeline.config.yaml
If parsing fails, Pydantic will raise a clear error pointing to the offending field.
Design Tips¶
Prefer adding to an existing sub‑config for cohesion (e.g.,
tip,tracking).Always keep a default value; old configs must continue to load.
Use typed tuples
(w, h)or(min, max)when pairs matter; write validators to coerce lists → tuples.Keep names descriptive and unit‑annotated in docstrings or
Field(description=...).
Changelog Highlights (recent)¶
feat(schema): add
PipelineConfigmodel for pipeline configuration.feat(pipeline): add tracking configuration to pipeline settings and use
context["config"]in stages.feat(config): add YAML support to pipeline configuration; prefer YAML over JSON when both exist.
refactor(config): integrate
PipelineConfigusage for parameter management across stages.
These changes make parameter management explicit, validated, and easy to extend.
Troubleshooting¶
YAML files require
PyYAML. If YAML isn’t available, JSON fallback is used.Missing or invalid fields trigger Pydantic errors at load; check the line and field name.
Ensure
--configpath is correct; otherwise defaults are loaded fromconfigs/.