s6.app.data.filterΒΆ

Interactively and programmatically filter entries in StructuredDatasets.

class s6.app.data.filter.QueryResult(dataset_idx: int, dataset_label: str, matched_indices: List[int])

Bases: object

Matched source indices for one dataset.

dataset_idx: int
dataset_label: str
matched_indices: List[int]
class s6.app.data.filter.ConfigPreviewDataset(dataset: AugmentedKeypointDataset, valid_indices: List[int], incompatible_indices_by_dataset: Dict[int, List[int]], skipped_count: int = 0)

Bases: object

Valid raw COG mappings selected for manual filter preview.

dataset: AugmentedKeypointDataset
valid_indices: List[int]
incompatible_indices_by_dataset: Dict[int, List[int]]
skipped_count: int = 0
property inner

Return the underlying StructuredDatasetTorch wrapper.

sample_ref_at(idx: int)
load_raw_mapping(idx: int) RawKeypointMapping
property incompatible_entry_count: int
s6.app.data.filter.normalize_datakey(datakey: str) str

Normalize config-style dotted datakeys into slash-delimited UI keys.

s6.app.data.filter.get_by_datakey(data, datakey)

Retrieve a nested value using a slash- or dot-separated datakey path.

s6.app.data.filter.get_optional_by_datakey(data, datakey)

Retrieve a nested value when present, otherwise return None.

s6.app.data.filter.ensure_vector2d(point) Vector2D

Convert tuple/list/dict-like points into Vector2D.

s6.app.data.filter.normalize_point_key_group(point_keys: str | Sequence[str]) List[str]

Normalize one logical point-key group into slash-delimited datakeys.

s6.app.data.filter.resolve_key_specs(image_keys: Sequence[str] | None, point_keys: Sequence[str | Sequence[str]] | None, mask_keys: Sequence[str] | None = None) List[Tuple[str, List[str], str | None]]

Validate and normalize image, point, and optional mask datakey selections.

s6.app.data.filter.resolve_config_selection(config_path: Path) Tuple[List[Path], List[Tuple[str, List[str], str | None]]]

Resolve dataset directories and key specs from a training data config.

s6.app.data.filter.parse_args(argv=None)

Parse CLI arguments for the dataset filter.

s6.app.data.filter.compile_query(query: str) Callable[[Any], bool]

Compile a safe dataset-entry query into a predicate.

s6.app.data.filter.query_dataset_entries(datasets: Sequence[StructuredDataset], dataset_labels: Sequence[str], query: str) List[QueryResult]

Return matched indices for each dataset.

s6.app.data.filter.print_query_results(results: Sequence[QueryResult], lengths: Sequence[int]) int

Print query match counts and return total matches.

s6.app.data.filter.confirm_delete(total_matches: int) bool

Ask the user to confirm query deletion.

s6.app.data.filter.delete_query_matches(datasets: Sequence[StructuredDataset], results: Sequence[QueryResult]) int

Delete matched entries in descending index order.

s6.app.data.filter.run_query_mode(args) None

Run command-line query matching and optional deletion.

s6.app.data.filter.ensure_bgr_image(image)

Return a BGR view of image suitable for overlay drawing.

s6.app.data.filter.overlay_mask(image_bgr: ndarray, mask: ndarray) ndarray

Blend a segmentation mask onto image_bgr.

s6.app.data.filter.build_zoom_view(image_bgr, points: Sequence[Vector2D])

Build a zoomed crop centered on point and return zoom metadata.

s6.app.data.filter.render_panel(image, points, image_key, point_keys, mask=None, mask_key: str | None = None)

Render one image/point/mask trio as a single annotated panel.

s6.app.data.filter.pad_image(image, target_height, target_width)

Pad image with black pixels to the requested size.

s6.app.data.filter.compose_panels(panels)

Compose panels into a compact grid.

s6.app.data.filter.add_header(canvas, idx, total, removed_count, dataset_idx=0, dataset_total=1, dataset_label='', source_count=1) ndarray

Add filter navigation metadata above a rendered sample canvas.

s6.app.data.filter.render_sample(sample, key_specs, idx, total, removed_count, dataset_idx=0, dataset_total=1, dataset_label='')

Render the current dataset entry across all selected sources.

s6.app.data.filter.render_raw_mapping(mapping: RawKeypointMapping, idx: int, total: int, removed_count: int, dataset_total: int) ndarray

Render one raw COG training mapping for manual filtering.

s6.app.data.filter.validate_dataset_dirs(dataset_dirs: Sequence[Path]) None
s6.app.data.filter.next_dataset_index(lengths: Sequence[int], current: int, direction: int) int
s6.app.data.filter.filter_config_from_path(config_path: Path) Config

Load a COG config for deterministic raw preview filtering.

s6.app.data.filter.build_config_preview_dataset(config_path: Path) ConfigPreviewDataset

Build the raw mapped preview dataset for a COG training config.

s6.app.data.filter.print_incompatible_results(preview_dataset: ConfigPreviewDataset) int

Print incompatible source-entry counts from a config scan.

s6.app.data.filter.delete_incompatible_entries(preview_dataset: ConfigPreviewDataset) int

Delete config-incompatible source records in descending index order.

s6.app.data.filter.run_config_incompatible_cleanup(args) None

Delete source entries that cannot satisfy a COG data mapping config.

s6.app.data.filter.next_config_preview_dataset_index(preview_dataset: ConfigPreviewDataset, current: int, direction: int) int

Return the next mapped-preview index from another source dataset.

s6.app.data.filter.run_config_preview_mode(args) None

Open the manual filter viewer over raw COG data mappings.

s6.app.data.filter.main()

Open an interactive viewer to browse and prune dataset entries.