Contents Menu Expand Light mode Dark mode Auto light/dark, in light mode Auto light/dark, in dark mode Skip to content
sense-core documentation
sense-core documentation

📘 Introduction

  • Sense Core (s6)
  • System Architecture Overview
  • GUI Architecture & Interaction
  • command_queue Control Channel
  • Pipeline Configuration
  • Dataset Storage and R2 Utilities
  • Legacy Streamer Retirement (TCP/HTTP)
  • Applications (s6 CLI)
    • camera_preview
    • cog/keypoint — Keypoint Training, Preview, And ONNX Export
    • Train explicitly with bf16 autocast on CUDA
    • Train for 50 epochs and export the checkpoint from that same run to the
    • auto-generated assets/models/_… path
    • Export a fixed batch-2 ONNX model from the latest saved checkpoint
    • Override the output path explicitly when needed
    • data/collect — Retired
    • data/filter — Interactive dataset browser and filter
    • dataset — Manage datasets (local and R2)
    • id — Retired
    • perf-stats — Compare profiling runs
    • Placeholder
    • robotic/remote-control — Qt GUI for remote control
    • robotic/server — Robotics REST API server
    • Blender Simulation Renderer (s6 sim render)
    • stream — Retired
    • test — Run the project’s unit tests
    • track — Run the tracking stack
    • uplink — Push mock telemetry into a visualizer
  • Recipes
    • Recipe: Dataset Capture + Dev Loop
    • Recipe: Model Refine
    • Recipe: Pipeline Chrome Trace
  • s6.app
    • s6.app.pipeline
      • s6.app.pipeline.base
      • s6.app.pipeline.t1
      • s6.app.pipeline.v1
      • s6.app.pipeline.v2
    • s6.app.r2
      • s6.app.r2._common
      • s6.app.r2.delete
      • s6.app.r2.download
      • s6.app.r2.list
      • s6.app.r2.test
      • s6.app.r2.upload
    • s6.app.robotic
      • s6.app.robotic.control
      • s6.app.robotic.server
    • s6.app.sim
      • s6.app.sim.calib
      • s6.app.sim.install_package
      • s6.app.sim.render
      • s6.app.sim.render_animation
    • s6.app._common
    • s6.app._contextgenerators
    • s6.app._gui
    • s6.app._pipeline
    • s6.app._preview
    • s6.app._vispy
    • s6.app.camera_preview
    • s6.app.dataset
    • s6.app.ds
    • s6.app.id
    • s6.app.keypoint_runtime
    • s6.app.main
    • s6.app.monitor
    • s6.app.replay
    • s6.app.test
    • s6.app.track
    • s6.app.uplink
  • s6.app.data
    • s6.app.data.collect
    • s6.app.data.convert
    • s6.app.data.filter
    • s6.app.data.instrument_render_v3
    • s6.app.data.render
    • s6.app.data.visualize
  • s6.app.cog
    • s6.app.cog.augmented_dataset
    • s6.app.cog.benchmark
    • s6.app.cog.dataloader
    • s6.app.cog.keypoint
    • s6.app.cog.trt_runtime_experiment
    • s6.app.cog.yolo
  • s6.vision
    • s6.vision.camera
    • s6.vision.detectors
    • s6.vision.drawing
    • s6.vision.preprocess
    • s6.vision.solver
    • s6.vision.tracking
    • s6.vision.trajectory
  • s6.schema
    • s6.schema.calibration
    • s6.schema.connection
    • s6.schema.data
    • s6.schema.gstreamer
    • s6.schema.pipeline_config
    • s6.schema.pipeline_settings
    • s6.schema.platform
    • s6.schema.primitives
    • s6.schema.robotic
    • s6.schema.trace
  • s6.robotic
    • s6.robotic.client
    • s6.robotic.service
    • s6.robotic.stepper
  • s6.rendering
    • s6.rendering.visuals
  • s6.nn
    • s6.nn.data
      • s6.nn.data.aug
      • s6.nn.data.augmentation
      • s6.nn.data.dataset
    • s6.nn.ops
      • s6.nn.ops.layers
    • s6.nn.keypoint_detector_v2
    • s6.nn.keypoints
    • s6.nn.runtime
    • s6.nn.trt_runtime
    • s6.nn.utils
  • s6.utils
    • s6.utils.blender
    • s6.utils.bounding_box_selector
    • s6.utils.calibration
    • s6.utils.camera_capture
    • s6.utils.camera_identifier
    • s6.utils.camera_identifier_aruco
    • s6.utils.camera_identifier_bottom
    • s6.utils.camera_identifier_interface
    • s6.utils.cleartext
    • s6.utils.datapipe
    • s6.utils.devices
    • s6.utils.filters
    • s6.utils.infra
    • s6.utils.logging_utils
    • s6.utils.os
    • s6.utils.path_patterns
    • s6.utils.paths
    • s6.utils.profiler
    • s6.utils.r2
    • s6.utils.stage_decorator
    • s6.utils.sync_controller
    • s6.utils.tensor_registry
    • s6.utils.waypoint_utils
  • Docstring Style Guide (for Sphinx Autodoc)
Back to top
View this page

uplink — Push mock telemetry into a visualizer¶

Connects to a visualizer /ws/uplink endpoint and continuously publishes a synthetic telemetry payload whose vectors vary around the origin.

The payload matches the visualizer’s default binding paths:

  • export.solver.B.tippoint3d_c

  • export.solver.B.markerpoint3d_c

  • export.flags.model_visible

Usage¶

# Publish forever to the local visualizer backend
s6 uplink

# Target a different visualizer and slow the stream down
s6 uplink --uplink ws://127.0.0.1:5173/ws/uplink --hz 10

# Send a finite burst for testing
s6 uplink --count 50 --radius 0.05 --marker-offset 0.02

How it works¶

  • Opens a client WebSocket connection to --uplink.

  • Generates a deterministic sinusoidal vector3d around (0, 0, 0) in solver space.

  • Uses that vector as the mock tip point and applies --marker-offset on X for the marker point.

  • Sends raw JSON snapshots so the visualizer Node uplink hub can broadcast them directly to browser clients.

Next
Recipes
Previous
track — Run the tracking stack
Copyright © 2025, vicky
Made with Sphinx and @pradyunsg's Furo
On this page
  • uplink — Push mock telemetry into a visualizer
    • Usage
    • How it works