Motion input for Apple platforms. No extra hardware required.
Turn the iPhone everyone already owns into a motion controller. Face expressions, head movement, body tracking — captured via ARKit, delivered as a clean Swift API.
ARKit face & body tracking at 60fps. 52 blendshapes, head rotation, eye tracking. 91 body joints with multi-player support.
Zero-config Bonjour discovery. WebSocket streaming over local WiFi. Sub-50ms latency. Port 5070.
23 semantic gestures with cooldowns. Multi-player tracking. Three-layer API — raw data, gestures, or player events.
Raw tracking data for custom logic. Semantic gestures for quick integration. Player events for multiplayer. Use what you need.
The API is identical whether tracking runs locally (ARKit on-device) or remotely (iPhone over WiFi). Your code doesn't change.
import PropelReceiver let propel = PropelReceiver.shared // Layer 1 — Raw data propel.onFrame = { frame in let jawOpen = frame.blendshapes.jawOpen let pitch = frame.headRotation.pitch } // Layer 2 — Semantic gestures propel.onGesture = { event in switch event.gesture { case .smile: score += 1 case .jump: player.jump() case .headNod: confirmAction() default: break } } // Layer 3 — Multiplayer propel.onPlayerJoined = { player in print("\(player.id) joined") } propel.startDiscovery()
All gestures include configurable thresholds, cooldowns, and temporal pattern detection.
The tracking layer has zero knowledge of what the receiver does with data. Pairing is automatic via Bonjour.
// Swift Package Manager dependencies: [ .package( url: "https://github.com/propelsdk/propel-sdk.git", from: "0.2.0" ) ]
| Module | Platforms | Purpose |
|---|---|---|
| PropelCore | iOS, tvOS, macOS | Shared models, gestures, protocol |
| PropelTracker | iOS | ARKit capture, WebSocket streaming |
| PropelReceiver | iOS, tvOS, macOS | Gesture engine, player management |