Oct 2025 – Dec 2025 · GitHub
MultiCam is a small protocol and set of client apps for recording video on a bunch of heterogeneous devices at the same time, and having the resulting files line up.
The goal of MultiCam is to press one button on a laptop and get back N recordings that share a common timeline to within a frame.
There are three pieces:
START_RECORDING, STOP_RECORDING, DEVICE_STATUS, GET_VIDEO, LIST_FILES, UPLOAD_TO_CLOUD._multicam._tcp.local.:
All devices share an NTP-disciplined clock, so if they each honor that timestamp precisely, their first frames land on the same instant. Stopping works the same way. The recorded file is then tagged with the exact timestamp of its first frame, so downstream tooling can align clips without any audio or visual fiducials at all.
The fun part is that every platform fights you differently. iOS gives you AVCaptureSession with precise presentation timestamps and is happy to cooperate. Android’s camera stack is a zoo of vendor quirks and the scheduling story is much fuzzier. Quest runs Android under the hood but with its own passthrough camera pipeline. And the Pi is a Linux box running libcamera, which is lovely and predictable.
This is a rough experiment, not a complete characterization. Small sample sizes, a single test environment, and only a handful of device models. Treat the numbers as ballpark.
| iPhone (iOS) | < 100 ms |
| Raspberry Pi (libcamera) | < 100 ms |
| Meta Quest | < 500 ms |
| Android phones | < 500 ms |
Working on dedicated hardware for sub-10 µs syncing capabilities.