private SIL/HIL
Fine-tune ADAS. Never upload footage.
First browser-native fused-LoRA trainer for perception models. Drag your fleet video in — the file never leaves the laptop. Training runs in WebGPU on your silicon; the output is a 4 MB .flora adapter you can email, version, and ship.
Try it — it stays local
research preview · frame extraction real · training step is a simulated LoRA/Adam loop · real perception backbone next pass
drop a video file
or click to pick · stays on your machine
no file handy? synthetic clips:
highway
urban
night
drawn procedurally in-browser · nothing downloads
bytes transmitted off-device
0
Run detection (loads YOLOS-tiny · ~25 MB · one-time)
your adapter — keep classes that matter
min confidence
0.35
Train adapter on these preferences
show raw (bypass adapter)
Download .flora
Load .flora
✓ adapter trained
committed —
raw → kept — → —
classes —
threshold —
adapter id —
⚠ settings changed since last train — retrain to commit
verify locality: open
DevTools → Network .
Drop a file. Count outbound requests. Expected:
zero .