Physical AI · All-weather perception

We see
what cameras can't.

Radar-camera foundation model built from the physics up, that learns without labels and gets better with every mile driven.

4 US patents pending Published at CVPR 2023–24 London, UK · 2026

Vision has a weather problem.

Self-driving cannot deliver in rain, fog, snow, or glare. It blocks 20–40% of the addressable market and creates compounding regulatory liability.

  • 3.2M Tesla vehicles under expanded NHTSA probe — March 2026.
  • 30% of driving hours blind to vision-only stacks in northern latitudes.
  • UK AV Act 2024 means liability for adverse-weather incidents falls on the AV provider.

A foundation model for radar-camera fusion.

We integrate with any 4D imaging radar to deliver perception in conditions that blind vision-only and LiDAR systems, and we learn from every mile, without labels.

All-weather

Near-zero disengagements where vision-only fails and LiDAR struggles.

Sensor-agnostic

Runs on any 4D imaging radar. No hardware lock-in.

Self-supervised

Every mile becomes training data. The model improves at near-zero marginal cost.

Software-only

Deploys as IP license on existing silicon. No new sensors, no manufacturing risk.

Active across automotive and defence.

  • UK Government UK Government — CAM Pathfinder Active since June 2025
  • NVIDIA Inception NVIDIA Inception DRIVE platform · GPU compute
  • PNNL PNNL / Battelle Letter of intent · pathway to DoD
  • EnSilica EnSilica Hardware-optimised radar SoC deployment

Built by founders out of Nokia Bell Labs and GM, with advisory board from Ford and EPFL. CVPR 2023–24. 4 US patents pending.