Technology

Technology

Building Humanoids
For Real Homes

We deploy humanoid robots today using human-in-the-loop telepresence. Every interaction trains the next generation of physical AI.

The Technical Thesis

Full autonomy in unstructured home environments is years away – for everyone. The manipulation, reasoning, and safety requirements are unsolved problems. Many humanoid companies show visions of home robots, but build applications for commercial spaces.

We took a different path: deploy to homes now with human operators, generate real-world data at scale, and use that data to progressively automate. This isn’t a workaround – it’s the fastest route to physical AI that works.

Every teleoperation session creates training episodes. Every home deployment creates massive data variability.

More deploymentsMore dataBetter autonomyMore deployments
Our Journey

First Home Test in 2023

We’ve built 3 generations of hardware since.

Hardware

Robody — The Hardware

Designed for homes, proven in the field

Height
1.65 m
Weight
60 kg
Arms
6 DoF, 1.5 kg payload each
Neck
3 DoF
End-effector
2-finger gripper (finray-based)
Actuation
QDDs + Dynamixel integrated actuators
Mobility
Differential drive, stable platform
Battery
6h runtime, self-dock, inductive charging
Sensors
4K fisheye RGB, mm-wave radar, stereo mic
Compute
Nvidia Jetson Orin + Orin Nano
Middleware
ROS2

Safety Isn’t a Spec — It’s the Starting Point

The robot is intentionally weak: 1.5kg per arm won’t win any lifting competitions, but it’s enough to pour tea, carry a plate, or hand someone their medication—and gentle enough that physical contact with a person is never a risk.

In real apartments, you’re always navigating around something, so the robot inevitably bumps into tabletops, doorframes, cabinets. That’s why the whole body is wrapped in a soft skin that absorbs impacts, fully enclosed with no exposed joints or mechanisms where a curious hand could get pinched.

And because Robody will live in someone’s home for years, it should feel like it belongs there. It wears clothes—families dress it how they like. And it has a face. Not a screen, not a blinking light, but a face that moves, expresses, and makes eye contact.

Telepresence

You Cannot Can Be at Two Places at Once

Robody’s telepresence system is built around embodiment, not remote control. The operator experiences a true first-person perspective: the VR headset streams stereoscopic video directly from Robody’s fisheye camera array.

Every hand motion is mapped one-to-one onto the robot’s end-effector. The result is a control loop that feels less like “operating a machine” and more like inhabiting a body.

Performance

Typical glass-to-glass latency stays under 200 ms on standard residential internet.

Global Reach

Teleoperation from China

Live control of our Robody in Munich from Shenzhen, over 9000km.

Everyday Tasks

Robody in the Kitchen

Preparing meals in a real home kitchen environment.

Roadmap

What We’re Building Next

1
Action transformers for bi-manual mobile manipulation in homes
2
Sample-efficient Learning from Demonstration at scale
3
Safe physical interaction with vulnerable populations
4
Network-adaptive shared autonomy
5
Ultra-low-latency telepresence over unreliable networks
6
Fleet intelligence

Want to Work With Us?

Real-world robotics deployment, Embodied AI with immediate impact, Full-stack robotics challenges and a team that’s been doing this for a decade.

Apply at apply@devanthro.com