top of page

Robots Now See With Signals — MIT’s mmNorm Makes It Possible

  • ritambhara516
  • Jul 5
  • 4 min read


What if your robot could find a tool inside a cluttered drawer — without even opening it? Or identify a broken item inside a sealed box without tearing it apart? MIT researchers have turned that into reality with a groundbreaking new imaging technology that gives machines the power to “see” hidden objects using signals similar to Wi-Fi.


The system is called mmNorm, and it can reconstruct the shape of items that are completely out of sight — even if they’re wrapped, buried, or behind thin walls. By using millimeter wave (mmWave) signals, which can pass through materials like cardboard and plastic, the system captures the reflections bouncing off hidden objects and recreates them in 3D.


Think of it like X-ray vision — but safer, faster, and using wireless signals instead of radiation. Unlike traditional radar techniques that just detect object location, mmNorm estimates the direction of the object’s surface — a property called the surface normal. This extra detail allows it to reconstruct curves, edges, and fine shapes with surprising accuracy.


The research team, led by Professor Fadel Adib at MIT, attached a mmWave radar to a robotic arm. As the arm moves around a hidden item, it continuously sends out signals and picks up their reflections from different angles.


Each radar antenna receives signals of varying strength, depending on how the object’s surface is positioned. Antennas facing the surface directly receive strong reflections, while those at an angle get weaker ones.





Each signal acts like a “vote” for what direction the surface is facing. The system’s algorithm combines these votes to estimate the object’s shape at each point in space.

To create the final 3D reconstruction, mmNorm borrows techniques from computer graphics, selecting the surface that best fits all the data collected. The result is a clear, detailed model — all without opening a box or using a camera.


In real-world testing, mmNorm achieved 96% accuracy in reconstructing objects — a huge improvement over older methods, which topped out at 78%. The system also reduced shape reconstruction error by about 40%.


The team tested over 60 objects of various sizes, shapes, and materials. It worked for metal, wood, glass, plastic, and rubber — and even objects made from a combination of materials.


mmNorm also succeeded in reconstructing multiple items placed together in a single box — like a fork, knife, and spoon — accurately distinguishing between each one.

The only significant limitation? It doesn't work well when scanning through thick or solid metal walls, which completely block mmWave signals.


What makes mmNorm especially promising is that it requires no extra bandwidth. That means it could be used in real-world environments without overloading communication systems — a big win for efficiency and practical deployment.


The potential uses are wide-ranging. In a warehouse, robots could identify and handle delicate or damaged items before they’re shipped. In homes or assisted living centers, a robot helper could locate items inside cabinets or drawers without making a mess.


The technology also has applications in augmented reality (AR). With mmNorm, a factory worker wearing AR glasses could “see” tools inside a sealed container or view hidden wiring behind a wall.


Security and defense could benefit too. mmNorm could improve airport luggage scanners, offering sharper 3D reconstructions, or help military units spot hidden objects during recon missions.


The project was developed by MIT’s Signal Kinetics Group and presented at the Annual International Conference on Mobile Systems, Applications and Services. In addition to Adib, the team included Laura Dodds (lead author), Tara Boroushaki, and Kaichen Zhou.




One key insight the team had was around specularity — the way surfaces reflect mmWave signals like a mirror. Traditional radar systems ignore this, but mmNorm uses it to determine how surfaces are angled, improving detail in the 3D models.


The researchers also fine-tuned the system to work even when signals were weak or scattered. Because every antenna contributes a “vote,” even low-quality reflections help shape the final result.


Looking ahead, the team wants to make mmNorm even more powerful. Goals include boosting resolution, improving performance on low-reflection surfaces, and enabling scanning through thicker obstructions.


“This is a major step forward in how we think about wireless signals,” said Dodds. “We’re no longer just detecting the presence of objects — we’re understanding their shape, texture, and how to interact with them.”


The work was supported by the National Science Foundation, the MIT Media Lab, and Microsoft. With technologies like mmNorm, the line between digital and physical perception is quickly fading — and robots are getting a clearer picture of the world around them, one signal at a time.

Comments


bottom of page