Researchers enhance peripheral vision in AI models
By enabling models to see the world more like humans do, the work could help improve driver safety and shed light on human behavior.
By enabling models to see the world more like humans do, the work could help improve driver safety and shed light on human behavior.
Autonomous helicopters made by Rotor Technologies, a startup led by MIT PhDs, take the human out of risky commercial missions.
The system could improve image quality in video streaming or help autonomous vehicles identify road hazards in real-time.
Jonathan How and his team at the Aerospace Controls Laboratory develop planning algorithms that allow autonomous vehicles to navigate dynamic environments without colliding.
Researchers develop a machine-learning technique that can efficiently learn to control a robot, leading to better performance with fewer data.
Luca Carlone and Jonathan How of MIT LIDS discuss how future robots might perceive and interact with their environment.
A new AI-based approach for controlling autonomous robots satisfies the often-conflicting goals of safety and stability.
Cindy Alejandra Heredia’s journey from Laredo, Texas, took her to leading the MIT autonomous vehicle team and to an MBA from MIT Sloan.
It’s more important than ever for artificial intelligence to estimate how accurately it is explaining data.
A new computer vision system turns any shiny object into a camera of sorts, enabling an observer to see around corners or beyond obstructions.