Teaching robots to map large environments
A new approach developed at MIT could help a search-and-rescue robot navigate an unpredictable environment by rapidly generating an accurate map of its surroundings.
A new approach developed at MIT could help a search-and-rescue robot navigate an unpredictable environment by rapidly generating an accurate map of its surroundings.
New tool from MIT CSAIL creates realistic virtual kitchens and living rooms where simulated robots can interact with models of real-world objects, scaling up training data for robot foundation models.
At the inaugural MIT Generative AI Impact Consortium Symposium, researchers and business leaders discussed potential advancements centered on this powerful technology.
Neural Jacobian Fields, developed by MIT CSAIL researchers, can learn to control any robot from a single camera, without any other sensors.
An AI pipeline developed by CSAIL researchers enables unique hydrodynamic designs for bodyboard-sized vehicles that glide underwater and could help scientists gather marine data.
Developed to analyze new semiconductors, the system could streamline the development of more powerful solar panels.
MIT CSAIL researchers combined GenAI and a physics simulation engine to refine robot designs. The result: a machine that out-jumped a robot designed by humans.
Presentations targeted high-impact intersections of AI and other areas, such as health care, business, and education.
The system automatically learns to adapt to unknown disturbances such as gusting winds.
MAD Fellow Alexander Htet Kyaw connects humans, machines, and the physical world using AI and augmented reality.