Drones navigate unseen environments with liquid neural networks
MIT researchers exhibit a new advancement in autonomous drone navigation, using brain-inspired liquid neural networks that excel in out-of-distribution scenarios.
MIT researchers exhibit a new advancement in autonomous drone navigation, using brain-inspired liquid neural networks that excel in out-of-distribution scenarios.
Experts convene to peek under the hood of AI-generated code, language, and images as well as its capabilities, limitations, and future impact.
“DribbleBot” can maneuver a soccer ball on landscapes such as sand, gravel, mud, and snow, using reinforcement learning to adapt to varying ball dynamics.
MIT researchers trained logic-aware language models to reduce harmful stereotypes like gender and racial biases.
Developed at MIT’s Computer Science and Artificial Intelligence Laboratory, robots can self-assemble to form various structures with applications including inspection.
CSAIL system uses a patient’s ECG signal to estimate potential for cardiovascular death.
MIT CSAIL system can learn to see by touching and feel by seeing, suggesting future where robots can more easily grasp and recognize objects.
Gripper device inspired by “origami magic ball” can grasp wide array of delicate and heavy objects.