Augmenting citizen science with computer vision for fish monitoring
MIT Sea Grant works with the Woodwell Climate Research Center and other collaborators to demonstrate a deep learning-based system for fish monitoring.
MIT Sea Grant works with the Woodwell Climate Research Center and other collaborators to demonstrate a deep learning-based system for fish monitoring.
By moving their hands and fingers, users can direct a robot to play piano or shoot a basketball, or they can manipulate objects in a virtual environment.
A new hybrid system could help robots navigate in changing environments or increase the efficiency of multirobot assembly teams.
A new approach could help users know whether to trust a model’s predictions in safety-critical applications like health care and autonomous driving.
Torralba’s research focuses on computer vision, machine learning, and human visual perception.
The AI-powered tool could inform the design of better sensors and cameras for robots or autonomous vehicles.
The approach could apply to more complex tissues and organs, helping researchers to identify early signs of disease.
A new approach developed at MIT could help a search-and-rescue robot navigate an unpredictable environment by rapidly generating an accurate map of its surroundings.
MIT PhD student and CSAIL researcher Justin Kay describes his work combining AI and computer vision systems to monitor the ecosystems that support our planet.
By visualizing Escher-like optical illusions in 2.5 dimensions, the “Meschers” tool could help scientists understand physics-defying shapes and spark new designs.