AI learns how vision and sound are connected, without human intervention
This new machine-learning model can match corresponding audio and visual data, which could someday help robots interact in the real world.
This new machine-learning model can match corresponding audio and visual data, which could someday help robots interact in the real world.
Researchers are developing algorithms to predict failures when automation meets the real world in areas like air traffic scheduling or autonomous vehicles.
Sendhil Mullainathan brings a lifetime of unique perspectives to research in behavioral economics and machine learning.
“IntersectionZoo,” a benchmarking tool, uses a real-world traffic problem to test progress in deep reinforcement learning algorithms.
New type of “state-space model” leverages principles of harmonic oscillators.
Using diagrams to represent interactions in multipart systems can provide a faster way to design software improvements.
Researchers have created a unifying framework that can help scientists combine existing ideas to improve AI models or create new ones.
A new technique automatically guides an LLM toward outputs that adhere to the rules of whatever programming language or other format is being used.
By eliminating redundant computations, a new data-driven method can streamline processes like scheduling trains, routing delivery drivers, or assigning airline crews.
A new method from the MIT-IBM Watson AI Lab helps large language models to steer their own responses toward safer, more ethical, value-aligned outputs.