<span class="vcard">Adam Zewe | MIT News</span>
Adam Zewe | MIT News

A faster, better way to train general-purpose robots

Inspired by large language models, researchers develop a training technique that pools diverse data to teach robots new skills.

Making it easier to verify an AI model’s responses

By allowing users to clearly see data referenced by a large language model, this tool speeds manual validation to help users spot AI errors.

Modeling relationships to solve complex problems efficiently

Associate Professor Julian Shun develops high-performance algorithms and frameworks for large-scale graph processing.

AI simulation gives people a glimpse of their potential future self

By enabling users to chat with an older version of themselves, Future You is aimed at reducing anxiety and guiding young people to make better choices.

New security protocol shields data from attackers during cloud-based computation

The technique leverages quantum properties of light to guarantee security while preserving the accuracy of a deep-learning model.

3 Questions: Should we label AI systems like we do prescription drugs?

Researchers argue that in health care settings, “responsible use” labels could ensure AI systems are deployed appropriately.

Study: AI could lead to inconsistent outcomes in home surveillance

Researchers find large language models make inconsistent decisions about whether to call the police when analyzing surveillance videos.

Study: Transparency is often lacking in datasets used to train large language models

Researchers developed an easy-to-use tool that enables an AI practitioner to find data that suits the purpose of their model, which could improve accuracy and reduce bias.

3 Questions: How to prove humanity online

AI agents could soon become indistinguishable from humans online. Could “personhood credentials” protect people against digital imposters?

MIT researchers use large language models to flag problems in complex systems

The approach can detect anomalies in data recorded over time, without the need for any training.