Human-computer interaction
Human-computer interaction

Explained: Generative AI’s environmental impact

Rapid development and deployment of powerful generative AI models comes with environmental consequences, including increased electricity demand and water consumption.

Teaching AI to communicate sounds like humans do

Inspired by the mechanics of the human vocal tract, a new AI model can produce and understand vocal imitations of everyday sounds. The method could help build new sonic interfaces for entertainment and education.

MIT welcomes Frida Polli as its next visiting innovation scholar

The neuroscientist turned entrepreneur will be hosted by the MIT Schwarzman College of Computing and focus on advancing the intersection of behavioral science and AI across MIT.

Study reveals AI chatbots can detect race, but racial bias reduces response empathy

Researchers at MIT, NYU, and UCLA develop an approach to help evaluate whether large language models like GPT-4 are equitable enough to be clinically viable for mental health support.

Researchers reduce bias in AI models while preserving or improving accuracy

A new technique identifies and removes the training examples that contribute most to a machine-learning model’s failures.

Enabling AI to explain its predictions in plain language

Using LLMs to convert machine-learning explanations into readable narratives could help users make better decisions about when to trust a model.

Daniela Rus wins John Scott Award

MIT CSAIL director and EECS professor named a co-recipient of the honor for her robotics research, which has expanded our understanding of what a robot can be.

Citation tool offers a new approach to trustworthy AI-generated content

Researchers develop “ContextCite,” an innovative method to track AI’s source attribution and detect potential misinformation.

A model of virtuosity

Acclaimed keyboardist Jordan Rudess’s collaboration with the MIT Media Lab culminates in live improvisation between an AI “jam_bot” and the artist.

Can robots learn from machine dreams?

MIT CSAIL researchers used AI-generated images to train a robot dog in parkour, without real-world data. Their LucidSim system demonstrates generative AI’s potential for creating robotics training data.