Research
Research

Collaborating to advance research and innovation on essential chips for AI

Agreement between MIT Microsystems Technology Laboratories and GlobalFoundries aims to deliver power efficiencies for data centers and ultra-low power consumption for intelligent devices at the edge.

An ancient RNA-guided system could simplify delivery of gene editing therapies

The programmable proteins are compact, modular, and can be directed to modify DNA in human cells.

AI system predicts protein fragments that can bind to or inhibit a target

FragFold, developed by MIT Biology researchers, is a computational method with potential for impact on biological research and therapeutic applications.

Like human brains, large language models reason about diverse data in a general way

A new study shows LLMs represent different data types based on their underlying meaning and reason about data in their dominant language.

AI model deciphers the code in proteins that tells them where to go

Whitehead Institute and CSAIL researchers created a machine-learning model to predict and generate protein localization, with implications for understanding and remedying disease.

Can deep learning transform heart failure prevention?

A deep neural network called CHAIS may soon replace invasive procedures like catheterization as the new gold standard for monitoring heart health.

Validation technique could help scientists make more accurate forecasts

MIT researchers developed a new approach for assessing predictions with a spatial dimension, like forecasting weather or mapping air pollution.

Streamlining data collection for improved salmon population management

Assistant Professor Sara Beery is using automation to improve monitoring of migrating salmon in the Pacific Northwest.

Introducing the MIT Generative AI Impact Consortium

The consortium will bring researchers and industry together to focus on impact.

User-friendly system can help developers build more efficient simulations and AI models

By automatically generating code that leverages two types of data redundancy, the system saves bandwidth, memory, and computation.