Envisioning a future where health care tech leaves some behind
The winning essay of the Envisioning the Future of Computing Prize puts health care disparities at the forefront.
The winning essay of the Envisioning the Future of Computing Prize puts health care disparities at the forefront.
SketchAgent, a drawing system developed by MIT CSAIL researchers, sketches up concepts stroke-by-stroke, teaching language models to visually express concepts on their own and collaborate with humans.
PhD student Sarah Alnegheimish wants to make machine learning systems accessible.
Words like “no” and “not” can cause this popular class of AI models to fail unexpectedly in high-stakes settings, such as medical diagnosis.
Ukrainian students and collaborators provide high-quality translations of MIT OpenCourseWare educational resources.
MAD Fellow Alexander Htet Kyaw connects humans, machines, and the physical world using AI and augmented reality.
TactStyle, a system developed by CSAIL researchers, uses image prompts to replicate both the visual appearance and tactile properties of 3D models.
The MIT Festival of Learning sparked discussions on better integrating a sense of purpose and social responsibility into hands-on education.
“InteRecon” enables users to capture items in a mobile app and reconstruct their interactive features in mixed reality. The tool could assist in education, medical environments, museums, and more.
Professor of media technology honored for research in human-computer interaction that is considered both fundamental and influential.
The Tactile Vega-Lite system, developed at MIT CSAIL, streamlines the tactile chart design process; could help educators efficiently create these graphics and aid designers in making precise changes.
“Xstrings” method enables users to produce cable-driven objects, automatically assembling bionic robots, sculptures, and dynamic fashion designs.
The system uses reconfigurable electromechanical building blocks to create structural electronics.
New research could allow a person to correct a robot’s actions in real-time, using the kind of feedback they’d give another human.
The consortium will bring researchers and industry together to focus on impact.