Researchers at MIT's Fluid Interfaces group introduce InExChange and EmbER, mixed-reality systems leveraging haptic feedback and biometric sensors to exchange interoceptive signals like breath and heart rate. These platforms facilitate neural synchronisation and embodied empathy, improving collaborative problem-solving performance by over 20%.
Key points
- InExChange uses mixed-reality haptic actuators to share breathing signals, boosting reasoning performance by 24%.
- EmbER transmits heart rate variability and galvanic skin response via wearable actuators, enhancing empathy measures by 18%.
- Real-time biometric and EEG monitoring detect neural synchronisation, enabling AI to reinforce optimal group learning conditions.
Why it matters: By shifting from isolated content delivery to embodied social interaction, these AI systems redefine collaborative learning, offering scalable pathways to enhance empathy, problem-solving, and collective intelligence beyond conventional e-learning tools.
Q&A
- What is haptic feedback?
- What does interoception mean?
- How does neural synchronisation aid learning?
- What metrics show the system’s effectiveness?