Meta Partners With CMU to Develop Wearable Sensing Tech to Make Computer-Based Tasks, Gaming Accessible to More People

Carnegie Mellon University (CMU) and Meta have announced a joint initiative to enhance accessibility in digital environments through wearable sensing technology.

This collaboration focuses on utilizing Meta's surface electromyography (sEMG) wristband prototype to enable individuals with varying motor abilities to engage in computer-based tasks and immersive gaming experiences.

Carnegie Mellon University (CMU) and Meta have announced a joint initiative to create inclusive wearable sensing technology.KIRILL KUDRYAVTSEV/AFP via Getty Images

sEMG Technology of Meta

Meta's sEMG technology involves placing sensors on the wrist to detect electrical signals generated by muscle movements. These signals are then translated into commands that control digital devices, providing an alternative to traditional input methods like keyboards or joysticks.

Meta has previously demonstrated the potential of their technology to revolutionize user interfaces, and ongoing research aims to verify its effectiveness across diverse user demographics.

Douglas Weber, a professor in CMU's Department of Mechanical Engineering and the Neuroscience Institute, has conducted fundamental research revealing that individuals with complete hand paralysis can generate muscle signals in their forearms.

Though insufficient for physical movement, these signals show promise in facilitating interactions with computers and digital devices for people with spinal cord injuries.

Weber's team intends to build upon these findings in collaboration with Meta, investigating how sEMG technology can be integrated into daily computing and mixed-reality interactions.

By utilizing muscle signals as input mechanisms, the project seeks to enhance digital accessibility for individuals with physical disabilities, potentially transforming their engagement with technology.

Muscle-Generated Signals

This research explores the feasibility of utilizing muscle-generated signals instead of physical motions. If successful, this approach could significantly broaden the accessibility of computers and digital devices for individuals with physical limitations.

Endorsed by CMU's Institutional Review Board, the project entails participants engaging in adaptive mini-games to evaluate their technological proficiency. Subsequently, tailored mixed reality environments and activities will be developed to accommodate individual capabilities and preferences, fostering an inclusive digital environment.

Within virtual settings, individuals with varying degrees of physical mobility can participate in virtual activities by utilizing signals from their own motor systems. Researchers aim to construct simulated environments through mixed-reality platforms where users interact with virtual elements and other users, regardless of their motor capabilities.

The collaboration between CMU and Meta underscores a shared commitment to advancing accessible human-computer interaction technologies. By integrating Meta's sEMG technology into research and practical applications, both entities aim to empower individuals with disabilities to participate fully in digital and virtual realms.

"In the digital world, people with full or limited physical ability can be empowered to act virtually, using signals from their motor system," Rumaldo said in a press statement. "In the case of mixed reality technology, we are creating simulated environments where users interact with objects and other users, regardless of motor abilities."

Byline