AI brain interface lets users move robot arm with pure thought

Source: interestingengineering
Author: @IntEngineering
Published: 9/1/2025
To read the full content, please visit the original article.
Read original articleResearchers at the University of California, Los Angeles (UCLA) have developed a new wearable, noninvasive brain-computer interface (BCI) system that uses artificial intelligence (AI) to help individuals with physical disabilities control robotic arms or computer cursors through thought. Unlike previous BCI devices that required invasive neurosurgery, this system combines an electroencephalography (EEG) cap with a camera-based AI platform to decode brain signals and interpret user intent in real time. The AI acts as a “co-pilot,” enhancing the user’s control by guiding actions such as moving objects, thereby offering a safer and more practical alternative for people with paralysis or neurological disorders.
In trials involving four participants—including one paralyzed individual—the AI-assisted system enabled faster and more accurate task completion, such as moving a cursor to targets and manipulating blocks with a robotic arm. Notably, the paralyzed participant was able to complete a robotic arm “pick-and-place” task in about six and a half minutes
Tags
roboticsbrain-computer-interfaceartificial-intelligenceassistive-technologywearable-technologyneural-engineeringrobotic-arm-control