Sabitlenmiş Tweet

Here’s a behind the scenes look at Atlas™, our non-invasive brain-computer interface, driving a wheelchair simply by thinking about the direction you want to move.
A future where people interact with machines and devices using their thoughts is closer than you think (and no, you don’t need implants to do any of this).
At @SynaptrixAI, we are building high performance non-invasive BCIs that translate neural activity measured from the scalp into real time control signals. Using advanced signal processing, new classes of deep learning models, and large scale neural datasets, we are able to separate true cortical intent, enabling stable decoding of motor and other neural signals.
In these early demonstrations, users are able to control a cursor and navigate interfaces using only their brain activity. The same underlying system can be extended to wheelchair navigation, assistive communication, prosthetics, and other programmable brain driven applications.
This is still early work, but the trajectory is clear. As models improve and datasets scale, non-invasive neural interfaces will become dramatically more capable, accessible, and deployable in the real world.
We’re excited to share the early progress!
English





