New graphene sensors improve brain-machine interface

The development of a state-of-the-art graphene sensor has led to the creation of an interface capable of precisely controlling a robot using only thoughts. Development has positive consequences not only for health care, but also for a number of other industries.

Brain-machine interfaces (BMIs) allow a person to control a device using their brain waves. As hands-free and voiceless interfaces, BMIs have great potential for use in robotics, bionic prostheses, and self-driving cars.

BMI usually consists of three modules: an external sensory stimulus, a sensory interface, and a block that processes neural signals. Of these three, the sensory interface is critical because it detects the electrical activity generated by the outermost layer of the brain, the cerebral cortex, which is responsible for higher-level processes, including motor function.

But it is the visual cortex, the part of the cerebral cortex that receives and processes the information sent by the eyes, that is the key to BMI based on visual stimuli. The visual cortex is located at the very back of the brain, in the occipital lobe.

Brain waves are recorded using implantable or wearable sensors such as electroencephalography (EEG) electrodes. The problem with using EEG electrodes and other non-invasive biosensors on the back of the head is that this area is usually covered in hair.

Wet sensors rely on the use of a conductive gel on the scalp and hair, but this can cause the sensors to move when the person does so. Alternatively, dry sensors can be used, but they also have problems; they are less conductive than wet sensors and, given the rounded head shape, they may have difficulty maintaining proper contact.

Researchers at the University of Technology Sydney (UTS) have solved these problems by developing a dry biosensor containing graphene, a one-atom-thick layer of carbon atoms arranged in a hexagonal lattice that is 1,000 times thinner than a human hair and 200 times stronger. than steel.

Graphene is an optimal material for creating dry biosensors, given its thinness and high electrical conductivity. It is also resistant to corrosion and sweat, making it ideal for use on the head.

The researchers found that combining graphene with silicon results in a more reliable dry sensor. The graphene layer on the sensors they developed is less than a nanometer thick.

“By using an advanced graphene material combined with silicon, we were able to overcome the challenges of corrosion, durability, and skin contact resistance to develop wearable dry sensors,” said Francesca Jacopi, co-author of the study.

The researchers experimented with different transducer patterns, including squares, hexagons, bars, and dots, and found that transducers with a hexagonal pattern produced the lowest impedance on the skin. They then tested their new sensor with BMI.

Sensors with a hexagonal pattern are placed above the scalp at the back of the head to detect brain waves from the visual cortex, and the user wears an augmented reality (AR) lens that displays white squares. Concentrating on a certain square, brain waves are created that are captured by the biosensor. The decoder then translates this signal into a command.

Augmented reality visor worn by the user, with graphene sensors attached to the back of the scalp.

University of Technology Sydney

1/2

Augmented reality visor worn by the user, with graphene sensors attached to the back of the scalp.

University of Technology Sydney

2/2

The augmented reality (AR) interface allows the user to issue commands simply by concentrating on a specific white block.

University of Technology Sydney

New graphene sensors improve brain-machine interface

“Our technology can issue at least nine commands in two seconds,” said Chin-Teng Lin, co-author of the study. “This means that we have nine different types of commands and the operator can choose one of these nine during this period of time. “.

Australian Army soldiers have conducted real-world tests of a graphene BMI sensor, using it to control a four-legged robotic dog. The device made it possible to control the robot without the help of hands with an accuracy of up to 94%.

“Hands-free and voice communication technology works outside the lab, anytime, anywhere,” Jakopi said. “It makes interfaces like consoles, keyboards, touchscreens, and hand gesture recognition redundant.”

However, the researchers do not consider this the latest iteration of their design. Further research and testing is needed to strike a balance between the total available area of ​​graphene, the ability to accommodate the presence of hair, and the ability to maintain sensor contact with the scalp.

But it is a promising step towards developing a technology that could greatly benefit people with disabilities when using a wheelchair or prosthesis, as well as have wider applications in advanced manufacturing, defense and aerospace industries.

The study was published in the journal Applied Nanomaterials ACS.

Source: University of Technology Sydney.

Content Source

News Press Ohio – Latest News:
Columbus Local News || Cleveland Local News || Ohio State News || National News || Money and Economy News || Entertainment News || Tech News || Environment News

Related Articles

Back to top button