Modified virtual reality tech can measure brain activity

A team of researchers from The University of Texas at Austin has enhanced a commercial virtual reality headset to incorporate brain activity measurement capabilities, enabling the study of human reactions to stimuli like hints and stressors.

By integrating a noninvasive electroencephalogram (EEG) sensor into a Meta VR headset, the research team has developed a comfortable and wearable device for long-term use. The EEG sensor captures the brain’s electrical signals during immersive virtual reality interactions.

This innovation holds diverse potential applications, ranging from aiding individuals with anxiety to assessing the attention and mental stress levels of pilots using flight simulators. Additionally, it allows individuals to perceive the world through a robot’s eyes.

Nanshu Lu, a professor at the Cockrell School of Engineering’s Department of Aerospace Engineering and Engineering Mechanics, who led the research, emphasized the heightened immersion of virtual reality and the ability of their technology to yield improved measurements of brain responses within such environments.

Although the combination of VR and EEG sensors exists in the commercial domain, the researchers note that current devices are expensive and less comfortable for users, thus limiting their usage duration and applications.

Addressing these challenges, the team designed soft, conductive, and spongy electrodes that overcome issues related to traditional electrodes. These modified VR headsets integrate these electrodes into the top strap and forehead pad, utilizing a flexible circuit with conductive traces similar to electronic tattoos, along with an EEG recording device attached to the headset’s rear.

This technology aligns with a larger research initiative at UT Austin focused on a robot delivery network, which will also facilitate an extensive study of human-robot interactions. The VR headsets, enhanced with EEG capabilities, will enable observers to experience events from a robot’s perspective and simultaneously measure the cognitive load of prolonged observations.

To validate the effectiveness of the VR EEG headset, the researchers developed a driving simulation game. Collaborating with José del R. Millán, an expert in brain-machine interfaces, the team created a scenario where users respond to turn commands by pressing a button, and the EEG records brain activity to assess their attention levels.

The researchers have initiated preliminary patent procedures for their EEG technology and are open to collaborations with VR companies to integrate their innovation directly into VR headsets.

The research team includes experts from various departments such as Electrical and Computer Engineering, Aerospace Engineering and Engineering Mechanics, Mechanical Engineering, Biomedical Engineering, and Artue Associates Inc. in South Korea.

Posted in

Aihub Team

Leave a Comment





SK Telecom outlines its plans with AI partners

SK Telecom outlines its plans with AI partners

Razer and ClearBot are using AI and robotics to clean the oceans

Razer and ClearBot are using AI and robotics to clean the oceans

NHS receives AI fund to improve healthcare efficiency

NHS receives AI fund to improve healthcare efficiency

National Robotarium pioneers AI and telepresence robotic tech for remote health consultations

National Robotarium pioneers AI and telepresence robotic tech for remote health consultations

IBM’s AI-powered Mayflower ship crosses the Atlantic

IBM’s AI-powered Mayflower ship crosses the Atlantic

Humans are still beating AIs at drone racing

Humans are still beating AIs at drone racing

How artificial intelligence is dividing the world of work

How artificial intelligence is dividing the world of work

Global push to regulate artificial intelligence

Global push to regulate artificial intelligence

Georgia State researchers design artificial vision device for microrobots

Georgia State researchers design artificial vision device for microrobots

European Parliament adopts AI Act position

European Parliament adopts AI Act position

Chinese AI chipmaker Horizon endeavours to raise $700M to rival NVIDIA

Chinese AI chipmaker Horizon endeavours to raise $700M to rival NVIDIA

AI Day: Elon Musk unveils ‘friendly’ humanoid robot Tesla Bot

AI Day: Elon Musk unveils ‘friendly’ humanoid robot Tesla Bot

AI and Human-Computer Interaction: AI technologies for improving user interfaces, natural language interfaces, and gesture recognition.

AI and Data Privacy: Balancing AI advancements with privacy concerns and techniques for privacy-preserving AI.

AI and Virtual Assistants: AI-driven virtual assistants, chatbots, and voice assistants for personalized user interactions.

AI and Business Process Automation: AI-powered automation of repetitive tasks and decision-making in business processes.

AI and Social Media: AI algorithms for content recommendation, sentiment analysis, and social network analysis.

AI for Environmental Monitoring: AI applications in monitoring and protecting the environment, including wildlife tracking and climate modeling.

AI in Cybersecurity: AI systems for threat detection, anomaly detection, and intelligent security analysis.

AI in Gaming: The use of AI techniques in game development, character behavior, and procedural content generation.

AI in Autonomous Vehicles: AI technologies powering self-driving cars and intelligent transportation systems.

AI Ethics: Ethical considerations and guidelines for the responsible development and use of AI systems.

AI in Education: AI-based systems for personalized learning, adaptive assessments, and intelligent tutoring.

AI in Finance: The use of AI algorithms for fraud detection, risk assessment, trading, and portfolio management in the financial sector.

AI in Healthcare: Applications of AI in medical diagnosis, drug discovery, patient monitoring, and personalized medicine.

Robotics: The integration of AI and robotics, enabling machines to perform physical tasks autonomously.

Explainable AI: Techniques and methods for making AI systems more transparent and interpretable

Reinforcement Learning: AI agents that learn through trial and error by interacting with an environment

Computer Vision: AI systems capable of interpreting and understanding visual data.

Natural Language Processing: AI techniques for understanding and processing human language.