Enhancing human-system synergy through neuro-physiological computing
Start Date
21-2-2025 4:00 PM
End Date
21-2-2025 4:20 PM
Description
As automated and intelligent systems are increasingly integrated into work environments, they hold immense potential to enhance productivity and decision-making. However, these systems often fail to account for the nuanced and dynamic nature of human behavior, resulting in challenges such as misaligned human-machine collaboration and human-machine conflicts. Without the ability to sense and adapt to users' intentions and cognitive and emotional states, these systems risk undermining performance, safety, and well-being. To ensure effective human-system interaction, it is crucial for these systems to become more aware of and responsive to the humans they are designed to support.
This talk explores how neurophysiological computing technologies can bridge this gap by enabling real-time human state sensing. By leveraging tools such as electrocardiography (ECG), electrodermal activity (EDA), eye tracking, electroencephalography (EEG), and functional near-infrared spectroscopy (fNIRS), we can capture rich physiological and behavioral data to infer user states such as attention, workload, intention, and even team dynamics. These insights empower intelligent systems to dynamically adapt to users' needs, potentially reducing errors, resolving conflicts, and fostering better collaboration.
This talk will showcase advancements in neuro-physiological computing for recognizing user intention, sensory conflicts, and teamwork states through case studies and applications. Highlights include an EEG-based system that detects visual-vestibular conflicts to assist pilots with spatial disorientation, and an fNIRS-based solution that monitors neural synchrony in teamwork for adaptive training in aviation and healthcare. These examples demonstrate how integrating human sensing technologies into intelligent systems can enhance performance, safety, and well-being in an automated world.
Speaker
Prof XU Jie
Associate Professor, Department of Psychology, Lingnan University
Jie (Jay) Xu, Ph.D., is an accomplished expert in human factors engineering and human-computer interaction. He currently serves as an Associate Professor in the Department of Psychology at Lingnan University and an Adjunct Associate Professor in the Department of Anesthesiology at Vanderbilt University. Dr. Xu's research focuses on human-Al interaction, with an emphasis on exploring how humans and Al agents understand and adapt to each other and designing intelligent systems that promote performance, safety, and well-being. His recent work includes investigating social perception and interactive behavior in human-Al interaction, designing transparent Al systems, and developing algorithms for Al agents to recognize human behavioral intentions and team dynamics. His expertise extends to user experience design for consumer products and human-system integration for complex systems such as healthcare and aviation.
Dr. Xu has published over 40 scholarly articles in academic journals and conferences. He has served as the principal investigator for eight extramurally funded research projects supported by prominent organizations such as the National Natural Science Foundation of China, Aeronautical Science Fund of China, and Huawei Technologies. Additionally, he serves as a committee member of the Committee on Engineering Psychology, Chinese Psychological Society, and a member of the editorial board of the journal Ergonomics.
Document Type
Presentation
Recommended Citation
Xu, J. J. (2025, February 21). Enhancing human-system synergy through neuro-physiological computing. Presentation presented at the International Conference and Workshop on Health and Well-being in the Digital Era. Lingnan University, Hong Kong.
Enhancing human-system synergy through neuro-physiological computing
As automated and intelligent systems are increasingly integrated into work environments, they hold immense potential to enhance productivity and decision-making. However, these systems often fail to account for the nuanced and dynamic nature of human behavior, resulting in challenges such as misaligned human-machine collaboration and human-machine conflicts. Without the ability to sense and adapt to users' intentions and cognitive and emotional states, these systems risk undermining performance, safety, and well-being. To ensure effective human-system interaction, it is crucial for these systems to become more aware of and responsive to the humans they are designed to support.
This talk explores how neurophysiological computing technologies can bridge this gap by enabling real-time human state sensing. By leveraging tools such as electrocardiography (ECG), electrodermal activity (EDA), eye tracking, electroencephalography (EEG), and functional near-infrared spectroscopy (fNIRS), we can capture rich physiological and behavioral data to infer user states such as attention, workload, intention, and even team dynamics. These insights empower intelligent systems to dynamically adapt to users' needs, potentially reducing errors, resolving conflicts, and fostering better collaboration.
This talk will showcase advancements in neuro-physiological computing for recognizing user intention, sensory conflicts, and teamwork states through case studies and applications. Highlights include an EEG-based system that detects visual-vestibular conflicts to assist pilots with spatial disorientation, and an fNIRS-based solution that monitors neural synchrony in teamwork for adaptive training in aviation and healthcare. These examples demonstrate how integrating human sensing technologies into intelligent systems can enhance performance, safety, and well-being in an automated world.