KEMBAR78
Empathic Computing: Creating Shared Understanding | PDF
EMPATHIC COMPUTING
Mark Billinghurst
mark.billinghurst@unisa.edu.au
August 8th 2025
Creating Shared Understanding
The Perfect Construction Project …
In Reality..
Teleconferencing Tools Today
Limitations with Current Technology
•Lack of spatial cues
• Person blends with background
•Poor communication cues
• Limited gaze, gesture, non-verbal communication
•Introduction of artificial seams between physical/digital
• Separation of task/communication space
CSCW Matrix (Johansen, 1988)
Key Problem
How can you create shared
understanding between people in
different places or at different times?
OPTION 1
Put yourself in another person’s body
Smart Glass for Remote Collaboration
• Camera + Processing + AR Display + Connectivity
• First person Ego-Vision Collaboration
AR View Remote Expert View
Empathy Glasses
• Combine together eye-tracking, display, face expression
• Implicit cues – eye gaze, face expression
+
+
Pupil Labs Epson BT-200 AffectiveWear
Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of the
34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
Remote Collaboration
• Eye gaze pointer and remote pointing
• Face expression display
• Implicit cues for remote collaboration
Shared Sphere – 360 Video Sharing
Shared
Live 360 Video
Host User Guest User
Lee, G. A., Teo, T., Kim, S., & Billinghurst, M. (2017). Mixed reality collaboration through sharing a
live panorama. In SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications (pp. 1-4).
3D Live Scene Capture
• Use cluster of RGBD sensors
• Fuse together 3D point cloud
Live 3D Scene Capture and Sharing
Scene Reconstruction Remote Expert Local Worker
Bai, H., …& Billinghurst, M. (2020). A user study on mixed reality remote collaboration with eye gaze
and hand gesture sharing. In Proceedings of the 2020 CHI conference (pp. 1-13).
AR View Remote Expert View
View Sharing Evolution
• Increased immersion
• Improved scene understanding
• Better collaboration
2D 360 3D
OPTION 2
Add additional communication cues
Remote Communication with Avatars
• Using AR/VR to share communication cues
• Gaze, gesture, head pose, body position
• Sharing same environment
• Virtual copy of real world
• Collaboration between AR/VR
• VR user appears in AR user’s space
Piumsomboon, T., Dey, A., Ens, B., Lee, G., & Billinghurst, M. (2019). The effects of sharing awareness cues
in collaborative mixed reality. Frontiers in Robotics and AI, 6, 5.
Sharing: Virtual Communication Cues
Sharing Virtual Communication Cues
• AR/VR displays
• Gesture input (Leap Motion)
• Room scale tracking
• Conditions
• Baseline, FoV, Head-gaze, Eye-gaze
Results
• Predictions
• Eye/Head pointing better than no cues
• Eye/head pointing could reduce need for pointing
• Results
• No difference in task completion time
• Head-gaze/eye-gaze great mutual gaze rate
• Using head-gaze greater ease of use than baseline
• All cues provide higher co-presence than baseline
• Pointing gestures reduced in cue conditions
• But
• No difference between head-gaze and eye-gaze
Sharing: Communication Cues
• What happens when you can’t see your colleague/agent?
Piumsomboon, T…. & Billinghurst, M. (2018). Mini-me: An adaptive avatar for mixed reality remote
collaboration. In Proceedings of the 2018 CHI conference (pp. 1-13).
Collaborating Collaborator out of View
Mini-Me Communication Cues in MR
• When lose sight of collaborator a Mini-Me avatar appears
• Miniature avatar in real world
• Mini-Me points to shared objects, show communication cues
• Redirected gaze, gestures
Results from User Evaluation
• Collaboration between user in AR, expert in VR
• Hololens, HTC Vive
• Two tasks
• Asymmetric, symmetric collaboration
• Significant performance improvement
• 20% faster with Mini-Me
• Social Presence
• Higher sense of Presence
• Users preferred
• People felt the task was easier to complete
• 60-75% preference
“I feel like I am
talking to my
partner”
Multi-Scale Collaboration
• Changing the user’s virtual body scale
On the Shoulder of the Giant..
Piumsomboon, T., … &
Billinghurst, M. (2019). On the
shoulder of the giant: A multi-scale
mixed reality collaboration with 360
video sharing and tangible
interaction. In Proceedings of the
2019 CHI conference (pp. 1-17).
OPTION 3
Record and playback actions at a different time
User could move along the Reality-Virtuality Continuum
Time Travellers - Motivation
Expert
worker Store room
Workbench
?
Store room
Workbench
• In a factory.
Cho, H., Yuan, B., Hart, J. D., Chang, Z., Cao, J.,
Chang, E., & Billinghurst, M. (2023, October). Time
Travellers: An Asynchronous Cross Reality Collaborative
System. In 2023 IEEE International Symposium on
Mixed and Augmented Reality Adjunct (ISMAR-
Adjunct) (pp. 848-853). IEEE.
Design Goals
34
AR
/
VR
Recording user’s actions MR Playback
MR Headset
(Magic Leap 2)
Real object
tracking
Design Goals
35
AR/VR
AR VR
WIM (World In Miniature)
VR view manipulation
Design Goals
36
AR
/
VR
Visual Annotation (AR mode) Visual Annotation (VR mode)
Design Goals
37
AR
/
VR
[ Seamless Transition ]
AR -> VR
VR -> AR
[ Avatar, Virtual Replica ]
Time Travellers Overview
38
Step 1: Recording an expert’s
standard process
Step 2: Reviewing the recorded process
through the hybrid cross-reality playback
system
2nd User
1st User
MR
Headset
(Magic Leap
2)
Real
object
tracking
1st User’s
view
Visual annotation
Avatar interaction
2nd User’s
view
Timeline
manipulation
Recording
Data
[ 3D Work space
]
[ Avatar, Object ]
Spatial Data
1st User’s
view
Real
object
tracking
AR mode VR mode
Cross reality asynchronous collaborative system
AR mode VR mode
EMPATHIC COMPUTING
Modern Communication Technology Trends
1. Improved Content Capture
• Move from sharing faces to sharing places
2. Increased Network Bandwidth
• Sharing natural communication cues
3. Implicit Understanding
• Recognizing behaviour and emotion
Natural
Collaboration
Experience
Capture
Implicit
Understanding
Empathic
Computing
“Empathy is Seeing with the
Eyes of another, Listening with
the Ears of another, and Feeling
with the Heart of another..”
Alfred Adler
Empathic Computing Research Focus
Can we develop systems that allow
us to share what we are seeing,
hearing and feeling with others?
Sharing Experiences
Sharing Heart Rate in VR
• HTC Vive HMD
• Heart rate sensor
• Empatica E4
Dey, A., Piumsomboon, T., Lee, Y., & Billinghurst, M. (2017). Effects of
sharing physiological states of players in a collaborative virtual reality
gameplay. In Proceedings of CHI 2017 (pp. 4045-4056).
VR Environments
• Butterfly World: calm scene, collect butterflies
• Zombie Attack: scary scene, fighting zombies
Experiment Design
• Key Question
• What is the impact of sharing heart rate feedback?
• Two Independent Variables
• Game Experience (Zombies vs butterflies)
• Heart Rate Feedback (On/Off)
• Measures
• Heart rate (player)
• PANAS Scale (Emotion)
• Inclusion of other in self scale (Connection)
Results
• Results
• Significant difference in Heart Rate
• Sharing HR improves positive affect (PANAS)
• Sharing HR created subjective connection between collaborators
Heart Rate Data
Likert Data
Technology Trends
• Advanced displays
• Wide FOV, high resolution, light
• Real time space capture
• 3D scanning, stitching, gaussian splats
• Natural gesture interaction
• Hand tracking, pose recognition
• Robust eye-tracking
• Gaze points, focus depth
• Emotion sensing/sharing
• Physiological sensing, emotion mapping
Galea: Multiple Physiological Sensors into HMD
• Incorporate range of sensors on HMD faceplate and over head
• EMG – muscle movement
• EOG – Eye movement
• EEG – Brain activity
• EDA, PPG – Heart rate
• Measure physiological cues
• Brain activity
• Heart rate
• Eye gaze
• Show user state
• Cognitive load
• Attention
Showing Cognitive Load in Collaboration
Sasikumar, P.,... & Billinghurst, M.
(2024). A user study on sharing
physiological cues in vr assembly tasks.
In 2024 IEEE VRlity (pp. 765-773).
Demo
Empathic Shared MR Experiences: NeuralDrum
• Using brain synchronicity to increase connection
• Collaborative VR drumming experience
• Measure brain activity using 3 EEG electrodes
• Use PLV to calculate synchronization
• More synchronization increases graphics effects/immersion
Pai, Y. S., Hajika, R., Gupta, K., Sasikumar, P., & Billinghurst, M. (2020). NeuralDrum: Perceiving Brain
Synchronicity in XR Drumming. In SIGGRAPH Asia 2020 Technical Communications (pp. 1-4).
Set Up
• HTC Vive HMD
• OpenBCI
• 3 EEG electrodes
Results
"It’s quite interesting, I actually felt like my
body was exchanged with my partner."
Poor Player Good Player
• Advanced displays
• Real time space capture
• Natural gesture interaction
• Robust eye-tracking
• Emotion sensing/sharing
Empathic
Tele-Existence
Empathic Tele-Existence
• Move from Observer to Participant
• Explicit to Implicit communication
• Experiential collaboration – doing together
CONCLUSIONS
Conclusions
• Empathic Computing
• Creating systems that increase understanding
• Key Aspects of Empathic Computing
• Scene capture
• Emotion recognition
• Sharing communication cues
• Many Opportunities for Research
• Sharing emotion, cognitive cues
• Empathic MR Agents, Adding Touch
• Ethics, societal considerations
• Etc..
Looking for Collaborators …
• World Class: #2 in World for AR, #19 for VR
• Largest AR/VR Research Centre in Australia
• 140 people in 8 research groups
• Multidisciplinary mix of art, design, engineering, psychology..
• Student opportunities, Virtual Intern Program, MS, PhD
• Joint research grants, Discovery, CRC, COE, Linkage, etc
Empathic Computing Journal
• Looking for
submissions
• Any topic relevant to
Empathic Computing
• Open Access, free to
publish currently
Submit intent at https://forms.gle/XXHkWh5UVQazbuTx7
www.empathiccomputing.org
@marknb00
mark.billinghurst@unisa.edu.au

Empathic Computing: Creating Shared Understanding

  • 1.
  • 2.
  • 3.
  • 4.
  • 5.
    Limitations with CurrentTechnology •Lack of spatial cues • Person blends with background •Poor communication cues • Limited gaze, gesture, non-verbal communication •Introduction of artificial seams between physical/digital • Separation of task/communication space
  • 6.
  • 7.
    Key Problem How canyou create shared understanding between people in different places or at different times?
  • 8.
    OPTION 1 Put yourselfin another person’s body
  • 9.
    Smart Glass forRemote Collaboration • Camera + Processing + AR Display + Connectivity • First person Ego-Vision Collaboration
  • 10.
    AR View RemoteExpert View
  • 11.
    Empathy Glasses • Combinetogether eye-tracking, display, face expression • Implicit cues – eye gaze, face expression + + Pupil Labs Epson BT-200 AffectiveWear Masai, K., Sugimoto, M., Kunze, K., & Billinghurst, M. (2016, May). Empathy Glasses. In Proceedings of the 34th Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems. ACM.
  • 12.
    Remote Collaboration • Eyegaze pointer and remote pointing • Face expression display • Implicit cues for remote collaboration
  • 14.
    Shared Sphere –360 Video Sharing Shared Live 360 Video Host User Guest User Lee, G. A., Teo, T., Kim, S., & Billinghurst, M. (2017). Mixed reality collaboration through sharing a live panorama. In SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications (pp. 1-4).
  • 16.
    3D Live SceneCapture • Use cluster of RGBD sensors • Fuse together 3D point cloud
  • 17.
    Live 3D SceneCapture and Sharing Scene Reconstruction Remote Expert Local Worker Bai, H., …& Billinghurst, M. (2020). A user study on mixed reality remote collaboration with eye gaze and hand gesture sharing. In Proceedings of the 2020 CHI conference (pp. 1-13).
  • 18.
    AR View RemoteExpert View
  • 19.
    View Sharing Evolution •Increased immersion • Improved scene understanding • Better collaboration 2D 360 3D
  • 20.
    OPTION 2 Add additionalcommunication cues
  • 21.
  • 22.
    • Using AR/VRto share communication cues • Gaze, gesture, head pose, body position • Sharing same environment • Virtual copy of real world • Collaboration between AR/VR • VR user appears in AR user’s space Piumsomboon, T., Dey, A., Ens, B., Lee, G., & Billinghurst, M. (2019). The effects of sharing awareness cues in collaborative mixed reality. Frontiers in Robotics and AI, 6, 5. Sharing: Virtual Communication Cues
  • 23.
    Sharing Virtual CommunicationCues • AR/VR displays • Gesture input (Leap Motion) • Room scale tracking • Conditions • Baseline, FoV, Head-gaze, Eye-gaze
  • 25.
    Results • Predictions • Eye/Headpointing better than no cues • Eye/head pointing could reduce need for pointing • Results • No difference in task completion time • Head-gaze/eye-gaze great mutual gaze rate • Using head-gaze greater ease of use than baseline • All cues provide higher co-presence than baseline • Pointing gestures reduced in cue conditions • But • No difference between head-gaze and eye-gaze
  • 26.
    Sharing: Communication Cues •What happens when you can’t see your colleague/agent? Piumsomboon, T…. & Billinghurst, M. (2018). Mini-me: An adaptive avatar for mixed reality remote collaboration. In Proceedings of the 2018 CHI conference (pp. 1-13). Collaborating Collaborator out of View
  • 27.
    Mini-Me Communication Cuesin MR • When lose sight of collaborator a Mini-Me avatar appears • Miniature avatar in real world • Mini-Me points to shared objects, show communication cues • Redirected gaze, gestures
  • 29.
    Results from UserEvaluation • Collaboration between user in AR, expert in VR • Hololens, HTC Vive • Two tasks • Asymmetric, symmetric collaboration • Significant performance improvement • 20% faster with Mini-Me • Social Presence • Higher sense of Presence • Users preferred • People felt the task was easier to complete • 60-75% preference “I feel like I am talking to my partner”
  • 30.
    Multi-Scale Collaboration • Changingthe user’s virtual body scale
  • 31.
    On the Shoulderof the Giant.. Piumsomboon, T., … & Billinghurst, M. (2019). On the shoulder of the giant: A multi-scale mixed reality collaboration with 360 video sharing and tangible interaction. In Proceedings of the 2019 CHI conference (pp. 1-17).
  • 32.
    OPTION 3 Record andplayback actions at a different time
  • 33.
    User could movealong the Reality-Virtuality Continuum Time Travellers - Motivation Expert worker Store room Workbench ? Store room Workbench • In a factory. Cho, H., Yuan, B., Hart, J. D., Chang, Z., Cao, J., Chang, E., & Billinghurst, M. (2023, October). Time Travellers: An Asynchronous Cross Reality Collaborative System. In 2023 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR- Adjunct) (pp. 848-853). IEEE.
  • 34.
    Design Goals 34 AR / VR Recording user’sactions MR Playback MR Headset (Magic Leap 2) Real object tracking
  • 35.
    Design Goals 35 AR/VR AR VR WIM(World In Miniature) VR view manipulation
  • 36.
    Design Goals 36 AR / VR Visual Annotation(AR mode) Visual Annotation (VR mode)
  • 37.
    Design Goals 37 AR / VR [ SeamlessTransition ] AR -> VR VR -> AR [ Avatar, Virtual Replica ]
  • 38.
    Time Travellers Overview 38 Step1: Recording an expert’s standard process Step 2: Reviewing the recorded process through the hybrid cross-reality playback system 2nd User 1st User MR Headset (Magic Leap 2) Real object tracking 1st User’s view Visual annotation Avatar interaction 2nd User’s view Timeline manipulation Recording Data [ 3D Work space ] [ Avatar, Object ] Spatial Data 1st User’s view Real object tracking AR mode VR mode Cross reality asynchronous collaborative system AR mode VR mode
  • 39.
  • 40.
    Modern Communication TechnologyTrends 1. Improved Content Capture • Move from sharing faces to sharing places 2. Increased Network Bandwidth • Sharing natural communication cues 3. Implicit Understanding • Recognizing behaviour and emotion
  • 41.
  • 42.
    “Empathy is Seeingwith the Eyes of another, Listening with the Ears of another, and Feeling with the Heart of another..” Alfred Adler
  • 43.
    Empathic Computing ResearchFocus Can we develop systems that allow us to share what we are seeing, hearing and feeling with others?
  • 44.
  • 45.
    Sharing Heart Ratein VR • HTC Vive HMD • Heart rate sensor • Empatica E4 Dey, A., Piumsomboon, T., Lee, Y., & Billinghurst, M. (2017). Effects of sharing physiological states of players in a collaborative virtual reality gameplay. In Proceedings of CHI 2017 (pp. 4045-4056).
  • 46.
    VR Environments • ButterflyWorld: calm scene, collect butterflies • Zombie Attack: scary scene, fighting zombies
  • 48.
    Experiment Design • KeyQuestion • What is the impact of sharing heart rate feedback? • Two Independent Variables • Game Experience (Zombies vs butterflies) • Heart Rate Feedback (On/Off) • Measures • Heart rate (player) • PANAS Scale (Emotion) • Inclusion of other in self scale (Connection)
  • 49.
    Results • Results • Significantdifference in Heart Rate • Sharing HR improves positive affect (PANAS) • Sharing HR created subjective connection between collaborators Heart Rate Data Likert Data
  • 50.
    Technology Trends • Advanceddisplays • Wide FOV, high resolution, light • Real time space capture • 3D scanning, stitching, gaussian splats • Natural gesture interaction • Hand tracking, pose recognition • Robust eye-tracking • Gaze points, focus depth • Emotion sensing/sharing • Physiological sensing, emotion mapping
  • 51.
    Galea: Multiple PhysiologicalSensors into HMD • Incorporate range of sensors on HMD faceplate and over head • EMG – muscle movement • EOG – Eye movement • EEG – Brain activity • EDA, PPG – Heart rate
  • 53.
    • Measure physiologicalcues • Brain activity • Heart rate • Eye gaze • Show user state • Cognitive load • Attention Showing Cognitive Load in Collaboration Sasikumar, P.,... & Billinghurst, M. (2024). A user study on sharing physiological cues in vr assembly tasks. In 2024 IEEE VRlity (pp. 765-773).
  • 54.
  • 55.
    Empathic Shared MRExperiences: NeuralDrum • Using brain synchronicity to increase connection • Collaborative VR drumming experience • Measure brain activity using 3 EEG electrodes • Use PLV to calculate synchronization • More synchronization increases graphics effects/immersion Pai, Y. S., Hajika, R., Gupta, K., Sasikumar, P., & Billinghurst, M. (2020). NeuralDrum: Perceiving Brain Synchronicity in XR Drumming. In SIGGRAPH Asia 2020 Technical Communications (pp. 1-4).
  • 56.
    Set Up • HTCVive HMD • OpenBCI • 3 EEG electrodes
  • 58.
    Results "It’s quite interesting,I actually felt like my body was exchanged with my partner." Poor Player Good Player
  • 59.
    • Advanced displays •Real time space capture • Natural gesture interaction • Robust eye-tracking • Emotion sensing/sharing Empathic Tele-Existence
  • 60.
    Empathic Tele-Existence • Movefrom Observer to Participant • Explicit to Implicit communication • Experiential collaboration – doing together
  • 61.
  • 62.
    Conclusions • Empathic Computing •Creating systems that increase understanding • Key Aspects of Empathic Computing • Scene capture • Emotion recognition • Sharing communication cues • Many Opportunities for Research • Sharing emotion, cognitive cues • Empathic MR Agents, Adding Touch • Ethics, societal considerations • Etc..
  • 63.
    Looking for Collaborators… • World Class: #2 in World for AR, #19 for VR • Largest AR/VR Research Centre in Australia • 140 people in 8 research groups • Multidisciplinary mix of art, design, engineering, psychology.. • Student opportunities, Virtual Intern Program, MS, PhD • Joint research grants, Discovery, CRC, COE, Linkage, etc
  • 64.
    Empathic Computing Journal •Looking for submissions • Any topic relevant to Empathic Computing • Open Access, free to publish currently Submit intent at https://forms.gle/XXHkWh5UVQazbuTx7
  • 66.