KEMBAR78
Embedded Sensing and Computational Behaviour Science | PDF
+ =
Embedded Sensing & Computational Behaviour Science
Advancing embedded sensing and AI to understand human behaviour
Prof. Dr. Daniel Roggen
Sensor Technology Research Centre, University of Sussex, UK
Computational Behaviour Analytics
Activities
Location
Cogition, emotions
Social interactions
«Context»
You went to the supermarket, and
enjoyed a coffee with Lisa
Supporting behaviour change
Stairs?Lift?
Activity recognition pipeline
Smartphone-based
transportation / locomotion
recognition
Smartphone recognition of transportation & locomotion
Still Car
Walking Bus
Running Train
Cycling Subway
Concept Ethical approvals HW & SW Recruitment
Synchronisation Curation QA Release
www.shl-dataset.org
Annotation++ Quality control Daily protocol HW&SW checklist
SHL dataset collection process
Nov
‘16
Dec
‘17
Data collection
Feb
‘17
Apr
‘17
Aug
‘17
Dataset readyProject start
CurationSetup
• Smartphone app
• Ethical approvals
• Concept
• Smartphone app
• Annotation pipeline
• Website
• Recruitment
• Daily protocol
• Quality control
• Dataset analysis
• Baseline performance
• Machine learning
• Dataset curation
• Dataset analysis
• Management
• Computer vision
• Multimodal fusion
• Mobile implementation
F.J Ordonez Morales
M. Ciliberto
H. Gjoreski
L. Wang
S. Richoz
D. Roggen
Transportation & Locomotion Recognition with the SHL dataset
www.shl-dataset.org
[1] Gjoreski et al.,The University of Sussex-Huawei locomotion and transportation dataset for multimodal analytics with mobile devices, IEEE Access, 2018
Transportation & Locomotion Recognition with the SHL dataset
www.shl-dataset.org
[1] Gjoreski et al.,The University of Sussex-Huawei locomotion and transportation dataset for multimodal analytics with mobile devices, IEEE Access, 2018
• 2812 hours, 17562 km
• 3 users
• 8 primary annotations
• 23 secondary annotations
• 11 sensor modalities
• 30 data channels
• Smart assistance
• Mobility recognition
• City-scale optimisation
• Well-being
• Road condition
• Traffic condition
• Novel localisation
• Multimodal sensor fusion
Applications
Activity recognition pipeline
• 5 s / frame
• 3.95 millions frames
[1] Roggen et al., The adARC pattern analysis architecture for adaptive human activity recognition systems, J. Ambient Intelligence and Humanized Computing
4(2), 2013
[2] Roggen et al., Opportunistic human activity and context recognition, IEEE Computer 46(2), 2013
[3] L. Wang et al., Enabling Reproducible Research in Sensor-Based Transportation Mode Recognition with the Sussex-Huawei Dataset. IEEE Access, 2019
• Feature extraction: 2724 features/frame
• Feature selection: 147 features/frame
1 - Still; 2 - Walk; 3 - Run;
4 - Bike; 5 - Car; 6 - Bus;
7 - Train; 8 - Subway
[1] Wang et al., Enabling Reproducible Research in Sensor-Based Transportation Mode Recognition with the Sussex-Huawei Dataset. IEEE Access, 2019
[2] Roggen et al., The adARC pattern analysis architecture for adaptive human activity recognition systems, J. Ambient Intelligence and Humanized
Computing 4(2), 2013
[3] Roggen et al., Opportunistic human activity and context recognition, IEEE Computer 46(2), 2013
[4] Gjoreski et al., Unsupervised Online Activity Discovery Using Temporal Behaviour Assumption, ISWC, 2017
Activity recognition pipeline
• Automation
• Robustness: user-independent, placement-independent, …
• Power-performance trade-offs
• Enhanced pipelines: adaptive [2], opportunistic [3], lifelong learning [4]
SHL Recognition Challenge
2018 – Singapore
• 21 teams
2019 – London
• 15 teams
[1] Wang et al., Summary of the Sussex-Huawei locomotion-transportation recognition challenge, Adjunct Proc. of Ubicomp, 2018
[2] Wang et al., Summary of the Sussex-Huawei locomotion-transportation recognition challenge 2019, Adjunct Proc. of Ubicomp, 2019
Gesture recognition & sensor technologies
Wearable head scratch detection
Embedded Sensing Platforms
[1] Roggen et al., BlueSense - Designing an Extensible Platform for Wearable Motion Sensing, Sensor Research and IoT Applications, Proc. EWSN, 2018
30mm
30mm
9DoF motion
5 ppm RTC
Fuel gauge
AVR 1284p
SDHC
Expansion
USB
Bluetooth
Microphone
SDHC
STM32L4
Fuel gauge
USB
Expansion
Power
MPU
EEPROM
Bluetooth
DFU
30mm
30mm
BlueSense2 (AVR) BlueSense4 (ARM Cortex M4)
• 1KHz motion
• Microphone
• Multimodal
• Logging
• Streaming
• Extensible
Motion ADC
Sound
BlueSense2
[1] Roggen et al., BlueSense - Designing an Extensible Platform for Wearable Motion Sensing, Sensor Research and IoT Applications, Proc. EWSN, 2018
• RTC for synchronous recordings
• True hardware off
• RTC wakeup
• Built-in power sense
BlueSense
ADC extension
Stretch sensor
P. Lugoda et al. Ecofriendly carbon black and coconut oil filled elastomers for strain measurement, In preparation for Advanced Materials,
2020
BlueSense
EPS: Electric Potential Sensing
• Remote electric potential sensing
– Small voltages: e.g. non-contact ECG [2]
– 50/60Hz field
• Capacitively coupled
• Available as an IC [1]
[1] Plessey Semiconductors. PS25254/55 EPIC Ultra High Impedance ECG Sensor. Issue 1
[2] Prance et al., Remote detection of human electrophysiological signals using the electric potential sensor, Applied Physics Letter, 2008
[1] Roggen et al., Electric field phase sensing for wearable orientation and localisation applications, ISWC, 2016
Fingerprinting for localization Relative orientation sensing
E-field communication
Vs
Tx Rx
EPS receiver
Received symbol Transmitter
A
B
Low
feedbackMed feedback
High feedback
IMU + EPS
IMU
[1] Pour Yazdan et al., Wearable electric potential sensing: a new modality sensing hair touch and restless leg movement, Proc. Ubicomp, 2016
[2] Jocys et al., Multimodal fusion of IMUs and EPS body worn sensors for scratch recognition. Accepted in Proc. of Pervasive Health, 2019
EPSxyz coordinates
Single IMU: -15pp
+8pp w/ multimodal
fusion
Embedded AI
Roggen et al., Limited-Memory Warping LCSS for Real-Time Low-Power Pattern Recognition in Wireless Nodes, Proc EWSN, 2015
Matching
score
Embedded pattern recognition with Warping LCSS
Signal S(i)
Motif T(j)
Drink recognition demo
Sports skill assessment [1,2]
Serve type 1 Serve type 2
[1] Roggen et al., Limited-Memory Warping LCSS for Real-Time Low-Power Pattern Recognition in Wireless Nodes, Proc. EWSN 2015
[2] Ponce Cuspinera et al., Beach Volleyball serve type recognition, Proc ISWC, 2016
[3] Ciliberto et al., Complex Human Gestures Encoding from Wearable Inertial Sensors for Activity Recognition, Proc. EWSN, 2018
2
13
4
LM-WLCSS training
• Evolutionary optimisation [1]
[1] Ciliberto et al., WLCSSLearn: Learning Algorithm for Template Matching-based Gesture Recognition Systems, Proc. Joint 8th International
Conference on Informatics, Electronics & Vision (ICIEV) and 3rd International Conference on Imaging, Vision & Pattern Recognition (icIVPR), 2019
[2] Ciliberto et al., WLCSSCuda: A CUDA Accelerated Template Matching Method for Gesture Recognition, Proc. ISWC, 2019
• CUDA acceleration [2]
Embedded pattern recognition with Warping LCSS
• High-speed: 67 (AVR), 140 (M4) motifs w/ 8mW, 10mW @8MHz
• Low-power: single gesture spotter (AVR) w/ 135uW
• Suitable for silicon implementation
– Integer representation
– Operations: add, shift, compare
• Initial VHDL / FPGA implementation
[1] Roggen et al., Limited-Memory Warping LCSS for Real-Time Low-Power Pattern Recognition in Wireless Nodes, Proc EWSN, 2015
[2] Ciliberto et al., WLCSSLearn: Learning Algorithm for Template Matching-based Gesture Recognition Systems, Proc. Joint 8th International Conference on
Informatics, Electronics & Vision (ICIEV) and 3rd International Conference on Imaging, Vision & Pattern Recognition (icIVPR), 2019
[3] Ciliberto et al., WLCSSCuda: A CUDA Accelerated Template Matching Method for Gesture Recognition, Proc. ISWC, 2019
ARM M4
Activities of daily living
-
Is deep learning useful?
• Open / close drawer 2
• Open / close drawer 3
• Clean table
• Drink from cup
• Toggle light switch
• Open / close door 1
• Open / close door 2
• Open / close fridge
• Open / close dishwasher
• Open / close drawer 1
17 gestures
Activities of daily living: the OPPORTUNITY dataset
Roggen et al., Collecting complex activity datasets in highly rich networked sensor environments, INSS 2010
Is Deep Learning Useful for Activity Recognition?
• Open / close door 1
• Open / close door 2
• Open / close fridge
• Open / close dishwasher
• Open /close drawer 1
• Open / close drawer 2
• Open / close drawer 3
• Clean table
• Drink from cup
• Toggle light switch
[1] Ordonez Morales et al., Deep LSTM recurrent neural networks for multimodal wearable activity
recognition, Sensors, 2016
[2] Chavarriaga et al., The Opportunity challenge: A benchmark database for on-body sensor-based activity
recognition, Pattern recognition letters, 2013
DeepConvLSTM [1]: 0.86 F1 Score +9pp over competing
approaches [2]
Yes, Deep Learning is Useful for Activity Recognition
Architecture [1]
Architecture optimisation
[1] Ordonez Morales et al., Deep LSTM recurrent neural networks for multimodal wearable activity recognition, Sensors, 2016
[2] Ordonez Morales et al., Deep convolutional feature transfer across mobile activity recognition domains, sensor modalities and
locations, ISWC, 2016
Kernel reuse [2]
17% training
time reduction
Generic kernels for HAR
Power-performance trade-offs
(Weight quantization / LUT / ablation / TPU)
Research perspectives
+
• Sensor technologies
• Datasets
– Ecological validity
– Automation
• Tools for non-experts
• “Human-like” perception
– Lifelong learning & adaptivity
– Cognitive architectures
• Embedded intelligence
– Trade-offs: power, performance, latency, comfort, …
– Hardware & software partitioning / co-design
– Platforms
FP7 FET Opportuntiy (2009-2012)
UK EPSRC LifeLearn (2016-2017)
• Ethics in CBS EU ICT-48-2020, 12M€
2020-2023, 53 partners
(Accepted 12.03.2020)
Team
Sakura Uetsuji Dr Luis Ponce CuspineraDr Francisco Javier Ordoñez
Morales
Charles Tempelman
Dr Hristijan Gjoreski
Mathias Ciliberto
Dr. Arash Pour Yazdan
Dr. Lin Wang
Lukas
Gunthermann
Zygimantas Jocys Sebastien Richoz Lloyd Pellat

Embedded Sensing and Computational Behaviour Science

  • 1.
    + = Embedded Sensing& Computational Behaviour Science Advancing embedded sensing and AI to understand human behaviour Prof. Dr. Daniel Roggen Sensor Technology Research Centre, University of Sussex, UK
  • 2.
    Computational Behaviour Analytics Activities Location Cogition,emotions Social interactions «Context» You went to the supermarket, and enjoyed a coffee with Lisa
  • 3.
  • 4.
  • 5.
    Smartphone recognition oftransportation & locomotion Still Car Walking Bus Running Train Cycling Subway
  • 6.
    Concept Ethical approvalsHW & SW Recruitment Synchronisation Curation QA Release www.shl-dataset.org Annotation++ Quality control Daily protocol HW&SW checklist SHL dataset collection process
  • 7.
    Nov ‘16 Dec ‘17 Data collection Feb ‘17 Apr ‘17 Aug ‘17 Dataset readyProjectstart CurationSetup • Smartphone app • Ethical approvals • Concept • Smartphone app • Annotation pipeline • Website • Recruitment • Daily protocol • Quality control • Dataset analysis • Baseline performance • Machine learning • Dataset curation • Dataset analysis • Management • Computer vision • Multimodal fusion • Mobile implementation F.J Ordonez Morales M. Ciliberto H. Gjoreski L. Wang S. Richoz D. Roggen
  • 8.
    Transportation & LocomotionRecognition with the SHL dataset www.shl-dataset.org [1] Gjoreski et al.,The University of Sussex-Huawei locomotion and transportation dataset for multimodal analytics with mobile devices, IEEE Access, 2018
  • 9.
    Transportation & LocomotionRecognition with the SHL dataset www.shl-dataset.org [1] Gjoreski et al.,The University of Sussex-Huawei locomotion and transportation dataset for multimodal analytics with mobile devices, IEEE Access, 2018 • 2812 hours, 17562 km • 3 users • 8 primary annotations • 23 secondary annotations • 11 sensor modalities • 30 data channels • Smart assistance • Mobility recognition • City-scale optimisation • Well-being • Road condition • Traffic condition • Novel localisation • Multimodal sensor fusion Applications
  • 10.
    Activity recognition pipeline •5 s / frame • 3.95 millions frames [1] Roggen et al., The adARC pattern analysis architecture for adaptive human activity recognition systems, J. Ambient Intelligence and Humanized Computing 4(2), 2013 [2] Roggen et al., Opportunistic human activity and context recognition, IEEE Computer 46(2), 2013 [3] L. Wang et al., Enabling Reproducible Research in Sensor-Based Transportation Mode Recognition with the Sussex-Huawei Dataset. IEEE Access, 2019 • Feature extraction: 2724 features/frame • Feature selection: 147 features/frame
  • 11.
    1 - Still;2 - Walk; 3 - Run; 4 - Bike; 5 - Car; 6 - Bus; 7 - Train; 8 - Subway [1] Wang et al., Enabling Reproducible Research in Sensor-Based Transportation Mode Recognition with the Sussex-Huawei Dataset. IEEE Access, 2019 [2] Roggen et al., The adARC pattern analysis architecture for adaptive human activity recognition systems, J. Ambient Intelligence and Humanized Computing 4(2), 2013 [3] Roggen et al., Opportunistic human activity and context recognition, IEEE Computer 46(2), 2013 [4] Gjoreski et al., Unsupervised Online Activity Discovery Using Temporal Behaviour Assumption, ISWC, 2017 Activity recognition pipeline • Automation • Robustness: user-independent, placement-independent, … • Power-performance trade-offs • Enhanced pipelines: adaptive [2], opportunistic [3], lifelong learning [4]
  • 12.
    SHL Recognition Challenge 2018– Singapore • 21 teams 2019 – London • 15 teams [1] Wang et al., Summary of the Sussex-Huawei locomotion-transportation recognition challenge, Adjunct Proc. of Ubicomp, 2018 [2] Wang et al., Summary of the Sussex-Huawei locomotion-transportation recognition challenge 2019, Adjunct Proc. of Ubicomp, 2019
  • 13.
    Gesture recognition &sensor technologies Wearable head scratch detection
  • 14.
    Embedded Sensing Platforms [1]Roggen et al., BlueSense - Designing an Extensible Platform for Wearable Motion Sensing, Sensor Research and IoT Applications, Proc. EWSN, 2018 30mm 30mm 9DoF motion 5 ppm RTC Fuel gauge AVR 1284p SDHC Expansion USB Bluetooth Microphone SDHC STM32L4 Fuel gauge USB Expansion Power MPU EEPROM Bluetooth DFU 30mm 30mm BlueSense2 (AVR) BlueSense4 (ARM Cortex M4) • 1KHz motion • Microphone • Multimodal • Logging • Streaming • Extensible Motion ADC Sound
  • 15.
    BlueSense2 [1] Roggen etal., BlueSense - Designing an Extensible Platform for Wearable Motion Sensing, Sensor Research and IoT Applications, Proc. EWSN, 2018 • RTC for synchronous recordings • True hardware off • RTC wakeup • Built-in power sense
  • 16.
    BlueSense ADC extension Stretch sensor P.Lugoda et al. Ecofriendly carbon black and coconut oil filled elastomers for strain measurement, In preparation for Advanced Materials, 2020 BlueSense
  • 17.
    EPS: Electric PotentialSensing • Remote electric potential sensing – Small voltages: e.g. non-contact ECG [2] – 50/60Hz field • Capacitively coupled • Available as an IC [1] [1] Plessey Semiconductors. PS25254/55 EPIC Ultra High Impedance ECG Sensor. Issue 1 [2] Prance et al., Remote detection of human electrophysiological signals using the electric potential sensor, Applied Physics Letter, 2008
  • 18.
    [1] Roggen etal., Electric field phase sensing for wearable orientation and localisation applications, ISWC, 2016 Fingerprinting for localization Relative orientation sensing
  • 19.
    E-field communication Vs Tx Rx EPSreceiver Received symbol Transmitter A B Low feedbackMed feedback High feedback
  • 20.
    IMU + EPS IMU [1]Pour Yazdan et al., Wearable electric potential sensing: a new modality sensing hair touch and restless leg movement, Proc. Ubicomp, 2016 [2] Jocys et al., Multimodal fusion of IMUs and EPS body worn sensors for scratch recognition. Accepted in Proc. of Pervasive Health, 2019 EPSxyz coordinates Single IMU: -15pp +8pp w/ multimodal fusion
  • 21.
  • 22.
    Roggen et al.,Limited-Memory Warping LCSS for Real-Time Low-Power Pattern Recognition in Wireless Nodes, Proc EWSN, 2015 Matching score Embedded pattern recognition with Warping LCSS Signal S(i) Motif T(j)
  • 23.
  • 24.
    Sports skill assessment[1,2] Serve type 1 Serve type 2 [1] Roggen et al., Limited-Memory Warping LCSS for Real-Time Low-Power Pattern Recognition in Wireless Nodes, Proc. EWSN 2015 [2] Ponce Cuspinera et al., Beach Volleyball serve type recognition, Proc ISWC, 2016 [3] Ciliberto et al., Complex Human Gestures Encoding from Wearable Inertial Sensors for Activity Recognition, Proc. EWSN, 2018 2 13 4
  • 25.
    LM-WLCSS training • Evolutionaryoptimisation [1] [1] Ciliberto et al., WLCSSLearn: Learning Algorithm for Template Matching-based Gesture Recognition Systems, Proc. Joint 8th International Conference on Informatics, Electronics & Vision (ICIEV) and 3rd International Conference on Imaging, Vision & Pattern Recognition (icIVPR), 2019 [2] Ciliberto et al., WLCSSCuda: A CUDA Accelerated Template Matching Method for Gesture Recognition, Proc. ISWC, 2019 • CUDA acceleration [2]
  • 26.
    Embedded pattern recognitionwith Warping LCSS • High-speed: 67 (AVR), 140 (M4) motifs w/ 8mW, 10mW @8MHz • Low-power: single gesture spotter (AVR) w/ 135uW • Suitable for silicon implementation – Integer representation – Operations: add, shift, compare • Initial VHDL / FPGA implementation [1] Roggen et al., Limited-Memory Warping LCSS for Real-Time Low-Power Pattern Recognition in Wireless Nodes, Proc EWSN, 2015 [2] Ciliberto et al., WLCSSLearn: Learning Algorithm for Template Matching-based Gesture Recognition Systems, Proc. Joint 8th International Conference on Informatics, Electronics & Vision (ICIEV) and 3rd International Conference on Imaging, Vision & Pattern Recognition (icIVPR), 2019 [3] Ciliberto et al., WLCSSCuda: A CUDA Accelerated Template Matching Method for Gesture Recognition, Proc. ISWC, 2019 ARM M4
  • 27.
    Activities of dailyliving - Is deep learning useful?
  • 28.
    • Open /close drawer 2 • Open / close drawer 3 • Clean table • Drink from cup • Toggle light switch • Open / close door 1 • Open / close door 2 • Open / close fridge • Open / close dishwasher • Open / close drawer 1 17 gestures Activities of daily living: the OPPORTUNITY dataset Roggen et al., Collecting complex activity datasets in highly rich networked sensor environments, INSS 2010
  • 29.
    Is Deep LearningUseful for Activity Recognition? • Open / close door 1 • Open / close door 2 • Open / close fridge • Open / close dishwasher • Open /close drawer 1 • Open / close drawer 2 • Open / close drawer 3 • Clean table • Drink from cup • Toggle light switch [1] Ordonez Morales et al., Deep LSTM recurrent neural networks for multimodal wearable activity recognition, Sensors, 2016 [2] Chavarriaga et al., The Opportunity challenge: A benchmark database for on-body sensor-based activity recognition, Pattern recognition letters, 2013 DeepConvLSTM [1]: 0.86 F1 Score +9pp over competing approaches [2]
  • 30.
    Yes, Deep Learningis Useful for Activity Recognition Architecture [1] Architecture optimisation [1] Ordonez Morales et al., Deep LSTM recurrent neural networks for multimodal wearable activity recognition, Sensors, 2016 [2] Ordonez Morales et al., Deep convolutional feature transfer across mobile activity recognition domains, sensor modalities and locations, ISWC, 2016 Kernel reuse [2] 17% training time reduction Generic kernels for HAR Power-performance trade-offs (Weight quantization / LUT / ablation / TPU)
  • 31.
  • 32.
    + • Sensor technologies •Datasets – Ecological validity – Automation • Tools for non-experts • “Human-like” perception – Lifelong learning & adaptivity – Cognitive architectures • Embedded intelligence – Trade-offs: power, performance, latency, comfort, … – Hardware & software partitioning / co-design – Platforms FP7 FET Opportuntiy (2009-2012) UK EPSRC LifeLearn (2016-2017) • Ethics in CBS EU ICT-48-2020, 12M€ 2020-2023, 53 partners (Accepted 12.03.2020)
  • 33.
    Team Sakura Uetsuji DrLuis Ponce CuspineraDr Francisco Javier Ordoñez Morales Charles Tempelman Dr Hristijan Gjoreski Mathias Ciliberto Dr. Arash Pour Yazdan Dr. Lin Wang Lukas Gunthermann Zygimantas Jocys Sebastien Richoz Lloyd Pellat