52
E-Gesture: A Collaborative Architecture for Energy-efficient Gesture Recognition with Hand-worn Sensor and Mobile Devices Taiwoo Park 1 , Jinwon Lee 1 , Inseok Hwang 1 , Chungkuk Yoo 1 , Lama Nachman 2 , Junehwa Song 1 KAIST 1 , Intel Corporation 2 ACM SenSys 2011

E-Gesture: A Collaborative Architecture for Energy-efficient Gesture Recognition with Hand-worn Sensor and Mobile Devices Taiwoo Park 1, Jinwon Lee 1,

Embed Size (px)

Citation preview

E-Gesture: A Collaborative Architecturefor Energy-efficient Gesture Recognition

with Hand-worn Sensor and Mobile Devices

Taiwoo Park1, Jinwon Lee1, Inseok Hwang1,Chungkuk Yoo1, Lama Nachman2, Junehwa Song1

KAIST1, Intel Corporation2

ACM SenSys 2011

Hand Gesture-Based Mobile Interaction

Smartphone

Mobile Gesture Interaction Framework

Hand-worn Sensors

Applications

Motion Sensor(s)

Design Challenges

• Providing Energy-efficient Gesture Processing

• Accurately Detecting and Classifying Hand Gestures

Mobility Noises

(Segmenting)

Design Challenges

• Providing Energy-efficient Gesture Processing

• Accurately Detecting and Classifying Hand Gestures

Mobility Noises

20hrs Sensor, 250mAh

24hrs 17hrs Smartphone

Over 90% False detections

Only 70% Classification

(Segmenting)

Design Challenges

• Providing Energy-efficient Gesture Processing

• Accurately Detecting and Classifying Hand Gestures

Mobility Noises

20hrs Sensor, 250mAh

24hrs 17hrs Smartphone

Over 90% False detections

Only 70% Classification

(Segmenting)

E-Gesture ArchitectureWristwatch Device Smartphone

① Sensing ② Detection ③ Classification ④ Notification(Segmentation)

1

2

3

4

Collaboration between devicesDetection on wristwatch, classification on smartphone

Collaboration between sensorsAccel triggers gyro for energy efficiencyGyro adapts accel to mobility changes

Why ‘collaborative’?

Approaches

• Investigated characteristics of Accel and Gyro– Accelerometer: Mobility-Sensitive, Energy-Efficient– Gyroscope: Mobility-Robust, Energy-Hungry

• Designed energy-efficient, mobility-robust gesture detection architecture– Triggering Gyroscope by analyzing Accelerometer Signal– Adjusting Accelerometer sensitivity by Gyroscope Validation

• Suggested two gesture classification architectures considering users’ mobilities (based on HMM)

(threshold)

Approaches

• Investigated characteristics of Accel and Gyro– Accelerometer: Mobility-Sensitive, Energy-Efficient– Gyroscope: Mobility-Robust, Energy-Hungry

• Designed energy-efficient, mobility-robust gesture detection architecture– Triggering Gyroscope by analyzing Accelerometer Signal– Adjusting Accelerometer sensitivity by Gyroscope Validation

• Suggested two gesture classification architectures considering users’ mobilities (based on HMM)

(threshold)

Approaches

• Investigated characteristics of Accel and Gyro– Accelerometer: Mobility-Sensitive, Energy-Efficient– Gyroscope: Mobility-Robust, Energy-Hungry

• Designed energy-efficient, mobility-robust gesture detection architecture– Triggering Gyroscope by analyzing Accelerometer Signal– Adjusting Accelerometer sensitivity by Gyroscope Validation

• Suggested two gesture classification architectures considering users’ mobilities (based on HMM)

(threshold)

Energy-Performance Tradeoff of Accel and Gyro

Energy Consumption

Mobility Robustness

Segmentation Accuracy

Accel-based Low Poor PassableGyro-based High (9x accel) Good Good

Example of Mobility Noises

Standing still Walking Running

Waveform of real mobile gesture

Accel

Gyro

RMS offorce

Waveform of ‘Touching ear’ while running

RMS ofrotation

Mobility Noise

Accel threshold should fit to mobility

Lower fixed threshold False-positiveson high mobility

Higher fixed threshold False-negativeson low mobility

Mobility Situation

RIDE STAND WALK RUN

Accel-based 0.15G 0.15G 0.2G 0.35G

Gyro-based 25 degree/sec

Optimal threshold for Accel and Gyro(minimizes FPs without incurring FNs)

False-Positives per mobility with optimal threshold

However gyroscope is energy-hungry..

Sensor-side Energy Profile(Atmega128L, CC2420, Accel and Gyro)

AccelGyroscope

However gyroscope is energy-hungry..

Accel Gyro Accel and Gyro arecomplementary

in terms ofperformanceand energy

Approaches

• Investigated characteristics of Accel and Gyro– Accelerometer: Mobility-Sensitive, Energy-Efficient– Gyroscope: Mobility-Robust, Energy-Hungry

• Designed energy-efficient, mobility-robust gesture detection architecture– Triggering Gyroscope by analyzing Accelerometer Signal– Adjusting Accelerometer threshold by Gyroscope Validation

• Suggested two gesture classification architectures considering users’ mobilities (based on HMM)

(threshold)

Closed-loop Collaborative Segmentation

Gyro-basedDetector

Gyro-based Detector

Accurate, High energy

Closed-loop Collaborative Segmentation

Gyro-basedDetector

Trigger Gyro-basedDetector

Accel-basedDetector

Gyro-based Detector

Open-loop Detector

Accurate, High energy

Closed-loop Collaborative Segmentation

Gyro-basedDetector

Trigger Gyro-basedDetector

Accel-basedDetector

Gyro-based Detector

Open-loop Detector

Accurate, High energy

Accurate, High energy(because of mobility)

Closed-loop Collaborative Segmentation

Accurate,Low energy

Trigger Gyro-basedDetector

Accel-basedDetector

Feedback

Closed-loop Collaborative Detector

Closed-loop Collaborative Segmentation

Accurate,Low energy

Trigger Gyro-basedDetector

Accel-basedDetector

Feedback

Closed-loop Collaborative Detector

Performance-preserving, Energy-saving Collaborative Sensor Fusion

Approaches

• Investigated characteristics of Accel and Gyro– Accelerometer: Mobility-Sensitive, Energy-Efficient– Gyroscope: Mobility-Robust, Energy-Hungry

• Designed energy-efficient, mobility-robust gesture detection architecture– Triggering Gyroscope by analyzing Accelerometer Signal– Adjusting Accelerometer sensitivity by Gyroscope Validation

• Suggested two gesture classification models considering users’ mobilities (based on HMM)

(threshold)

Basic HMM• Trained with samples collected in stationary setting• Classification accuracy drops in mobile situation

Model for Gesture A

Model for Gesture B

Model for Garbages

Gesture Candidate

Sample

Gesture A orGesture B or

… orGarbage

Raw, Delta, Integral(18 features)

Probabilities

8-state left-right HMMs

Basic HMM• Trained with samples collected in stationary setting• Classification accuracy drops in mobile situation

Model for Gesture A

Model for Gesture B

Model for Garbages

CandidateGesture Sample

Gesture A orGesture B or

… orGarbage

Raw, Delta, Integral(18 features)

Probabilities

Design alternatives:1) Adapt models to mobility changes (in run-time)

2) Train several different models forpredefined set of mobility situations

8-state left-right HMMs

Adaptive HMM• Update the models with gesture samples

– Negative update scheme of uWave [PerCom09]– By MLLR (Maximum Likelihood Linear Regression) adaptation

On-line adaptationwith users’ gesture samples

Model for Gesture A

Model for Gesture B

Model for Garbages

CandidateGesture Sample

Gesture A orGesture B or

… orGarbage

Wrong!!!

Raw, Delta, Integral(18 features)

Probabilities

Multi-Situation HMM• Train models separately for representative mobility

situations– e.g. Riding a car, Standing, Walking, Running

• Classify by evaluating all models

CandidateGesture Sample

Gesture A orGesture B or

… orGarbage

Model for Gesture AModel for Gesture B

Model for Garbages

…Model for Gesture AModel for Gesture B…

Model for Gesture AModel for Gesture B…

Trained for RIDE

Trained for STAND

Trained for WALK

𝑁𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑚𝑜𝑑𝑒𝑙𝑠=𝑁𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑠𝑖𝑡𝑢𝑎𝑡𝑖𝑜𝑛𝑠×𝑁𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑔𝑒𝑠𝑡𝑢𝑟𝑒𝑠

Raw, Delta, Integral(18 features)

Probabilities

Brief Comparison betweenClassification Architectures

Basic Adaptive Multi-Situa-tion

Adaptation cost(Users’ burden) none large none

Training cost # of gestures # of gestures# of gestures x

# of mobile situations

Evaluation cost(Processing) # of gestures # of gestures

# of gestures x# of mobile situations

Brief Comparison betweenClassification Architectures

Basic Adaptive Multi-Situa-tion

Adaptation cost(Users’ burden) none large none

Training cost # of gestures # of gestures# of gestures x

# of mobile situations

Evaluation cost(Processing) # of gestures # of gestures

# of gestures x# of mobile situations

IMPLEMENTATION

• Sensor node– Atmega128L MCU– CC2420 Zigbee Radio– Sensors

• 3-Axis Accelerometer (ADXL335)• 3-Axis Gyroscope (3 XV-3500CB)• 40Hz Sensing

– Vib motor

• Smartphones– Nokia N96, Google Nexus One– Bluetooth Radio

• Bridge node to convert ZigbeeBluetooth

Implementation

Google Nexus One Sensor node

Nokia N96 Bluetooth Headset

Implementation• Ported HTK to Mobile OSs

– Google Android 2.1 with NDK rev.3– Nokia S60 3.2.0 with Open C++ library

• Application interface– Helps application developers easily define and develop gesture UIs

Sample Applications

• Swan Boat [Ubicomp09][MM09][ACE09]– Collaborative boat-racing exertion game– Utilizes hand gestures as additional game input

• Punching together, flapping together

• Mobile Music Player, Phone Call Manager– Featuring eye-free, touch-free controls– User can control the application by hand gestures

Hand-worn Sensors

EVALUATION

Gesture Data WorkloadThreshold(Sensitivity) AdaptationEnergy EfficiencyClassification Accuracy

Gesture Data Workload• 4 Representative mobility situations

– Riding a car, Standing, Walking, Running

• 8 Intuitive gestures

• Data Collection– 4 situations × 8 gestures × 30 samples × 7 participants

= Collected 6720 gesture samples in total– Also collected non-gestures to generate test workloads

• Workload configuration (for energy efficiency)– Ratio of gestures: 10% of total time– Mobility mixture: 75% from stationary (50% STAND, 25% RIDE)

25% from mobile (12.5% WALK, 12.5% RUN)

1

23 Earpose (12)Laydown (21)Throw (13)Draw (31)Leftflick (11)Rightflick (11)

Leftknob

Rightknob

Higher Threshold in Higher Mobility

Lower Threshold in Lower Mobility

Threshold adaptation ofClosed-loop detector

Q: Does the closed-loop collaborative detector adapt accelerometer threshold well?

Performance of Closed-loop Detector

False positivesfrom

accel detector

False negativesfrom

accel detector

Q: How much does the closed-loop detectorsuppress false-positives and false-negatives

from the accel-based detector?

Next two questions:

• How much energy does the E-Gesture save?– Sensor– Smartphone

• Does the suggested classification models provide sufficient accuracy for user interaction?

Sensor-side Energy Savings fromClosed-loop Architecture

46mW

39mW (↓15%)

19mW (↓59%)

59% less energy consumption, 2.4x longer lifetime

250mAhLi-ion

Battery

transmitraw sensing data:

20 hrstransmit only

detected gestures(no sensor control)23.7 hrs (1.2x)transmit only

detected gestures (closed-loop detection):

48.7 hrs (2.4x)Energy Consumption

Mobile-side Energy Savings from Sensor-side gesture detection

All processing on mobile:42.1hrs

Sensor-side Gesture Detection:

74hrs (1.8x)

122mW

70mW(↓43%)

1400mAhLi-ion

Battery

3G/WiFi on

Energy Consumption

Gesture Classification Performance

AdaptationCost

4xGestureModels

(Precision)

Conclusion• A mobile gestural interaction platform

• A collaborative gesture interaction architecture– Collaborative sensor-mobile gesture data processing

• 1.8x longer battery lifetime for Smartphone

– Closed-loop collaborative sensor-side segmentation• 2.4x longer battery lifetime for Hand-worn Sensor

+ preserving gyroscope’s detection performance

• Experiments under different representative mobile situations– Up to 94.6% of classification accuracy on mobile usage

by mobility-considered classification architecture design

Thanks! Questions?

Conclusion

• 2.4x longer battery lifetime for Hand-worn Sensor+ preserving gyroscope’s detection performance

by sensor-wise collaboration• 1.8x longer battery lifetime for Smartphone

by device-wise collaboration• Up to 94.6% of classification accuracy on mobile usage

by mobility-considered classification architecture design

Achievements:

• Opportunities for further energy savings– Early filtering on sensor devices

• Improvement on classification architecture

Future Work:

Accuracy breakdown by mobility situation

• Sensor node– Atmega128L MCU– CC2420 Zigbee Radio– Sensors

• 3-Axis Accelerometer (ADXL335)• 3-Axis Gyroscope (3 XV-3500CB)• 40Hz Sensing

– Vib motor

• Smartphones– Nokia N96, Google Nexus One– Bluetooth Radio

• Bridge node to convert ZigbeeBluetooth

Implementation Detail (Hardware)

Google Nexus One Sensor node

Nokia N96 Bluetooth Headset

• We carefully chose gyroscope with very small latency– <30ms: XV-3500CB (Epson)

• Drops one sample Same accuracy with complete samples

– >50ms: ADXRS620 (Analog Devices), IDG-500 (Invensense)• Drops two+ samples 0.4%+ of accuracy loss

OUR CHOICE

Accel-based trigger alone is not effective:

• Mobility noises trigger gyroscope very frequently

FN with a High Threshold(Accel-based Segmentor)

FP with a Low Threshold(Accel-based Segmentor)

Accel GyroTrigger

hard to usehigher thresholdbecause of FNs

If we take in account the bridge node: 51hrs (1.2x)

Accel’s sensitivity should fit to mobility

Mobility SituationRIDE STAND WALK RUN

Accel-based 0.15G 0.15G 0.2G 0.35GGyro-based 25 degree/sec

Optimal threshold for Accel and Gyro(minimizing false detections without ignorance)

Higher Fixed Sensitivity False Detections on Higher mobility

Lower Fixed Sensitivity Ignorance on Lower mobilityPrecision ↓

Recall ↓

Closed-loop Collaborative Segmentation

Gyro-basedDetector

Accurate,High energy

Accurate, High energyor

Insensitive, Low energy

Accurate,Low energy

Trigger Gyro-basedDetector

Accel-basedDetector

Trigger Gyro-basedDetector

Accel-basedDetector

Feedback

Gyro-based Detector

Open-loop Detector

Closed-loop Collaborative Detector