33
ADL Assistant: An Aware Remote Caregiving System Tom Keating Eugene Research Institute Ray Keating Pontimax Technologies, Inc.

ADL Assistant: An Aware Remote Caregiving System Tom Keating Eugene Research Institute Ray Keating Pontimax Technologies, Inc

Embed Size (px)

Citation preview

ADL Assistant: An Aware Remote Caregiving System

Tom KeatingEugene Research Institute

Ray KeatingPontimax Technologies, Inc.

Remote Caregiving Support

• Integrating home sensor networks, smart software and web-based tools to provide data for consumers and caregivers

• Applications:- Providing long distance behavioral support

- Helping avoid victimization

- Monitoring physical health and communicating information to health care providers

- Monitoring environmental changes related to health and safety

Test Bed Example of Remote Monitoring in Practice

Main Components

Picture Planner™ activity planning and prompting application

X10-based home sensor network Home automation software Consumer and caregiver monitoring computers Intelligent activity recognition software Webcams or IP Cameras

Apartment Layout

7 PIR motion sensors

1 magnetic reed sensor

CM15 powerline interface

Remote Desktop View

Web Portal for Caregiver

Kitchen

Bdr

Bath

Shower

LR

DR

Porch

Fr. dr.

Summaries______________________M_____T____W____Th____F____S____SuShowers 1 1 0 1 0 1 0Sleep hours 6.5 12 7 8 8.5 8.5 6Nighttime door 0 0 0 0 0 0 0Nighttime bath use 2 1 1 1 2 3 1

Benefits• Improved self-management for activity and task

completion• Feasible universally cost effective remote support

technology • Caregiver effectiveness increased• Caregiver peace of mind increased• Some functionality increasingly available “off the

shelf”

Smart Prompting Prototype: Planned Activity vs. Actual Behavior

Conceptual Model for Behavioral Inferencing in Residential Settings

Behavioral Inferencing for Activities of Daily Living

The “Sensible”

Spatio-Temporal Way

Objective 1

Provide near real time oversight of consumer in their activities of daily living by:

• Infer simple-to-complex ADL of interest from motion

sensor data

• Selectively generate multi modal, immediate (email,

SMS texting) advisories to caregiver

Objective 2

• Provide positive consumer ADL direction though:

• Use of inferred events, activities & behaviors to drive

generation of multi-media audio & video prompts

• Communicate prompts to consumer at residence

domain area

Objective 3

• Use least obtrusive, least expensive (!)

means possible to achieve Objectives 1 & 2

Working Back From Objective 3

• Passive infrared (PIR) sensors can detect motion

• Are inconspicuous

• Are inexpensive ($15 - $30)

• Can be easily installed at the consumer’s residence

What’s a PIR Sensor?

• Are area sensors

• Fire on motion in their field of view (FOV)

• Commonly used to turn on a light when the

room is entered

Can they meet the challenges posed by objectives 1 & 2?

The Challenge is one of BOTH Sense & Sensibility

• “Sensing”– Can the ADL be captured by motion sensors?

• “Sensibility”

– Do ADL have characteristic patterns in the form of

spatio-temporal motion occurrences?

– What would these ADL characteristics be based on?

Can the ADL be Captured by Motion Sensors?

• To answer the question…

– Let’s take a look at how the ADL motion data

generation and collection occurs using PIR

type motion sensors

Capturing the ADL Motion Data• Coverage

– Consumer residence “domain” divided into “Domain Areas”:

- Shower, bath, bedroom, living room, dining area, kitchen, front door, porch

• One PIR sensor per domain area

• Sensing Capabilities

– Installed PIR sensors fire once every ten seconds for motions in FOV– Motions continuously detected for all domain areas

Capturing the ADL Motion Data, Con’t.

• Real time data collection– Consumer site unit receives transmitted sensor motion data– Time stamps location tagged motion data– Continuous uploading to database server

OK, it seems that movements at locations (spatial) can be detected by motion sensors and chronologically (temporal) recorded in a database

- How are these spatio-temporal motion patterns recognized as a particular ADL?

Characterizing ADL with Spatio-Temporal Fact Patterns

• Empirical studies indicate that many ADL can be characterized as:

– A Certain number of motions– Within a time period– At a particular location

Clearly, location (& maybe time) context is paramount! Example: Taking a daily shower

• If more than X motions within Y duration at the SHOWER domain area, then consumer has taken their daily shower!

How to Define the Spatio-Temporal Fact Patterns?

• Might…– Caregiver experience…– Mathematical analysis…– Behavioral research…

…Be used?

How to Define the Spatio-Temporal Fact Patterns, con’t-2

• Example:

– The Caregiver knows from experience (“A Posteriori”) that the Consumer generally takes five to ten minute

showers.

– The motion sensor detects movement, at most, once every ten seconds so in a five to ten minute

period there should be a pattern of motion detections in the range of 50 to 100 occurrences, assuming

constant movement in the shower.

Let’s use some data analysis to add some Assurance to our caregiver’s experience…

How to Define the Spatio-Temporal Fact Patterns, con’t-3

• Use Time-Location Based Clustering Analysis:

- Do the data collection over some sufficient period -- say a week

- Aggregate the sensor data occurrences for each domain area by time period, using the caregiver’s A Posteriori observation.

• Results:The basic spatio-temporal “shower taken” pattern

confirmed, albeit, at a lower occurrence rate of 30 to 60 occurrences over a five to ten minute duration.

How do we Apply our ADL Fact Patterns to the Data?

• Three requirements:

– Describe the spatio-temporal fact patterns– Store them for access from the database– Use them to inference ADL occurrences

Use Spatio-Temporal Predicate Expressions to Specify the ADL Patterns

• General form:

– @Pattern Meta Function [tag-value argument set]:• Spatio-temporal context (tag-value context reference:value

set)– Predicate Operator (=,>=,<=, !=)

– ! Pattern Meta Criterion [criterion argument(s)]:• Spatio-temporal context (tag-value context reference:value

set)

• Showering_Activity:

@OCCURRENCES[SCANFREQ=’10’,DURATION=’10’]: CONTEXT(LOCATION=‘SHOWER‘, TIMESCOPE=‘2000-2200’ )

>= !OCCURRENCES[CRITERION_OCCURRENCES,CRITERION_DURATION]:CONTEXT(LOCATION=‘SHOWER', TIMEFRAME=‘0000-2400’)

CAUTION:

• Many ADL patterns can’t be reliably inferred by just a single motion pattern

• Can the Level of “Sensibility” be raised?

– Sure, just specify additional inferencing patterns to form a predicate inferencing chain:

• If fact pattern-1, then if fact pattern-N…

ADL of interest occurred

Spatio-Temporal Fact Patterns• For Shower taken, for example, how can it be assured

that the consumer didn’t just step in and out of the shower?Simple – define a spatio-temporal fact pattern to establish that the consumer was continuously in the shower…

• CONTINUOUS_PRESENCE-SHOWER: @DURATION[SPAN='CONTINUOUS']:CONTEXT(LOCATION=

‘SHOWER',TENSE='CURRENT')

>=

!DURATION[ACTUAL_DURATION]:CONTEXT

(LOCATION=‘SHOWER',TEMPORAL='CURRENT')

Inference Chaining of ADL Predicate Patterns

• A “shower taken” is now defined as:

Shower_Taken

Showering -activity Continuous-Presence-Shower

• Chained Inferencing Predicate Description– If the sensor data pattern described by Showering_activity occurred AND

if at that same time CONTINUOUS_PRESENCE-SHOWER were true, then it is very likely that a shower was taken by the consumer!

Behavioral Plans

• Inferencing hierarchies of ADL concepts can be defined to for a “Plan”

• The ADL concepts provided are:

Plans - Made up of behaviors

Behaviors - Made up of activity patternsActivity patterns - Made up of activitiesActivities - Made up of eventsEvents - Made up of inference fact patterns

• Currently goal times can be specified for each concept entity.• Soon, goal expressions consisting of inference fact patterns will be specifiable for all

behavior plan concepts.

General and Predicate Events• Can be defined as stand-alone entities• Consist of inference fact patterns• Can be “scored” or not - scored have goal times and

have non-occurrence as well as occurrence inferencing performed

• SHOWER_TAKEN is a general event• Location context maintenance accomplished by a set of

general events• Predicate events can be defined to be inferenced upon

occurrence of a specified triggering event.

Consequent Actions

• Can be specified for any “inferencing entity”

– Behavior plan concepts: inferred events, activities, etc.

– General and predicate events

• Performed on the occurrence of the inferred entity

• Can be specified to be performed on positive occurrence, negative occurrence of ALL

• Currently email event occurrence reports, cell phone (SMS) texting and text-to-speech prompt generation

• Can be specified as “external action” consequent actions.

The Inferencing Agent uses the Pattern Meta Specification to do the Inferencing

• Retrieves the stored behavior plans, general & predicate events, and inference fact pattern meta specifications

• Generates the necessary data base queries and inferencing rules and fact patterns

• Performs the Inferencing per the fact pattern’s datescope and timescope, at the scan frequency specified

• Accomplishes any specified consequent actions

Questions?

[email protected]

[email protected]