6
Pervasive Sensing in Healthcare: From Observing and Collecting to Seeing and Understanding Steven R. Rick Dept. of Computer Science and Engineering, UC San Diego La Jolla, CA, USA [email protected] Nadir Weibel Dept. of Computer Science and Engineering, UC San Diego La Jolla, CA, USA [email protected] Vishwajith Ramesh Dept. of Bioengineering, UC San Diego La Jolla, CA, USA [email protected] Danilo Gasques Rodrigues Dept. of Computer Science and Engineering, UC San Diego La Jolla, CA, USA [email protected] Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]. CHI 2017, May 6-11, 2017, Denver, CO, USA. – WISH Symposium Copyright is held by the owner/author(s). Publication rights licensed to ACM. ACM ISBN 978-1-4503-4655-9/17/05 ...$15.00. Abstract From analyzing complex socio-technical systems, to eval- uating novel interactions, increasingly pervasive sensing technologies provide researchers with new ways to observe the world. This paradigm shift is enabling capture of richer and more diverse data, combining elements from in-depth study of activity and behavior with modern sensors, and providing the means to accelerate sense-making of com- plex behavioral data. At the same time novel multimodal signal processing and machine learning techniques are equipping us with ’super powers’ that enable understand- ing of these data in real-time, opening up new opportunities for embracing the concept of ’Data Science in the Wild’. In this paper we present what this transition means in the context of Health and Healthcare, focusing on how it leads to the ’UbiScope’, a ubiquitous computing microscope for detecting particular health conditions in real-time, promot- ing reflection on care, and guiding medical practices. Just as the microscope supported key scientific advances, the UbiScope will act as a proxy for understanding and support- ing human activity and inform specific interventions in the years ahead. Author Keywords Sensors, Pervasive Health, UbiScope, Stroke-Kinect, Augmented Reality, Surgery, Ubiquitous Computing, Data Science

Pervasive Sensing in Healthcare: From Observing and ... · From Observing and Collecting to Seeing and Understanding ... a ubiquitous computing ... Pervasive Sensing in Healthcare:

Embed Size (px)

Citation preview

Pervasive Sensing in Healthcare:From Observing and Collectingto Seeing and Understanding

Steven R. RickDept. of Computer Science andEngineering,UC San DiegoLa Jolla, CA, [email protected]

Nadir WeibelDept. of Computer Science andEngineering,UC San DiegoLa Jolla, CA, [email protected]

Vishwajith RameshDept. of Bioengineering,UC San DiegoLa Jolla, CA, [email protected]

Danilo Gasques RodriguesDept. of Computer Science andEngineering,UC San DiegoLa Jolla, CA, [email protected]

Permission to make digital or hard copies of all or part of this work for personal orclassroom use is granted without fee provided that copies are not made or distributedfor profit or commercial advantage and that copies bear this notice and the full citationon the first page. Copyrights for components of this work owned by others than ACMmust be honored. Abstracting with credit is permitted. To copy otherwise, or republish,to post on servers or to redistribute to lists, requires prior specific permission and/or afee. Request permissions from [email protected] 2017, May 6-11, 2017, Denver, CO, USA. – WISH SymposiumCopyright is held by the owner/author(s). Publication rights licensed to ACM.ACM ISBN 978-1-4503-4655-9/17/05 ...$15.00.

AbstractFrom analyzing complex socio-technical systems, to eval-uating novel interactions, increasingly pervasive sensingtechnologies provide researchers with new ways to observethe world. This paradigm shift is enabling capture of richerand more diverse data, combining elements from in-depthstudy of activity and behavior with modern sensors, andproviding the means to accelerate sense-making of com-plex behavioral data. At the same time novel multimodalsignal processing and machine learning techniques areequipping us with ’super powers’ that enable understand-ing of these data in real-time, opening up new opportunitiesfor embracing the concept of ’Data Science in the Wild’.In this paper we present what this transition means in thecontext of Health and Healthcare, focusing on how it leadsto the ’UbiScope’, a ubiquitous computing microscope fordetecting particular health conditions in real-time, promot-ing reflection on care, and guiding medical practices. Justas the microscope supported key scientific advances, theUbiScope will act as a proxy for understanding and support-ing human activity and inform specific interventions in theyears ahead.

Author KeywordsSensors, Pervasive Health, UbiScope, Stroke-Kinect,Augmented Reality, Surgery, Ubiquitous Computing, DataScience

ACM Classification KeywordsH.5.m [Information interfaces and presentation (e.g., HCI)]:Miscellaneous

IntroductionThroughout history the advancement of science and tech-nology has granted humanity new lenses through whichto better understand the world. The telescope granted ac-cess to the macro-scale world, the huge and distant cosmicbodies. The microscope granted access to the micro-scaleworld, the tiny and close cells and bacteria which were re-sponsible for illness and disease. While each of these microand macro lenses have evolved and advanced, it was notuntil recently that a third lens was introduced at the level ofhuman beings. Pervasive and ubiquitous means for sens-ing and computing, as enabled by new compact and mobilesensors which can see beyond the range of our own eyes,can nowadays detect imperceptible motion, signals, or pat-terns that a human alone cannot. But observing and sens-ing are not enough on their own. Incredible advances inmachine learning, multimodal signal processing, and mobilecomputing, are producing the ’perfect storm’ of conditionsthat allow us to go beyond sensing, introducing new oppor-tunities to see the world and generate real understanding inways that were not possible before.

It is important to note that the microscope was not enoughto directly drive healthcare outcomes, but required consid-erable efforts in translational research to generate importantfindings in medicine. Similarly, we must be thoughtful aswe design pervasive sensing technology to integrate withubiquitous computing systems that seek to support andpositively influence patient outcomes, improve quality ofhealthcare delivery, or reduce adverse events. To supportthis move from simply observing and collecting data, to ac-tually seeing through this lens and generating better under-

standing of health and healthcare, we introduce a new con-ceptual framework that we call UbiScope, a ubiquitous com-puting microscope for real-time seeing and understandingof situated interactions. The UbiScope is instantiated as acombination of three main components: (1) data collectionwith unobtrusive sensors and off-line analysis, (2) machinelearning and modeling to generate lightweight classifiers,and (3) the implementation of these focused detectors forreal-time applications. The application of this conceptualframework in healthcare provides a means for leveragingubiquitous computing and sensing as a microscope thatenables real-time characterization of particular behaviors,diseases, or disabilities.

In the remainder of this paper we lay out existing work thatinforms the introduction of the UbiScope, and we introducethree example that instantiate this idea in specific health-care settings. We then discuss the reach of this approach,concluding with next steps and future outlook.

Related WorkA rapidly growing body of work surrounds the idea of im-plementing advanced computing and sensing in order toimprove human understanding in healthcare. This has of-ten materialized in the usage of offline machine learningfor disease detection, such as Mazilu and colleagues’ workon classifying Parkinson’s Disease related gait symptomsvia smartphone sensors [7]. Another example is the workof Suk, Lee, and Shen who performed multimodal featureextraction to model and diagnose Alzheimer’s Disease andMild Cognitive Impairment [9]. Work has also been pursuedin terms of quantification, rather than detection, of diseaseprogression. One such example is the work of Kontschiederand colleagues evaluating patients with Multiple Sclerosiswith depth sensing devices [5]. Moving beyond specific dis-eases to assess the overall environment, we find our own

work focused on the deployment of a range of multimodalsensors into outpatient clinics to capture and character-ize physician-patient-computer interaction [10]. All of thiswork, while incredibly informative, relied upon techniquesthat were computationally too expensive to easily permitadaptation into real-time evaluation systems.

More ubiquitous computing methodologies are being uti-lized in order to promote change and learning within thedelivery of healthcare. Work like Maiden and colleagues’deployment of tracking technologies to improve medicalstaff awareness of how they were delivering care to patientswith dementia [6], is one example in this setting. This re-search leveraged a more people-centered approach, raisingawareness of current practice through self-reflection. Aspointed out by Elf, Frost, Lindahl, and Wijk [1], by consid-ering the current practices within the context of the envi-ronment where the work naturally occurs, design and de-ployment of new models of care or new technologies whichseek to improve care are more readily adopted.

In our research leading up to our concept of the UbiScope,we integrated the knowledge from preexisting work focusedon sensing within the health and healthcare domains. Bycombining those learned lessons with our own situateddata-collection and analysis, we found means to augmentand empower patients, clinicians, or medical practices whilerespecting their work environment. This approach allowedus to effectively shift data science into the real world in anecologically valid way.

Algorithms and procedures that allowed multimodal andmulti-dimensional data collected in the field to be analyzedthrough post-processing, have now been rebuilt in order tobe deployed in the field and work in real-time. Optimiza-tion methods for producing faster signal processing andmore lightweight classifiers enable real-time on-line analy-

sis and ultimately support the design and development ofdata-driven interventions informed in real-time by pervasivesensing.

Data Science in the Wild: the UbiScopeWith an emphasis on the natural augmentation of healthand healthcare, it is essential to understand the needs ofmodern day health practitioners as they go about their workin their natural environments. Growing literature [4], as wellas our own experiences point to time being one of the keyfactor that drives many of the decisions, especially in criti-cal settings. With both costs and patient outcomes stronglyrelated to time taken, care teams must move quickly whilemaintaining high precision. This temporal demand, how-ever, can occasionally lead to subtle clues being missedwhich are highly relevant to the detection of specific condi-tions.

In order to resolve these concerns, we propose a new frame-work that combines methodologies for sensing and sense-making from Computational Ethnography [11], with interac-tive and interventional systems that have been co-designedalongside the medical care team members who will usethem. In this section we discuss how these factors lead tothe UbiScope, and it’s core concepts of detection, reflec-tion, and guidance through three distinct tracks of work: (1)neurological disease classification, (2) surgical assessment,and (3) surgical training. All of these areas of work empha-size different forms of real-time and near-time interactivefeedback, informed by expert inquiry and in-situ observa-tion, and realized through the use of ubiquitous sensing andcomputing devices.

Neurological Disease ClassificationIn the area of acute neurological disease diagnosis andtreatment, speed and accuracy are both incredibly impor-

Figure 1: Top-Left: Body tracking data collected from a patientduring a neurological assessment visualized in the multimodalanalysis tool ChronoViz [2]. One of the arms of the patient wasabnormally asymmetric during analysis, indicating higher likelinessof stroke conditions. Bottom-Right: envisioned characterization ofproblematic moments in time, across modalities

tant. This is especially true in the case of stroke. The drugsand procedures used to treat stroke often carry very highrisks related to internal bleeding (especially the brain).Therefore, diagnosis of stroke must be made by highlytrained and specialized neurologists before treatment canbe administered. Unfortunately, experts are frequently un-available and current solutions rely heavily on support staffto triage and prioritize patient evaluations. This often leadsto stroke not being treated on time, with only 5% of acutestrokes treated with the standard medication (rt-PA), primar-ily due to the narrow therapeutic time window for interven-tion [3].

To help solve this problem, we are studying how to compu-tationally characterize stroke features by observing patients’multimodal behaviors. We are currently collecting behav-ioral data from patients while they are assessed by neurol-

ogists [8] (see Fig. 1). As the patients run through a batteryof speech-based, motor-based, and coordination-basedtasks and tests to assess neurological condition, we isolatefeatures indicative of positive and negative stroke diagnosis.We aim to provide tools that help augment doctors’ diag-nostic abilities.

The application of the UbiScope conceptual framework inthis setting will guide us towards deploying a system whichcan autonomously aid in triage through detection. If a neu-rologist cannot immediately be seen, the patient can firstengage with our tools, which will interactively guide the pa-tient through a basic battery of tasks previously found toelicit the strongest signals for stroke diagnosis. This in turnwill provide just-in-time support to clinical staff through re-ports which highlight the positive and negative signals de-tected (see envisioned chart in Fig. 1, bottom-right)

Figure 2: Data from the tracking of two laparoscopic surgeonsworking together with an abdominal training simulator played backin ChronoViz. In the bottom-right panel shows problematicergonomics moments (in green) automatically recognized by thesystem.

Surgical Assessment in Laparoscopic Operating RoomsMoving away from explicit disease detection, we also seethe UbiScope framework providing support directly to careclinicians and medical personnel. Our research in this set-ting is focusing on the assessment of ergonomics and human-factors during laparoscopic surgical procedures. Our ap-proach successfully combines body-tracking sensors withstandards such as ANSI Z-365 in order to generate an er-gonomic ’score’ at each moment in time. This will directanalysis to moments when a surgeon’s ergonomics wereparticularly poor (see Fig. 2 for an example).

This in-situ data collection and post-evaluation is very in-formative of what shortcomings preexist in the practice ofsurgery. While direct attention to the practices that attributeto long-term injury is important, the post-capture analysis islimited in it’s ability to directly influence change.

By extending this work through the UbiScope’s focus onreflection, the assessment algorithms can be optimizedto run in real-time. This will then make it possible to designreal-time or near-time interventions on top of these scores.We are currently experimenting with the idea of ambientdisplays or other passive environmental cues as an inter-ventional tool that can be leveraged to raise awareness inreal-time without increasing cognitive burden or distractingfrom the task at hand.

Surgical Simulation and Augmented RealityBeyond assessment in the operating room, it is importantto also focus on teaching the required skills through simu-lations. Embedding teaching in high fidelity simulations isvery effective but requires specialized facilities and expen-sive equipment. For this reason our work is looking at theaugmentation of surgical training through pervasive sensorswhich can track students actions and overlay informativevisualizations onto low-cost mannequins. Specifically we

Figure 3: CT Scan image integrated within the body of amannequin during a simulation training with medical students.

make use of new Augmented Reality devices such as theMicrosoft Hololens to provide a more realistic means forinteraction within the simulated world. Information and vi-sualizations are provided at the locations where it is mostrelevant (e.g. we show CT scans inside the body of a pa-tient, or views from an ultrasound probe projected longi-tudinally within the patient’s body), rather than on remotedisplays and screens which require effort to mentally trans-form the information from the digital realm to the physical(see Fig. 3)

Experiences such as augmented teaching in surgery, in-formed by both pervasive sensing and augmentation of theenvironment, drive UbiScope’s focus on just-in-time feed-back and guidance. Whether occurring while the skills aretaught in simulation, or during actual patient cases, lever-aging sensors to track activity and human interactions willsupport the design and deployment of means for visuallyaugmenting various medical procedures. This will supportinteractive visual guides, including feedback mechanismswhen erroneous behaviors occur.

Discussion and ConclusionIn this paper we presented UbiScope, a conceptual frame-work that shifts the power of sensing and data collectiontowards seeing and understanding, exploiting the conceptsof detection, reflection and guidance. The three examplesoutlined in the paper demonstrate the transformative po-tential of applying data science in the wild, and how thisenables us to move toward interactive and interventionalapplications of sensing within the healthcare environment.

While health presents an environment for high impact, it isalso very likely that advanced systems supported via theUbiScope conceptual framework could contribute signifi-cantly to other areas of work, such as behavioral science,disease prevention, child development, and linguistics.

All in all, we believe that the UbiScope represents the startof a new era where it is increasingly possible to exploit ad-vances in machine learning, signal processing, and mo-bile computing to support new ways to see and understandsituated interactions in real-time. The application of DataScience in the Wild to healthcare, as we outlined in this pa-per, makes it evident that the impact of UbiScope in yearsahead will be exciting and important.

References[1] Elf, M., Fröst, P., Lindahl, G., and Wijk, H. Shared

decision making in designing new healthcare environ-ments. BMC Health Serv Res 15, 1 (2015), 114.

[2] Fouse, A., Weibel, N., Hutchins, E., and Hollan, J. D.ChronoViz: A System for Supporting Navigation ofTime-coded Data. In CHI ’11 EA (2011), 299–304.

[3] Go, A., Mozaffarian, D., Roger, V., American Heart As-soc. and Stroke Statistics Subcommittee, et al. Heartdisease and stroke statistics. Circulation 127, 1 (2013).

[4] Han, Y., Carcillo, J., Venkataraman, S., Clark, R., Wat-son, R., Nguyen, T., Bayir, H., and Orr, R. Unexpectedincreased mortality after implementation of a commer-cially sold computerized physician order entry system.Pediatrics 116, 6 (2005), 1506–1512.

[5] Kontschieder, P., Dorn, J. F., Morrison, C., Corish, R.,Zikic, D., Sellen, A., DâAZSouza, M., Kamm, C. P.,Burggraaff, J., Tewarie, P., et al. Quantifying progres-sion of multiple sclerosis via classification of depthvideos. In Med Image Comput Comput Assist Interv,Springer (2014), 429–437.

[6] Maiden, N., D’Souza, S., Jones, S., Müller, L., Pan-nese, L., Pitts, K., Prilla, M., Pudney, K., Rose, M.,Turner, I., et al. Computing technologies for reflective,creative care of people with dementia. Communica-tions of the ACM 56, 11 (2013), 60–67.

[7] Mazilu, S., Hardegger, M., Zhu, Z., Roggen, D.,Troster, G., Plotnik, M., and Hausdorff, J. M. Onlinedetection of freezing of gait with smartphones and ma-chine learning techniques. In PervasiveHealth 2012,IEEE (2012), 123–130.

[8] Ramesh, V., Rick, S., Meyer, B., Cauwenberghs, G.,and Weibel, N. A neurobehavioral evaluation systemusing 3d depth tracking & computer vision: The caseof stroke-kinect. In Neuroscience (Posters) (2016).

[9] Suk, H.-I., Lee, S.-W., Shen, D., Initiative, A. D. N.,et al. Hierarchical feature representation and multi-modal fusion with deep learning for ad/mci diagnosis.NeuroImage 101 (2014), 569–582.

[10] Weibel, N., Rick, S., Emmenegger, C., Ashfaq, S.,Calvitti, A., and Agha, Z. Lab-in-a-box: semi-automatictracking of activity in the medical office. Personal andUbiquitous Computing 19, 2 (2015), 317–334.

[11] Zheng, K., Hanauer, D. A., Weibel, N., and Agha, Z.Computational ethnography: Automated and unob-trusive means for collecting data in situ for human–computer interaction evaluation studies. In CognitiveInformatics for Biomedicine. Springer, 2015, 111–140.