Four papers co-authored by researchers at the Center of Excellence for Mobile Sensor Data-to-Knowledge (MD2K) were presented last month at the International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp 2017). held in Maui, Hawaii.
Each of the papers dealt with a different aspect of mobile health (mHealth) research: user engagement, visual analytics, eating detection, and detection of fatigue using computational eyeglasses. In its fourth year of NIH-funded research, MD2K researchers have had more than 200 papers and articles accepted for publication.
The authors of “eWrapper: Operationalizing engagement strategies in mHealth” (citation below) examined the problem of attrition among mHealth app users and proposed a model – eWrapper – that integrates engagement strategies so that they dynamically adapt to the characteristics and responses of mHealth app users. The eWrapper is an “ambient display for a mobile device that enables the implementation and adaptation of a wide variety of engagement strategies grounded in social psychology, marketing, cognitive psychology, behavioral economics and human-computer interaction.” The work described in the paper supports the creation of just-in-time adaptive interventions (JITAI) that seek to increase and maintain engagement in mobile health.
The researchers designed the strategies to promote self-monitoring behaviors among obese/overweight adults using a weight-management mobile app. Future research plans include investigating how the framework can be designed to increase engagement in the context of other mHealth settings.
A visual analytics system called Discovery Dashboard (citation below) allows researchers to explore large volumes of time-series data gathered in mobile medical field studies. The system was demonstrated at the conference.
“Discovery Dashboard offers interactive exploration tools and a data mining motif discovery algorithm to help researchers formulate hypotheses, discover trends and patterns, and ultimately gain a deeper understanding of their data.,” the authors stated. “Discovery Dashboard emphasizes user freedom and flexibility during the data exploration process and enables researchers to do things previously challenging or impossible to do – in the web browser and in real time.”
The researchers are working to deploy the Discovery Dashboard for health researchers and practitioners to evaluate the system in the field and also plan to include Ecological Momentary Assessment (EMA) data in the system. Analyzing EMA data alongside sensor data will expose insights that might not have been apparent if each type of data were analyzed separately.
Two other papers focused on detection of behaviors. “EarBit: Using Wearable Sensors to Detect Eating Episodes in Unconstrained Environments” (citation below) examined the practice of food journaling to monitor food intake and proposed EarBit, a wearable system that detects eating moments through use of a sensor system that consists of two inertial measurement units, a proximity sensor and a microphone.
EarBit was deployed in a simulated home environment and then outside the lab, and obtained an accuracy rate in excess of 90% in both cases. Using video recordings to establish ground truth, researchers found it accurately recognized all but one recorded eating episode.
“EarBit brings us one step closer to automatically monitoring food intake, which can ultimately aid in preventing and controlling many diet-related diseases,” the researchers said.
The second paper focused on the use of computational eyeglasses. “iLid: Low-Power Sensing of Fatigue and Drowsiness Measures on a Computational Eyeglass,” (citation below) presented a wearable sensor system that monitors eye closures and blink patterns to detect fatigue. Blinking and eye closure patterns have long been regarded as accurate indicators of fatigue. But, wearable systems are hampered by the high power consumption of the necessary camera and inability to adapt to changing environmental conditions.
With the iLId system, researchers were able to reduce power consumption by sampling and processing small subsets of an image in real time. From this data they were able to extract blink and eyelid features such as blink duration, blink rate and eyelid closure patterns. They used the technique under a variety of lighting conditions, when the user was experiencing eyeglass slippage and when the use was mobile. To further validate the technique, they compared it against a high-tech eyeglasses-based electro-oculogram (EOG), which is the state of the art for monitoring the eye.
They found that the iLId technique detected blinks with 95+% accuracy and eyelid closures with an accuracy of 97.5% both indoors and outdoors. It was also found to function well in a variety of settings involving lighting changes, shifts in eyeglasses and user mobility. The system was also able to run in real time on a low-power platform.
“Blink and eye closure features are the cornerstone of fatigue and drowsiness detectors that can operate in the real world,” the researchers said. “We believe this work paves the way for a regular spectacles form-factor device that has the built-in ability to monitor cognitive state in real time.” They hope to conduct long-term user studies in more natural environments to uncover more eye-related features at different levels of fatigue during the day.
UbiComp brings together top researchers and practitioners in the research and development of pervasive, wireless, embedded, wearable and/or mobile technologies. It is presented by the Association for Computing Machinery, the world’s largest educational and scientific computing society. Founded in 1947, ACM works to promote high standards and recognize technical excellence. It has more than 100,000 members, about half of which live outside the United States.
Citations
Bedri A, Li R, Haynes M, Kosaraju RP, Grover I, Prioleau T, Beh MY, Goel M, Starner T and Abowd G (2017), "EarBit: Using Wearable Sensors to Detect Eating Episodes in Unconstrained Environments," Proc. ACM Interact. Mob. Wearable Ubiquitous Technol.. New York, NY, USA, September, 2017. Vol. 1(3), pp. 37:1-37:20. ACM.
Fang D, Hohman F, Polack P, Sarker H, Kahng M, Sharmin M, al'Absi M and Chau DH (2017), "mHealth Visual Discovery Dashboard," In Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers. New York, NY, USA , pp. 237-240. ACM.
Soha R, Addison M, Deepak G, Benjamin M and Jeremy G (2017), "iLid: Low-power Sensing of Fatigue and Drowsiness Measures on a Computational Eyeglass," Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies. New York, NY, USA Vol. 1(2), pp. 23:1-23:26. ACM.
Wagner III B, Liu E, Shaw SD, Iakovlev G, Zhou L, Harrington C, Abowd G, Yoon C, Kumar S, Murphy S, Spring B and Nahum-Shani I (2017), "eWrapper: Operationalizing Engagement Strategies in mHealth," In Proceedings of the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2017 ACM International Symposium on Wearable Computers. New York, NY, USA , pp. 790-798. ACM.