A “smart home” prototype may help people with dementia dress themselves through automated assistance, enabling them to maintain independence and dignity and providing their caregivers with a much-needed respite.
Dressing is one of the most common and stressful activities for both people with dementia and their caregivers because of the complexity of the task and lack of privacy. Research shows that adult children find it particularly challenging to help dress their parents, especially for opposite genders. Smart home concepts are being applied for the benefit of this population: researchers have developed DRESS, an intelligent dressing system which integrates automated tracking and recognition with guided assistance with the goal of helping a person with dementia get dressed without a caregiver in the room.
The prototype uses a combination of sensors and image recognition to track progress during the dressing process by means of barcodes on clothing to identify the type, location and orientation of a piece of clothing. A five-drawer dresser is topped with a tablet, camera and motion sensor, and is organized with one piece of clothing per drawer in an order that follows an individual’s dressing preferences. A skin conductance sensor worn as a bracelet monitors a person’s stress levels and related frustration.
The DRESS system is activated and monitored by the caregiver from an app from an app. The patient receives an audio prompt recorded in the caregiver’s voice to open the top drawer, which simultaneously lights up. The clothing in the drawers contains barcodes that are detected by the camera. If an item of clothing is put on correctly, the DRESS system prompts the person to move to the next step; if it detects an error or lack of activity, audio prompts are used for correction and encouragement. If it detects ongoing issues or an increase in stress levels, the system can alert a caregiver that help is needed.
To test the ability of the DRESS prototype to accurately detect proper dressing, 11 healthy participants simulated common scenarios for getting dressed, ranging from normal dressing to donning a shirt inside out or backwards or partial dressing – typical issues that challenge a person with dementia.
The system was demonstrated to detect clothing orientation and position as well as infer one’s current state of dressing using the combination of sensors and software. In initial phases of donning either shirts or pants, the DRESS prototype accurately detected participants’ clothing 384 of 388 times. However, the prototype was not able to consistently identify when one completed putting on an item of clothing, missing these final cues in 10 of 22 cases for shirts and 5 of 22 cases for pants.
Improvements implemented on the basis of these findings included increasing the size of the barcodes, minimizing the folding of garments to prevent barcodes from being blocked and optimal positioning of participants with respect to the DRESS prototype.
Scientists from New York University, MGH Institute of Health Professions and Arizona State University participated in this research, which is published in JMIR Medical Informatics.