An older woman, still living independently in a senior care community, contracts a urinary tract infection. She ignores it for a few days, because she’s had them before and maybe it will go away. She takes more trips to the bathroom. The infection worsens and creates side effects including dizziness. She falls, breaks her hip, and eventually ends up with a trip to the emergency room, months of rehabilitation, a mountain of medical bills, and diminished quality of life.
That hypothetical scenario exemplifies the disastrous health events that a team of Washington State University researchers is trying to predict and prevent with a combination of sensors and machine learning algorithms, trained by experienced clinicians.
“We literally are trying to train an algorithm to detect and think like a nurse would, because that’s what’s needed to prevent health events,” says Roschelle “Shelly” Fritz, assistant professor in the WSU College of Nursing.
In the case of the UTI, motion sensors (visible in the image above), would detect the woman’s more frequent trips to the bathroom. A machine learning algorithm would be trained to recognize this departure from her normal routine and send an alert to a nurse or care provider assigned to the woman, prompting a check-in. The nurse could then diagnose the urinary tract infection, and prescribe antibiotics to treat it before the symptoms worsen and trigger the catastrophic fall.
Just such a system is being developed through a pilot project at a Touchmark retirement community in Spokane, WA. It’s not a new idea—other researchers and companies have been testing similar kinds of sensors and software—but what stands out is the tighter link between clinical expertise and the artificial intelligence techniques being used. The multidisciplinary human and machine intelligence underpinning the WSU system has been years in the making, boosted by a recent federal grant and supported by Touchmark’s charitable foundation for its enormous promise in reduced healthcare costs and improved quality of life.
“One of the big impacts of this is going to be from a human perspective, improving people’s quality of life, because it’s going to extend their health and their ability to stay in their homes longer,” Fritz says.
[Editor’s note: Fritz will talk about her research at Xconomy’s upcoming Healthcare + A.I. Northwest event on Nov. 9 in Seattle (agenda and registration). See more of Xconomy’s ongoing series on A.I. in healthcare here.]
Discerning Daily Life
It starts with the sensors. WSU computer science professor Diane Cook and her team at the Center for Advanced Studies in Adaptive Systems have built scores of “smart environments” around the world over the last several years—outfitting homes with sensors that gather raw data when people move across a room, open a door, manipulate an object, or turn on an appliance. The smart environments include Zigbee low-power wireless data networks to gather the raw sensor data, which is time-stamped and identified, then stored in a relational database.
The next task was to teach algorithms to discern patterns from the sensor data—the unique sequence of sensor triggers over specific time intervals—that correspond to activities of daily life, such as resting, grooming, cooking, using the bathroom, and dozens more.
It was a painstaking process that began with students carrying out simple scripts in the sensor-equipped environments. The scripts became more complex. Errors were injected. Multiple activities were woven together to better mimic what the system would encounter in practice. Soon, the system was ready to be deployed in homes with real people going about their daily lives, telling the researchers what they were doing. A team of annotators labeled the incoming data, creating a “ground truth” to match what was happening in the home with what the sensors observed.
“That took a while to get robust, and even now I would say it’s not perfect,” Cook says. The system can recognize more than 40 activities of daily living with 98 percent accuracy.
Performance of the system degrades when there is more than one person living in the home. Pets, cats in particular, can confuse the system. “They tend to teleport in the space, creating aberrations,” Cook says.
But even cats have their habits, and the software adjusts to each individual’s patterns. In one home, for example, the cat would run up and down the stairs after the residents went to bed, she says. “We do activity and pattern discovery in each person’s home,” Cook says, noting that beyond the clinically relevant activities common to everyone, the system can spot the specific movement patterns associated with an individual’s hobbies, which can signal well-being.
‘Clinician in the Loop’
With that foundation in place, the researchers are now focusing on training the system to detect patterns that indicate a present or looming health event, such as the worsening of a chronic condition or increasing likelihood of a fall. They’re also continuing work, with WSU psychology professor Maureen Schmitter-Edgecombe, on ways the system can help those with declining cognitive function, such as with automated reminders to help keep track of daily tasks.
They’ve outfitted five residences at the Touchmark continuing care retirement community in Spokane with sensors. The people living in the homes are older adults with two or more chronic conditions, picked specifically for the study because they were now, or soon would be, in active decline. The researchers won a $1.8 million grant from the National Institute of Nursing Research earlier this year to continue the pilot study for another five years.
Like training the algorithm to recognize activities of daily living, teaching it to spot and even predict anomalies that signal a change in health is a labor-intensive process that demands deep medical expertise.
“If it’s not done in a way that can be clinically validated, it will be totally ignored by the clinical world and our time will be wasted,” Cook says.
Moreover, for it to be useful in a real-world setting, the system must keep false positives to a minimum. “We need to supervise the anomaly detection process, through feedback from the clinical team, and find anomalies that are clinically relevant in the data,” Cook says.
That’s where clinicians like Shelly Fritz comes in. Fritz spent 25 years as a nurse in emergency rooms, public health clinics, and hospital administration, before pursuing her PhD at WSU, where she encountered Cook’s ongoing smart-home research.
Her work involves cross-referencing the sensor data—before the activities have been labeled by the machine learning algorithms—with quantitative and qualitative health information gathered through weekly telehealth visits and monthly in-person nursing assessments with residents of the sensor-equipped homes.
If a patient tells Fritz she had a bad day last week—her restless leg syndrome acted up, for example—Fritz can retrospectively analyze the sensor data to look for “any movement or motion that relates to what they’ve described on that particular day,” she says.
Fritz has caught several falls in the sensor data, correlated with patient reports. She asks patients to describe what they were doing before and after the fall, and where it occurred.
“We need that context to be able to train context-aware algorithms,” she says. She refers to her role in this developing system as “a clinician in the loop.”
(Machine learning algorithms trained with an assist by humans—either through data labeling or output correction—are often described as having a “human in the loop.”)
Healthcare providers, under pressure to make data-driven care decisions, also need a system that allows them to quickly consume the information and alerts. So, a second focus of the grant is to develop data visualizations that will be useful to doctors, nurses, and other care staff.
“How do you take millions of data points and aggregate them into a visual analytic that a clinician can look at in less than 60 seconds … and trust that what I’m seeing is evidence-based and might be useful in my decision-making process?” Fritz says.
Healthcare ‘Force Multiplier’
While much work remains to train and refine the system to detect health events, senior care providers such as Touchmark, which develops and operates retirement communities in nine states and one Canadian province, are eager to see it come to fruition.
Touchmark vice president Bret Cope says the WSU system could be a “force multiplier” for nursing and care staff—not a replacement for them—as the senior care industry confronts the “silver tsunami” of ageing baby boomers, alongside an ongoing shortage of housing units and healthcare providers.
Cope also chairs the nonprofit Touchmark Foundation, which provides grants and scholarships to support nursing education and research, including at WSU. “The supply is nowhere near keeping up with current needs,” Cope says of staffing levels.
Cope says the smart home sensor technology is easy enough to install in the company’s senior residences, which range from independent homes and cottages attached to senior living communities, to assisted-living apartments, nursing homes, and memory care services. (There’s a huge opportunity to build retirement homes with this technology in the coming years. Cope, citing data from the American Seniors Housing Association, says the U.S. will need more than 3 million senior housing units by 2040, of which 2 million still need to be built.)
“The A.I. engineering is where the genius is,” Cope says.
A smart home system could not only help care staff intervene earlier to stave off emergencies, it could also help them triage patient interactions and assessments, so a nurse with 100 patients in her care could know who needs to be seen first, based on measured changes in their patterns of daily living.
Cope imagines a call center full of nurses reviewing incoming sensor data and responding proactively to individuals who are deviating from their patterns.
“It extends the reach of true healthcare using technology, because the cost of healthcare goes down dramatically when you don’t have an emergency,” he says.
Fritz, whose PhD dissertation focused on adoption of technology in senior care settings, says many older adults are willing to accept some loss of privacy—motion sensors, but not microphones and cameras—if it allows them to age in place and extend their independence.
Cope says demand is growing for wearable sensors and services such as Life Alert—“Help! I’ve fallen and I can’t get up!”—that provide a level of assurance and real-time monitoring of an ageing loved one’s well-being.
Smart sensor networks, which can fade into the background of a home, are “in many ways more subtle,” he says.
Business Use Cases
As smart sensors and microphone-equipped speakers backed by machine learning algorithms proliferate, startups, established healthcare companies, and technology giants are pursuing applications across healthcare.
One example is Atlas5D, a Cambridge, MA-based startup making software to run-sensor equipped devices to help track activity of people suffering from chronic diseases. Bigger players including Intel, GE, IBM Watson, and Honeywell have their own longstanding initiatives.
Right now, smart home systems such as the one the WSU researchers are developing would be too expensive for most individuals to purchase on their own, Cook says.
But if they can be shown to prevent health events such as falls—which cost Medicare $31 billion in 2015—she sees a role for insurance companies to subsidize their costs. “In return, they will not be called on as much for claims due to individuals not getting the anticipatory help or medical care because there wasn’t a fast response,” Cook says.
Cope at Touchmark says retirement communities are ideal for the technology, particularly if it can be shown to extend people’s comfort and keep them in their residences longer, reducing costs to refill units when health calamities force older adults to move to more intensive care settings.
The savings are related to improving quality of life, Cope says. “That’s the early sweet spot as a business application.”