EXOME

all the information, none of the junk | biotech • healthcare • life sciences

Always-on Health Care, Always-on Data

Opinion

Xconomy San Francisco — 

Thanks to major shifts to modernize traditional medicine over the last few decades, many more key stake holders across health care are better connected. These new connections—among patients, physicians, hospitals, research organizations, and more—will continue to help drive efficiencies, improve collaboration, and provide a clearer path to optimal care.

Technological advancements will help us better understand the massive amounts of patient data now being collected every day. They will support the next scientific breakthroughs, and lead to new patient-centered care models. The goal, of course, is better, more personalized care at lower cost. And at the center of this new world order are data and real-time data analytics.

Unprecedented amounts of data are streaming from a host of sources, from clinical research to electronic health records to consumers tracking their vital signs with wearable devices. All this data must be stored, protected, analyzed, and always available.

Imagine a patient with early stage Parkinson’s describing his or her symptoms to an intake nurse or physician. Even with a daily medical journal, it can be hard for patients to recall symptoms such as hand tremors, to describe how they felt, and report when they occurred. Mobile apps and wearable devices can help monitor physical data, measure disease progression, and offer a clearer path to better care. The data can also offer new insights into the disease and its treatments.

Big Data, Big Challenges
This drive toward faster, more personalized care enhanced by large data sets brings new challenges. These challenges can be broken into four key areas. Some are widely recognized, while others only rise to the forefront once you’re waist-deep in this data deluge.

1) Fragmentation and Data Movement
2) Compliance and Security
3) Scale
4) Variety

Fragmentation and Data Movement
Health data sets reside in multiple systems. Some are built on modern data platforms like NoSQL and Hadoop, while others reside in legacy systems that are twenty years old. For example, the typical Medicare patient in the early 2000s saw an average of two primary care physicians and five specialists across four different practice settings. Just think about the number of systems in use across these varied environments.

All this information may allow practitioners to make personalized treatment decisions more quickly, but only if all these different data sets can be analyzed together. Organizations need to aggregate data in the most effective way.

Data transfer from one system to another brings its own set of challenges. Companies must consider infrastructure issues, and ensure that they have suitable network bandwidth. Companies also have to prevent thousands of data requests from overloading the production systems.

Compliance and Security
The recent spate of data breaches in healthcare and other industries shines a light on the number of ways that confidential data can be captured. Healthcare data is more vulnerable now due to the combination of large data sets, data movement, and multiple access points, including smart phones, tablets, and computers.

While we won’t go through the litany of steps that companies need to take to ensure that data is appropriately secured, companies should consider: 1) what data elements, while not covered by HIPAA regulations, still need to be protected as they move across data systems; 2) how to decide whether to encrypt and mask data that is both at-rest and in motion; and 3) how to impose masking consistently, so companies can validly analyze data from multiple sources with ease.

Scale
Data sets these days are often measured in petabytes. Both vendors and healthcare companies share the goal of “unbounded scalability.” In a recent article, Penn Medicine says it has increased its power to analyze data by linking many commodity computers to produce a petabyte-scale cluster. Penn can now improve its diagnosis of particular diseases, while cutting the time to produce an analysis by 24 hours.

Scale is relevant from a data processing perspective, which is why platforms such as Hadoop, Cassandra and others have become popular. But scale is also important from a data management perspective. In other words, companies not only need to ensure rapid data processing, they also need to ensure the universal availability of data sets of any size to all the people who need them. This is true whether the discussion revolves around a few megabytes emanating from a Runkeeper app to petabytes of clinical trial data.

Variety
Data sets are not only becoming larger in scale, but are also increasingly varied, ranging from highly structured clinical trial data to unstructured electronic health records. Companies need to make sure that their diverse data sets are available in native form, and not caged within different proprietary formats that interfere with optimal storage and retrieval.

Prepare Today. Lead Tomorrow.
According to the World Health Organization, worldwide healthcare spending grew 2.6 percent last year—and is expected to continue growing at an average rate of 5.3 percent each year for the next four years. In the U.S. alone, national healthcare spending is expected to surpass $3.2 trillion this year. Improved management of all these new health-related data sets will help streamline care, dramatically minimize inefficiencies, and in the end, reduce overall spend.

Technology bellwethers like Apple and Google are attacking health care challenges with initiatives such as ResearchKit and Calico. This signals a rapid convergence of technology, data and health care.

The time is now. Preparing today supports a tomorrow where data will continue to shape care and improve patient outcomes. Patterns uncovered by data analysis will improve diagnosis and treatment, but only if that data is accurate and readily available to those who need it.

Data will help fuel the next generation of personalized medicine and care. It will inform us in ways that were unimaginable not long ago, and certainly before many legacy systems were built and implemented. The challenges are great, but the opportunities are endless.

——————————————————————————————————

Nitin Donde is the Founder and CEO of Talena, Inc., a provider of software designed to support “always on” Big Data applications through backup, recovery and other data management capabilities.

Nitin Donde is the Founder and CEO of Talena, Inc., a provider of software designed to support "always on" Big Data applications through backup, recovery and other data management capabilities. Follow @Talena_Inc

Trending on Xconomy