Verily‘s visit to last week’s Health 2.0 conference had an odd-but-fun tack, comparing the data received from human bodies to the billions of data points generated by an average late-model automobile in normal operations. We generate a lot less (ten orders of magnitude difference, according to Verily Chief Technology Officer Brian Otis), but Verily wants to maximize the output by wiring us to multiple sensors and to use the data in a predictive health model. Some of the Verily devices this Editor predicts will be non-starters (the sensor contact lens developed with Alcon) but others like the Dexcom partnership to develop a smaller, cheaper continuous blood glucose monitor and Liftware, the tremor-canceling silverware company Google acquired in 2014, appear promising. Key to predictive health is the Study Watch, which is a wearable that collects a lot of data but is easy to wear for a long time. Mobihealthnews
But what to do with this All That Data? Where this differs from a car is that the operational data goes into feedback loops that tune the engine’s performance, perform long-term monitoring, electrical system, braking, and more. (When the sensors go south or the battery’s low, watch out!) It’s not clear from the talk where this overwhelming amount of healthcare data generated goes to and how it becomes useful to a person or a doctor. This has its own feedback loop this Editor dubbed a few years ago as the Five Big Questions (FBQs): who pays, how much, who’s looking at the data, who’s actioning it, how data is integrated into patient records. That’s not answered, but presumably these technologies will incorporate machine learning and AI to Crunch That Data into bite-sized parts.
Which leads us back to Verily’s parent, Alphabet a/k/a Google. All that data into Verily devices could be monitored by Google and fed into other Google programs like their search engines and Adwords. Another privacy problem?
Perhaps health systems are arriving at the realization that they have to crunch the data, not avoid it. For the first time, this Editor has observed that a CMIO of a small health system in Illinois and Sanford Health‘s executive director of analytics are actually welcoming patient data and research. Startups in this area such as PreventScripts labor on that “last mile” of clinical decision support, preventative medicine. EHRs are also into the act. Epic launched Share Everywhere, where patients can grant access to their data and clinicians can send updates into the patient portal (MyChart). What’s needed, CMIO Goel admits, is software that combines natural language processing and algorithms to track by disease and specialty–once again, machine learning. Healthcare IT News
North Somerset Council (west of Bristol in UK’s mid-southwest) provides care for more than 2,800 people. Their budget for adult social care this year is £65.3million. Yet even with this large budget, the trend is not its friend, according to Hayley Verrico, the council’s assistant director of adult support and safeguarding. In addition to the demand created by more older people and the ‘old-old’ growing frailer, there are special needs children who enter adult social care. The priority is to enable them to stay at home. Will this increased demand be met by technology? Ms. Verrico believes so, giving examples such as telecare and assistive technology for PERS, automatic tap (water) shutoffs, and door/wander sensors. The paradox is that carers also need to be trained in the meaningful monitoring and support management part of home care, transitional care, and encouraging that person to be more independent in activity, versus the traditional hands-on part of direct care.
This story is a chirping canary in the mine in UK, EU and the US. The last situation is in a way worse. Not only are we in the US not set up for community-wide maintaining of adults at home, but also most direct care workers are paid in the bottom quarter of US hourly wages with few perceived opportunities for advancement. Beyond monitoring, how do we handle the next meaningful step–telehealth and RPM? North Somerset Times
9 to 5 Mac, the tip sheet for all things Apple, tracked down a patent granted to Apple (via Patently Apple) for computing health measurements using the iPhone. According to Apple in the patent, “electrical measurements may be used to measure heart function, compute an electrocardiogram, compute a galvanic skin response that may be indicative of emotional state and/or other physiological condition, and/or compute other health data such as body fat, or blood pressure.” It would use the front-facing camera, light sensor and proximity sensor to emit light that would be reflected back to the sensors. Additional sensors mounted in the same area would also generate additional health measurements such as body fat and EKG, which is already measured by the Kardia Mobile/Alivecor attachment. The camera and light sensor alone, based on the patent and the article, would measure oxygen saturation, pulse rate, perfusion index and a photoplethysmogram (which can monitor breathing rate and detect circulatory conditions like hypovolemia). Another demonstration of Apple’s keen interest in the health field, but what features will show up on real phones and apps–and when?
Can you ever be too rich or too thin?
The latter seems to be achievable when it comes to skin patches which can monitor key vital signs like skin temperature, take electroencephalography (EEG) and electromyography (EMG for muscles), and measure hydration. The graphene used in this sensor developed by University of Texas at Austin is 0.3-nm thick, in a polymer 463-nm thick. Unlike Stanford’s stretchy sensor we profiled in November
, this doesn’t stretch, but is so thin as to be highly unobtrusive. It is made by growing single-layer graphene on a copper sheet, which is then coated by a stretchy support polymer. The copper is etched off and the polymer-graphene placed on temporary tattoo paper. The wearer doesn’t sense it because, as the researchers termed it, it is compliant with the nooks and crannies of human skin–and it doesn’t look obnoxious. It can be placed on the chest, on the arm or other locations as needed. Testing indicated good quality signals and in fact, detected EKG signals not registering on a conventional monitor. Presented at IEEE’s International Electron Devices Meeting (IEDM). IEEE Spectrum
This year, on the 10th Anniversary of Telehealth and Telecare Aware, we have invited industry leaders nominated by our readers to reflect on the past ten years and, if they wish, to speculate about the next ten. Here is the first article, with a UK focus, by Dr Kevin Doughty.
Many of us are frustrated at how little progress there has been in the deployment and acceptability of telecare during the past decade. Yet, despite warnings that an ageing population was about to bankrupt the NHS (and health insurance schemes elsewhere in the world), and that access to social care for older people was being withdrawn at such a rate that it could only be afforded by the wealthiest in society, our health and social care systems have just about survived.
But this can’t go on, and in England over the past 12 months: (more…)
Here is a tech-savvy person lamenting (ranting?) in Venture Beat that there’s no one place to put all of his health data that he needs–weight, PHR (personal health record), his spin class and aerobic training data. AppleHealth/Apple HealthKit? Only the weight via a Withings scale maps to it, and you have to scroll past oodles of data categories, such as your molybdenum levels, to get to more vital things like weight and heart rate. FitBit lasted three months in his life before being tossed in a drawer. What took center stage at International CES were more devices dumping more data that doesn’t map into a central database. He acidly notes that Apple HealthKit is free because it is is worthless. Is there something broken here that we in telehealth need to deal with, quickly? My health data is killing me (figuratively) Hat tip to Tom Greene posting in The King’s Fund LinkedIn group Digital Health and Care Congress, this year 16-17 June. A reminder–call for papers closes 13 Feb!
A neuroscience research team at the Medical College of Georgia (MCG) in Augusta has developed a way to decipher a video image of a person to measure a person’s heart and breathing rates. Using any single-channel video camera, including a web or cell phone cam, in day, low-light or even at night using near-infrared images, they have developed algorithms to track how the body moves slightly from the way light is reflected off of it and recorded. This can determine within fair clinical accuracy of physical measurements, with false positives only 3 percent of the time and false negatives less than 1 percent. If produced to work with systems to scale, this could vastly assist telemedicine consults especially at distance and facilitate hands-free in-person examinations. Research published in PLOS ONE with a summary in Healthline News. Philips in July started marketing a ‘Vital Signs Camera’ app for $1.99 in the iTunes Store that also measures heart and breathing, but not to clinical quality.