A monitoring future without smartwatches, pendants, or transmitting readings through your tablet? A professor at MIT has developed a box, about the size of a Wi-Fi router, that can monitor a person’s vital signs throughout the house. Like Wi-Fi, the device emits a low-power wireless radio signal, but the device then measures the return on those radio signals from the bodies in the residence. The ‘neural network’ takes the data from the tiny changes in electromagnetic signals to track physiological signs as the person moves from room to room, even through walls, using machine learning to analyze those reflected signals and extract physiological data such as breathing, heart rate, posture, and gait. The device has also been tested on sleep patterns including sleep stages, which means it could replace the awkward and artificial electrodes in a lab which are usual for sleep testing.
Dina Katabi, a MIT professor of electrical engineering and computer science, built this box in her lab. So far it has been tested in over 200 homes around the US, tracking the baselines of healthy people and those with Parkinson’s, Alzheimer’s, depression, and pulmonary diseases. In the case of Parkinson’s, the data gathered by the device over eight weeks in the home of a patient indicated that his gait improved around 5 or 6 am, right around the time he took his medication. Data is encrypted and Professor Katabi has stated that the setup process requires a user to complete a series of specific movements before it’s possible to be tracked. She has also cofounded a startup, Emerald Innovations, to commercialize the technology. If it is workable beyond the test stage, it has the capability to revolutionize remote patient monitoring. Engadget, MIT Technology Review
The OpenEMR system, which is an open-source patient record system used in UK hospitals and others worldwide, has dozens of security flaws in its software, according to Project Insecurity, a London-based “tight-knit computer research organization which focuses primarily on educating the masses on the topics of information security” according to their corporate description on LinkedIn. According to their report, Project Insecurity found vulnerabilities including: “a portal authentication bypass, multiple instances of SQL injection, multiple instances of remote code execution, unauthenticated information disclosure, unrestricted file upload, CSRFs including a CSRF to RCE proof of concept, and unauthenticated administrative actions.” OpenEMR has stated that they have now supplied patches to fix the vulnerabilities listed in the report. However, these multiple flaws put potentially millions of patient records at risk for some time.
OpenEMR’s decentralized model has some drawbacks when it comes to security. According to OpenEMR, they do not know how many organizations are affected as the open-source software has voluntary registration. Patches and security fixes are announced to the registration list, the OpenEMR’s online forum and social accounts, the open-emr.org community, and OpenEMR vendors. While no data has been publicly exposed, the Project Insecurity report revealed this system’s risk to the healthcare organizations which use it. Also DigitalHealth and Project Insecurity on Twitter.
McAfee has confirmed another vulnerability–that vital signs reporting into a central monitoring station can be altered in real time. They tested a circa 2004 bedside monitor/central monitoring system reportedly still in use. The system monitored heartbeat, oxygen level, and blood pressure, used both wired and wireless networking over TCP/IP, and appeared to store patient information. The central monitoring station ran Windows XP Embedded, which presented one set of flaws, but far more accessible to a breach was the communication from the devices to the central monitoring system. In short, “the attacker simply has to send replacement data to the central station while appearing as the patient monitor.” The article proves vital signs can be altered by the time they reach the central monitoring station to create a bad diagnosis, unnecessary testing, and unneeded medication. The McAfee article lays out How to Mess With Vital Signs, Believably.
Verily‘s visit to last week’s Health 2.0 conference had an odd-but-fun tack, comparing the data received from human bodies to the billions of data points generated by an average late-model automobile in normal operations. We generate a lot less (ten orders of magnitude difference, according to Verily Chief Technology Officer Brian Otis), but Verily wants to maximize the output by wiring us to multiple sensors and to use the data in a predictive health model. Some of the Verily devices this Editor predicts will be non-starters (the sensor contact lens developed with Alcon) but others like the Dexcom partnership to develop a smaller, cheaper continuous blood glucose monitor and Liftware, the tremor-canceling silverware company Google acquired in 2014, appear promising. Key to predictive health is the Study Watch, which is a wearable that collects a lot of data but is easy to wear for a long time. Mobihealthnews
But what to do with this All That Data? Where this differs from a car is that the operational data goes into feedback loops that tune the engine’s performance, perform long-term monitoring, electrical system, braking, and more. (When the sensors go south or the battery’s low, watch out!) It’s not clear from the talk where this overwhelming amount of healthcare data generated goes to and how it becomes useful to a person or a doctor. This has its own feedback loop this Editor dubbed a few years ago as the Five Big Questions (FBQs): who pays, how much, who’s looking at the data, who’s actioning it, how data is integrated into patient records. That’s not answered, but presumably these technologies will incorporate machine learning and AI to Crunch That Data into bite-sized parts.
Which leads us back to Verily’s parent, Alphabet a/k/a Google. All that data into Verily devices could be monitored by Google and fed into other Google programs like their search engines and Adwords. Another privacy problem?
Perhaps health systems are arriving at the realization that they have to crunch the data, not avoid it. For the first time, this Editor has observed that a CMIO of a small health system in Illinois and Sanford Health‘s executive director of analytics are actually welcoming patient data and research. Startups in this area such as PreventScripts labor on that “last mile” of clinical decision support, preventative medicine. EHRs are also into the act. Epic launched Share Everywhere, where patients can grant access to their data and clinicians can send updates into the patient portal (MyChart). What’s needed, CMIO Goel admits, is software that combines natural language processing and algorithms to track by disease and specialty–once again, machine learning. Healthcare IT News
North Somerset Council (west of Bristol in UK’s mid-southwest) provides care for more than 2,800 people. Their budget for adult social care this year is £65.3million. Yet even with this large budget, the trend is not its friend, according to Hayley Verrico, the council’s assistant director of adult support and safeguarding. In addition to the demand created by more older people and the ‘old-old’ growing frailer, there are special needs children who enter adult social care. The priority is to enable them to stay at home. Will this increased demand be met by technology? Ms. Verrico believes so, giving examples such as telecare and assistive technology for PERS, automatic tap (water) shutoffs, and door/wander sensors. The paradox is that carers also need to be trained in the meaningful monitoring and support management part of home care, transitional care, and encouraging that person to be more independent in activity, versus the traditional hands-on part of direct care.
This story is a chirping canary in the mine in UK, EU and the US. The last situation is in a way worse. Not only are we in the US not set up for community-wide maintaining of adults at home, but also most direct care workers are paid in the bottom quarter of US hourly wages with few perceived opportunities for advancement. Beyond monitoring, how do we handle the next meaningful step–telehealth and RPM? North Somerset Times
9 to 5 Mac, the tip sheet for all things Apple, tracked down a patent granted to Apple (via Patently Apple) for computing health measurements using the iPhone. According to Apple in the patent, “electrical measurements may be used to measure heart function, compute an electrocardiogram, compute a galvanic skin response that may be indicative of emotional state and/or other physiological condition, and/or compute other health data such as body fat, or blood pressure.” It would use the front-facing camera, light sensor and proximity sensor to emit light that would be reflected back to the sensors. Additional sensors mounted in the same area would also generate additional health measurements such as body fat and EKG, which is already measured by the Kardia Mobile/Alivecor attachment. The camera and light sensor alone, based on the patent and the article, would measure oxygen saturation, pulse rate, perfusion index and a photoplethysmogram (which can monitor breathing rate and detect circulatory conditions like hypovolemia). Another demonstration of Apple’s keen interest in the health field, but what features will show up on real phones and apps–and when?
Can you ever be too rich or too thin?
The latter seems to be achievable when it comes to skin patches which can monitor key vital signs like skin temperature, take electroencephalography (EEG) and electromyography (EMG for muscles), and measure hydration. The graphene used in this sensor developed by University of Texas at Austin is 0.3-nm thick, in a polymer 463-nm thick. Unlike Stanford’s stretchy sensor we profiled in November
, this doesn’t stretch, but is so thin as to be highly unobtrusive. It is made by growing single-layer graphene on a copper sheet, which is then coated by a stretchy support polymer. The copper is etched off and the polymer-graphene placed on temporary tattoo paper. The wearer doesn’t sense it because, as the researchers termed it, it is compliant with the nooks and crannies of human skin–and it doesn’t look obnoxious. It can be placed on the chest, on the arm or other locations as needed. Testing indicated good quality signals and in fact, detected EKG signals not registering on a conventional monitor. Presented at IEEE’s International Electron Devices Meeting (IEDM). IEEE Spectrum
This year, on the 10th Anniversary of Telehealth and Telecare Aware, we have invited industry leaders nominated by our readers to reflect on the past ten years and, if they wish, to speculate about the next ten. Here is the first article, with a UK focus, by Dr Kevin Doughty.
Many of us are frustrated at how little progress there has been in the deployment and acceptability of telecare during the past decade. Yet, despite warnings that an ageing population was about to bankrupt the NHS (and health insurance schemes elsewhere in the world), and that access to social care for older people was being withdrawn at such a rate that it could only be afforded by the wealthiest in society, our health and social care systems have just about survived.
But this can’t go on, and in England over the past 12 months: (more…)
Here is a tech-savvy person lamenting (ranting?) in Venture Beat that there’s no one place to put all of his health data that he needs–weight, PHR (personal health record), his spin class and aerobic training data. AppleHealth/Apple HealthKit? Only the weight via a Withings scale maps to it, and you have to scroll past oodles of data categories, such as your molybdenum levels, to get to more vital things like weight and heart rate. FitBit lasted three months in his life before being tossed in a drawer. What took center stage at International CES were more devices dumping more data that doesn’t map into a central database. He acidly notes that Apple HealthKit is free because it is is worthless. Is there something broken here that we in telehealth need to deal with, quickly? My health data is killing me (figuratively) Hat tip to Tom Greene posting in The King’s Fund LinkedIn group Digital Health and Care Congress, this year 16-17 June. A reminder–call for papers closes 13 Feb!
A neuroscience research team at the Medical College of Georgia (MCG) in Augusta has developed a way to decipher a video image of a person to measure a person’s heart and breathing rates. Using any single-channel video camera, including a web or cell phone cam, in day, low-light or even at night using near-infrared images, they have developed algorithms to track how the body moves slightly from the way light is reflected off of it and recorded. This can determine within fair clinical accuracy of physical measurements, with false positives only 3 percent of the time and false negatives less than 1 percent. If produced to work with systems to scale, this could vastly assist telemedicine consults especially at distance and facilitate hands-free in-person examinations. Research published in PLOS ONE with a summary in Healthline News. Philips in July started marketing a ‘Vital Signs Camera’ app for $1.99 in the iTunes Store that also measures heart and breathing, but not to clinical quality.