Weekend ‘Must Read’: Are Big Tech/Big Pharma’s health tech promises nothing but a dangerous fraud?

If it sounds too good to be true, it isn’t. And watch your wallet. In 14 words, this summarizes Leeza Osipenko’s theme for this article. It may seem to our Readers that Editor Donna is out there for clicks in the headline, but not really. Dr. Osipenko’s term is ‘snake oil’. It’s a quaint, vintage term for deceptive marketing of completely ineffective remedies, redolent of 19th Century hucksters and ‘The Music Man’. Its real meaning is fraud.

The promise is that Big Data, using Big Analytics, Big Machine Learning, and Big AI, will be a panacea for All That Ails Healthcare. It will save the entire system and the patient money, revolutionize medical decision making, save doctors time, increase accuracy, and in general save us from ourselves. Oh yes, and we do need saving, because our Big Tech and Big Health betters tell us so!

Major points in Dr. Osipenko’s Project Syndicate article, which is not long but provocative. Bonus content is available with a link to a London School of Economics panel discussion podcast (39 min.):

  • Source data is flawed. It’s subject to error, subjective clinical decision-making, lack of structure, standardization, and general GIGO.
  • However, Big Data is sold to health care systems and the general public like none of these potentially dangerous limitations even exist
  • Where are the long-range studies which can objectively compare and test the quality and outcomes of using this data? Nowhere to be found yet. It’s like we are in 1900 with no Pure Food Act, no FDA, or FTC to oversee.
  • It is sold into health systems as beneficial and completely harmless. Have we already forgotten the scandal of Ascension Health, the largest non-profit health system in the US, and Google Health simply proceeding off their BAA as if they had consent to identified data from practices and patients, and HIPAA didn’t exist? 10 million healthcare records were breached and HHS brought it to a screeching halt.
    • Our TTA article of 14 Nov 19 goes into why Google was so overeager to move this project forward, fast, and break a few things like rules.
  • We as individuals have no transparency into these systems. We don’t know what they know about us, or if it is correct. And if it isn’t, how can we correct it?
  • “Algorithmic diagnostic and decision models sometimes return results that doctors themselves do not understand”–great if you are being diagnosed.
  • Big Data demands a high level of math literacy.  Most decision makers are not data geeks. And those of us who work with numbers are often baffled by results and later find the calcs are el wrongo–this Editor speaks from personal experience on simple CMS data sets.
  • In order to be valuable, AI and machine learning demand access to potentially sensitive data. What’s the tradeoff? Where’s the consent?

Implicit in the article is cui bono?

  • Google and its social media rivals want data on us to monetize–in other words, sell stuff to us. Better health and outcomes are just a nice side benefit for them.
  • China. Our Readers may also recall from our April 2019 article that China is building the world’s largest medical database, free of those pesky Western democracy privacy restrictions, and using AI/machine learning to create a massive set of diagnostic tools. They aren’t going to stop at China, and in recent developments around intellectual property theft and programming back doors, will go to great lengths to secure Western data. Tencent and Fosun are playing by Chinese rules.

In conclusion:

At the end of the day, improving health care through big data and AI will likely take much more trial and error than techno-optimists realize. If conducted transparently and publicly, big-data projects can teach us how to create high-quality data sets prospectively, thereby increasing algorithmic solutions’ chances of success. By the same token, the algorithms themselves should be made available at least to regulators and the organizations subscribing to the service, if not to the public.

and

Having been massively overhyped, big-data health-care solutions are being rushed to market in without meaningful regulation, transparency, standardization, accountability, or robust validation practices. Patients deserve health systems and providers that will protect them, rather than using them as mere sources of data for profit-driven experiments.

Hat tip to Steve Hards.

Google’s ‘Project Nightingale’–a de facto breach of 10 million health records, off a bridge too far?

Breaking News. Has this finally blown the lid off Google’s quest for data on everyone? This week’s uncovering, whistleblowing, and general backlash on Google’s agreement with Ascension Health, the largest non-profit health system in the US and the largest Catholic health system on the Planet Earth, revealed by the Wall Street Journal (paywalled) has put a bright light exactly where Google (and Apple, Facebook, and Amazon), do not want it.

Why do these giants want your health data? It’s all about where it can be used and sold. For instance, it can be used in research studies. It can be sold for use in EHR integration. But their services and predictive data is ‘where it’s at’. With enough accumulated data on both your health records and personal life (e.g. not enough exercise, food consumption), their AI and machine learning modeling can predict your health progression (or deterioration), along with probable diagnosis, outcomes, treatment options, and your cost curve. Advertising clicks and merchandising products (baby monitors, PERS, exercise equipment) are only the beginning–health systems and insurers are the main chance. In a worst-case and misuse scenario, the data modeling can make you look like a liability to an employer or an insurer, making you both unemployable and expensively/uninsurable in a private insurance system.

In Google’s latest, their Project Nightingale business associate agreement (BAA) with Ascension Health, permissible under HIPAA, allowed them apparently to access in the initial phase at least 10 million identified health records which were transmitted to Google without patient or physician consent or knowledge, including patient name, lab results, diagnoses, hospital records, patient names and dates of birth. This transfer and the Google agreement were announced by Ascension on 11 November. Ultimately, 50 million records are planned to be transferred from Ascension in 21 states. According to a whistleblower on the project quoted in The Guardian, there are real concerns about individuals handling identified data, the depth of the records, how it’s being handled, and how Google will be using the data. Ascension doesn’t seem to share that concern, stating that their goal is to “optimize the health and wellness of individuals and communities, and deliver a comprehensive portfolio of digital capabilities that enhance the experience of Ascension consumers, patients and clinical providers across the continuum of care” which is a bit of word salad that leads right to Google’s Cloud and G Suite capabilities.

This was enough to kick off an inquiry by Health and Human Services (HHS). A spokesperson confirmed to Healthcare Dive that “HHS’ Office of Civil Rights is opening an investigation into “Project Nightingale.” The agency “would like to learn more information about this mass collection of individuals’ medical records with respect to the implications for patient privacy under HIPAA,” OCR Director Roger Severino said in an emailed statement.”

Project Nightingale cannot help but aggravate existing antitrust concerns by Congress and state attorneys general on these companies and their safeguards on privacy. An example is the pushback around Google’s $2.1 bn acquisition of Fitbit, which one observer dubbed ‘extraordinary’ given Fitbit’s recent business challenges, and data analytics company Looker. DOJ’s antitrust division has been looking into how Google’s personalized advertising transactions work and increasingly there are calls from both ends of the US political spectrum to ‘break them up.’ Yahoo News

Google and Ascension Health may very well be the ‘bridge too far’ that curbs the relentless and largely hidden appetite for personal information by Google, Amazon, Apple, and Facebook that is making their very consumers very, very nervous. Transparency, which seems to be a theme in many of these articles, isn’t a solution. Scrutiny, oversight with teeth, and restrictions are.

Also STAT News , The Verge on Google’s real ambitions in healthcare, and a tart take on Google’s recent lack of success with acquisitions in ZDNet, ‘Why everything Google touches turns to garbage’. Healthcare IT News tries to be reassuring, but the devil may be in Google’s tools not being compliant with HIPAA standards.  Further down in the article, Readers will see that HIPAA states that the agreement covers access to the PHI of the covered entity (Ascension) only to have it carry out its healthcare functions, not for the business associate’s (Google’s) independent use or purposes. 

What’s up with Amazon in healthcare? Follow the money. (updated)

Updated–click to see full page. Amazon is the Scary Monster of the healthcare space, a veritable Godzilla unleashed in Tokyo, if one listens to the many rumors, placed and otherwise, picked up in mainstream media which then are seized on by our healthcare compatriots.

According to CNBC’s breathless reporting, they have set up a skunk works HQ’d in Seattle. When they posted job listings, they were under keyword “a1.492” or as “The Amazon Grand Challenge a.k.a. ‘Special Projects’ team.” In late July, these ads for people like a UX Design Manager and a machine learning director with experience in healthcare IT and analytics plus a knowledge of electronic medical records were deleted. Amazon has separate initiatives on selling pharmaceuticals and building health applications to be compatible with Echo/Alexa and other smart home tech. Both have come up in the context of the CVS-Aetna merger, where buying up state pharmacy licenses cannot be kept secret (see end of our 8 Dec article) and that efforts to extend Alexa and Echo’s capabilities aren’t particularly secret.

A quick look at Bezos Expeditions, Amazon supremo’s Jeff Bezos’ personal fund, on Crunchbase reveals several healthcare investments, such as GRAIL (cancer), Unity Biotechnology (aging), Rethink Robotics, and Juno Therapeutics (cancer). Not really things easy to sell on Amazon.

Last week, Amazon reportedly hired Dr. Martin Levine, who ran integrated primary health Iora Health’s Seattle-based clinics, according to CNBC and Becker’s. They met with Iora, Kaiser, and the now-defunct Qliance about a year ago on innovative healthcare models. More breathless reporting: they are hiring a “HIPAA compliance lead.” 

What does this all mean? It may be more–or less–than what the speculation is. Here’s what this Editor believes as some options:

  • Alexa and Echo are data collectors as well as assistants–information that has monetary value to healthcare providers and pharma. To this Editor, this is the most likely and soonest option–the monetization of this data and the delivery of third-party services as well as monitoring.
  • Amazon now employs a lot of people. It is large enough to create its own self-funded health system. It’s already had major problems in the UK, Italy, and even in the US with healthcare and working conditions in its warehouses. Whole Foods’ non-union workers are prime for unionization since the acquisition (and also if, as rumored, robots and automation start replacing people).
  • A self-funded health system may also be plausible to sell  (more…)