If it sounds too good to be true, it isn’t. And watch your wallet. In 14 words, this summarizes Leeza Osipenko’s theme for this article. It may seem to our Readers that Editor Donna is out there for clicks in the headline, but not really. Dr. Osipenko’s term is ‘snake oil’. It’s a quaint, vintage term for deceptive marketing of completely ineffective remedies, redolent of 19th Century hucksters and ‘The Music Man’. Its real meaning is fraud.
The promise is that Big Data, using Big Analytics, Big Machine Learning, and Big AI, will be a panacea for All That Ails Healthcare. It will save the entire system and the patient money, revolutionize medical decision making, save doctors time, increase accuracy, and in general save us from ourselves. Oh yes, and we do need saving, because our Big Tech and Big Health betters tell us so!
Major points in Dr. Osipenko’s Project Syndicate article, which is not long but provocative. Bonus content is available with a link to a London School of Economics panel discussion podcast (39 min.):
- Source data is flawed. It’s subject to error, subjective clinical decision-making, lack of structure, standardization, and general GIGO.
- However, Big Data is sold to health care systems and the general public like none of these potentially dangerous limitations even exist
- Where are the long-range studies which can objectively compare and test the quality and outcomes of using this data? Nowhere to be found yet. It’s like we are in 1900 with no Pure Food Act, no FDA, or FTC to oversee.
- It is sold into health systems as beneficial and completely harmless. Have we already forgotten the scandal of Ascension Health, the largest non-profit health system in the US, and Google Health simply proceeding off their BAA as if they had consent to identified data from practices and patients, and HIPAA didn’t exist? 10 million healthcare records were breached and HHS brought it to a screeching halt.
- Our TTA article of 14 Nov 19 goes into why Google was so overeager to move this project forward, fast, and break a few things like rules.
- We as individuals have no transparency into these systems. We don’t know what they know about us, or if it is correct. And if it isn’t, how can we correct it?
- “Algorithmic diagnostic and decision models sometimes return results that doctors themselves do not understand”–great if you are being diagnosed.
- Big Data demands a high level of math literacy. Most decision makers are not data geeks. And those of us who work with numbers are often baffled by results and later find the calcs are el wrongo–this Editor speaks from personal experience on simple CMS data sets.
- In order to be valuable, AI and machine learning demand access to potentially sensitive data. What’s the tradeoff? Where’s the consent?
Implicit in the article is cui bono?
- Google and its social media rivals want data on us to monetize–in other words, sell stuff to us. Better health and outcomes are just a nice side benefit for them.
- China. Our Readers may also recall from our April 2019 article that China is building the world’s largest medical database, free of those pesky Western democracy privacy restrictions, and using AI/machine learning to create a massive set of diagnostic tools. They aren’t going to stop at China, and in recent developments around intellectual property theft and programming back doors, will go to great lengths to secure Western data. Tencent and Fosun are playing by Chinese rules.
At the end of the day, improving health care through big data and AI will likely take much more trial and error than techno-optimists realize. If conducted transparently and publicly, big-data projects can teach us how to create high-quality data sets prospectively, thereby increasing algorithmic solutions’ chances of success. By the same token, the algorithms themselves should be made available at least to regulators and the organizations subscribing to the service, if not to the public.
Having been massively overhyped, big-data health-care solutions are being rushed to market in without meaningful regulation, transparency, standardization, accountability, or robust validation practices. Patients deserve health systems and providers that will protect them, rather than using them as mere sources of data for profit-driven experiments.
Hat tip to Steve Hards.