An impressive article written by a young doctor poses the problem of social sharing, data we don’t know we’re generating and how that data is being processed in ways such as tracking programs to predict and analyze our behavior. The example he gives of the Samaritans (a non-profit social services group in UK with a mission to prevent suicide) design of an app to be used with tweets of people we follow to alert you of worsening mood changes so that you could intervene. Some felt it was beneficial, most considered the possibilities for misuse or cyberstalking, and it was pulled. The other, rather chilling example was how a PHR could pick up EHR patient evaluation notes data not meant to be seen by the patient. Data insecurity with devastating consequences. Read the article for what UK family GPs are being asked to do by the Government. When data gets creepy: the secrets we don’t realise we’re giving away (Guardian). Hat tip to reader Mike Clark.
John Boden
“The rather chilling example of how a PHR could pick up EHR patient notes data not meant to be seen by the patient” made an alarm bell go off in my head.
Just what type of data should appropriately be withheld from the patient? Whose data is it? Maybe doctors think patients are too stupid to have access to their own data. There is nothing chilling about a patient learning everything about their own health.
Donna Cusano
Hi John–read further in the article. It’s the kind of information that needs to be best discussed with the patient, not read in a PHR. It’s the same with genomic data–the 23andMe approach which is ‘just the facts’ can be devastating and misused versus empowering (getting counseling to know what the real risks are).. As we do more with patient data, including decision support, things will go into the record which require interpretation. It’s something to consider.