Often the finger of blame is pointed at unverified information when tackling bad healthcare choices online. This explores the faults of verified sources and how behavioural science and better choice architecture can offer solutions.

Presentation Outline.

With 20 billion health related Google searches each month, information seeking online is thriving. Ask yourself if you have Googled a symptom, disease, medication or surgical process and I am sure you will recall at least one anecdote. Yet because healthcare decisions are amongst the most sensitive decisions we will make in our lives, public opinion on using Google in this way is tepid. Healthcare professionals too often advise against their patients turning to Google before discussing subject matter with a trained physician.

Verified Vs Unverified Information

If you listen to the majority of advice and commentary, you will be lead to believe that inaccurate unverified information dominates the digital landscape. This unverified content is responsible for poor healthcare decisions with the seeker having to filter and navigate the bad to find the good. This is simply not true, verified content is just as responsible for problems associated with seeking healthcare information online.

The problems are threefold: Cyberchondria, Poor Therapy Choice and Delay of Therapy. This presentation explores from where the problems stem and explains how content design and ‘choice architecture’ provides solutions to ‘The Hazards and Fragilities of Dr Google’.

Slide Notes:

Slides 2 – 8 

Why I was Compelled to Influence Healthcare Communication Online.

Slide 9 – 11

How Homeland and a story about pacemaker hacking caused concern for the pacemaker community. Online communication at its worst.

Slide 12 – 23 

Calming the concerns of pacemaker patients with a blog post and carefully chosen words. My introduction to behavioural science.

Slide 24.

Holistic model of cardiovascular health showing the possible pro-arrhythmic nature of increased anxiety through poor online communication.

Slide 25.

The move from content that informs to content that influences. Google is the largest information resource the world has ever know. We use google when we seek information to influence our choice.

Slide 26 – 29.

Google is the largest information resource the world has ever know.  We use google when we seek information to influence our choice. There are 20 billion health queries a month did you mention your findings to anyone? What was there reaction. Healthcare choices need to be right does Dr Google create a safe environment to make informed decisions?

Slide 30 – 31.

Hazards of Dr Google. Cyberchondria, Therapy Delay, Poor Therapy Choice

Zuccon G, et al. Queensland University of Technology. “‘Dr. Google’ doesn’t know best: Search engine self-diagnosis and ‘cyberchondria’.” ScienceDaily. ScienceDaily, 6 May 2015. Cyberchondria is estimated to cost the NHS more than £420 million a year.

Slide 32 – 34.

Current advice always focuses on avoiding unverified information – this is lazy advice offered by those not experienced in online communication.

Slide 35 – 42.

Avoiding unverified medical information online will protect you from Sensationalism, Unqualified information and content created from online disinhibition. Examples given.

Slide 43 – 45.

Reasons that unverified information is not the main protagonist of the problems with Dr Google. As Google improves there is no improvement in Cyberchondria also we already discriminate data and check the source. Evidence given.

Slide 46 – 55.

An example of verified information causing increased health anxiety. Converted to a real life conversation between a patient and Dr Google. Patients will think they are the ‘1 in a Million’ patient. Quote from professor to confirm.

Slide 56 – 57.

Neglect of Probability: We are wired to respond to the magnitude of an event and not to consider it’s probability.

In a classic experiment in 1972, participants were divided into two groups. Members of group 1 were told they would receive a small electric shock. Members of group 2 were told there was a 50% probability that they would receive a small electric shock. After this information was provided, researchers measured physical anxiety (heart rate, nervousness, sweating) shortly before starting. The result: Absolutely no difference in the anxiety levels of the two groups. Puzzling. Next, researchers announced a series of reduction in the probability of getting shocked to group 2, from 50% down to 20%, 10% and finally 5%. There was still no difference in the anxiety level experienced by the group compared to group 1. When they announced an increase in the strength of the current, anxiety in both groups increased equally. The anxiety level in group 2 finally did go down, when the probability dropped to 0%.

Slide 58 – 62.

What are Cognitive Biases? The byproduct of heuristics. The nature of humans to make irrational or illogical decisions because of our ‘fast’ thinking or system 1 brain. A heuristic is a mental shortcut that allows people to solve problems and make judgments quickly and efficiently. These rule-of-thumb strategies shorten decision-making time and allow people to function without constantly stopping to think about their next course of action.

Slide 63 – 77.

How humans are able to help others with choice architecture. Choice architecture = Design inline with behavioural science, in order to achieve a desired outcome. Examples of choice architecture shown that enable better outcomes for people. The idea of libertarian paternalism.

Slide 78 – 79.

How the hazards and fragilities of Dr Google are really down to human error and cognitive bias. Through this logic, the solution to better healthcare choices online is through choice architecture and behavioural science.

Slide 80 – 102.

Examples and suggestions of better choice architecture online. Including, the removal of unverified suggestions on google when health is involved. A cyberchondria algorithm and nicer graphical representation of statistics.

How to improve information retention with the middle bias and disfluency with ugly fonts. Metaphors to tether information and use old knowledge but adjust the application.

Increasing the number of organ donors with personalisation, the affect heuristic, social conformity and ugly fonts.

Slide 103 – 106.

Wrap up and recap.