Harding Center for Risk Literacy
Current Website found at www.mpib-berlin.mpg.de/en/research/



The Harding Center for Risk Literacy helps people in their struggle to understand and assess the risks facing them.
Content is from the site's 2011-2012 archived pages offering a glimpse of the information this site offered its readership. Additional content/images from other sources

The current website for the Harding Center for Risk Literacy is found at: https://www.harding-center.mpg.de/en or www.mpib-berlin.mpg.de/en/research/harding-center

 

"Our aim is to study how people behave in risk situations. We believe that our work can contribute towards the ideal of a society that knows how to calculate risks and live with them."

Gerd Gigerenzer

Should I have a flu vaccination or not? Is it safer to travel by car or by plane? Can early-detection screening tests for cancer prolong my life? Questions like these are the research focus of our team of six scientists led by Gerd Gigerenzer, director of the Center.

Our goal is to help people in their struggle to understand and assess the risks facing them. Our primary focus is on health and medicine as well as on educating people from childhood onwards to understand statistics. By conducting studies, experiments, and surveys, we investigate people's problems with understanding numbers and find solutions to these. We also offer special training seminars for physicians and journalists, who particularly need to know how to interpret and communicate risks to their patients and the general public.



Gerd Gigerenzer / Director

Gerd Gigerenzer is director of the Max Planck Institute for Human Development in Berlin and author of numerous books, including Gut Feelings and Calculated Risks: How to Know When Numbers Deceive You (titled Reckoning with Risk in the UK), both of which were named Science Book of the Year in Germany (2007 and 2002).



David Harding

The Harding Center for Risk Literacy is named after David Harding, who created an endowment for the Center. Harding, global investment manager and director of Winton Capital, became aware of Gerd Gigerenzer's work after Reckoning with Risk was named Science Book of the Year by the Royal Society. The meeting with Professor Gigerenzer inspired him to establish the Winton Professorship of the Public Understanding of Risk at the University of Cambridge in the UK.

WHO IS DAVID HARDING? 

David Harding is the director of Winton Capital, one of the 10 largest investment companies in the UK. Harding has long been interested in Gerd Gigerenzer's research and has already made a name in the field by endowing a chair in risk communication at the University of Cambridge.

HOW DOES A LONDON GLOBAL INVESTMENT MANAGER HAPPEN TO DONATE 2.2 MILLION EUROS TO AN INSTITUTE IN BERLIN?

Having read Gigerenzer’s book Reckoning with Risk (US: Calculated Risks), which was nominated for the Royal Society’s „Science Book Prize“, Harding was excited enough to give a copy of this book to each of his employees. After a public lecture by Gigerenzer in London, which was attended by Harding and his employees, the two men had dinner together – and the idea of a reseach center for risk literacy was born. Their idea became reality thanks to Harding’s generous donation of 2.2 million euros, which enabled the foundation of the Harding Center for Risk Literacy.

WHAT IS THE MAX PLANCK SOCIETY'S RELATIONSHIP TO THE CENTER?

In return for Harding’s donation, the Max Planck Society provides the facilities and staff support for the Center. This alliance ensures both truly independent research and that a broad public is informed about the research findings. The Harding Center for Risk Literacy officially opened in spring 2009

 

The Max Planck Institute for Human Development, founded in 1963, is an interdisciplinary research institution dedicated to the study of human development. The Institute is part of the Max Planck Society for the Advancement of Science, an independent, non-profit research organisation.

 

Harding Center for Risk Literacy

In April of 2009, the Harding Center for Risk Literacy was founded at the Max Planck Institute for Human Development, Berlin. The center envisions a society of informed citizens who are competent enough to deal with the risks of a modern technological world.

How do I make decisions in our modern, technological world? Should I have a vaccination or not? Is it safer to travel by car or by plane? Are early-detection screening tests for cancer useful, or can they cause harm?

Questions like these are the research focus of a team of 6 scientists led by Professor Gerd Gigerenzer, director of the Harding Center for Risk Literacy. The researchers will conduct studies and experiments and carry out surveys in the general population. Their findings shall aid in assessing risks competently and correctly Furthermore, the Center will offer special further training seminars for physicians as well as for journalists.

 

What You Should Know

Our RISK QUIZ: Are you risk literate?

Check your knowledge about risks and uncertainties of everyday life in 8 short questions!

Fact Boxes

 

Medical questions often have no black-and-white answers. For this reason, transparent information is crucial – as is the courage to make informed decisions for oneself. We have prepared fact boxes with unbiased and easy-to-understand information about different subjects:

 

>> about early detection of breast cancer by mammography screening

>> about early detection of prostate cancer by PSA screening and digital-rectal examination

>> about breast cancer prevention for high risk women with Nolvadex (Tamoxifen)

 

Important Questions for Life in an Uncertain World

/images/thumb_questions_t5.png

When it comes to results of medical tests and treatments, people have a need for certainty. However, often our most important decisions must be made under considerable uncertainty. We identified a couple of questions that help you face uncertainty and facilitate your understanding of risks across different situations. 

 

Technical Terms

It is not always easy to evaluate (health) risks. In a glossary we have summarized and explained technical terms and criteria, which help to facilitate risk comprehension.

 



 

Bad Statistic of the Month

Berlin psychologist Gerd Gigerenzer, economist Thomas Bauer from Bochum, and statistician Walter Krämer from Dortmund founded the “Bad Statistic of the Month”Unstatistik des Monats in 2012. Every month they will question recently published statistics and their interpretations. This campaign shall help to deal with data and facts in a more reasonable way, to interpret numerical representations of reality correctly, and to describe an increasingly complex world more adequately. Further information on the background to this initiative can be found on www.unstatistik.de.

 

Poverty is not inequalityt

armut_klein

The statistics for the month of October is 15.8% and comes from the Federal Statistical Office in Wiesbaden: "15.8% of the population was at risk of poverty in 2010," reported the official statistics on 17 October 2012. This number is correct, but not its interpretation.
Photo (c) Uta Herbert / pixelio.de

Gene corn killst

mais_klein

A French team of scientists recently reported that rats fed GM maize over a longer period of time have suffered serious health problems, most of all more likely to cause cancer. But does that mean that GM corn causes cancer?
Photo (c) W.R. Wagner / pixelio.de

Obesity makes you dumb and depressed

uebergewicht_klein

Every now and again, the German media report relationships between eating habits, depression, intelligence, body weight and school attendance. Early this month, the “Apotheken-Umschau” claimed that excessive consumption of fast food leads to depression. But what's the real story behind such ostensibly causal relationships?
Picture (c) Sigrid Rossmann / pixelio.de

Bad Statistic of the Month: No need to fear cholera

unstatistik_cholera_klein

How true are the horror stories predicting the spread of cholera in Germany? Our “Bad Statistic of the Month” takes a closer look at a medical study that found a positive correlation between the warming of the Baltic Sea and the occurrence of cholera bacteria.
Picture (c) Ronald Taylor

 

VCD “City Check” Gives Wrong Statistics for Road Safety

unstatistik_verkehr_klein

In their “City Check 2012,” the German association of traffic participants (Verkehrsclub Deutschland [VCD]) divides 76 German cities into categories depending on their traffic safety for young children and young people. Hereby, the rate of change in traffic accidents for the last five years is averaged arithmetically. This approach, however, is misleading and scientifically not acceptable.
Picture (c) Rainer Sturm / pixelio.de

The Lost Girls of Gorleben

unstatistik_atom_klein

Alleged deficiencies of female births around German nuclear plants have moved the media. Especially near the interim storage for nuclear waste in Gorleben is the male–female ratio of 109 to 100 supposed to be clearly biased. A working group led by epidemiologist Hagen Scherb states that the outgoing radiation from these nuclear plants is to blame.
Picture (c) Viktor Mildenberger / pixelio.de

Chocolate Makes You Thin

unstatistik_schokolade_klein

This is what several German newspapers and magazines wrote about a study of the University of California in San Diego, USA, in which scientists found a negative correlation between the frequency of chocolate consumption and the body mass index (BMI).
Picture (c) Manfred Walker / pixelio.de

Wage Differences Between Women and Men?

unstatistik_lohn

The “Unstatistic of the Month” in March is 23%. That is the average wage difference between women and men. However, this number does not give any information on whether women and men are actually treated unequally, which is based on the fact that “apples and oranges” are compared with one another.
PIcture (c) Thommy Weiss / pixelio.de

Punctual Trains?

unstatistik_bahn

The “Unstatistic of the Month” in February is the punctuality figures of the Deutsche Bahn AG, the German railroad company. In order to become more transparent, they published the actual punctuality as 96.5% for January on their Website.
Picture (c) Michael Bührke / pixelio.de

Misleading statistics in chicken industry

unstatistik_haehnchen_klein

 

The “Unstatistic of the Month” in January is 96.4%. According to the Ministry of Environment of North Rhine-Westphalia, this is the amount of chickens that were treated with antibiotics.
Picture (c) Viktor Schwabenland / pixelio.de

 

Poles are more diligent than Germans

/images/bad-statistic-2.jpg

Stephanie Hofschlaeger

As the statistic of the month of December we refer to the statement: "Poles are more diligent than Germans", voiced by Federal President Joachim Gauck at a meeting with the heads of state of Italy and Poland on 19 November 2012 in Naples. But is this statement merely a compliment or a meaningful statistic?

 


 

The Ruhr area as a poorhouse

Due to the "Poverty Report 2011" presented by the Paritätischer Wohlfahrtsverband in December, the Ruhrgebiet was identified as the new poorhouse of the republic in many German media. For example, The At-risk-of-poverty rate in Dortmund increased from 18.6 to 23 percent between 2005 and 2010, compared with 14.5 percent in Germany as a whole. But also nationwide, the economic boom would bypass the poor.

 



 

Press/Online (international)

 

"With Prostate Cancer, Is It Better Not To Know?" / npr.com, 30.5.2012

A federal task force has recommended against routine use of the PSA test, a common method of screening for prostate cancer. The panel concluded that the potential for harm outweighs any benefit, as many men undergo procedures that are unnecessary and can lead to serious side effects.

Guests

Richard Knox, health and science correspondent, NPR
Hal Arkes, professor, Ohio State University
Dr. Mary McNaughton-Collins, physician, Massachusetts General Hospital

 


"With PSA Testing, The Power Of Anecdote Often Trumps Statistics" / npr.com, 28.5.2012

by RICHARD KNOX

Millions of men and their doctors are trying to understand a federal task force's recommendation against routine use of a prostate cancer test called the PSA.

The guidance, which came out last week, raises basic questions about how to interpret medical evidence. And what role expert panels should play in how doctors practice.

About 70 percent of men over 50 have gotten a PSA blood test. Some are convinced it was a lifesaver.

Tom Fouts of Florida is one of them. He and his doctor had been watching his PSA (prostate-specific antigen) creep up for almost two years. Fouts was losing sleep over it, wondering if it meant a silent killer was incubating in his prostate gland.

 
Finally, he decided to act. After three painful biopsies, doctors discovered a moderate-grade cancer and Fouts had surgery to remove it.

Today he's fine. "I'm a firm believer the PSA test has saved my life," he says. And he doesn't think much of the U.S. Preventive Services Task Force, the government-appointed expert panel that advised against routine PSA testing after analyzing reams of statistics.

"My theory on statistics," Fouts says, "is anybody can look at the same stats and come up with their own opinion. Government does it; each political party does it. Whatever you want it to come up to read, you can fine-tune it and make it come up to that."

Hal Arkes, a psychology professor at Ohio State University, says Fouts' way of thinking is nearly universal. The power of the anecdote almost always overwhelms statistical analysis, he says.

"Statistics are dry and they're boring and they're hard to understand," Arkes tells Shots. "They don't have the impact of someone standing in front of you telling their heart-rending story. I think this is common to just about everybody."

Arkes says anecdotal thinking "contributes to the widespread gross over-estimation of the benefits of PSA screening." He suggests people do a mental exercise to understand what the numbers are saying about PSA:

Imagine an auditorium filled with 1,000 men who had PSA screening tests and another auditorium with 1,000 men who didn't. That represents the kind of studies the federal task force was relying on.

"Take a look at the men in the two auditoriums, the men in the screened and the men in the not-screened auditorium," Arkes says. "There's just as many men who died of prostate cancer in each auditorium, which leads us to think in the aggregate it didn't do any good."

Arkes breaks it down in the journal Psychological Science.

In each auditorium, there would be eight men who died of prostate cancer. But among the thousand who got PSA tests, there would also be 20 men who were treated for prostate cancers that would never have grown and caused symptoms. And five of these needlessly treated men would have lifelong complications, such as impotence and incontinence.

Dr. Ian Thompson of the University of Texas Health Science Center at San Antonio says Arkes "is exactly correct – but only according to the current clinical trials of PSA screening."

"If you leave them the way they are, that article is smack-on correct," says Thompson, a urologist. "But the trials had problems."

Thompson says the chief issue is that men in the best study to date, from Europe, have been followed for a maximum of 13 years — and that's not long enough.

"When you analyze those trials very early, what you pick up on are the harms of testing," Thompson says. "And it really takes many, many years to see the benefits."

He doesn't know how much longer the European men would need to be watched, but thinks it would eventually become clear that PSA testing saved many more from a prostate cancer death. He doesn't think the Preventive Services Task Force should have taken a stand against testing at this time.

Dr. Michael Barry, head of the Informed Medical Decisions Foundation in Boston, thinks Thompson has a point.

"I'm reluctant myself to make a decision for someone else about PSA screening," Barry says. "And as a result, I'm also reluctant for expert panels to take that position of telling men what to do here."

Barry says his way out of the controversy is to take it "one man at a time." That is, doctors need to lay out the evidence as clearly as they can, which he says indicates there's very little, if any, benefit to PSA testing.

"I think many men won't want the test in that circumstance," he says. "But some will, and I'm comfortable with that."

Up to now, Barry says, those discussions haven't been happening nearly enough.


"Why People Stick with Cancer Screening, Even When It Causes Harm" / TIME, 25.5.2012

"Analyzing Those Widespread Feelings Of ‘Hands Off My PSA Test’" / CommonHealth, 23.5.2012

"A Prostate Screening Picture Worth A Thousand Words" / CommonHealth, 24.5.2012


"Uproar Over Prostate-Cancer Screenings Explained" / ScienceDaily, 22.5.2012

ScienceDaily (May 22, 2012) —

The uproar that began last year when the U.S. Preventive Services Task Force stated that doctors should no longer offer regular prostate-cancer tests to healthy men continued this week when the task force released their final report. Overall, they stuck to their guns, stating that a blood test commonly used to screen for prostate cancer, the PSA test, causes more harm than good -- it leads men to receive unnecessary, and sometimes even dangerous, treatments.
But many people simply don't believe that the test is ineffective. Even faced with overwhelming evidence, such as a ten-year study of around 250,000 men that showed the test didn't save lives, many activists and medical professionals are clamoring for men to continue receiving their annual PSA test. Why the disconnect?

In an article published in Psychological Science, a publication of the Association for Psychological Science, researchers Hal R. Arkes, of Ohio State University, and Wolfgang Gaismaier, from the Max Planck Institute for Human Development in Berlin, Germany, picked apart laypeople's reactions to the report, and examined the reasons why people are so reluctant to give up the PSA test.

"Many folks who had a PSA test and think that it saved their life are infuriated that the Task Force seems to be so negative about the test," said Arkes.

They suggest several factors that may have contributed to the public's condemnation of the report. Many studies have shown that anecdotes have power over a person's perceptions of medical treatments. For example, a person can be shown statistics that Treatment A works less frequently than Treatment B, but if they read anecdotes (such as comments on a website) by other patients who had success with Treatment B, they'll be more likely to pick Treatment B. The source of the anecdotes matters too. If a friend, a close relative, or any trusted source received successful treatment, they would be more likely to recommend that treatment to others, even if there was evidence showing the treatment only works for a minority of people.

Arkes and Gaismaier also propose that the public may have recoiled against the task force's recommendations so fiercely because they weren't able to properly evaluate the data in the report. Confusion over the use of control groups may have led people in the general public to weigh the data differently than medical professionals did.

"How to change this is the million-dollar question," said Arkes. "Pictorial displays are far easier to comprehend than statistics. The two figures in our article depict the situation more clearly than text and numbers can do. I think data displayed in this manner can help change people's view of the PSA test because we compare the relative outcomes of being tested and not being tested. Without that comparison, it is tough for the public to appreciate the relative pluses and minuses of the PSA test versus not having the PSA test."

Men will be able to continue to request the PSA test, and it will be covered by health insurance for the foreseeable future. But psychological science suggests that unless people are convinced to choose statistics over anecdotes, confusion surrounding the test's effectiveness will linger.


"Psychological Science Explains Uproar over Prostate-Cancer Screening" / Psychological Science, 22.5.2012


"Many Primary-Care Docs Don’t Understand Cancer-Screening Stats: Study" / The Wall Street Journal, 5.3.2012

By Katherine Hobson

As the debate about screening for some types of cancer — think prostate and breast cancer — has gotten more heated over the past several years, doctors and patients are increasingly being called upon to make tough decisions about whether and how to screen.

That kind of informed decision is hard to make, though, if some doctors are confused by the statistics behind cancer screening — as is suggested by a studyjust published in the Annals of Internal Medicine.

In 2010 and 2011, researchers surveyed the reactions of 412 primary-care physicians to the effect of two hypothetical screening tests. (The type of cancer wasn’t identified to participants, but the data were derived from prostate-cancer screening research.)

One test was described in terms of its impact on five-year survival — boosting it to 99% from 68%. Some 82% of the surveyed doctors said that this test “saves lives from cancer,” and 69% said they would “definitely recommend” the test.

Another test was describe in terms of its mortality benefit — lowering cancer mortality to 1.6 deaths from 2 per 1,000 people. But only 60% of doctors said that this test “saves lives from cancer,” and just 23% said they’d definitely recommend it.

Trouble is, an extramural committee of the National Cancer Institute found that reduced mortality — the stat used to describe the second test — is the “only statistic that reliably proves that a screening test saves lives,” the study says. And five-year survival cannot actually prove that a screening test reduces cancer deaths.

Here’s one reason why, as described by the study authors:

Imagine a group of patients in whom cancer was diagnosed because of symptoms at age 67 years, all of whom die at age 70 years. Each patient survives only 3 years, so the 5-year survival for the group is 0%. Now imagine that the same group undergoes screening … Suppose that with screening, cancer is diagnosed in all patients at age 60 years, but they nevertheless die at age 70 years. In this scenario, each patient survives 10 years, so the 5-year survival for the group is 100%. Yet despite this dramatic improvement in survival (from 0% to 100%), nothing has changed about how many people die or when.

In addition, screening healthy people will turn up more cases of cancer, some of which aren’t destined to cause harm to health. But those less serious cases will make the five-year survival statistics look rosier, even though screening isn’t actually reducing the number of cancer deaths, the authors say.

Why did doctors get it wrong? For one thing, the authors say, “survival” has different, more significant, implications in a randomized trial of a treatment rather than a screening study. The big difference in the numbers in the two scenarios — 68%/99% vs. 1.6/2 — also probably contributed to the confusion, they write.

Physicians’ responses also indicated that they hadn’t considered the possibility of overdiagnosis — identifying cases of cancer that weren’t ever going to progress — when weighing information about the hypothetical tests, the authors write.

The extent of overdiagnosis isn’t known for many cancers, and even when there are estimates — as with prostate and breast cancer — screening is “nonetheless strongly promoted,” explains Odette Wegwarth, a study author and senior research scientist at the Max Planck Institute for Human Development in Berlin, via email. That may have more to do with the U.S. health system than physicians’ ignorance, namely that doctors “can be easily sued for doing too little,” she says.

“There is a saying, ‘No one is ever sued for overtreatment,’” she tells us.

 


 

"Most U.S. doctors baffled by cancer screening stats" / Reuters, 5.3.2012

"Risk comm guru Gigerenzer argues that absolute risk communication is a moral issue" / Gary Schwitzer's HealthNewsReview Blog, 21.12.2010

"Decision-making: Risk school" / nature.com, 28.10.2009

 


9/11: New findings on indirect harms of terrorism

In the wake of the terrorist attacks of September 11, 2001, many Americans started driving more due to a fear of flying – and lost their lives in traffic accidents. But why did this happen more frequently in some states than in others? And why didn't Spanish driving habits change in the same way following the 2004 train bombings in Madrid? Wolfgang Gaissmaier and Gerd Gigerenzer from the Harding Center for Risk Literacy at the Max Planck Institute for Human Development in Berlin present new findings on this topic in the journal Psychological Science.

As we all know, the terrorist attacks of September 11, 2001 changed the world: The feeling of vulnerability led to the so-called "war on terror." New laws were passed and surveillance intensified to reduce the risk of direct damage resulting from terrorism. But terrorist attacks also cause indirect damage. This comes about through people's thoughts and fears in reaction to such attacks. In the case of 9/11, it was primarily severe losses in the aviation and tourism industries. Earlier studies showed that, following the terrorist attacks, more people chose to drive rather than fly, feeling it was safer. The result was not just a greater risk of traffic congestion: in the twelve months following September 11, 2001, there were an estimated 1,600 more accident-related deaths on American roads than would have been expected statistically.

But why would such an increase in traffic and, with it, also in traffic deaths, be observed only in some states and not in others? And why was no increase in driving and in traffic accidents seen following the likewise devastating train bombings in Madrid in 2004? Psychologists Gaissmaier and Gigerenzer from the Max Planck Institute for Human Development in Berlin and the Harding Center for Risk Literacy based there present new analyses, which will soon be published in the journal Psychological Science.

In the analyses, they show that car traffic increased particularly in the New York vicinity. The main attacks were focused on the World Trade Center located there. These images, and thus also the fear, were presumably particularly present for people who lived in the surrounding area; other studies also support this assumption. However, the authors further identify a second, even stronger factor that could explain why the traffic volume increased sharply even in some states far away from New York, especially in the Midwest: there, the infrastructure was simply very well suited to replace flying with driving. The streets were very well developed in relation to the number of inhabitants, and many cars were registered.

"Our study findings support the assumption that the fear created by terrorist attacks can cause potentially risky behaviour. But they also make it clear that fear alone is not enough to understand where indirect damage can occur in the wake fatal events like those of 9/11", says Wolfgang Gaissmaier. "To predict where the indirect damage of terrorist attacks can have particularly fatal consequences, and to possibly curb a secondary, psychological attack, we must pay very close attention to the general conditions that first make it possible for risky, fear-induced behaviours to express themselves – such as the respective infrastructure."

That could also explain why there were fewer Spanish train travellers following the train bombings in Madrid on March 11, 2004, but without any corresponding increase in car travel. Spain simply has a less pronounced car-driving culture, and Gaissmaier and Gigerenzer also express this in numbers: in 2001 in the US, there were around 800 cars registered per 1,000 inhabitants, while in 2004 in Spain, this figure was just around 600.

Source: Gaissmaier, W. & Gigerenzer, G. (in press). 9/11, Act II: A Fine-grained Analysis of Regional Variations in Traffic Fatalities in the Aftermath of the Terrorist Attacks. Psychological Science.

 

‚Äč

Harding-Center.com