Skip to content

Believe in data

Believe in data

Kauffman’s Nathan Madden evaluates the crisis that pre-dates our current crises – the crisis of what or whom to believe.


“There are three kinds of lies: lies, damn lies, and statistics.”

This saying (oft misattributed to Missouri’s own Mark Twain) tells a truth about how people misguide others. It seems like we’re continuously bombarded with new datapoints used on behalf of an argument. More insidiously, data and the algorithms driven by data can be used to provide a misleading view of the world – and we might not even realize it is happening.

Right now, we’re facing a public health crisis, the movement to save Black lives from police brutality, an economic downturn, and – pre-dating all of those – a crisis of what or whom to believe. As an evaluator who works with numbers, surveys, interviews, and other forms of data on a daily basis, I’m not insulated from this third crisis either. With so much more information coming to us every day, it begs the question: “What data and information should I even trust anymore?”

First, let me just say that it’s OK to stop and let yourself ask that question – it’s your analytical mind kicking in and, properly channeled, that can be a good thing. In response to questions about data credibility, evaluators have discovered some ways to be more effective in communicating findings from data to help overcome biases so that information can reach and sway even the staunchest critics. Sometimes, it’s not that easy, though.

Resist the algorithm

There are probably fewer, more timely examples of how data are purposefully misused to misdirect than in the times we are in now. Our current technological reality allows for sophisticated algorithms, infiltration by “bots” and those looking to spread misinformation, and political campaigns with enough funds to execute precision targeting based on the data and access social media platforms provide.

For example, Dr. Safiya Noble’s work on how algorithms perpetuate racial tropes and stereotypes comes to mind. To see an example of this, do as she suggests, and as a thought experiment, do a basic search on Google for “Black girls.” Needless to say, the autofill and subsequent search results are enraging, saddening, and sickening – given all the utterly racist garbage that piles up in front of legitimate resources.

And, almost as soon as we all finished seeing – with our own eyes – George Floyd have his life snuffed out by a Minneapolis police officer, people began pointing out that more White people are killed by police than Black people. I’m sure you saw that trotted out in your Twitter feed or “pop-up” in your Google searches, but it’s a deliberate half-truth: the fact is, that Black people in the U.S. are killed by police at the highest rate of any group – and well over double the rate of Whites. This is how the truth is used to contain a lie.

With so much more information coming to us every day, it begs the question: What data and information should I even trust anymore?

It has become so accepted that data is there to be manipulated, that sometimes researchers face consequences for not bending data to fit a point of view. Take the reports of Rebekah Jones, a public health data researcher in Florida who was fired in May after she said she refused to manipulate data on the COVID-19 pandemic so that politicians in the state could say it was safe to reopen. Instead of using hard data to tell the truth of what was happening, this appears to be an instance of a blatant willingness to suppress and ignore the truth at the cost of people’s health and lives.

How do we sort it all out?

If I’m being honest, there’s not an easy answer to that question.

But I think it starts with people who produce and evaluate data see steps below and includes putting value on data that is transparent, that is produced by diverse research teams, and that is designed for learning, not confirming. It means upping our ability to recognize credible data and understand that the information we’re served is not sorted by accuracy or by accident.

While we might not be able to thoroughly examine the legitimacy of any and every piece of data we see, if we want to resist the temptation of the algorithm, what we can do is recognize our own bias and our willingness to accept data that validates it.

Believe in people

If we want to resist the temptation of the algorithm, what we can do is recognize our own bias and our willingness to accept data that validates it.

If we’re to ever see our way out of the crises of health, economy, and truth that we find ourselves in, we must start doing a radical act: we need to believe the data. We need to reaffirm our ability to trust in what other, credible people who devote their lives to researching those issues are finding – and that does not mean the latest Twitter poll.

Believing in data means believing in people.

Not just the researchers producing quantitative data, but also those who surface what is qualitative. These are the stories often lost to the news cycle, buried by the algorithm, forgotten as individualism erodes interdependence and the comfort of ignorance wins out over the discomfort of awareness. We need the stories that data tell because each piece of data is not a number, but a person. Each unnecessary death of a Black person in police custody. Each child without Wi-Fi. Each business exit – and start.

We are well-served by considering lived experience as data and listening to those with extensive knowledge of the subject at hand – like medical experts, teachers, nurses, and even the millions of people exercising their Constitutional right to assemble and speak freely about police brutality. Right now, especially, we need effective data to make decisions and to learn lessons to rebuild better.

The bully pulpit is powerful – especially with an algorithm behind it – but quality data makes for informed narratives that we can use to engage in civil discussions about how best to address an issue. That’s the “content” that aids inclusive, effective policy decisions, can evolve systems, and transform people’s lives.

We need to believe the people who have devoted their entire lives to listening, compiling stories and information for the rest of us to know and learn from, and the people who passionately advocate for the truth that those collective stories carry.

I know I want to believe.

Steps to continue making data better

Be inclusive, adaptive, and responsive

Too often, researchers and evaluators come in with a pre-set agenda of the questions they want answered. However, those may not be the right questions or your approach to collecting data might not be appropriate. As a straight, cis-gender, White male, I have blind spots on the right questions to ask or best approach to showing an impact, which could undercut the findings from the data that were collected. The Annie E. Casey Foundation, in partnership with WestEd, helped define some excellent questions to ask prior to an evaluation and data collection. The bottom line is, if you’re collecting data and making conclusions, and not deeply engaging a variety of diverse perspectives, you will immediately undermine what you find, even if done with the best intentions.

Always humanize a datapoint.

I get it. “… One in 10 people don’t do this; 90% of people do the other thing; $43 billion was spent X, Y, or Z …” At some point, they’re just numbers on a page or a screen. Given the volume of information we consume every day, it gets harder and harder for those numbers or data points to have meaning. They become an abstraction, even if it’s true and powerful at its core. I know during the COVID-19 crisis, I’ve been anxiously and analytically checking the number of people who succumbed to the disease. However, when I see a story on a transit worker or loving grandfather who died from the disease, it becomes an entirely different feeling; I understand from a deep emotional standpoint the gravity of the loss we face – and will continue to face – as a country.

We all have a story and it should be told. Added together, they become living history and a monument to those of us who are different but, fundamentally, the same. One of the best resources for inclusive storytelling in evaluation can be found at Racial Equity Tools. Always make sure that you tell the entirety of a story – the macro-level context that quantitative data gives us and the micro-level emotion and experience that qualitative data provides as complement.

Make sense of the data with the people.

A key principle of evaluation – and data collection more broadly – is to have the people who contributed to your study help you make sense of what you found. It sounds intuitive, but too often it’s overlooked. That means, if you run a survey of 100 people, for example, you should invite those same people into a facilitated session to talk it over. Not only do you get much-needed and helpful context put around the analysis you conducted, but participants begin to see just how their data are used in a transparent and open way. People see themselves in the percentages and graphs; they develop a connection to the process of data collection and analysis. These are all things necessary to building trust in what’s behind a number. FSG has some excellent resources on facilitated learning to help guide you in this process. Don’t worry – many of these facilitation techniques are social distancing-friendly.

This piece is part of the Foundation’s “Uncommon Voices” series, which features viewpoints from those working hard on issues that reduce racial inequity and support economic stability, mobility, and prosperity.