Last week, in Ditching the Dissonance II, we discussed rabid, info-seeking wolves and thoughtful, information-foraging caribou, considered how confirmation bias can influence our information seeking habits, and how bad information causes breakdowns in communication. Today, we will look at information seeking strategies that will help you get in touch with your inner evidence-based-information-seeking caribou.
As you read last week’s blog, did you consider which category of information seeker you fall into: wolf or caribou? Do you do diligent research and consider even the information that doesn’t support your hypothesis? Or do you seek only the evidence that confirms your views…and are you so in denial that you are a poor researcher that you decided to just disregard all the information in the blog that supported this conclusion?
“I really want to be an evidence-based-information-seeking caribou!” I hear you say, “But how???”
If you found yourself in the latter category and weren’t in denial about it, you may have been greatly disappointed with this state of affairs, but never fear—caribou status is within your reach.
The first step is to think about how you typically seek out information. Determine where your weak spots are. Are you a lazy researcher who takes frequent shortcuts to avoid sifting through large amounts of information? Are you afraid to find information that will force you to reevaluate your opinions? Have you ever stepped foot inside a library? Have you ever found yourself utilizing Boolean operators in a bibliographic database search? Now, you may be thinking at this point: Who are you and why are you asking me complicated questions that I don’t want to answer??? But you need to answer these questions before you can achieve caribou status. Consider where you are as a researcher and where you can improve your skill set if you want to be a better information seeker.
The second step is to determine your information needs. This may mean admitting to yourself that there is a gap in knowledge or understanding of an idea, concept, or phenomenon that you are looking to fill (I.e., admit that you don’t know everything). This will likely require examining your own motives. Determine what you need to know, and why, and look for the best answer. In other words, don’t Google what you are convinced is the only possible answer to the question or seek out only the information that confirms your stance on an issue. This leads to confirmation bias. The information Google will give to you will likely confirm your opinion (no matter how far-fetched), but it may be grossly inaccurate. Recall those instructions that your English teacher gave you when you had to write an expository essay in high school: start with a good question and work toward an answer based on reliable sources.
The third step: choose your sources wisely. The information need will determine the sources you use. Though they are easy to access, Google, Wikipedia, and the like are not the best sources for many types of information. However, I will be the first to admit that I use them. When I wake up at 2 in the morning desperately trying to remember who directed the movie Paris, Texas (it was Wim Wenders!!!), Google provides a quick resolution to this dilemma. Google and Wikipedia can, at times, be a good starting place for finding resources (those “References” sections can be useful). But if I want to understand the conceptual underpinnings of string theory I’m not going to rely on a Wikipedia article or the random blog posts churned up by a Google search—I’m going to look for information compiled by experts in the field.
For some subjects, you need an expert. Not everyone on the Internet is an expert but anyone on the Internet can claim to be an expert. Depending on your information needs, you may need to seek out information in peer-reviewed journals. Don’t be afraid to wade into scholarly waters if your information need necessitates it. If you feel that your information need is basic enough that you can trust Google to answer it, you should still apply certain criteria to your search results. Check the credentials of the content creator, review their other work (if available), the date of the material, and look for retractions.
One of the most important but often overlooked steps in the information seeking process is evaluating the results and, if you’ve been following the previous steps, it should make your job easier. To evaluate your results for validity and relevance, ask the following questions: Do they help you answer (or at least work toward an answer) to your question? Do they meet your information needs? Are they the best results you could get under the circumstances? Is the information they provide useful, reliable, up-to-date, and unbiased? If you don’t feel comfortable with the answers to these questions, it may be time to move on to the secret 5th step: repeat the process if you aren’t satisfied with your results.
How you choose to interpret the information you find on an information foraging expedition and how you use what you find is up to you but, at the very least, always try to be open to new information you find along the way, not just the stuff that confirms a previously held opinion. While this is helpful advice for any information seeker, it is vital for anyone in a position of authority in the public or private sector, or any position where the information you disseminate will have a far reach, because of the amount of influence you wield. Any bad information you provide will travel far. Learn to recognize confirmation bias and avoid it before it leads you to spread bad information. Continually hone your information seeking skills and improve your information seeking strategy. Learn the ways of the diligent, evidence-based-information-seeking caribou!
And for my fellow Pixies fans, this…
Back in mid-March, I wrote a blog post called “Ditching the Dissonance.” In this blog, I discussed how quickly bad information gets around, and how we become complicit in spreading it across the Internet via social media. I discussed the term “cognitive dissonance”, the theory that we “seek consistency in our beliefs and attitudes in any situation where two cognitions are inconsistent.” Today I want to consider how confirmation bias can influence our information seeking habits and how it can be detrimental to rational debate.
Foraging in the vast information fields of the Interwebs
Here’s a thought-provoking analogy from an old Library Quarterly article, cited in a paper presented by the Faculty of Information Studies of the University of Toronto:
Just as animals evolve different methods of gathering and hunting food or prey in order to increase their intake of nutrition, humans also adopt different strategies of seeking information in order to increase their intake of knowledge. Foraging for information on the Web and foraging for food share common features: both resources tend to be unevenly distributed in the environment, uncertainty and risk characterize resource procurement, and all foragers are limited by time and opportunity costs as they choose to exploit one resource over another (Sandstrom 1994).
Librarians are great foragers and can help us find the best information to meet our needs but not everyone has librarian-level information foraging skills. Some of us are poor information foragers because we lack the skills OR we have the skills but aren’t always willing to utilize them. Sometimes we are short on time and need to take shortcuts on our information seeking journey. Sometimes we can’t be bothered to sift through the huge amount of information available to us so we choose the top results that Google returns. Sometimes we just want the information that will help us make the point we are trying to make in the heat of a debate and we don’t want the information that doesn’t support it. This is where confirmation bias comes in.
Confirmation bias, as defined by Science Daily is “a phenomenon wherein decision makers have been shown to actively seek out and assign more weight to evidence that confirms their hypothesis, and ignore or underweigh evidence that could disconfirm their hypothesis.”
Even on a good day, confirmation bias can be a bad thing. When you are angry, it can be a disaster.
You mad, bro?
There’s a quote from one of my favorite novels, Pride and Prejudice by Jane Austen: “Angry people are not always wise.” (Tweet This) While this is usually true, never is it truer than when we apply it to our information seeking behaviors. Have you ever gotten into a debate on Facebook or Reddit, or observed other people engaged in debate, and watched the debate deteriorate to the level of name-calling and—even worse—the posting of links to hastily Googled, poorly sourced “articles” (as spurious blog posts are sometimes called)? I have seen debates like this end friendships.
I’ve always thought that the most amazing thing about Facebook, and the Internet in general, is its power to unite people who are separated by distance—people who haven’t seen each other for decades, relatives, etc.—or people who wouldn’t have met in real life but find each other on the Internet through shared interests and experiences. Yet, just as easily, it can tear these relationships apart and create enemies of people we’ve never even met because of the anonymity it provides. We can say things behind a computer screen that we’d never say to another human’s face. This is dangerous stuff.
When you are angry, you are not a good decision-maker, nor are you a good information-seeker. (Tweet This) You think you are, in the heat of the moment, but you aren’t. You are the worst information seeker you can possibly be when you are angry because you are looking for only the information that confirms your opinion, are incapable of viewing information in an unbiased way, and do not take the time to filter bad information. At that point, you are not a peaceful caribou, foraging serenely in a field of bibliographic database search results, selecting only the best, most reliable information to meet your information needs; you are a rabid wolf, stalking any shred of information, reliable or not, to validate your righteous anger! You may even attack the peaceful caribou who tries to get you to consider evidence-based information compiled by experts! Hyperbole and analogies aside, you DO NOT want to be the rabid wolf in this scenario. Rabid info-seeking wolves often succumb to confirmation bias. (Tweet This)
Real leadership requires that we rise above our own biases. (Tweet This) How we seek information affects how we interpret and present information and, if you are in a position of leadership, it is particularly important to maintain a high level of integrity when collecting and disseminating information. As a leader, you want to be fearless like a wolf; when it comes to foraging for information, you want to resemble the thoughtful caribou. Next week, we’ll talk about information seeking strategies and how to be an evidence-based-information-seeking caribou.
Have you seen this video? It has been making the rounds and, recently, SGR featured it in our weekly 10 in 10 Update. I’m sure most of you reading this, especially those of you who are regular Facebook users, have witnessed the phenomenon outlined in this video—from viral videos to cat memes, information can spread like a communicable disease. Unfortunately misinformation can spread just as rapidly.
While funny memes and Buzzfeed articles are more entertaining than harmful, misinformation can be extremely harmful. Sometimes the people spreading these “mental sneezes,” and the ones catching them, don’t realize this. One of the great challenges we are faced with in the Information Age, is how to distinguish the factual, evidence-based information from the misinformation. If you are a frequent Facebooker or Googler, this can be a daunting task due to the speed at which misinformation travels and the speed with which search engines may be integrating our personal biases and habits with their filtering.
A while back, I read a book called The Filter Bubble. This book discusses the inner workings of internet filters and reveals the ways in which they can potentially promote bias and threaten rational discourse. In the book, the searches of two women are compared. Both Googled the term “BP” (as in, the oil company). The results Google returned were very different for each. One got results that were mostly about the BP oil spill, the other got mostly investment information about the company. These women had similar backgrounds and similar political beliefs. Google was primed to retrieve results based on their previous search habits. Considering these factors, you may suspect that people with drastically different political views and backgrounds would retrieve drastically different results when conducting identical searches.
What this suggests is that those great algorithms that help Google and Facebook determine what kinds of food and clothing we like (based on our previous habits) and target ads accordingly, are also determining the seemingly reliable information that is served up to us every time we do a search. In other words, Google is in your head and it knows what you want to hear.
Sometimes this is good thing. We want the information we want to be delivered as quickly as possible, right? The problem is, of course, it isn’t always reliable information. And this can be bad because, even if it isn’t reliable, we are still willing to accept it as reliable if it confirms our previously held notions. We tend to seek out information that makes us feel validated. And then we tend to post that information on Facebook or Reddit and argue about it for hours—even days—on end because even the arguing reinforces our belief in the opinion-confirming information.
Social psychologist Leon Festinger developed the theory cognitive dissonance to explain our tendency to seek out information that confirms and reinforces our previously held assumptions. The theory contends that we “seek consistency in our beliefs and attitudes in any situation where two cognitions are inconsistent,” (read more here). When we are confronted with information that conflicts with our beliefs and attitudes, we experience cognitive dissonance, a sort of “brain noise” that results from trying to hold two or more inconsistent beliefs at the same time. There are several ways we try to silence this noise:
- We change our beliefs, behaviors, and/or opinions (but not usually).
- We try to justify the belief, behavior, and/or opinion by altering the conflicting information so that it is a better fit with what we already think and feel.
- We try to justify beliefs, behaviors, and/or opinions by adding new ideas that make us feel better about what we already think and feel.
- We decide to ignore any new conflicting information that would call our beliefs, behaviors, and/or opinions into question.
As we learned from Randy’s blog on Monday, bad decisions are often made in groups. Herd mentality can ensure that bad decisions have a bigger impact, that mistakes are more difficult to correct, and that misinformation spreads faster. In order to make better decisions, we need to be well-informed; we need to be receptive to new information, even when it conflicts with our opinions and beliefs, and learn to spot harmful misinformation so that we don’t spread it. Next week, we’ll discuss some practices that can help us do this.