a magnifying glass enlarges the Google logoAuthor: Cami Ryan, Social Sciences Lead, Regulatory Scientific Affairs, Bayer Crop Science 

Picture it. You are a respected, established advocate for agriculture and food production. You might even have a well-recognized brand. You come across a post on Facebook or Twitter that contains information that is:

  1. Inaccurate (not evidence- or science-based)
  2. Inflammatory
  3. Biased
  4. Not credibly sourced
  5. One, any or all of the above

What do you do?

Those of us who have been around the ‘debunking’ block a time or two have come up with ways to identify ‘sketchy’ information, finding ways to mitigate or debunk it. We leverage a vast network of credible experts in the area like Jayson Lusk (agriculture economist), or Lynn Sosnoskie (weed scientist) or the Cornell Alliance for Science. I’ve even directed family members and friends to Snopes or Hoaxslayer sites when I’ve come across things like “lemons will cure your cancer!” or “gargling with salt water will cure COVID-19!”

But here’s another scenario. What if the source of the misinformation comes from another science- or evidence-based advocate or expert just like you?

None of us are immune to misinformation, no matter who we are and where our level of expertise lies. I’ve slipped up many times over the years. My cognitive traps are things like satire (I’m very literal), I’ve misread intent and even failed to check background or context. I don’t think I’m alone in all this. We can all get tripped up by misplaced ideology and even carelessness. And let’s face it, we will get called out on it. Twitter never forgets and it rarely forgives.

But let’s step back a bit. Are you dealing with misinformation, or is it disinformation? There is a difference.

Misinformation is referred to as inaccurate or incomplete information. Misinformation can mislead (Fallis 2009) through:

  • An honest mistake
  • Negligence
  • Unconscious bias

Disinformation is qualitatively different, though. Disinformation is defined in the literature as “a product of a carefully planned and technically sophisticated deceit process,” (Fallis 2009). Disinformation comes with intended or expected outcomes — anything from attracting “likes” and “shares” to outright ban of targeted products or technologies. Disinformation is a product with a market (Ryan et al 2019).

The main difference between misinformation and disinformation is INTENT. But the two are inextricably intertwined. Purposeful distribution of disinformation can lead to the spread of misinformation. Misinformation can, in turn, inform disinformation through deliberative strategies that leverage the gaps in our understanding of products, technologies or ideas.

As humans, we are motivated or incentivized by a variety of competing factors. Branding is a big one. If someone’s primary objective in posting, sharing, tweeting or otherwise is branding or brand-building, it is easy to fall into a trap of leaning towards and sharing more inflammatory information. Inflammatory information drives attention, but it could also be purposefully sensationalized, and, it can be misleading.

There are costs of mis/disinformation, such as delayed or shelved innovations (just think of Golden Rice or virus-resistant cassava), diminished scientific integrity, such as science by press conference, predatory journals (further information can be accessed on this topic through articles by Nature here, here and here), etc. This is less about scientific literacy and more about encouraging information literacy, media literacy and critical thinking. By 2025, it is estimated that there will be 4.1 billion users on social media.

No matter our level of expertise, we are social creatures incentivized by competing factors and guilty of biased thinking. Human beings are social beings, herd animals. All of us, including those of us who identify as knowledgeable experts. If we are to expect consistency in terms of how others create, interpret and share information, we need to do our best to inoculate ourselves against misinformation. We have to continuously demonstrate critical thinking skills ourselves. It is our responsibly to not only continue to engage around these complex topics but to also take inventory of the incentives we are responding to and understand the type of information that motivates our beliefs. In fact, those of us who are experts and advocates have the greatest responsibility to stay attuned to these spaces and be accountable when we fail or slip up.

The Agriculture Advocate’s Checklist:

  • Understand and accept that, as experts, we can’t possibly know everything.
  • Avoid snap judgements to mis/disinformation posts. Experts and advocates need to be ‘slow thinkers’ (a la Daniel Kahneman).
  • Constantly leverage your networks to ensure accuracy: check context, timing, dates and intent or motivations (theirs and yours). Remember, mis and disinformation are constantly moving targets. It’s tough to keep up.
  • Share less and check more.
  • Think about the collective good. Does your post/response add value to the conversation?
  • Engaging in productive dialogue about agriculture means putting the relationship first. It’s a conversation, not a conversion.

Finally, INTENT is something that we always need to consider when engaging in productive conversations about complex topics like GMOs, health, vaccines or pesticides. Specific issues will come and go, but agriculture and science will always be problematized through disinformation. Fundamentally, we need to fight our urges to “win” today’s conversation or gather attention with today’s post and instead emphasize building transparency and trust for the long run. Understanding the landscape of mis/disinformation — how it is created and shared and the behaviors and biases that drive it (including our own) — is important in ensuring that societies benefit from the best that science has to offer.

For more information from Cami Ryan on this topic, see her Twitter thread which highlights a variety of sources/citations on the topic here. Her recent journal article entitled Monetizing disinformation in the attention economy: The case of genetically modified organisms (GMOs) in the European Management Journal can be accessed here.

Further Suggested Reading

Caulfield, T. (2020). The COVID-19 ‘infodemic’— Debunking works, if it’s done right. The Star. June 19.

Cook, John, and Stephan Lewandowsky. The debunking handbook. St. Lucia, Australia: University of Queensland, 2012.

Potochnik, A. (2020) Awareness of Our Biases Is Essential to Good Science. Scientific American. August 9.

Ryan, C. (2020). The Bad Stuff is Easier to Believe. LinkedIn blog post, Cami D Ryan.

Ryan, C. D., et al. “Monetizing disinformation in the attention economy: The case of genetically modified organisms (GMOs).” European Management Journal 38.1 (2020): 7-18.

Schurenberg, E. (2013). Daniel Kahneman: Thinking Fast vs. Thinking Slow. Interview on Inc. Magazine.

ConsumerCorner.2020.Letter.18