Fake News and Filter Bubbles

My first big assignment in 2017 (aside from successfully pulling off our annual conference) was to create a training plan for a 3-hour session for librarians at Brooklyn Public Library on web literacy. I was grateful for the input of BPL’s Director of Customer Experience on shaping the plan for the day, and thrilled as always to work with one of my web literacy partners in crime to make the training a success.

Sometimes I like to think of these types of engagement as recipes. This one was a 1 cup of Mozilla’s Web Literacy Pilot Project, 1 cup of the reading I’d done over winter holiday on algorithms and bias, 1/4 cup of the agonizing I’d seen around the web re: the polarization of our country, and tablespoon of my own deep reflections on how our current information environment impacts library workers.

Learning Outcomes For Participants

  • Understand how unconscious bias and filter failure resulted in the rise of fake news
  • Evaluate media found on the web for validity and accuracy & teach others to do the same
  • Provide reasons why it is important to remain vigilant in seeking meaningful sources
  • Work with peers to develop a coordinated strategy for discussing these issues with our communities

This session operated as part train-the-trainer and part tools workshop. Librarians do this work every single day and their experiences are central to ensuring that our society has the resources they need.

Lesson Plan Outline

I. Introduction
Facilitators will introduce themselves, and share the overview for the course and agenda for the day.

II. Warm-up
Facilitators will lead a warm-up activity demonstrating the challenges in determining the veracity of online information.

III. Our unconscious biases
Truth is, we humans have always had the potential for inhabiting vastly different realities. In this section, facilitators will briefly review the characteristics of unconscious bias and share how, as social beings, we take comfort in those who look and think like we do.

IV. Algorithms and Facebook feeds
In this section, we will discuss what an algorithm actually is, and how they can be utilized to manipulate one’s experience of the web. Case studies here include Facebook’s newsfeed and Google’s search results. Participants will demystify algorithms in a small group exercise.


V. Filter bubbles & fake news
Now that our participants are versed in the comfort of community within news information and the role technology plays in highlighting information that confirms pre-determined biases, we’ll discuss the rise of fake news — who is writing it and why.

VI. Evaluating sources
Library workers are well-versed in evaluating online information. In small groups, participants will discuss the attributes they look for in a trusted source. We will reconvene as a large group to build a list of these attributes.

VII. What can we do about it?
Facilitators will lead a discussion on how we might address the challenges of and opportunities in helping library users assess information in the pursuit of truth. As a group, we will brainstorm methods of bringing the public into shared spheres of meaning. Participants may choose to use this time to put their ideas and thoughts on how we can address these issues in writing.

VIII. Closing
We’ll close by reviewing material covered and gather any last minute questions.

Learning Outcomes For Your Humble Facilitator(s)

Predictably, we learned a ton in the first session, and I’m very grateful we were able to practice our adjustments for the second go ‘round. In particular, I found that I needed to rework the portion on algorithms with a few highly structured slides before landing into the activity. We also created more scaffolding by way of written reflection for the discussion on how our participants might bring these ideas to their work.

As a whole, this was an amazing opportunity to work with an institution I’ve long admired. And to be able to do so while working alongside the best in the business was truly a career highlight.