Outstanding Academic Titles 2023: Technology, Data, and Algorithms
Posted on in Blog Posts
Posted on May 17, 2023 in Blog Posts
The popularization of ChatGPT has pushed conversations about AI literacy and algorithmic literacy to the fore, but these literacies are important in other contexts affecting researchers and students. In the chapter “Algorithmic Literacy as Inclusive Pedagogy” recently published in Exploring Inclusive & Equitable Pedagogies: Creating Space for All Learners (ACRL, 2023) and republished here under a CC BY-NC-SA 4.0 license, Melanie Sellar argues that the centrality of search engines like Google in students’ research processes makes it necessary for librarians to help students understand how algorithms shape what information is presented in potentially very biased ways. She makes a powerful case for this form of literacy:
When we ignore these academic and everyday research experiences they bring into college and lament their reliance on search tools like Google, we invalidate their experiences, elevate one privileged academic notion of research, and miss an opportunity to connect with them. Even worse, we allow the biases (racial, gender, political, geographical, to name a few) in algorithms to go uncritiqued and undetected, inadvertently sending the message that these tools are neutral and do not merit examination.
Sellar provides numerous ways of integrating inclusive algorithmic literacy skills into academic library instruction and supplies a truly excellent list of resources at the end of her chapter you’ll want to explore on your own.
Early in my career as an instruction librarian, I sensed something was missing in my workshops. It seemed I was doing everything right. I followed best practices for designing and delivering engaging sessions and made sure to tightly align my instruction with course assignments. Then I came to a realization: we were prioritizing subscription library databases and neglecting student use of Google. If we did talk about Google (and similar tools), we quickly dismissed it as inferior, characterized the information as unreliable, and positioned library databases as the best and only way to succeed on course assignments.
At my next institution, where I helped to directly shape the library’s information literacy curriculum, we opened up discussions around Google in first-year composition courses and created a research narrative assignment to elicit student thinking. By centering student voice, we became more certain of the need to include search engines in first-year instruction. When we listened to what students wrote in their narratives, it was evident that they felt their experiences were marginalized, even looked down upon, as articulated by one student:
I eventually gave up on my search for scholarly articles. Using good ol’ Google, I typed in the exact phrase I used when searching in the databases. And if those librarians say otherwise about my proficiency in research and quote citing, I have other papers that I would be happy to show to prove them wrong.
This reluctance to engage with nonacademic sources was reflective of the time. Project Information Literacy surfaced the culture of research assignments in a 2010 study, finding that faculty were most likely to point students to traditional library sources in their assignment directions.1 Search engines, websites, Wikipedia, and blogs were the least likely to be mentioned as possible, permissible sources.2 Within this culture it seemed difficult for librarians to find room in workshops to advocate for and lead discussions around how to use nontraditional sources. Nevertheless, many librarians began to initiate those discussions and see some traction. For example, Nelson and Jacobs describe their faculty-librarian partnership leading a semester-long Wikipedia project in an undergraduate history course.3
As the decade wore on, our students and faculty came to use and rely upon an ever-increasing set of information tools, including YouTube, Twitter, Facebook, Instagram, and Snapchat. Each platform wrestles enormous sets of data into ordered results lists. When Netflix launched a one million dollar prize for the first entrant who could improve its recommendation algorithm’s predictions by 10 percent, it was an acknowledgement that tailoring results to be more precise and appealing keeps users happy and subscribing.4 People generally like algorithms, value the shortcuts they offer, and place a high degree of trust in the algorithmically driven tools used in their daily lives. In its 2020 study, Project Information Literacy mused that “the lack of trust in traditional authority futures meant trust was placed in Google as the arbiter of truth, sometimes to a ridiculous extent.”5 Algorithms hook us and keep us using.
The increase in these platforms also ushered in growing concern about the invisible influence they wield over how we interpret the world and create new knowledge. At the 2021 ACRL conference, librarian Ian O’Hara observed that “search results have the power to structure knowledge and create an alternative material reality. Google search is an inflection point for an epistemic crisis.”6 These algorithms operate entirely behind closed, proprietary walls; they are not easily open to critique; and they change constantly. A Google employee and former search engineer shared that key intellectual property like search engine algorithms is not open to all employees; instead, Google has an internal hierarchy of access levels that involve nondisclosure agreements and other clearances.7 What we know of how these algorithms operate is entirely based upon what limited information the companies share and what we can observe about the algorithms through use.
We are now seeing academics spanning the fields of philosophy, ethics, education, and social sciences begin to study the societal impacts of algorithms, and a new field of critical algorithm studies being forged.8 We are asking questions about what the public and our students know and do not know about how algorithms operate.9 In librarianship we too are awakening to the role we can play in facilitating student awareness and critical discussion of algorithms. We may give particular credit to Safiya Noble’s keynote “Searching for Girls: Identity for Sale in the Age of Google” at the 2015 American Library Association conference and her 2018 book Algorithms of Oppression for igniting our professional interest and demonstrating how algorithmic profiling can cause social harm.10 The Framework for Information Literacy for Higher Education released in 2016 also gives us space to take on these overarching information literacy issues.11 Many of our colleagues have started doing important work in piloting lessons and curricula, some of which is highlighted in this chapter. Teaching algorithmic literacy, I argue next, is an inclusive teaching practice that we should adopt as a profession.
The inclusive education movement has its roots in K–12 special education over twenty years ago. Parents and teachers began to reject the segregation of children with disabilities from the mainstream classroom; instead, they called for inclusive spaces that could support all differently abled students.12 This idea broadened its focus over time to advocate for the full inclusion of students from all cultural, ethnic, linguistic, racial, and academic backgrounds. Teachers were encouraged to embrace and build upon the prior knowledge and experiences of their students, not to disregard or diminish their lived experiences and identities. Fundamentally the goal was to value the unique experiences of each student and avoid their marginalization within the classroom.
This call for inclusive education has now captured the wide interest of academics, including information literacy practitioners. There is no authoritative guidebook on how to enact inclusivity; rather, we are building that guidebook through our teaching itself and then sharing our practices in publications such as this one, following the established approach of the K–12 field.13
As our profession begins to develop a repertoire of inclusive teaching approaches, I propose that we include support of algorithmic literacy as one focus for how we enact and embrace inclusive education. Algorithmic literacy, a relatively new concept, refers to an understanding of what algorithms are, how they are used, and how they can positively and negatively impact individuals and groups. Koenig goes so far as to suggest that “the next phase in technological literacy is to incorporate the role of algorithms.”14 Faculty across academia believe that algorithmic literacy should be intentionally cultivated in college, but rarely discuss it, preferring instead that “someone else took the lead.”15
As information literacy practitioners, we could take this lead. By facilitating student awareness and critical discussion of algorithms, we can enact these inclusive teaching goals in our teaching contexts:
Our students enter our classrooms as complex humans with varied backgrounds, opinions, and ideas. They also bring positive and negative research experiences using varied tools outside of the library. When we ignore these academic and everyday research experiences they bring into college and lament their reliance on search tools like Google, we invalidate their experiences, elevate one privileged academic notion of research, and miss an opportunity to connect with them. Even worse, we allow the biases (racial, gender, political, geographical, to name a few) in algorithms to go uncritiqued and undetected, inadvertently sending the message that these tools are neutral and do not merit examination.
To bring this proposal into greater clarity, it’s helpful to look at what shape this work could take in a few contexts. I have sketched three lesson areas that could be incorporated into a lower division course, as one or two embedded sessions. Within each area I have highlighted some inclusive practices, informed partly by the Association of College and University Educators’ “Inclusive Teaching for Equitable Learning” curriculum crosswalk.”16 The areas also suggest a range of activities that librarians could adapt in their own instruction. Where they are inspired by materials in the resources section, I have indicated so.
Inclusion rationale: Search engines like Google are central to how students search for information.
While many students have some basic understanding of how those search algorithms function, that is not uniformly true and, more importantly, it is usually unaccompanied by a critical perspective.17 Given that students (and the public at large) normally consult just the first page of search results, they yield a lot of decision-making power to Google without being given the space and skills needed to ask about the impact of Google’s decisions on topic narratives.
Learning goals: Students typically begin (and often end) their research using Google. This lesson area acknowledges this student preference and supports them in becoming more critical users of it. Goals include being able to articulate some understanding of how Google’s ranking algorithm works, to identify some limitations of relying on top ranked results, and to identify more sophisticated strategies for or alternatives to searching Google.
Inclusion-oriented activities: The workshop content itself is inclusive because of its focus on students’ authentic experiences, on lifelong research skills, and on developing alternative ways of searching to bring in more diverse perspectives. I often kick off this kind of session with an assessment as a way to foreground student voice and experience; it could take the form of asking students to reflect on a question like “How does Google work?” using an anonymous tool such as Jamboard. Videos or textual materials on Google’s ranking algorithm could be explored in class or offered beforehand via a module in the learning management system. Students can work in pairs or small groups to discuss and refine their understanding, using a provided worksheet around a common search to guide their exploration. For example, they can be prompted to note the nature of the top publishers, proportion and placement of ads, observed top level domains, and geographic locations of results. Most importantly, students should be asked to reflect on why they think the top results are structured in this way, what the potential implications are not just for their own information seeking but also for their community or society at large, and what they might be missing. This not only gives them space to comment descriptively on what they are seeing, but also supports them in developing more critical perspectives. It also gives instructors an important window into student thinking, allowing us to adjust our instruction and identify new areas of potential support.
Extensions: There are many variations and extensions to this lesson area. Students could pick a different search platform (of their choice or a common one) to learn about and deconstruct, such as YouTube or Twitter, and then comment on the implications of their findings for their own search practices. There are also a number of materials in this chapter’s list of resources that provide excellent extension activities, including Berg’s “Googling Google: Search Engines as Market Actors,” Masunaga’s “Evaluating Online Sources with Lateral Reading” and “Exploring Google Scholar with a Summer Bridge Program.” and Caffrey Gardner’s “Analyzing Search Engines: What Narrative Is Told through the Algorithm.”18 An interesting lesson centered on the representation of authority in Instagram is offered by Andrews, Pho and Roh.19
Inclusion rationale: The issue of greatest concern and study in the last five years related to algorithms is that of bias. Scholars and advocates have been exposing and critiquing the range of harmful ways that algorithms reinforce bias and systems of privilege relating to race, gender, class, and ethnicity.20 When bias in information tools is left unacknowledged, librarians risk exacerbating this problem and inadvertently creating learning environments open to misrepresentation of diverse groups.
Learning goals: Lessons in this area seek to surface biases in search tools and expose them to critique. Students will explore how information production is not neutral but rather can reinforce stereotypes and traditional power structures. They should connect that knowledge not only to their college research assignments, but also, more importantly, to their larger communities and society.
Inclusion-oriented activities: The workshop content itself is inclusive because of its focus on students’ authentic and everyday search experiences, with an emphasis on surfacing and critically analyzing the biases encountered in those tools. Moving the discussion into how to seek and elevate alternative perspectives and narratives helps center the voices of diverse communities and helps inculcate lifelong dispositions needed to seek out and advocate not only for those perspectives but also for better tools. I have adopted Davis’s lesson “Bias in Your Search Results” into a first-year composition course with great student engagement.21 Students work in small groups to read articles about bias in a tool, source type, or system and answer questions to share with the larger class. If students were surveyed beforehand about the preferred tools they use, the articles could also be customized to align with those preferences. To help create a safe space for dialogue around biases, establishing some discussion guidelines beforehand will be important.
Extensions: This lesson area can be varied and extended in many ways. Caffrey Gardner’s “Analyzing Search Engines” workshop provides a good activity centered on bias in Google images.22 The University of California, Merced, similarly has Google images and Google auto-complete exercises in its LibGuide “Algorithm Bias and Gaming.”23 Schubert, Wiley, and Young have an adaptable and interactive LibWizard tutorial.24 The University of Louisville’s lesson on algorithms at play in academic disciplines is a useful activity for introducing and identifying issues of bias in those realms.25
Inclusion rationale: Digital platforms are widely used as sources for news information, as documented by a 2021 Pew Research report.26 When I asked first-year composition students where they got their news from, the majority indeed told me YouTube, Twitter, Instagram, Apple News, Snapchat, and TikTok. One student noted, “I get literally all my news from Twitter.”27 While librarians have done good work in recent years supporting the critical evaluation of news sources, we have yet to widely extend our pedagogy to embrace examining how algorithms serve up those news sources into the platforms we use.
Learning goals: We know that students already get news from an array of different sources. This lesson area builds upon that foundation, guiding them to reflect upon the platforms they are using, highlighting the role of algorithms in those platforms, and then identifying strategies for adding more diverse news outlets to their following.
Inclusion-oriented activities: The workshop content itself is inclusive because of its focus on students’ authentic experiences and on incorporating additional news search strategies to bring in more diverse voices. As a diagnostic assessment, I foreground student voice by asking them to share where they get their news from using an anonymous tool like Jamboard. Another helpful activity to kick off the lesson would be to ask students what they know about news platforms and what they’d like to learn. Using the learning management system, I typically ask students to watch videos on social media news platforms and engage with simulations like TheirTube or “Can You Spot the Deceptive Facebook Post?”28 Students can be grouped in think-share or small discussion forum groups to share thoughts as they explore the materials.
Extensions: With more faculty collaboration, you could ask students to track a news issue on social media over time, and then write an essay discussing how platforms differed in their portrayal and coverage of it. Students could also be introduced to AllSides and Pew’s “Political Typology Quiz” as a way to explore different political ideologies in news sources and implications for how that shapes their perceptions of stories.29
When we can’t secure the space for an entire session devoted to algorithmic literacy, we could choose one learning outcome, one small activity, or one formative assessment. As Carolyn Caffrey Gardner shared, “We always joke in my library that no one’s really ever asked for a one-shot session on algorithmic bias, . . .so it’s really about finding that space whether it’s one-on-one or in those one-shots where you are given more freedom.”30 If aiming for an entire lesson feels challenging, try to find spaces in an existing lesson where you might draw student attention to algorithms. For example, Caffrey Gardner often adds a question like “How do you think this search algorithm is ordering results?” in a database session.31 That can begin to open up bigger conversations with faculty. In line with Caffrey Gardener’s approach, Lloyd suggests that we should regularly incorporate and demonstrate a mindset of questioning results and automated decisions into our regular instruction.32 For those wanting to imagine what algorithmic literacy might look like at a multicourse or programmatic level, consult Koenig and Caffrey Gardner for learning outcome approaches.33
It’s clear that algorithmically driven platforms underpin everything our students do in their lives. They are ubiquitous, and their influence is growing. By adopting algorithmic literacy as an inclusive pedagogy, we can assert that we will honor, be attentive to, and discuss the information tools that they use every day in their lives, not just the scholarly ones used for an assignment immediately before them. As Heidi Jacobs wrote in her call to include broader and more critical approaches to information literacy: “When we limit the kinds of questions we ask our students and ask ourselves about information. . . , we limit the ways in which we can be informed, critical, and engaged.”34 Let’s acknowledge and address the algorithms at play in our students’ lives and, by doing so, push our information literacy work to be far more critical and inclusive.
CORA is an open educational resource for librarians, faculty, and other educators hosted by Loyola Marymount University. Search it for lesson plans on algorithms, algorithmic bias, and so on. Some are profiled below.
“Analyzing Search Engines: What Narrative Is Told through the Algorithm,” Caffrey Gardner
Learning outcomes: Students will identify advertisements within a list of search results, discuss the role advertising plays in how search results are ordered, and describe how search results are impacted by human biases in their ranking algorithms.
“Bias in Your Search Results,” Davis
Learning outcome: Students will be able to recognize that search tools and systems reflect power structures of race, gender, sexuality, class, and so on.
“Evaluating Online Sources with Lateral Reading,” Masunaga
Learning outcomes: Students will use lateral reading to identify potential biases or controversies associated with an organization publishing online sources using resources found on Google. They will consider if they would recommend the sources they evaluate to the community at large.
“Exploring Algorithmic Bias with a Summer Bridge Program,” Acosta
Learning outcomes: Students will discuss the effects of algorithm bias in order to articulate how some individuals or groups of individuals may be misrepresented or systematically marginalized in search engine results. They will also develop an attitude of informed skepticism in order to critically evaluate Google search results.
“Exploring Google Scholar with a Summer Bridge Program,” Masunaga
Learning outcomes: Students will be able to search Google Scholar in order to find scholarly and discipline-specific sources for their information needs. Students will understand Google Scholar’s limitations and biases in order to critically evaluate their search results.
“Googling Google: Search Engines as Market Actors,” Berg
Learning outcomes: Students will articulate clearly how algorithms such as PageRank influence information-seeking behavior and search results; explain Google’s data security and privacy issues; and create searches that show critical thinking and awareness of how Google works.
“What’s behind a Web Search? Bias and Algorithms,” Schubert, Wiley, and Young
Learning outcomes: Students will define a broader context for algorithms, analyze Google results for algorithmic bias, and identify actions for countering algorithmic bias. The lesson includes an excellent LibWizard tutorial.
“Algorithm Bias and Gaming,” University of California, Merced
Learning outcomes: Define algorithmic bias. Recognize how algorithms may perpetuate bias or misrepresent certain people or groups. Understand how algorithms may be altered or gamed for certain purposes. Brainstorm strategies to minimize algorithm bias.
“Algorithmic Literacy,” University of Louisville
Learning outcomes: Explore the impact of unseen algorithms on the online content we see and the ramifications on privacy, discrimination, and political polarization.
“Media Literacy,” University of North Carolina at Charlotte
This LibGuide provides links to various resources and digital learning objects which library instructors could use in teaching algorithmic literacy.
“Example Muscat Scholars 2020: Critical Information Literacy Activity,” University of San Francisco
This Google Doc describes a summer workshop for first-generation students. The focus is on critically exploring Instagram influencers for authority and authenticity.
Introduction to College Research, Butler, Sargent, and Smith
The chapter “Age of Algorithms” explores algorithms and their pervasiveness, identifies key concerns around algorithmic bias, and discusses the psychology and sociological effects of algorithms.
Humans R Social Media, Daly
Of particular relevance is chapter 4: “Algorithms: Invisible, Irreversible, and Infinite.”
Digital Survival Skills: My Media Environment, Krouse and Lee
There are activities that could be adapted to lower division undergraduate students.
Big Data and Society
Journal of Media Literacy Education
Critical algorithm studies
Critical data studies
Masked by Trust: Bias in Library Discovery (Reidsma)
The Age of Surveillance Capitalism (Zuboff)
Acosta, Elisa. “Exploring Algorithmic Bias with a Summer Bridge Program.” CORA (Community of Online Research Assignments), October 28, 2018. https://www.projectcora.org/assignment/exploring-algorithmic-bias-summer-bridge-program.
AllSides home page. Accessed September 2021. https://www.allsides.com/.
Andrews, Nicola, Annie Pho, and Charlotte Roh. “Muscat Scholars 2020: Critical Information Literacy Activity.” Google Docs. Accessed September 2021. https://docs.google.com/document/d/1D0_QT9BWbkI7LSjEzoeIUYhMcfe9Hiwtg7DrALqXZzM/edit.
Association of College and Research Libraries. Framework for Information Literacy for Higher Education. Chicago: Association of College and Research Libraries, 2016. https://www.ala.org/acrl/standards/ilframework.
Association of College and University Educators. “Inclusive Teaching for Equitable Learning.” Curriculum crosswalk, 2020. https://acue.org/?acue_courses=inclusive-teaching-for-equitable-learning.
Berg, Jacob. “Googling Google: Search Engines as Market Actors.” CORA (Community of Online Research Assignments), November 4, 2016. https://www.projectcora.org/assignment/googling-google-search-engines-market-actors.
Butler, Walter D., Aloha Sargent, and Kelsey Smith. Introduction to College Research, edited by Cynthia M. Cohen. Press Books, 2021. https://introtocollegeresearch.pressbooks.com/.
Caffrey Gardner, Carolyn. “Analyzing Search Engines: What Narrative Is Told through the Algorithm.” CORA (Community of Online Research Assignments), December 10, 2018. https://www.projectcora.org/assignment/analyzing-search-engines-what-narrative-told-through-algorithm.
———. “Teaching about Algorithms.” Interview by Amanda Piekart and Jessica Kiebler. The Librarian’s Guide to Teaching. Podcast audio, February 9, 2021. https://librariansguidetoteaching.weebly.com/episodes/episode-33-teaching-about-algorithms.
———. “Teaching Algorithmic Bias in a Credit-Bearing Course.” International Information and Library Review 5, no. 4 (2019): 321–27. https://doi.org/10.1080/10572317.2019.1669937.
Daly, Diana, ed. Humans R Social Media, 3rd ed. Tucson: University of Arizona Center for University Education Scholarship, 2021. https://opentextbooks.library.arizona.edu/hrsm/.
Davis, Lindsay. “Bias in Your Search Results.” CORA (Community of Online Research Assignments), July 12, 2019. https://www.projectcora.org/assignment/bias-your-search-results.
Detmering, Robert, Amber Willenborg, and Terri Holtze. University of Louisville Libraries. “Algorithmic Literacy.” Last updated October 7, 2020. https://library.louisville.edu/citizen-literacy/algorithmic.
Florian, Lani, and Kristine Black-Hawkins. “Exploring Inclusive Pedagogy.” British Educational Research Journal 37, no. 5 (2011): 813–28. https://doi.org/10.1080/01411926.2010.501096.
Gillespie, Tarleton, and Nick Seaver. “Critical Algorithm Studies: A Reading List.” Social Media Collective. Last updated December 15, 2016. https://socialmediacollective.org/reading-lists/critical-algorithm-studies/.
Hallinan, Blake and Ted Striphas. “Recommended for You: The Netflix Prize and the Production of Algorithmic Culture.” New Media and Society 18, no. 1 (2016): 117–37. https://doi.org/10.1177/1461444814538646.
Ham, Chris D. “Why Is This First? Understanding and Analyzing Internet Search Results.” Journal of Educational Research and Practice 9, no. 1 (2019): 400–12. https://doi.org/10.5590/JERAP.2019.09.1.28.
Head, Alison, and Michael Eisenberg. “Assigning Inquiry: How Handouts for Research Assignments Guide Today’s College Students.” Progress report. Project Information Literacy Research Institute, July 12, 2010. https://projectinfolit.org/pubs/research-handouts-study/pil_research-handouts_2010-07-13.pdf.
Head, Alison, Barbara Fister, and Margy MacMillan. Information Literacy in the Age of Algorithms: Student Experiences with New and Information, and the Need for Change. Santa Rosa, CA: Project Information Literacy Research Institute, January 15, 2020. https://projectinfolit.org/pubs/algorithm-study/pil_algorithm-study_2020-01-15.pdf.
J. Murrey Atkins Library. “Media Literacy.” Research guide. University of North Carolina at Charlotte. Last updated October 27, 2021. https://guides.library.uncc.edu/c.php?g=995102&p=7709547.
Jacobs, Heidi L. M. “Pedagogies of Possibility within the Disciplines.” Communications in Information Literacy 8, no. 2 (2014): 192–207. https://doi.org/10.15760/comminfolit.2014.8.2.166.
Kihara, Tomo. TheirTube home page. Accessed September 2021. https://www.their.tube.
Koenig, Abby. “The Algorithms Know Me and I Know Them: Using Student Journals to Uncover Algorithmic Literacy Awareness.” Computers and Composition 58 (December 2020): 102611. https://doi.org/10.1016/j.compcom.2020.102611.
Lloyd, Annemaree. “Chasing Frankenstein’s Monster: Information Literacy in the Black Box Society.” Journal of Documentation 75, no. 6 (2019): 1475–85. https://doi.org/10.1108/JD-02-2019-0035.
Masunaga, Jennifer. “Evaluating Online Sources with Lateral Reading.” CORA (Community of Online Research Assignments), February 15, 2021. https://www.projectcora.org/assignment/evaluating-online-sources-lateral-reading.
———. “Exploring Google Scholar with a Summer Bridge Program.” CORA (Community of Online Research Assignments), November 7, 2018. https://www.projectcora.org/assignment/exploring-google-scholar-summer-bridge-program.
Nelson, Robert L., and Heidi L. M. Jacobs. “History, Play, and the Public: Wikipedia in the University Classroom.” History Teacher 50, no. 4 (August 2017): 483–500. https://www.jstor.org/stable/44507270.
Noble, Safiya Umoja. Algorithms of Oppression: How Search Engines Reinforce Racism. New York: New York University Press, 2018.
O’Hara, Ian. I..Ihen… Else: Algorithmic Systems and Epistemic Crises.” In Ascending into an Open Future: Proceedings of the ACRL 2021 Virtual Conference, April 13–16, 2021, edited by Dawn M. Mueller, 98–106. Chicago: Association of College and Research Libraries, 2021. https://www.ala.org/acrl/sites/ala.org.acrl/files/content/conferences/confsandpreconfs/2021/IfThenElse.pdf.
Peterson, J. Michael. “Inclusive Schooling.” In Family and Society, edited by Michael Shally-Jensen, 1040–48. Vol. 3 of Encyclopedia of Contemporary American Social Issues. Santa Barbara, CA: ABC-CLIO, 2011. Gale.
Pew Research Center. “Political Typology Quiz.” November 9, 2021. https://www.pewresearch.org/politics/quiz/political-typology/.
Reidsma, Matthew. Masked by Trust: Bias in Library Discovery. Sacramento, CA: Litwin Books, 2019.
Schubert, Carolyn, Malia Wiley, and Alyssa Young. “What’s Behind a Web Search? Bias and Algorithms.” CORA (Community of Online Research Assignments), May 6, 2021. https://www.projectcora.org/assignment/what%E2%80%99s-behind-web-search-bias-and-algorithms.
Spratt, Jennifer, and Lani Florian. “Inclusive Pedagogy: From Learning to Action; Supporting Each Individual in the Context of ‘Everybody.’” Teaching and Teacher Education 49 (July 2015): 89–96. https://doi.org/10.1016/j.tate.2015.03.006.
University of California, Merced, Library. “Algorithm Bias and Gaming.” LibGuide. Last updated October 29, 2020. https://libguides.ucmerced.edu/algorithmic-bias.
Walker, Mason, and Katerina Eva Matsa. “News Consumption across Social Media in 2021.” Pew Research Center, September 20, 2021. https://www.pewresearch.org/journalism/2021/09/20/news-consumption-across-social-media-in-2021/.
Zuboff, Shoshana. The Age of Surveillance Capitalism. New York: PublicAffairs, 2019.
Melanie Sellar is head of instruction and assessment at University Library, Santa Clara University. In this role she develops, enhances, and implements the Library’s instruction and assessment programs and priorities. She has twelve years of accrued experience specializing in the public services of academic libraries, with scholarship interests spanning inclusive and critical pedagogy, algorithmic literacy, and instructional design. Connect with Melanie on LinkedIn.
📖 View Exploring Inclusive & Equitable Pedagogies: Creating Space for All Learners (ACRL, 2023)
🔥 Sign up for LibTech Insights (LTI) new post notifications and updates.
✍️ Interested in contributing to LTI? Send an email to Deb V. at Choice with your topic idea.
Posted on in Blog Posts
Good content strategy is key for user experience
Posted on in Blog Posts
Why do workflows vary so much between institutions?
Posted on in Blog Posts
Five great books to get you started
Posted on in Blog Posts