Lessons for Privacy Literacy Instruction from Pew Research’s New Survey

Can private literacy fight against the feeling of futility?

Last week, Pew Research Center released an 80-page report on Americans’ attitudes and beliefs about data privacy, tech regulation, and AI, among other topics. The survey, conducted between May 15 and 21, 2023, on 5,101 American adults, offers some interesting insights that might inform libraries’ data and privacy literacy programs. As with most Pew Research reports, the findings show strong opinions but also strong contradictions (a good summation of the American mind, to be fair). The limited sample size makes it possible to overstate the significance of the survey. However, we might nonetheless draw out a few trends and findings that might prove useful to consider for literacy instruction. 

Distrust and fear predominate attitudes toward privacy 

Those surveyed showed deep concerns about how both the government and companies use their data. Most believe that they have little to no control over how the government (79% of respondents) or companies (73%) use their data. Distrust toward the government showed a slight partisan edge, with self-identified Republicans showing a greater degree of skepticism. Perhaps given the headline antics of Elon Musk, public trust in the leaders of social media companies is also very low. The report finds that 77% have little to no trust that these leaders will take responsibility for data misuse, and 71% have little to no trust that the government will hold these leaders accountable. 

I’d say yikes, but it gets worse. The more significant problem is that 61% of people “feel skeptical that anything they do will make much difference” when it comes to privacy. To use one example, the report points out that very few people read the “User Agreements” before clicking agree. One way to look at the situation is that whether you read the terms or not, you still have to press either agree or disagree, and the only way to use an app or service is to agree. Does it make a difference whether you read the User Agreement and agree or don’t read the User Agreement and agree anyway? Clearly, people are aware that their agency has limits.


🌟 Subscribe to the LibTech Insights newsletter for more great posts, including: 


Even so, people trust themselves… but, well, maybe shouldn’t? 

One interesting contradiction that emerges in this report is that 78% of those surveyed “trust themselves to make the right decisions about their personal information,” but their practices don’t always align with this belief. Returning to an earlier example, about 6 in 10 people don’t bother reading User Agreements. Only about half of people purposefully create a strong password to secure their accounts. This isn’t to say that most people completely ignore their privacy (I’ll get to the positive signs later). But this false confidence is a dangerous attitude given what else we know. 

For instance, despite this (very American) ethic of self-reliance, only 53% of respondents are “very or somewhat confident” that they know what to do if someone hacks or steals their online information—a glaring problem. The report also found that 34% of respondents experienced fraudulent charges on their bank cards, had their email or social media accounts stolen, or had someone open a line of credit or apply for a loan in their name in the past year. Recently, Vox reported that zoomers—notwithstanding the popular image of them as digital natives—fall for online scams at a greater rate than boomers do. One could reasonably argue that this particular finding isn’t altogether surprising, because young people likely use the internet more than older people and have higher rates of exposure to scams. But the more general point worth taking from this study is that we shouldn’t presume good privacy judgment from young people. 

Again, I want to resist the temptation to blame individuals for systemic problems. However, I think the false confidence with which people treat their online privacy is a concerning feature of our digital literacy landscape that we need to contend with. 

People do care about their privacy 

Though a spirit of futility fills this report, I think readers should take comfort that people, by and large, agree on the importance of digital privacy. Only 29% of respondents said that privacy “is not that big of a deal to them.” And despite the hyper-partisanship in large sectors of American life and politics, the Pew report shows strong bipartisan support for greater regulation (72%). Indeed, bipartisan bills regulating Big Tech are already underway

Though seldom the majority, some people do take good steps toward securing their data and privacy. The report finds that  68% change their social media privacy settings, 49% have stopped using a device, website, or app due to privacy concerns, 44% rely on search engines or web browsers that don’t track them, and 36% deploy software that encrypts their messages and private communication. The final reckoning is that people are inconsistent about their levels of caution toward their privacy and data. Through teaching and training, we have the opportunity to replace that false confidence with actual confidence. 

The takeaways for privacy literacy instruction 

All librarians who deal with privacy or data literacy instruction must contend first and foremost with the widespread view that steps to secure one’s privacy and data are futile. On the one hand, we must acknowledge that, outside of regulations, a robust notion of privacy is unobtainable, and we cannot place sole responsibility for privacy onto individuals. On the other hand, we know that something as simple as changing your settings or having a strong password can, in fact, make a practical difference to a user’s or patron’s digital security and prevent a host of problems. Data and privacy literacy instruction has to not only draw this distinction but also clarify the stakes to reassure students that their decisions do in fact matter. We need to be honest about the limitations of what we can do, but maintain that what little agency we have is nonetheless important. 

Another practical takeaway from this report follows from the fact that only 53% of people know what to do during a data breach or hack. Education may not limit the number of scams out there, but it can equip students with the resources to address problems. This needs to happen at two scales. Students need to know how to respond to informational breaches into their personal accounts (e.g., bank accounts) and institutional accounts (e.g., university emails). Understandably, we emphasize “prevention” in IT trainings—how to identify phishing scams, for instance—but we also need to outline the path forward for when problems inevitably arise.

A final suggestion is to teach students about the laws and regulations that do exist. The Pew Research report shows that 72% of respondents have little to no understanding of current protections. Though it may feel like we don’t have any regulation or protection, we do have some. As creators and consumers on the internet, students need to have a better understanding of the regulatory environment already in place. 


🔥 Sign up for LibTech Insights (LTI) new post notifications and updates.

❗️ LTI is looking for new contributors! Interested in writing for us? Send your topic idea to Daniel P.