Safety vs. Privacy

Exploring Public Sentiment on the Balance Between Privacy and Safety in the Modern Era

Written By: Rob Gabriele | Updated September 28, 2023

Key Findings

  • More than half of all Americans believe it’s reasonable to trade privacy for safety, but only 1 in 5 feel significantly safer due to modern surveillance technology.
  • Most people feel their personal data is completely out of their control: 86% of Americans do not feel they have control over how data about them is used, and 84% do not feel able to control who can gather and use data about them.
  • 4 in 5 Americans are at least somewhat concerned about government surveillance, and almost 9 in 10 believe surveillance will be taken too far or already has been.

The Rise of Modern Surveillance

Artificial intelligence is everywhere and is continually evolving. Most people interact with artificial intelligence every day, as it enables computers, robots, and machines to mimic the capabilities of the human mind. Modern surveillance is part of this remarkable advance in technology. Privacy remains a concern, as personal data collection is necessary for AI systems to operate and improve how they function.

From facial recognition and police robots to chatbots and AI that can generate images, the use of AI has skyrocketed in the past few years. We see AI being used in communities across the country and around the world by everyone from governments and law enforcement to individuals looking for a more efficient way of doing things.

As the balance between safety and privacy seems to be at a tipping point in our society, we wanted to put our finger on the pulse of Americans to see how they feel about surveillance. Where do most people stand on the issue of security versus privacy?

Security and Privacy Concerns

As heightened security measures become the new norm, we felt compelled to examine how it’s affecting Americans. Are they willing to trade their privacy for increased safety? Are they concerned about the amount of surveillance they’re under? Here’s a look at their responses.

Our recent survey also took gender and political party affiliation into account when studying how people feel about safety and privacy concerns. It turns out that men (30%) were almost twice as likely as women (16%) to value privacy over safety. An additional study about safety confirms that women are more likely than men to feel unsafe, and they’re also more likely to use technology to stay safe.

Interestingly, 3 out of 4 Democrats agree that trading privacy for safety is reasonable. Independents were the most likely to value privacy over safety. Republicans were the most likely to say surveillance technology made them feel safer, however, they were also more likely to have a high level of concern about its usage.

Trust and Control Issues

We next took a look at five different types of surveillance to see if their usage made people feel safer or more concerned. We also measured trust levels people felt toward the various users of surveillance technology and the top reasons why people worry about its usage.

An overwhelming majority of people believed large institutions will take or have already taken surveillance measures too far. When it comes to political parties, 92% of Republicans believed surveillance has been or would be taken too far, followed by 88% of independents and 86% of Democrats. About 1 in 4 Americans felt surveillance had already been taken too far; of these, 37% were independents, 26% were Republicans, and 23% were Democrats.

Republicans were much less trusting of the federal government than state and local governments. On the other hand, Democrats were more trusting of the federal government and had less confidence in state and local governments. Our research also revealed that women (57%) were more likely than men (47%) to trust all levels of government with surveillance technology.

Technologically Advanced Law Enforcement

As leaders seek to arm their officers with new capabilities to protect and serve, advanced technology use in law enforcement has been on the rise in police departments.

Technologically advanced law enforcement is the use of computer programs and modern devices to police people. Some of the most commonly used surveillance technologies include the following:

  • Artificial intelligence
  • Automatic license plate recognition
  • Biometrics
  • Drones
  • Enhanced body cameras
  • Facial recognition software
  • Robots
  • Gunshot detection systems
  • Smarter police cruisers
  • Thermal imaging

According to a recent study, policymakers, interest groups, and scholars have encouraged the implementation of tech in policing for the past 40 years with the aim of improving efficiency and effectiveness. There’s a big push for the adoption of new information technology that includes gathering and storing a big collection of data. Some technologies have shown to be beneficial in reducing crime, like CCTV surveillance cameras. The efficacy of other methods, however, are still in question.

Facial recognition software is a powerful technology that compares faces to one another to help identify a person. Law enforcement wants to use this advanced tech to match suspects to photos in databases. Though this software boasts high accuracy among most groups of people, previous studies have alluded that facial recognition exhibits racial discrimination.

Consistent inaccuracies have been noted in the matching of faces for young females ages 18 to 30 with dark complexions. If facial recognition is adopted countrywide, the problem remains that the technology is imperfect and may lead to the false arrest or incarceration of an innocent, misidentified person. This controversial advanced technology falls short and that’s why some cities have banned their police departments from using it.

Conclusion

Americans are nearly split in half on whether or not they want more surveillance technology in their communities. Forty-five percent of the people we questioned said yes, while the other 55% either didn’t want it or weren’t sure. Women were also more likely than men to want modern technology in their community. Almost half of the group (48%) believes increased surveillance will have a positive impact, while 25% fear a negative impact, and 27% think it wouldn’t make a difference. Most people are also unconvinced that more advanced surveillance would result in a more positive relationship between the community and law enforcement.

The use of body cameras and microphones are what most people want to see increased in law enforcement, followed by more traffic and surveillance cameras. Women, non-white Americans, and Democrats are the most likely to desire more surveillance tech, while men, white Americans, Republicans, and Independents value privacy more. All agreed that the least desired technology implementations were artificial intelligence judges and robot police.

Some people believed advanced technology use in policing will help reduce police brutality and crime in their communities; however, many still remain unconvinced of the efficacy of these new methods. The reported lack of interest and trust in the most advanced artificial intelligence suggests that people aren’t yet ready for autonomous policing tech. They also aren’t convinced that those who handle their personal data and wield advanced technologies can be trusted to do so in a way that doesn’t violate privacy rights. Researchers agree that further evaluation is needed on the impact of artificial intelligence in policing to truly determine whether it’s worth sacrificing privacy for increased security.

Our Data

For this study, we surveyed 1,009 respondents using the Amazon MTurk platform. Among those respondents, 465 were female, 538 were male, and six identified as nonbinary. Our respondents ranged in age from 18 to 79 with an average age of 39.

In order to help gather accurate responses, all survey respondents were required to identify and correctly answer a decoyed attention-check question. In some cases, questions and answers have been rephrased or paraphrased for brevity or clarity. These data rely on self-reporting, and potential issues with self-reported data include, but are not limited to, telescoping, selective memory, and attribution errors.

Are you a journalist or researcher writing about this topic?

Are you a journalist or researcher writing about this topic? Contact us and we'll connect you with an expert who can provide insights and data to support your work.

Submit Question