• IEEE Xplore Digital Library
  • IEEE Standards
  • IEEE Spectrum

IEEE Digital Privacy

  • Publications
  • Additional Digital Privacy Resources

What Is Digital Privacy and Its Importance?

Digital privacy, a subset of the broader concept of privacy, focuses on the proper handling and usage of sensitive data—specifically personal information, communication, and conduct—that are generated and transmitted within digital environments. In essence, it denotes the rights and expectations of individuals to keep personal information confidential and secure in the digital realm.

The importance of digital privacy is profoundly evident in today's data-driven world. Individuals utilize digital platforms for various tasks, generating substantial amounts of personal data that could convey intimate insights about their lives if misused—whether it's sensitive financial information or personal health records. Therefore, digital privacy is crucial as it maintains a boundary to protect users from unwanted intrusions and manipulations of data, preserving human dignity and individual autonomy.

Equally deserving of attention is the role of digital privacy in ensuring a healthy democratic society. It allows the freedom of thought and expressio n , promoting diversity of ideas and opinions while negating manipulative influences. Within the business sphere, digital privacy practices foster customer trust and build corporate reputation, which are indispensable elements for growth and success in a competitive marketplace.

Finally, digital privacy is pivotal in circumventing potential data breaches. With cybercriminal activities on the rise, ensuring an individual's digital privacy is not just desirable, but a vital necessity. The threats posed by malicious hackers and cybercriminals make the protection of personal data of utmost importance, thereby emphasizing the imperative need for digital privacy.

Introduction to Digital Privacy

Protecting Privacy in the Digital Age

Digital privacy in the modern era is a complex amalgamation of various elements, including data privacy and individual privacy . It entails protecting personal information that a user shares with other entities—be it other individuals, companies, or public bodies—across digital platforms. It encompasses safeguarding one's digital identity, ensuring confidentiality and security of communications and transactions, and maintaining control over user-generated data.

The advent of the digital age has revolutionized our understanding and expectations of privacy. Initially viewed primarily in terms of rights to solitude, privacy has now evolved to emphasize control over personal data. Digital privacy in this context symbolizes the ability of users to own, manage, and control their data, with a focus on how these data are collected, processed, stored, and shared.

Key components of digital privacy can be broadly categorized into three main areas—individual privacy, information privacy, and communication privacy. Individual privacy centers on the protection of personal information identifiable to an individual such as health records, financial information, or social security numbers. Information privacy involves safeguarding of data collected digitally, ensuring it is collected and processed ethically and lawfully. Communication privacy pertains to the confidentiality and security of digital communications, preventing unauthorized access and interception.

Digital privacy holds significant implications for both individuals and businesses. On an individual scale, digital privacy safeguards personal information from theft, abuse, and unwanted exposure, thereby maintaining personal security and mitigating risks associated with identity theft and online harassment. In contrast, businesses dealing with customer data need to uphold rigorous data privacy standards to keep consumer trust intact and avoid legal repercussions arising from data breaches.

Learn more in our course program: Protecting Privacy in the Digital Age

Access the courses

Importance of Digital Privacy for Individuals

Digital privacy serves as a shield that safeguards personal information from undue exposure and misuse. Consider the myriad of information individuals provide online—from social media profiles to online banking transactions. Without stringent digital privacy controls, these data could be exploited, resulting in severe consequences such as financial loss, identity theft, and personal harm. Hence, the role of digital privacy in protecting users against these risks cannot be understated.

Neglecting digital privacy could expose individuals to a plethora of risks. These include phishing scams, ransomware attacks, and cyberstalking—all of which could lead to significant personal, financial, and psychological harm. At an even graver scale, the lack of digital privacy could facilitate large-scale data breaches, where sensitive personal data of millions could be hijacked, then sold, or exploited.

Thankfully, individuals can take several steps to actively protect their digital privacy. For starters, using strong, unique passwords and activating 2-step verification where possible, can put up the first line of defense. Additionally, one should be cautious about the information shared online and ensure that privacy settings on social platforms are adjusted to limit data exposure. Regular software and firmware updates are also crucial as they often include security patches that fix vulnerabilities which cybercriminals might exploit. Lastly, the use of secure networks and reliable cybersecurity software can significantly contribute to increased digital privacy.

Social media platforms can both advance and undermine internet privacy . On the one hand, they can promote digital privacy by providing privacy controls that limit who can view user profiles and posts. On the other, they can also hinder privacy through vast data collection practices and data sharing with advertisers and third parties.

Encryption plays a pivotal role in strengthening digital privacy. It involves encoding information such that only authorized parties can access it. By leveraging encryption technologies, users can ensure that even if their data is intercepted, it remains unreadable and thus, safe. Encryption is extensively used in protecting sensitive data transmission and storage, including in email services, messaging apps, and cloud storage, thereby enhancing individual digital privacy.

Cybersecurity and Digital Privacy

Cybersecurity and digital privacy, while distinct, are inextricably linked aspects of the digital landscape. Cybersecurity primarily focuses on protecting the integrity and confidentiality of data and systems from cyber threats such as malware and hacking. Digital privacy, on the other hand, is about safeguarding personal information from unlawful data collection and ensuring user control over personal data. In essence, cybersecurity represents the measures taken to secure data and systems, while digital privacy deals with how personal information is collected, used, and shared. Therefore, effective cybersecurity is crucial for ensuring digital privacy.

Cybersecurity threats that impede digital privacy are abundant and perpetually evolving. These include viruses, ransomware, and phishing attacks, which could expose sensitive personal data. More sophisticated threats like man-in-the-middle attacks and Distributed Denial-of-Service (DDoS) attacks could disrupt systems and services, potentially leading to data breaches.

Businesses typically adopt a layered approach to bolster their cybersecurity posture and enhance digital privacy. This involves the deployment of numerous security measures at various levels, such as firewalls, Intrusion Detection Systems (IDS), antivirus software, and secure authentication systems. Additionally, cybersecurity awareness training for employees and adopting best practices such as least privilege principle and regular systems audits can help contravene privacy-eroding cyber threats.

The absence of cybersecurity measures portends grave repercussions for digital privacy. Unprotected digital systems offer a favorable atmosphere for cybercriminals to launch attacks, leading to unauthorized access to sensitive information. Even worse, the lack of sufficient cybersecurity measures could facilitate large-scale data breaches, with the potential to significantly compromise the privacy of individuals at mass-scale.

Government Surveillance and Digital Privacy

The nexus between government surveillance and digital privacy is a complex and often contentious issue. While surveillance can be justified on grounds of maintaining national security or preserving public safety, it invariably poses challenges to digital privacy. From indiscriminate metadata collection to CCTV monitoring and data requests from tech companies, government surveillance strategies can lead to substantial encroachments on citizens’ digital privacy.

The ethics of government surveillance in the digital space often centers around the balancing act between maintaining national security and preserving individual privacy. Unquestionably, state authorities have a duty to secure the nation from threats, both internal and external. However, the presence of broad and pervasive surveillance measures could stifle individual liberties, not least the right to privacy. Hence, the crux of the ethical debate lies in finding legitimate, proportionate, and justifiable surveillance methods that respect individuals' privacy rights.

Striking a balance between privacy concerns and national security needs is indeed complex. On the one hand, citizens expect their governments to ensure their safety. Conversely, they also wish to keep personal aspects of their lives confidential. Privacy-enhancing technologies like encryption, anonymizing tools, and secure communication channels can help individuals protect their digital privacy. It is also essential for democratic discussion about the appropriate extent of surveillance, leading to legal protective mechanisms that respect both national security needs and privacy concerns.

Several legal frameworks exist worldwide to regulate government access to digital data. These include data protection laws like the EU's General Data Protection Regulation (GDPR), sector-specific regulations such as Health Insurance Portability and Accountability Act (HIPAA) in the U.S. for health information, and surveillance laws that stipulate under which conditions governments can access personal data . While these frameworks provide a degree of protection for individuals' digital privacy, challenges remain, such as jurisdictional discrepancies in these laws and the capabilities of governments to bypass legal constraints.

Emerging Technologies and Digital Privacy

Emerging technologies like Artificial Intelligence (AI) significantly impact digital privacy. AI's capabilities for data processing and analysis can enable more efficient service delivery and insights generation. However, these same capabilities can also be used to analyze personal data for profiling and decision-making without meaningful human oversight, potentially invading individual privacy without consent. As such, AI can be both a force for good and a potential detriment to digital privacy, calling for robust privacy policies to govern its use.

Data analytics, another powerful development, can shape digital privacy practices by providing insights into users' behaviors, enabling better personalization of services. However, extensive data harvesting and analysis can easily infringe individual privacy rights if not carefully managed. Thus, data minimization and anonymization techniques need to be integrated alongside data analytics to balance efficiency gains against privacy implications.

In the face of the evolving combination of digital privacy and technology, individuals can adapt by staying abreast of technological changes and their implications for privacy. This involves understanding the privacy policies of digital services, making use of privacy tools and settings, and regularly updating and patching systems to address any technical vulnerabilities.

The integration of the Internet of Things (IoT) devices into daily life presents serious digital privacy risks. IoT devices collect, process, and transmit vast amounts of data, some of it highly personal, posing potential privacy threats if this data is misused or inadequately protected. Thus, users need to pay careful attention to the security settings and data handling practices of such devices.

Finally, advancements like facial recognition technologies pose significant challenges to digital privacy norms. While offering transformative potential for security and access control, they bring up unsettling questions: Will individuals lose anonymity in public spaces? How can misuse for surveillance be prevented? This accentuates the necessity of legal and ethical guidelines to govern the use of such powerful tools in an era heightened by privacy concerns.

Digital privacy, undeniably an indispensable aspect of personal online identities, has significant individual and societal implications. From data protection laws to cybersecurity measures, various elements influence digital privacy. As human evolution continues in the digital era, grappling with the rapid progression of technologies like AI and IoT, their understanding and management of digital privacy will inevitably require continual reassessment and adaptation. Individuals, businesses, and governments share a collective responsibility to dialogue, innovate, and legislate for a balanced digital society that honors both the immense potential of the digital age and the timeless value of privacy.

Interested in joining IEEE Digital Privacy? IEEE Digital Privacy is an IEEE-wide effort dedicated to champion the digital privacy needs of the individuals. This initiative strives to bring the voice of technologists to the digital privacy discussion and solutions, incorporating a holistic approach to address privacy that also includes economic, legal, and social perspectives. Join the IEEE Digital Privacy Community to stay involved with the initiative program activities and connect with others in the field.

essay on digital privacy

Photo by Raghu Rai/Magnum

Privacy is power

Don’t just give away your privacy to the likes of google and facebook – protect it, or you disempower us all.

by Carissa Véliz   + BIO

Imagine having a master key for your life. A key or password that gives access to the front door to your home, your bedroom, your diary, your computer, your phone, your car, your safe deposit, your health records. Would you go around making copies of that key and giving them out to strangers? Probably not the wisest idea – it would be only a matter of time before someone abused it, right? So why are you willing to give up your personal data to pretty much anyone who asks for it?

Privacy is the key that unlocks the aspects of yourself that are most intimate and personal, that make you most you, and most vulnerable. Your naked body. Your sexual history and fantasies. Your past, present and possible future diseases. Your fears, your losses, your failures. The worst thing you have ever done, said, and thought. Your inadequacies, your mistakes, your traumas. The moment in which you have felt most ashamed. That family relation you wish you didn’t have. Your most drunken night.

When you give that key, your privacy, to someone who loves you, it will allow you to enjoy closeness, and they will use it to benefit you. Part of what it means to be close to someone is sharing what makes you vulnerable, giving them the power to hurt you, and trusting that person never to take advantage of the privileged position granted by intimacy. People who love you might use your date of birth to organise a surprise birthday party for you; they’ll make a note of your tastes to find you the perfect gift; they’ll take into account your darkest fears to keep you safe from the things that scare you. Not everyone will use access to your personal life in your interest, however. Fraudsters might use your date of birth to impersonate you while they commit a crime; companies might use your tastes to lure you into a bad deal; enemies might use your darkest fears to threaten and extort you. People who don’t have your best interest at heart will exploit your data to further their own agenda. Privacy matters because the lack of it gives others power over you.

You might think you have nothing to hide, nothing to fear. You are wrong – unless you are an exhibitionist with masochistic desires of suffering identity theft, discrimination, joblessness, public humiliation and totalitarianism, among other misfortunes. You have plenty to hide, plenty to fear, and the fact that you don’t go around publishing your passwords or giving copies of your home keys to strangers attests to that.

You might think your privacy is safe because you are a nobody – nothing special, interesting or important to see here. Don’t shortchange yourself. If you weren’t that important, businesses and governments wouldn’t be going to so much trouble to spy on you.

You have your attention, your presence of mind – everyone is fighting for it. They want to know more about you so they can know how best to distract you, even if that means luring you away from quality time with your loved ones or basic human needs such as sleep. You have money, even if it is not a lot – companies want you to spend your money on them. Hackers are eager to get hold of sensitive information or images so they can blackmail you. Insurance companies want your money too, as long as you are not too much of a risk, and they need your data to assess that. You can probably work; businesses want to know everything about whom they are hiring – including whether you might be someone who will want to fight for your rights. You have a body – public and private institutions would love to know more about it, perhaps experiment with it, and learn more about other bodies like yours. You have an identity – criminals can use it to commit crimes in your name and let you pay for the bill. You have personal connections. You are a node in a network. You are someone’s offspring, someone’s neighbour, someone’s teacher or lawyer or barber. Through you, they can get to other people. That’s why apps ask you for access to your contacts. You have a voice – all sorts of agents would like to use you as their mouthpiece on social media and beyond. You have a vote – foreign and national forces want you to vote for the candidate that will defend their interests.

As you can see, you are a very important person. You are a source of power.

By now, most people are aware that their data is worth money. But your data is not valuable only because it can be sold. Facebook does not technically sell your data, for instance. Nor does Google. They sell the power to influence you. They sell the power to show you ads, and the power to predict your behaviour. Google and Facebook are not really in the business of data – they are in the business of power. Even more than monetary gain, personal data bestows power on those who collect and analyse it, and that is what makes it so coveted.

T here are two aspects to power. The first aspect is what the German philosopher Rainer Forst in 2014 defined as ‘the capacity of A to motivate B to think or do something that B would otherwise not have thought or done’. The means through which the powerful enact their influence are varied. They include motivational speeches, recommendations, ideological descriptions of the world, seduction and credible threats. Forst argues that brute force or violence is not an exercise of power, for subjected people don’t ‘do’ anything; rather, something is done to them. But clearly brute force is an instance of power. It is counterintuitive to think of someone as powerless who is subjecting us through violence. Think of an army dominating a population, or a thug strangling you. In Economy and Society (1978), the German political economist Max Weber describes this second aspect of power as the ability for people and institutions to ‘carry out [their] own will despite resistance’.

In short, then, powerful people and institutions make us act and think in ways in which we would not act and think were it not for their influence. If they fail to influence us into acting and thinking in the way that they want us to, powerful people and institutions can exercise force upon us – they can do unto us what we will not do ourselves.

There are different types of power: economic, political and so on. But power can be thought of as being like energy: it can take many different forms, and these can change. A wealthy company can often use its money to influence politics through lobbying, for instance, or to shape public opinion through paying for ads.

Power over others’ privacy is the quintessential kind of power in the digital age

That tech giants such as Facebook and Google are powerful is hardly news. But exploring the relationship between privacy and power can help us to better understand how institutions amass, wield and transform power in the digital age, which in turn can give us tools and ideas to resist the kind of domination that survives on violations of the right to privacy. However, to grasp how institutions accumulate and exercise power in the digital age, first we have to look at the relationship between power, knowledge and privacy.

There is a tight connection between knowledge and power. At the very least, knowledge is an instrument of power. The French philosopher Michel Foucault goes even further, and argues that knowledge in itself is a form of power . There is power in knowing. By protecting our privacy, we prevent others from being empowered with knowledge about us that can be used against our interests.

The more that someone knows about us, the more they can anticipate our every move, as well as influence us. One of the most important contributions of Foucault to our understanding of power is the insight that power does not only act upon human beings – it constructs human subjects (even so, we can still resist power and construct ourselves). Power generates certain mentalities, it transforms sensitivities, it brings about ways of being in the world. In that vein, the British political theorist Steven Lukes argues in his book Power (1974) that power can bring about a system that produces wants in people that work against their own interests. People’s desires can themselves be a result of power, and the more invisible the means of power, the more powerful they are. Examples of power shaping preferences today include when tech uses research about how dopamine works to make you addicted to an app, or when you are shown political ads based on personal information that makes a business think you are a particular kind of person (a ‘persuadable’, as the data-research company Cambridge Analytica put it, or someone who might be nudged into not voting, for instance).

The power that comes about as a result of knowing personal details about someone is a very particular kind of power. Like economic power and political power, privacy power is a distinct type of power, but it also allows those who hold it the possibility of transforming it into economic, political and other kinds of power. Power over others’ privacy is the quintessential kind of power in the digital age.

T wo years after it was funded and despite its popularity, Google still hadn’t developed a sustainable business model. In that sense, it was just another unprofitable internet startup. Then, in 2000, Google launched AdWords, thereby starting the data economy. Now called Google Ads, it exploited the data produced by Google’s interactions with its users to sell ads. In less than four years, the company achieved a 3,590 per cent increase in revenue.

That same year, the Federal Trade Commission had recommended to US Congress that online privacy be regulated. However, after the attacks of 11 September 2001 on the Twin Towers in New York, concern about security took precedence over privacy, and plans for regulation were dropped. The digital economy was able to take off and reach the magnitude it enjoys today because governments had an interest in having access to people’s data in order to control them. From the outset, digital surveillance has been sustained through a joint effort between private and public institutions.

The mass collection and analysis of personal data has empowered governments and prying companies. Governments now know more about their citizens than ever before. The Stasi (the security service of the German Democratic Republic), for instance, managed to have files only on about a third of the population, even if it aspired to have complete information on all citizens. Intelligence agencies today hold much more information on all of the population. To take just one important example, a significant proportion of people volunteer private information in social networks. As the US filmmaker Laura Poitras put it in an interview with The Washington Post in 2014: ‘Facebook is a gift to intelligence agencies.’ Among other possibilities, that kind of information gives governments the ability to anticipate protests, and even pre-emptively arrest people who plan to take part. Having the power to know about organised resistance before it happens, and being able to squash it in time, is a tyrant’s dream.

Tech companies’ power is constituted, on the one hand, by having exclusive control of data and, on the other, by the ability to anticipate our every move, which in turn gives them opportunities to influence our behaviour, and sell that influence to others. Companies that earn most of their revenues through advertising have used our data as a moat – a competitive advantage that has made it impossible for alternative businesses to challenge tech titans. Google’s search engine, for example, is as good as it is partly because its algorithm has much more data to learn from than any of its competitors. In addition to keeping the company safe from competitors and allowing it to train its algorithm better, our data also allows tech companies to predict and influence our behaviour. With the amount of data it has access to, Google can know what keeps you up at night, what you desire the most, what you are planning to do next. It then whispers this information to other busybodies who want to target you for ads.

Tech wants you to think that the innovations it brings into the market are inevitable

Companies might also share your data with ‘data brokers’ who will create a file on you based on everything they know about you (or, rather, everything they think they know), and then sell it to pretty much whoever is willing to buy it – insurers, governments, prospective employers, even fraudsters.

Data vultures are incredibly savvy at using both the aspects of power discussed above: they make us give up our data, more or less voluntarily, and they also snatch it away from us, even when we try to resist. Loyalty cards are an example of power making us do certain things that we would otherwise not do. When you are offered a discount for loyalty at your local supermarket, what you are being offered is for that company to conduct surveillance on you, and then influence your behaviour through nudges (discounts that will encourage you to buy certain products). An example of power doing things to us that we don’t want it to do is when Google records your location on your Android smartphone, even when you tell it not to.

Both types of power can also be seen at work at a more general level in the digital age. Tech constantly seduces us into doing things we would not otherwise do, from getting lost down a rabbit hole of videos on YouTube, to playing mindless games, or checking our phone hundreds of times a day. The digital age has brought about new ways of being in the world that don’t always make our lives better. Less visibly, the data economy has also succeeded in normalising certain ways of thinking. Tech companies want you to think that, if you have done nothing wrong, you have no reason to object to their holding your data. They also want you to think that treating your data as a commodity is necessary for digital tech, and that digital tech is progress – even when it might sometimes look worryingly similar to social or political regress. More importantly, tech wants you to think that the innovations it brings into the market are inevitable. That’s what progress looks like, and progress cannot be stopped.

That narrative is complacent and misleading. As the Danish economic geographer Bent Flyvbjerg points out in Rationality and Power (1998), power produces the knowledge, narratives and rationality that are conducive to building the reality it wants. But technology that perpetuates sexist and racist trends and worsens inequality is not progress. Inventions are far from unavoidable. Treating data as a commodity is a way for companies to earn money, and has nothing to do with building good products. Hoarding data is a way of accumulating power. Instead of focusing only on their bottom line, tech companies can and should do better to design the online world in a way that contributes to people’s wellbeing. And we have many reasons to object to institutions collecting and using our data in the way that they do.

Among those reasons is institutions not respecting our autonomy, our right to self-govern. Here is where the harder side of power plays a role. The digital age thus far has been characterised by institutions doing whatever they want with our data, unscrupulously bypassing our consent whenever they think they can get away with it. In the offline world, that kind of behaviour would be called matter-of-factly ‘theft’ or ‘coercion’. That it is not called this in the online world is yet another testament to tech’s power over narratives.

I t’s not all bad news, though. Yes, institutions in the digital age have hoarded privacy power, but we can reclaim the data that sustains it, and we can limit their collecting new data. Foucault argued that, even if power constructs human subjects, we have the possibility to resist power and construct ourselves. The power of big tech looks and feels very solid. But tech’s house of cards is partly built on lies and theft. The data economy can be disrupted. The tech powers that be are nothing without our data. A small piece of regulation, a bit of resistance from citizens, a few businesses starting to offer privacy as a competitive advantage, and it can all evaporate.

No one is more conscious of their vulnerability than tech companies themselves. That is why they are trying to convince us that they do care about privacy after all (despite what their lawyers say in court). That is why they spend millions of dollars on lobbying. If they were so certain about the value of their products for the good of users and society, they would not need to lobby so hard. Tech companies have abused their power, and it is time to resist them.

In the digital age, resistance inspired by the abuse of power has been dubbed a techlash. Abuses of power remind us that power needs to be curtailed for it to be a positive influence in society. Even if you happen to be a tech enthusiast, even if you think that there is nothing wrong with what tech companies and governments are doing with our data, you should still want power to be limited, because you never know who will be in power next. Your new prime minister might be more authoritarian than the old one; the next CEO of the next big tech company might not be as benevolent as those we’ve seen thus far. Tech companies have helped totalitarian regimes in the past, and there is no clear distinction between government and corporate surveillance. Businesses share data with governments, and public institutions share data with companies.

When you expose your privacy, you put us all at risk

Do not give in to the data economy without at least some resistance. Refraining from using tech altogether is unrealistic for most people, but there is much more you can do short of that. Respect other people’s privacy. Don’t expose ordinary citizens online. Don’t film or photograph people without their consent, and certainly don’t share such images online. Try to limit the data you surrender to institutions that don’t have a claim to it. Imagine someone asks for your number in a bar and won’t take a ‘No, thank you’ for an answer. If that person were to continue to harass you for your number, what would you do? Perhaps you would be tempted to give them a fake number. That is the essence of obfuscation, as outlined by the media scholars Finn Bruton and Helen Nissenbaum in the 2015 book of that name. If a clothing company asks for your name to sell you clothes, give them a different name – say, Dr Private Information, so that they get the message. Don’t give these institutions evidence they can use to claim that we are consenting to our data being taken away from us. Make it clear that your consent is not being given freely.

When downloading apps and buying products, choose products that are better for privacy. Use privacy extensions on your browsers. Turn your phone’s wi-fi, Bluetooth and locations services off when you don’t need them. Use the legal tools at your disposal to ask companies for the data they have on you, and ask them to delete that data. Change your settings to protect your privacy. Refrain from using one of those DNA home testing kits – they are not worth it. Forget about ‘smart’ doorbells that violate your privacy and that of others. Write to your representatives sharing your concerns about privacy. Tweet about it. Take opportunities as they come along to inform business, governments and other people that you care about privacy, that what they are doing is not okay.

Don’t make the mistake of thinking you are safe from privacy harms, maybe because you are young, male, white, heterosexual and healthy. You might think that your data can work only for you, and never against you, if you’ve been lucky so far. But you might not be as healthy as you think you are, and you will not be young forever. The democracy you are taking for granted might morph into an authoritarian regime that might not favour the likes of you.

Furthermore, privacy is not only about you. Privacy is both personal and collective. When you expose your privacy, you put us all at risk. Privacy power is necessary for democracy – for people to vote according to their beliefs and without undue pressure, for citizens to protest anonymously without fear of repercussions, for individuals to have freedom to associate, speak their minds, read what they are curious about. If we are going to live in a democracy, the bulk of power needs to be with the people. If most of the power lies with companies, we will have a plutocracy. If most of the power lies with the state, we will have some kind of authoritarianism. Democracy is not a given. It is something we have to fight for every day. And if we stop building the conditions in which it thrives, democracy will be no more. Privacy is important because it gives power to the people. Protect it.

essay on digital privacy

Economic history

Economics 101

Why introductory economics courses continued to teach zombie ideas from before economics became an empirical discipline

Walter Frick

Cyclists in a professional race riding in rainy conditions. The leading cyclist in a blue jersey raises his arms in victory, with other cyclists closely following behind on the wet road. Everyone is wearing helmets and sunglasses for protection.

Sports and games

Performance-enhancing vices

Selfishness channels ambition, envy drives competition, pride aids the win. Does it take a bad person to be a good athlete?

Sabrina Little

A young woman and a man, both dressed formally, sit at a table with an electronic device in front of them. The woman is engaging with the device, which has several buttons and dials, while the man observes attentively. The setting appears to be an office or classroom. The image is in black and white.

Technology and the self

Tomorrow people

For the entire 20th century, it had felt like telepathy was just around the corner. Why is that especially true now?

Roger Luckhurst

A colourful, abstract image featuring a large, black and white bird in flight. The background is a swirling mix of purple, orange, and yellow hues. A small figure of a child walks in the distance, casting a long shadow.

Ageing and death

Peregrinations of grief

A friend and a falcon went missing. In pain, I turned to ‘Slaughterhouse-Five’ – and found a new vision of sorrow and time

Black and white image of Japanese soldiers in battle gear marching with a Rising Sun Flag, superimposed with large Japanese calligraphy characters on a plain background.

Nations and empires

Chastising little brother

Why did Japanese Confucians enthusiastically support Imperial Japan’s murderous conquest of China, the homeland of Confucius?

Shaun O’Dwyer

A historical painting of a woman in a detailed blue and silver gown with lace sleeves. She wears a pearl necklace and a hat adorned with white and pink flowers. The background features a golden tapestry and dark pillars.

Stories and literature

Her blazing world

Margaret Cavendish’s boldness and bravery set 17th-century society alight, but is she a feminist poster-girl for our times?

Francesca Peacock

Slouching Toward ‘Accept All Cookies’

When everything we do online is data to be harvested, resignation is easy. But there’s a better way to think about digital privacy.

A colorful, pixelated person running

Listen to this article

Listen to more stories on curio

W e are all shedding data like skin cells. Almost everything we do with, or simply in proximity to, a connected device generates some small bit of information—about who we are, about the device we’re using and the other devices nearby, about what we did and when and how and for how long. Sometimes doing nothing at all—merely lingering on a webpage—is recorded as a relevant piece of information. Sometimes simply walking past a Wi-Fi router is a data point to be captured and processed. Sometimes the connected device isn’t a phone or a computer, as such; sometimes it’s a traffic light or a toaster or a toilet. If it is our phone, and we have location services enabled—which many people do, so that they can get delivery and Find My Friends and benefit from the convenience of turn-by-turn directions—our precise location data are being constantly collected and transmitted. We pick up our devices and command them to open the world for us, which they do quite well. But they also produce a secondary output too—all those tiny flecks of dead skin floating around us.

Our data are everywhere because our data are useful. Mostly to make people money: When someone opens up their phone’s browser and clicks on a link—to use the most basic example—a whole hidden economy whirs into gear. Tracking pixels and cookies capture their information and feed it to different marketers and companies, which aggregate it with information gleaned from other people and other sites and use it to categorize us into “interest segments.” The more data gathered, the easier it is to predict who we are, what we like, where we live, whom we might vote for, how much money we might have, what we might like to buy with it. Once our information has been collected, it ricochets around a labyrinthine ad-tech ecosystem made up of thousands of companies that offer to make sense of, and serve hyper-targeted ads based on, it.

Our privacy is what the internet eats to live. Participating in some part or another of the ad-tech industry is how most every website and app we use makes money. But ad targeting isn’t the only thing our data are good for. Health-care companies and wearables makers want our medical history and biometric data—when and how we sleep; our respiratory rate, heart rate, steps, mile times; even our sexual habits—to feed us insights via their products. Cameras and sensors, on street corners and on freeways, in schools and in offices, scan faces and license plates in order to make us safer or identify traffic patterns. Monitoring software tracks students taking tests and logs the keystrokes of corporate employees. Even if not all of our information goes toward selling ads, it goes somewhere . It is collected, bought, sold, copied, logged, archived, aggregated, exploited, leaked to reporters, scrutinized by intelligence analysts, stolen by hackers, subjected to any number of hypothetical actions—good and bad, but mostly unknowable. The only certainty is that once our information is out there, we’re not getting it back.

I t’s scary and concerning , but mostly it’s overwhelming. In modern life, data are omnipresent. And yet, it is impossible to zoom out and see the entire picture, the full patchwork quilt of our information ecosystem. The philosopher Timothy Morton has a term for elements of our world that behave this way: A hyperobject is a concept so big and complex that it can’t be adequately described. Both our data and the way they are being compromised are hyperobjects.

Climate change is one too: If somebody asks you what the state of climate change is, simply responding that “it is bad” is accurate, but a wild oversimplification. As with climate change, we can all too easily look at the state of our digital privacy, feel absolutely buried in bad news, and become a privacy doomer, wallowing in the realization that we are giving our most intimate information to the largest and most powerful companies on Earth and have been for decades. Just as easy is reading this essay and choosing nihilism, resigning yourself to being the victim of surveillance, so much so that you don’t take precautions.

These are meager options, even if they can feel like the only ones available. Digital privacy isn’t some binary problem we can think of as purely solvable. It is the base condition and the broader context of our connected lives. It is dynamic, meaning that it is a negotiation between ourselves and the world around us. It is something to be protected and preserved, and in a perfect world, we ought to be able to guard or shed it as we see fit. But in this world, the balance of power is tilted out of our reach. Imagine you’re in a new city. You’re downloading an app to buy a ticket for a train that’s fast approaching. Time is of the essence. You hurriedly scroll through a terms-of-service agreement and, without reading, click “Accept.” You’ve technically entered a contractual agreement. Now consider that in such a moment, you might as well be sitting at a conference table. On one side is a team of high-priced corporate lawyers, working diligently to shield their deep-pocketed clients from liability while getting what they need from you. On the other side is you, a person in a train station trying to download an app. Not a fair fight.

So one way to think of privacy is as a series of choices. If you’d like a service to offer you turn-by-turn directions, you choose to give it your location. If you’d like a shopping website to remember what’s in your cart, you choose to allow cookies. But companies have gotten good at exploiting these choices and, in many cases, obscuring the true nature of them. Clicking “Agree” on an app’s terms of service, might mean, in the eyes of an exploitative company, that the app will not only take the information you’re giving up but will sell it to, or share it with, other companies.

Understanding that we give these companies an inch and they take a mile is crucial to demystifying their most common defense: the privacy paradox. That term was first coined in 2001 by an HP researcher named Barry Brown who was trying to explain why early internet users seemed concerned about data collection but were “also willing to lose that privacy for very little gain” in the form of supermarket loyalty-rewards programs. People must not actually care so much about their privacy, the argument goes, because they happily use the tools and services that siphon off their personal data. Maybe you’ve even convinced yourself of this after almost two decades of devoted Facebooking and Googling.

But the privacy paradox is a facile framework for a complex issue. Daniel J. Solove, a professor at George Washington University Law School, argues the paradox does not exist, in part because “managing one’s privacy is a vast, complex, and never-ending project that does not scale.” In a world where we are constantly shedding data and thousands of companies are dedicated to collecting it, “people can’t learn enough about privacy risks to make informed decisions,” he wrote in a 2020 article. And so resignedly and haphazardly managing our personal privacy is all we can do from day to day. We have no alternative.

But that doesn’t mean we don’t care. Even if we don’t place a high value on our personal data privacy, we might have strong concerns about the implications of organizations surveilling us and profiting off the collection of our information. “The value of privacy isn’t based on one’s particular choice in a particular context; privacy’s value involves the right to have choices and protections,” Solove argues. “People can value having the choice even if they choose to trade away their personal data; and people can value others having the right to make the choice for themselves.”

This notion is fundamental to another way to think of privacy: as a civil right. That’s what the scholar Danielle Keats Citron argues in her book The Fight for Privacy . Privacy is freedom, and freedom is necessary for humans to thrive. But protecting that right is difficult, because privacy-related harm is diffuse and can come in many different forms: At its most extreme, it can be physical (violence and doxxing), reputational (the release of embarrassing or incorrect information), or psychological (the emotional distress that comes along with having your intimate information taken from you). But, according to work by Solove and Citron, proving harm that goes beyond concrete economic loss is difficult in legal terms.

Citron argues in her book that we need a new social compact, one that includes civic education about privacy and why it is important. Simply understanding our right to privacy won’t vaporize overly permissive, opt-out data collection. It won’t completely correct the balance of power. But it will begin to give us a language for what is at stake when a new company or service demands our information with few safeguards. This education is not just for children but for everyone: executives, tech employees, lawmakers. It is a way to make the fight a bit fairer.

A nd how should we think about our data—all that digital dandruff? Scale is part of the problem here: Giving up an individual piece of location data may not feel all that meaningful, but having all of your movements tracked might constitute a violation. Context is also important: A piece of private sexual-health data may be a guarded secret of life-and-death import to the person it originated from; to a health-care conglomerate, that data point may be worth a fraction of a fraction of a cent. But when data are sliced and categorized and placed into profiles and buckets, their value increases. In 2020, Facebook made 97.9 percent of its revenue—nearly $85 billion—off of targeted ads, pinpointed by such data collection. Data, in the aggregate, is an asset class, one that powers innovative technologies and inflates bottom lines.

In a 2019 essay, the technologist Can Duruk discussed an analogy that, he admits, is a bit cliché: Data is the new oil . Extracting it is dirty, and storing it is dangerous . “We are barely recognizing the negative externalities of decades of oil production and consumption now, and it took us almost destroying the planet,” he writes. “We should do a better job for data.”

It’s interesting to imagine a society that would force companies to treat data as an oil-like commodity, something valuable, rather than digital ephemera in inexhaustible supply—where not only would the environmental toll of leaks and spills be remedied but victims could attempt to hold liable those trusted with storage. Maybe we’d demand a sort of supply-chain transparency to trace the flow of the product around the world. Maybe we’d find a way to quantify the externality.

Digital privacy’s climate-change analogy is not perfect, but when it comes to calls to action, the parallel is helpful. No single law or innovation could adequately reshape the world we’ve spent decades building. Quick fixes or sweeping legislative changes may very well have unintended consequences. We cannot totally reverse what we’ve put into motion. But there is always a reason to push for a better future. Last year, the environmentalist, author, and activist Bill McKibben wrote about a climate question he hears frequently: How bad is it? He is unsparing in his assessment but never overly alarmist. “Despair is not an option yet,” he writes. “At least if it’s that kind of despair that leads to inaction. But desperation is an option—indeed, it’s required. We have to move hard and fast.”

W hen reckoning with a subject as complex and fundamental as our digital privacy, metaphor is appealing—I’ve certainly reached for it throughout this essay. Our information is oil: a pollutant, a liability, a thing that powers the world. It’s skin cells: floating all around us. It’s a hyperobject: impossible to understand in its entirety all at once. If our data are what the internet feeds off of, maybe each piece—every datum, every bit of information from every tiny thing we do—is a calorie: incredibly powerful in the aggregate but invisible and incomprehensible to the naked eye, a sort of hypo -object.

We keep grasping for these metaphors because all are helpful, but none is quite sufficient. The internet as we know it is a glorious, awful, intricate, sprawling series of networks that needs our information in order to function. We cannot go back to a time before this was true—before turn-by-turn directions and eerily well-targeted ads, before we carried little data-collection machines in our pockets all day—and nor would all of us want to. But we can demand much more from the reckless stewards of our information. That starts with understanding what exactly has been taken from us. The fight for our privacy isn’t just about knowing what is collected and where it goes—it is about reimagining what we’re required to sacrifice for our conveniences and for a greater economic system. It is an acknowledgment of the trade-offs of living in a connected world, but focusing on what humans need to flourish. What is at stake is nothing less than our basic right to move through the world on our terms, to define and share ourselves as we desire.

Numbers, Facts and Trends Shaping Your World

Read our research on:

Full Topic List

Regions & Countries

  • Publications
  • Our Methods
  • Short Reads
  • Tools & Resources

Read Our Research On:

How Americans View Data Privacy

1. views of data privacy risks, personal data and digital privacy laws, table of contents.

  • Role of social media, tech companies and government regulation
  • Americans’ day-to-day experiences with online privacy
  • Personal data and information
  • Feelings of concern, confusion and a lack of control over one’s data
  • Privacy laws and regulation
  • Americans largely favor more regulation to protect personal information
  • Trust in social media executives
  • Children’s online privacy: Concerns and responsibility
  • Law enforcement and surveillance
  • AI and data collection
  • Trust in companies that use AI
  • How people approach privacy policies
  • How people are protecting their digital privacy
  • How Americans handle their passwords
  • Data breaches and hacks
  • Identifying the most and least knowledgeable, confident and concerned
  • Knowledge and privacy choices
  • Confidence and privacy choices
  • Concern and privacy choices
  • The case of privacy policies
  • Acknowledgments
  • The American Trends Panel survey methodology
  • Appendix A: Law enforcement’s use of technology in investigations
  • Appendix B: Privacy outcomes by knowledge, confidence and concern 
  • Appendix C: Confident and independent use of digital devices, by age and education

Online privacy is complex, encompassing debates over law enforcement’s data access, government regulation and what information companies can collect. This chapter examines Americans’ perspectives on these issues and highlights how views vary across different groups, particularly by education and age. 

When managing their privacy online, most Americans say they trust themselves to make the right decisions about their personal information (78%), and a majority are skeptical that anything they do will make a difference (61%).

Bar charts showing that Most trust themselves to make the right decisions about their personal information online, but a majority also are skeptical anything they do will make a difference

Far fewer mention being overwhelmed by figuring out what they need to do (37%) or say privacy is not that big of a deal to them (29%).

Another 21% are confident that those with access to their personal information will do what is right.

Education differences

  • 81% of those with at least some college experience say they trust themselves to make the right decisions about their personal information online, compared with 72% of those with a high school diploma or less.
  • 67% of those with at least some college are skeptical that anything they do to manage their online privacy will make a difference, compared with half of those with a high school diploma or less formal education.

On the other hand, those with a high school education or less are more likely than those with some college experience or more to say that privacy isn’t that big of a deal to them and that they are confident that those who have access to their personal information will do the right thing.

About 4 in 10 Americans are very worried about their information being sold or stolen, but this varies by race and ethnicity

The survey also explores the concerns people have about data collection and security – specifically, how they feel about three scenarios around companies, law enforcement and identity theft.

Roughly four-in-ten Americans say they are very worried about companies selling their information to others without them knowing (42%) or people stealing their identity or personal information (38%). Fewer are apprehensive about law enforcement monitoring what they do online (15%).

Racial and ethnic differences

However, some of these shares are higher among Hispanic, Black or Asian adults: 1

  • Roughly half of Hispanic, Black or Asian adults are very worried about people stealing their identity or personal information, compared with a third of White adults.
  • About one-in-five of each group are very worried about law enforcement monitoring their online activity; 10% of White adults say this.

Americans are largely concerned and feel little control or understanding of how companies and the government collect, use data about them

A majority of Americans say they are concerned, lack control and have a limited understanding about how the data collected about them is used. This is true whether it’s the government or companies using their data. Similar sentiments were expressed in 2019, when we last asked about this .

Concern is high: 81% say they feel very or somewhat concerned with how companies use the data they collect about them. Fully 71% say the same regarding the government’s use of data.

People don’t feel in control: Roughly three-quarters or more feel they have very little or no control over the data collected about them by companies (73%) or the government (79%).

Understanding is low: Americans also say they don’t understand what these actors are doing with the data collected about them. Majorities say they have very little or no understanding of this, whether by the government (77%) or companies (67%).

Americans are now less knowledgeable than before about how companies are using their personal data. The share who say they don’t understand this has risen from 59% in 2019 to 67% in 2023.

They have also grown more concerned about how the government uses the data it collects about them, with the share expressing concern up from 64% to 71% over this same period.

While these sentiments have not changed significantly since 2019 among Democrats and those who lean toward the Democratic Party, Republicans and GOP leaners have grown more wary of government data collection. Today, 77% of Republicans say they are concerned about how the government uses data it collects about them, up from 63% four years earlier.

Growing shares say they don’t understand data privacy laws

Americans are less knowledgeable about data privacy laws today than in the past.

Today, 72% of Americans say they have little to no understanding about the laws and regulations that are currently in place to protect their data privacy. This is up from 63% in 2019.

By comparison, the shares who say they understand some or a great deal about these laws decreased from 37% in 2019 to 27% in 2023.

Broad partisan support for more regulation of how consumer data is used

Overall, 72% say there should be more government regulation of what companies can do with their customers’ personal information. Just 7% say there should be less regulation. Another 18% say it should stay about the same.

Views by political affiliation

There is broad partisan support for greater involvement by the government in regulating consumer data. 

A majority of Democrats and Republicans say there should be more government regulation for how companies treat users’ personal information (78% vs. 68%).

These findings are largely on par with a 2019 Center survey that showed strong support for increased regulations across parties.

A table showing most Americans don’t trust social media CEOs to handle users’ data responsibly, for example, by publicly taking responsibility for mistakes when they misuse or compromise it

Majorities of Americans say they have little to no trust that leaders of social media companies will publicly admit mistakes regarding consumer data being misused or compromised (77%), that these leaders will not sell users’ personal data to others without their consent (76%), and that leaders would be held accountable by the government if they were to misuse or compromise users’ personal data (71%).

This includes notable shares who have no trust at all in those who are running social media sites. For example, 46% say they have no trust at all in executives of social media companies to not sell users’ data without their consent.

About 9 in 10 Americans are concerned that social media sites and apps know kids’ personal information

Most Americans say they are concerned about social media sites knowing personal information about children (89%), advertisers using data about what children do online to target ads to them (85%) and online games tracking what children are doing on them (84%).

Concern is widespread, with no statistically significant differences between those with and without children.

Majority of Americans say parents and technology companies should have a great deal of responsibility for protecting children’s online privacy

Another key question is who should be responsible for the actual protection of kids’ online privacy.

Fully 85% say parents bear a great deal of responsibility for protecting children’s online privacy. Roughly six-in-ten say the same about technology companies, and an even smaller share believe the government should have a great deal of responsibility. 

The survey also measured how acceptable Americans think it is for law enforcement to use surveillance tools during criminal investigations.

Older adults are more likely than younger adults to support law enforcement tracking locations, breaking into people’s phones during an investigation

Roughly three-quarters of Americans say it’s very or somewhat acceptable for law enforcement to obtain footage from cameras people install at their residences during a criminal investigation or use information from cellphone towers to track where someone is.

By comparison, smaller shares – though still a slight majority – say it is acceptable to break the passcode on a user’s phone (54%) or require third parties to turn over users’ private chats, messages or calls (55%) during a criminal investigation. 2

About one-in-ten Americans say they aren’t sure how they feel about law enforcement doing each of these things.

Age differences

Older adults are much more likely than younger adults to say it’s at least somewhat acceptable for law enforcement to take each of these actions in criminal investigations. 

For example, 88% of those 65 and older say it’s acceptable for law enforcement to obtain footage from cameras people install at their residences, compared with 57% of those ages 18 to 29.

In the case of a criminal investigation:

  • White adults are more likely than Hispanic and Black adults to think it’s acceptable for law enforcement to use information from cellphone towers to track people’s locations and to break the passcode on a user’s phone to get access to its contents.
  • White and Hispanic adults are more likely than Black adults to say it’s acceptable to require third parties to turn over users’ private chats, messages or calls.

Majority of Americans say it’s unacceptable to use AI to determine public assistance eligibility, but views are mixed for smart speakers analyzing voices

Artificial intelligence (AI) can be used to collect and analyze people’s personal information. Some Americans are wary of companies using AI in this way.

Fully 55% of adults say using computer programs to determine who should be eligible for public assistance is unacceptable. Roughly a quarter say it’s an acceptable use of AI.

Roughly half (48%) think it is unacceptable for social media companies to analyze what people do on their sites to deliver personalized content. Still, 41% are supportive of this.

Views are mixed when it comes to smart speakers analyzing people’s voices to learn who is speaking. Statistically equal shares say it’s unacceptable and acceptable (44% and 42%, respectively).

And some Americans – ranging from 10% to 17% – are uncertain about whether these uses are acceptable or not.

  • 49% of adults 50 and older say it’s unacceptable for a smart speaker to analyze people’s voices to learn to recognize who’s speaking. This share drops to four-in-ten among adults under 50.
  • Similarly, 56% of those 50 and older say social media companies analyzing what people do on their sites to deliver personalized content is unacceptable. But 41% of those under 50 say the same.
  • There are no differences between those under 50 and those 50 and older over whether computer programs should be used to determine eligibility for public assistance.

Most Americans who have heard of AI don’t trust companies to use it responsibly and say it will lead to unease and unintended uses

In addition to understanding people’s comfort level with certain uses of AI, the survey also measured the public’s attitudes toward companies that are utilizing AI in their products.

Among those who have heard of AI:

  • 70% say they have little to no trust in companies to make responsible decisions about how they use AI in their products.
  • Roughly eight-in-ten say the information will be used in ways people are not comfortable with or that were not originally intended.
  • Views are more mixed regarding the potential that using AI to analyze personal details could make life easier. A majority of those who have heard of AI say this will happen (62%). Regarding differences by age, adults under 50 are more optimistic than those 50 and older (70% vs. 54%). 
  • 87% of those with a college degree or higher say companies will use AI to analyze personal details in ways people would not be comfortable with. Some 82% of those with some college experience and 74% with a high school diploma or less say the same.
  • 88% of those with a bachelor’s degree or more say companies will use this information in ways that were not originally intended. This share drops to 80% among those with some college experience and 71% among those with a high school diploma or less.
  • About three-quarters of those with a college degree or more (74%) say this information will be used in ways that could make people’s lives easier. But this share drops to 60% among those with some college experience and 52% among those with a high school diploma or less.
  • This survey includes a total sample size of 364 Asian adults. The sample primarily includes English-speaking Asian adults and, therefore, it may not be representative of the overall Asian adult population. Despite this limitation, it is important to report the views of Asian adults on the topics in this study. As always, Asian adults’ responses are incorporated into the general population figures throughout this report. Asian adults are shown as a separate group when the question was asked of the full sample. Because of the relatively small sample size and a reduction in precision due to weighting, results are not shown separately for Asian adults for questions that were only asked of a random half of respondents (Form 1/Form 2). ↩
  • Half of respondents were asked the questions above, and the other half received the same questions with the added context of it being a “criminal investigation where public safety is at risk.” Differences in response were largely modest. See Appendix A for these findings. ↩

Sign up for our weekly newsletter

Fresh data delivery Saturday mornings

Sign up for The Briefing

Weekly updates on the world of news & information

  • Artificial Intelligence
  • Online Privacy & Security
  • Social Media
  • Tech Companies

Social Media Fact Sheet

Teens and social media fact sheet, more americans are getting news on tiktok, bucking the trend seen on most other social media sites, life on social media platforms, in users’ own words, charting congress on social media in the 2016 and 2020 elections, most popular.

1615 L St. NW, Suite 800 Washington, DC 20036 USA (+1) 202-419-4300 | Main (+1) 202-857-8562 | Fax (+1) 202-419-4372 |  Media Inquiries

Research Topics

  • Email Newsletters

ABOUT PEW RESEARCH CENTER  Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. It conducts public opinion polling, demographic research, media content analysis and other empirical social science research. Pew Research Center does not take policy positions. It is a subsidiary of  The Pew Charitable Trusts .

© 2024 Pew Research Center

The New York Times

Opinion | the privacy project, the privacy project.

APRIL 10, 2019

The New York Times is launching an ongoing examination of privacy. We’ll dig into the ideas, history and future of how our information navigates the digital ecosystem and what’s at stake.

essay on digital privacy

Companies and governments are gaining new powers to follow people across the internet and around the world, and even to peer into their genomes. The benefits of such advances have been apparent for years; the costs — in anonymity, even autonomy — are now becoming clearer. The boundaries of privacy are in dispute, and its future is in doubt. Citizens, politicians and business leaders are asking if societies are making the wisest tradeoffs. The Times is embarking on this monthslong project to explore the technology and where it’s taking us, and to convene debate about how it can best help realize human potential.

Does Privacy Matter?

What do they know, and how do they know it, what should be done about this, what can i do.

View all Privacy articles

Stuart A. Thompson and Charlie Warzel

One Nation, Tracked

Dec. 19, 2019

Timothy Libert

This Article Is Spying On You

Sept. 18, 2019

Agnes Callard

The Real Cost of Tweeting About My Kids

Nov. 11, 2019

Bianca Vivion Brooks

I Used to Fear Being a Nobody. Then I Left Social Media.

Oct. 1, 2019

From the Newsroom

A Surveillance Net Blankets China’s Cities, Giving Police Vast Powers

Dec. 17, 2019

Mark Bowden

The Worm That Nearly Ate the Internet

June 29, 2019

James Orenstein

I’m a Judge. Here’s How Surveillance Is Challenging Our Legal System.

June 13, 2019

Sahil Chinoy

We Built an ‘Unbelievable’ (but Legal) Facial Recognition Machine

April 16, 2019

Bill Hanvey

Your Car Knows When You Gain Weight

May 20, 2019

As Cameras Track Detroit’s Residents, a Debate Ensues Over Racial Bias

July 8, 2019

Chris Hughes

It’s Time to Break Up Facebook

May 9, 2019

The Editorial Board

Total Surveillance Is Not What America Signed Up For

Dec. 21, 2019

Glenn S. Gerstell

N.S.A. Official: We Need to Prepare for the Future of War

Sept. 13, 2019

Charlie Warzel

Amazon Wants to Surveil Your Dog

Oct. 10, 2019

15 Ways Facebook, Google, Apple and Amazon Are in Government Cross Hairs

Sept. 6, 2019

Patrick Berlinquette

I Used Google Ads for Social Engineering. It Worked.

August 4, 2019

Bekah Wells

The Trauma of Revenge Porn

May 6, 2019

Jack Poulson

I Used to Work for Google. I Am a Conscientious Objector.

April 23, 2019

Roseanna Sommers and Vanessa K. Bohns

Would You Let the Police Search Your Phone?

April 30, 2019

Now Some Families Are Hiring Coaches to Help Them Raise Phone-Free Children

July 6, 2019

Illustrations by Max Guther

More on NYTimes.com

Advertisement

Digital technologies: tensions in privacy and data

  • Original Empirical Research
  • Open access
  • Published: 05 March 2022
  • Volume 50 , pages 1299–1323, ( 2022 )

Cite this article

You have full access to this open access article

essay on digital privacy

  • Sara Quach   ORCID: orcid.org/0000-0002-0976-5179 1 ,
  • Park Thaichon 1 ,
  • Kelly D. Martin 2 ,
  • Scott Weaven 1 &
  • Robert W. Palmatier 3  

100k Accesses

106 Citations

10 Altmetric

Explore all metrics

Driven by data proliferation, digital technologies have transformed the marketing landscape. In parallel, significant privacy concerns have shaken consumer–firm relationships, prompting changes in both regulatory interventions and people’s own privacy-protective behaviors. With a comprehensive analysis of digital technologies and data strategy informed by structuration theory and privacy literature, the authors consider privacy tensions as the product of firm–consumer interactions, facilitated by digital technologies. This perspective in turn implies distinct consumer, regulatory, and firm responses related to data protection. By consolidating various perspectives, the authors propose three tenets and seven propositions, supported by interview insights from senior managers and consumer informants, that create a foundation for understanding the digital technology implications for firm performance in contexts marked by growing privacy worries and legal ramifications. On the basis of this conceptual framework, they also propose a data strategy typology across two main strategic functions of digital technologies: data monetization and data sharing. The result is four distinct types of firms, which engage in disparate behaviors in the broader ecosystem pertaining to privacy issues. This article also provides directions for research, according to a synthesis of findings from both academic and practical perspectives.

Similar content being viewed by others

essay on digital privacy

Privacy as Enabler of Innovation

The role of data privacy in marketing.

essay on digital privacy

Digital Breadcrumbs: A Lack of Data Privacy and What People Are Doing About It

Avoid common mistakes on your manuscript.

Modern marketing practice requires the use of digital technologies, and the customer data they generate, to create value (Quach et al., 2020 ). Yet such reliance prompts increasing privacy concerns about firms’ data behaviors and actions among both consumers and regulators. Consumers thus take action to protect their data; for example, people who switch service providers frequently cite privacy worries as a key reason (Cisco, 2020 ). However, many consumer respondents to a recent Australian survey (58%) admitted they do not understand what firms do with the data they collect, and 49% feel unable to protect their data due to a lack of knowledge or time, as well as the complexity of the processes involved (OAIC, 2020 ). Stronger regulations at global, national, and state levels (e.g., Australian Privacy Act, General Data Protection Regulation [GDPR], California Privacy Right Act [CPRA]) may help consumers, but they are costly for firms to comply with (e.g., up to US$55 billion for CPRA, according to estimates by the California state attorney general’s office) and also establish strict penalties for noncompliance (e.g., 10–20 million euros or 2%–4% of global firm revenues for specific GDPR infringements). Thus, privacy concerns create tensions among consumers, firms, and regulators, and effective privacy protection likely requires cooperation among these interconnected groups.

Extensive research details consumers’ privacy concerns (for a comprehensive review, see Okazaki et al., 2020 ) and regulatory interventions of varying effectiveness (Jia et al., 2021 ), as well as the consequences for firms’ performance (e.g., Martin et al., 2017 ). However, we still lack a systematic, integrative, research-based view of privacy tensions across all three involved entities, specifically in relation to digital technologies and the unique customer data they generate (Pomfret et al., 2020 ). That is, existing research effectively outlines privacy tensions from consumers’ and firms’ perspectives (Bornschein et al., 2020 ) but without addressing the complex, interrelated positions of firms, consumers, and regulators simultaneously (Martin &; Palmatier, 2020 ). Research into internal privacy mechanisms such as privacy paradoxes (Kaaniche et al., 2020 ) or dyadic views of privacy between consumers and firms (Rasoulian et al., 2017 ) or between firms and regulators (Johnson et al., 2020 ) cannot establish a triadic view of the privacy tensions created by digital technologies that link all these groups.

Therefore, to develop new marketing insights into digital technologies and privacy, we explicitly consider this firm–consumer–regulatory intersection and work to disentangle the data strategies and embedded technologies that firms use to create mutual value for themselves and their customers. With a comprehensive review of digital technologies, we examine four categories: (1) data capturing; (2) data aggregation, processing, and storage; (3) data modeling and programming; and (4) data visualization and interaction design. Each category can enable data monetization and sharing in distinct ways and with unique implications for consumers’ (information, communication, and individual) privacy outcomes. Accordingly, we investigate the consumer implications of firms’ digital technology use, with a particular focus on their privacy responses. As consumers gain knowledge about digital technologies, they may be more likely to adopt a proactive strategy and take preemptive, protective measures when interacting with firms. Finally, we examine how various regulatory interventions enter into these consumer–firm interactions, by exploring both proactive and reactive regulatory enforcement mechanisms. In pursuing these three research objectives, we establish an integrated framework with relevant implications for consumers, firms, and regulators.

We augment the analyses with case studies (i.e., Apple, Facebook, and BMW) and interview data, gathered from senior managers and consumer informants, which enhance the external validity of the integrated digital strategy framework. In particular, we use informants’ insights to understand people’s growing privacy concerns and the legal ramifications linked to digital technology strategies. Because our findings extend knowledge by blending the perspectives of firms, consumers, and regulators, they also provide meaningful research directions and actionable insights for academics and practitioners. Accordingly, we offer suggestions for research, reflecting the synthesis of the academic and practical perspectives that inform our findings.

This research contributes to marketing theory by applying a structuration theoretical approach to a marketing data privacy context. Structuration theory (Giddens, 1984 ) overcomes some limitations of prior systems theories that overemphasize the role of either structure or action in social processes and interactions; its theoretical insights instead reflect their interplay. Therefore, it can help us explain how data privacy regulatory frameworks impose structure on consumer–firm–policymaker interactions, then predict reactive and proactive responses by each key actor. The presence (absence) of a regulatory framework provides rules and norms that can mitigate (exacerbate) privacy tensions. In addition to relying on effective regulations for data protection, consumers exhibit other privacy protection behaviors and demands, which then intensify the pressure on firms to respond to privacy tensions.

The findings of this study also help inform marketing practice by delineating firm responses that can offset consumer privacy risks. For example, in some contexts, firm responses to consumer privacy risks are stipulated by a well-defined regulatory mandate, though even in this case, they may be subject to multiple, conflicting regulations (Lavelle, 2019 ). In unregulated settings, firms must self-police to meet privacy expectations, despite a lack of insights into how to mitigate the threats and risks of privacy failures (e.g., data breaches, data misuse scandals). Another option would be to exceed regulatory stipulations and use privacy as a source of competitive advantage (Palmatier & Martin, 2019 ), in which case firms need specialized knowledge of how to infuse privacy proactively into all their structures and processes. Noting these options, we provide practical advice for how firms can adopt a reactive stance and respond to privacy mandates on an as-needed basis or else become more proactive by exhibiting privacy-by-design, zero-party data collection, or ecosystem innovation, among other approaches.

In the next section, we begin with a description of the consumer privacy tensions that emerge from firms’ digital technology uses in four areas: data capture; data aggregation, processing, and storage; data modeling and programming; and data visualization and interaction design. We then review these conceptualizations from a structuration theory perspective, from which we derive some suggested proactive and reactive responses for regulators, firms, and consumers. Using this discussion as a foundation for our integrative framework, we offer three thematic tenets and seven research propositions, which can inform a comprehensive firm data strategy typology, as well as an extensive research agenda.

Firms’ digital technology use and consumer privacy tensions

Digital technologies allow firms to access vast amounts of data, which they might leverage to increase their profitability (i.e., data monetization) or improve the performance of their broader business networks (i.e., data sharing). Specifically, data monetization means the firm exploits data for their direct or indirect economic benefits. These practices might include applying data analytics–based insights to develop new products and services for the customers whom the data represent (i.e., data wrapping). For example, Coca-Cola collects data to improve customer service and performance, such as development of a Cherry Sprite flavor, based on data collected from self-service vending machines and social monitoring empowered by AI-driven image recognition technology. Data monetization also involves harnessing insights to create value-added features for other clients (i.e., extended data wrapping). Facebook, for example, makes money by providing data analytics features to advertisers based on user data on its social network platform. Finally, a direct approach to data monetization is for firms simply to sell their data to other firms (Najjar & Kettinger, 2013 ). Comscore is a digital analytics organization that provides marketing data and information to advertisers, media and marketing agencies, publishers, and other firms, selling these data to more than 3200 clients in 75 countries.

Data sharing instead refers to resource exchanges in which firms provide data they have gathered to various network partners (e.g., suppliers, distributors, horizontal partners with complementary offerings), to facilitate collaboration in the broader ecosystem (Sydow & Windeler, 1998 ). For instance, Coca-Cola shares information with third parties such as hosting firms, IT service providers, or consultants that support its service provision. Coca-Cola’s EU website ( https://www.coca-cola.eu/privacy-notice/ ) lists 18 third parties with which it shares data. PayPal, by contrast, lists 600 such parties. Other tech firms such as Apple work with complex networks of suppliers and application developers that constantly exchange information to develop better products and services. In 2018, a New York Times investigation revealed that Facebook shared data with more than 150 companies. Such data-based collaborations improve the performance of its entire digital ecosystem. Thus, data monetization increases firm profitability more directly, whereas data sharing improves profitability via network performance.

Levels of data sharing and data monetization vary across firms (see Web Appendix 1 ). For example, data harvesters are mostly firms in non-technical industries that engage in very limited data sharing and data monetization. Few harvesters engage in data wrapping, which would demand significant investments in digital technologies. Many of them are small firms with low to moderate levels of digital competence, though others are huge firms that recognize their own data insights are more valuable than any data they might purchase from outside sources (e.g., Coca-Cola, adidas, McDonald’s). Data patrons (e.g. Apple, Paypal) often possess moderate to high levels of digital technology and invest in sharing data across networks of partners, such as suppliers and distributors, to improve the overall functioning of the ecosystem. Even if they share data extensively, they also impose strict limits on how those data can be used and if (whether) they may be monetized. On the other hand, data informants’ business models rely on extensive data monetization and include data brokers, app developers, and content creators (e.g., Comscore, Weather Bug, OnAudience). With vast digital technologies, they generally engage in little sharing but monetize data through extended data wrapping (e.g., game development services) or sales of information or analytics (e.g., data brokering services). Data experts (e.g. Facebook, Google) engage in high levels of both data sharing and data monetization. Due to their significant digital technology resources, they own a lot of data and also control most of the data flows in the digital ecosystem. They predominantly perform extended data wrapping to attract new customers. That is, data experts offer their customers’ data to other clients, such as advertisers, that use the insights to reach their own target customers.

Data sharing and monetization practices generally involve a diverse portfolio of digital technologies, each of which can create benefits but also trigger privacy tensions, as we describe next and summarize in Table 1 .

Privacy tensions

Digital technologies offer data monetization and sharing benefits to firms but have concomitant costs for consumers, especially with respect to privacy. Westin ( 1967 ) defines privacy as a person’s right “to decide what information about himself should be communicated to others and under what condition” (p. 10), whereas Altman ( 1975 ) regards it as “the selective control of access to the self” through social interactions and personal space (p. 24). Adopting these definitional premises of autonomy, access, and control, we conceive of three types of consumer privacy: information, communication, and individual (see also Hung & Wong, 2009 ). The simultaneous consideration of all three types offers an expansion of extant marketing studies of privacy that tend to focus solely on information privacy (Bornschein et al., 2020 ). In detail, information privacy refers to a consumer’s right to control the access to, use, and dissemination of her or his personal data (Westin, 1967 ). Thus people may decide for themselves when, how, and to what extent their information will be known by others. Communication privacy protects personal messages or interactions from eavesdropping, scanning, or interception. People generally prefer to keep their interpersonal communications confidential and safe from third-party surveillance, which would not be possible if conversations with friends were recorded by social media and messaging apps or their in-person discussions were captured by smart devices equipped with integrated microphones. Finally, individual privacy is being left alone without disruption (Westin, 1967 ). Threats to individual privacy involve personal space intrusions, emotional manipulation, and physical interference, including spam emails and retargeting practices. Such violations are on the rise, due to the presence of IoT and smart home devices installed in consumers’ personal, physical spaces. In turn, firms’ data strategies, enabled by digital technologies, have implications for each type of consumer privacy.

Extensive research details consumers’ privacy concerns (e.g., Okazaki et al., 2020 ), as well as some of the consequences for firm performance or regulatory interventions. However, we still lack a systematic understanding of how privacy issues arise from firms’ data strategies and their uses of various digital technologies to support such strategies. To articulate the critical tensions between firms’ technology uses for data sharing and data monetization purposes, and consumers’ privacy risks, we combine the three forms of privacy with the data sharing and data monetization strategies related to four digital technology classifications: (1) data capturing; (2) data aggregation, processing, and storage; (3) data modeling and programming; and (4) data visualization and interaction design (Table 1 ). It would be impossible to discuss all technologies; rather, we attend specifically to six broad groups of emerging technologies: SMAC (social media, mobile, analytics, cloud), digital surveillance, robotics, AI, IoT, and mixed (virtual, augmented) realities (VR, AR). Each of these is characterized by consumer-marketer interactions and is central to firms’ data monetization and sharing strategies (Poels, 2019 ). Digital technologies such as blockchain, digital fabrication (e.g., 3D printing), 5G, and quantum computing are beyond the scope of this study, because they mainly support operations and digital infrastructure functions.

Data capture privacy tensions

Data capture technologies, including various sources and methods of data extraction, fuel data sharing and data monetization practices. In this respect, instead of technologies that collect transactional data such as point-of-sale systems, we focus on social media , geospatial , biometrics , and web tracking technologies. To facilitate data sharing, the data gathered via these technologies can be shared readily with business partners and networks, such as between manufacturers and suppliers or across subsidiaries (e.g., WhatsApp shares phone numbers, device specifications, and usage data with other Facebook [recently rebranded to Meta] companies). The data collected from social media, geospatial, biometrics, and web tracking technologies can also be monetized in various ways. With user-generated social media content, location insights from geospatial technologies, biometric data, and web tracking technologies such as cookies, firms can improve marketing and business performance by developing market segmentation and (re)targeting strategies, by crafting personalized content, products, and experiences, and by building and strengthening customer relationships (de Oliveira Santini et al., 2020 ). They also can conduct data wrapping, for example, through customization and optimization practices such as facial recognition and medical alerts (e.g., Apple watch). Firms also can apply extended data wrapping or sell data to other entities. Facebook, as noted, sells in-depth insights and analytics based on its users’ personal data (Appel et al., 2020 ), and Twitter sells third-party subscriptions to its API that allow other firms to explore users’ behaviors.

These practices threaten information privacy because consumers lose control over who has access to their personal information and communicative exchanges (e.g., tweet, review on a public Facebook page). Geospatial data enable firms to identify customers’ positions; by monitoring consumers’ digital footprints, companies also can follow them across different platforms, raising concerns about individual privacy . Soft biometric data, about moods or emotions, raise security and ethical concerns, because they reflect personal feelings that can be manipulated for commercial purposes, which would represent individual privacy violations . Each user’s information might also include details about other users, due to the networked nature of social media. If a user tags a friend on a public Facebook post, their conversations get exposed, which violates both friends’ communication privacy if firms review and exploit these exchanges.

Data aggregation, processing, and storing privacy tensions

Firms often combine data sets from multiple novel sources, which allows them to effectively share and monetize such data. Key technologies in data aggregation, processing, and storing technologies are IoT, big data, and cloud computing , with capacities to process and manage massive amounts of information (Kobusińska et al., 2018 ). The convergence of IoT, big data, and cloud computing is central to data sharing as it enables firms to share applications and analytics with multiple parties in real-time and at reduced technology costs. Data can be shared via IOT-enabled devices in machine-to-machine communications. Insights and analytics based on big data can be exchanged with partners, whereas cloud technologies offer a cost-effective information storage cyber-infrastructure that is broadly available across time and space and accessible by multiple users simultaneously (Alsmadi & Prybutok, 2018 ). Data aggregation, processing, and storing technologies empower data monetization practices by establishing novel insights about customers from IoT-enabled devices and big data, facilitated by cloud technologies, which can inform consumer profiling, behavior prediction, and targeting efforts. In turn, these efforts can optimize marketing and business performance, supply chain management, and (extended) data wrapping (i.e., development of analytical functions). Accordingly, these technologies have been widely adopted by many businesses, such as Netflix (Izrailevsky et al., 2016 ) and Woolworths (Crozier, 2019 ), to improve their performance and profitability.

Both data sharing and monetization practices in this domain can result in significant privacy tensions. Data collected from IoT devices such as CCTV cameras that track people using facial recognition technology and wearable devices that gather real-time information about users’ medical conditions or physical activity are very sensitive and highly personal. A comprehensive personal picture created through data aggregation and algorithmic profiling using big data analytics increases information privacy concerns, because it can reveal identifiable attributes such as sexual orientation, religious and political views, and personality (Kshetri, 2014 ). Moreover, when their behavior can be predicted more accurately, consumers become more susceptible to marketing efforts. For example, gambling companies might pinpoint addicts and entice them with free bets (Cox, 2017 ). Less purposefully, cloud services rely on virtual storage, but such remote processing can compromize system security (Alsmadi & Prybutok, 2018 ), especially at the transition moment, when firms shift internal applications and data to the cloud, which risks information exposure to fourth parties, including unethical actors that seek to steal consumers’ personal data (Yun et al., 2019 ). The sheer volume of information, historical and real-time, that links connected consumers, especially those proximal to one another through IoT devices, heightens security risks involving stolen identities, personal violations, and intellectual property losses (Kshetri, 2014 ). These practices together threaten communication privacy and individual privacy because they are intrusive, invisible, and extraordinarily difficult to control.

Data modeling and programming privacy tensions

Automation enabled by data modeling and programming technologies plays a key role in data sharing and data monetization. Considering our focus on privacy tensions, we discuss AI/machine learning and service robots as relevant amalgamations of engineering and computer science that produce intelligent automation, capable of learning and adaptation (Xiao & Kumar, 2019 ). These technologies facilitate data sharing as AI generally enables automated sharing of real-time data, and embodied AIs such as robots can exchange information in physical interactions. Moreover, AI-based systems enable data monetization by improving marketing and operational performance (e.g., personalized recommendations, smart content, programmatic media buys, chatbots, and predictive modeling) (Davenport et al., 2020 ). Modern robots, such as humanoid, programmable Pepper (Musa, 2020 ), can understand verbal instructions, interpret human emotions, and exhibit social intelligence to improve customer experiences and optimize performance. AI and service robots also enable data wrapping/extended wrapping by automating tasks and services; in addition, their data analytics–based features can adapt automatically to the real-time, physical environment.

However, optimizing machine learning requires enormous amounts of data, collected from consumer interactions, often without their knowledge. In general, AI might extract sensitive information such as people’s political opinions, sexual orientation, and medical conditions from less sensitive information (Davenport et al., 2020 ), then manipulate users through predictive analytics or create deception such as deep fakes (Kietzmann et al., 2020 ), which threaten information privacy . Robots equipped with computer vision and machine learning both see and sense the environment, implying greater penetration into consumers’ private, physical, and emotional spaces and threats to individual and communication privacy .

Data visualization and interaction design privacy tensions

Finally, data sharing and monetization activities rely on data visualization and interaction design technologies, as each enables connected realities known as the “metaverse,” predicted to become an important part of digitial future (Kim, 2021 ). Data can be visualized through display technologies, such as mixed, augmented (AR), and virtual (VR) realities , which deliver realistic virtual experiences involving synthetic worlds in which users become immersed through interactions and sensory stimulation (Roesner et al., 2014 ). In terms of data sharing, these technologies allow immersive data presentations and experiences, especially data storytelling, that can be shared virtually, visually, and seamlessly among different groups of users (customers). In addition, these technologies enable firms to monetize data because they enhance customer interactive experiences (Hilken et al., 2017 ); they allow marketers to build increasingly intimate customer relationships, as in the examples of Sephora’s virtual product try-on or Facebook’s social VR platform Horizon (Appel et al., 2020 ), thereby improving marketing and operational performance. Both VR and AR technologies offer great potential for data wrapping/extended wrapping by realistically depicting analytics-based features.

Privacy tensions created are similar to those created by the IoT. Notably, alternate realities require sophisticated input from cameras, GPS, and microphones to enable the simultaneous functioning of various applications (Roesner et al., 2014 ). Blending mixed reality also requires sensitive information, such as personal communications, images captured by cameras, and movements captured by sensors, posing a risk to information and communication privacy . Some of the latest privacy concerns involve bystanders in social AR in public spaces, because the data of passers-by, such as their faces or behaviors, can be captured by AR devices without their realization (Nijholt, 2021 ). The processed data then could be transferred to other applications for display or rendering too, such that their personal information is exposed to an unknown system that might access and manipulate the data without users’ consent. “Clickjacking” tricks people into clicking on sensitive features by using transparent or deceptive interfaces, which then allows the illegitimate actor to extract their data (Roesner et al., 2014 ). Finally, an extensive range of sensitive sensors can capture rich information, as when visual data produce spatial mapping information also validate spatial elements, such as exteriors or physical articles. Such exposures of physical space threaten individual privacy .

A structuration approach to digital technology–privacy tensions

Data monetizing and data sharing, achieved through firms’ use of digital technologies, can exacerbate technology–privacy tensions among consumers, regulators, and firms. Underpinned by structuration theory, we advance a framework for understanding their unique approaches to managing such tensions in Table 2 .

As noted previously, structuration theory highlights the interaction of structure and action (agency) rather than remaining limited, as some previous social theories had been, to the exclusive role of just structure or action (Giddens, 1984 ). It thus advances a structural duality account, involving the mutual interdependence and recursivity of actions and structures. Structures, which represent both the context and the outcomes of social practices (Luo, 2006 ), include rules, laws, social norms, roles, and resources (e.g., digital technology), such that they might constrain or enable (group and individual) agents’ behavior (Jones & Karsten, 2008 ). Structuration theory also predicts the production and reproduction of a social system through interactions by actors bound by the structure. These actors rely on rules and resources to define meaningful action (reactive approach) and also might extend or transform rules and resources (proactive approach) through their actions (Sydow & Windeler, 1998 ). Firms and consumers inherently belong to social systems that establish structures, such as regulatory frameworks or strongly held social norms about privacy. Privacy tensions also stem from social practices that evoke responses from consumers and firms. Therefore, even as consumers and firms are influenced by regulatory frameworks and privacy norms, their actions inform and shape those regulatory frameworks and norms. This iterative, dynamic interplay establishes the rules that govern subsequent interactions, forming and refining policies and constraints (Park et al., 2018 ).

For analytical purposes, Giddens ( 1984 ) characterizes structure according to three dimensions: signification (meaning), legitimation (norms), and domination (power). Then interactions consist of three corresponding characteristic forms: communication, (exercise of) power, and (application of) sanctions. Separate modalities connect structure and action. In practice, these elements often are interconnected and function simultaneously (Giddens, 1984 ). Considering the novelty of this structuration theory application to privacy topics, as well as the complexity of our proposed model, which involves interplays of institutions (regulators), groups (firms), and individuals (consumers), we focus here on the duality of structure and social practices in an effort to clarify privacy tensions among firms, consumers, and regulators, rather than test the original analytical dimensions of structuration theory.

When considering digital technologies and privacy tensions, the structure–actor relationship also might be described according to the service-dominant logic (SDL), which indicates that actors do not function in isolation but are part of wider networks (Vargo & Lusch, 2016 ). A firm ecosystem comprises a web of strategic networks, in which actors are connected and exchange resources to cocreate value, within the constraints of relevant institutions or institutional arrangements (regulatory authorities, frameworks) (Roggeveen et al., 2012 ). Firms operate within ecosystems and continuously interact with other entities such as supply chain partners. Because data constitute a type of currency in the digital economy, they represent important elements in any firm’s value chain and the broader marketing ecosystem. By integrating structuration theory with the SDL, we can derive a framework of relationships among actors (regulators, consumers, firms) and relevant structures or institutions (Vargo & Lusch, 2016 ). This blended perspective implies that the actors exist and interact within a system of relationships (Giddens, 1984 ; Roggeveen et al., 2012 ). Accordingly, we can explain the regulatory framework associated with privacy (i.e., structure) and predict both reactive and proactive responses by consumers and firms (i.e., actors). As we noted previously, the presence (or absence) of a regulatory framework implies rules or norms that in turn affect privacy tensions. In addition to relying on effective data protection though, consumers engage in further protective behaviors and demand data protection, forcing firms to respond to the privacy tensions.

Data privacy regulation

According to structuration theory, structures such as regulatory frameworks (i.e., rules) can both constrain and enable consumer and firm actions in the digital landscape, exacerbating or offsetting privacy tensions. Privacy regulatory frameworks or policies seek to provide fairness, trust, and accountability in consumer–firm data exchanges. Similar to other consumer-focused public policies, major privacy frameworks attempt to improve overall societal well-being and protect people’s rights, in balance with countervailing societal goals such as firm profitability and economic prosperity (Davis et al., 2021 ; Kopalle & Lehmann, 2021 ). Digital technologies have evolved significantly, smoothing processes that allow firms to monetize and share customer data while simultaneously adding complexity to consumer-side privacy prevention. Therefore, it is critical for regulators to address privacy tensions that arise from digital technology use.

The three broad classes of privacy risks created by firms’ data monetization and sharing strategies are addressed to varying degrees by global data protection laws such as the GDPR, Australian Privacy Act, and CPRA, each of which attempts to limit the collection, use, storage, and transmission of personal information. Although data privacy regulations differ from country to country, the GDPR has become a global standard (Rustad & Koenig, 2019 ). New privacy laws tend to reflect its foundations (Bennett, 2018 ), and the global nature of business implies that many international firms must comply with its rules. Most U.S. state-based and global data protection frameworks share three common principles as their foundation (Helberger et al., 2020 ), which also align with structuration theory themes. First, consumers are both content receivers and data producers, making consent and ownership critical. Second, transparency is paramount to balance power discrepancies between consumers and firms. Third, data move throughout marketing ecosystems and across multiple parties, making access and control of data streams and consumer education about data collection, uses, and potential consequences critical.

Data protection laws also tend to involve two main enforcement methods, related to firms’ privacy policies and managerial practices, which might be categorized as more reactive or more proactive. Reactive conditions imply minimal changes and less impact on existing firm structures and performance; proactive conditions require more expansive changes. In relation to a firm’s privacy policy, for example, a reactive requirement might stipulate its availability and visibility on the firm’s website. For example, CPRA requires a privacy policy hyperlink on the firm’s home page that is noticeable and clearly identifiable (e.g., larger font, different design than surrounding text). A proactive version might require firms to disclose consumers’ rights and information access, use, and storage rules as important elements of their privacy policy. Through either enforcement mechanism, the regulatory goal is that consumers learn easily about data security, data control, and governance measures enacted by the firm.

In terms of managerial practices, a reactive approach would mandate notice of data breaches. Most data protection laws also set penalties for noncompliance; under GDPR, firms convicted of privacy violations face fines of up to 20 million euros or 4% of their global revenue. A proactive version might require firms to obtain consumer consent for information collection and usage. For example, websites often use a pop-up window that details the different types of cookies used for tracking and parties with which data may be shared. Consumers may review this information, then opt-out or request that the firm delete their information or stop sharing it with third parties. Both GDPR and CPRA enforce these consumer protections. Other regulations address firm profiling practices, facilitated by AI, to prevent harmful consumer alienation or exclusion practices. However, such laws differ in notable ways. For example, under the GDPR, firms must conduct a regular privacy impact assessment, which is not required by CPRA.

Consumer privacy protection behavior

Structuration theory suggests that as consumers grow more aware of various privacy tensions during interactions with firms, their sense of worry or fear might evoke protective actions (Walker, 2016 ). The level of fear or worry depends on the nature of the rules and resources available in their relationships with firms. Assessments of relationship structures likely refer to the severity of the privacy risks, their perceived likelihood , and felt vulnerability or agency to cope with privacy risks (Lwin et al., 2007 ). For example, if consumers realize greater privacy risks due to the nature of the data being collected or increased breach likelihood in a firm relationship, they become more likely to engage in privacy protective behaviors, manifested as future responses to the structures and resources available within that relationship.

Some privacy-protecting strategies increase consumers’ control over personal information (e.g., decrease disclosures, minimize their digital footprint) or establish requirements for explicit permission for uses of their personal data (information access and use) (Walker, 2016 ). Thus, we again can identify reactive and proactive protection strategies. With a proactive strategy, consumers preemptively address privacy threats; with a reactive strategy, they act as explicitly advised by a firm or in response to an immediate threat. Therefore, we propose a two-dimensional categorization of consumer privacy protection behavior that spans reactive/proactive and information control/permission control dimensions and produces four groups (see Table 2 ): (1) reactive information strategy, (2) proactive information strategy, (3) reactive permission strategy, and (4) proactive permission strategy.

Reactive information strategy

By correcting their digital footprint, in response to privacy tensions, consumers can manage immediate privacy threats. For example, they might self-censor or filter content after it has been published, by deleting content from blog entries or Facebook posts, “untagging” themselves in photos or posts, “unfriending” contacts, or requesting that a firm or social media platform remove their information. Consumers also might avoid disclosure by intentionally refusing to provide certain elements of information in response to initial requests (Martin & Murphy, 2017 ) or else falsify the information they do provide, such as using a fake name, address, date of birth, and profile picture. This strategy reduces their digital footprint by removing or altering content that previously has been available.

Proactive information strategy

Rather than managing content that already has been published, a proactive information strategy uses restraint as a protective mechanism that defines consumers’ ongoing practices of withholding information (Lwin et al., 2007 ). Consumers reduce the amount of personal content shared, minimize digital interactions, and limit activities such as online check-ins, which can reveal personal information. They also might use encrypted communications such as Pretty Good Privacy software, S/MIME standards (Kaaniche et al., 2020 ), or anonymous re-mailers to reduce data availability. Some people seek non-digital alternatives for their communications, information search, and purchases (Martin & Palmatier, 2020 ). Since this strategy restricts information prior to sharing, it limits content and sociability. It also generally involves more effort, complexity, and inconvenience for consumers than a reactive information strategy, because it demands continuous monitoring of the digital footprint.

Reactive permission strategy

In a reactive permission strategy, consumers limit access to their personal information when service providers ask for it or respond to an instant threat such as a data breach that makes the risk salient. Consumers generally might agree to provide access to their information, but with a reactive strategy, they engage in a withdrawal tactic to remove themselves from risky situations, such as deleting apps that ask for access to their location, rejecting or removing cookies from their computers, and blocking advertisements (Yap et al., 2012 ). A fortification of identification effort might include changing passwords after data breaches or threats. They also can minimize risk by communication termination, or opting out of firm communications to avoid intrusion and prevent third-party information access.

Proactive permission strategy

Among consumers who are more aware of privacy tensions and knowledgeable about digital privacy technologies, we note more sophisticated efforts to protect personal information (Martin et al., 2017 ). With screening , they monitor their own digital activities by verifying firms’ privacy policies and securing transactions (e.g., using https protocols). Restriction involves limiting information access by adjusting privacy settings, such as turning off location-based access or changing cookie settings. Identity masking is another popular strategy to prevent tracking, using a security feature that stops a browser from storing cookies and the search history. Even more sophisticated tools include virtual private networks and The Onion Router, which work through encryption and create networks of virtual tunnels, designed to anonymize internet communications (Kaaniche et al., 2020 ). Finally, if they adopt security consolidation , consumers install privacy-enhancing technologies, such as blockers and firewalls for third-party trackers, along with internet security programs (Zarouali et al., 2020 ). These strategies offer strong protection but also require substantial technological savvy that is unlikely to be possessed by all consumers.

Firm privacy responses

According to structuration theory, augmented by the SDL, firms as actors operate in broader systems that affect their behaviors (Vargo & Lusch, 2016 ). Firms are influenced by structure (e.g., regulations) and by their relationships with other actors (e.g., consumers) (Park et al., 2018 ). In response to regulatory and consumer actions, firms might comply with privacy rules (reactive response) or go beyond them to engage in privacy innovation (proactive response), which potentially shapes new structures (Luo, 2006 ).

Reactive response (privacy compliance)

Structuration theory (e.g., Luo, 2006 ; Park et al., 2018 ) suggests the presence of some structurally embedded constraints on actors. Due to increased scrutiny of data practices, firms are expected to comply with what is sometimes a patchwork of local, national, and international privacy regulations. A reactive response corresponds to the minimum expectation for a company, namely, to follow existing, immediate structures in the regulatory framework. With this local approach to privacy regulation, firms only aim to meet specific, local privacy rules. This type of response is common among small, local businesses, but it also might be adopted by big corporations, to take advantage of variances in legal systems across specific markets.

Furthermore, this approach is in line with a privacy process that emphasizes privacy as a feature. That is, privacy constitutes added value, generally included as an afterthought in product and service development efforts. The main goal underlying this approach is to stay within legal boundaries and general expectations related to privacy. For example, by strengthening their cybersecurity, companies can address consumers’ reactive information strategies by minimizing negative events such as data breaches that threaten to trigger consumers’ falsification, avoidance, withdrawal, or communication termination actions. In addition, these firms likely focus on technologies that enable them to adhere to regulations. When the GDPR came into force and required firms to ensure consumers’ right to be forgotten, they faced technological challenges and thus committed to developing automated and standardized procedures for the removal, transfer, or recovery of data, upon consumers’ request, which also might dissuade consumers from adopting self-censorship behaviors.

Proactive response (privacy innovation)

In a volatile business environment marked by constantly changing structural parameters, structuration theory suggests that firms can influence structural forces. For example, Xerox, Cisco, Nokia, and Motorola persistently and efficaciously convinced the Chinese government to update and require all firms to conform with a new set of industry technical standards, thereby changing industry norms as a key structural parameter (Luo, 2006 ). Privacy innovations are new or enhanced firm privacy management practices designed to benefit consumers, appease the government, or otherwise appeal to relevant stakeholders. They arise when firms actively integrate compliance as a business pillar and attempt to address privacy regulations collectively, through a universal approach to privacy . Instead of dealing with each law and policy separately, firms identify key compliance issues across regulatory frameworks and adopt a streamlined, uniform strategic plan that can guide all aspects of their behavior, as well as current and future standards.

Furthermore, privacy innovation encompasses a privacy by design paradigm , which embeds privacy in all business processes, products, and services, from their initial development to their final consumption and disposition stages (Bu et al., 2020 ). Privacy by design stresses proactive, user-centric, and user-friendly protection, and it requires substantial investments and changes. For example, data collection strategies would aim to gather zero-party data, which refer to consumers’ voluntary provision of their information, are completely consent-based, and can be collected from polls, quizzes, or website widgets (Martin & Palmatier, 2020 ). By engaging in data discovery, categorization, and flow mapping, innovative firms might minimize their information collection and only collect what they actually need. Privacy might be integrated into customer-facing applications too, such as automatic timed logouts, notifications for unrecognized access, and setting security and privacy as default options. Finally, privacy innovation encompasses accountable business practices that require firms to involve their partners in data governance to ensure end-to-end security, such as by auditing third parties that manage data on a firm’s behalf (Merrick & Ryan, 2019 ). Pursuing privacy innovation can address the proactive privacy responses of even highly skeptical consumers and instill trust, by creating a safe ecosystem, so it should minimize restraint and restriction behavior. In this sense, privacy innovation offers an effective way to address both proactive consumer responses and regulations. However, it also tends to be costly and requires both long-term commitments and extensive transformations of the business structure and practices.

In summary, structuration theory purports that a privacy-related structure must include regulations that require firms to provide notice and gain consent from consumers to collect, parse, and store their data. They greatly enhance consumer-initiated strategies to address technology–privacy tensions. Consumers’ behaviors also depend on their resources, such as knowledge and self-efficacy (Walker, 2016 ). In general, reactive strategies require less expertise, and proactive ones demand greater technological savvy. Yet firms remain bound by the structure and can employ either a reactive response that treats privacy as a compliance issue or a proactive response that views it as a core business value. These trade-offs and tensions characterize regulatory–consumer–firm interactions, and we rely on them to propose an integrated framework to inform theory, practice, and policy.

Integrated framework of the structuration of privacy

The preceding review offers key insights and implications for firms, consumers, and regulators. Informed by structuration theory, and augmented by elements of SDL, we draw from these insights to develop an integrated framework (Fig.  1 ), in which privacy and its preservation emerges from interactions across structures (i.e., digital technologies as resources and data privacy regulations as rules) and actors (i.e., firms’ and consumers’ actions). On this basis, we propose a series of tenets related to themes of (1) data monetization and firm performance, (2) data sharing and firm performance, and (3) firms’ data privacy actions. We also introduce associated propositions. This synthesis of extant literature reveals practical insights to clarify the future of digital technologies in contexts marked by changing consumer behaviors and regulatory parameters. Depth interviews and case studies (Table 3 ) provide additional, conceptual scaffolding to proposed tenets and propositions, in support of our framework.

figure 1

Integrated framework of privacy structuration

To verify our propositions relative to firms’ and consumers’ experiences with digital technologies and data exchanges, we conducted in-depth interviews with ten senior managers in various industries, with 4 to 31 years of experience in their respective areas. We also interviewed five consumer informants from 27 to 41 years of age who are heavy users of digital technologies (see Web Appendix 2 for informant profiles). We identified participants from our contacts in a research cluster. Interviews were conducted either face-to-face or via a video conference platform, and they were recorded and transcribed. The interview protocol includes 18 questions related to digital technologies, data collection and use, and privacy issues (see Web Appendix 3 ).

Tenet 1: Data monetization and firm performance

We propose that digital technologies function as resources that enable firms’ data monetization and data sharing strategies. Using digital technologies such as big data, IoT, and AI in the ways previously described, firms can convert data and analytics into value for their customers and increase their profitability (Najjar & Kettinger, 2013 ). For example, a recent estimate of the value of Facebook users’ personal information is $35.2 billion, or 63% of Facebook’s revenues (Shapiro, 2019 ). As a shared general consensus, the interviewed senior managers agreed that data analytics boost firms’ performance. Internal data monetization practices can enhance firm performance, because the data collected from digital platforms represent consumer insights that firms can use to tailor solutions to meet consumers’ preferences and also make better business decisions (Bleier et al., 2020 ). As the head of product marketing in an electronics firm noted: “By using data we can offer the right product, right value at the right touchpoint to the end-user [using the] right approach.” An informant who performs customer analytics in the banking and finance sector also provided an example of data wrapping practices, such that the organization packaged its products with data insights as value-added features:

[Some of the data] that we capture [from individual customers] can be used to provide insights to our B2B customers…. With what we have today we could provide insights into their business based on their data to help them grow their business.

Similarly, external monetization, such as selling data to clients for marketing and targeting purposes, offers significant economic benefits for sellers. These three approaches are not mutually exclusive; firms can use more than one to generate revenue. Such data monetization practices increase the profitability of a firm and thereby enhance its performance.

Privacy tensions can stem from consumer–firm interactions through digital technologies (Park et al., 2018 ). Drawing from the notion in structuration theory that structure can both shape and be shaped by social practices, we note that the inherent privacy tensions of data monetizing practices provoke consumer and regulatory privacy responses, which have direct implications for firm performance. Data monetization thus may lead to privacy tensions and open firms to legal challenges, especially as data privacy regulations grow stronger. After the Cambridge Analytica scandal, heated debates about consumers’ privacy rights arose, and policymakers sought to increase the stringency of data regulations, such that Facebook’s CEO was called to testify before Congress and the company was fined US$5 billion by the U.S. Federal Trade Commission for deceiving users about their ability to control the privacy of their personal information (Lapowsky, 2019 ). In addition, trust in Facebook plunged by 66% (Weisbaum, 2018 ) and customers, including influential figures such as Elon Musk, joined the #DeleteFacebook movement in response. As this example shows, monetizing data may spark consumers’ privacy protection behaviors, which can jeopardize firms’ relationships with them. Even requests for data or perceptions that firms profit from consumer data can trigger reactive and proactive privacy protection behaviors, such as information falsification or outright refusal (Table 2 ). One consumer informant recalled an experience that felt like “an invasion, like I visited a website once because we got a new kitchen and now I get ads constantly for kitchen stuff. And it’s like, I might need that, but I don’t want you to know that I need it, but I want to find it myself.” A senior manager, head of digital marketing for an apparel firm, echoed this sentiment by acknowledging that “society is a lot more worried about data.” The inherent privacy tensions of data monetization can increase regulatory scrutiny, damage customer–firm relationships, and spark consumer privacy protection behaviors. We propose the following tenet and propositions:

Tenet 1 (Data Monetization Trade-Off)

Enabled by digital technologies, data monetization creates a trade-off between firm profitability and privacy tensions (information, communication, and individual privacy). When they result from consumer–firm interactions, privacy tensions lead to changes in both regulatory and customer responses.

Proposition 1

Data monetization positively influences firm performance through profitability.

Proposition 2

Data monetization negatively influences firm performance through increased privacy tensions (information, communication, and individual privacy), which trigger consumer data protection behaviors and privacy regulations.

Tenet 2: Data sharing and firm performance

From the integration of structuration theory and the SDL, we determine that digital technologies enable data sharing among actors within a business network, so multiple parties can access the data, anytime and from anywhere, which increases efficiency, in line with the prediction that value is co-created by multiple actors in an ecosystem (Vargo & Lusch, 2016 ). Data sharing also strengthens relationships among supply chain partners and fuels network effectiveness, which refers to the “viability and acceptability of inter-organizational practices and outcomes” (Sydow & Windeler, 1998 , p. 273). Firms might collectively improve their performance by complementing their data with others’ information, thus generating second-party data (Schneider et al., 2017 ). Manufacturers gather market analytics from distributors for new product design, demand forecasts, and the development of marketing strategies. Take the automobile industry as an example. Data sharing enables carmakers, including BMW and its suppliers, to pinpoint production bottlenecks or parts shortages, then formulate effective responses to potential supply chain problems and boost the performance of all firms involved (BMW, 2021 ). A chief financial officer of a manufacturing firm affirms the value of data sharing:

Definitely, you know, for us as a supplier when we receive our clients’ market data, that’s entirely valuable for us. [Data sharing] is a critical part of making sure that we do the best job that we can.

Yet similar to data monetization, the multiple-actor, collaborative nature of data sharing can result in privacy tensions. The more data a firm shares, the more control it must surrender to other parties, creating vast uncertainty. Therefore, data sharing may jeopardize consumer information privacy and trigger both consumer and regulatory responses. These responses may imply performance losses for the focal firm, especially if requisite security measures are missing (Schneider et al., 2017 ). Considering the interactions between structure and social practices of data sharing, we offer the following tenet and propositions:

Tenet 2 (Data Sharing Trade-Off)

Enabled by digital technologies, data sharing creates a trade-off between network effectiveness and privacy tensions (information, communication, and individual privacy).

Proposition 3

Data sharing positively influences firm performance through network effectiveness.

Proposition 4

Data sharing negatively influences firm performance through increased privacy tensions (information, communication, and individual privacy), which trigger consumer data protection behaviors and data privacy regulations.

Tenet 3: Firm privacy responses

The dynamic interplay of structures and actors, again, guides our theorizing. Privacy tensions that occur from data monetization and data sharing (i.e., social practices) tend to alert policymakers, who often respond by strengthening regulatory frameworks. This change in structure (i.e., data regulation), together with objections from consumers that result in privacy-protective behaviors, requires firms to develop data privacy actions to reduce privacy tensions. Firms that aspire to address privacy tensions proactively must devote resources and create processes for updating their digital technologies, which is a particular challenge for small firms: “I think we would like to focus more on innovation. But we’re probably not that big a company at the moment that we can put a lot of resources towards it” (head of digital marketing, apparel).

A common centerpiece of privacy legislative frameworks is an emphasis on consumer consent. Adhering to such practices may reduce the collection and use of data and limit the number of parties with which data can be shared, which can engender diminished firm performance due to the constraints on personalization, customization, targeting, and prediction efforts. In addition, privacy laws might restrict firms’ ability to sell data and analytics and increase legal expenses, causing a significant disadvantage for businesses that are unable to collect data on their own. However, the effects of data privacy regulation can be mitigated by proactive privacy responses. Technology innovations might be exploited to circumvent regulatory limitations. For example, Airbnb and Uber personalize predictive models without compromising consumer privacy by using smart pricing algorithms based on anonymized, aggregated, or market-oriented (event- or object-based) data, rather than personally identifiable information (Greene et al., 2019 ). Whether they enhance privacy practices or technological processes, privacy innovations allow firms to reap critical data benefits, meet or exceed legal and regulatory obligations, accommodate consumer expectations, reduce privacy tensions, and, ultimately, mitigate the impacts of consumer responses and regulation stringency. Even though they involve data monetizing for marketing, operational efficiency, and data wrapping practices, Apple’s privacy initiatives (e.g., App Tracking Transparency, Privacy Nutrition Labels) have boosted customers’ loyalty to the brand even higher, such that 92.6% of Apple users claim they would never switch to an Android (O'Flaherty, 2021 ). The initiatives also alter industry norms and impose greater pressure on competitive firms, such that Google has announced plans to consider an anti-tracking feature for Android devices (Statt, 2021 ). As a chief executive officer of a telecommunication service provider explained:

You certainly got to comply, ticking that box, but I do think that if you were going that extra mile and looking at ways [of] being innovative, then you’re going to be servicing your customers even better.

We offer the following tenet and propositions:

Tenet 3 (Firm Privacy Responses)

Enabled by digital technologies, firms develop privacy responses to address regulatory and consumer privacy responses.

Proposition 5

Data privacy regulation can both positively and negatively influence firm privacy responses.

Proposition 6

Consumer privacy protection behaviors positively influence firm privacy responses.

Proposition 7

Compared with reactive privacy responses, firm proactive privacy responses reduce consumer privacy protection behavior by mitigating the negative effects of data monetization and data sharing strategies on privacy tensions.

Implications for firms with different data strategies

Our novel framework explains that digital technologies empower firms’ data monetization or data sharing strategies, while also creating privacy tensions. Although most firms seek vast customer data and employ various means to leverage them, not every firm requires the same amount of data or has the necessary digital technology capabilities (see Web Appendix 1 ). The effect of data privacy regulation on data sharing and monetization thus should vary across firms which rely on data to varying levels. The more firms rely on data monetization and data sharing, the more pronounced the effects of regulatory changes and consumer data protection behaviors become. For such companies, data privacy responses have heightened implications; a proactive response might mitigate the restrictions. We apply our framework to provide implications to firms with various data strategies as depicted in Web Appendix 1 , in which we assign firms to one of four groups, according to their levels of both data monetization and data sharing practices.

Data harvester

Data harvesters engage in limited, internal data monetization and data sharing practices, leaving them less exposed to consumer privacy behaviors and regulations. Many of them seek to “harvest” their own data to create customer value. The head of digital marketing of an apparel firm shared:

We don’t have as much access to that sort of [third-party] data but yet we do have first-party data, which will be those [data] of those people within our leads database in our customer base.

However, if restrictions were imposed on their internal consumer data collection and use, they would have to rely on third-party data providers or brokers. Therefore, data harvesters likely comply closely with government regulations and tend to adopt reactive privacy strategies to protect consumer privacy.

Data patron

Data patrons’ monetization approaches resemble those of data harvesters, such as developing customer intelligence for better internal marketing efforts. However, data patrons such as Apple, Microsoft, and PayPal are more likely to undertake data wrapping to create customer value, because they have greater digital technology capabilities. In addition, they are cautious about the many partners involved in their business networks which can create significant risks for sharing practices. Echoing this view, the head of customer analytics of a banking and finance firm stated:

We do have some partnerships that we share [our data with], but again it depends on our terms and conditions on what we share and what it is related to, especially if it has to do with the customer experience and if it is in the right interest of our customer.

These firms are moderately affected by consumer privacy behaviors and regulatory frameworks. Innovative patrons can navigate sophisticated regulatory frameworks and consumer privacy protection behaviors; for example, Apple’s App Tracking Transparency promotes its image as a responsible tech firm.

Data informant

Data is the bloodline of data informants’ business models, as described by the president of a software development firm: “We don’t produce any physical goods. We operate only in the informational space. That means data is everything.” To maximize profits, some data informants use questionable methods to identify people’s interests in sensitive topics. These firms face substantial scrutiny; some regulators suggest they should be listed in public registries and allow consumers to request clarifications about data ownership. Their size and scope also make these firms prime targets for cyberattacks (Bleier et al., 2020 ). Finally, their privacy innovation tends to be low, because few incentives (or punishments) limit their data exploitation.

Data expert

Data experts are active members of the digital technology ecosystem and work with any third parties, engaging in very high levels of both data sharing and data monetization. Other firms rely on data experts for advertising insights and customer analytics, so in turn, they have significant power and influence over the nature and amount of data collected. Only a few firms (e.g., Google, Facebook, Twitter) fit this description, and each of them is subject to ongoing regulatory scrutiny. In general, stronger regulations, competitive maneuvers (i.e., Apple’s iOS 14 privacy updates affecting Facebook, Twitter, and others), and increasing consumer criticism may threaten their business model. Accordingly, data experts may need to adopt more extensive, transparent, and proactive privacy practices to address these challenges.

Recommendations for policymakers

Privacy regulations attempt to empower consumers by ensuring their ability to share, capably monitor, and protect their personal data. Regulations also provide guardrails to constrain firms’ interactions with consumers by mandating responsible data use and fair exchange. On the basis of our integrative framework and typology of data strategy, which establish a comprehensive view of privacy tensions linked to emerging digital technologies, we offer policymakers several recommendations for drafting, implementing, and monitoring effectively such regulations.

Addressing digital technology evolution

Digital technologies evolve rapidly, and privacy regulations must account for that rapid evolution. Although future-proofing privacy regulations is untenable, regulatory parameters that govern fundamental data exchanges, rather than specific technological techniques for gathering or processing data, can protect consumers more broadly, even as technologies change. In addition, such regulations would prevent firms from applying technologically advanced workarounds to subvert the restrictions. Both the GDPR and CPRA are designed to be technology neutral, governing the data exchanged between a customer and a firm rather than the technology through which the data are exchanged. Nevertheless, emerging digital applications such as deep fakes, the rampant spread of misinformation, and the growth of advertising ecosystems can challenge even the most technologically broad regulatory mechanisms. It is thus imperative that regulatory frameworks adequately protect consumer data, regardless of technological advances, with legal and protective parameters drafted with a technology-neutral approach that avoids regulatory obsolescence and prevents innovative subversion by firms.

Beyond imposing constraints though, we also recommend that regulators work closely with firms to learn how the regulations they propose are likely to play out in practice. For example, novel technologies can make the enforcement of various privacy regulatory dimensions more or less effective; requiring customer consent is a cornerstone of the GDPR framework, but its operationalization and enactment in practice has led to increased consumer annoyance with the required pop-ups (Fazzini, 2019 ). Monitoring efforts also should go beyond identifying violations or demanding strict, high-level compliance. In summary, even if technology advances too quickly to be subject to specific regulation, it strongly influences the implementation and effectiveness of regulation in practice, such that it can support or hinder intended regulatory purposes, so policymakers need to pursue and maintain an up-to-date, clear understanding of recent technology developments.

Appreciating variability across firms

The GDPR may have had the unintended consequence of empowering the big technology companies (Facebook, Google) that it originally sought to constrain (Lomas, 2020 ). Large companies with many resources (financial, legal, personnel) are better poised to accommodate vast regulatory changes, including privacy regulatory mandates. In particular, firms that already house vast troves of customer data easily can reduce their reliance on external or third-party data providers. Their in-house data capabilities enable them to conform with regulatory parameters, even if their data use might seem ethically questionable. We recommend that regulators examine firms’ specific data sharing and data monetization practices closely, with particular monitoring efforts focused on firms with extensive engagement in data monetization, such as data informants and data experts. This targeted means to privacy regulation avoids some of the weaknesses of a “one-size-fits-all” approach. For example, many data informants are developers that offer free apps in exchange for customer data, and their external monetization practices are largely unknown to consumers. By applying the proposed data strategy typology, policymakers can detect areas of data concentration with the potential for misuse, such as among data experts and data patrons. Such considerations also could help limit the dominance of major tech firms. Regulators might apply the typology to understand the adverse effects of privacy regulation, such as data portability on data harvesters, that might harm small businesses or start-ups. If they cannot acquire large troves of data on their own, these smaller competitors must rely on larger actors to obtain data-based insights.

Promoting proactive regulatory enforcement

To the extent possible, regulatory frameworks should impose proactive enforcement of both privacy policy requirements and managerial practices, including privacy by design and privacy by default principles. Monitoring a firm’s data protection behavior can indicate the effectiveness of the regulatory framework, beyond just capturing violation occurrences or noncompliance. Even with comprehensive data protection regulations in force, multinational corporations appear to adopt reactive privacy strategies for the most part, while continuing to engage in behaviors and practices that put them and their customers at risk for data breaches (Norwegian Consumer Council, 2020 ). By recognizing and rewarding proactive firm responses (data innovation), perhaps in collaboration with industry bodies or aspirational firms, regulators also could encourage the reproduction of best practices.

Finally, transnational cooperation among enforcement authorities is necessary to deal with the many multinational firms and online businesses whose operations transcend national borders. Harmonization efforts proposed by the United Nations provide a potentially useful platform; its Personal Data Protection and Privacy Principles can help member organizations navigate the diverse patchwork of regulatory coverage, given their business scope and reach, by developing a coherent set of widely applicable rules that embody strong, proactive standards. Effective enforcement of privacy regulation and true protection of consumer privacy can be realized only if globally cooperative mechanisms are in place.

Conclusion and research directions

Adapting to ever-changing business environments and developing long-term relationships with key stakeholders requires extensive investments in digital technologies. Enabled by digital technologies, modern firms have access to massive amounts of data, which they use to pursue various advantages. This research provides new insights into the role of digital technologies by adopting a multidimensional approach and synthesizing current research and practical insights from the perspectives of firms, consumers, and regulators—each of which is critical to developing an integrated, comprehensive framework.

We begin by identifying the pressing tensions between firms’ data monetization and sharing practices enabled by digital technologies, as well as their implications for consumer privacy. By leveraging structuration theory, infused with elements of the SDL, we delineate responses to these tensions exhibited by regulators, consumers, and firms. The intersection of the firm, consumer, and regulatory perspectives produces an integrated framework of three tenets and seven corresponding propositions with the potential to advance understanding of privacy as a central product of the interactions across structure (i.e., digital technologies and regulatory frameworks) and social practices (i.e., firms’ and consumers’ actions). We overlay these predictions with a typology of firm strategies for data monetization and data sharing to highlight how firms’ privacy responses are influenced by resources (i.e., digital technology), rules (i.e., regulations), and other actors’ actions (i.e., consumers’ privacy protection behaviors).

To stimulate further research, we propose an agenda across three broad areas: firm data and privacy strategies, regulatory impacts on firms and ecosystems, and consumer responses to the digital technologies. Each area comprises multiple research questions and avenues for marketing researchers and practitioners, which we summarize in Table 4 .

Firm data and privacy strategies

Data strategy.

Our proposed framework acknowledges the reality of growing data monetization and data sharing practices, as focal strategies that determine firm performance. Data are the new currency in the digital era, and firms that can create unique, sought-after data, business intelligence, and responsible data-generating technologies will wield increasing power. To maximize the value of data, firms should seek balanced, responsible data monetization and data sharing. Such efforts would benefit from an optimal data valuation model that promises to maximize value and minimize risks through responsible data monetization and sharing. Firms should carefully consider their dynamic inventory of information assets, the features of their data that are central to realizing their potential, and metrics for assessing the value of data and returns on their investments (Deloitte, 2020 ).

Data sharing poses unique challenges, especially across platforms and organizations. A key question is how to improve data sharing through interoperability on secure, permission-based platforms. In addition, changing work cultures, such as the increased use of home networks, personal and shared computers, and access to office systems from external work venues, increase threats of cyberattacks and data breaches. Such developments represent new obstacles to data sharing, and they offer a fruitful research area.

Privacy innovation

Regardless of firm size or data strategy type, data privacy innovations extend beyond mere compliance and can lead to competitive advantages. The benefits of data privacy innovations across all firm data strategy types require further investigation. In particular, data privacy innovation should gain momentum as a positive catalyst for the firm and ecosystem performance. We need research that details effective implementations of privacy innovation practices and their long-term effects.

Research also is needed to identify effective data privacy innovations that might enhance the outcomes of data sharing and data monetization. In particular, research might determine sustainable levels of investment in privacy innovation, required for firms of different sizes, business models, and data strategies (i.e., data harvesters, data informants, data patrons, and data experts). A key question is whether the benefits of privacy innovation justify the cost of its adoption and implementation. For example, state-of-the-art security schemes and privacy-preserving technologies (e.g., blockchain-based approaches) still suffer disadvantages regarding scalability and data storage limitations (Jin et al., 2019 ), and they remain costly to implement, despite the data protection improvements they provide.

Moreover, privacy innovation might create ripple effects, due to the interconnected nature of firms in an ecosystem. For example, privacy changes among data experts and informants might influence data harvesters, who rely on the services of those data experts and informants. Therefore, a potential research consideration might be the ripple effects of privacy innovation, including the positive inspiration of an ecosystem revolution but the simultaneous strain they put on the relationship between firms and other actors.

Regulatory effects on firms and ecosystems

Effect of privacy regulation on firms.

We anticipate that as regulatory frameworks and consumer responses to privacy issues evolve, firms will face more restrictions on their strategies and practices. Continued research should examine the risks if firms fail to adhere to reactive and proactive privacy requirements. The outcomes might be subject to contextual factors, because regulation stringency varies across countries and industries. Privacy regulations restrict firm actions related to data monetization and sharing, but as we have outlined, structuration theory also predicts that actors’ behaviors shape the structures. Therefore, it would be interesting to identify the extent to which firm responses to regulatory frameworks can change the regulations themselves, which in turn might trigger a different set of firm actions. Uncovering nuanced effects according to firm size, industry, and other firmographics is an important direction to inform regulatory efforts. We thus call for investigations of how the typology of data strategy and firmographic variables moderates the effect of privacy regulations on firm performance.

Effect of privacy regulations on firm networks

Privacy regulations may be increasingly necessary, but they also have the potential to restrict the benefits that both firms and consumers receive from data monetizing and data sharing. That is, greater limits may be warranted, but more stringent regulations can create unintended obstacles to the performance of a firm’s network and ecosystem. Research should continue to address the impact of regulation stringency on network effectiveness. Finally, different, fragmented data access regimes exist in various sectors, such as utilities, automotive, finance, and digital content/services (Graef & van den Boom, 2020 ). It is therefore relevant to test the interactions of sector-specific regimes to predict broader network performance.

Consumer responses to digital technologies

Privacy attitudes.

As digital technologies penetrate consumers’ lives further and become increasingly powerful means for firms’ data collection and use, fresh privacy concerns emerge. Research is needed to understand how new digital technologies such as artificial general intelligence (i.e., AI on a par with human intelligence) threaten consumers’ information, individual, and communication privacy risks. Research thus might investigate consumers’ perceptions of firms’ data strategy (i.e., data harvester, informant, patron, and expert), actual data privacy actions, and regulation stringency, which could help construct long-term, technology-proof outcomes for all parties involved. Notably, all our informants acknowledge data skeptics, who proactively protect their privacy by using encrypted platforms or refusing all personalized services, but they believe most consumers are willing to accept some loss of privacy in exchange for personalization of the right experience. A critical question is how to identify consumer segments and make data trade-offs more acceptable for them.

Privacy protection behavior

Research on privacy in marketing has predominantly focused on privacy concerns, leaving a dearth of evidence about actual privacy behaviors (Pomfret et al., 2020 ). Consumers employ various forms of protection from privacy risks, whether reactive or proactive. Understanding when these responses are triggered and the role of privacy-enhancing technologies in evoking them would help firms devise more effective strategies. Additional research could explore how consumer privacy responses translate into financial impacts for firms.

As our explication of digital technologies highlights, data and their applications continue to transform the marketing landscape. The novel framework we propose can help clarify digital technologies and the future of data for firms, consumers, and regulators. But much work remains. We hope this research offers some compelling directions for further inquiry, along with new insights into the future of digital technologies.

Alsmadi, D., & Prybutok, V. (2018). Sharing and storage behavior via cloud computing: Security and privacy in research and practice. Computers in Human Behavior, 85 , 218–226.

Article   Google Scholar  

Altman, I. (1975). Privacy: Definitions and properties. In I. Altman (Ed.), The environment and social behavior: Privacy, personal space, territory, crowding . Brooks/Cole Publishing Company.

Google Scholar  

Appel, G., Grewal, L., Hadi, R., & Stephen, A. T. (2020). The future of social media in marketing. Journal of the Academy of Marketing Science, 48 (1), 79–95.

Apple (2021). Apple Customer Privacy Policy , Apple , viewed 20/05/2021, from https://www.apple.com/legal/privacy/

Bennett, C. J. (2018). The European general data protection regulation: An instrument for the globalization of privacy standards? Information Polity, 23 , 239–246.

Bleier, A., Goldfarb, A., & Tucker, C. (2020). Consumer privacy and the future of data-based innovation and marketing. International Journal of Research in Marketing, 37 (3), 466–480.

BMW (2021). BMW Group Data Ecosystem. BMW , viewed 20/05/2021, from https://www.bmwgroup.com/en/innovation/technologies-and-mobility/bmw-group-data-ecosystem.html

Bornschein, R., Schmidt, L., & Maier, E. (2020). The effect of consumers’ perceived power and risk in digital information privacy: The example of cookie notices. Journal of Public Policy & Marketing, 39 (2), 135–154.

Bu, F., Wang, N., Jiang, B., & Liang, H. (2020). “Privacy by design” implementation: Information system engineers’ perspective. International Journal of Information Management, 53 , 102124.

Cisco (2020). Protecting data privacy to maintain digital trust: The importance of protecting data privacy during the pandemic and beyond, Cisco , viewed 13/10/2020, from https://www.cisco.com/c/dam/en_us/about/doing_business/trust-center/docs/cybersecurity-series-2020-cps.pdf

Cox, K. (2017). Gambling services use big data to target recovering gamblers, low-income families, Consumerist , viewed 13/10/2020, from https://consumerist.com/2017/08/31/gambling-services-use-big-data-to-target-recovering-gamblers-low-income-families/

Crozier, R. (2019). How Woolworths uses Google to power its massive analytics uplift, itnews , viewed 13/10/2020, from https://www.itnews.com.au/news/woolworths-uses-google-to-power-massive-data-analytics-uplift-523639

Davenport, T., Guha, A., Grewal, D., & Bressgott, T. (2020). How artificial intelligence will change the future of marketing. Journal of the Academy of Marketing Science, 48 (1), 24–42.

Davis, B., Grewal, D., & Hamilton, S. (2021). The future of marketing analytics and public policy. Journal of Public Policy & Marketing, 40 (4), 447–452.

de Oliveira Santini, F., Ladeira, W. J., Pinto, D. C., Herter, M. M., Sampaio, C. H., & Babin, B. (2020). Customer engagement in social media: A framework and meta-analysis. Journal of the Academy of Marketing Science, 48 (6), 1211–1228.

Deloitte (2020). Data valuation: Understanding the value of your data assets, Deloitte, viewed 10/05/21, from https://www2.deloitte.com/content/dam/Deloitte/global/Documents/Finance/Valuation-Data-Digital.pdf

Fazzini, K. (2019). Europe’s sweeping privacy rule was supposed to change the internet, but so far it’s mostly created frustration for users, companies, and regulators. CNBC , viewed 22/10/2021, from https://www.cnbc.com/2019/05/04/gdpr-has-frustrated-users-and-regulators.html .

FTC (2019). FTC imposes $5 billion penalty and sweeping new privacy restrictions on Facebook, Federal Trade Commission , viewed 20/08/2020, from https://www.ftc.gov/news-events/press-releases/2019/07/ftc-imposes-5-billion-penalty-sweeping-new-privacy-restrictions

Giddens, A. (1984). The constitution of society: Outline of the theory of structuration . University of California Press.

Graef, I., & van den Boom, J. (2020). Spill-overs in data governance: Uncovering the uneasy relationship between the GDPR’s right to data portability and EU sector-specific data access regimes. Journal of European Consumer and Market Law, 9 (1).

Greene, T., Shmueli, G., Ray, S., & Fell, J. (2019). Adjusting to the GDPR: The impact on data scientists and behavioral researchers. Big Data, 7 (3), 140–162.

Helberger, N., Huh, J., Milne, G., Strycharz, J., & Sundaram, H. (2020). Macro and exogenous factors in computational advertising: Key issues and new research directions. Journal of Advertising, 49 (4), 1–17.

Hilken, T., de Ruyter, K., Chylinski, M., Mahr, D., & Keeling, D. (2017). Augmenting the eye of the beholder: Exploring the strategic potential of augmented reality to enhance online service experiences. Journal of the Academy of Marketing Science, 45 (6), 884–905.

Hung, H., & Wong, Y. H. (2009). Information transparency and digital privacy protection: Are they mutually exclusive in the provision of e-services? Journal of Services Marketing, 23 (3), 154–164.

Ioannou, A., Tussyadiah, I., & Lu, Y. (2020). Privacy concerns and disclosure of biometric and behavioral data for travel. International Journal of Information Management, 54 , 102122.

Izrailevsky, Y., Vlaovic, S., & Meshenberg, R. (2016). Completing the Netflix Cloud Migration, Netflix , viewed 13/10/2020, from https://about.netflix.com/en/news/completing-the-netflix-cloud-migration

Jia, J., Jin, G. Z., & Wagman, L. (2021). The short-run effects of the general data protection regulation on technology venture investment. Marketing Science, 40 (4), 661–684.

Jin, H., Luo, Y., Li, P., & Mathew, J. (2019). A review of secure and privacy-preserving medical data sharing. IEEE Access, 7 , 61656–61669.

Johnson, G A., Shriver, S. K., & Goldberg, S. G. (2020). Privacy and market concentration: intended and unintended consequences of the GDPR. United States Federal Trade Commission , viewed 20/10/2021, from https://www.ftc.gov/system/files/documents/public_events/1548288/ privacycon-2020-garrett_johnson.pdf

Jones, M. R., & Karsten, H. (2008). Giddens's structuration theory and information systems research. MIS Quarterly, 32 , 127–157.

Kaaniche, N., Laurent, M., & Belguith, S. (2020). Privacy enhancing technologies for solving the privacy-personalization paradox: Taxonomy and survey. Journal of Network and Computer Applications, 171 , 102807.

Kamboj, S., Sarmah, B., Gupta, S., & Dwivedi, Y. (2018). Examining branding co-creation in brand communities on social media: Applying the paradigm of stimulus-organism-response. International Journal of Information Management, 39 , 169–185.

Kietzmann, J., Lee, L. W., McCarthy, I. P., & Kietzmann, T. C. (2020). Deepfakes: Trick or treat? Business Horizons, 63 (2), 135–146.

Kim, J. (2021). Advertising in the Metaverse: Research agenda. Journal of Interactive Advertising, 21 , 1–4.

Kobusińska, A., Leung, C., Hsu, C. H., Raghavendra, S., & Chang, V. (2018). Emerging trends, issues and challenges in internet of things, big data and cloud computing, future generation computer systems, volume 87. October, 2018 , 416–419.

Kopalle, P. K., & Lehmann, D. R. (2021). EXPRESS: Big data, marketing analytics, and public policy: Implications for health care. Journal of Public Policy & Marketing, 0743915621999031 , 453–456.

Kshetri, N. (2014). Big data’s impact on privacy, security and consumer welfare. Telecommunications Policy, 38 (11), 1134–1145.

Kwok, A. O., & Koh, S. G. (2020, April). Neural network insights of blockchain technology in manufacturing improvement. In 2020 IEEE 7th international conference on industrial engineering and applications (ICIEA) (pp. 932-936). IEEE.

Lapowsky, I. (2019). How Cambridge Analytica Sparked the Great Privacy Awakening, Wired , viewed 20/08/2020, from https://www.wired.com/story/cambridge-analytica-facebook-privacy-awakening/

Lavelle, J. (2019), Gartner survey shows accelerating privacy regulation returns as the top emerging risk worrying organizations in 1Q19. Gartner , viewed 20/10/2021, from https://www.gartner.com/en/newsroom/press-releases/2019-04-11-gartner-survey-shows-accelerating-privacy-regulation-returns-as-the-top-emerging-risk-worrying-organizations-in-1q19

Leswing, K. (2021). Apple’s privacy change is poised to increase the power of its app store. CNBC , viewed 20/05/2021, from https://www.cnbc.com/2021/03/19/apples-privacy-change-could-increase-the-power-of-its-app-store.html

Lomas, N. (2020). GDPR enforcement must level up to catch big tech, report warns. TechCrunch , viewed 22/10/2021, from https://techcrunch.com/2020/11/26/gdpr-enforcement-must-level-up-to-catch-big-tech-report-warns/

Luo, Y. (2006). Political behavior, social responsibility, and perceived corruption: A structuration perspective. Journal of International Business Studies, 37 (6), 747–766.

Lwin, M., Wirtz, J., & Williams, J. (2007). Consumer online privacy concerns and responses: A power-responsibility equilibrium perspective. Journal of the Academy of Marketing Science, 35 (1), 572–585.

Martin, K. D., & Murphy, P. E. (2017). The role of data privacy in marketing. Journal of the Academy of Marketing Science, 45 (2), 135–155.

Martin, K. D., & Palmatier, R. W. (2020). Data privacy in retail: Navigating tensions and directing future research. Journal of Retailing, 96 (4), 449–457.

Martin, K. D., Borah, A., & Palmatier, R. W. (2017). Data privacy: Effects on customer and firm performance. Journal of Marketing, 81 (1), 36–58.

McStay, A. (2020). Emotional AI, soft biometrics and the surveillance of emotional life: An unusual consensus on privacy. Big Data & Society, 7 (1), 2053951720904386.

Mende, M., Scott, M. L., van Doorn, J., Grewal, D., & Shanks, I. (2019). Service robots rising: How humanoid robots influence service experiences and elicit compensatory consumer responses. Journal of Marketing Research, 56 (4), 535–556.

Merrick, R., & Ryan, S. (2019). Data privacy governance in the age of GDPR. Risk Management, 66 (3), 38–43.

Musa, J. (2020). Retail, robots, and COVID-19: Trends and how can robots play a role in safe shopping. Soft Bank robotics , viewed 13/10/2020, from https://www.softbankrobotics.com/emea/en/blog/news-trends/retail-robots-and-covid-19-trends-and-how-can-robots-play-role-safe-shopping

Najjar, M. S., & Kettinger, W. J. (2013). Data monetization: Lessons from a retailer's journey. MIS Quarterly Executive, 12 (4), 213–225.

Nica, G. (2020). BMW joins forces with German companies to create auto data alliance, BMWBlog , viewed 20/08/2020, from https://www.bmwblog.com/2020/12/03/bmw-joins-forces-with-german-companies-to-create-auto-data-alliance/

Nijholt, A. (2021). Experiencing social augmented reality in public spaces. In Adjunct proceedings of the 2021 ACM international joint conference on pervasive and ubiquitous computing and proceedings of the 2021 ACM international symposium on wearable computers (pp. 570-574).

Norwegian Consumer Council (2020), Out of Control: How Consumers are Exploited by the Online Advertising Industry , 177–83, viewed 20/09/2021, from https://www.forbrukerradet.no/out-of-control/

OAIC. (2020). Australian Community Attitudes to Privacy Survey 2020. Office of the Australian Information Commissioner, viewed 30/05/2021, from https://www.oaic.gov.au/__data/assets/pdf_file/0015/2373/australian-community-attitudes-to-privacy-survey-2020.pdf

O'Flaherty, K. (2021). iPhone Users’ Favorite iOS 14.5 Feature Is A Rip-Roaring Success. Forbes , viewed 20/05/2021, from https://www.forbes.com/sites/kateoflahertyuk/2021/05/14/iphone-users-favorite-ios-145-feature-is-a-rip-roaring-success/?sh=3f9e327e6e76

Okazaki, S., Eisend, M., Plangger, K., de Ruyter, K., & Grewal, D. (2020). Understanding the strategic consequences of customer privacy concerns: A meta-analytic review. Journal of Retailing, 96 (4), 458–473.

Palmatier, R. W., & Martin, K. D. (2019). Understanding and valuing customer data. In The intelligent Marketer’s guide to data privacy (pp. 133–151). Palgrave Macmillan, .

Park, Y. J., Chung, J. E., & Shin, D. H. (2018). The structuration of digital ecosystem, privacy, and big data intelligence. American Behavioral Scientist, 62 (10), 1319–1337.

Patterson, D. (2020). Facebook data privacy scandal: A cheat sheet. TechRepublic , viewed 20/10/2021, from https://www.techrepublic.com/article/facebook-data-privacy-scandal-a-cheat-sheet/

Poels, G. (2019). Enterprise modelling of digital innovation in strategies, services and processes. In International conference on business process management (pp. 721–732). Springer, .

Pomfret, L., Previte, J., & Coote, L. (2020). Beyond concern: Socio-demographic and attitudinal influences on privacy and disclosure choices. Journal of Marketing Management, 36 (5–6), 519–549.

Quach, S., Thaichon, P., Lee, J. Y., Weaven, S., & Palmatier, R. W. (2020). Toward a theory of outside-in marketing: Past, present, and future. Industrial marketing management . 10.1016/j.indmarman.2019.10.016.

Rasoulian, S., Grégoire, Y., Legoux, R., & Sénécal, S. (2017). Service crisis recovery and firm performance: Insights from information breach announcements. Journal of the Academy of Marketing Science, 45 (6), 789–806.

Roesner, F., Kohno, T., & Molnar, D. (2014). Security and privacy for augmented reality systems. Communications of the ACM, 57 (4), 88–96.

Roggeveen, A. L., Tsiros, M., & Grewal, D. (2012). Understanding the co-creation effect: When does collaborating with customers provide a lift to service recovery? Journal of the Academy of Marketing Science, 40 (6), 771–790.

Rustad, M. L., & Koenig, T. H. (2019). Towards global data privacy standard. Florida Law Review, 71 (2), 365–454.

Sabillon, R., Cano, J., Cavaller Reyes, V., & Serra Ruiz, J. (2016). Cybercrime and cybercriminals: A comprehensive study. International Journal of Computer Networks and Communications Security, 2016, 4 (6), 165–176.

Schneider, M. J., Jagpal, S., Gupta, S., Li, S., & Yu, Y. (2017). Protecting customer privacy when marketing with second-party data. International Journal of Research in Marketing, 34 (3), 593–603.

Shapiro, R (2019). What Your Data Is Really Worth to Facebook. Washington Monthly , viewed 20/08/2020, from https://washingtonmonthly.com/magazine/july-august-2019/what-your-data-is-really-worth-to-facebook/

Statt, S. 2021, Google is weighing an anti-tracking feature for android, following Apple’s lead, The Verge, https://www.theverge.com/2021/2/4/22266823/google-anti-tracking-feature-android-privacy-apple-ios-app-tracking-transparency

Sun, Y., Wang, N., Shen, X. L., & Zhang, J. X. (2015). Location information disclosure in location-based social network services: Privacy calculus, benefit structure, and gender differences. Computers in Human Behavior, 52 , 278–292.

Sydow, J., & Windeler, A. (1998). Organizing and evaluating interfirm networks: A structurationist perspective on network processes and effectiveness. Organization Science, 9 (3), 265–284.

Vargo, S. L., & Lusch, R. F. (2016). Institutions and axioms: An extension and update of service-dominant logic. Journal of the Academy of Marketing Science, 44 (1), 5–23.

Walker, K. L. (2016). Surrendering information through the looking glass: Transparency, trust, and protection. Journal of Public Policy & Marketing, 35 (1), 144–158.

Weisbaum, H. (2018). Trust in Facebook has dropped by 66 percent since the Cambridge Analytica scandal, NBC news , viewed 20/08/2020, from https://www.nbcnews.com/business/consumer/trust-facebook-has-dropped-51-percent-cambridge-analytica-scandal-n867011

Westin, A. F. (1967). Privacy and freedom . Atheneum.

Wilkie, W. (2020) Hyperdrive BMW and SAP Join Forces to Build German Auto Data Alliance, Bloomberg, viewed 20/08/2020, from https://www.bloomberg.com/news/articles/2020-12-01/bmw-and-sap-join-forces-to-build-german-auto-data-alliance

Wong, J. (2018). Elon Musk joins #DeleteFacebook effort as Tesla and SpaceX pages vanish, The Guardian , viewed 20/08/2020, from https://www.theguardian.com/technology/2018/mar/23/elon-musk-delete-facebook-spacex-tesla-mark-zuckerberg

Xiao, L., & Kumar, V. (2019). Robotics for customer service: A useful complement or an ultimate substitute? Journal of Service Research, 1094670519878881 , 9–29.

Yap, J. E., Beverland, M. B., & Bove, L. L. (2012). Doing privacy: Consumers’ search for sovereignty through privacy management practices. Research in Consumer Behavior, 14 , 171–190.

Yun, H., Lee, G., & Kim, D. J. (2019). A chronological review of empirical research on personal information privacy concerns: An analysis of contexts and research constructs. Information & Management, 56 (4), 570–601.

Zarouali, B., Ponnet, K., Walrave, M., & Poels, K. (2017). “Do you like cookies?” Adolescents' skeptical processing of retargeted Facebook-ads and the moderating role of privacy concern and a textual debriefing. Computers in Human Behavior, 69 , 157–165.

Zarouali, B., Poels, K., Ponnet, K., & Walrave, M. (2020). The influence of a descriptive norm label on adolescents’ persuasion knowledge and privacy-protective behavior on social networking sites. Communication Monographs , 1–21.

Zubcsek, P. P., Katona, Z., & Sarvary, M. (2017). Predicting mobile advertising response using consumer colocation networks. Journal of Marketing, 81 (4), 109–126.

Download references

Open Access funding enabled and organized by CAUL and its Member Institutions.

Author information

Authors and affiliations.

Department of Marketing, Griffith Business School, Griffith University, Gold Coast campus, Southport, Queensland, 4222, Australia

Sara Quach, Park Thaichon & Scott Weaven

College of Business, Colorado State University, Fort Collins, CO, 80523-1201, USA

Kelly D. Martin

Foster School of Business, University of Washington, Box: 353226, Seattle, WA, 98195-3226, USA

Robert W. Palmatier

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Sara Quach .

Ethics declarations

Conflict of interest.

The authors declare that they have no confict of interest.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Dhruv Grewal served as Guest Editor for this article.

Supplementary Information

(DOCX 104 kb)

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Quach, S., Thaichon, P., Martin, K.D. et al. Digital technologies: tensions in privacy and data. J. of the Acad. Mark. Sci. 50 , 1299–1323 (2022). https://doi.org/10.1007/s11747-022-00845-y

Download citation

Received : 31 December 2020

Accepted : 25 January 2022

Published : 05 March 2022

Issue Date : November 2022

DOI : https://doi.org/10.1007/s11747-022-00845-y

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Digital technology
  • Data monetization
  • Data sharing
  • Social media
  • Artificial intelligence
  • Internet of things
  • Structuration theory
  • Privacy regulation
  • Find a journal
  • Publish with us
  • Track your research

Cart

  • SUGGESTED TOPICS
  • The Magazine
  • Newsletters
  • Managing Yourself
  • Managing Teams
  • Work-life Balance
  • The Big Idea
  • Data & Visuals
  • Reading Lists
  • Case Selections
  • HBR Learning
  • Topic Feeds
  • Account Settings
  • Email Preferences

The New Rules of Data Privacy

  • Hossein Rahnama
  • Alex “Sandy” Pentland

essay on digital privacy

Navigating privacy protection, new regulation, and consumer revolt.

After two decades of data management being a wild west, consumer mistrust, government action, and competition for customers are bringing in a new era. Firms that generate any value from personal data will need to change the way they acquire it, share it, protect it, and profit from it. They should follow three basic rules: 1) consistently cultivate trust with customers, explaining in common-sense terms how their data is being used and what’s in it for them; 2) focus on extracting insight, not personal identifiable information; and 3) CIOs and CDOs should work together to facilitate the flow of insights, with a common objective of acquiring maximum insight from consented data for the customer’s benefit.

The data harvested from our personal devices, along with our trail of electronic transactions and data from other sources, now provides the foundation for some of the world’s largest companies. Personal data also the wellspring for millions of small businesses and countless startups, which turn it into customer insights, market predictions, and personalized digital services. For the past two decades, the commercial use of personal data has grown in wild-west fashion. But now, because of consumer mistrust, government action, and competition for customers, those days are quickly coming to an end.

essay on digital privacy

  • HR Hossein Rahnama is Associate Professor with the Creative School at Ryerson University in Toronto and a Visiting Professor with the MIT Media Lab in Cambridge, Massachusetts. A recognized computer scientist known for his work in context-aware computing, Hossein is the founder and CEO of Flybits, a technology firm that helps companies synthesize digital customer experiences from enterprise data assets.
  • AP Alex “Sandy” Pentland is the Toshiba Professor of Media Arts and Sciences with the Media Lab, Sloan School of Management, and College of Computing at MIT. Sandy directs MIT’s Connection Science and Human Dynamics research laboratories, advises the OECD and UN, and co-led the World Economic Forum personal data initiatives.

Partner Center

Digital privacy comes at a price. Here's how to protect it

Cyber Security Protection Firewall Interface Concept

Sharing personal data is an unavoidable part of digital connectivity. Image:  rawpixel.com

.chakra .wef-1c7l3mo{-webkit-transition:all 0.15s ease-out;transition:all 0.15s ease-out;cursor:pointer;-webkit-text-decoration:none;text-decoration:none;outline:none;color:inherit;}.chakra .wef-1c7l3mo:hover,.chakra .wef-1c7l3mo[data-hover]{-webkit-text-decoration:underline;text-decoration:underline;}.chakra .wef-1c7l3mo:focus,.chakra .wef-1c7l3mo[data-focus]{box-shadow:0 0 0 3px rgba(168,203,251,0.5);} Robert Muggah

essay on digital privacy

.chakra .wef-9dduvl{margin-top:16px;margin-bottom:16px;line-height:1.388;font-size:1.25rem;}@media screen and (min-width:56.5rem){.chakra .wef-9dduvl{font-size:1.125rem;}} Explore and monitor how .chakra .wef-15eoq1r{margin-top:16px;margin-bottom:16px;line-height:1.388;font-size:1.25rem;color:#F7DB5E;}@media screen and (min-width:56.5rem){.chakra .wef-15eoq1r{font-size:1.125rem;}} Cybersecurity is affecting economies, industries and global issues

A hand holding a looking glass by a lake

.chakra .wef-1nk5u5d{margin-top:16px;margin-bottom:16px;line-height:1.388;color:#2846F8;font-size:1.25rem;}@media screen and (min-width:56.5rem){.chakra .wef-1nk5u5d{font-size:1.125rem;}} Get involved with our crowdsourced digital platform to deliver impact at scale

Stay up to date:, cybersecurity.

Listen to the article

  • Data privacy exchanged for frictionless convenience is being compromised, stolen and leaked with disturbing regularity.
  • People do not consciously put a price on their online privacy and most are unaware of how much data they are voluntarily sharing.
  • We need a more coordinated approach across the public and private sector to tackle cyber security and enhance data protection.

The world is more wired than ever. Digital networks connect everything from office computers and bank accounts to baby monitors and pacemakers. Connectivity is blurring the lines between what is public and private. Privacies usually taken for granted – from web searches to heart-beats – are being steadily exploited in exchange for frictionless convenience. Meanwhile, personal data is being compromised, stolen and leaked with disturbing regularity. Promises made by cyber security companies of enhanced data privacy and protection ring hollow.

Have you read?

How to restore data privacy after the coronavirus pandemic, closing the trust gap: how responsible data use can accelerate a sustainable society, it's time to redefine how data is governed, controlled and shared. here's how.

Most people do not consciously put a price on their online privacy. But what if they did? A 2020 survey of Argentinians, Brazilians, Colombians, Mexicans, Germans and US citizens did precisely this. The Technology Policy Institute, a think tank, asked respondents how much a company would have to pay them each month to disclose various types of personal data. While the exact amounts varied across countries and categories – with Germans charging the most and US residents the least – the average came out to a surprisingly affordable $10, or $120 a year.

Yet most people are still unaware of just how much data they are voluntarily sharing, much less what is being syphoned from them involuntarily. But this is starting to change. The explosion of cyber attacks, especially ransomware, now makes the headlines. US companies are paying 400% more in ransom payouts in 2021 compared to 2019 . The average cost of a disclosed ransomware attack is a staggering $1.8 million , with companies forced to pay up or have millions of private records scattered across the internet. Predictably, cybersecurity insurance premiums are spiralling upward.

The pros and cons of a digitizing world

One reason people are sharing information is because it's an unavoidable part of joining the information superhighway. Today, there are over 4.6 billion active internet users, with billions more about to plug in. Social media platforms and search engines enlist billions of users a day who voluntarily part with their private information with the expectation it will “optimize” their experience.

All this digital onboarding has a dark side , including widening the exposure of governments, companies and citizens to an array of digital harms. There are signs that intrusive data harvesting and constant data theft is triggering a techlash. Sensing the shift in public mood, some tech companies are rolling-out new safeguards and reaping the benefits of surging demand for privacy.

Try as tech companies might to quell it, the popular push-back against surveillance capitalism is gathering pace. More and more people believe that their data is less secure than ever before. A 2019 survey of 24 countries found that 80% of respondents were concerned about online privacy, with one in four saying they did not trust the internet.

The World Economic Forum's Centre for Cybersecurity at the forefront of addressing global cybersecurity challenges and making the digital world safer for everyone.

Our goal is to enable secure and resilient digital and technological advancements for both individuals and organizations. As an independent and impartial platform, the Centre brings together a diverse range of experts from public and private sectors. We focus on elevating cybersecurity as a key strategic priority and drive collaborative initiatives worldwide to respond effectively to the most pressing security threats in the digital realm.

Learn more about our impact:

  • Cybersecurity training: In collaboration with Salesforce, Fortinet and the Global Cyber Alliance, we are providing free training to the next generation of cybersecurity experts . To date, we have trained more than 122,000 people worldwide.
  • Cyber resilience: Working with more than 170 partners, our centre is playing a pivotal role in enhancing cyber resilience across multiple industries: oil and gas , electricity , manufacturing and aviation .

Want to know more about our centre’s impact or get involved? Contact us .

Most Americans believe it is impossible to go through the day without having personal data harvested by governments or companies. Many are convinced that their online and offline lives are being tracked and monitored and that there is little they can do about it, which may help explain why they are so willing to part with it.

Mistrust of governments and companies also comes down to personal experience. The increase in cyber attacks and ransomware is undermining the binding glue of the internet: trust. According to one study , over 86% of all online consumers in 2020 were victims of some form of online fraud or data breach.

The relentless collection and reselling of personal data by private companies is hardly helping. Fewer people than ever believe they can safely and securely navigate online. This can lead to what researchers call “privacy self-defence” – withholding personal information, giving false biographical details or removing information from mailing lists altogether.

Building a more private and anonymous online experience

Forward-looking governments and companies are beginning to recognize that privacy has a price and some are developing solutions to protect it. They are responding to public calls to develop more stringent legislation, regulation and compliance to improve data protection and security. In democratic countries, at least, there is growing intolerance for intrusive harvesting and use of personal data, as the pushback against COVID-19-related contact tracing shows.

In most parts of the world, including more authoritarian corners, people value their anonymity and object to abuses of their privacy. More and more consumer groups , think tanks and universities are illuminating what governments and companies are doing with their data and how this contradicts data protection laws.

Data Protection and Privacy Legislation Worldwide.

In a digitally-dependent world, securing data is more important than ever. A growing number of governments and companies recognize the importance of measuring and quantifying their data privacy and protection risks as evidenced by the European Union’s General Data Protection Regulation (GPPR), Brazil’s Marco Civil and California’s Consumer Privacy Act (CCPA), among others. Doing so can help avoid costly breaches, maintain a positive reputation and ensure compliance with basic laws and norms. Citizens too are starting to question whether the loss of privacy is worth the temporary convenience afforded by newly connected devices.

A privacy mindset is essential. One way to help mitigate exposure is through digital distancing . This includes using virtual private networks with no-log policies as well as Tor – free and open-source software that enables anonymous communication – in order to conceal user location and use from intrusive network surveillance. Encrypted emails are also essential, especially from providers and platforms that can neither read nor track user content. Regulating app permissions, installing ad blockers and avoiding social media altogether are well-known strategies to bolster online privacy and reduce one's digital footprint.

Preparedness is vital in a world of cascading digital threats . More than ever, governments, companies and international organizations – not just individuals – need to design-in digital defences while also managing their digital presence. Installing cyber security software is only the start. Privacy amplification and managed attribution technologies can help reinforce and strengthen data protection. At a time of persistent and omnipresent online surveillance and digital malfeasance, data security needs to be built at both the enterprise and the user levels. Minimizing exposure and maximizing privacy is a core value proposition.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

The Agenda .chakra .wef-n7bacu{margin-top:16px;margin-bottom:16px;line-height:1.388;font-weight:400;} Weekly

A weekly update of the most important issues driving the global agenda

.chakra .wef-1dtnjt5{display:-webkit-box;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-align-items:center;-webkit-box-align:center;-ms-flex-align:center;align-items:center;-webkit-flex-wrap:wrap;-ms-flex-wrap:wrap;flex-wrap:wrap;} More on Cybersecurity .chakra .wef-nr1rr4{display:-webkit-inline-box;display:-webkit-inline-flex;display:-ms-inline-flexbox;display:inline-flex;white-space:normal;vertical-align:middle;text-transform:uppercase;font-size:0.75rem;border-radius:0.25rem;font-weight:700;-webkit-align-items:center;-webkit-box-align:center;-ms-flex-align:center;align-items:center;line-height:1.2;-webkit-letter-spacing:1.25px;-moz-letter-spacing:1.25px;-ms-letter-spacing:1.25px;letter-spacing:1.25px;background:none;padding:0px;color:#B3B3B3;-webkit-box-decoration-break:clone;box-decoration-break:clone;-webkit-box-decoration-break:clone;}@media screen and (min-width:37.5rem){.chakra .wef-nr1rr4{font-size:0.875rem;}}@media screen and (min-width:56.5rem){.chakra .wef-nr1rr4{font-size:1rem;}} See all

essay on digital privacy

Tinder Swindler: How 'romance fraud' became a multi-billion dollar cybercrime

Robin Pomeroy and Sophia Akram

May 24, 2024

essay on digital privacy

Safeguarding central bank digital currency systems in the post-quantum computing age

Cameron Nili, Tom Patterson and Carl Dukatz

May 21, 2024

essay on digital privacy

UK military in major data breach, and other cybersecurity news to know this month

Akshay Joshi

essay on digital privacy

How the Internet of Things (IoT) became a dark web target – and what to do about it

Antoinette Hodes

May 17, 2024

essay on digital privacy

Global financial stability at risk due to cyber threats, IMF warns. Here's what to know

Spencer Feingold and Johnny Wood

May 15, 2024

essay on digital privacy

Cybersecurity lessons from Latin America's battle against ransomware threats

Belisario Contreras

May 2, 2024

Explore the Constitution

  • The Constitution
  • Read the Full Text

Dive Deeper

Constitution 101 course.

  • The Drafting Table
  • Supreme Court Cases Library
  • Founders' Library
  • Constitutional Rights: Origins & Travels

National Constitution Center Building

Start your constitutional learning journey

  • News & Debate Overview
  • Constitution Daily Blog
  • America's Town Hall Programs
  • Special Projects

Media Library

America’s Town Hall

America’s Town Hall

Watch videos of recent programs.

  • Education Overview

Constitution 101 Curriculum

  • Classroom Resources by Topic
  • Classroom Resources Library
  • Live Online Events
  • Professional Learning Opportunities
  • Constitution Day Resources

Student Watching Online Class

Explore our new 15-unit high school curriculum.

  • Explore the Museum
  • Plan Your Visit
  • Exhibits & Programs
  • Field Trips & Group Visits
  • Host Your Event
  • Buy Tickets

First Amendment Exhibit Historic Graphic

New exhibit

The first amendment, introduction: a twenty-first century framework for digital privacy.

By Jeffrey Rosen [1] , president and CEO of the National Constitution Center, a nonpartisan nonprofit organization devoted to educating the public about the U.S. Constitution. Rosen is also professor of law at The George Washington University Law School and a contributing editor of The Atlantic .

Download PDF

At the beginning of the twenty-first century, breathtaking changes in technology pose stark challenges to privacy and security.  Private companies collect massive amounts of consumer data—information that often contains intimate details about our personal lives, from the people that we email to the apps that we use to the websites that we visit.  Law enforcement often seeks to access this information when investigating suspected criminals and terrorists, and the data itself is highly mobile, with companies able to send it across borders at the tap of a button.  At the same time, hackers threaten our personal data through cyberattacks, and sophisticated users shield the contents of their data from law enforcement through the use of cutting-edge encryption tools.

Concerns span the ideological spectrum.  Libertarian conservatives and civil-libertarian liberals are united in their suspicion of state surveillance, while conservatives suspect regulation of the consumer sphere and progressives are concerned about concentrated corporate power. At the same time, judges and legislators have been remarkably slow to respond to the new threats cyberspace poses.  There have been no great Supreme Court cases on data privacy since the 1980s; the statutory framework for regulating consumer privacy is incomplete and uncertain; and few judges or legislators have been willing to tackle the crucial challenge of translating the Constitution and key privacy laws in light of new technologies.

Advances in technology raise numerous important (and difficult) legal questions:

  • How can we strike the right balance between security and privacy in the digital age?
  • How might we translate Fourth Amendment doctrine in light of technological advances and changing consumer expectations of privacy?
  • What constitutional and statutory protections should there be for data stored in the Cloud, and under what circumstances and with what constraints should the government get access to it?
  • Does the government have to tell consumers when it searches their email accounts or accesses their data?
  • And whose law should govern access to data in our borderless world—a world where data is often stored on servers in other countries and can be transferred across borders at the snap of a finger?

The National Constitution Center, with the support of Microsoft, has assembled leading scholars and thought leaders to publish a series of five white papers, entitled A Twenty-First Century Framework for Digital Privacy .  We’ve asked these contributors to reflect on the challenges that new technologies pose to existing constitutional doctrine and statutory law and to propose solutions—doctrinal, legislative, and constitutional—that translate the Constitution and federal law in light of new technologies.  The overarching question we asked contributors to address is how best to balance privacy concerns against the need for security in the digital age.  These contributors represent diverse points of view and experiences and their papers reflect the Constitution Center’s commitment to presenting the best arguments on all sides of the constitutional issues at the center of American life.

In Digital Divergence , David Kris examines advances in technology, and he challenges the view that balancing privacy and security is a zero-sum game.  Instead, he argues that new technologies threaten both privacy and security. While privacy faces familiar threats such as mass data collection and government surveillance, Kris argues that we shouldn’t ignore the risks posed to security in the digital age.  Far from a “golden age” of government surveillance, Kris sees a digital world where advances in technology have made it easier for sophisticated criminals and terrorists to carry out illegal acts and more difficult for government agents to conduct effective surveillance—due, in part, to the massive volume of data, data’s mobility, and the use of cutting-edge encryption tools.  In short, more and more data is generated—most of it useless to law enforcement, but that data is still susceptible to large-scale cyberattacks.  Kris frames this problem as one of “digital divergence”—namely, that the advance of digital technology has harmed both privacy and security—and examines constitutional and statutory changes that might follow from this trend. 

In Administering the Fourth Amendment in the Digital Age , Jim Harper critiques the limited privacy protections provided by current Fourth Amendment doctrine and calls on courts to adopt a new approach—one guided by Justice Pierce Butler’s forgotten dissent in Olmstead v. United States .  Harper rejects an approach that focuses on society’s “reasonable expectations of privacy” and calls on courts to adopt one that protects privacy at least as much as the Founding generation did—protections that hew closely to the Fourth Amendment’s text and recognize data, information, and communications as a key form of property.  In particular, he urges courts to administer the Fourth Amendment methodically, by (1) determining whether there has been a “seizure,” defined as an invasion of a property right, or a “search,” defined as acting with a “purpose of finding something”; (2) analyzing whether the government agent seized or searched something protected by the Fourth Amendment—namely, a person, house, paper, or effect; and, finally, (3) asking if the seizure or search was reasonable—an inquiry that should focus on the reasonableness of the government’s actions rather than the reasonableness of the defendant’s privacy preferences.  Harper argues that this approach would place judges back in the familiar position of applying the law to the facts of a specific case.

In Policing and The Cloud , Christopher Slobogin offers his own approach to balancing privacy and security in the digital age—an approach that focuses on context and proportionality.  He argues that given the personal nature of the information stored in the Cloud, law enforcement shouldn’t be able to access it at will.  Instead, Slobogin advocates for an approach that is sensitive to the context of the specific government action or request.  While a warrant may not be appropriate in all circumstances, a mere subpoena may not be sufficient, either.  For instance, when the government requests non-public information about a specific person, courts should weigh the level of intrusion involved in the request against the level of suspicion.  As the level of intrusion and the amount of data requested increases, so should the level of justification—up to and including, perhaps, a warrant.  Slobogin’s goal is to construct rules that will allow the government to harness the Cloud’s investigative potential, while also limiting the opportunities for government abuses.

In Secret Government Searches and Digital Civil Liberties , Neil Richards tackles the issue of what he describes as “secret government searches”—namely, examples of government surveillance that remain a secret to the search target.  These can be physical or digital, carried out with a warrant or without, and unknown to everyone but the government or facilitated by a private company that is prohibited from notifying the target.  Richards places these secret searches in historical, technological, and constitutional context and argues that they are unprecedented, historically and technologically, and inconsistent with key constitutional values, including freedom of thought, freedom of expression, and freedom from unreasonable searches and seizures.

Finally, in Whose Law Governs in a Borderless World?: Law Enforcement Access to Data Across Borders , Jennifer Daskal explores the challenges posed by the mobility of data.  Today, legal rules covering government access to data treat location as king.  And yet, data can move across borders and around the world instantly, can be held in multiple places at once, and can be accessed remotely from across the world.  Daskal argues that a better rule would shift the focus away from data location and consider a variety of other factors, including target location and nationality, the location of the provider, and the strength of the government’s interest.  For Daskal, these factors better reflect the interests at stake in cross-border data disputes, including privacy, security, and sovereignty.

Tom Donnelly, Senior Fellow for Constitutional Studies at the National Constitution Center, contributed to this article. ↑

More from the National Constitution Center

essay on digital privacy

Constitution 101

Explore our new 15-unit core curriculum with educational videos, primary texts, and more.

essay on digital privacy

Search and browse videos, podcasts, and blog posts on constitutional topics.

essay on digital privacy

Founders’ Library

Discover primary texts and historical documents that span American history and have shaped the American constitutional tradition.

News & Debate

Modal title.

Modal body text goes here.

Share with Students

U.S. flag

An official website of the United States government

Here’s how you know

Official websites use .gov A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS A lock ( Lock A locked padlock ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

https://www.nist.gov/blogs/taking-measure/why-security-and-privacy-matter-digital-world

Taking Measure

Just a Standard Blog

Why Security and Privacy Matter in a Digital World

abstract web world illustration

One cannot pick up a newspaper, watch TV, listen to the radio, or scan the news on the internet without some direct or veiled reference to the lack of information security or intrusions into personal privacy. Many intrusions into government and private-sector systems have exposed sensitive mission, business and personal information. Every day it seems that more and more systems are breached and more and more personal information is made available either on the web or, worse, the dark web . Given this backdrop, it is often easy to get lost in the details of cybersecurity and privacy and the seemingly endless discussions about cyber attacks, system breaches, frameworks, requirements, controls, assessments, continuous monitoring and risk management and forget why security and personal privacy matter in an increasingly digital world.

We are witnessing and taking part in the greatest information technology revolution in the history of mankind as our society undergoes the transition from a largely paper-based world to a fully digital world. As part of that transformation, we continue to push computers closer to the edge. The “edge” today is the burgeoning and already vast world of the “Internet of Things,” or IoT. This new world consists of an incredibly diverse set of familiar everyday technologies, including dishwashers, refrigerators, cameras, DVRs, medical devices, satellites, automobiles, televisions, traffic lights, drones, baby monitors, building fire/security systems, smartphones and tablets. It also includes technologies that are perhaps less familiar to the average person but absolutely vital to maintaining and safeguarding the familiar world in which they live: advanced military weapons systems; industrial and process control systems that support power plants and the nationwide electric grid, manufacturing plants and water distribution plants; emergency response systems; banking and financial systems; and transportation systems—in short, our most critical infrastructure. Yes, we have fully embraced this emerging technology and pushed computers, software and devices everywhere to the edge of this new world. And as those technologies, both familiar and critical, become increasingly integrated with IoT, so does information , all kinds of information, including intellectual property and your personal information.

It goes without saying that innovations in information technology and IoT will continue to make us more productive, help us solve difficult and challenging problems, entertain us, allow us to communicate with virtually anyone in the world instantaneously, and provide all kinds of additional, and previously unimaginable, benefits. For instance, who wouldn’t want an app that tells you the optimal time to go to the restroom during the movie you’re about to see at your local theater? These new technologies are not only compelling, but also intoxicating and addicting—leaving us with a huge blind spot that puts us at great risk of losing our property, our privacy, our security and, in some cases, our lives.

We have built an incredibly complex information technology infrastructure consisting of millions of billions of lines of code, hardware platforms with integrated circuits on computer chips, and millions of applications on every type of computing platform from smart watches to mainframes. And right in the middle of all that complexity, your information is being routinely processed, stored and transmitted through global networks of connected systems. From a security and privacy perspective, we are not only concerned about the confidentiality, integrity and availability of the data contained in the systems embedded deep in the nation’s critical infrastructure, but also of our personal information.

Recognizing the importance of both security and privacy safeguards for systems, organizations and individuals, NIST recently initiated several groundbreaking projects to bring these concepts closer together—to facilitate the development of stronger, more robust security and privacy programs and provide a unified approach for protecting all types of information, including personal information. The first installment in this new approach occurred with the release of NIST Special Publication 800-53, Revision 5 , which provided, for the first time in the standards community, a consolidated catalog of security and privacy controls—standing side by side with the broad-based safeguards needed to protect systems and personal privacy.

Today, NIST is announcing the second installment of the unified approach to privacy and security by releasing a discussion draft of NIST Special Publication 800-37, Revision 2 . This publication responds to the President’s Executive Order on Strengthening the Cybersecurity of Federal Networks and Critical Infrastructure and the Office of Management and Budget’s Memorandum M-17-25 (implementation guidance for the Executive Order) to develop the next-generation Risk Management Framework (RMF 2.0) for systems, organizations and individuals. RMF 2.0 provides a disciplined, structured and repeatable process for organizations to select, implement, assess and continuously monitor security and privacy controls.

NIST Special Publication 800-37, Revision 2, empowers customers to take charge of their protection needs and provide security and privacy solutions to support organizational missions and business objectives. It includes a new organizational preparation step, instituted to achieve more timely, effective, efficient and cost-effective risk management processes. The organizational preparation step incorporates concepts from the Cybersecurity Framework to facilitate better communication between senior leaders and executives at the enterprise and mission/business process levels and system owners—conveying acceptable limits regarding the implementation of security and privacy controls within the established organizational risk tolerance. The enterprise-wide preparation also facilitates the identification of common controls and the development of organization-wide tailored security and privacy control baselines. This significantly reduces the workload on individual system owners, provides more customized security and privacy solutions, and lowers the overall cost of system development and protection.

And finally, RMF 2.0 helps organizations reduce the complexity of their IT infrastructure by consolidating, standardizing and optimizing systems, applications and services through the application of enterprise architecture concepts and models. Such complexity reduction is critical to identifying, prioritizing and focusing organizational resources on high-value assets that require increased levels of protection—taking steps commensurate with risk such as moving assets to cloud-based systems or shared services, systems and applications.

The transformation to consolidated security and privacy guidelines will help organizations strengthen their foundational security and privacy programs, achieve greater efficiencies in control implementation, promote greater collaboration of security and privacy professionals, and provide an appropriate level of security and privacy protection for systems and individuals.

About the author

essay on digital privacy

Ron Ross is a computer scientist and Fellow at the National Institute of Standards and Technology. He specializes in cybersecurity, risk management, and systems security engineering.  Ron is a retired Army officer who, when not defending cyberspace, follows his passion for NASCAR and takes care of his adopted rescue dog, Sophie.

Related posts

Photos of Marc Levitan and Long Phan are part of a collage of tornado images labeled: Tornado Resiliency Building Code Research

NIST Research Is Setting the Standard to Help Buildings Withstand Tornadoes

A researcher wearing safety glasses reaches into a box of circuitry and other equipment, which emits a green glow.

Demystifying Quantum: It’s Here, There and Everywhere

Zach Grey poses outdoors with wind turbines in the background.

Riding the Wind: How Applied Geometry and Artificial Intelligence Can Help Us Win the Renewable Energy Race

Good afternoon Mr. Ross, I just want to let you know that I do admire your leadership at NIST with such an incredible publications like the SP-800's and others to keep our beautiful country safe. I did work before supporting and improving the ICD503 and your publications were read and exercise by me in order to do my job. I want to thank you for giving me opportunity to continue reading every day on your new development publications on Cyber Security and Information Assurance that are my passion. Have a wonderful day.

Best Regards Carlos G. Salinas

Thank you for your kind remarks, Mr. Salinas. They are very much appreciated. It is an honor and a privilege to be able to serve our public and private sector customers by providing standards, guidelines, and best practices to help them build robust security and privacy programs.

I only just now received the link to the draft SP 800-37. In my opinion, NIST did a great job on RMF already. Unfortunately, I am familiar with a segment of government that immediately assumes it must have its own variations of anything and everything. This "organization" made a mess of RMF from the start, seemingly only wanting to make it as painless as possible. They failed in that by the way. If I had to pick one overriding issue that I would change If I could, it would be the apparent universality of the term "organization" used in so many controls absent a consistent understanding of who or what part of a large organization is being addressed. When an assessment procedure tells me "organizations" are automatically compliant because <insertAgencyNameHere> has defined the <widget> for me, and this control part is not identified as a tier 1 or common offering, several veins of logic are now varicose. The very next control or part may speak of "organization" as if it is the CCP or the ISO without regard for what precedes or follows. My assumption is that many people worked on controls independently and never came to agreement on a standard definition of "organization."

Beautiful blog author.Thank you for sharing.Keep it up.Good wishes for your work.

Beautiful blog post author.Thank you.

Excellent post & thank you so much for sharing

Thank you for your post.

Add new comment

  • No HTML tags allowed.
  • Web page addresses and email addresses turn into links automatically.
  • Lines and paragraphs break automatically.

Image CAPTCHA

  • Follow us on Facebook
  • Follow us on Twitter
  • Criminal Justice
  • Environment
  • Politics & Government
  • Race & Gender

Expert Commentary

Data security: Research on privacy in the digital age

Research on consumer attitudes toward digital privacy and the practices of tech companies that shape data collection and use policies.

people on their phones

Republish this article

Creative Commons License

This work is licensed under a Creative Commons Attribution-NoDerivatives 4.0 International License .

by Chloe Reichel, The Journalist's Resource April 12, 2018

This <a target="_blank" href="https://journalistsresource.org/economics/data-digital-privacy-cambridge-analytica/">article</a> first appeared on <a target="_blank" href="https://journalistsresource.org">The Journalist's Resource</a> and is republished here under a Creative Commons license.<img src="https://journalistsresource.org/wp-content/uploads/2020/11/cropped-jr-favicon-150x150.png" style="width:1em;height:1em;margin-left:10px;">

On your smartphone, you’re not much more than a data machine, generating reams of valuable information that tech companies can mine for insights, sell to advertisers and use to optimize their products.

The Cambridge Analytica scandal, which involves a third-party Facebook app that harvested data well beyond the scope of the 270,000 users who initially consented to its terms of service for use in political campaigns (including Donald Trump’s 2016 bid for the presidency), highlights anew the vulnerability of consumer data in this digital age.

But it’s easy to forget these risks to personal privacy and security while tapping out messages to friends or scrolling endlessly through the web. The distraction machines at our fingertips ask for access and we give it up quickly, hastily agreeing to unread privacy policies and terms of service in exchange for a fresh jolt of content.

Studies highlight this “digital privacy paradox,” in which people express concerns over their privacy but then act in ways that undermine these beliefs , for example, offering up personal data for a small incentive. This review features research on this topic — consumer attitudes toward digital privacy — as well as studies of the supply-side — that is, research on the practices of app developers and other tech companies that shape data collection and use policies.

“ Artificial Intelligence and Consumer Privacy ” Jin, Ginger Zhe. National Bureau of Economic Research working paper, 2018. DOI: 10.3386/w24253.

Summary: This paper looks at the risks big data poses to consumer privacy. The author describes the causes and consequences of data breaches and the ways in which technological tools can be used for data misuse. She then explores the interaction between privacy risks and the U.S. market. For example, the author highlights the “self-conflicting” views consumers hold about their privacy, citing literature in which consumers give away personal data for small incentives despite attitudes that might indicate otherwise. On the supply side, similar paradoxes exist — for example, despite an awareness of cyber risks, firms “tend to deploy new technology… before adopting security measures to protect them.” The author discusses how market forces might motivate firms to strengthen privacy settings in response to consumer concerns, but also mentions how market mechanisms can have the opposite effect, using the example of password policies and consumers’ demand for convenience (in the form of weaker password requirements). The author then describes how artificial intelligence might be used to mitigate data security and privacy risks. Lastly, she provides an overview of U.S. policy on consumer privacy and data security and describes future challenges in the field.

“ The Digital Privacy Paradox: Small Money, Small Costs, Small Talk ” Athey, Susan; Catalini, Christian; Tucker, Catherine. National Bureau of Economic Research working paper, 2017. DOI: 10.3386/w23488.

Abstract: “‘Notice and Choice’ has been a mainstay of policies designed to safeguard consumer privacy. This paper investigates distortions in consumer behavior when faced with notice and choice which may limit the ability of consumers to safeguard their privacy using field experiment data from the MIT digital currency experiment. There are three findings. First, the effect small incentives have on disclosure may explain the privacy paradox: Whereas people say they care about privacy, they are willing to relinquish private data quite easily when incentivized to do so. Second, small navigation costs have a tangible effect on how privacy-protective consumers’ choices are, often in sharp contrast with individual stated preferences about privacy. Third, the introduction of irrelevant, but reassuring information about privacy protection makes consumers less likely to avoid surveillance, regardless of their stated preferences towards privacy.”

“ Mobile Applications and Access to Private Data: The Supply Side of the Android Ecosystem ” Kesler, Reinhold; Kummer, Michael E.; Schulte, Patrick. SSRN Electronic Journal , 2017. DOI: 10.2139/ssrn.3106571.

Summary: This paper looks at strategies mobile app developers use to collect data, which apps are most likely to practice intrusive data collection, and what factors predict problematic personal data usage. By examining the variations in data collection strategies of different apps created by the same developers over a period of four years, the researchers uncover three trends. 1) With time and experience, developers adopt more intrusive data collection tactics. 2) Apps with intrusive data collection strategies most commonly target adolescents. 3) Apps that request “critical and atypical permissions” (i.e., access to various data sources) are linked with an increased risk of problematic data practices later on.

“ Consumer Privacy Choice in Online Advertising: Who Opts Out and at What Cost to Industry? ” Johnson, Garrett A.; Shriver, Scott; Du, Shaoyin. SSRN Electronic Journal , 2017. DOI: 10.2139/ssrn.3020503.

Abstract: “We study consumer privacy choice in the context of online display advertising, where advertisers track consumers’ browsing to improve ad targeting. In 2010, the American ad industry implemented a self-regulation mechanism that overlaid ‘AdChoices’ icons on ads, which consumers could click to opt out of online behavioral advertising. We examine the real-world uptake of AdChoices using transaction data from an ad exchange. Though consumers express strong privacy concerns in surveys, we find that only 0.23 percent of American ad impressions arise from users who opted out of online behavioral advertising. We also find that opt-out user ads fetch 59.2 percent less revenue on the exchange than do comparable ads for users who allow behavioral targeting. These findings are broadly consistent with evidence from the European Union and Canada, where industry subsequently implemented the AdChoices program. We calculate an upper bound for the industry’s value of the average opt-out user’s browsing information to be $8 per capita annually in the US. We find that opt-out users tend to be more technologically sophisticated, though opt-out rates are higher in American states with lower income. These results inform the privacy policy discussion by illuminating the real-world consequences of an opt-out privacy mechanism.”

“ The Economics of Privacy ” Acquisti, Alessandro; Taylor, Curtis R.; Wagman, Liad. Journal of Economic Literature , 2016. DOI: 10.2139/ssrn.2580411.

Abstract: “This article summarizes and draws connections among diverse streams of theoretical and empirical research on the economics of privacy. We focus on the economic value and consequences of protecting and disclosing personal information, and on consumers’ understanding and decisions regarding the trade-offs associated with the privacy and the sharing of personal data. We highlight how the economic analysis of privacy evolved over time, as advancements in information technology raised increasingly nuanced and complex issues associated with the protection and sharing of personal information. We find and highlight three themes that connect diverse insights from the literature. First, characterizing a single unifying economic theory of privacy is hard, because privacy issues of economic relevance arise in widely diverse contexts. Second, there are theoretical and empirical situations where the protection of privacy can both enhance, and detract from, individual and societal welfare. Third, in digital economies, consumers’ ability to make informed decisions about their privacy is severely hindered, because consumers are often in a position of imperfect or asymmetric information regarding when their data is collected, for what purposes, and with what consequences. We conclude the article by highlighting some of the ongoing issues in the privacy debate of interest to economists.”

About The Author

' src=

Chloe Reichel

Social Sciences

© 2024 Inquiries Journal/Student Pulse LLC . All rights reserved. ISSN: 2153-5760.

Disclaimer: content on this website is for informational purposes only. It is not intended to provide medical or other professional advice. Moreover, the views expressed here do not necessarily represent the views of Inquiries Journal or Student Pulse, its owners, staff, contributors, or affiliates.

Home | Current Issue | Blog | Archives | About The Journal | Submissions Terms of Use :: Privacy Policy :: Contact

Need an Account?

Forgot password? Reset your password »

Home — Essay Samples — Social Issues — Privacy — Defending Privacy: A Pillar of Autonomy and Democracy

test_template

Defending Privacy: a Pillar of Autonomy and Democracy

  • Categories: Democracy Privacy

About this sample

close

Words: 886 |

Published: Jun 6, 2024

Words: 886 | Pages: 2 | 5 min read

Table of contents

The ethical imperative of privacy, privacy as a pillar of democracy, the technological challenge, legal protections and future directions.

Image of Dr. Oliver Johnson

Cite this Essay

Let us write you an essay from scratch

  • 450+ experts on 30 subjects ready to help
  • Custom essay delivered in as few as 3 hours

Get high-quality help

author

Prof Ernest (PhD)

Verified writer

  • Expert in: Government & Politics Social Issues

writer

+ 120 experts online

By clicking “Check Writers’ Offers”, you agree to our terms of service and privacy policy . We’ll occasionally send you promo and account related email

No need to pay just yet!

Related Essays

7 pages / 3251 words

3 pages / 1157 words

5 pages / 2356 words

2 pages / 1011 words

Remember! This is just a sample.

You can get your custom paper by one of our expert writers.

121 writers online

Still can’t find what you need?

Browse our vast selection of original essay samples, each expertly formatted and styled

Related Essays on Privacy

My right to privacy at home, in my car, and within my emails is one of the most fundamental rights protecting about who I am as a person. One of the amendments that present this right is the Fourth Amendment to the US [...]

Technology has become an integral part of our lives, impacting every aspect of our daily routines. It has greatly simplified tasks, allowing us to accomplish things more quickly and efficiently. Additionally, it has opened up [...]

In today's digital age, social media platforms have become an integral part of our daily lives, shaping how we connect with others, consume information, and perceive the world around us. While the advent of social media has [...]

Ever since social media was introduced as such a necessary and almost vital part of our lives, several concerns have risen about the boundaries at which we must draw in terms of our privacy. The mass of information users pour [...]

It is both the accomplishment and the progress of legal codes with the intentionality of preserving the country against assualts, illegal foreign invasions and terrorism. It greatly helps to maintain security in worldwide trade [...]

Related Topics

By clicking “Send”, you agree to our Terms of service and Privacy statement . We will occasionally send you account related emails.

Where do you want us to send this sample?

By clicking “Continue”, you agree to our terms of service and privacy policy.

Be careful. This essay is not unique

This essay was donated by a student and is likely to have been used and submitted before

Download this Sample

Free samples may contain mistakes and not unique parts

Sorry, we could not paraphrase this essay. Our professional writers can rewrite it and get you a unique paper.

Please check your inbox.

We can write you a custom essay that will follow your exact instructions and meet the deadlines. Let's fix your grades together!

Get Your Personalized Essay in 3 Hours or Less!

We use cookies to personalyze your web-site experience. By continuing we’ll assume you board with our cookie policy .

  • Instructions Followed To The Letter
  • Deadlines Met At Every Stage
  • Unique And Plagiarism Free

essay on digital privacy

Digital Media and the Issue of Privacy Critical Essay

  • To find inspiration for your paper and overcome writer’s block
  • As a source of information (ensure proper referencing)
  • As a template for you assignment

Analysis of the Readings

The readings focus on digital media and its impact on the issue of privacy. The article by Mills (2015) focuses on the old conflicts that are witnessed in the new media. Just like in the past, the top celebrities and politicians try to hide their personal lives from the public. However, media is keen to expose them due to public interest.

In their quest to expose these celebrities, this author argues that sometimes journalists end up giving unverified information just to be relevant. This may be harmful. The article written by Beate and Dorota (2015) talks about the necessary balance that should exist between privacy and publicity.

Issues relevant to the public should be publicized, and those that involve privacy of an individual should remain private. The third article by Margulis (2011) focuses on three theories of privacy. The theories include Westin’s Theory, Altman’s Theory, and Petrino’s Communication Privacy Management (CPM) Theory.

Westin’s Theory

This theory holds that people always make deliberate attempts to limit their accessibility to others as a way of protecting their privacy. When they know that a piece of information about them may be harmful, then they make an effort to ensure that it remains hidden from others. He also argued that privacy involves a voluntary withdrawal of an individual from the public.

Altman’s Theory

According to this theory, people always regulate how open or closed they are to different people under different external environmental conditions. For instance, if people achieve the success adored by the society, they embrace publicity. On the other hand, the same people would prefer scandalous information to be private. People exhibit different behavioral mechanisms to regulate privacy based on the expected outcome.

Petrino’s Theory

This theory defines privacy boundaries which range from complete or total openness to complete secrecy. Privacy has rules that should be observed, especially by the modern digital media. Before making a piece of information public, one must ensure that it is truthful and relevant to the public. The publicity must be made without any intent to harm an individual or group.

Comparison of the Theories

From the analysis of the three theories, it is clear that they both share the principle that privacy is an ingredient in communication that should be respected. They all agree that some pieces of information are better kept as secretes because of the consequences they may have on an individual, a group of people, or the public in case they are publicized.

How the Theories Differ

Westin’s Theory differs from the last two theories in terms of the custodian of information. It argues that by staying away from the public, one is able to protect certain information from reaching the public. The other two theories do not agree with this argument. Patrino’s CPM Theory is very comprehensive on the relationship of social media and privacy while the other two theories do not give such detailed explanations of the relationship between the two.

Application of the Theories to Digital Media

It may not be easy to apply these theories in the unregulated social media where individuals may fail to follow the law when publishing pieces of information. For example, an individual may publish wrong information about a person or group with the intent to cause harm.

On the other hand, a government may prohibit media from publishing true information that may be of harm to the political leaders. This is very common in the Middle East where media freedom is infringed upon by state organs. As Mills (2015) says, these theories are very relevant when they relate to mass media that is regulated in a democratic environment.

Interesting Items in the Readings

These theories give a wide focus on the issue of privacy and how it can be upheld in a society that thrives in social media. For instance, it is interesting how Patrino’s CPM Theory defines how one can promote privacy in social media. Sometimes people lie to shift public’s attention from the truth that may be more harmful.

Comparing and Contrasting Major Themes

The three readings are in agreement that some pieces of information are best kept as secrete as a way of protecting individuals, groups, or states. They agree that the freedom brought about by the social media should not be abused by infringing into the rights of others with malicious intentions. However, these readings differ when it comes to how privacy should be promoted in the digital media.

Perception towards Digital Media Privacy

I believe that the privacy of individuals or groups should be respected as long as it is not affecting the public and that it operates within the law. Social media has promoted invasion of privacy because of the ease with which information can be published by individuals. This may be dangerous in this digital world. For instance, when a person publishes a classified government data relating to security, it may jeopardize the functionality of security organs within the country.

Beate, S., & Dorota, M. (2015). Social Dimensions of Privacy: Interdisciplinary Perspective . Cambridge: Cambridge University Press.

Margulis, S. (2011). Three Theories of Privacy: An Overview . Berlin: Springer.

Mills, J. (2015). Privacy in the New Media Age . Gainesville: University Press of Florida.

  • Altman Corporation's Dress Code Policy Memorandum
  • Comparing Sweden Immigration Policy with German Immigration Policy
  • Robert Altman and Global Popular Culture: ‘The Player’ and ‘Gosford Park’
  • Bloggers’ Influence on Customers’ Intention to Purchase
  • Turkey, Media and Human Rights
  • Print Media versus Digital
  • The New York Times Major Challenges
  • Media Regulations in the GCC Countries
  • Chicago (A-D)
  • Chicago (N-B)

IvyPanda. (2019, June 25). Digital Media and the Issue of Privacy. https://ivypanda.com/essays/digital-media-privacy/

"Digital Media and the Issue of Privacy." IvyPanda , 25 June 2019, ivypanda.com/essays/digital-media-privacy/.

IvyPanda . (2019) 'Digital Media and the Issue of Privacy'. 25 June.

IvyPanda . 2019. "Digital Media and the Issue of Privacy." June 25, 2019. https://ivypanda.com/essays/digital-media-privacy/.

1. IvyPanda . "Digital Media and the Issue of Privacy." June 25, 2019. https://ivypanda.com/essays/digital-media-privacy/.

Bibliography

IvyPanda . "Digital Media and the Issue of Privacy." June 25, 2019. https://ivypanda.com/essays/digital-media-privacy/.

Data Privacy and Technology

Explore the risks and rewards of data privacy and collection.

Explore legal and ethical implications of one’s personal data, the risks and rewards of data collection and surveillance, and the needs for policy, advocacy, and privacy monitoring, in this Harvard Online course.

Harvard John A. Paulson School of Engineering and Applied Sciences

What You'll Learn

Why is data privacy important? Where is the line between the benefits of gathering information for business growth and personal privacy? How can you balance security and surveillance in practice? Should the data gathered about your customers be used to personalize the ads they see and the prices they pay? 

Technology isn’t value-neutral. As a decision-maker in your personal and professional life, you may find yourself weighing the benefits and risks of using new and emerging technologies to collect personal data.  

As new digital technologies are introduced, they present an ever evolving set of online data protection and privacy challenges for businesses and consumers to face. Data Privacy and Technology will help you think critically about the trade-offs and threats presented by today’s digital landscape.

Through real-life examples with industry experts, policy makers, and privacy researchers you’ll gain privacy and data protection training, and:

  • Examine legal and ethical implications of collecting personal data.
  • Understand who’s responsible for protecting personal data.
  • Comprehend why antitrust and privacy laws are unable to keep pace with the rapid change in technology. 

Throughout the modules in this course you will be introduced to concepts on data privacy and ethics, and have the opportunity to:

  • Explore the positive and negative impact of technology on privacy.
  • Understand the concepts behind privacy and ethics in information technology.
  • Uncover the risks and rewards of surveillance.
  • Examine the future of data privacy ethics, collection, and usage.
  • Be introduced to the technology of personal data collection.

Privacy is a complex and multifaceted concept. This course aims to help you become an effective leader in business task forces, privacy-forward communities, and data-sharing practices. 

By the end of the course, you will be ready to contribute to your organization as it grapples with the interaction between privacy and big data.  

Do you agree to enroll in this course and earn your data privacy certification? 

It's time to learn how to balance the utility of a dataset with the privacy of the individuals.  

The course is part of the Harvard on Digital Learning Path and will be delivered via  HBS Online’s course platform .  Learners will be immersed in real-world examples from experts at industry-leading organizations. By the end of the course, participants will be able to:

  • Engage in dialogue and decision making about data collection and security in your workplace.
  • Explain the attempts to legally define privacy and the ongoing conflict of laws and norms within technology advancements.
  • Identify approaches to collecting, using, and selling data, including data privacy policies—and the impact on consumer protections.
  • Analyze challenges related to the anonymization of data and the tensions between privacy and utility.
  • Examine the price of personal data and the trade-offs between privacy and other values.
  • Recognize and prepare for the impacts of emerging technologies on the future of privacy, protection, and the law.
  • Think critically about privacy issues from multiple angles, exploring policy, cultural, and societal impacts.

Your Instructors

Michael D. Smith  is the John H. Finley, Jr. Professor of Engineering and Applied Sciences and a Distinguished Service Professor at Harvard University. He spent 11 years as the Edgerley Family Dean of the Faculty of Arts and Sciences, leading Harvard’s oldest and largest school. Smith was actively involved in the launch of edX, and served on its board from 2012 to 2018. Earlier in his career, he spent time in industry building a range of computing hardware for Honeywell Information Systems and, in 2001, co-founded the data security company Liquid Machines, which was acquired in 2010 by Check Point Software Technologies. While at Harvard, he received a prestigious National Science Foundation Young Investigator Award and the Alpha Iota Prize for Excellence in Teaching.

Jim Waldo  is the Gordon McKay Professor of the Practice of Computer Science in the School of Engineering and Applied Sciences at Harvard, where he teaches courses in distributed systems and privacy; the Chief Technology Officer for the School of Engineering and Applied Sciences; and a Professor of Policy teaching on topics of technology and policy at the Harvard Kennedy School. Waldo was a Distinguished Engineer with Sun Microsystems Laboratories, where he investigated next-generation, large-scale distributed systems, and got his start in distributed systems at Apollo Computer.

Real World Case Studies

Affiliations are listed for identification purposes only.

Headshot of Andy Yen, Data Privacy and Technology

Hear from a CEO on creating a privacy-forward business and learn more about the cost of privacy-protection.

Photo of Latanya Sweeney, featured protagonist in Data Science Principles

Latanya Sweeney

Rebecca Skloot Headshot, Data Privacy and Technology

Rebecca Skloot

Available discounts and benefits for groups and individuals.

Investment Icon

Experience Harvard Online by utilizing our wide variety of discount programs for individuals and groups. 

Past participant discounts.

Learners who have enrolled in at least one qualifying Harvard Online program hosted on the HBS Online platform are eligible to receive a 30% discount on this course, regardless of completion or certificate status in the first purchased program. Past Participant Discounts are automatically applied to the Program Fee upon time of payment.  Learn more here .

Learners who have earned a verified certificate for a HarvardX course hosted on the  edX platform  are eligible to receive a 30% discount on this course using a discount code. Discounts are not available after you've submitted payment, so if you think you are eligible for a discount on a registration, please check your email for a code or contact us .

Nonprofit, Government, Military, and Education Discounts

For this course we offer a 30% discount for learners who work in the nonprofit, government, military, or education fields. 

Eligibility is determined by a prospective learner’s email address, ending in .org, .gov, .mil, or .edu. Interested learners can apply below for the discount and, if eligible, will receive a promo code to enter when completing payment information to enroll in a Harvard Online program. Click here to apply for these discounts.

Gather your team to experience Data Privacy and Technology and other Harvard Online courses to enjoy the benefits of learning together: 

  • Single invoicing for groups of 10 or more
  • Tiered discounts and pricing available with up to 50% off
  • Growth reports on your team's progress
  • Flexible course and partnership plans 

Learn more and enroll your team ! 

Who Will Benefit

Protection Icon

IT, Programming, & Marketing Professionals

Utilize best practices in collecting and using personal data.

Meeting Icon

Managers & Decision-Makers

Set clear strategies and policies about consumer data, website security, and technology best practices.

Changemaker Icon

Advocates & Policy Makers

Explore current policies and regulations to define data education and protection.

Learner Experience

On Data Privacy and Technology

"The course was informative on both current and future data privacy and technological innovation trends—the need for data privacy without inhibiting innovation. The team and instructors prompt critical thinking while broadening the understanding of data privacy beyond the frontiers. At the end of the course, I concluded that there was a need for a mass cultural shift towards ethical use of technology."

Joanita Nagaba Co-founder, ANJ Data Management Solutions Africa Ltd.

"I love the way the course is structured with real-world examples and the critical thinking sessions. It forces us to reflect upon what is happening around us. People who have an interest in cybersecurity, as well as those that would like to gain more general knowledge, would greatly benefit from this course."

Anand Narayan Account Executive, Lenovo Canada

"I would highly recommend the course to a colleague as it was a very interesting course. The way the course is designed helped me to interact with my peers and learn from them, expending my perspective about the subjects being discussed. I also feel that I learned more about privacy and data protection in ways that I would never have thought about it if it wasn't for the real-world cases."

Camilla de Carvalho Corporate Paralegal and Governance Specialist, Concert Properties Ltd.

"I recommended the course to a colleague. I feel that the course provides a perspective into real-life privacy concerns. This perspective changes, in my context, the way I view privacy-related legislation. I think the course is especially relevant for anyone who is venturing into a career path centred around data privacy and technology. I also appreciate the interplay between privacy and technology because I got to see just how crucial it is to guard against the innovation of technology that is non-compliant with data privacy laws."

Giscard Kotelo Privacy Consultant, Deloitte

"I have truly enjoyed this course and I never thought that I would learn this much from an online course. The critical thinking has been awesome and really improved how I approach privacy issues at work and in my private life. The mix of short lectures, cases, discussions, and having to author responses as well as review and grade others (and having your responses graded) was excellent. I have already recommended this course to my colleagues."

Sofia Sjöö Principal Data Privacy Specialist, LL.M

Course Syllabus

Data Privacy and Technology is a complex and multifaceted concept. This course aims to help you become a better decision maker around data-sharing practices. You will explore data privacy ethics and principles, the risks of data collection and surveillance, and the need for data privacy policy, advocacy, and monitoring.

Learning requirements: In order to earn a Certificate of Completion from Harvard Online and Harvard Business School Online, participants must thoughtfully complete all 5 modules, including satisfactory completion of the associated assignments, by stated deadlines.

Download Full Syllabus

  • Study the Stop LAPD Spying Coalition case
  • Expand your definition of privacy to acknowledge multiple definitions shaped by history, culture, and personal experience
  • Reframe privacy as a thorny and actionable issue, where technology is not value-neutral
  • Interpret the long conflict between privacy norms and technological advancement through historical and ethical lenses
  • Study the California Consumer Privacy Act (CCPA) and Henrietta Lacks and the Story of HeLa Cell s cases
  • Identify key differences among laws and regulations governing privacy and understand the implications of these differences
  • Understand the harms associated with data collection and data usage, and the rise of ethics in human subjects research
  • Gain insights into how data can be used in unpredictable ways and consider how these usages can shift notions of privacy
  • Study the Data Re-identification, EdX Data for Education Research, and U.S. Census cases
  • Comprehend the implementation challenges of anonymizing a data set
  • Understand some common technical measures of data privacy, specifically k-anonymity and differential privacy
  • Explain the difference between anonymized and de-identified data
  • Recognize the difference between anonymity and having your personal data in a de-identified
  • Study Algorithmic Bias in Facebook Ads, Proton , and Extremism Online cases
  • Recognize how algorithms differentiate based on otherwise protected attributes and why de-identification of the individual is not enough to protect people from harm
  • Compare the economic value of privacy at the individual and group level, and how some companies build their business model around privacy
  • Understand how personalizing each of our views of the world leads to information silos
  • Study Deepfakes and Birth and Death cases
  • Characterize deep fakes and the unique risks to privacy that they present
  • Predict future privacy issues and the harms and benefits that might occur

Earn Your Certificate

Enroll today in Harvard Online's Data Privacy and Technology course.

Still Have Questions?

Are there discounts available for this course? Are there prerequisites? What are the learning requirements? How do I list my certificate on my resume? Learn the answers to these and more in our FAQs.

Data Privacy and Technology Certificate Sample

Explore and connect to our courses via articles, webinars, and more.

Anonymity, De-Identification, and the Accuracy of Data

Professors Michael D. Smith and Jim Waldo discuss a quick primer on different kinds of anonymization/de-identification by looking at three different regulations.

Keeping Your Data Secure

Watch a webinar about the data privacy trade-offs and challenges presented by today’s ever-changing role of technology.

Jim and Mike on the Potential and Limitations of ChatGPT

Leaders in the fields of computer science and data privacy, Michael D. Smith and Jim Waldo, answer questions about ChatGPT and offer their thoughts on the risks and rewards that accompany widespread generative AI technology use.

View More Posts

Related Courses

Digital health.

Digital technologies and big data offer tremendous opportunities to improve health care.

Big Data for Social Good

Using real-world data and policy interventions as applications, this course will teach core concepts in economics and statistics and equip you to tackle some of the most pressing social challenges of our time.

Data Science for Business

Designed for managers, this course provides a hands-on approach for demystifying the data science ecosystem and making you a more conscientious consumer of information.

  • Work & Careers
  • Life & Arts

Become an FT subscriber

Try unlimited access only $1 for 4 weeks.

Then $75 per month. Complete digital access to quality FT journalism on any device. Cancel anytime during your trial.

  • Global news & analysis
  • Expert opinion
  • Special features
  • FirstFT newsletter
  • Videos & Podcasts
  • Android & iOS app
  • FT Edit app
  • 10 gift articles per month

Explore more offers.

Standard digital.

  • FT Digital Edition

Premium Digital

Print + premium digital, ft professional, weekend print + standard digital, weekend print + premium digital.

Essential digital access to quality FT journalism on any device. Pay a year upfront and save 20%.

  • Global news & analysis
  • Exclusive FT analysis
  • FT App on Android & iOS
  • FirstFT: the day's biggest stories
  • 20+ curated newsletters
  • Follow topics & set alerts with myFT
  • FT Videos & Podcasts
  • 20 monthly gift articles to share
  • Lex: FT's flagship investment column
  • 15+ Premium newsletters by leading experts
  • FT Digital Edition: our digitised print edition
  • Weekday Print Edition
  • Videos & Podcasts
  • Premium newsletters
  • 10 additional gift articles per month
  • FT Weekend Print delivery
  • Everything in Standard Digital
  • Everything in Premium Digital

Complete digital access to quality FT journalism with expert analysis from industry leaders. Pay a year upfront and save 20%.

  • 10 monthly gift articles to share
  • Everything in Print
  • Make and share highlights
  • FT Workspace
  • Markets data widget
  • Subscription Manager
  • Workflow integrations
  • Occasional readers go free
  • Volume discount

Terms & Conditions apply

Explore our full range of subscriptions.

Why the ft.

See why over a million readers pay to read the Financial Times.

International Edition

Digital Privacy

Identity Theft Protection Services

man on computer

Identity Theft Protection

Best Identity Theft Protection Services

(PeopleImages) |

Popular Topics

Man talks on the phone while looking at a laptop

10 Ways to Prevent Identity Theft

Man works on laptop in library

What Is Credit Monitoring?

Person holding a cell phone with a lock on the screen

How to Report Identity Theft

Women works on laptop

Identity Guard Identity Theft Protection

Women with headset smiles as she works on a laptop.

IdentityForce Identity Theft Protection

Learn More About the Best Identity Theft Protection Services of 2024  »

Woman's hands on laptop with VPN symbol on the screen

prykhodov |

VPN concept with person using a laptop on a white table

Best Free VPNs

Recruitment Concept. Girl Browsing Work Opportunities Online, Using Job Search App or Website on Laptop, Copy Space

What Is a VPN?

Data Privacy Concept

How To Set Up a VPN

Infographic on how VPN's work

How Does a VPN Work?

Woman in front of a computer with a password lock screen.

NordVPN vs. ExpressVPN

Learn More About the Best VPNs of 2024  »

Young woman using computer. Cyber security concept.

Best Antivirus Software of 2024

Network security icon with graphic diagram on screen background. Computer defender and security concept.

Best Antivirus Software for Mac

Woman with glasses looking at her tablet

Cheapest Antivirus Software

Businessman working on laptop. Protection network security computer and safe your data concept.

Cheapest Antivirus Software for Mac

How to buy antivirus software, how does antivirus software work.

Learn More About the Best Antivirus Software of 2024  »

Password Managers

Data Privacy Concept

Best Password Managers of 2024

(guvendemir) |

Woman working on a tablet

Are Password Managers Safe?

1password vs. lastpass.

Young woman using computer. Cyber security concept.

Dashlane vs. LastPass

Dashlane vs. 1password.

Learn More About the Best Password Managers of 2024  »

credit score concept on the screen of smartphone, take credit

Best Credit Monitoring Services of 2024

Stephen Slaybaugh

Jon Martindale

State of Digital Privacy in the U.S. thumbnail

State of Digital Privacy in the U.S.

Cloud Security

What Is Cloud Security?

WiFi network.

What Is Wi-Fi

Man holding binoculars with computer code on lenses. Concept of computer hacking.

What Is Spyware?

Digital lock

What Is Advanced Encryption Standard?

Close up of programmer's hand typing on keyboard laptop

What Is a Brute-Force Attack?

Person signing an electronic document

What Is a Digital Signature?

Concept of cyber security, information security and encryption

Tokenization vs. Encryption

Woman looking concerned at laptop screen

What Is a Computer Exploit?

Surveys and studies.

Digital Privacy Survey Main Image

Digital Privacy Survey Report 2024

essay on digital privacy

Most, Least At-Risk States for ID Crimes

Identity theft loss survey thumbnail

U.S. News & World Report ID Theft Survey

All categories.

  • Home Improvement
  • Home Office
  • Bedding & Pillows
  • All Home Goods
  • Wearable Tech
  • TV, Media, and Accessories
  • Computers, Tablets and Accessories
  • Home Electronics
  • Personal Tech
  • All Technology
  • Gift Guides
  • Home Delivery

Why Should You Trust Us?

At U.S. News & World Report, we rank the Best Hospitals, Best Colleges, and Best Cars to guide readers through some of life’s most complicated decisions. Our 360 Reviews team draws on this same unbiased approach to rate products that protect your digital privacy. The team doesn't keep samples, gifts, or loans of products or services we review. In addition, we maintain a separate business team that has no influence over our methodology or recommendations.

Microsoft Azure Blog

Category: AI + Machine Learning • 11 min read

From code to production: New ways Azure helps you build transformational AI experiences   chevron_right

By Jessica Hawk Corporate Vice President, Data, AI, and Digital Applications, Product Marketing 

What was once a distant promise is now manifesting—and not only through the type of apps that are possible, but how you can build them. With Azure, we’re meeting you where you are today—and paving the way to where you’re going. So let’s jump right into some of what you’ll learn over the next few days. Welcome to Build 2024!

Unleashing innovation: The new era of compute powering Azure AI solutions   chevron_right

By Omar Khan General Manager, Azure Product Marketing

New models added to the Phi-3 family, available on Microsoft Azure   chevron_right

By Misha Bilenko Corporate Vice President, Microsoft GenAI

AI + Machine Learning , Announcements , Azure AI Content Safety , Azure AI Studio , Azure OpenAI Service , Partners

Published May 13, 2024 • 2 min read

Introducing GPT-4o: OpenAI’s new flagship multimodal model now in preview on Azure   chevron_right

By Eric Boyd Corporate Vice President, Azure AI Platform, Microsoft

Microsoft is thrilled to announce the launch of GPT-4o, OpenAI’s new flagship model on Azure AI. This groundbreaking multimodal model integrates text, vision, and audio capabilities, setting a new standard for generative and conversational AI experiences.

AI + Machine Learning , Announcements , Azure AI , Azure Cosmos DB , Azure Kubernetes Service (AKS) , Azure Migrate , Azure Web PubSub , Compute , Industry trends

Published May 6, 2024 • 5 min read

Harnessing the power of intelligent apps through modernization   chevron_right

By Mike Hulme GM, Azure Digital Applications Marketing

81% of organizations believe AI will give them a competitive edge. Applications are where AI comes to life. Intelligent applications, powered by AI and machine learning (ML) algorithms are pivotal to enhancing performance and stimulating growth. Thus, innovating with intelligent apps is crucial for businesses looking to gain competitive advantage and accelerate growth in this era of AI.

AI + Machine Learning , Announcements , Azure AI , Azure AI Search , Azure App Service , Azure Cosmos DB , Azure Database for PostgreSQL , Azure Databricks , Azure DevOps , Azure Health Data Services , Azure Machine Learning , Azure Managed Applications , Azure SQL Database , Customer stories , DevOps , Events , Microsoft Azure portal , Microsoft Copilot for Azure , Microsoft Defender for Cloud , Migration , SQL Server on Azure Virtual Machines

Published May 2, 2024 • 11 min read

What’s new in Azure Data, AI, and Digital Applications: Harness the power of intelligent apps    chevron_right

Sharing insights on technology transformation along with important updates and resources about the data, AI, and digital application solutions that make Microsoft Azure the platform for the era of AI.

Hybrid + Multicloud , Thought leadership

Published May 2, 2024 • 4 min read

Cloud Cultures, Part 8: Recapturing the entrepreneurial spirit in the American Rust Belt   chevron_right

By Corey Sanders Corporate Vice President, Microsoft Cloud for Industry

Excited to explore this industrious spirit and a cloud culture closer to home, we ventured to the Northeastern and Midwestern states—the famed Rust Belt—to learn how entrepreneurial adaptability is energizing both people and businesses in the area. 

Latest posts

Analytics , Announcements , Azure Kubernetes Service (AKS) , Azure Monitor , Compute , Containers

Published June 5, 2024 • 4 min read

Announcing Advanced Container Networking Services for your Azure Kubernetes Service clusters   chevron_right

By Deepak Bansal Corporate Vice President and Technical Fellow, Microsoft Azure , and Chandan Aggarwal Partner Group Engineering Manager, Microsoft Azure

Microsoft’s Azure Container Networking team is excited to announce a new offering called Advanced Container Networking Services. It’s a suite of services built on top of existing networking solutions for Azure Kubernetes Services (AKS) to address complex challenges around observability, security, and compliance.

Data center image with green background

AI + Machine Learning , Announcements , Azure Database for PostgreSQL , Azure Machine Learning , Azure OpenAI Service , Events , Migration

Published June 5, 2024 • 5 min read

Raise the bar on AI-powered app development with Azure Database for PostgreSQL   chevron_right

By Ramnik Gulati Sr. Director, Product Marketing of Microsoft Operational Databases

By harnessing the might of PostgreSQL in the cloud—with all the scalability and convenience you expect—comes Microsoft Azure Database for PostgreSQL. This fully managed service takes the hassle out of managing your PostgreSQL instances, allowing you to focus on what really matters: building amazing, AI-powered applications.

Microsoft Developers collaborating on their Windows Machines.

AI + Machine Learning , Azure AI , Azure AI Services , Azure OpenAI Service , Cloud Services , Partners

Published June 4, 2024 • 10 min read

Unlock AI innovation with new joint capabilities from Microsoft and SAP   chevron_right

By Silvio Bessa General Manager, SAP Business Unit

Learn more about the transformative synergy of the Microsoft Cloud and RISE with SAP for business.

Man working on computer

AI + Machine Learning , Announcements , Azure VMware Solution , Migration , Partners

Published May 30, 2024 • 3 min read

Microsoft and Broadcom to support license portability for VMware Cloud Foundation on Azure VMware Solution   chevron_right

By Brett Tanzer Vice President, Product Management

Microsoft and Broadcom are expanding our partnership with plans to support VMware Cloud Foundation subscriptions on Azure VMware Solution. Customers that own or purchase licenses for VMware Cloud Foundation will be able to use those licenses on Azure VMware Solution, as well as their own datacenters, giving them flexibility to meet changing business needs.

Abstract image

Announcements , Azure Bastion , Security

Published May 30, 2024 • 4 min read

Enhance your security capabilities with Azure Bastion Premium   chevron_right

By Aaron Tsang Product Manager, Microsoft

Microsoft Azure Bastion, now in public preview, will provide advanced recording, monitoring, and auditing capabilities for customers handling highly sensitive workloads.

Abstract image

AI + Machine Learning , Azure AI , Azure AI Content Safety , Azure AI Search , Azure AI Studio , Azure Cosmos DB , Azure Kubernetes Service (AKS) , Azure OpenAI Service , Events

Published May 30, 2024 • 5 min read

Celebrating customers’ journeys to AI innovation at Microsoft Build 2024   chevron_right

By Victoria Sykes Product Marketing Manager, Azure AI, Microsoft

From enhancing productivity and creativity to revolutionizing customer interactions with custom copilots, our customers demonstrate the transformative power of generative AI and truly, brought Build 2024 to life. So, how’d they do it? 

logo, company name

AI + Machine Learning , Industry trends , Thought leadership

Published May 29, 2024 • 4 min read

IT trends show customers need computing power to take advantage of AI    chevron_right

In a recent study, Microsoft surveyed over 2,000 IT professionals across ten countries on their tech readiness for and adoption of AI as well as their concerns and challenges along the way.

A group of colleagues on their computers, overlaid on a colorful yellow, green and blue gradient background.

AI + Machine Learning , Announcements , Azure Maps , Integration

Azure Maps: Reimagining location services with cloud and AI innovation   chevron_right

By Nick Lee Corporate Vice President, Microsoft Maps and Local

Today, we’re announcing the unification of our enterprise maps offerings under Microsoft Azure Maps. This enables our customers to accelerate innovation by leveraging other Microsoft Azure cloud services while retaining many familiar features from Bing Maps for Enterprise.

Aerial view of freeway interchange in downtown Singapore.

AI + Machine Learning , Announcements , Azure AI , Azure AI Studio , Azure OpenAI Service , Events

Published May 21, 2024 • 5 min read

At Microsoft Build 2024, we are excited to add new models to the Phi-3 family of small, open models developed by Microsoft.

A decorative image of a computer outline with cube shapes around it.

AI + Machine Learning , Announcements , Azure AI , Azure AI Content Safety , Azure AI Services , Azure AI Studio , Azure Cosmos DB , Azure Database for PostgreSQL , Azure Kubernetes Service (AKS) , Azure OpenAI Service , Azure SQL Database , Events

Published May 21, 2024 • 11 min read

A decorative image of two developers pointing towards a computer

IMAGES

  1. Digital Privacy Essay Example

    essay on digital privacy

  2. A Problem Of Internet Privacy

    essay on digital privacy

  3. ⇉Importance of Internet Privacy Essay Example

    essay on digital privacy

  4. (PDF) SOCIAL COMPUTING AND DIGITAL PRIVACY

    essay on digital privacy

  5. Digital Privacy and the Right to be Protected Free Essay Example

    essay on digital privacy

  6. Internet privacy

    essay on digital privacy

VIDEO

  1. Digital Bangladesh Paragraph/Essay writing in English

  2. Essay on Digital India for New India / Paragraph on Digital India for New India/ Essay Digital India

  3. procrastinating an essay

  4. डिजिटल इंडिया पर निबंध

  5. IELTS Essay Topic

  6. Digital Democracy: Social Media and Political Participation (CSS Essay 2023)

COMMENTS

  1. Privacy in the Digital Age

    Unfortunately, this has also brought about various challenges that must be addressed 1. Generally, information is a vital treasure in itself, and the more one has the better. Having valuable, intellectual, economic, and social information creates enormous opportunities and advantages for any individual. We will write a custom essay on your topic.

  2. What Is Digital Privacy and Its Importance?

    Digital privacy, a subset of the broader concept of privacy, focuses on the proper handling and usage of sensitive data—specifically personal information, communication, and conduct—that are generated and transmitted within digital environments. In essence, it denotes the rights and expectations of individuals to keep personal information ...

  3. Privacy matters because it empowers us all

    Yes, institutions in the digital age have hoarded privacy power, but we can reclaim the data that sustains it, and we can limit their collecting new data. Foucault argued that, even if power constructs human subjects, we have the possibility to resist power and construct ourselves.

  4. The Battle for Digital Privacy Is Reshaping the Internet

    Now that system, which ballooned into a $350 billion digital ad industry, is being dismantled. Driven by online privacy fears, Apple and Google have started revamping the rules around online data ...

  5. What Digital Privacy Is Worth

    But there's a better way to think about digital privacy. ... In a 2019 essay, the technologist Can Duruk discussed an analogy that, he admits, is a bit cliché: Data is the new oil.

  6. Views of data privacy risks, personal data and digital privacy laws in

    1. Views of data privacy risks, personal data and digital privacy laws. Online privacy is complex, encompassing debates over law enforcement's data access, government regulation and what information companies can collect. This chapter examines Americans' perspectives on these issues and highlights how views vary across different groups ...

  7. PDF The right to privacy in the digital age

    With the advent of the digital age, the right to privacy and freedom of expression have become interdependent, as is for example demonstrated by the chilling effect that privacy violations can have on media freedom: monitoring of online activity, data retention and big data, Artificial Intelligence-powered ...

  8. Opinion

    The New York Times is launching an ongoing examination of privacy. We'll dig into the ideas, history and future of how our information navigates the digital ecosystem and what's at stake.

  9. Full article: Online Privacy Breaches, Offline Consequences

    Over 30 years ago, Mason (Citation 1986) voiced ethical concerns over the protection of informational privacy, or "the ability of the individual to personally control information about one's self" (Stone et al., Citation 1983), calling it one of the four ethical issues of the information age.Since the 1980s, scholars have remained concerned about informational privacy, especially given ...

  10. Privacy in the digital age: comparing and contrasting ...

    This paper takes as a starting point a recent development in privacy-debates: the emphasis on social and institutional environments in the definition and the defence of privacy. Recognizing the merits of this approach I supplement it in two respects. First, an analysis of the relation between privacy and autonomy teaches that in the digital age more than ever individual autonomy is threatened ...

  11. Digital technologies: tensions in privacy and data

    Driven by data proliferation, digital technologies have transformed the marketing landscape. In parallel, significant privacy concerns have shaken consumer-firm relationships, prompting changes in both regulatory interventions and people's own privacy-protective behaviors. With a comprehensive analysis of digital technologies and data strategy informed by structuration theory and privacy ...

  12. The New Rules of Data Privacy

    Firms that generate any value from personal data will need to change the way they acquire it, share it, protect it, and profit from it. They should follow three basic rules: 1) consistently ...

  13. The Importance of Internet Privacy: [Essay Example], 1017 words

    Get custom essay. Ultimately, the importance of internet privacy extends beyond individual convenience; it is a fundamental right that underpins our freedom, security, and individuality in the digital age. By recognizing the significance of internet privacy and taking meaningful steps to protect it, we can ensure that the digital landscape ...

  14. Digital privacy comes at a price. Here's how to protect it

    Most people do not consciously put a price on their online privacy. But what if they did? A 2020 survey of Argentinians, Brazilians, Colombians, Mexicans, Germans and US citizens did precisely this. The Technology Policy Institute, a think tank, asked respondents how much a company would have to pay them each month to disclose various types of personal data.

  15. Introduction: A Twenty-First Century Framework for Digital Privacy

    In Digital Divergence, David Kris examines advances in technology, and he challenges the view that balancing privacy and security is a zero-sum game. Instead, he argues that new technologies threaten both privacy and security. While privacy faces familiar threats such as mass data collection and government surveillance, Kris argues that we ...

  16. The Right to Privacy: Personal Freedom in the Digital Age: [Essay

    The right to privacy is the right to be left alone, to keep one's personal information and life choices free from unwanted intrusion or surveillance. It encompasses the right to control one's personal data, maintain confidentiality in communications, and make autonomous decisions about one's body and lifestyle.

  17. Why Security and Privacy Matter in a Digital World

    Given this backdrop, it is often easy to get lost in the details of cybersecurity and privacy and the seemingly endless discussions about cyber attacks, system breaches, frameworks, requirements, controls, assessments, continuous monitoring and risk management and forget why security and personal privacy matter in an increasingly digital world.

  18. The Effects of Privacy and Data Breaches on Consumers' Online Self

    Five major streams of research inform our work in this paper: (1) technology adoption model (TAM), (2) consumer privacy paradox, (3) service failure, (4) protection motivation theory (PMT), and (5) trust. First, digital life has become such an integral part of consumers' existence that it is hard to separate the two.

  19. Data security: Research on privacy in the digital age

    National Bureau of Economic Research working paper, 2018. DOI: 10.3386/w24253. Summary: This paper looks at the risks big data poses to consumer privacy. The author describes the causes and consequences of data breaches and the ways in which technological tools can be used for data misuse. She then explores the interaction between privacy risks ...

  20. The Right to Privacy in a Digital Age ...

    Law professor and privacy law expert, Daniel J. Solove, confirmed that current privacy laws are not sufficient for digital privacy, which he called a "privacy self-management model," where users are informed of their legal rights and consent to data collection without knowing what it really entails (Weckerle, 2013, pg. 252).

  21. Managing privacy in the digital economy

    Based on the review of previous research, an ontology of digital privacy is proposed (Fig. 1), considering the psychological, economical, and technical aspects of privacy issues in digital economy.Digital privacy is defined as the selective psychological and technical control of access to the digital self in the form of online profiles, personal data, and digital assets.

  22. Defending Privacy: a Pillar of Autonomy and Democracy

    The advent of digital technologies has revolutionized data collection, storage, and analysis, presenting unprecedented challenges to privacy. Corporations collect vast amounts of personal data, often without explicit consent, for purposes ranging from targeted advertising to behavioral analysis.

  23. Social Media Users' Legal Consciousness About Privacy

    In thinking about privacy, two emerging phenomena are of particular interest: on the one hand, technological architectures of social media push the boundaries of disclosure—both voluntary and involuntary—accompanied by privacy policy in the terms and conditions (T&C) 2 of use. In response, the question of informed consent has entered European law, to counterbalance a perceived disparity in ...

  24. Digital Media and the Issue of Privacy Critical Essay

    The readings focus on digital media and its impact on the issue of privacy. The article by Mills (2015) focuses on the old conflicts that are witnessed in the new media. Just like in the past, the top celebrities and politicians try to hide their personal lives from the public. However, media is keen to expose them due to public interest.

  25. Data Privacy and Technology

    Jim Waldo is the Gordon McKay Professor of the Practice of Computer Science in the School of Engineering and Applied Sciences at Harvard, where he teaches courses in distributed systems and privacy; the Chief Technology Officer for the School of Engineering and Applied Sciences; and a Professor of Policy teaching on topics of technology and policy at the Harvard Kennedy School.

  26. Right to Privacy and Data Protection Under Indian Legal Regime

    Privacy has emerged as a basic human right across the globe and in India too it has been recognized as a Fundamental Right under Article 21 of the Indian Consti

  27. Why we still care about Kafka

    It is a line — touching, life-affirming — inconceivable in the entirety of Kafka's oeuvre, because Kafka held being human in contempt. This human rhythm he could only read about, never hear ...

  28. Digital Privacy Reviews & Information

    At U.S. News & World Report, we rank the Best Hospitals, Best Colleges, and Best Cars to guide readers through some of life's most complicated decisions. Our 360 Reviews team draws on this same ...

  29. Microsoft Azure Blog

    By Jessica Hawk Corporate Vice President, Data, AI, and Digital Applications, Product Marketing. Sharing insights on technology transformation along with important updates and resources about the data, AI, and digital application solutions that make Microsoft Azure the platform for the era of AI. Hybrid + Multicloud, Thought leadership.