
Why Should We Care About the Privacy Gap?
We say we care about privacy, but our actions tell a different story. This gap between what we value and what we do raises serious questions about our choices online. Why does this disconnect exist, and why should we care?
TLDR; When we have a privacy gap:
​
-
We're living in contradiction with our own values
-
We split our identity between what we believe and what we do
-
We let others define who we are through "psychometric assessment"
-
We lose the chance to experiment and grow without judgment
​​
Privacy as a Public Health Issue
​
My quiz focuses on privacy in regards to health behaviors, but I would be remiss to ignore how privacy in general has significant implications for public health. For example, a lack of online privacy has led to financial discrimination by banks. Lenddo, an online lender that operated in countries like the Philippines, Colombia, and Mexico, has made loan decisions based on applicants' social media history. As part of their process, loan applicants had to link their Facebook accounts during application. If they're friends with someone who has previously defaulted on a Lenddo loan, their own ability to secure a loan takes a hit. In 2017, Lenddo was acquired by the Entrepreneurial Finance Lab (EPL) and now sells their credit-assessment technology to over 50+ banks globally. On its website, the company advertises "psychometric assessment" as "the only truly universal credit information." See below:
​
​
Yet, I would like to pose this question: is a person's online behavior—often driven by polarizing social media algorithms and the gamification of communication—a true representation of their trustworthiness? If this is unknown, then is it ethical to limit an individual's access to financial capital given the deliterious public health outcomes of bad credit?
I would argue not. Thus, to avoid having our autonomy limited, I believe we must take action to protect our privacy and close the gap.
​
For example, think about how knowing you're being watched changes your behavior. Studies show people conform more, take fewer risks, and express less controversial opinions when they know they're under observation. That's not freedom.
Privacy gives us room to figure out who we are without constant audience judgment. It's the space where you can search for information about a health concern without it following you through ads. Where you can message a friend without algorithms analyzing your conversation.
​
Without this space, we become performers rather than authentic beings. We lose the ability to try on different identities and to speak freely without fear of future employers or algorithms judging us. Here are more negative outcomes of privacy erosion:
​​
-
Economic harm: Data breaches lead to identity theft, costing average victims 200+ hours to resolve
-
Discrimination: Algorithms using your data make decisions about loans, jobs, and insurance
-
Manipulation: Your digital profile determines what news you see and what political messages target you
-
Loss of opportunity: Bad data in hiring practices can cost you jobs without you knowing why
-
Mental health effects: Constant online surveillance creates anxiety and self-censorship
-
Relationship damage: Privacy invasions can lead to lost trust between people
​
​
Utility and Happiness
​
In addition, think about utility—the greatest good for the greatest number. When we click "accept all cookies" despite valuing privacy, we're actually working against our own happiness.
​
It's a simple trade-off that millions make daily:
​
-
We gain: Immediate access to a service
-
We lose: Control over personal information that can be used against us for years
The small conveniences we gain might not make up for what we lose (such as having a loan denied based on online data). In addition, we face the anxiety of knowing our data could be exposed in the next breach. Or the discomfort of seeing ads for products we only discussed verbally. Or realizing our health insurer knows what we've been searching about medical symptoms.
​
True utility would mean enjoying technology's benefits without the constant background worry about who's watching and profiling. When we fail to protect our privacy despite valuing it, we're choosing a path that ultimately makes us less happy and less autonomous.
​
​
Dignity and Autonomy
​
​
A deontologist might say privacy matters because we're not just tools to be used by major corporations as an end to something else (in this case, profit). We have inherent worth.
​
When tech companies hide privacy controls behind layers of menus while making the "share everything" option a single convenient button, they're not treating us with respect. They're using our cognitive biases against us to extract data at an extreme profit. This is unsurprising, as personal data has now surpassed oil as the world's most valuable commodity. Specifically, the Big Data analytics market is now worth over $307.5 billion.
​
Consider social media platforms that track your browsing across the entire internet, building a profile so detailed it can predict your actions better than your friends can, or apps that demand access to your microphone when they have no reasonable need for them. These systems aren't designed to serve us, they're designed to extract value from us. They manipulate rather than empower.
​
Our dignity takes a hit when we can't make real choices about our information. Autonomy requires meaningful options, not illusions of choice. When the deck is stacked against privacy through dark patterns and manipulation, we're not acting freely.
​
What would genuine respect for our autonomy look like? I believe it would consist of privacy controls being as prominent as "accept" buttons and data collection that's opt-in rather than opt-out. Until we have these, our autonomy remains compromised.
​
​
How Companies Exploit Our Thinking
​
​
We want to protect our privacy, but why is it so hard? It's by design.
​
Common tactics used to get your data:
- Tiny "decline" buttons vs. huge, colorful "accept" buttons
- Privacy settings buried 5+ clicks deep in account menus
- Time-pressure: "Only 24 hours left to claim your account!"
- Guilt-inducing language: "No thanks, I don't want better service"
- Lengthy privacy policies
- Defaults set to maximum data sharing
- "Free" services with a hidden cost in data collection
​
For example, Facebook's "privacy checkup" guided users through a few simple settings while leaving dozens untouched. These aren't accidents or oversights. Companies employ behavioral psychologists to study exactly how to wear down your resistance and create decision fatigue. They know we'll take the path of least resistance as most of us won't read 20,000-word terms of service.
​
The most troubling part? This manipulation happens most to those who can least afford to lose their privacy: the elderly, the less tech-savvy, those with less time to fight the system. It's a system rigged to produce exactly this outcome.
​
​
Difficulty Finding a Balance
​
Aristotle's concept of the "golden mean" applies perfectly to privacy choices. He taught that virtue lies in finding balance between extremes—not too much, not too little. Privacy works the same way. Too much privacy isolation cuts us off from digital benefits, while too little leaves us vulnerable to exploitation. The balanced approach requires thoughtful choices about what to share and when. The challenge is that digital systems are designed to push us toward oversharing. Companies make privacy protection deliberately cumbersome while making data collection the easy default:
​
​​
​
​
​
​
​
​
​
​
​
​
​
​
​
Most of us lean toward oversharing because the digital ecosystem pushes us that way. Finding the golden mean requires swimming against a powerful current. The "convenient" option is almost never the privacy-protecting one. For example, look at fitness trackers that share your health data with insurers or smart TVs that watch what you're watching. The middle ground gets harder to find when companies hide how much they're taking.
True balance would mean having the benefits of digital life without constant surveillance. It would mean services that respect boundaries rather than constantly push to eliminate them.
​
​
The Social Side of Privacy
​
​
Privacy affects everyone around you. When you use an email service that scans your contacts or submit your DNA to a genealogy company, you're making privacy choices for others too. When enough people give up privacy:
​
-Data collection becomes the unquestioned default
-Companies get bolder with how they use our information
-Government surveillance seems more normal and acceptable
-Privacy becomes harder for everyone to maintain
-The cost of opting out grows (try functioning in modern society without a smartphone)
Imagine making rules for society without knowing your place in it—what philosophers call the "veil of ignorance." Would you create today's privacy-invading systems if you didn't know whether you'd be powerful or vulnerable in that society? Probably not. You'd want protections that work for everyone.
​
The social contract should protect all of us. But right now, it's failing. Each privacy surrender makes it harder for others to protect themselves. Your choices matter not just for you, but for everyone.
