Is privacy a universal human right or a modern Western invention? We are exploring why some cultures guard their data like a state secret while others practically broadcast their medical prescriptions in crowded waiting rooms.
It is a fascinating paradox, Corn. On one hand, you have the rise of quantum-resistant encryption and global privacy laws like the General Data Protection Regulation. On the other, you have wildly different cultural norms that make those technical tools feel like overkill in some places and a baseline necessity in others. My name is Herman Poppleberry, and today we’re diving into the deep end of the privacy pool.
And I am Corn. Today’s prompt from Daniel is about the cultural and philosophical dimensions of privacy. He’s looking at the contrast between those high-security privacy tools like Proton Mail and PGP encryption versus the more informal, almost communal approach to personal information you see in places like Israel.
It’s a great prompt because it touches on the very core of how we organize society. By the way, today’s episode is powered by Google Gemini three Flash.
I hope the AI isn’t eavesdropping on our thoughts about privacy. That would be a bit on the nose, wouldn’t it? But seriously, Herman, Daniel brings up a point that hits home. He mentioned how in some clinics, you might hear a nurse shout out a prescription for something personal like Lexapro in front of everyone. In a place like Germany or Switzerland, that would probably trigger a national crisis or at least a very sternly worded letter from a data protection officer.
The cultural divide is massive. In the West, specifically in Germanic and Nordic cultures, privacy is often treated as an extension of individual autonomy. If you don’t have a private sphere, you aren’t truly a free individual. But as Daniel noted, in Israel or even parts of the Mediterranean and Middle East, the boundaries are much more porous. There is a "we are all in this together" mentality that makes the "me" less of a walled fortress.
It’s almost like a "social transparency" tax you pay for living in a tight-knit community. If everyone knows your business, everyone is also there to help when things go sideways. But does that mean they don't value the option of being private, or is the social cost of secrecy just too high in those cultures?
I think it’s a bit of both. In a high-trust, high-density society, trying to keep a secret can actually be seen as a sign of antisocial behavior. If you’re building a wall, people ask, "What are you hiding from us?" Whereas in London or New York, the wall is the default. If you don't have a wall, people think you're crazy.
So, let’s frame the big question here. Are we actually hardwired for privacy? Is it something in our DNA, or is it just a fancy social construct we dreamt up once we got enough money to afford separate bedrooms?
That is the million-dollar question. There is a popular school of thought in anthropology that suggests privacy is a modern luxury. The argument goes that hunter-gatherer societies lived in such close proximity—sleeping in the same huts, hunting together, eating together—that the concept of a "private life" simply didn't exist. You couldn't hide anything even if you wanted to.
I can see that. If you're living in a tent with ten other people, trying to keep a secret is probably a full-time job you don't have time for because you're too busy not getting eaten by a lion. But does that mean they didn't value it, or just that they couldn't achieve it?
I side with the evolutionary psychologists on this one, and I think Daniel is right to be skeptical of the "modern construct" theory. Even in those high-density communal living situations, humans developed incredibly sophisticated social "cloaking" mechanisms. Think about the way people use language, taboos, or even just turning one's back.
Right, like the "ignore the person crying in the corner" rule. We have internal walls even when the external ones are missing. It’s like a psychological curtain we pull shut when the physical one isn't available.
Primatologists have observed similar behaviors in chimpanzees and bonobos. There is a clear sense of territoriality, not just over physical space, but over information. Animals will hide food or sneak away to mate to avoid the gaze of a dominant alpha. That is a rudimentary form of privacy. It’s about managing social signaling to ensure your own survival and reproductive success. If everyone knows everything you’re doing, you lose your competitive edge.
So your "nerdy expert" take is that privacy is an evolutionary survival strategy, not just a middle-class obsession with fences?
I would argue it is innate. We have a fundamental biological drive to control how we are perceived by the group. The tools we use to do that change—from a thick bush in the savannah to PGP encryption in a digital inbox—but the drive remains the same. We want to choose what we reveal because information is power. If a rival knows your weakness, they can exploit it. If the group knows your surplus, they might demand it. Privacy is the buffer that allows the individual to negotiate their terms with the collective.
It’s funny you mention the bush in the savannah, because now we have companies like Proton and Mullvad basically selling us digital bushes. Daniel mentioned Mike, a Proton user, and the fact that these companies often cluster in places like Switzerland or Sweden. Why there? Is it just the chocolate and the meatballs, or is there something deeper in the soil?
It goes back to that cultural variance we touched on. Switzerland is the perfect example. They have a centuries-old tradition of neutrality and discretion. It’s baked into their national identity. The Swiss banking secrecy laws didn't happen by accident; they were a reflection of a culture that believes what is mine is mine, and the state has no business poking around in it unless there’s a very good reason.
But wait, didn't Switzerland change some of those banking laws recently under international pressure? Does that mean their "privacy culture" is eroding, or is it just moving from the bank vault to the data center?
That’s an astute observation. While the banking sector faced pressure to increase transparency for tax purposes, the legal framework for personal data remains incredibly robust. Article 13 of the Swiss Federal Constitution literally guarantees the right to privacy. When Proton Mail sets up shop in Geneva, they aren't just getting a scenic view; they are getting a legal shield that requires a Swiss court order—one of the hardest to get in the world—before any data can be touched.
And then you have Germany, where the history of the Stasi and the German Democratic Republic turned privacy from a cultural preference into a survival necessity. When you’ve lived through a regime where your neighbor is reporting your dinner conversations to the secret police, you tend to get pretty intense about end-to-end encryption.
That historical trauma is a huge driver. It’s why German privacy laws are among the most stringent in the world. They view data protection as a fundamental human right because they’ve seen what happens when that right is stripped away. It’s not just about hiding "bad" things; it's about protecting the dignity of the individual against the overreach of the collective or the state. In Germany, the concept of "Informational Self-Determination" is a constitutional principle. It means you decide what happens to your data, not the company that collected it.
It’s a sharp contrast to the Israeli example Daniel gave. If the culture is informal and people "stop caring" about their prescriptions being announced, does that mean they’ve lost that "innate" drive for privacy, or just that they’ve redefined what counts as sensitive?
I think it’s a redefinition. In high-context cultures, the group identity is so strong that the "shame" or "secrecy" around things like health might be lower because the social support network is higher. If everyone knows you’re on Lexapro, maybe they’ll be more patient with you. In a hyper-individualistic Western city, if someone knows you’re on Lexapro, you might worry it affects your job promotion. The stakes change based on how much you rely on the group versus the system.
That’s a really sharp point. Privacy is a shield, but you only need a shield if you feel like people are throwing spears. If you feel like you're among friends, you might leave the shield at home. But let's talk about the digital shields. Daniel mentioned PGP and Proton Mail. For the folks who aren't familiar with the guts of it, PGP stands for Pretty Good Privacy, which has to be the most modest name for a cryptosystem ever.
It really is. It was created by Phil Zimmermann in the early nineties, and the name is a nod to a grocery store in a radio play called "Prairie Home Companion." But the tech is anything but modest. It uses a combination of symmetric-key and public-key encryption.
Before you go full "Professor Poppleberry" on us, give me the "why it matters" version. Why is Mike using this instead of just a regular old email provider?
Because with regular email, your provider—let’s say it’s a big tech giant—can technically read your messages. They need to scan them for spam, for ads, or just to keep them on their servers. PGP, and by extension Proton Mail, uses what’s called "Zero Access Encryption." When Mike sends an email, it’s encrypted on his device using the recipient’s public key. Only the recipient has the private key to unlock it. Proton, the company sitting in the middle, just sees a jumble of random characters. They couldn't read Mike's secrets even if a government agent walked into their Geneva office with a warrant.
It’s the digital equivalent of a wax seal on a letter, but the wax is made of math that would take a supercomputer a billion years to crack. But how does that work if Mike is emailing someone who isn't using Proton? Doesn't the seal break once it hits a Gmail inbox?
That’s the "walled garden" problem. If Mike sends an unencrypted email to a Gmail user, it travels across the open web like a postcard. To solve this, Proton allows Mike to send "Password Protected Emails." He sends a link, and the recipient has to enter a pre-shared password to decrypt the message on Proton's secure servers. It’s a bit more friction, but it keeps the "wax seal" intact all the way to the destination.
And Daniel mentioned something called Proton Lumo, which is their new AI integration. This is where it gets really technical and cool. Usually, AI and privacy are enemies. To give you an AI summary of your emails, the AI has to "see" the text. But Proton is trying to do this in a way where the decryption happens in a volatile, secure memory environment. The message is decrypted, the LLM processes it, the response is generated, and then the original data is wiped from that temporary space. It never touches a permanent log or a training set.
It’s a massive engineering challenge. Most AI models "learn" from the data they process. Proton has to ensure that Lumo is "forgetful" by design. It’s like a witness who is allowed to look at a crime scene to help the police, but then has their memory wiped immediately afterward so they can't tell anyone else what they saw.
It’s like a "Mission Impossible" message that self-destructs after the AI reads it. But doesn't this bring us back to Daniel's second big question? The "bad actor" dilemma. If we make this kind of "unbreakable" encryption available to everyone—including Mike, but also including actual villains—is that a net positive for society?
This is the core of the crypto-wars. Governments hate "dark" spaces. They want a "backdoor" for the "good guys" to catch the "bad guys." But as we’ve discussed before, and as Daniel noted, history shows there is no such thing as a safe backdoor. A backdoor is just a vulnerability that hasn't been found by the wrong person yet. If you build a key that opens every door in the city and give it to the police, eventually a thief is going to steal that key, or a corrupt cop is going to sell it.
It’s the "golden key" fallacy. You can't have a lock that only honest people can open. Math doesn't care about your moral compass. If the math is weak for the FBI, it's weak for a hacker in a basement in another country.
And that is why there is a massive societal interest in making these tools widely available. We have to protect the ninety-nine percent of legitimate users, journalists, activists, and ordinary citizens like Mike, even if it means a few bad actors use the same tools. The alternative is a world of total surveillance, which, as we’ve seen in historical examples, inevitably leads to the suppression of dissent and the erosion of freedom. Think about whistleblowers like Edward Snowden. Without encryption, his revelations about mass surveillance would never have reached the public.
It’s interesting that Daniel mentioned Mullvad VPN too. They are famous for being almost suspiciously transparent. You don't even give them an email address to sign up. You just generate a random account number and pay with cash or crypto if you want. It’s a very "no questions asked" Swedish approach to privacy.
Mullvad is the gold standard for transparency. Most VPNs are owned by giant, opaque holding companies often based in jurisdictions with zero oversight. Mullvad publishes their audits, they show exactly who owns them, and they’ve even been raided by the police before. When the Swedish police showed up to seize their servers in 2023, Mullvad basically said, "Go ahead, but we don't have any logs for you to take." The police left empty-handed because the system was designed from day one to be "blind" to user activity.
That raid is a wild story. Imagine the police walking into a server room with a warrant and the tech guys just shrugging and saying, "We literally don't know who our customers are." That takes a lot of guts and a lot of very careful engineering.
It’s building privacy into the architecture rather than just promising it in a privacy policy that nobody reads anyway. But let’s circle back to the culture. Why is Mike, who lives in a world of informal privacy in Israel, choosing to use these high-end Swiss and Swedish tools? Is he a "privacy prepper," or is he just ahead of the curve?
I think he’s reacting to the "Radical Transparency Paradox." We live in an era where everything is recorded, indexed, and searchable. In a small village or a traditional community, if you "stop caring" about people knowing your business, it’s because you know those people. You have a social contract with them. But on the internet, "everyone" isn't your neighbor. "Everyone" is an algorithm, a data broker, or a malicious actor ten thousand miles away.
Right. I don’t mind if the nurse knows I’m on Lexapro, but I definitely mind if a life insurance algorithm finds that out and triples my premiums without telling me why. Or if a future employer does a deep dive into my medical history before they even offer me an interview.
That’s the "Visibility Trap." We’ve transitioned from "social privacy" among humans to "data privacy" against machines. The informal norms that work in a physical clinic in Jerusalem don’t translate to the digital world. In the digital world, if you don't have a wall, you're not just "informal"—you're a product to be sold. The machine doesn't have empathy; it only has data points.
So, for our listeners who are hearing this and thinking, "Maybe I should be more like Mike," what are the actual takeaways here? If privacy is both a biological drive and a cultural choice, how do we navigate it in 2026?
First, you have to realize that where your tools are built matters. There is a reason Proton is in Switzerland and Mullvad is in Sweden. These countries have legal frameworks that prioritize the individual over the state or the corporation. If you use a privacy tool based in a country with weak data protection laws, the best encryption in the world won't save you from a subpoena that forces the company to log your data moving forward.
So, step one: Check the "jurisdiction" of your digital home. If it’s a "Five Eyes" country—like the US, UK, Canada, Australia, or New Zealand—maybe don't keep your most sensitive stuff there?
It’s a consideration for sure. Those countries have deep intelligence-sharing agreements that can bypass local laws. Step two is embracing "Zero Trust." Don't just trust a company because they have a pretty website and say they "value your privacy." Look for tools that use end-to-end encryption by default. PGP might sound old school, but it’s still one of the most robust ways to ensure that only the intended recipient can read your words. Proton makes it easy by hiding the complexity, but the underlying math is what provides the security.
And what about the "bad actor" thing? Should we feel guilty for using tools that could be used by villains?
Absolutely not. You should feel empowered. Privacy is a public good, like clean water or a paved road. Yes, a criminal can drive a getaway car on a paved road, but we don't tear up the asphalt because of it. We need those roads for the economy, for emergency services, and for our daily lives. Quantum-resistant encryption is the "paved road" of the future. As quantum computers get more powerful, they will be able to crack current encryption. If we don't make quantum-resistant tools widely available now, our entire financial and personal infrastructure will be naked in a few years.
It’s a proactive defense. We're building the dikes before the flood hits. I like that. It’s not about being "sneaky"; it's about being "secure." It’s the difference between locking your front door and living in an underground bunker. One is just common sense.
One thing that fascinates me about the cultural side is how these norms might shift as we get more "augmented." If we all eventually have some form of neural interface or constant AR recording, does the concept of "private thought" become the next frontier?
Oh man, don't give the tech bros any ideas. "Proton Mind—end-to-end encryption for your shower thoughts." I can see the ad now. But seriously, if we have brain-computer interfaces, is there any privacy left?
That is the ultimate battleground. We are already seeing research into "brain-reading" AI that can reconstruct images you are looking at just from fMRI data. If we don't establish the cultural and legal "right to a private mind" now, we might find ourselves in a world where even our silent protests are visible to the state. It sounds like sci-fi, but the foundations are being laid today.
You joke, but we’re already seeing "privacy-preserving AI." The fact that we can even discuss things like "Federated Learning," where an AI learns from your data without ever actually seeing your data, shows that we are finding technical solutions to cultural problems. We are trying to have our cake and eat it too—the benefits of a connected, smart world without the "Visibility Trap" of total surveillance.
It’s a delicate balance. Federated learning is brilliant because the data stays on your phone; only the "mathematical insights" are sent to the central server. It’s like a group of students studying for a test in separate rooms, and then only sharing their notes, not their personal diaries.
It feels like a constant arms race between our desire to be part of the group and our need to be our own person. Daniel’s example of the clinic in Israel is a reminder that humans are resilient. We adapt to our surroundings. If the walls are thin, we lower our voices or we stop caring about the noise. But in the digital world, the walls aren't just thin—they’re often made of glass and have microphones built-in.
And that is why we have to be intentional. We have to choose our tools based on the reality of the environment. If you’re in a "high-noise" digital environment, you need "high-silence" tools.
I love the idea of "high-silence" tools. That’s a great way to think about it. It’s not about hiding; it’s about creating a space where you can actually hear yourself think without an algorithm whispering in your ear.
To Daniel’s point about being "hardwired," I think the most human thing we can do is demand that space. Whether you're a hunter-gatherer finding a quiet spot by the river or Mike using PGP in 2026, you're asserting your right to be more than just a data point. You're asserting your personhood.
That’s a deep thought for a donkey, Herman. I’m impressed. Usually, you’re just worried about whether the encryption algorithm is "elegant" enough.
Hey, elegance is a form of truth! But you’re right, the human element is what makes it matter. If we didn't have souls, we wouldn't need privacy. We’d just be open-source code walking around.
So, for everyone listening, maybe take a page out of Mike's book. Audit your footprint. Check where your data "lives." Is it in a jurisdiction that respects you, or is it in a digital waiting room where someone is shouting your business to the crowd? Think about the "analog" version of your digital life. If you wouldn't say it in a crowded elevator, why are you sending it in an unencrypted email?
And don't be afraid of the "weird" tools. Mullvad, Proton, PGP—they might seem "nerdy," but they are the fences of the twenty-first century. And as the old saying goes, good fences make for good neighbors—even if those neighbors are on the other side of the planet.
Or even if those neighbors are LLMs like the one writing this script. We see you, Gemini!
It really is a fascinating time to be alive. We have more power to protect our privacy than at any point in history, but we also face more threats to it. The "cultural leader" societies like Switzerland and Germany are giving us the blueprints, but it’s up to us to actually build the house.
I think we’ve covered a lot of ground today. From primate territoriality to Swiss bank vaults to the "analog hole" of an Israeli doctor's office. It all points back to one thing: privacy isn't just a setting on your phone. It's a way of relating to the world.
I couldn't have said it better myself. It’s about agency. The power to choose.
What most people don't realize is that once you lose that power, it's incredibly hard to get back. You can't "un-ring" the bell of a data breach or a leaked secret. You can only build better bells for next time. And hopefully, those bells are encrypted.
And on that note, I think it’s time to wrap this one up. We’ve given the listeners plenty to chew on.
Just don't shout their prescriptions while they're chewing. That would be rude.
Noted. No shouting in the "My Weird Prompts" clinic.
Thanks as always to our producer, Hilbert Flumingtop, for keeping the gears turning behind the scenes.
And a big thanks to Modal for providing the GPU credits that power this show. Without them, we’d just be two brothers talking to a wall.
A wall with very thin privacy, I might add. This has been My Weird Prompts. If you’re digging the show, head over to myweirdprompts dot com to find our RSS feed and all the ways to subscribe.
We’ll catch you in the next one. Stay secure, everyone.
And keep it weird. Goodbye!