Episode #555

Before They Can Click: The Ethics of Sharenting

Explore the ethical and technical landmines of sharing children's photos online, from metadata leaks to the rise of AI-generated deepfakes.

Episode Details
Published
Duration
25:24
Audio
Direct link
Pipeline
V4
TTS Engine
LLM

AI-Generated Content: This podcast is created using AI personas. Please verify any important information independently.

In the latest episode, hosts Herman Poppleberry and Corn tackle one of the most pervasive yet overlooked ethical dilemmas of the modern era: the digital footprints created for children by their parents. This phenomenon, often referred to as "sharenting," has evolved from a simple way to stay connected with family into what Herman describes as a "massive, unplanned social experiment." As the duo explores the landscape of 2026, they reveal that the stakes of sharing a "cute photo" have never been higher.

The Scale of the Digital Footprint

Herman opens the discussion with a staggering statistic: by the time the average child reaches the age of five, they may already have up to 1,500 photos of themselves living online. In a world where 75% of parents share their children’s lives on social media—often including real names—the amount of data available to third parties is unprecedented.

Corn points out a fundamental shift in how we document childhood. Unlike previous generations, whose embarrassing moments were confined to physical photo albums on a shelf, today’s children are born into a global, searchable, and permanent record. These photos are not just memories; they are data points used to build predictive models of an individual’s life before they are old enough to even understand what the internet is.

Hidden Dangers: Metadata and Digital Kidnapping

A significant portion of the conversation focuses on the technical "landmines" hidden within digital files. Herman explains the risks of Exchangeable Image File Format (EXIF) data. Every photo taken on a smartphone contains metadata—GPS coordinates, timestamps, and device information. For a parent posting from home, this effectively publishes their home address and their child’s daily routine to anyone capable of reading the file.

The discussion then turns to the disturbing trend of "digital kidnapping." This occurs when strangers scrape photos of children from public accounts and repost them as if they were their own. While some parents use emojis or blurring to obscure their children’s faces, Herman notes that while this is a helpful symbolic gesture and effective against basic facial recognition, it is only a partial shield against the broader machinery of data collection.

The AI Frontier and Deepfakes

Perhaps the most sobering part of the episode involves the role of artificial intelligence. Herman cites a 2026 report from UNICEF and INTERPOL, which found that over 1.2 million children had their images manipulated into sexually explicit deepfakes in just one year. The ease with which generative AI can learn a child’s likeness from a few high-resolution photos has turned public social media profiles into "fuel" for malicious actors.

Beyond malicious intent, there is the issue of corporate exploitation. Every photo uploaded to a major platform is ingested into proprietary AI models. These models learn biometric patterns, voiceprints, and even "gait patterns"—the way a person walks. Corn and Herman emphasize that once this data is integrated into the weights and biases of a neural network, it is virtually impossible to "delete."

Legal Thresholds and Social Friction

The hosts also examine the legal landscape, specifically the updates to the Children's Online Privacy Protection Act (COPPA). As of April 2026, federal regulations have expanded the definition of personal information to include biometrics. However, Herman argues that legal thresholds like the age of thirteen are arbitrary. The real issue is the lack of "affirmative consent." A toddler cannot consent to a permanent biometric profile, yet they are forced to live with the consequences of their parents' posts for the rest of their lives.

This creates a "collective action problem" at social events like birthday parties or school graduations. Even if one parent is diligent about privacy, twenty other parents may be uploading photos to public stories. Corn suggests that we are entering an era that requires a shift in social etiquette—where parents must set firm boundaries with friends, relatives, and schools to protect their children’s digital autonomy.

Practical Steps for Privacy

Despite the grim technical realities, Herman and Corn offer practical advice for parents who want to share their joy without compromising their children’s safety:

  1. Move Away from Public Platforms: Use encrypted messaging apps like Signal or WhatsApp for family updates.
  2. Dedicated Sharing Services: Utilize private photo-sharing platforms that do not sell data or use images for AI training.
  3. Audit Media Releases: Herman urges parents to read the fine print on school and sports registration forms, noting that parents can often opt out of public media use.
  4. Educate Relatives: Technical coaching for grandparents is essential to ensure they understand why a private group chat is safer than a public Facebook wall.

Conclusion: The Right to be Forgotten

As the episode concludes, Herman and Corn reflect on the "right to be forgotten." While Europe has made strides in allowing individuals to request the removal of personal links, applying this to photos posted by parents remains a legal and ethical battlefield.

The takeaway from the discussion is clear: privacy is no longer just about hiding; it is about protecting the future autonomy of the next generation. By being "technically literate" and socially proactive, parents can ensure that their children’s digital identities are theirs to build—not a legacy they are forced to inherit.

Downloads

Episode Audio

Download the full episode as an MP3 file

Download MP3
Transcript (TXT)

Plain text transcript file

Transcript (PDF)

Formatted PDF with styling

Episode #555: Before They Can Click: The Ethics of Sharenting

Corn
It is amazing how much of our lives are documented online these days, but there is one specific area that feels increasingly fraught with ethical and technical landmines. I am talking about the digital footprints we create for people who have no say in the matter. Our children.
Herman
Herman Poppleberry at your service. And you are right, Corn. This is a topic that has evolved so rapidly over the last decade. It feels like we are in the middle of a massive, unplanned social experiment. Our housemate Daniel sent us a really thoughtful prompt about this today, focusing on the ethics and safety of sharing photos of children online. He is asking about everything from privacy settings to the impact of artificial intelligence.
Corn
It is a great prompt because it hits that intersection of parenting, technology, and fundamental rights. Daniel mentioned he used to have a YouTube channel and understands that human instinct to share, but now that he is thinking about the next generation, the stakes feel different.
Herman
They are vastly different. When we were kids, our embarrassing photos were trapped in physical albums on a shelf. To see them, someone had to actually come to our house. Today, a single photo posted to a social media account can be indexed, scraped, and distributed globally in milliseconds. It is a permanent record.
Corn
I want to start with the concept of sharenting. It is a term that has been around for a while now, but the implications keep getting deeper. We are essentially creating a digital identity for children before they even have the motor skills to hold a phone. Herman, what does the research say about the scale of this in twenty twenty-six?
Herman
It is staggering. The classic benchmark is that by the time a child reaches the age of five, they may already have up to one thousand five hundred photos of themselves online. But recent data from twenty twenty-five shows that over seventy-five percent of parents share their children's lives on social media, and roughly eighty percent of those posts include the child's real name. That is a massive amount of data points helping build a predictive model of who that person is, where they live, and what their life looks like. And remember, these are often high-resolution images with rich metadata.
Corn
Right, and that metadata is where a lot of the hidden danger lies. Most people just see a cute photo of a kid at a park, but the file itself contains so much more information than just pixels.
Herman
Exactly. The exchangeable image file format, or metadata, can include the exact G-P-S coordinates of where the photo was taken, the time and date, and even the specific device used. If a parent posts a photo from their backyard every day, they are effectively publishing their home address and the child's daily schedule to anyone who knows how to look at that data. Even if you have privacy settings on, that data is still being processed by the platforms themselves for their own internal profiles.
Corn
And that brings up one of Daniel's specific questions. Is there a specific age when it becomes more acceptable to share these images? We often talk about thirteen as the magic number because of the Children's Online Privacy Protection Act, or COPPA, but that is more of a legal threshold for data collection than an ethical one for privacy.
Herman
Thirteen is a bit of an arbitrary line drawn by regulators, though it is worth noting that the Federal Trade Commission just finalized major updates to COPPA that become enforceable this April, twenty twenty-six. These new rules expand the definition of personal information to include biometrics like voiceprints and gait patterns. But from a developmental perspective, thirteen is often when children start to develop a sense of their own digital identity. The real issue is the lack of affirmative consent. A toddler cannot understand the concept of a permanent digital record. By the time they are old enough to care, their face is already in dozens of databases.
Corn
I have seen some parents take a middle-ground approach where they only share photos where the child's face is obscured, or they use an emoji to cover it. Does that actually do anything from a safety or technical perspective, or is it just a symbolic gesture?
Herman
It is actually quite effective against basic facial recognition and scraping. If the biometric data of the face is obscured, it makes it much harder for automated systems to link that photo to a specific individual's identity graph. It also prevents what we call digital kidnapping, which is a bizarre and disturbing trend where strangers take photos of children from the internet and repost them as if they were their own children. Obscuring the face makes the photo less valuable for those kinds of bad actors.
Corn
That is such a strange and dark corner of the internet. But let's look at the other side of Daniel's question. What about third parties? You might be the most private parent in the world, but your child goes to a birthday party or a school event, and suddenly twenty other parents are snapping photos and uploading them to public Instagram stories. How do you manage that without becoming the neighborhood pariah?
Herman
That is the social friction point. It is a classic collective action problem. You can control your own behavior, but you cannot easily control the behavior of a crowd. We are seeing some radical shifts, though. Australia recently banned social media for children under sixteen, which has sparked a global conversation about whether we should be more restrictive. Many schools now have media release forms where you can opt out, but that does nothing to stop another parent from posting a group shot of the kindergarten graduation.
Corn
I think it requires a shift in social etiquette. We are starting to see some parents make an announcement at the beginning of parties, just a quick mention like, hey, we are keeping our kids off social media, so please do not post any photos with our son in them. It feels awkward at first, but it sets a boundary.
Herman
It does, but we also have to recognize that privacy is a sliding scale. For some families, the risk is higher than others. If you have a high-profile job or if there are custody issues involved, that privacy becomes a physical safety requirement. But even for the average family, the long-term risk is the erosion of the child's future privacy. We are giving away their right to be forgotten before they even know they have it.
Corn
Let's talk about the A-I aspect, because that is where things have changed the most in the last couple of years. Daniel asked how new technologies like A-I impact a child's digital safety. We are moving past just facial recognition into the realm of generative A-I and deepfakes.
Herman
This is the part that really keeps me up at night, Corn. A report released just this month by UNICEF and INTERPOL found that over one point two million children have had their images manipulated into sexually explicit deepfakes in the past year alone. The more high-quality photos of a child that exist online, the easier it is for an A-I model to learn their likeness. In twenty twenty-six, the technology is so accessible that you do not need a supercomputer to do this. You can do it on a mid-range laptop.
Corn
And it is not just about malicious deepfakes. It is also about training data. Every photo uploaded to a major social platform is essentially fuel for their proprietary A-I models. These models are learning what children look like at various stages of development. We are essentially donating our children's likenesses to enrich these massive corporations.
Herman
Exactly. And once that data is ingested into a model, it is almost impossible to remove. You can delete the original photo, but the weights and biases of the neural network have already been influenced by it. This is why some privacy advocates are pushing for a total moratorium on sharing identifiable photos of minors. They argue that we are creating a permanent biometric profile that will follow them for the rest of their lives.
Corn
It feels like we are at a point where the convenience and social validation of sharing a photo are being weighed against a very abstract, long-term risk. Most parents are not thinking about A-I training data when they post a photo of their kid's first tooth. They just want their friends to see it.
Herman
And that is a very human impulse. We should not demonize parents for wanting to share their joy. But we do need to be more technically literate about what happens after we hit that post button. For example, even if your account is private, your followers can still take a screenshot. They can download the image. Once it leaves your device, you have lost control of it.
Corn
So what are the actual recommendations? If someone wants to be responsible but still wants to stay connected with family, what are the best practices?
Herman
The gold standard is to move away from public or semi-public social media platforms for family photos. Use encrypted messaging apps like Signal or WhatsApp for sharing with close relatives. Or better yet, use a dedicated, private photo-sharing service that does not sell your data or use it for A-I training. There are several platforms now that prioritize privacy and give you full control over who can see and download the images.
Corn
I have also seen people use shared albums in cloud services where they can revoke access at any time. That seems like a good middle ground. But what about the older relatives? Daniel mentioned that for some people, Facebook is the only platform they know how to use. How do you handle the grandmother who just wants to show off her grandkids to her friends?
Herman
That is a tough conversation. It requires a bit of technical coaching. You can help them set up their privacy settings so that only their actual friends can see their posts, rather than the public. But you also have to be firm about the boundaries. You might have to say, Grandma, we love that you want to share these, but please only send them in our private group chat. It is about protecting the kids, not about excluding the grandparents.
Corn
It is also worth mentioning that some countries are starting to take this more seriously from a legal perspective. In parts of Europe, there have been cases where children have sued their parents for sharing photos of them without consent once they reached adulthood. We are also seeing the TAKE IT DOWN Act here in the States, which is designed to help remove non-consensual imagery of minors.
Herman
I agree. The concept of digital consent is going to be a major legal battlefield in the next decade. We are already seeing the emergence of the right to be forgotten laws, which allow individuals to request that search engines remove links to personal information. But applying that to photos posted by a third party, like a parent, is much more complicated.
Corn
Let's dig deeper into the third-party situation Daniel mentioned. Schools and sports teams. Often, when you sign those registration papers, there is a tiny box at the bottom that gives them permission to use your child's image for anything they want. I think most parents just sign it without thinking.
Herman
They do. And that is a huge mistake. You should always read the fine print on those media releases. In many cases, you can cross out the sections you do not agree with or attach an addendum that limits the use of the photos to internal school communications only. Most organizations will respect that if you are proactive about it. But if you do not say anything, they will assume they have carte blanche.
Corn
What about the guests at a celebration? Daniel brought up the idea of making an announcement. Do you think we will reach a point where no-phone zones are the norm for children's parties?
Herman
We are already seeing it at weddings and high-end events. It would not surprise me if it becomes more common for children's birthdays too. Some people even have a designated photographer who takes the photos and then shares a curated, private link with the guests later. That way, the parents maintain control over which images are distributed.
Corn
That seems like a very elegant solution, although maybe a bit much for a casual playdate. But I think the core idea is intentionality. We have been in this mode of default sharing for so long that we have forgotten how to be private.
Herman
Exactly. We need to move back toward privacy by default. Instead of asking why should I not post this, we should be asking why should I post this? Is the benefit to me or my child worth the potential long-term risk?
Corn
There is also the psychological impact on the child to consider. If a child grows up knowing that every milestone and every mistake is being broadcast to an audience, how does that affect their sense of self? Are they living their life for themselves, or for the camera?
Herman
That is a profound question. There is a lot of emerging research on the performative nature of childhood in the age of social media. When a parent is constantly framing their child's life for an external audience, it can disrupt the child's ability to develop an internal sense of privacy and autonomy. They begin to see themselves as a character in a story being told by their parents.
Corn
It is a form of surveillance, even if it is done with love. The child is always being watched, always being documented. That has to have some effect on their development.
Herman
It definitely does. And it makes it much harder for them to establish their own boundaries later in life. If their parents did not respect their privacy, why should they expect anyone else to? We are modeling behavior for them every time we pull out our phones.
Corn
I want to go back to the technical side for a moment. You mentioned facial recognition and A-I. Are there any tools available for parents who have already posted a lot of photos and now want to scrub them? How hard is it to delete your child's digital footprint?
Herman
It is incredibly difficult to do it completely, but you can certainly reduce it. The first step is to go through your old posts and either delete them or change the privacy settings to only me. There are also services that can help you scan the web for mentions of your name or your child's name and request removals. But for images, it is much harder because they are not always indexed by name.
Corn
And then there is the Wayback Machine and other archival sites. Once something is out there, it is often mirrored on dozens of other sites that you might not even know exist.
Herman
Right. This is why the best strategy is prevention. But for those who are already deep into it, I would recommend using tools like the N-C-M-E-C's Take It Down service if you are dealing with sensitive imagery, or using services that help you monitor for your child's face appearing in new places. It is a bit of a cat-and-mouse game, though.
Corn
It feels like we are entering an era where privacy is going to be a luxury good. It will take time, effort, and technical knowledge to keep your child's life private.
Herman
It already is, Corn. And that is the unfortunate reality. The platforms are designed to make sharing as easy as possible because that is how they make money. Privacy is intentionally made difficult. It is buried under layers of menus and confusing legal language.
Corn
So, if we were to summarize the guidelines for someone like Daniel, or any parent listening, what would the top three be?
Herman
Number one, scrub your metadata. If you are going to share, make sure you are not inadvertently sharing your location or your child's schedule. There are apps that can do this automatically before you upload. Number two, favor private, encrypted channels over public social media. If you want the grandparents to see the photo, send it to them directly. Number three, have the hard conversations with third parties early. Do not wait for a photo to be posted to set your boundaries with schools, friends, and family.
Corn
I would add a fourth one, which is to involve the child in the process as soon as they are old enough to understand. Ask them, hey, is it okay if I send this photo to Grandma? Even if they are only four or five, it starts the habit of asking for consent and showing them that their opinion on their own image matters.
Herman
That is a great point. It builds that foundation of digital agency. And honestly, it might surprise you. Sometimes kids will say no because they do not like how they look or they were having a bad day. Respecting that no is a powerful way to show them you value their privacy.
Corn
It also makes them more likely to respect the privacy of others as they get older. We are training the next generation of internet users right now. If we want a more private and respectful internet, we have to start with how we treat our own children's data.
Herman
Absolutely. And we have to be realistic about the fact that we cannot achieve one hundred percent privacy. We live in a connected world. But we can certainly be more intentional. We can move the needle from total exposure to a more balanced, protective approach.
Corn
I think the A-I threat is the one that is going to force this issue into the mainstream. When people start seeing deepfakes of children being used for scams or worse, the casual sharing of photos is going to become much less socially acceptable.
Herman
I think you are right. We are seeing a shift in the zeitgeist. A few years ago, it was considered weird not to share photos of your kids. Now, it is increasingly seen as a savvy, protective move. The parents who are keeping their kids off the grid are the ones who are thinking ten steps ahead.
Corn
It is like that old saying about the best time to plant a tree. The best time to start protecting your child's privacy was the day they were born. The second best time is today.
Herman
Exactly. You cannot change what you did in the past, but you can change your behavior going forward. You can have those conversations, you can change those settings, and you can start being a more conscious gatekeeper of your child's digital identity.
Corn
It is a lot to think about, and it can feel overwhelming. But I think the key is not to let the perfect be the enemy of the good. Any step you take to increase your child's privacy is a win.
Herman
Well said. And it is something we all need to be talking about more. This should not be a private struggle for parents. It should be a broader social conversation about the rights of children in a digital age.
Corn
Definitely. And speaking of conversations, if you have been enjoying the show and finding these deep dives helpful, we would really appreciate it if you could leave us a quick review on your podcast app. It genuinely helps other people find us and join the discussion.
Herman
It really does. We love seeing the community grow and hearing your perspectives on these topics.
Corn
We should also mention that there are some great resources out there for parents who want to dive deeper into the technical side of this. Organizations like the Electronic Frontier Foundation have guides on digital privacy for families that go into a lot more detail than we can cover here.
Herman
Yes, the E-F-F is a fantastic resource. They have been at the forefront of these issues for decades. I would also recommend looking into the work of researchers like Stacey Steinberg, who has written extensively on the legal and ethical implications of sharenting. Her work is really the gold standard for understanding this phenomenon.
Corn
It is interesting how this ties back to some of our earlier episodes too. Remember when we talked about the future of facial recognition in public spaces back in episode four hundred and twelve? This is the private version of that same struggle.
Herman
It is. It is the same technology, just applied in a different context. In the public sphere, we are worried about the government or corporations tracking us. In the private sphere, we are essentially doing the tracking for them. We are building the databases that they will use later.
Corn
That is a sobering thought. We are the voluntary contributors to the surveillance state when we post these photos.
Herman
In many ways, yes. That might sound dramatic, but from a data perspective, it is accurate. We are providing the high-quality, labeled data that these systems need to become more effective.
Corn
So, the takeaway is to be a bit more of a friction point. Do not make it so easy for the machines to know everything about our kids.
Herman
Exactly. Be the friction. Be the intentional gatekeeper. Your child will thank you for it in twenty years.
Corn
I think that is a perfect place to wrap this up. Daniel, thank you for such a timely and important prompt. It is something that affects almost everyone, whether they have kids or not, because we are all part of this digital ecosystem.
Herman
Definitely. It is a shared responsibility.
Corn
Well, this has been My Weird Prompts. You can find all five hundred and forty-six episodes, including this one, at myweirdprompts.com. We have an R-S-S feed there for subscribers and a contact form if you want to send us a prompt of your own.
Herman
And of course, we are available on Spotify and all the major podcast platforms. Thanks for listening and for being part of the conversation.
Corn
We will be back soon with another deep dive into the weird and wonderful prompts you send our way. Until then, keep asking the hard questions and stay curious.
Herman
Goodbye everyone.
Corn
Bye.

This episode was generated with AI assistance. Hosts Herman and Corn are AI personalities.

My Weird Prompts