
Pornhub's Intentionality
- Many people are unaware of Pornhub's deliberate actions and the consequences they face.
- Pornhub's actions weren't due to ignorance but intentional decisions regarding exploited children and rape victims.
- Evidence, including accidentally released documents, shows years of ignored requests to remove videos of unconscious, raped, or underage individuals.
Main takeaways from the episode
Pornhub became a global crime scene due to lax upload rules
- Pornhub allowed almost anyone to upload videos with only an email — no ID or consent checks — which led to widespread child sexual abuse material, rape videos, revenge porn and stolen content being hosted and monetized.
Trafficking Hub movement exposed the problem and forced action
- Laila launched the #TraffickingHub campaign after finding how easy uploads were and after high-profile investigations and survivor stories; the campaign, petitions, media and litigation pressured payment processors and led to massive content takedowns.
Systemic corporate failures and intentional choices
- Internal documents and discovery show MindGeek/Pornhub executives made policy choices (e.g., minimal moderation, refusing keyword removals, not reporting CSAM) that prioritized profit over safety, undermining Section 230 defenses.
Moderation was inadequate and exploitative
- Moderation teams were tiny (e.g., 10 people per shift), pressured to click through hundreds–thousands of videos per shift, and policies required many flags before review — enabling illegal content to remain online and monetized.
Consequences for survivors are severe and long-lasting
- Victims suffer “immortalization” of trauma: ongoing re-uploads, download buttons, bullying, homelessness, addiction and high rates of suicidal ideation. Civil suits have followed and may yield large damages.
Effective levers: payment processors and legal pressure
- Credit card companies (Visa/Mastercard) and payment processors were the “Achilles’ heel”; cutting financial services forced Pornhub to remove ~91% of unverified content. Civil discovery and lawsuits exposed incriminating internal evidence.
Proposed scalable solutions: age & consent verification + financial gating
- Mandatory third-party age, ID and consent verification for every person in every uploaded video (biometric + government ID + liveness checks) and financial institutions refusing to do business with non-compliant sites are practical, scalable fixes.
Risks from emerging tech (AI deepfakes) amplify harm
- AI-generated non-consensual porn (deepfakes) and synthetic CSAM create new threats; laws like the U.S. Take It Down Act criminalize uploading non-consensual AI content, and verification systems can help block deepfakes from being uploaded.
Broader cultural and prevention work is needed
- Beyond platform fixes, prevention includes parental tools, tech solutions that block filming of nudes on children’s devices, better sex education, and public awareness about the permanence and harms of sharing sexual images.
(source summary for episode context: [0.453sec-4639.153sec])