Digital Freedom Under Attack
Payment processors and governments are increasingly censoring legal content across the internet.
Recent Censorship Issues
EU Digital Services Act Enforcement Accelerates
The European Union has begun aggressive enforcement of the Digital Services Act with significant recent developments:
- Platform Investigations: Formal proceedings opened against 6 major platforms
- Content Scanning Mandates: AI-based pre-screening now required for all uploads
- Fines Issued: €250M in penalties for "insufficient" content moderation
- Global Impact: Platforms implementing EU standards worldwide to avoid compliance costs
Project 2025: Blueprint for Digital Authoritarianism
Project 2025 outlines the most comprehensive censorship plan in US history. Unlike other threats that require new laws, this 900-page blueprint details how to use existing government powers to control online speech:
- Transform the FCC into a content regulator with power to define "harmful" speech
- Eliminate Section 230 protections, forcing platforms to pre-censor all content
- Create government "morality boards" to determine appropriate content
- Target LGBTQ+, reproductive health, and climate information as "harmful"
KOSA Censorship Bill Could Return Anytime
The Kids Online Safety Act disguises internet censorship as child protection. Despite failing in 2024, KOSA could be reintroduced at any moment with bipartisan support. Recent developments suggest reintroduction is likely in Q4 2025.
- Force platforms to censor legal speech to avoid lawsuits
- Target mental health resources and LGBTQ+ content as "harmful"
- Create impossible standards based on unproven claims about online harm
- Silence support communities discussing depression, eating disorders, and more
- New threat: State-level versions being introduced in 12 states
Steam & Itch.io Under Escalating Pressure
Major payment processors including Visa and Mastercard have intensified pressure on gaming platforms, with specific recent actions:
- Steam: Forced removal of 200+ visual novels in Q2 2025
- Itch.io: Payment processing suspended for 48 hours in July 2025
- DLsite: New regional restrictions implemented globally
- Developer Impact: 15,000+ creators affected, $50M+ revenue lost
- Genre Targeting: Entire visual novel and dating sim categories at risk
State-Level Age Verification Laws Spreading Rapidly
Following Louisiana's lead, 18 states have now passed or are implementing age verification laws that effectively censor adult content through invasive ID requirements:
- Utah: Strictest enforcement - platforms blocking entire state
- Texas: $10,000 daily fines for non-compliance, effective Sept 2025
- Florida: Expanded definition includes "suggestive" content
- Platform Response: Mass geoblocking rather than compliance
- VPN Usage: 400% increase in affected states
UK Online Safety Act Now Law
The UK's Online Safety Act is now enforced, requiring:
- Age verification for adult content
- Removal of "legal but harmful" content
- Algorithm transparency requirements
- Heavy fines for non-compliance
What's At Stake
Digital Games & Media
Products that could be banned or restricted:
- Adult Visual Novels: Entire genre at risk
- Mature Indie Games: Horror, psychological themes
- Dating Sims: Romance and relationship games
- Art Games: Experimental and artistic content
- Modding Communities: User-generated content
Platform Restrictions
How platforms are being forced to change:
- Content Removal: Mass deletion of legal content
- Creator Bans: Accounts terminated without warning
- Payment Blocks: Unable to receive payments
- Geographic Restrictions: Content blocked by region
- Age Gates: Invasive verification requirements
Creator Economy Impact
How censorship affects digital creators:
- Income Loss: Millions in lost revenue
- Artistic Freedom: Self-censorship to survive
- Platform Migration: Forced to less stable platforms
- Community Fragmentation: Audiences scattered
- Innovation Stifled: Fear of creating new content
Historical Timeline
📊 8 Years of Escalating Censorship
From FOSTA-SESTA in 2018 to current 2025 threats, explore the comprehensive timeline of digital censorship events affecting millions of creators and users worldwide.
Legislative Threats to Digital Freedom
KOSA (Kids Online Safety Act)
The Kids Online Safety Act creates a censorship regime disguised as child protection. Despite claims it won't censor content, KOSA's core mechanism—a "duty of care" requirement—will force platforms to suppress lawful speech to avoid lawsuits.
How KOSA Creates Censorship:
- Duty of Care Trap: Platforms must "prevent and mitigate" vague harms like depression, anxiety, eating disorders, and "compulsive usage"
- Legal Liability: Government agencies can sue platforms that don't remove content someone claims contributed to these harms
- Over-Censorship Incentive: When in doubt, platforms will delete content rather than risk expensive lawsuits
- Subjective Standards: No clear definition of what constitutes "compulsive usage" or harmful content
Who Gets Silenced:
- Mental Health Communities: Support groups discussing depression, eating disorders, or self-harm recovery
- LGBTQ+ Youth Resources: Content about identity, relationships, or community support
- Educational Content: Medical information, harm reduction, or crisis intervention resources
- All Users: KOSA's censorship affects everyone, not just minors—platforms can't easily age-gate content
Why "Viewpoint Protection" Fails:
- Platform Liability: Legal risk attaches to platforms, not users—they must censor to stay safe
- Content vs. Viewpoint: Platforms will remove entire topics (like eating disorder recovery) regardless of viewpoint
- Subjective Enforcement: Different administrators will interpret "harmful" content differently
⚠️ No Scientific Basis
KOSA relies on unproven claims about "compulsive usage" and social media harm. There's no accepted clinical definition of online addiction, and no scientific consensus that platforms cause mental health disorders.
KOSA could be reintroduced at any time. Contact your representatives now to oppose this censorship bill disguised as child protection.
Project 2025 (US)
Project 2025 is a comprehensive conservative policy blueprint that poses significant threats to digital freedom and free expression. Key censorship mechanisms include:
Censorship Through Government Control:
- FCC Weaponization: Transform the FCC into a content regulator, giving government power to define "harmful" speech
- Section 230 Elimination: Remove platform immunity, forcing platforms to pre-censor all content to avoid lawsuits
- Morality Police: Create government bodies to determine what content is "appropriate" for Americans
- Educational Censorship: Control curriculum and restrict access to "unapproved" educational materials
Targeting Specific Communities:
- LGBTQ+ Content: Classify LGBTQ+ educational materials as "pornographic" to justify censorship
- Reproductive Health: Restrict access to information about reproductive health and contraception
- Civil Rights Materials: Target diversity, equity, and inclusion content as "divisive"
- Scientific Information: Censor climate change and public health information that contradicts political positions
🚨 Immediate Threat
Unlike KOSA, Project 2025 doesn't require new legislation—it's a roadmap for using existing government powers to control online speech. Many proposals could be implemented through executive action.
Project 2025 represents the most comprehensive threat to digital freedom in American history, with detailed plans to control online speech through existing government agencies.
UK Online Safety Act
Currently enforced with powers to:
- Remove "legal but harmful" content
- Mandate age verification systems
- Fine platforms up to £18M or 10% revenue
- Block websites in the UK
- Require content scanning
Other Legislative Threats
RESTRICT Act (US)
Gives government power to ban apps and services deemed "security threats"
EU Digital Services Act
Requires content moderation at scale with heavy penalties
Age Verification Laws
Multiple US states requiring ID checks for adult content
Section 230 Reform
Threatens platform immunity for user-generated content
Understanding KOSA's Censorship Mechanism
1. The Legal Trap
KOSA requires platforms to "exercise reasonable care" to prevent harm to minors. This creates legal liability for any content that someone later claims contributed to harm—even if the content was completely legal and helpful.
2. Impossible Standards
How do you prevent "compulsive usage" or "anxiety" from social media? The terms are so vague that no platform can know what's safe to host. The only safe choice becomes removing anything that might be risky.
3. Everyone Gets Censored
Platforms can't easily determine user age or restrict content only for minors. When content is flagged as potentially harmful to kids, the easiest solution is to remove it for everyone—including adults who need that information.
Real-World Censorship Examples Under KOSA
Mental Health Forums
Posts saying "here's how I got through depression" could be removed because discussing mental health might "trigger anxiety" in some users.
Body Positivity
Messages like "love your body" could be censored for potentially triggering eating disorders, even when promoting healthy self-image.
Educational Content
Medical information about eating disorders or substance abuse could be removed because discussing these topics might be seen as harmful.
LGBTQ+ Resources
State attorneys general could target LGBTQ+ content as "harmful" to minors, forcing platforms to remove identity-affirming resources and community support.
Take Action Now
Contact Representatives
Make your voice heard. Tell your representatives to oppose censorship legislation.
Support Creators
Follow and support creators on alternative platforms that respect creative freedom.
Spread the Word
Share information about digital censorship on social media to raise awareness.
Stay Informed
Bookmark this site and check back regularly for updates about new censorship threats and victories for digital freedom.