This isn't really accurate. Age verification is not mandatory for all accounts. You will be able to join a Discord with your friends, chat, and do voice without age verification.
Here's the exact list of what's restricted if you don't verify:
> Content Filters: Discord users will need to be age-assured as adults in order to unblur sensitive content or turn off the setting.
> Age-gated Spaces – Only users who are age-assured as adults will be able to access age-restricted channels, servers, and app commands.
> Message Request Inbox: Direct messages from people a user may not know are routed to a separate inbox by default, and access to modify this setting is limited to age-assured adult users.
> Friend Request Alerts: People will receive warning prompts for friend requests from users they may not know.
> Stage Restrictions: Only age-assured adults may speak on stage in servers.
Taken from the announcement https://discord.com/press-releases/discord-launches-teen-by-...
So the claim that Discord is making ID verification "mandatory" or that you need it for gaming chats is untrue.
For children - this mandate also still makes the decision on behalf of the parents that a child must submit a scan of their face to a third party. Moving to Persona for age verification involves verification data being sent outside of the user's phone - in direct contradiction to Discord's initial promise of keeping facial scan data solely on the phone. Third parties that we've been given no reason to trust will delete the data without using it for an improper purpose such as creating derivative info from the ID or facial scan itself unrelated to the sole purpose of verifying that an individual is an adult.
While we're at it - is there any legitimate reason why Discord is associating a person's actual or estimated age with their account as opposed to storing a value that states if they are or are not an adult? That sort of granularity seems unrelated to the stated purpose.
The vendor is called k-ID: https://www.k-id.com/
Their website includes a lot of technical details about their system. Including that their granularity they typically return in their API is just "ADULT" or not (per jurisdiction). [1]
They also document all their international compliance standings including their assurances that data really doesn't leave user devices.
At a glance, it seems a much better option than Persona, as k-ID's competition. Though arguably, yes, most of their options are still "AI garbage" under the hood. But that's what countries want right now.
[0] https://support.discord.com/hc/en-us/articles/30326565624343...
[1] https://docs.k-id.com/get-started/quickstart-guides/age-veri...
If they were age-gating children to make a safe space, that would be one thing, but they are instead making an adults only area where sexual content and flirting is allowed. For many people this is a bonus, because now like in a bar, you always know the person you’re taking to is an adult.
2. If they don't about-face, there's a lot about the implementation that remains to be seen.
Personally, I use discord for things that should be completely unaffected by this. I will not verify my age if there are surprises. I'll leave. If the communities I'm a part of decide to move, I'll support them and move even if I don't run into surprises.
There is absolutely no way we should support giving identifying information to a U.S. company given what's going on right now. The trust is no longer there. If you verify your identity, anything you say on Discord could be used against you if you ever pass through American borders.
At some point every social media service is going to have to do this. Discord got a ton of bad press by being the first to say it out loud (and ask for feedback about it!) and give a timeline for worldwide roll out (early May). Discord isn't going to be the last to announce it. I think we all expect Meta (Facebook, Instagram, WhatsApp) to roll out whatever their solution will be under the cover of darkness (and dark patterns), just shows up one day with no easy opt-out and worse than what Discord is currently planning.
> There is absolutely no way we should support giving identifying information to a U.S. company given what's going on right now.
For what little it is worth, so far Discord has been using a third-party vendor for handling this information called k-ID [2]. k-ID looks like it is fully remote and has a globally distributed workforce including many in the US, but its press kit and job posting imply it is a company registered and based out of the Republic of Singapore.
[0] https://europeannewsroom.com/to-ban-or-not-to-ban-eu-countri...
[1] https://en.wikipedia.org/wiki/Social_media_age_verification_...
You are correct. For now. But why would they stop there?
Supposedly this is to protect teens. If that's true, why would they continue letting teens chat with anonymous users? What if they get tricked into sharing sensitive images or video of themselves? Surely we need to know everyone's ID to ensure teens aren't unwittingly chatting with a known predator. It's for their safety. But for now that's a bridge too far. For now.
And why should we believe this even has anything to do with protecting teens? That's valuable data. Discord says they're not holding onto it... for now. But Discord is offering quite a lot to users for free. Why let such an obvious revenue source go unmonetized? They're doing this now because they're going public soon. Investors want an ROI and this action is sure to invite some competition. The people leaving want an alternative, so a competitor could get a foothold. Discord needs to stay ahead. And the users Discord keeps after this stunt are going to be the most resilient to leaving - the most exploitable. Surely they wouldn't care if the policy changes in the future.
The sky isn't falling. But the frog is boiling.
All so that we can post online about how Google is invading our privacy?
Pretty much an AI detecting vulgarity and blocking it, although actual racist, vulgarity gets through things like 'here with my gock' to 'troll it' are what I've seen.
So, yes it is a requirement, and yes, they are censoring people and things, and requiring others to have an ID to see the messages as well.
So 'Not mandatory for all accounts' is technically true, but I mean.. you get it, hopefully.
> You will be able to join a Discord with your friends, chat, and do voice without age verification.
No, building a community is a goal for many; this just isn't acceptable.
> So the claim that Discord is making ID verification "mandatory" or that you need it for gaming chats is untrue.
Again, not mandatory but creates more issues than it solves.
I know not everyone is so open but in the lgbt space most people are.
I've been noticing people in this space react to these news in a very worrisome manner, either by downplaying the need of nsfw in their lives (ironically, hours after discussing a clearly-nsfw matter!), or even worse: by equating all nsfw to "porn"; giving them carte blanche to judge others who want the option for nsfw talk as "being in it just for sex".
It's been shocking for me to see this phenomenon unveil in real time. This overwhelming "sanitizing" force that bulldozes through any nuance regarding the nature of being an adult in shared online adult spaces. It's especially rough for marginalized or minority communities, who oftentimes don't even have IRL spaces to talk about adult subject matters.
> >Content Filters:
Sound like something people might not want tied to real-world identities.
> >Age-gated Spaces:
So, #politics in my local instance.
1. So they can use it against you later if they want to (eg. blackmail, spying, etc.)
2. So they can start shutting off access to content that those in power don't like
Calling it now: Reddit is next.
1. A way for politicians and the state to track porn habits to US citizens and use that information against them in the future. Blackmail for the future politicians, business leaders, and wealthy to coerce them into doing what those in power want.
2. A way for conservatives to tighten the noose around non-chaste materials and begin to purge them from the internet. And if that works, that's hardly the last thing that will go. Next will be LGBT content, women's rights content, atheist content, pro-labor content, and more. (Or if you're on the other side of the political spectrum, consider that the powers could be used to remove Christian content, 2nd Amendment content, etc. It doesn't really matter what is being removed, just that the mechanisms are in place and that powers can put a lid on the populace.)
We aren't screaming loudly enough.
Do not try to sugar coat this with a pedantic mistake.
This is far worse.
It's a first step down a path the Big Brother state wants.
Yell.
Scream.
Protest.
This topic really brings out the crazy conspiracy theories.
No, politicians are not using Discord age verification to track constituents' porn habits and blackmail them with it later.
> For the majority of adult users, we will be able to confirm your age group using information we already have. We use age prediction to determine, with high confidence, when a user is an adult. This allows many adults to access age-appropriate features without completing an explicit age check.
> Facial scans never leave your device. Discord and our vendor partners never receive it. IDs are used to get your age only and then deleted. Discord only receives your age — that’s it. Your identity is never associated with your account.
> We leverage an advanced machine learning model developed at Discord to predict whether a user falls into a particular age group based on patterns of user behavior and several other signals associated with their account on Discord. We only use these signals to assign users to an age group when our confidence level is high; when it isn't, users go through our standard age assurance flow to confirm their age. We do not use your message content in the age estimation model.
I work with corporate privacy all of the time, and there is actually something really interesting going on here. We're basically never allowed to claim legal compliance using heuristics or predictive models. Like, never ever. They demand a paper trail on everything, and telling our legal team that we are going to leave it to an algorithm on a user device would make them foam at the mouth.
They are basically trusting a piece of software to look at your face or ID in the same way that, like, a server at a restaurant would check before serving you alcohol.
I am curious to see if this kind of software compliance in the long run is even allowable by regulators.
Part 3, Chapter 2, Section 12(4) specifies that user-to-user service providers are required to use either age verification or age estimation (or both!) to prevent children from accessing content that is harmful to children. Section 12(6) goes on to state that "the age verification or age estimation must be of such a kind, and used in such a way, that it is highly effective at correctly determining whether or not a particular user is a child."
Part 12, Section 230(4) rules out self-declaration of age as being a form of age verification/estimation.
So I suppose it'll come down to whether or not Ofcom deems Discord's age estimation as "highly effective".
[Part 3, Chapter 2, Section 12(4)]: https://www.legislation.gov.uk/ukpga/2023/50/part/3/chapter/...
This is unrelated, but something I find interesting is that Category 1 user-to-user services (of which Discord is one, as per The Online Safety Act 2023 (Category 1, Category 2A and Category 2B Threshold Conditions) Regulations 2025) are required by Part 4, Chapter 1, Section 64(1) to "offer all adult users of the service the option to verify their identity (if identity verification is not required for access to the service).".
They have devised a system so lackluster and unverifyable that they can claim they are following the letter without having to turn over anything remotely useful to actually verify or track people's identities.
Hmmm. I feel like self-hosting is the FASTEST way to lose your anonymity. Your self hosted service is MUCH more easily tied to your identity than some third party like discord.
Just imagine you set up a self-hosted forum where you want to discuss something you want to keep private, but the government is very interested and wants to know who you are talking to.
Well, now they know any IP address connecting to your forum is a person of interest. They don't need to decrypt anything to know you are talking to each other.
By using something unique, you are going to make yourself uniquely identifiable.
Also, services like TOR exist. Both on the hosting and user side.
It really bothered me that so many important projects were relying on a proprietary chat technology instead of using mailinglists or IRC which were more decentralized and under the control of the local admin.
I would like to get back to a situation in which you can participate in group chats for open source projects without these being hosted on closed platforms, but if this results in major open source projects shifting from discord to telegram or whatsapp, then nothing will have been learned.
SimpleX seems trustworthy enough, with thoughtful design decisions, even if it fails my "forced tor" requirement. I haven't spent the time to dive into Session's architecture, but it's on my to-do list, currently the marketing copy makes it look like the best choice.
Same with the other account.
An attempt to remove an "optional" and totally not mandatory phone number results in immediate account block "for my own security" and a request for the number.
You might be an outlier.
I thought age verification was only required to access "adult" content?
> Content Filters: Discord users will need to be age-assured as adults in order to unblur sensitive content or turn off the setting.
> Age-gated Spaces – Only users who are age-assured as adults will be able to access age-restricted channels, servers, and app commands.
> Message Request Inbox: Direct messages from people a user may not know are routed to a separate inbox by default, and access to modify this setting is limited to age-assured adult users.
> Friend Request Alerts: People will receive warning prompts for friend requests from users they may not know.
> Stage Restrictions: Only age-assured adults may speak on stage in servers.
Does this mean that in panel-like settings where 100s of users are listening to a speaker, in order to ask or contribute in voice you need to be verified?
https://soatok.blog/2025/07/24/against-the-censorship-of-adu...
I’ll be building a new platform on these two technologies and using Zoom or something else like Jitsi on the side for video/audio sharing.
It’s time accept the loss of “features” and go back to something simpler but also something that can still be here in 38 years — like IRC has been.
I guess I have a hard time understanding these calls to switch to a platform that has even fewer features than the unverified Discord accounts. The blog post is incorrect in claiming that verification will be mandatory. It will only be necessary to access certain features and content. For simple IRC-style chats or even for voice chats with gaming friends, no verification is required.
The average Discord user, or even the 98th percentile user, isn’t going to be looking to switch to a platform that isn’t a replacement for the features they use. They’re just going to not verify their accounts and move on.
Communities aren’t about the “platform features” they’re about the environment. As for profit CEO after CEO fail to recognize time after time
I think it's the writing on the wall that's important here, mate. This is only the first step.
Things like image embeds, "markdown lite" formatting, and cross-device synchronization are now considered table stakes. There are always going to be some EFnet-type grognards who resist progress because reasons, but they should be ignored.
IRCv3 and Ergo support some of what's needed already (and in a backwards-compatible way!) but client support just isn't there yet, particularly on mobile.
One other feature that's absolutely considered table stakes now is persistent server-side history, with the ability to edit and delete messages. Modern chat services are less like IRC, and more like a web forum with live updates.
(Yes, you can poorly emulate server-side history on IRC with a bouncer. That's not enough, and it's a pain for users to set up.)
Coming from a former heavy IRC user who's not going back except for nostalgia trips.
I'm sure I'll be fine.
Their DPO ignored a PII leak I discovered and reported last year. Their dpo mail address just creates a zendesk ticket, I was able to view the ticket was locked and marked "solved" with no response a few days later.
So, I brought it to the Dutch DPA, who were very responsive, and on the same day as their "final update" email, my nearly-decade old Discord account was suddenly "suspended" hours later. The PII leak, which had been ongoing for over a year before my discovery at that point was suddenly stopped the same day. Funny how that works.
It took 5+ months for Discord's DPO and informal disputes team to finally get back to me after informing them of the retaliation, with irrelevant copy & paste templates giving me walk through guides on how to file a "trust and safety" ticket.
When filing a ticket with "trust and safety" under appeal categories I get an automated "please appeal your ban through the app! I am now closing this ticket" response and my ticket's locked once again. And of course, appealing through the app gives me a generic system error.
Watching as things play out, I understand why people try to target discord et al. with their complaints about the loss of anonymity. Being a tiny minority they have no hope to influence their governments because the opposite position is widely popular.
Therefore, they try to convince commercial entities to disregard these laws as much as possible. This is particularly useful for that niche since fighting legislation cannot in itself be done anonymously. Therefore, they attempt to transform a very nonymous (haha) entity to do the fighting on their behalf. If the attempt fails, no harm befalls them.
I think it's a doomed endeavour. To get users on discord, it has to be portrayed to parents as a safe and legal service. The days of underground BBSes are gone. Now, if your brand gets associated with anything negative you're toast. And realistically the anonymous users are kind of useless as a whole. They won't pay, so they're practically just a drag on your platform. Losing them risks not very much.
Overall, a fight with a bygone conclusion. If you want anonymity you have to use other tools and be aware that simply using those tools marks you out as someone who desires anonymity.
1) Addictive design of many social networks (doom scrolling et al.) 2) Privacy & age verification
On 1) most parents would support a legal limit on digital media use by age. But it's not a realistic requirement. Next best thing is to outlaw social media that results in addictive scrolling behaviour. Treating it the same way as smoking is not ideal, but no better solutions have been proposed. Many people on HN wouldn't mind if FB, TikTok and Insta were treated the same way as cocaine. I.e. only available for a lot of money to people who are happy to break the law.
On 2) there are ways to implement technical solutions, that would allow the government to provide a privacy conscious service that would allow businesses to check if someone is 16+ or 18+ without collecting any other information. These services can be gamed. But that's not the point. A 14 year old could become addicted to cocaine and we wouldn't usually blame the policy for it. The problem is the government tries to solve problem 1) now, while the solution for 2) is being discussed.
Again, a law that limits social media use for under 16 year olds is necessary. But so is a toolset that would enable a plausible age check, and limit the desire of FANG (and their Chinese competitors) to target minors.
There are those that will stay on Discord because the benefits of the first three outweigh the degradation of privacy. Then there are those that will leave because the first three aren't important enough to outweigh the privacy loss. There will be all sorts of people in between.
HN has a rather amplified showing of folks who won't trust anything unless it's completely decentralized using E2EE clients verifiably compiled from source that they'be personally audited running on hardware made from self-mined rare metals. The reality is that there is a spectrum of folks out there all with different preferences and while some folks will leave (in this case) Discord, others will remain because thats where the folks they want to chat/game/voice with.
Back when I played games one friend in our group was banned from LoL arbitrarily so the whole group switched to Dota 2.
Honestly, all of these are documented probabilities at this point. SNS owners can do very decent predictions on what will happen if they introduce certain kind of friction. Also, it’s not 2005 anymore, people are used to upload their IDs everywhere. I mentioned it before as well, if you’ve used any large app, the chances are you’ve uploaded your ID (AirBnB, Tinder, and etc.)
I feel like it has always been on this path to capture more and more of your data and personally link it to who you are.
It will reduce attacks on and abuse of people, because those are usually founded on anonymity (no fear for repercussions etc.)
I don't mind having a platform where everyone is at least somehow verified. yes, sure, you can bypass it and it is not 100% foolproof but what ever is? It raises the barrier for abuse and that's a good thing IMHO
Welcoming Discord users amidst the challenge of Age Verification
This is a lie, this only affects you if you want to view porn/nsfw channels on discord. I'm in the UK happily using it without age verification.
Edit: it does look a little too corporate for me though with the 'book a demo' and the focus on my 'mission'. Doesn't really give hanging out with friends vibes. Just saying.
this is a big problem - if individuals switch to something else, they will lose access to popular Discord communities.
not sure what solution there is for this, as it's unrealistic that all communities would switch to the same alternative (if at all)
docker run --name ircd -p 6667:6667 inspircd/inspircd-docker
In the case of an online-based ID check, even with nice looking privacy terms, there is no guarantee that your ID won't be stored forever and/or re-analyzed many times cross-checking with other services, and worse leaked.
What's really more distressing is that it got this far before people figured out the game--maybe we should be reflecting on that part, the gullibility and the enabling of those people by those who knew better.
And not just that event: Parents are roasting Roblox for kids getting groomed, but after the relationship is initiated, the groomers always immediately the convo to Discord.
Image what will happen post-IPO.
Did they forget it's proprietary, and from the same person that made OpenFeint, which also had a privacy lawsuit?