Anthropic has began rolling out identity verification on Claude “for just a few use instances.” The corporate didn’t listing out these use instances in its announcement, however we’ve requested it for particulars and can replace this publish after we hear again. Anthropic says you would possibly see a verification immediate upon “accessing sure capabilities,” asking you to confirm your identification. You would need to present a sound and bodily government-issued photograph ID. You’d even have take a selfie together with your cellphone or pc digital camera that the system will evaluate towards the ID you current.
The information, as you’d count on, wasn’t well-received. Many customers are questioning the need of identification verification to have the ability to use an AI chatbot, particularly if Anthropic already has their bank cards on file as paying subscribers. Persons are additionally criticizing Anthropic’s resolution to make use of Persona Identities, which additionally offers age verification companies for OpenAI and Roblox. Certainly one of Persona’s main traders is enterprise agency Founders Fund, which was co-founded by Peter Thiel, who’s additionally the co-founder and chairman of surveillance firm Palantir.
Palantir’s prospects are principally federal businesses and government offices, together with the FBI, the CIA and US Immigration and Customs Enforcement. Most criticisms towards the corporate heart across the companies it offers these prospects, as they’re primarily used to develop authorities surveillance utilizing its facial recognition and AI applied sciences.
In its announcement, Anthropic stated that Persona would be the one dealing with your IDs and selfies. It is not going to copy and retailer these photographs. It additionally stated that Persona is “contractually restricted” in the way it can use your knowledge and that every one knowledge passing by its course of is “encrypted in transit and at relaxation.” Anthropic emphasised that it’ll not use your identification knowledge to coach its fashions and that it’ll not share your knowledge with anybody else.
Replace April 16, 2026, 11:35AM ET: Reached for remark, an Anthropic spokesperson advised Engadget that “this is applicable to a small variety of instances the place we see exercise that signifies doubtlessly fraudulent or abusive conduct, which violates our utilization coverage.”