New Delhi: Anthropic has begun rolling out identity verification checks for some users and for some use cases of its Claude AI platform, marking a shift in how access to advanced AI tools is managed. The move comes as companies face growing pressure to track usage and prevent misuse of powerful systems.
The company said the rollout is part of routine safety and compliance checks. Users may see verification prompts when trying to access certain features or during periodic platform reviews.
Anthropic has framed the move as a step towards responsible AI deployment. The company wrote, “Being responsible with powerful technology starts with knowing who is using it.”
The verification process is not universal at this stage. It is being applied to specific use cases, particularly where higher capability access or risk is involved. The aim is to ensure that users comply with platform policies and legal requirements.
Users asked to verify their identity will need a government-issued photo ID and access to a camera for a live selfie. The process typically takes a few minutes to complete.
Accepted documents include passports, driving licences, and national ID cards. Anthropic has clarified that photocopies, digital IDs, and non-government documents will not be accepted.
Anthropic has partnered with Persona to handle identity verification. According to the company, verification data is processed securely and is not stored directly on its systems.
The company said, “We only use your verification data to confirm who you are and not for any other purposes.”
It added that identity data is not used to train AI models and is protected through encryption during transfer and storage.
Verification may fail if documents are unclear, expired, or if technical issues occur. Users are allowed to retry the process or contact support for further checks.
In some cases, accounts may face restrictions or bans based on policy violations or other compliance issues. The company has provided an appeal mechanism for users who believe their accounts were wrongly affected.
Contact to : xlf550402@gmail.com
Copyright © boyuanhulian 2020 - 2023. All Right Reserved.