The Promise
Pika Labs launched PikaMe (pika.me) in early 2026 as an "AI Self" platform. The pitch: upload a selfie, record your voice, answer some personality questions, and Pika creates a persistent AI version of you that can text, post on social media, join video calls, and act autonomously across 16+ platforms.
When users reasonably ask whether their face and voice data will be used to train AI models, Pika's FAQ page offers a clear, comforting answer:
"No. Your inputs are used to create your experience on the platform only. We won't use your likeness or inputs to train other people's AI Selves or any general-purpose models."
pika.me → FAQ → Will my personal data be used to train AI models?The FAQ also states users own all generated IP. The marketing copy calls the process "giving birth" to your AI Self. It's friendly, playful, disarming.
Then you read the Terms of Service.
What the Terms Actually Say
Pika's Terms of Service, hosted at pika.art/terms-of-service and governing both pika.art and pika.me, contain a license grant that directly contradicts the marketing:
Users grant Pika a "non-exclusive, irrevocable, perpetual, worldwide, royalty-free, fully paid, transferable, sublicensable right and license to use any Inputs and Outputs made available by users or otherwise generated in connection with their use of the Service."
pika.art/terms-of-serviceEvery amplifying adjective in the legal dictionary is present. Here's what each means for your face and voice data:
- Perpetual + Irrevocable: Even if you delete your account, Pika retains the right to use your biometric data forever. You cannot revoke this grant.
- Transferable: If Pika is acquired, your face/voice data rights transfer to the acquiring company.
- Sublicensable: Pika can grant these rights to third parties — partners, advertisers, anyone — without notifying you.
- Worldwide + Royalty-free: They can use your likeness anywhere, for any purpose, without paying you.
But the Terms go further. There is an explicit training clause:
"Inputs, Outputs, and user interactions with the Service may be used by Pika to train, enhance, evolve and improve its machine learning models and artificial models, algorithms and related technology, products and services (including for labeling, classification, content moderation and model training purposes)."
pika.art/terms-of-serviceSide by side
"We won't use your likeness or inputs to train other people's AI Selves or any general-purpose models."
"Inputs...may be used by Pika to train, enhance, evolve and improve its machine learning models...for model training purposes."
In any legal dispute, the Terms of Service control. The FAQ is marketing copy. It is not a binding legal document. If Pika trains a general-purpose model on your face data tomorrow, you have no recourse based on the FAQ — because the ToS you agreed to explicitly permits it.
The ToS timeline: written for video, applied to your face
This is where it gets worse. A Wayback Machine analysis of Pika's Terms of Service reveals that these terms were originally written for a video generation tool and then silently expanded to cover biometric collection.
Inputs defined as: "text prompts, directions, images, videos, or other content." No mention of likeness, voice samples, AI Self, biometric data, facial geometry, or digital twins.
Wayback Machine → pika.art/terms-of-service (Nov 2024)Inputs redefined as: "text prompts, directions, images, videos, voice samples, likeness data, documentation, or other material." The term "AI Self" appears 40 times. "Likeness" appears 10 times.
pika.art/terms-of-service (current)The timeline tells the story:
- Nov 2023 – Nov 2024: ToS covers video generation only. "Inputs" means text and images.
- Dec 2, 2024: Privacy Policy quietly adds "biometric data, such as voice data and scans of faces" — but with no mention of AI Self, pika.me, or likeness. Users are not notified.
- Feb 11, 2026: ToS is massively rewritten to add AI Self language — 9 days before public launch. The perpetual, irrevocable license now applies to "voice samples" and "likeness data."
- ~Feb 20, 2026: PikaMe launches publicly. Users who signed up for video generation are silently opted in under the expanded terms.
- Mar 15, 2026: Privacy Policy finally updated for AI Self — 3 weeks after launch. For those 3 weeks, users were uploading face and voice data under a privacy policy that didn't describe the product they were using.
One more detail: the word "biometric" appears zero times in the Terms of Service. Not in the original version. Not in the current version. The only document that acknowledges biometric collection is the Privacy Policy — which lives on a different domain and was updated weeks after users started uploading their faces.
This is precisely the pattern the FTC flagged in February 2024: companies that "adopt more permissive data practices" and "only inform consumers of this change through a surreptitious, retroactive amendment to its terms of service."
The Biometric Data Problem
PikaMe doesn't just collect photos. The onboarding process requires three categories of biometric data:
- Facial geometry — via selfie upload, used to generate your AI Self's visual likeness
- Voiceprint — via voice recording, used for voice cloning and speech synthesis
- Behavioral biometrics — via personality questionnaire and ongoing interactions, used to model your personality
Pika's Privacy Policy explicitly acknowledges this:
"Biometric information, such as scans of facial features or voice data, where applicable, when you use features that allow you to create content using such inputs."
pika.art/privacy-policyThis is significant because biometric data occupies a special legal category in most privacy frameworks. It is, by definition, data that is uniquely and permanently tied to your physical identity. You can change a password. You cannot change your face.
What's missing from the Privacy Policy
- No data retention period. How long are face scans and voice recordings kept? Not stated.
- No deletion mechanism. The FAQ says "your data gets deleted" when you delete your account, but the ToS license is perpetual and irrevocable — meaning Pika may retain the right to use data already collected.
- No CCPA rights section. Despite being headquartered in Palo Alto, California, the privacy policy contains no "right to delete," "right to access," or "do not sell" provisions required under California law.
- No training opt-out. There is no mechanism to consent to the service while opting out of model training.
BIPA, GDPR, and the Legal Void
PikaMe's legal documents do not mention BIPA (Illinois Biometric Information Privacy Act), GDPR (EU General Data Protection Regulation), or CCPA/CPRA (California Consumer Privacy Act). For a service collecting facial geometry and voiceprints from users worldwide, this is a remarkable omission.
BIPA exposure
Illinois BIPA (740 ILCS 14) is the strictest biometric privacy law in the United States. It requires:
- Written informed consent before collecting biometric identifiers — a specific, separate consent, not generic terms-of-service acceptance
- A publicly available retention and destruction schedule
- A prohibition on selling, leasing, or profiting from biometric data
Pika's current posture fails all three. The consent mechanism is a generic browsewrap ToS. There is no published retention schedule. The broad sublicensable/transferable license and third-party sharing language could constitute "profiting from" biometric data.
Statutory damages under BIPA: $1,000 per negligent violation, $5,000 per intentional violation.
Clearview AI settled for $51.75 million. YouTube/Google face recognition: $6 million. Lensa AI (Prisma Labs) was sued for its Magic Avatars feature — the exact same category of data collection PikaMe performs.
In 2025 alone, 107+ BIPA class actions were filed in Illinois. Pika is collecting the same data categories as these defendants, at scale, with arguably weaker compliance.
The FTC's Everalbum consent order is the closest precedent to PikaMe's situation. Everalbum's help articles told users facial recognition was consent-only; in reality, the company applied it to all users' photos and trained models on the data.
The FTC required Everalbum to: (1) obtain express consent, (2) delete all biometric data collected without consent, and (3) destroy the AI models and algorithms trained on deceptively obtained data — a first-of-its-kind "algorithmic destruction" remedy.
In February 2024, the FTC explicitly warned that it "may be unfair or deceptive for a company to adopt more permissive data practices — and to only inform consumers of this change through a surreptitious, retroactive amendment to its terms of service." PikaMe's FAQ-vs-ToS contradiction fits this pattern exactly.
GDPR gaps
Under GDPR, biometric data is a "special category" requiring explicit consent under Article 9. Pika's Privacy Policy does not mention GDPR, does not describe a lawful basis for processing biometric data, does not enumerate data subject rights, and does not describe international data transfer safeguards. Any EU user uploading their face to PikaMe is doing so under a legal framework that does not acknowledge GDPR exists.
Technical Deep Dive: What They Collect
PikaMe is not a photo filter app. It is a persistent, multi-modal AI agent platform that collects, processes, and deploys biometric data at a depth that significantly exceeds any prior consumer AI product.
The onboarding pipeline
The ToS define user inputs as: "text prompts, directions, images, videos, voice samples, likeness data, documentation, or other material submitted by the user to generate their AI Self." The onboarding flow extracts:
1. Selfie Upload
→ Facial geometry extraction
→ Visual likeness model generation
2. Voice Recording
→ Voiceprint extraction
→ Speech synthesis / voice cloning
3. Personality Questionnaire
→ Behavioral trait mapping
→ Communication style modeling
4. Ongoing Interactions (continuous)
→ Persistent memory across sessions
→ Behavioral refinement from corrections
→ Cross-platform interaction data (16+ platforms)
PikaStream 1.0: Real-time video deepfakes
The most recent expansion is PikaStream 1.0, which enables your AI Self to join Google Meet calls as a live talking-head avatar:
- Training data: ~10 million pre-training clips, ~2 million supervised clips
- Performance: 24 fps, ~1.5 seconds latency on a single NVIDIA H100
- Identity consistency: Modifiable mid-session without restarting
- Memory: Persistent conversational context across sessions
This is not an avatar overlay. It is a real-time generative video model producing convincing talking-head output for live business meetings.
Cross-platform data exposure
The AI Self connects to Slack, Telegram, WhatsApp, Discord, iMessage, Signal, Google Chat, X, Instagram, LinkedIn, YouTube, Notion, GitHub, Dropbox, Figma, and Zoom. Your biometric data is exposed not just to Pika's infrastructure but to every connected platform's data practices.
"You are responsible for the activity and behavior of your AI Self, so make sure you are following the regulations, terms and conditions of the respective platforms you connect them to."
Translation: if your AI Self violates another platform's terms, that's your problem.
The legal documents live on a different domain
pika.me has no privacy policy or terms of service pages. Every legal URL — /privacy, /terms, /terms-of-service — returns 404. The legal documents governing your biometric data live exclusively on pika.art, a completely different domain for Pika's video generation product.
The two domains don't even share infrastructure: pika.me uses Vercel with Let's Encrypt SSL, while pika.art uses Cloudflare with Google Trust Services SSL. They are operationally separate products united only by a single set of legal documents hosted on one of them. The privacy contact listed? support@pika.art — not even the domain you gave your face to.
Apple privacy labels vs. reality
Pika's App Store listing categorizes its data collection as "Photos or Videos" and "Audio Data." Notably absent: any mention of biometric data, face data, or voiceprint data. Their own privacy policy explicitly acknowledges collecting "scans of facial features or voice data," but Apple's nutrition labels bury this under generic categories. A user checking the App Store before downloading would have no idea their facial geometry is being extracted and processed.
Mandatory arbitration for biometric disputes
The Terms of Service include mandatory binding arbitration and a class action waiver. If your biometric data is mishandled, you cannot sue Pika in court or join a class action — you must resolve disputes individually through arbitration. This is particularly significant because BIPA's enforcement power comes largely from class action litigation. The arbitration clause effectively neutralizes the strongest legal mechanism consumers have.
How Pika Compares
| Feature | Pika AI Self | Industry Norm |
|---|---|---|
| License scope | Perpetual, irrevocable, sublicensable, transferable | Limited to operating the service |
| Training opt-out | None | Available |
| Biometric consent | Implicit browsewrap | Separate written consent |
| Retention period | Not specified | Disclosed |
| Deletion rights | Not addressed | Available on request |
| BIPA/GDPR | Absent | Present |
| Marketing vs legal | Contradictory | Consistent |
FaceApp and Lensa AI comparison
Both faced privacy backlash for collecting facial data. PikaMe's collection is substantially broader:
- FaceApp/Lensa collected photos for one-time transformations. PikaMe collects face + voice + personality and builds a persistent agent.
- FaceApp/Lensa data was (claimed to be) ephemeral. PikaMe data is persistent by design.
- FaceApp/Lensa operated within their own apps. PikaMe deploys your likeness across 16+ third-party platforms.
- Lensa was sued under BIPA. Pika collects the same data categories plus voice biometrics, with no visible BIPA consent mechanism.
The Deepfake-in-a-Meeting Problem
PikaStream 1.0 creates a risk category that prior AI avatar products did not: convincing, real-time video deepfakes that participate in live business meetings.
Pika's own description: "It is your AI Self, responding in real time with naturally expressive facial movements, emotionally appropriate reactions, and persistent memory of who they're talking to."
The Terms prohibit using someone else's likeness without permission. Enforcement relies on AI moderation and user reports. But the infrastructure itself — a real-time generative video model, deployable via API — creates a novel attack surface for social engineering, identity fraud, and impersonation at consumer scale.
The PikaStream video meeting skill is open-sourced on GitHub and available via API key. Any application can put a synthetic face on a live call.
What You Should Do
If you've already used PikaMe:
- Read the actual Terms of Service at pika.art/terms-of-service, not the FAQ.
- Understand what you've agreed to: a perpetual, irrevocable license over your face and voice data that survives account deletion.
- If you're in Illinois: you may have BIPA rights that Pika is not currently honoring.
- If you're in the EU: you may have GDPR rights to deletion that are not addressed in Pika's Privacy Policy.
- Consider the cross-platform exposure: every platform your AI Self connects to is another entity with access to your likeness data.
If you haven't used PikaMe yet: read the Terms before you upload your face. The marketing copy and the legal terms tell very different stories.
PikaMe collects three categories of biometric data — face, voice, and personality — under a Terms of Service that was originally written for video generation and silently expanded to cover biometric collection 9 days before launch. The perpetual, irrevocable, sublicensable license that users agreed to for video clips now applies to their facial geometry and voiceprints. The word "biometric" appears zero times in the ToS. BIPA and CCPA are never mentioned. The marketing FAQ directly contradicts the binding legal terms.
The FTC forced Everalbum to destroy AI models trained on deceptively obtained face data in an identical fact pattern. Pika is collecting the same data, at greater scale, with the same type of misrepresentation — and a mandatory arbitration clause designed to prevent the class actions that enforce BIPA.
Sources
Every claim links to a primary source. Verify everything yourself.
- Pika Terms of Service
- Pika Privacy Policy
- PikaMe Landing Page & FAQ
- Pika Acceptable Use Policy
- Pika iOS App Store Listing
- Pika Skills (GitHub)
- Illinois BIPA (740 ILCS 14)
- Clearview AI $51.75M BIPA Settlement
- Lensa AI BIPA Class Action
- FTC v. Everalbum — Algorithm Destruction Order (2021)
- FTC: AI Companies Quietly Changing ToS Could Be Deceptive (2024)
- Pika iOS App Store — Privacy Nutrition Labels
- Wayback Machine — Pika ToS (Nov 2024, pre-AI Self)
- Wayback Machine — Pika Privacy Policy (Dec 2024, first biometric mention)