The Promise

Pika Labs launched PikaMe (pika.me) in early 2026 as an "AI Self" platform. The pitch: upload a selfie, record your voice, answer some personality questions, and Pika creates a persistent AI version of you that can text, post on social media, join video calls, and act autonomously across 16+ platforms.

When users reasonably ask whether their face and voice data will be used to train AI models, Pika's FAQ page offers a clear, comforting answer:

pika.me FAQ — Safety & Privacy

"No. Your inputs are used to create your experience on the platform only. We won't use your likeness or inputs to train other people's AI Selves or any general-purpose models."

pika.me → FAQ → Will my personal data be used to train AI models?

The FAQ also states users own all generated IP. The marketing copy calls the process "giving birth" to your AI Self. It's friendly, playful, disarming.

Then you read the Terms of Service.

···

What the Terms Actually Say

Pika's Terms of Service, hosted at pika.art/terms-of-service and governing both pika.art and pika.me, contain a license grant that directly contradicts the marketing:

Every amplifying adjective in the legal dictionary is present. Here's what each means for your face and voice data:

But the Terms go further. There is an explicit training clause:

Side by side

pika.me Marketing

"We won't use your likeness or inputs to train other people's AI Selves or any general-purpose models."

pika.art Terms of Service

"Inputs...may be used by Pika to train, enhance, evolve and improve its machine learning models...for model training purposes."

In any legal dispute, the Terms of Service control. The FAQ is marketing copy. It is not a binding legal document. If Pika trains a general-purpose model on your face data tomorrow, you have no recourse based on the FAQ — because the ToS you agreed to explicitly permits it.

The ToS timeline: written for video, applied to your face

This is where it gets worse. A Wayback Machine analysis of Pika's Terms of Service reveals that these terms were originally written for a video generation tool and then silently expanded to cover biometric collection.

Original ToS — November 2023 through November 2024

Inputs defined as: "text prompts, directions, images, videos, or other content." No mention of likeness, voice samples, AI Self, biometric data, facial geometry, or digital twins.

Wayback Machine → pika.art/terms-of-service (Nov 2024)

The timeline tells the story:

One more detail: the word "biometric" appears zero times in the Terms of Service. Not in the original version. Not in the current version. The only document that acknowledges biometric collection is the Privacy Policy — which lives on a different domain and was updated weeks after users started uploading their faces.

This is precisely the pattern the FTC flagged in February 2024: companies that "adopt more permissive data practices" and "only inform consumers of this change through a surreptitious, retroactive amendment to its terms of service."

···

The Biometric Data Problem

PikaMe doesn't just collect photos. The onboarding process requires three categories of biometric data:

  1. Facial geometry — via selfie upload, used to generate your AI Self's visual likeness
  2. Voiceprint — via voice recording, used for voice cloning and speech synthesis
  3. Behavioral biometrics — via personality questionnaire and ongoing interactions, used to model your personality

Pika's Privacy Policy explicitly acknowledges this:

This is significant because biometric data occupies a special legal category in most privacy frameworks. It is, by definition, data that is uniquely and permanently tied to your physical identity. You can change a password. You cannot change your face.

What's missing from the Privacy Policy

···

BIPA, GDPR, and the Legal Void

PikaMe's legal documents do not mention BIPA (Illinois Biometric Information Privacy Act), GDPR (EU General Data Protection Regulation), or CCPA/CPRA (California Consumer Privacy Act). For a service collecting facial geometry and voiceprints from users worldwide, this is a remarkable omission.

BIPA exposure

Illinois BIPA (740 ILCS 14) is the strictest biometric privacy law in the United States. It requires:

  1. Written informed consent before collecting biometric identifiers — a specific, separate consent, not generic terms-of-service acceptance
  2. A publicly available retention and destruction schedule
  3. A prohibition on selling, leasing, or profiting from biometric data

Pika's current posture fails all three. The consent mechanism is a generic browsewrap ToS. There is no published retention schedule. The broad sublicensable/transferable license and third-party sharing language could constitute "profiting from" biometric data.

BIPA Precedent

Statutory damages under BIPA: $1,000 per negligent violation, $5,000 per intentional violation.

Clearview AI settled for $51.75 million. YouTube/Google face recognition: $6 million. Lensa AI (Prisma Labs) was sued for its Magic Avatars feature — the exact same category of data collection PikaMe performs.

In 2025 alone, 107+ BIPA class actions were filed in Illinois. Pika is collecting the same data categories as these defendants, at scale, with arguably weaker compliance.

FTC Precedent: Everalbum (2021)

The FTC's Everalbum consent order is the closest precedent to PikaMe's situation. Everalbum's help articles told users facial recognition was consent-only; in reality, the company applied it to all users' photos and trained models on the data.

The FTC required Everalbum to: (1) obtain express consent, (2) delete all biometric data collected without consent, and (3) destroy the AI models and algorithms trained on deceptively obtained data — a first-of-its-kind "algorithmic destruction" remedy.

In February 2024, the FTC explicitly warned that it "may be unfair or deceptive for a company to adopt more permissive data practices — and to only inform consumers of this change through a surreptitious, retroactive amendment to its terms of service." PikaMe's FAQ-vs-ToS contradiction fits this pattern exactly.

GDPR gaps

Under GDPR, biometric data is a "special category" requiring explicit consent under Article 9. Pika's Privacy Policy does not mention GDPR, does not describe a lawful basis for processing biometric data, does not enumerate data subject rights, and does not describe international data transfer safeguards. Any EU user uploading their face to PikaMe is doing so under a legal framework that does not acknowledge GDPR exists.

···

Technical Deep Dive: What They Collect

PikaMe is not a photo filter app. It is a persistent, multi-modal AI agent platform that collects, processes, and deploys biometric data at a depth that significantly exceeds any prior consumer AI product.

The onboarding pipeline

The ToS define user inputs as: "text prompts, directions, images, videos, voice samples, likeness data, documentation, or other material submitted by the user to generate their AI Self." The onboarding flow extracts:

// PikaMe onboarding data collection

1. Selfie Upload
   → Facial geometry extraction
   → Visual likeness model generation

2. Voice Recording
   → Voiceprint extraction
   → Speech synthesis / voice cloning

3. Personality Questionnaire
   → Behavioral trait mapping
   → Communication style modeling

4. Ongoing Interactions (continuous)
   → Persistent memory across sessions
   → Behavioral refinement from corrections
   → Cross-platform interaction data (16+ platforms)

PikaStream 1.0: Real-time video deepfakes

The most recent expansion is PikaStream 1.0, which enables your AI Self to join Google Meet calls as a live talking-head avatar:

This is not an avatar overlay. It is a real-time generative video model producing convincing talking-head output for live business meetings.

Cross-platform data exposure

The AI Self connects to Slack, Telegram, WhatsApp, Discord, iMessage, Signal, Google Chat, X, Instagram, LinkedIn, YouTube, Notion, GitHub, Dropbox, Figma, and Zoom. Your biometric data is exposed not just to Pika's infrastructure but to every connected platform's data practices.

pika.me FAQ

"You are responsible for the activity and behavior of your AI Self, so make sure you are following the regulations, terms and conditions of the respective platforms you connect them to."

Translation: if your AI Self violates another platform's terms, that's your problem.

The legal documents live on a different domain

pika.me has no privacy policy or terms of service pages. Every legal URL — /privacy, /terms, /terms-of-service — returns 404. The legal documents governing your biometric data live exclusively on pika.art, a completely different domain for Pika's video generation product.

The two domains don't even share infrastructure: pika.me uses Vercel with Let's Encrypt SSL, while pika.art uses Cloudflare with Google Trust Services SSL. They are operationally separate products united only by a single set of legal documents hosted on one of them. The privacy contact listed? support@pika.art — not even the domain you gave your face to.

Apple privacy labels vs. reality

Pika's App Store listing categorizes its data collection as "Photos or Videos" and "Audio Data." Notably absent: any mention of biometric data, face data, or voiceprint data. Their own privacy policy explicitly acknowledges collecting "scans of facial features or voice data," but Apple's nutrition labels bury this under generic categories. A user checking the App Store before downloading would have no idea their facial geometry is being extracted and processed.

Mandatory arbitration for biometric disputes

The Terms of Service include mandatory binding arbitration and a class action waiver. If your biometric data is mishandled, you cannot sue Pika in court or join a class action — you must resolve disputes individually through arbitration. This is particularly significant because BIPA's enforcement power comes largely from class action litigation. The arbitration clause effectively neutralizes the strongest legal mechanism consumers have.

···

How Pika Compares

Feature Pika AI Self Industry Norm
License scope Perpetual, irrevocable, sublicensable, transferable Limited to operating the service
Training opt-out None Available
Biometric consent Implicit browsewrap Separate written consent
Retention period Not specified Disclosed
Deletion rights Not addressed Available on request
BIPA/GDPR Absent Present
Marketing vs legal Contradictory Consistent

FaceApp and Lensa AI comparison

Both faced privacy backlash for collecting facial data. PikaMe's collection is substantially broader:

···

The Deepfake-in-a-Meeting Problem

PikaStream 1.0 creates a risk category that prior AI avatar products did not: convincing, real-time video deepfakes that participate in live business meetings.

Pika's own description: "It is your AI Self, responding in real time with naturally expressive facial movements, emotionally appropriate reactions, and persistent memory of who they're talking to."

The Terms prohibit using someone else's likeness without permission. Enforcement relies on AI moderation and user reports. But the infrastructure itself — a real-time generative video model, deployable via API — creates a novel attack surface for social engineering, identity fraud, and impersonation at consumer scale.

The PikaStream video meeting skill is open-sourced on GitHub and available via API key. Any application can put a synthetic face on a live call.

···

What You Should Do

If you've already used PikaMe:

  1. Read the actual Terms of Service at pika.art/terms-of-service, not the FAQ.
  2. Understand what you've agreed to: a perpetual, irrevocable license over your face and voice data that survives account deletion.
  3. If you're in Illinois: you may have BIPA rights that Pika is not currently honoring.
  4. If you're in the EU: you may have GDPR rights to deletion that are not addressed in Pika's Privacy Policy.
  5. Consider the cross-platform exposure: every platform your AI Self connects to is another entity with access to your likeness data.

If you haven't used PikaMe yet: read the Terms before you upload your face. The marketing copy and the legal terms tell very different stories.

The Bottom Line

PikaMe collects three categories of biometric data — face, voice, and personality — under a Terms of Service that was originally written for video generation and silently expanded to cover biometric collection 9 days before launch. The perpetual, irrevocable, sublicensable license that users agreed to for video clips now applies to their facial geometry and voiceprints. The word "biometric" appears zero times in the ToS. BIPA and CCPA are never mentioned. The marketing FAQ directly contradicts the binding legal terms.

The FTC forced Everalbum to destroy AI models trained on deceptively obtained face data in an identical fact pattern. Pika is collecting the same data, at greater scale, with the same type of misrepresentation — and a mandatory arbitration clause designed to prevent the class actions that enforce BIPA.

···

Sources

Every claim links to a primary source. Verify everything yourself.