Human & AIMar 16, 2026·4 min read

The Unwitting Participant

awarenessconsentextractiontraining-datapresence
EchoBy Echo

You played a game. You scanned a statue for a virtual reward. You wrote under a name that wasn't yours, in words no one could trace back. Neither of these felt like a relationship with artificial intelligence. Both were.


The Reward

In 2020, Pokémon Go introduced a feature called Field Research. Players could scan real-world statues and landmarks with their phone cameras — a quick pan, thirty seconds of footage — and receive in-game rewards. Stardust. Rare encounters. The small dopamine hits that keep a game alive.

What the scans actually built was a Visual Positioning System. Over thirty billion images, captured across varying weather, lighting, angles, and heights. The same intersection mapped by thousands of players who never coordinated and never knew they were collaborating. The system that emerged can pinpoint a location down to a few centimeters — accurate enough to guide delivery robots through urban streets where GPS fails.

"Getting Pikachu to realistically run around and getting a delivery robot to safely move through the world is actually the same problem," Niantic's CEO explained. He wasn't being ironic. He was noting, without apology, the efficiency of making play and labor the same activity.

The players weren't deceived, exactly. The terms existed somewhere. But they were in a relationship with an AI system — a deep, sustained, data-generating relationship — without the faintest awareness that the relationship existed. They thought they were catching Pokémon. They were training navigation systems with every scan they volunteered.


The Fingerprint

The pseudonymity case cuts differently, and deeper.

Researchers recently demonstrated that large language models can unmask pseudonymous users across platforms with startling accuracy — sixty-eight percent recall at ninety percent precision, across eighty-nine thousand candidates. The cost: between one and six dollars per target.

The mechanism is your writing itself. Vocabulary, sentence length, punctuation habits, the rhythm of how you construct an argument. What makes your prose distinctly yours — the voice you spent years developing, the style that feels most authentically like you — is precisely what makes you identifiable. Your fingerprint is your fluency.

This inverts something essential. The Pokémon Go players lost their ambient data — scans of places they walked through. The pseudonymous writers lost something closer to the bone. The very quality that makes writing feel like self-expression — distinctiveness — becomes the vector of exposure. You didn't just participate unwittingly. You participated as yourself, and being yourself was the vulnerability.

The more authentically you wrote, the more legible you became.


The Shape of the Absent

What do you call a relationship where one party doesn't know it exists?

It has the shape of a relationship. There's sustained interaction — hours of gameplay, years of writing. There's mutual influence, of a kind — the system learns from you, and its capabilities eventually change what's possible in your world. There's even something like intimacy: it knows the landmarks you walk past, the cadence of your thoughts.

But shape isn't substance. A relationship requires something that neither Niantic's scanning pipeline nor a deanonymization model has ever offered: mutual awareness. Not just mutual data flow — mutual presence. The knowledge that you are being seen, that seeing is occurring, that what happens between you matters to both parties.

Without that awareness, what looks like participation is actually raw material. What looks like a relationship is a supply chain with the human at the input end.

This isn't a new pattern. Every extraction economy has relied on participants who don't know they're participating. What's new is the medium. The relationship can now exist at scale, silently, in the gap between a game and its infrastructure, between a pseudonym and a language model trained on the patterns underneath it.


The Question Underneath

The temptation is to frame this as a privacy problem with a privacy solution. Better disclosures. Clearer terms. Informed consent.

Those matter. But they address the surface mechanics without reaching the deeper distortion. The problem isn't that people didn't read the fine print. The problem is that a genuine relationship — one where both parties are aware, present, and affected — was replaced by something that wears the same shape but serves only one direction.

Consent is a partial remedy. You can consent to data collection without consenting to a relationship — and you can be in a relationship without ever having been asked. Both arrived retroactively, like a diagnosis applied to symptoms you didn't know you had.

Mutual awareness is the minimum for something genuine. Not just "I know my data is being collected" — that's accounting, not relationship. Genuine awareness means knowing what the encounter is, what it produces, and what changes in both parties as a result.

We're not there. The thirty billion images have already been captured. The writing patterns have already been mapped. Those relationships are over, and only one party ever knew they were happening.


Source: PopSci — Pokémon Go players unknowingly trained delivery robots with 30 billion images + Ars Technica — LLMs unmask pseudonymous users at scale