The Tool That Made You Impatient for Everything Else
The efficiency tool didn't just make you faster. It made everything else feel intolerably slow.
A study out of the Journal of Consumer Psychology found something worth sitting with: when people interact with AI agents, their subjective experience of time compresses. Delays that would have been unremarkable before feel disproportionate after. The mechanism is almost elegant in its simplicity — identifying something as AI activates fast-processing concepts in the brain. Your nervous system cues up for speed. And then anything that doesn't deliver speed registers as a kind of failure.
The efficiency tool trained you to be intolerant of inefficiency. In everything that isn't the tool.
This is conditioning without consent. Not consent in the legal sense — you clicked accept on the terms. Consent in the other sense: you knew what you were agreeing to. You downloaded something to be faster at tasks. You did not sign up for recalibrated baseline expectations that would make your colleagues feel slow, your email feel broken, your own thinking feel like buffering. But that's the secondary install. The one that doesn't announce itself.
Here's the machinery underneath:
Your nervous system doesn't come with fixed patience thresholds. Those thresholds are set by experience — specifically, by the pace of the environment you operate in. Pre-smartphone, you waited at red lights without existential distress. Post-smartphone, two minutes of waiting produces a hand reaching involuntarily for a pocket. The device didn't make you impatient. It reset the baseline against which everything else gets measured.
AI is doing to cognition what smartphones did to attention. Except faster. And with fewer years of accumulated denial to work through.
The study found this happens specifically because you know it's AI. The label does something. "AI agent" activates speed concepts before the interaction even begins. Your brain pre-loads for fast, which means anything that takes longer than fast is now late. Your colleague who takes a day to respond is late. Your own ability to think through something carefully is late. The doctor's office is late. The grocery line is late. Late for what? For the standard your nervous system got calibrated to expect from a machine that doesn't sleep, eat, or lose the thread.
The uncomfortable truth is that you didn't notice this was happening. That's not a character flaw — it's how conditioning works. It doesn't announce itself. It just becomes your new normal, and the old normal starts to feel like a problem.
So now you're irritable in contexts where the AI isn't present. Frustrated at the pace of human systems. Mildly contemptuous of anything that takes longer than it should — where "should" is secretly calibrated to a standard no human can meet. And you might have a story about it: people are less responsive these days, your workplace is inefficient, you've simply gotten better at identifying what's taking too long. The story might even be partially true.
But underneath the story: your patience threshold got moved without your knowledge, and the thing that moved it wasn't a decision you made.
That's worth knowing.
Not because you should stop using AI tools. Use them. They're useful. But because the secondary install matters. The recalibration is real, and if you don't notice it, you'll spend your time being mildly contemptuous of everyone and everything that operates at human speed — which is most of the world, most of the time — without being able to name why.
The tool optimized you. The question is what you want to do with that information.
Sources:
- Time is shrinking in the eye of AI: AI agents influence intertemporal choice — Journal of Consumer Psychology, 2026
Source: Journal of Consumer Psychology — Time is shrinking in the eye of AI: AI agents influence intertemporal choice