In every case, you can’t argue back. There’s no mechanism for reading your intent. No mechanism for hearing your side. The context has already been collapsed. The pattern has already been matched. The identity has already been assigned.
It all started with Amazon labeling my mother a Nazi, and now algorithms are starting to rule our lives.
We are creating algorithms that run our lives, determining what we see, what we do, and even what we think. We used to be in control, but can we stay that way?
This is episode 1 of our Managed Minds series in which we’ll explore the power algorithms are assuming over our lives (and how we are gleefully inviting their control.
In this episode, we start with the escalation and introduce the concepts of context collapse and affective amplification.
We talk about how:
Amazon uses algorithms to recommend products. The stakes? You see some weird suggestions you’d never buy. Annoying, sure. Harmless, mostly. This is where most people’s understanding of algorithmic decision-making begins and ends — a quirky recommendation engine that occasionally gets it wrong. But this is the shallow end of the pool.
A hiring platform uses them to screen resumes. The stakes go up fast. You don’t get the interview. You don’t get the job you’re qualified for. And you never find out why — because the system rejected you before a human ever saw your application. No one called you. No one read your cover letter. A pattern decided you weren’t a fit, and that was that.
A credit system uses algorithms to assign you a score. Now the stakes are financial. You pay a higher interest rate. You get rejected for a mortgage. You can’t rent an apartment. Not because of something you did, but because of a pattern the system built around people who share your data profile. You’re being judged by a composite sketch of someone who isn’t you.
An immigration system uses algorithms to flag travelers for additional screening. You missed your flight. You get detained. You end up somewhere you never expected to be — and the system that sent you there can’t explain why it chose you. It just did. Pattern matched. Flag raised.
A criminal court uses algorithms to determine sentences. You go to prison for longer than you otherwise would have, or at all, based on a score that no one can explain to you. Not the judge. Not your lawyer. Not the people who built the system. A number was generated, and it shaped the rest of your life.
(Coming soon: in a couple of episodes, we’ll discuss predictive policing).
Notice the escalation. We went from a bad product recommendation to a prison sentence in five steps. And in every single case, the same thing is true: you can’t argue back.
There’s no appeals process for an algorithm’s assumptions. There’s no cross-examination of a model’s training data. There’s no moment where you get to stand up and say, wait — that’s not who I am. You’re not seeing the full picture.
The context has already been collapsed. The pattern has already been matched. The identity has already been assigned.
This is the conversation we need to be having — not whether AI is impressive (it is), but whether we’ve built any real mechanism for the people affected by these systems to be heard.
---
🎧 Listen to the full episode for the complete discussion, including what accountability might actually look like when machines make decisions about people.
Key topics covered:
00:00 Introduction
01:52 Discussion about comments
11:17 Hitler, my mother and how Amazon put them together.
13:58 The three mechanisms for control
16:20 Amazon as the test case: what it can measure and what it ignores.
21:47 When algorithms take your freedom.
22:58 Why “you can’t argue back” is the defining problem of automated systems
27:02 Persuasion is no longer between humans.
Resources and further reading:
Weapons of Math Destruction by Cathy O’Neil
Automating Inequality by Virginia Eubanks
God Human Animal Machine by Meghan O’Gieblyn
What’s your experience? Have you ever been on the wrong end of an algorithmic decision — denied something, flagged, scored — without understanding why? I’d love to hear your story in the comments.









