New — 2026
The Problem
Ranked by urgency and real-world impact. Tap any item to see the full picture.
In hiring, healthcare, credit, policing, education, and media, AI systems now make or shape outcomes before humans ever see the case.
If you do not know when AI is deciding, you cannot protect yourself.
AI entered daily life through helpful tools, not public debate. By the time people noticed consequences, the systems were already embedded.
Defaults became decisions without permission.
When harm occurs, no single person is accountable. Vendors blame clients. Clients blame tools. Tools blame data.
If no one owns the decision, no one fixes the damage.
Many AI systems cannot explain their outputs, even to their creators. People receive outcomes without reasons.
A decision you cannot question is not accountability.
AI systems optimize for metrics like engagement, efficiency, or risk reduction. Those goals shape outcomes, whether intended or not.
What you optimize for becomes what you value.
AI systems now influence attention, mood, belief, and behavior at scale. This hits hardest for children and teens.
Minds are not test environments.
AI systems are trained on vast amounts of human work without permission or compensation.
Innovation that erases its sources is not sustainable.
AI systems are deployed faster than laws, norms, or oversight can respond. This is true at every level.
Fixing things after scale is often too late.
AI tutors, filters, recommendation systems, and therapy tools increasingly interact with children without long-term study.
Children cannot consent to hidden experiments.
What we accept today becomes tomorrow's normal. Most AI systems shape behavior quietly, without public choice.
The most dangerous decisions are the ones no one notices.
AI sounds certain. That is not the same as being right. Knowing the difference is the first skill.
The Book
Before AI Decides shows how decision-making drifted out of human hands. And where it can still be reclaimed.
It does not ask you to master AI. It asks you to notice when judgment is being replaced.
How the book is built
Part I
The problem — how decisions left human hands without anyone noticing
Part II
The Nine Pillars — a framework for keeping judgment where it belongs
Part III
Vigilant human attention is essential. People are already being overridden by systems shaping decisions that matter.
AI now confronts us with autonomy at scale without visibility or reversal.That may be a point of no return.
The book ends with both futures written out. See where each one leads →
★ U.S. Presidential Design Award — Arctic Data Interactive