Who Was (Is) Philip Agre?
Was anyone else strangely moved to read about the elusive technologist Philip Agre recently?
I haven’t been dipping into the news much because, well…the news. But I’m glad I stumbled upon Reed Albergotti’s Washington Post profile of former UCLA professor Philip Agre. Nearly 30 years ago, Agre predicted that computers would upend nearly everything we understood about privacy, about the manipulation of behavior and belief, about information and misinformation.
I’m not a technologist, and I’m not qualified to discuss Agre’s work in meaningful depth. But what grabbed me was that as a mathematician (and a gifted one), Agre wanted to understand AI on a philosophical level. Frustrated by lack of humanism in his deep technical grounding, he retrained himself to read philosophy on its own terms, not his. It transformed him into something rare but essential: A person who could understand how AI worked on a nuts-and-bolts level, but grasp it on a human one too. And what he saw frightened him.
What Agre saw was the rise of AI.
I’m not talking about sentient robots, but the more prosaic (and already ubiquitous) use of “machine learning” that determines what we see in our feeds and what we understand to be “news.” You interact with it every day; you’re doing so right now. This technology is used for purposes we half-understand—like swaying elections—and those we don’t, like how the virtual world affects the brain development of those born since its rise.
At first, Agre’s work didn’t have much effect outside academic circles. Now, people are wondering why not. Complicating matters, Agre himself left the conversation. A dozen years ago, he dropped out, and hard. Friends reported him missing. When former colleagues attempted to compile his writing, Agre asked them not to.
Strange. Touching. Poignant. Fractal. For reasons I can’t quite pin down, it reminded me of Terence McKenna’s embrace of the “psychedelic perspective,” and the work of Tao Lin.
Wherever he is, I hope Philip Agre has found peace. Glimpsing the future is, I imagine, a heavy burden to bear.