About Tomorrow’s Mess
In the family I was raised in – as I’m confident was the case for many readers – you clean up after yourself. If you’re staying with family or friends, you don’t leave a mess for your hosts to clean up; you strip the bed linens when you leave and don’t leave your dirty towels on the bathroom floor. It’s not just good manners, but you hope for the same treatment with your own home when folks visit you. That reciprocity is core to how human hospitality has worked for generations: from the Old Testament to classical Greek literature to the Qu’ran to the Mahabharata, all contain admonitions about what makes a good guest and a good host that are still familiar to us today.
And yet after a decade in tech policy, I have repeatedly found myself returning to a simple question: how did “move fast and break things” become the mantra of the American tech elite? How did it become the default that we permit tech elites to “disrupt” the basic functioning of society – not just of media and politics, but also at kitchen tables and on playgrounds – and bear no burden or costs for fixing what they break? How has it come to pass that we’ve made the men who insist that rules don’t apply to them not outcasts but the richest men in the history of the planet instead?
Section 230, weaponization of the First Amendment, the influence of money in politics, and the influence of the tech lobby in DC and state capitals are all part of the story – but not the whole story. The source of “move fast and break things” is not just the mere pursuit of profit, but is instead cultural and ideological: nonprofit tech accountability organizations struggle with accountability themselves, and we at this point are familiar with stories of techno-solutionists who claim fidelity to serving humanity both causing great harm.
The observation that Silicon Valley has a unique culture is not a new one: from the 1995 essay “The Californian Ideology” by Richard Barbrook and Andy Cameron to the six seasons of the HBO show Silicon Valley, that there’s something that seems – to the rest of Americans – “off” about Bay Area technologists is a cold take. But here at Tomorrow's Mess, I will be more deeply examining how developments in emerging technology and technology’s influence on politics can be better understood through unpacking the deeper ideologies of the tech elite, and what those ideologies can predict about what is coming next.
This isn't another newsletter deflating the hype around the latest generative AI developments or cataloguing the newest outrages coming from the tech elite and the harms their products cause — there are already great sources on that, from Marcus on AI to Blood in the Machine to AI Snake Oil to After Babel. Tomorrow’s Mess is instead about the cultural DNA of technology—how the values and blind spots of those building tomorrow's tools shape all our futures. Fundamentally, this is a newsletter about not just technology but about technologists, politics, and culture; and about unintended consequences, humility, hubris, incentives, and intentions.
Join me as I pick through the shards of broken things to understand why they broke, seek how we might fix things that are not yet beyond repair, and look around the corner at what might be broken next.
About Me
I’m a lawyer by training, a former Amazon lobbyist, and have served two governors (a Democrat and a Republican). I’ve been working in tech policy for a little over a decade, but came to it in an unorthodox way.
One day in 2009, in a dilapidated village classroom in the former Soviet Republic of Moldova and not long after a failed Twitter revolution in that country, a group of my students (I taught 12th grade as a Peace Corps Volunteer there) asked me to visit a website on my laptop: the homepage of the Russian social media called Odnoklassniki ("Classmates”). My students had never heard of Facebook, Instagram was still a year away, with Snap and TikTok even further into the future — but they were desperate to know what I thought of their Odnoklassniki profiles and wanted me to sign up for one.
This classroom moment marked my first real understanding of technology's power as a social architect. What looked like teenage social media use was actually a case study in how digital design shapes human behavior at scale. Odnoklassniki’s design — which was essentially "Hot or Not" with DMs — created an incentive for girls to post increasingly sexualized content for boys to rate and rank; boys could buy extra votes with mobile phone minutes, and thus would stand out to girls whose attention they wanted. And so it quickly became the case that students who once could barely wait for nightfall on summer Saturday nights so they could go to dance at the village discotheque now only showed up to vamp for photos and go home.
Ondnoklassniki wasn't broken. It was working exactly as designed. And the impact was not just confined to the classroom and the village disco. Moldova had long struggled with trafficking in young women for sexual exploitation. Odnoklassniki’s design proved to be the perfect tool for traffickers to identify and groom prospective victims; it also later became a key vector for Russian disinformation in advance of Russia’s continued attempts to influence the most consequential elections in the country’s short independent history.
That formative experience in Moldova revealed something I couldn't unsee: technology isn’t necessarily neutral, and the consequences of technologists’ design choices aren't confined to the digital realm. The years since have taken me through roles on all sides of the tech policy divide—from Amazon’s policy team to state government, from international development to civil society. Along the way, I've witnessed firsthand how the values embedded in technology design ripple outward, reshaping politics, communities, and individual lives, while those who build these systems rarely confront the messes they create.
At Tomorrow's Mess, I’ll be unpacking the ideologies driving Silicon Valley's most powerful figures and examining who pays the price when "disruption" becomes a virtue. Drawing on thinkers like Postman and McLuhan, I translate tech's deliberately obscure language into clear insights about what's coming next.
