Home

Donate

The Ugliest Thing San Francisco Ever Built: Lessons for AI Infrastructure

Eryk Salvaggio / Jan 23, 2025

Eryk Salvaggio is a 2025 Tech Policy Press reporting fellow.

WASHINGTON, DC - JANUARY 21, 2025: OpenAI CEO Sam Altman (center), US President Donald Trump (left), Oracle Chairman Larry Ellison (first right), and SoftBank CEO Masayoshi Son (second right) speak during a news conference announcing an investment in AI infrastructure. (Photo by Andrew Harnik/Getty Images)

A day after his inauguration, President Donald Trump was joined in the White House by OpenAI, Oracle, and Softbank executives to announce a $100 billion joint venture called Stargate — a private sector investment into AI data infrastructure. Stargate, in Trump’s own words, aims to “build the physical and virtual infrastructure to power the next generation of advancements in AI. And this will include the construction of colossal data centers, very, very massive structures. I was in the real estate business. These buildings, these are big, beautiful buildings.”

Ten Stargate data centers are underway in Texas, with more under consideration. The White House announcement, paired with Trump’s immediate revocation of former President Joe Biden’s Executive Order on “Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence,” signals a new alignment between Silicon Valley and the federal government. But how do we evaluate this new industrial policy? What are we building, and what consequences will it have?

Fever dreams of the future

Among the proudest achievements in Boston and San Francisco are the roads that aren’t there. In Boston, the Big Dig was a bloated infrastructure project to correct the overreach of the highway system into the city. A similar spirit permeates the present-day Embarcadero. City life once lived under the shadow of the California Route 480, a freeway since declared "the ugliest thing San Francisco ever built."

Prioritizing the connection of some individuals over others, these projects decimated lower-income and especially Black neighborhoods in cities across America. Both disasters rose from a collective technological fever dream surrounding a radical transformation of society set in motion by the automobile. In “The Death and Life of Great American Cities,” Jane Jacobs reflected on the havoc spread in the aftermath. The overvaluation of a specific form of physical infrastructure, designed for a specific technology, at the expense of the interactions that create the social infrastructure of a city, had leveled neighborhoods and communities. In the meantime, it constrained the possibilities of viable alternatives, such as public transit or preserving walkable cities.

Jacobs came to mind while reading “AI in America: OpenAI’s Economic Blueprint,” a policy document that was released on January 13, just in time for the start of the second Trump administration. Built on the metaphor of the highway and the driver, much of the report praises “common sense” laws surrounding American regulation and infrastructure for the automobile.

As usual with AI, the metaphor is more about a feeling than a way of actually understanding the technology. Their point isn’t that there are lessons to learn from highway infrastructure — at least, no link to precedent appears in the document. The goal is to signal speed and scale, and to argue that it is third-party developers and the public who drive AI, and that OpenAI just builds the roads. The argument to policymakers is to resist holding back the speed of artificial intelligence by demanding accountability from the road builders.

Like the overzealous investment into arterial highways, I fear we are once again poised to value technological infrastructure over our shared social infrastructure.

The cover pages of two recent documents published by OpenAI.

Infrastructure in, infrastructure out

There are two ways to look at AI’s infrastructure. One is the infrastructure that supports AI, and the other is the infrastructure AI promises to one day deliver. We might think of these as support infrastructure (what it needs) and emergent infrastructure (what it promises).

Support infrastructure is the sprawling system the AI industry depends on to survive. There are overlapping physical infrastructures: the data centers for computation and the corresponding water and energy necessary to supply them. There are social infrastructures, too, including a dependency on unfettered digital surveillance of our online lives, arranged in such a way as to provide a continuous stream of new training data for the machines to analyze and emulate.

The other way to see AI is as emergent infrastructure: the industry’s vision of the world it wants to build. Tech leaders envision an ecosystem of developers, applications, and new technologies that rely on Large Language Models to produce promised efficiencies and cost reductions. Despite such promises, the mechanisms of securing these efficiencies are unexplained.

In September 2024, still under the Biden administration, OpenAI published “Infrastructure is Destiny,” a pitch for its 5-gigawatt plants in 12 states. It claimed the data centers would directly create 4,000 jobs per state, while “worker spending” would generate an additional 14,000 construction jobs and around 30,000 additional jobs per state. In sum, it would create $5-$7 million growth in GDP per state. It’s unclear whether AI infrastructure promises something unique over other forms of civic infrastructure, such as a hospital or museum. Productivity increases, assuming they materialize, are a poor civic investment if the end result is steep unemployment or lower incomes at the national level that might trickle into your community.

Notably, no clear case beyond job numbers was made in either report to explain how AI would generate economic benefits. The Blueprint paper notes that states should use AI “to identify ways to solve people’s daily hard problems in areas like education and healthcare” by “supporting government workers and developer communities... to identify ways to improve the lives of their taxpayers.”

What we see here is a clear vision for transforming social infrastructure, and a sense of urgency that this transformation is needed. What we do not see is an explanation of what benefits may come from the transformation.

Unusual Powers

OpenAI’s Blueprint requests tremendous and unusual powers for any American industry. For example, access to US intelligence concerning foreign rivals, a bid to ensure the US lead in AI technology over China and others. It asks the government to provide classified data centers for evaluating the security of AI models. OpenAI proposes working with the government to develop model evaluations but seeks assurances that their participation would be purely voluntary and supersede any requirements from individual state governments.

Though OpenAI’s blueprint suggests it will be supported by energy sources that do not yet exist (fission and fusion), the Trump administration will likely fund energy sources that do. Trump’s Stargate announcement suggests these data centers will be allowed to produce their own energy. Given recent interests in nuclear power, it would be unsurprising to see data centers doubling as independent nuclear facilities.

In rescinding the Biden administration’s executive order, the requirement for AI companies to disclose new risks to the US government ahead of deployment has been shredded. Without it, we are developing not merely data centers but nuclear infrastructure that prioritizes corporate partners over other infrastructural needs. Given the secrecy that already surrounds data centers, policymakers need to be clear on the boundaries they set for this independence.

AI’s emergent infrastructure

The case for the automobile was relatively straightforward: it was a faster horse. In comparison, OpenAI promises only this: “With AI, our children will be able to do things we can’t, and eventually everyone’s lives can be better than anyone’s life is now.” The details are left to the reader. But we have seen glimpses into the vision of emerging social infrastructure for AI.

OpenAI calls for a “nationwide education strategy” focused on AI literacy, designed to bolster AI at the local, community level “in partnership with American companies to help our current workforce and students become AI-ready.” To this end, the K-12 system has already been a beholden partner to the AI industry’s bid to reimagine classrooms. Centering a utilitarian AI literacy in classrooms, rather than critical reasoning, is a way of preparing a generation for its inevitability.

Recently, OpenAI partnered with Common Sense Media to create an AI instructional program for educators. Most of the lesson boiled down to: “It’s up to you to figure out how to use this in your classroom.”

The pattern is clear. From local governments to classrooms, OpenAI wants someone else to figure out how to connect their product to positive, economically beneficial uses. The AI industry has blueprints but no roadmap. If local governments, K-12 classrooms, and other present-day social infrastructure are meant to serve as research labs for these tech companies, then the public must have more say and more insight into the use of these assets. But it also suggests that investments of this scale and deregulation at this stage are premature.

Lost highway

Seen only as a collection of concrete and light poles, a highway is not racist. But the highway is a system within a system, and its path through a city or town can trace the boundaries set by prejudice. These paths are carved into cities based on decisions that prioritize categories of people through a lens of economic value. Once completed, highways may have been inaccessible to the lower-income communities they had segregated.

Likewise, when biases based on gender, race, and other social traits are baked into LLMs, they perpetuate social and economic inaccessibility in ways indistinguishable from racism or sexism. The word choices predicted by a large language model are paths, too, and are directed by probabilities. Whatever progress has been made in minimizing the bias of these models has been contingent on pressure from policymakers and regulations to ensure it was a priority.

In a world where the president promises to punish government employees who seek diversity, equity, and inclusion in their work, we risk building AI that inscribes these biases not only into “free expression,” but in the navigation of services and information. Building a massive American infrastructure project on top of a speculative technology with well-documented social biases is potentially devastating. We need only look at the long-term effects of urban renewal or photos of the now torn-down elevated highway that once passed by San Francisco’s Ferry Building. Infrastructure we do not need, yet assembled in haste, carries implications and costs we do not want.

If we are meant to take AI infrastructure of either kind very seriously, it might behoove us to remember how the writer Ursula Franklin laid it out in The Real World of Technology, more than three decades ago: “Technology is not the sum of the artifacts, of the wheels and gears, of the rails and electronic transmitters. Technology is a system. It entails far more than its individual material components. Technology involves organization, procedures, symbols, new words, equations, and, most of all, a mindset.”

OpenAI’s blueprint reminds me of the off-ramps I used to see on I-93 headed into Boston from Maine. Scattered across the elevated highway were unfinished paths and exits, veering off into nowhere, a remnant of a hastily assembled plan that bisected homes and neighborhoods. Good infrastructure requires public input for a reason. It needs to center the concerns and necessities of society rather than asking society to find a reason to speed.

Authors

Eryk Salvaggio
Eryk Salvaggio is a blend of hacker, researcher, designer, and media artist exploring the social and cultural impacts of technology, including artificial intelligence. He is a 2025 visiting professor at the Rochester Institute of Technology's Humanities, Computing, and Design program and an instruct...

Related

Challenging The Myths of Generative AI

Topics