What Data Centers Are and Why So Many Are Being Built

A data center is a building filled with computers. That description sounds simple, but the scale of what those computers require — power, cooling, water, land, and a direct connection to the electrical grid — is what makes a data center unlike nearly any other structure that gets built in or near a residential community. When a data center is proposed for a piece of industrial land next to a neighborhood, the conversation that follows is not really about technology. It is about what the facility will draw from the surrounding infrastructure and what it will put into the surrounding environment: noise, heat, exhaust, light, and heavy truck traffic.

Understanding what data centers actually do, and why so many are being built at once, is the starting point for any resident trying to make sense of what is happening near their home.

What a Data Center Does

Every time someone streams a video, sends an email, uses a search engine, opens a cloud-stored document, or asks an AI assistant a question, that request is processed by a computer in a building somewhere. Data centers are those buildings. They store data on servers — specialized computers designed to run continuously without stopping — and they process requests from users around the world, around the clock.

The physical structure of a data center is roughly analogous to a warehouse, though it contains dense rows of metal server racks instead of shelving. A large data center may contain tens of thousands of individual servers. Those servers generate significant heat as a byproduct of their operation, which requires an industrial cooling system to keep them from failing. That cooling system is one of the most consequential aspects of a data center for the surrounding community, because it is loud, it consumes large quantities of water, and it runs continuously.

Data centers also require an uninterruptible connection to the electrical grid. A facility large enough to serve as part of a major cloud provider’s network may draw anywhere from 20 to several hundred megawatts of power at any given moment. For context, a single megawatt is enough electricity to power roughly 800 to 1,000 average American homes. A large hyperscaler campus drawing 500 megawatts is consuming roughly the equivalent of a mid-sized city’s residential electrical load — at all hours, every day.

Why They Require Industrial-Scale Resources

The resource demands of a data center flow directly from the laws of physics. Servers consuming electricity generate heat. That heat must be removed or the servers fail. The primary method of removing that heat, in most large facilities, is evaporative cooling — a process that moves heat into water and then evaporates the water into the air. This is why data centers consume water even though they are not factories or farms: cooling towers at large hyperscaler campuses can consume between one and five million gallons of water per day, according to the Environmental and Energy Study Institute.

Power consumption figures are similarly large. According to the Lawrence Berkeley National Laboratory’s 2024 United States Data Center Energy Usage Report, data centers consumed approximately 176 terawatt-hours of electricity in 2023 — roughly 4.4 percent of total U.S. electricity consumption. That figure is projected to rise to between 325 and 580 terawatt-hours by 2028, depending on how AI-related demand develops. The same report found that data-center power demand more than doubled between 2017 and 2023, driven largely by the growth of AI servers requiring increasingly powerful chips and more intensive cooling.

Land consumption is also substantial. A large hyperscaler campus may occupy hundreds of acres. The buildings themselves are often single-story or two-story structures with large footprints, surrounded by cooling equipment, substations, backup generators, security fencing, and buffer zones. From the outside, they often look like large, windowless warehouses with industrial cooling equipment on the roof or in adjacent yards.

The Demand Drivers: Cloud, Streaming, and AI

Data center construction has been accelerating for more than a decade, but the pace has intensified sharply since 2022. Three overlapping demand drivers explain why.

The first is cloud computing. Rather than maintaining their own computer servers, most large companies now lease computing power and storage from cloud providers — primarily Amazon Web Services, Microsoft Azure, and Google Cloud. This shift has been underway for years and continues to drive steady demand for new data center capacity.

The second is streaming and media. Video streaming, in particular, requires enormous amounts of data to be stored and transmitted. Every hour of high-definition video watched by a consumer somewhere requires servers to store and deliver it.

The third, and currently the fastest-growing, is artificial intelligence. Training a large AI model — the kind used in AI assistants and image generators — requires months of computation on thousands of specialized processors running simultaneously. Once trained, those models must also be served to users through a process called inference, which happens every time someone queries an AI system. Both training and inference are computationally intensive and generate corresponding heat. AI processors, known as GPUs, draw dramatically more power per unit than conventional servers. A traditional data center rack might draw 15 to 20 kilowatts; a rack of modern AI processors can draw 80 to 150 kilowatts or more. This multiplication of per-rack power demand is what is driving the current construction surge.

CBRE’s North America Data Center Trends H2 2024 report documented that supply in primary data center markets increased 34 percent year-over-year in 2024, reaching 6,922 megawatts of total capacity — with a record 6,350 megawatts under construction at year-end 2024, more than double the amount under construction a year earlier. Net absorption — the measure of new capacity being leased and occupied — jumped from 329.6 megawatts in 2020 to 1,809.5 megawatts in 2024, a 450 percent increase in four years.

Why Siting Decisions Are Happening Faster Than Communities Can Track

The speed of current data center development is a direct product of competitive economics. The companies driving demand — Amazon, Microsoft, Google, and Meta — are each spending tens to hundreds of billions of dollars building out infrastructure to support AI services. Whoever can deploy capacity fastest gains a competitive advantage. This means that siting and construction decisions are being made at a pace that is outrunning the capacity of local permitting processes, utility interconnection queues, and community engagement to keep up.

Site selectors — consultants hired by developers to identify suitable land — often move quickly and quietly. Land acquisition can happen through limited liability companies that obscure the ultimate buyer’s identity. By the time a large data center project becomes publicly visible in a community — through a building permit application, a zoning hearing notice, or a utility interconnection filing — the land may already be purchased and the financing secured.

The result is that many communities learn about a proposed data center only after the decision has effectively been made, or at a point in the approval process where meaningful intervention is difficult.

Hyperscalers, Colocation Facilities, and Edge Data Centers

Not all data centers are the same size or serve the same purpose. Understanding the distinctions helps residents assess what a particular proposed facility is likely to look like and how it is likely to operate.

Hyperscaler campuses are the largest category. They are built by or for the major cloud and technology companies — Amazon Web Services, Microsoft Azure, Google Cloud, and Meta — and may consist of multiple large buildings on a single campus, collectively consuming hundreds of megawatts of power. Northern Virginia has the largest concentration of hyperscaler capacity in the world; Loudoun County alone hosts more than 200 data centers and has been called “Data Center Alley.”

Colocation facilities, often called “colo” data centers, are built and operated by real estate companies and developers — firms like Equinix, Digital Realty, QTS, and Iron Mountain. Rather than serving a single company, they house servers belonging to many different tenants. Colo facilities range widely in size. The largest are comparable to hyperscaler campuses; smaller ones may draw only a few megawatts. They are present in nearly every major metropolitan area and increasingly in secondary and smaller markets.

Edge data centers are the smallest category. They are designed to process data closer to the end user, reducing transmission delays for applications that require very fast response times. They may draw only a fraction of a megawatt. While individual edge facilities have a smaller footprint, they are being deployed in large numbers across suburban and even rural areas, sometimes in repurposed commercial buildings.

For communities, the practical distinctions matter: a proposed hyperscaler campus represents a fundamentally different scale of impact on local infrastructure than a small colocation facility. The approval process and the intensity of community scrutiny warranted are both different. But even smaller facilities can generate noise and consume water in quantities that affect neighbors, particularly if they are sited in close proximity to residences.

Secondary Markets and the Spread Beyond Major Hubs

For the first decade of modern data center development, most new construction concentrated in a small number of primary markets: Northern Virginia, Silicon Valley, Chicago, Dallas-Fort Worth, Phoenix, and the New York–New Jersey metro. These markets offered the combination of existing fiber infrastructure, power availability, and proximity to major population centers that made them attractive from the beginning.

That geography is now shifting. Because primary markets face growing power constraints — grid interconnection queues stretching five to seven years in some cases — developers are moving to secondary markets that can offer faster access to electricity. Midwestern markets like Columbus, Ohio, and Indianapolis have attracted significant investment. The Southeast, particularly Atlanta and parts of the Carolinas, has seen rapid growth. Rural areas near existing high-voltage transmission corridors, in states with favorable tax incentive programs, are increasingly on site selectors’ lists.

For communities in these emerging markets, the dynamic is often unfamiliar. A community that has not previously hosted data center development may not have zoning codes that address them, may not have noise ordinances calibrated to industrial-scale cooling equipment, and may not have local officials with experience evaluating the impacts. This informational asymmetry — between a developer who has navigated dozens of similar approvals and a community encountering the first one — shapes how the local process unfolds.

What Communities Are Left to Weigh

The economic case for data centers is typically presented by developers and local officials in terms of tax revenue and construction jobs. What receives less attention is the infrastructure burden: the demands placed on municipal water systems, the cost of grid upgrades spread across all ratepayers, the noise experienced by nearby residents, and the pace at which these facilities are being approved.

The jobs argument, in particular, merits examination. Large data center campuses employ relatively few permanent workers — typically a few dozen to a few hundred employees for facilities consuming hundreds of megawatts of power. Construction employment is temporary. The tax revenue depends heavily on what abatements have been negotiated and how long they run. A community that accepts significant infrastructure burdens in exchange for a data center development is making an implicit trade that is rarely quantified in public debate.

These impacts are not incidental. They are direct consequences of what data centers are and how they operate. For residents near proposed or existing facilities, understanding the basic mechanics — what a data center does, what it consumes, and why it has to run without stopping — is the foundation for any meaningful engagement with the decisions being made about them.


Lawrence Berkeley National Laboratory 2024 Data Center Energy Usage Report | CBRE North America Data Center Trends H2 2024 | Environmental and Energy Study Institute: Data Centers and Water Consumption


This article was researched and drafted with AI assistance under human review. See our full AI and editorial practices.