What are Data Centers and How Do They Work?
Data centers are the physical infrastructure that makes digital life possible, yet most people have never seen one up close or understand how they function. As artificial intelligence (AI) development expands, more individuals are curious about what’s inside a data center.
A Concrete Fortress
Data centers are essentially large concrete warehouses housing thousands of computer servers working in tandem. Traditional data centers have one or two plants divided into spacious rooms, while newer ones are taller. A single data center can serve one company or be shared among multiple clients.
Servers are housed in standardized 48-centimeter racks, essentially metal cabinets lined up in rows. Large data centers can accommodate tens of thousands of servers operating simultaneously, generating substantial heat and consuming significant energy for both power and cooling.
Proximity Matters
Having a data center near end-users enhances speed, crucial for activities like e-commerce and gaming. Ashburn, Virginia, home to the world’s highest concentration of data centers (about 50 kilometers from Washington D.C.), exemplifies this.
However, building in densely populated areas is more expensive and sometimes faces local resistance. Consequently, companies increasingly opt for rural locations where land is cheaper and regulatory hurdles fewer.
Managing Heat
Inside these bunker-like buildings, a single server rack generates as much heat as several running kitchen ovens. Cooling accounts for approximately 40% of a data center’s total energy consumption.
Advanced processors, Graphics Processing Units (GPUs) used for AI, can reach temperatures over 90°C, threatening performance and potentially causing permanent damage during prolonged operation. They are also heavier than lower-performance chips.
Traditional installations use computer room air conditioning with heat exhaust through ceiling-mounted ducts, but this isn’t suitable for GPUs, which primarily use water cooling.
Modern buildings are beginning to implement “free cooling,” using outdoor air when temperatures permit, and various water-based options: liquid cooling systems pumping refrigerant directly to components or evaporative cooling functioning like human sweating.
Currently, substantial water quantities are needed for direct and indirect data center cooling. In 2014, U.S. data centers used 21.2 billion liters of water; this figure rose to an estimated 66 billion liters in 2023.
Energy Consumption
Electricity supply is vital for data centers. Tech giants racing to develop AI have spent tens of millions in mere months building suitable structures for GPUs.
Operators rely on existing power grids but increasingly secure their own resources for greater security and to limit rate hikes for users.
Some install solar panels or gas turbines, and many anticipate the arrival of small modular reactors (SMR), a developing nuclear energy technology.
Most data centers must operate 24/7, with critical systems backed up in case of power outages. This can be achieved through large battery banks or diesel generators.
The most advanced facilities guarantee electricity supply 99.995% of the time.
Key Questions and Answers
- What are data centers? Data centers are large facilities housing thousands of computer servers that make digital life possible.
- Why are data centers important for AI? Data centers provide the necessary computational power and storage for AI applications to function.
- How do data centers manage heat? Data centers use various cooling methods, including air conditioning, water cooling, and free cooling, to maintain optimal server temperatures.
- Why is electricity crucial for data centers? Electricity powers the servers and cooling systems in data centers, making a reliable power supply essential.
- How do data centers ensure continuous operation? Data centers employ backup power systems, such as batteries or diesel generators, to maintain operations during outages.