Close
Updated:

Anatomy of a Data Center

Traditional and social media are thick with reports and predictions of the remarkable increase in size, power consumption and significance of data centers. Not only technology companies but real estate and energy developers, investment funds, lenders, and professionals of all stripes are in or determined to enter this sector. Our inboxes are full—it’s data center this, data center that.

But what exactly is a data center? What infrastructure, technology and human resources come together to create and sustain one of these localized points of computation? By understanding their components, we can glean some understanding of the business, public policy and (our focus) legal issues that arise before and during their operation.

In this article, we cite key characteristics of a reference Blackacre Data Center, with occasional glances at other (real) structures that offer variations on themes. Blackacre is a composite of several centers we have encountered in our law practice. These facilities differ widely in size, location and functions, so your mileage will vary.

INFRASTRUCTURE RESOURCES
Blackacre Data Center is built on a 14-acre site consisting of two contiguous legal parcels that were formerly developed as Class C light industrial warehouses in the early 1970s, located on the outskirts of a major United States technology-center city. The property was acquired and developed by a real estate developer specializing in industrial projects, including data centers, and the fee owner is a special-purpose entity subsidiary of an investment fund. The acquisition and construction were funded by equity investment (aided by tax credits and location incentives) and third-party debt financing. The improvements were built under a design-build contract with an engineering and construction company joint venture. The fee estate is leased to an operating company affiliated with the developer, which then leases interior space to technology companies.

The Blackacre site was selected on the basis of many factors, including real estate cost and availability; telecommunications connectivity; availability of reliable, sustainable power; resilience to natural disasters; ambient temperatures or water supply suitable for cooling systems; taxes and incentives; and human factors to attract employee and contractor workforces. (There is a publication, ANSI/TIA 942 Annex, with data center site selection factors.)

To entitle Blackacre, the developer underwent a streamlined planning process with the city, since the project site is located within an existing industrial park within the city’s designated “Industrial Technology and Innovation Corridor.” The zoning permits a wide range of uses, including, office, business park, industrial, research and development, manufacturing, and information and technology-based uses. Fortunately for Blackacre compared with other projects, the site is bordered by undeveloped land and other industrial and commercial uses, remote from residential districts, and located near a state highway.

Data center businesses often employ one of two approaches—either an enterprise model in which the servers, storage and network switches are controlled by a tenant technology company, or a co-location model in which information technology (IT) equipment and utilities supporting that equipment are leased to individual businesses. Blackacre follows the co-location model.

The outside perimeter is an imposing 10-foot-tall security fence, with gated site access staffed 24/7 and the first of many closed-circuit television (CCTV) cameras. The grounds include parking for 73 cars and a dozen bicycles, outside electricity stepped down to an onsite 75 MVA (mega volt-ampere), 25,000-square-foot substation, a 25,000-square-foot utility switching station, and telecommunications outlets (both satellite dishes and a dedicated T1 fiber optic line). The substation has two transformer lines allowing one to be taken out of operation without interruption of service.

There are two data center structures located on the campus: a three-story main building (350,000 gross square feet) and a one-story auxiliary building housing storage media (15,000 gross square feet). There is also a backup generator yard; a 3,000 square feet security building; and infrastructure improvements such as fire detection and suppression devices, access driveways, stormwater facilities, and water storage tanks.

On the entry floor of each data center building, there is a lobby to verify access and monitor entry and exit by authorized individuals. Security measures include biometrics and badge access into and around the facility (including the equipment areas), key access to specific racks and servers, logs for employee and visitor/vendor access, physical escorts for all non-employees, video surveillance, and 24/7 on-site security personnel. Cybersecurity is maintained through firewalls, virus detection software, encryption, disaster and breach security, backup and recovery systems, and regular tests and audits. Blackacre security personnel are mindful of industry security standards such as ISO/IEC 27001 and data privacy regulations and best practices.

Authorized entrance from the lobby leads to an operations center with monitors of all the data center activities. From that center authorized personnel can next enter the computer rooms, where the servers, storage, network switches and other equipment are located, operated and maintained. When you enter a computer room, besides the hum and whir of storage devices, you will hear the whoosh of cool air being fed and hot air being withdrawn in separate currents called “cold aisles” and “hot aisles.”

Blackacre has visions of greater and greater sustainability. Over the longer term, zero-emission power is to be supplied for the data center from advanced energy systems; its owners are in negotiations with developers of small modular reactor (SMR) nuclear fission, natural gas with carbon capture, advanced geothermal, sodium-based storage from renewables, and even nuclear fusion, to be located at an adjacent undeveloped project site. This power may be delivered directly physically to the site, or virtually through a power purchase agreement.

INFORMATION TECHNOLOGY RESOURCES
The heart of Blackacre Data Center is its information technology, which falls into three broad categories: servers, storage devices, and network devices.

Servers are the computers that perform the customers’ desired functions with respect to the data that come to and leave from the data center. Server operation entails the retrieval, storage, and processing of data supplied by the technology company or generated by its user base in engaging with internal and customer-facing application software. These servers have a range of internal memory, processing power, and other specifications, requiring customized support by Blackacre staff. Servers can be self-contained computers installed in racks, or slimmed-down “blades” more dependent on Blackacre support and utilities.

Ancillary to the servers are large storage devices, both hard drive and solid state. They store data in block increments with many terabytes of data capacity. (Lawyers should start getting used to the next thousand-power prefixes, peta, exa, zetta and yotta.) Lastly, a data center relies on network infrastructure. Switches transfer data between nodes on a network; routers transfer data from and to networks; and firewalls and load balancers complement their functions.

Blackacre also has monitoring computers, system software, and application software. The various IT components and technology are partly proprietary to the operating company and partly licensed to them and their customer base. With virtualization and greater employment of cloud resources, the lines separating server, storage, network and other devices are blurring, and the Blackacre configurations will change with the times to keep up with the larger enterprise “hyperscale” data centers.

POWER AND COOLING RESOURCES
Data centers are well known voracious consumers of electricity, not only for powering the computers but also for cooling them down and maintaining the rest of the integrated facility. A metric called Power Usage Effectiveness (PUE) is used to compare the power needs of the IT equipment with the overall power needs of the center. Within that energy budget, there is a target power usage of so many kilowatts per rack or per blade within the facility. The processing power of individual chips and the energy density of servers are increasing at a torrid pace, raising Blackacre’s energy needs rack by rack and upgrade by upgrade.

The absolute requirement for a data center is guaranteed power at “five nines” (99.999%) or even higher levels of assurance. The offsite utility power line comes into a transfer switch with the generator circuit, so the generators can step in within several seconds of a grid outage. That is not good enough, though, so an Uninterruptible Power System (UPS) is also connected. The UPS may be a long-life (over 12 years useful life) lead-acid or lithium-ion battery system or a continuously spinning flywheel. (See APC White Paper No. 92 for their respective merits.) The UPS can keep critical load in steady operation until the generator comes on or the grid source resumes.

The cooling requirements of Blackacre are daunting. Even with built-in fans, the IT equipment would overheat quickly were it not for the complex circulatory systems in the main building. The American Society of Heating, Refrigerating and Air Conditioning Engineers (ASHRAE) recommends data center IT room temperatures and humidity be kept in a controlled range. While some larger centers have chilled water systems with Computer Room Air Handlers (CRAHs), others use a compressor-based system with Computer Room Air Conditioning (CRAC), in which the familiar refrigerant cycle of evaporation, compression, condensation and expansion of a fluid with low boiling point drives the airflow.

For Blackacre, by contrast, Data Hall Air Handling Units (DAHUs) are installed in dedicated galleries. Outside air is drawn from each building’s perimeter. The DAHUs use a “flooded room” design, meaning that no ductwork or raised flooring systems direct the cooling air to the IT racks’ air intakes. Instead, all DAHUs in each mechanical gallery discharge cooling air into the adjacent common supply air header. The potable water demand for the entire site is relatively modest, with half dedicated to evaporative cooling and the balance to landscaping and domestic needs.

Continued expansion of the data center sector to accommodate AI and other demands will depend on ever more efficient designs and operations of server components, innovative power systems, and cooling systems that minimize energy losses. Pillsbury advised on contracts for Stanford University’s award-winning central energy plant using heat exchangers to capture excess heat from data centers to warm dormitories. At locations more remote than Blackacre, centers may be able to take advantage of geothermal temperature gradients for both power and cooling.

HUMAN RESOURCES
Employee and contractor workforces are a critical part of Blackacre’s operation. The campus currently employs 52 local residents on a full-time basis, with monitoring, security and maintenance shifts around the clock 24/7. Representatives of the industrial real estate company, the customer base, and equipment suppliers visit frequently and upgrade the technology assets and security systems. Maintenance contractors are also engaged on a periodic basis, all with the objective of uninterrupted performance at highest industry levels.

Personnel (sometimes even working with their attorneys on Pillsbury’s data centers team!) also maintain compliance with laws, contracts, permits and industry practices. Relevant standard-setting organizations include ASHRAE, the Uptime Institute, the Telecommunications Industry Association (TIA, which has a comprehensive family of data center specifications TIA-942), and the Building Industry Consulting Service International (BICSI). Available sustainability certification sources include LEED, Energy Star, and Building Research Establishment Environmental Assessment Methodology (BREEAM).

CONCLUSION
Everyone talks about data centers, but few can fully describe one. They indeed sometimes even look like “black boxes”; the architects do not typically splurge on windows and aesthetic touches. The need for security limits the amount of visibility into how they are built and operated. There are many variations on Blackacre, and this article is no substitute for learning about any given facility’s features. We nonetheless find it useful to have a reference to compare to our new projects in this rapidly expanding field.


RELATED ARTICLES

Big Data Meets Big Green: Data Centers and Carbon Removal Compete for Zero-Emission Energy

Data Center Investment Opportunities

From Encryption to Employment, U.S. Federal Agencies Brace for the Effects of Quantum Computing, AI and More