This is the age of Big Data. Given the flood of information, it is also the age of the big data center. Most of us don’t think much about these out-of-sight, out-of-mind hives of humming computers, but they are central to the modern wired economy, and they eat up energy in proportion to their importance—an annual energy consumption equivalent to the output of 35 large coal-fired power plants.
Data centers are everywhere, some of them massive (think Google, Amazon and other giants of and in the cloud), some of them small-scale operations that service retail operations and home businesses. Many do business with the federal government, and the government operates many more—which brings us to the nut of our story.
By the federal government’s estimation, based on a policy memorandum released on August 1 by Federal CIO Tony Scott, there are entirely too many of these data centers operating entirely too inefficiently, many well under 50 percent of capacity. As long ago as 2010, the Obama administration was considering ways in which to reduce their number, both to make the federal footprint more manageable and to reduce energy consumption, to say nothing of easing the way toward comprehensive security plans made possible only by knowing just how many of the things were out there—and where.
To this end, in 2014, President Obama signed into law the Federal Information Technology Acquisition Reform Act (FITARA), which requires federal agencies to undertake a comprehensive data center inventory, develop performance metrics, and provide an annual report detailing costs and savings measures. As of August 1, a new policy measure, the Data Center Optimization Initiative (DCOI), supplants a similar directive from 2010, requiring federal agencies to “develop and report on data center strategies to consolidate inefficient infrastructure, optimize existing facilities, improve security posture, achieve cost savings, and transition to more efficient infrastructure.”
As of April 2017, by the terms of the DCOI, federal agencies will not be able to budget for the construction of new data centers or “significantly” increase the size of existing ones. Instead, agencies, whether needing more space or not, will be required to analyze data needs and consider alternatives such as pooling resources with other agencies, using cloud services, or working with a third party, the last two of which would seem to offer opportunities to data entrepreneurs in the private sector.
The General Services Administration (GSA) and Office of Management and Budget (OMB) will develop standards for interagency cooperation, while GSA will develop a shared services marketplace, to include outside service providers. All these efforts are meant to work toward a set of larger goals, including optimizing existing data centers, closing about half of them, and arriving at inarguably significant savings, paring the data budget by at least 25 percent of the 2016 figure by the end of FY 2018 and taking costs down by half, $2.7 billion, of the $5.4 billion expenditure in FY 2014.
All of this is a tall order, reining in agencies across the government in a relatively short time, certainly shorter than most reform initiatives. To top it off, the effort must be transparent: Beginning this year, OMB will publish progress reports detailing planned and realized data center closings, cost savings, and other metrics. Updated information on the DCOI and FITARA will be made available here at the OMB website “Management and Oversight of Federal Information Technology,” and we will report on developments as they become known.