Thursday, July 10, 2008

$4.5 Billion Annual Data Center Power Bills

Data centers in the United States consume more than 60 billion kWh of energy each year, at an annual cost of $4.5 billion, according to the Environmental Protection Agency. Energy consumption has doubled since 2000. Much of that spending is for cooling systems to deal with all the heat produced by the power-gobbling servers that are the muscle inside any data center.

Additional studies by many of today's largest corporations agree that a 10 percent to 20 percent reduction in power consumption from new IT equipment is required. That, and the inability of data centers to continue to scale operations using current technology virtually assures new generations of power-efficient servers.

Some data centers already are finding that the key constraint to further growth in hosting capacity is inability to get any more power from the grid at their current locations.

Much the same can be said for end user devices, especially mobiles. Broadband mobile applications require more power consumption. That means bigger or better batteries. Since device size is crucial these days, that means better batteries.

Of course, the problem is that processors and memory advance at much-faster rates than battery technology, for example.

No comments:

Generative AI Will NOT have the Impact Many Expect

Generative artificial intelligence, to say nothing of machine learning or neural networks (and eventually general AI), might collectively re...