Now available—the Embedded Vision Summit On-Demand Edition! Gain valuable computer vision and edge AI insights and know-how from the experts at the 2021 Summit.

This blog post was originally published by Bitfury. It is reprinted here with the permission of Bitfury.

Discussions about the future of artificial intelligence (AI) often focus on the incredible value that can be gleaned from data. The idea that “data is the new oil” was first attributed to mathematician Clive Humby in 2006, followed by complementary predictions from The Economist, the World Economic Forum, Cisco, Hacker Noon, and several others.

The metaphor holds up quite well, if data is viewed as a commodity. Data is indeed valuable, and it is quickly becoming available at a scale comparable to resources like oil. Similar to oil, data must be refined to be useful.

What is missing from these discussions, however, is a frank accounting of the environmental impact of this “commodity.” Just like oil, data harbors a powerful, destructive threat to our environment. There is no climate contingency plan for the impending “age of supercomputing,” and it is desperately needed. But to do so, we need to understand its scale and the factors behind its insatiable growth.

A Tsunami of Data

The advent of the internet and its worldwide connectivity gave us the “age of information.” That connectivity brought us into a world of cell phones, social media and smart devices. It also brought us an absolute tsunami of data. We are inundated with information at a scale we can barely recognize, much less utilize. By 2020 (in just a few months), humankind will produce 44 zettabytes of data yearly, meaning every single person on earth will create about 1.7 megabytes of data per second. This production is only going to accelerate, thanks to the 125 billion Internet of Things devices that will be connected by 2030.

To date, only about 2 percent of this data has been analyzed. Companies are now turning to AI and high-performance computing to dive into the other 98 percent.

Insatiable Demand for AI

Raw data alone, even in massive supply, is not useful. As a result, companies are investing heavily in AI and high-performance computing to mine insights from this data. This is unquestionably valuable, both for the companies’ bottom line as well as for our own quality of life. The positive impacts are too numerous to list, but the “refinement” of data could provide us with medical breakthroughs, hyper-precise weather predictions, and personalized learning plans, just to name a few.

However, this increase in computing power is not without negative effects. Last June, a paper was published that revealed the astonishing carbon footprint of neural network processing (a type of AI computing). The University of Massachusetts found that training a large neural network to recognize patterns in data can emit more than 626,000 pounds of carbon dioxide equivalent. This is nearly five times the lifetime emissions of the average American car (including the manufacturing of the car itself). And these “networks” are doubling in size about every 18 months (as more data comes available), meaning their computational requirements are growing as well.

As a community, we should act responsibly now to ensure “the age of supercomputing” is environmentally sustainable. It will not be easy; current computing technologies are not well-suited to the demands of AI, and several technological forces including Moore’s Law will hinder quick advancements.

However, there are plenty of ways we can make improvements immediately.

First on the list is to rethink our datacenters. The cooling systems in most datacenters are horribly inefficient and cannot adequately cool high-performance computers or support new technologies like 5G. Recent advancements in datacenter design, such as immersion cooling, are far better suited to this task and are easily paired with renewable energy grids.

Another step we can take is to use superior hardware. Soon we will have computing chips that require very little power (known as ultra-low voltage chips), and new computing architectures (such as parallelization, which you can read more about here) that will significantly reduce the amount of power needed to process AI applications.

These innovations can significantly lessen our carbon footprint and will be followed soon by the availability of sustainable AI computing products (which Bitfury AI is working on now).

The good news is that it is still early in the age of supercomputing. We can chart a course now towards a more sustainable future, perhaps one where data is less like oil and more like renewable energy.

Here you’ll find a wealth of practical technical insights and expert advice to help you bring AI and visual intelligence into your products without flying blind.



1646 North California Blvd.,
Suite 360
Walnut Creek, CA 94596 USA

Phone: +1 (925) 954-1411
Scroll to Top