Interested in learning what's next for the gaming industry? Join gaming executives to discuss emerging parts of the industry this October at GamesBeat Summit Next. Learn more.
Sometimes I like to think about long-term problems. Since the Australian wildfires have put the issue of climate change my mind, I’m wondering whether our visions for great things in technology and gaming, such as cloud gaming or the Metaverse, will make the problem worse.
At the moment, cloud gaming is probably not producing enough pollution to be on anybody’s list of the top contributors to climate change. But if the dreams of cloud gaming companies come true, then we’ll need to start worrying about their contribution to the problem.
And it’s great to have games that raise awareness about the issue of climate change — The Climate Trail and Jupiter & Mars and Eco — but then there is the small irony that if those games become really popular, then they will also contribute to climate change.
Data centers melting the polar ice caps?
In the big picture, data centers and the tech gadgetry that connects to them are a concern. SoftBank and Arm predict that the internet of things — everyday devices that are smart and connected — could reach more than a trillion units by 2035. To date, Arm’s customers have shipped 150 billion processors.
Event
MetaBeat 2022
MetaBeat will bring together metaverse thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 3-4 in San Francisco, CA.
Those things are going to connect to data centers, over 5G networks or other internet connections. A Bloomberg story said that power efficiency gains in data centers have bottomed out, according to the Uptime Institute.
“Even with efficiency gains, data center electricity demand is voracious and growing; that growth has a number of implications for the power grid and for power utilities,” the Bloomberg story said.
Add to that the problem of the slowing of Moore’s Law, the 1965 prediction by Intel chairman emeritus Gordon Moore that the number of transistors on a chip would double every couple of years. That was a proxy for continuous electronics progress, meaning that as long as Moore’s Law continued, computing would become more efficient, doing more computations for the same cost or less power.
Intel, one of the world’s biggest chip makers, has struggled with its transition to the next doubling. That has raised concern that the law that held up for the past 55 years is coming to its end as we approach the limits of the laws of physics. That slowdown comes at a bad time as the demand for data centers rises.
Where cloud gaming makes this worse
Some things about cloud gaming concern me. Some wags have figured out that streaming high-end games consumers something like 100MBs a second. Based on that, if you play a game like Red Dead Redemption 2 (with more than 100 hours of gameplay), then that one game played across a month could exceed your monthly data cap with cable providers. That’s a lot of computing usage, and it will put pressure on data centers.
If we’re going to live in a series of connected virtual worlds, which are like being inside video games like in Ready Player One or The Matrix, I can’t imagine that’s going to make this problem of electricity consumption any better.
What are the answers?
I have been asking chip industry executives about this question.
Arm’s CEO Simon Segars told me that “Moore’s law aside, I think that the deployment of IoT and the AI processing of data can do a lot to help with some of the issues of climate change. We’ve been a believer, and publicly outspoken, on the role that technology can play in addressing all of the U.N. global goals, whether it’s to do with climate change, quality of water, education, whatever. If you look across the global goals, technology can help with all of them. There’s a lot of inefficiency. This thermostat cranking out freezing cold air when we’re all not enjoying it — a wall switch would help here. But there’s really a lot of inefficiency in the world. There’s low-hanging fruit here that doesn’t take much to address. We have the technologies we need for that now.”
Most of the responses I have received from executives such as Arm’s Drew Henry, AMD’s Mark Papermaster, and others fall into this kind of category. Sure, the internet of things will consume material and energy resources, but it will make us more efficient. But I don’t see anyone really making nuanced arguments about how to architect the data centers and the internet of things in the right way.
A study in Nature estimated that data centers consume about 0.3% of the world’s electricity but are on their way to becoming a far bigger slice of the pie. It also raised concern about the rise of cryptocurrencies such as Bitcoin and the structure of blockchain — seeking to verify a fact through the coordination of a lot of computers — is a real energy hog.
Nvidia acknowledges that cloud gaming data centers have an impact on the environment, and it is thinking about ways to make its efforts carbon neutral. But it doesn’t have a solution yet. Meanwhile, Microsoft and Google have committed to making their data centers carbon neutral or negative. That means employing alternative energy sources such as solar and other clever ideas.
Offsetting the demand for electricity
Norman Liang, a game industry observer, noted that cloud gaming could end up saving money if it means that players will buy less hardware in the future. If you can stream great games with high-end graphics and play them on any piece of hardware, even old machines, then you don’t have to buy as many consoles or PCs.
Old game consoles and PCs are big sources of electronic waste, as owners have no secondary or long-term use for outdated technologies. In that way, companies who spend more on capabilities in the cloud could offset spending by consumer’s at the edge.
There is also a lot of “dark fiber” throughout the world, or unused fiber optic cables that have been under-utilized when it comes to transporting data. By tapping this resource, the world’s networks could become more efficient without incurring more expenses or power consumption.
“I’m an optimist by nature. What opportunity do we have, when we have more data available to us than we’ve ever had, and more computation than we’ve ever had?” said Papermaster, the chief technology officer at Advanced Micro Devices. “You look at what was announced by the Department of Energy with the Frontier supercomputer, where we’re partnering with HPE Cray to deliver 1.5 exaflops of computing in 2021. If you marry that with, as you said, this massive amount of data we’ve never had before, what analytics can we run that we never have before that can improve society? What can we do, based on that kind of analysis, that can inform us on how to contain climate change?”
Let’s hope that the optimists are right, but I’d sure love to see real studies on this issue and how to change our direction if that’s necessary.
GamesBeat's creed when covering the game industry is "where passion meets business." What does this mean? We want to tell you how the news matters to you -- not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Learn more about membership.