JOE KAVA

Vice President, Google Data Centers

Published in Issue 5 | December 2020

For Google’s global fleet of data centers, Google’s VP of Data Centers Joe Kava heads engineering, construction management and delivery, critical facility operations, environmental health and safety, sustainability, energy, and location strategy. He joined Google in 2008 from RagingWire Data Centers (now owned by NTT). Kava was COO at RagingWire, responsible for design, construction, facility operations, managed services, business planning and development.

Prior to joining the data center industry, Kava spent 17 years in the semiconductor industry, working at LSI Logic and Applied Materials in various technical and executive roles. He holds four US Patents for his work in reactive ion and plasma etch technology during his time at Applied Materials; and he has a Bachelor of Science degree in Materials Engineering from California Polytechnic State University, San Luis Obispo.

Kava’s passion for educating the global data center industry resulted in him receiving the 2019 iMasons Sustainability Champion Award. During his tenure at Google, Kava has been a staunch advocate for building sustainable data centers and implementing Google’s cloud computing strategy. Beyond Google, Kava wants others to benefit from their pioneering efforts of creating efficient best-practices.

Google owns and operates data centers globally. What does it take to maintain their security, efficiency and sustainability?

The foundation of the philosophy guiding our design choices is that from the outset, our data centers are designed to be efficient and sustainable. Our design teams look for innovative methods to improve the electrical and cooling systems. We often utilize the natural habitat that surrounds our data centers so that we can minimize the environmental impact. We are extremely resourceful when it comes to cooling our data centers. Some examples include using reclaimed storm water in South Carolina; wastewater from a treatment plant in Georgia; canal water in Belgium; and sea water from the Gulf in Finland. Our efforts have been strong when it comes to reducing our impact.

As we continue to observe our impact and how it can evolve, we are working to accelerate the transition to renewable energy. We are already to the point where our data centers consume about half of the average energy compared to other data centers. We also provide more than seven times the compute power than we did just five years ago. We’re not stopping, and we’ll continue with optimizing both ends of the scale: energy and compute efficiencies.

How do your cloud customers apply your focus on sustainability when thinking about their own carbon footprint?

A Google Cloud customer automatically receives our green guarantees. They don’t even need to think about how to utilize our sustainability commitments for their benefit when consuming data and compute services from a company running on renewable energy.

Yet, customers have their own roadmaps and objectives, too. One of our big cloud customers, Etsy, already runs on Google Cloud, the greenest on Earth. Moreover, Etsy has their own social, economic and ecological objectives such as using 100 percent renewable energy by 2020 or a 25 percent reduction in their overall energy consumption by 2025. Our work with Etsy provides their sellers with the best solution to connect with their buyers, which naturally helps their own objectives with respect to revenue and social impact.

How has AI changed the way Google processes data and runs data centers? What are some of the machine learning (ML) technologies deployed? How do these technologies come into play when processing data?

Well, these days many big companies want to utilize ML and AI to optimize their operations, all with systems that we pioneered. Several years ago, in partnership with DeepMind (now an Alphabet-owned AI unit headquartered in London), we developed an algorithm that made an AI system able to operate part of our cooling systems autonomously and safely. The results were promising.

In the areas where we rolled out the AI system, we saw an estimated 40 percent in savings in overhead energy usage for cooling. We considered the biggest limitations to driving renewable energy on the grid, which are the sun shining not being reliable and the wind not blowing constantly. For planning purposes, grid operators need to know ahead of time the amount of demand that the grid will have, and subsequently, how much generation capacity they’ll need on the grid. Looking at those needs, with DeepMind, we created another algorithm that accurately predicts the output of a wind farm up to 36 hours in advance of the actual power generation. This foresight and predictability then gives grid operators a tremendous amount—hundreds—of wind energy megawatts to put on the grid. When running our data centers, we also use AI and ML to determine methodology like when to time refactoring, tech refreshes and new capacity deploying, as well as how to balance those deployments more efficiently.

Tell us more about Google’s commitment to operate all its data centers and campuses on carbon-free energy 24/7 by 2030.

Google entered its third decade of both existence and tackling climate change recently, which has proven to be our most ambitious objective yet. We first achieved carbon neutrality in 2007, and we were able to start matching 100 percent of our consumption through renewable energy purchases for the entire enterprise data centers, offices and everything worldwide a decade later. The first large corporate purchaser to accomplish this was Google. We’ve been matching annually ever since, though, now the problem is those parts of the world where the public / energy policies don’t allow us to bring renewable energy fields onto the grid. So, technically with that global outlook, we do not consume 100 percent renewable energy on an hour-by-hour basis. Given that we map it out throughout each year, we are able to buy enough megawatt hours to match our total load. Nevertheless, even then, we haven’t made it exactly on an hour-by-hour basis.

Our new project-goal is to match with 100 percent carbon-free energy 24/7 by 2030. To achieve this, we’ll need a two-fold plan. First, to have various carbon-free energy projects working together. Second, to develop new technologies such as tech-storage that allow renewable energy to be banked when it’s abundant so it can be used when there’s a shortage. However, in order to make headway bringing more renewables onto some global grids, we need certain policy advancements. Indeed, when it comes to being carbon-free, we have made remarkable progress so far; but there is still a lot more work to be done.

Due to signing renewable energy power purchase agreements (PPAs), all of our data centers worldwide now match consumption at about 65 percent on an hour-by-hour basis. Some match even more than that number. Consider Finland as the model example. Since starting in 2017, our data center in Finland was matching energy usage with renewable energy at about 97 percent on an hour-by-hour basis, which was made possible via a grid with a high renewable-energy percentage. It definitely has helped that in the Nordics, policies allow power to be transferred from a renewable energy project in Sweden to Finland.

Fortunately, this allows us to rapidly increase the renewable energy percentage matching in that site. Thus, we’ve made several deals in Finland and Sweden, whereas over in the US, specifically in Iowa, the grid only has 18 percent carbon-free energy, on average. We have done a lot of wind PPAs that increased our percent matching in the US up to 74 percent and in The Netherlands, with solar PPAs added into the mix, up to 69 percent.