Exclusive Interview with Joe Kava, Vice President, Data Centers

For Google’s global fleet of data centers, Google’s VP of Data Centers, Joe Kava heads engineering, construction management and delivery, critical facility operations, environmental health and safety, sustainability, energy, and location strategy.

He joined Google in 2008 from RagingWire Data Centers (now owned by NTT). Kava was COO at RagingWire, responsible for design, construction, facility operations, managed services, business planning and development.

Prior to joining the data center industry, Kava spent 17 years in the semiconductor industry, working at LSI Logic and Applied Materials in various technical and executive roles. He holds four US Patents for his work in reactive ion and plasma etch technology during his time at Applied Materials; and he has a Bachelor of Science degree in Materials Engineering from California Polytechnic State University, San Luis Obispo.

Kava’s passion for educating the global data center industry resulted in him receiving the 2019 iMasons Sustainability Champion Award. During his tenure at Google, Kava has been a staunch advocate for building sustainable data centers and implementing Google’s cloud computing strategy. Beyond Google, Kava wants others to benefit from their pioneering efforts of creating efficient best-practices.

Google owns and operates data centers globally. What does it take to maintain their security, efficiency and sustainability?

The foundation of the philosophy guiding our design choices is that from the outset, our data centers are designed to be efficient and sustainable. Our design teams look for innovative methods to improve the electrical and cooling systems. We often utilize the natural habitat that surrounds our data centers so that we can minimize the environmental impact. We are extremely resourceful when it comes to cooling our data centers. Some examples include using reclaimed storm water in South Carolina; wastewater from a treatment plant in Georgia; canal water in Belgium; and sea water from the Gulf in Finland. Our efforts have been strong when it comes to reducing our impact.

As we continue to observe our impact and how it can evolve, we are working to accelerate the transition to renewable energy. We are already to the point where our data centers consume about half of the average energy compared to other data centers. We also provide more than seven times the compute power than we did just five years ago. We’re not stopping, and we’ll continue with optimizing both ends of the scale: energy and compute efficiencies.

These colorful lockers play a key role in the disk erase process at Saint Ghislain, Belgium data center

How do your cloud customers apply your focus on sustainability when thinking about their own carbon footprint?

A Google Cloud customer automatically receives our green guarantees. They don’t even need to think about how to utilize our sustainability commitments for their benefit when consuming data and compute services from a company running on renewable energy.

Yet, customers have their own roadmaps and objectives, too. One of our big cloud customers, Etsy, already runs on Google Cloud, the greenest on Earth. Moreover, Etsy has their own social, economic and ecological objectives such as using 100 percent renewable energy by 2020 or a 25 percent reduction in their overall energy consumption by 2025. Our work with Etsy provides their sellers with the best solution to connect with their buyers, which naturally helps their own objectives with respect to revenue and social impact.

How has AI changed the way Google processes data and runs data centers? What are some of the machine learning (ML) technologies deployed? How do these technologies come into play when processing data?

Well, these days many big companies want to utilize ML and AI to optimize their operations, all with systems that we pioneered. Several years ago, in partnership with DeepMind (now an Alphabet-owned AI unit headquartered in London), we developed an algorithm that made an AI system able to operate part of our cooling systems autonomously and safely. The results were promising.

The heat exchangers play a critical part in keeping the Eemshaven, Netherlands data center cool

In the areas where we rolled out the AI system, we saw  an estimated 40 percent in savings in overhead energy usage for cooling. We considered the biggest limitations to driving renewable energy on the grid, which are the sun shining not being reliable and the wind not blowing constantly. For planning purposes, grid operators need to know ahead of time the amount of demand that the grid will have, and subsequently, how much generation capacity they’ll need on the grid. Looking at those needs, with DeepMind, we created another algorithm that accurately predicts the output of a wind farm up to 36 hours in advance of the actual power generation. This foresight and predictability then gives grid operators a tremendous amount—hundreds—of wind energy megawatts to put on the grid. When running our data centers, we also use AI and ML to determine methodology like when to time refactoring, tech refreshes and new capacity deploying, as well as how to balance those deployments more efficiently.

This story is part of a paid subscription. Please subscribe for immediate access.

Subscribe Now Already have Subscription? Login