Nvidia’s groundbreaking H200 GPU chip is setting new standards with double the speed of its predecessor, the H100. This unparalleled performance upgrade, however, forces designers to grapple with a considerable increased energy demand.
The data center world, which collectively contributes nearly 4% to global energy consumption, currently relies heavily on air cooling. Enter forward-thinking solutions such as evaporative cooling, which presents as much as a 20% reduction in energy usage. Yet, this energy-conscious approach brings about its own challenge – a massive amount of water consumption.
Anticipating a greener future, data center design is pivoting toward liquid-cooled server technology. Liquid-cooled AI servers, boasting a capability of a billion calculations per second, could mark a monumental leap forward in sustainable technological advancement. Join us in an exclusive webinar where we venture into the intricate landscape of AI data centers.
In our latest webinar, we’re joined by Dave Martinez of Sandia Labs and Steve Harrington of Chilldyne to unravel this tapestry of technological advancements, navigate environmental considerations, and explore collaborative solutions shaping the sustainable future at the intersection of AI and data centers.
Dave Martinez | Sandia National Labs
David Martinez is the Engineering Program Project lead at Sandia National Labs Corporate Computing Facilities (CCF). David has in depth understanding of HVAC controls, IT, Facility hardware implementations, and is widely viewed as a DOE resource for both air and liquid cooled data center deployments.
Dr. Steve Harrington | Chilldyne
In 2011, Dr. Harrington founded Chilldyne to bring his engineering, fluid dynamics, and electronics cooling expertise to the data center. His personal goal is to reduce energy consumption and the carbon impact of data centers globally by deploying liquid cooling to as many servers as possible through widespread adoption of liquid cooling .