TechTalk Daily
NVIDIA is working on several big, worldwide problems. These projects include efforts to mitigate climate change, improve and expand renewable energy, address current and future pandemics with faster cures, speed up critical drug recovery, and help scientists understand the universe around us.
These efforts require massive computing efforts using HPC (high performance computing), AI, and supercomputer resources to both fully understand the related problems and come up with viable ways to mitigate or eliminate these problems.
Let’s talk about how HPC and supercomputing are changing and how NVIDIA’s efforts are functioning to create a better world. This is all from Ian Buck’s (VP and GM of NVIDIA’s Accelerated Computing) keynote at ISC22 (you can watch this keynote here).
Buck started out with an overview of the current critical workload being placed on these massively powered, current-generation supercomputers. They broke down into five areas: edge computing focused on microscopy image processing; transformer models; simulation at ever-increasing scales; digital twin (like Earth 2 focused on combating climate change) and quantum computing which will, when it fully matures, disruptively revolutionize much of the computing environment (particularly massive data set analysis, security, and communications).
These efforts and related advancements and technologies are changing dramatically what comprises computer science and assure that the computing world of tomorrow will be very different than the computing world of today.
Let’s break down a couple of these highlighted current workloads.
One of the largest efforts NVIDIA is involved in is Earth 2, a global simulation of the earth at relatively high resolution so that weather events can be better anticipated, and the related loss of life significantly reduced. This resolution started at kilometer scale and that scale has been reduced over time. Simulation has also been used to simulate what the Covid-19 spike does and how to mitigate the related damage. If you know a virus well, you can craft a cure for that virus and supercomputers have been invaluable in helping craft the cures that have emerged.
Forecast Net, a cooperative, collaborative effort, has proven to be better able to measure global scale air movement (atmospheric rivers) critical to anticipating future weather events. Other efforts look at more efficient ways to extract and produce energy more efficiently using atomic energy (achieved a 1,000% speed improvement in the effort), plasma physics inside a reactor, and predicting (Siemens did this) to better plan and implement predictive maintenance to reduce costs while simultaneously increasing reliability.
These efforts are focused on analyzing the world around us. Light, gravity, magnetics and other sensors that produce a massive amount of data that is simply too large to store. HPC at the Edge, and resulting model enhancements, turn the data into concentrated information that can more accurately identify problems and come up with cost effective targeted solutions to mitigate these problems. A lot of this work is done with images that can range from world scale images down to microscopic images. Supercomputers can reduce the image to its critical elements, then use that information to expand the image back out again into something that is far easier to interpret. One of the interesting demonstrations was looking at cells to reconstruct biology in real time. Applied to cancer, this could be critical to both understanding better how cancer works, and to finding out less dangerous and more effective ways to treat it. Researchers can not only see these cells interact at a level never seen before, but they can interact with the cancerous material to determine causes and effects.
Read the full article at techspective >
– Rob Enderle, The Enderle Group