HPC or high-performance computing has become a commonly uttered buzzword in pitch meetings and technical presentations, but what exactly do we mean by that? Dell has many server platforms marketed towards HPC workloads, and they vary significantly in their architecture and componentry. So, what exactly is HPC, why should we care, and how does Dell Technologies address that market? Let’s dig in.
The concept of HPC, commonly referred to as supercomputing, is incredibly old and pre-dates what we even think of as a computer. As early as the late 1920’s there existed very large, custom-built tabulating machines designed to perform arithmetic functions at a scale and speed that was not possible by other means. At its core, this continues to be the goal today, although the scope has expanded to many new scientific fields and the scale has increased by many orders of magnitude.
Many would probably consider the dawn of supercomputing to be the early 60s when the likes of Control Data Corporation and IBM were jockeying for the title of fastest computer based on a performance standard that survives to this day – FLOPS/s, or floating-point operations per second. Eventually, Seymour Cray broke off and formed his own company, developing the venerable and widely known Cray series of room-filling, power-sucking computational behemoths.
Cray-2 liquid cooled supercomputer
Fast-forwarding through the next few decades, we went from 4 processors on the Cray-2 to over 7.5 million processing cores in today’s champion, Supercomputer Fugaku, in Japan. The architectural trend has generally shifted from single monolithic systems to distributed compute clusters. In contrast, input has gone from stacks of paper punch cards to petabytes of data, and the range of applications of this technology has exploded. Today the term HPC encompasses a broad range of use cases and industry verticals, so let’s take a look at a few.
Big Data Analytics:
Big Data and analytics get a lot of attention today because of their growing importance in competitive businesses. Retailers like Wal-Mart discovered trends in consumer behavior many years ago that gave them a huge competitive advantage. Customers that purchased a certain product would also frequently purchase another product. With this simple insight, derived from terabytes upon terabytes of processed point of sale data, they could relocate one item in a store to be near the other, thereby improving the shopping experience and driving more sales. For better or worse, in the age of the internet and GPS equipped smartphones, there is an exponentially growing amount of data that can provide extremely valuable insights to those who invest in its collection, processing, and interpretation.
Many would probably agree that the Human Genome Project is one of the greatest achievements of the past few decades. It has provided an unprecedented level of knowledge about human genetics and has forever changed the field of medicine. Without the ability to compare trillions of sequences per hour via the 800 interconnected Compaq Alpha-based computer systems, this 14-year endeavor would have taken far longer or been impossible altogether.
Weather and Climate:
The information provided by the real-time analysis of meteorological data from thousands of sensors has saved many lives with early and accurate predictions of severe weather phenomena and undoubtedly saved many a picnic from being ruined by an unexpected rainstorm. At the same time, the processing of hundreds of years of weather records has enabled the creation of models that greatly aid in better understanding climate change.
AI is one of the newer and more frequently talked about applications of HPC. A computer may only require a few operations to solve a highly complex mathematical problem but many thousands of operations to correctly identify a picture of a hot dog.
AI/ML has the potential to give us the best of both worlds but comes with a very steep cost in terms of computational resources. The potential is limitless in the field of food identification, not to mention E-commerce, voice and facial recognition, education, self-driving cars, agriculture, and finance. Comparing virtual personal assistants such as Siri or Alexa today to their earlier iterations really highlights how far the field has advanced.
So how does it all work?
At a very basic level, there are several steps in the process. Once the problem is defined, one must ingest and store all the relevant data. This data could take any number of forms from images, videos, databases, and sensor output. Therefore, to provide any value, the data must be cleaned and processed into a usable and meaningful format as well as sorted and simplified to the greatest extent possible. Next, this data must be analyzed, and models must be created to provide actionable insights. If all this sounds like a lot of work, that’s because it is, and HPC/supercomputing is a huge part of what makes it all possible.
Where does Dell Technologies fit in?
Dell has leaned in heavily in the HPC market from the top tier to the entry-level. There are a number of products available in various configurations to fit different needs and budgets. The HPC5 in Italy, for example, is the 9th most powerful supercomputer in the world and is built on the Dell C4140 platform running NVIDIA Tesla V100 GPUs and Mellanox HDR Infiniband interconnects. (https://www.top500.org/lists/top500/2021/11/) There are exclusively air-cooled options such as the XE8545 which offers unprecedented computing power and density, the more traditional form factor of the 2U R750xa which supports up to 4 double width GPUs or 6 single width as well as liquid cooling options, or the DSS8440 with it’s high speed PCIe fabric and extensive local storage.
Based on the chart below (https://www.top500.org) the trajectory of HPC and supercomputing is on a steady upward trend and shows no signs of slowing down. In the past 30 years our collective processing power has increased by three orders of magnitude. The appetite for HPC technology is great and the market has responded accordingly. Dell Technologies has leaned in heavily with over $600M invested into AI/ML as of July 2019 and that investment will pay off considerably as the world continues to become more and more data driven.