Benefits of Utilizing HPC in an Enterprise Data Center

I recently attended the 2013 Gartner Data Center Conference in Las Vegas. As the enterprise has started to get a handle on gathering and storing their data, they are now grappling with what to do with it.

Many sessions talked about whether to go with a traditional data center or to consolidate legacy systems into a private cloud, or even what makes sense to move to a public cloud to change from a CapEx accounting model to an OpEx one.

Enterprise Data Center

The enterprise now requires the information to be analyzed quickly and they want outcomes based on simulations and variables in the system. This problem is not new to research organizations, universities, and governments; they have been using high performance computing for years. Many companies have the same challenge of BIG data analysis from many different sources in real-time to make business decisions. This is especially true in Finance, Retail, and Manufacturing.

Data is so large that in some cases it would be faster to take the storage and physically ship it by courier than it would be to move the data. This requires the enterprise to think about moving the computational workload close to the data. All of these processes and logistics require a robust WORKFLOW that can span local and remote data centers, including private and public clouds. If data sets are not as large but computational power is needed, many are turning to the cloud to augment resources. It is interesting to me that this problem is currently in the wheelhouse of supercomputing.

I recently read online that an insurance company was spinning up a public HPC cloud in a weekly 60-hour job to generate quotes so they could return a quicker response to customers. Another article profiled a large online company that uses HPC and data analytics to cut a two-week analysis workload down to near real-time for fraud detection; this saved them millions of dollars in a year.

It is a very exciting time for the enterprise as the technology converges between the Cloud, Big Data, and High Performance Computing. An IDC research study, found that “97% of companies that had adopted supercomputing said they could no longer compete or survive without it.”

At Adaptive Computing we are building the foundation for this new age of enterprise computing including scheduling, resource optimization, and workload proximity to compute near the vast amounts of data. HPC is no longer just a research, university, and government technology it is now becoming commonplace in the commercial enterprise. HPC is now needed to manage the enterprise’s big data.

Facebook Twitter Email