Adaptive Computing Integrates Intel HPC Distribution for Apache Hadoop Software to Manage Big Data

We’re proud to announce that we’re integrating our Moab/TORQUE workload management software with the Intel® HPC Distribution for Apache Hadoop software, which combines the Intel® Distribution for Apache Hadoop software with the Intel® Enterprise Edition of Lustre software.

“The solution allows customers to leverage both their HPC and big data investments in a single platform, as opposed to operating them in siloed environments,” said Michael Jackson, president and co-founder of Adaptive Computing. “The convergence between big data and HPC environments will only grow stronger as organizations demand data processing models capable of extracting the results required to make data-driven decisions.”

The Intel HPC Distribution for Apache Hadoop software is an open source software platform for big data processing and storage built from the hardware up to deliver industry leading performance, multi-layered security and enterprise-grade manageability. The solution addresses the growing adoption of big data analytics and HPC systems in enterprises as well as research institutions.

Enterprises as well as research labs are looking for a flexible software platform that allows big data analytics applications that are based on Apache Hadoop to access data that is located on HPC storage systems. Just as importantly, organizations expect the same performance and manageability from Hadoop workloads that they get from orchestrating HPC workloads today.

The integration marks a milestone in the big data ecosystem by enabling Hadoop and HPC workloads to run together on the same infrastructure, faster and easier than in isolation. The combined technologies from Intel and Adaptive Computing improve job launch speeds as well as Hadoop processing speeds to increase efficiency and performance of big data workloads in HPC environments.

Facebook Twitter Email