Adaptive Computing’s Big Data Predictions for 2014

We know we need to address the dynamically evolving data center of today and the future. That’s why we’ve thoroughly researched and compiled our top predictions on the future of computing and big data analytics. Adaptive Computing’s big data predictions encompass several emerging trends—including the collision of cloud computing, HPC, and big data—that are expected to accelerate the ways enterprises extract insights from their data.

Adaptive Computing solidified its predictions after findings from a hands-on survey conducted at Supercomputing, HP Discover, and Gartner Data Center in December 2013. More than 400 data center managers, administrators, and users participated in the Adaptive Computing survey from a number of vertical markets, including education, oil and gas, financial and insurance services, government, manufacturing, technology software, and telecommunications.

The speed, accuracy, and cost at which enterprises can process big data analytics is the new competitive battleground. We expect the need for results to greatly impact computing in the future.

Take a look at our predictions for big data in 2014:

  • Enterprises will combine computing resources for a better big data solution. According to our survey, 91% of organizations believe some combination of big data, HPC or cloud should occur. As the collision of cloud computing, HPC and big data intensifies, we predict that organizations will gain a competitive advantage by investing in software capable of scheduling and optimizing data center resources, which increases utilization by simultaneously orchestrating compute jobs over multiple computing platforms.
  • More organizations will turn to HPC as a big data solution. According to our survey, 44% of organizations use HPC as a big data solution. As HPC hardware costs continue to decrease, we forecast that HPC will become an attainable big data solution for more organizations, including midmarket enterprise companies.
  • Big data analysis process will become more automated. The majority of organizations (84%) have a manual process to analyze big data. A manual approach is time-consuming and typically results in underutilized, siloed computing environments, which explains why 90% of survey respondents would achieve greater satisfaction from a better analysis process or workflow. To process simulations and data analysis more effectively, we predict that more organizations will automate the workflow, minimizing costly and error-prone manual work.
  • The volume and complexity of big data workflows will begin to impact businesses on a larger scale. Our survey found that 72% of organizations believe workflow impacts their business. This is due to the complexities that accompany setting up different types of data sets and databases, and the corresponding application needed for each job. Running compute- and data-intensive big data workflows with no automation tends to cause logjams and delay results. We foresee greater emphasis on automating workflows to eliminate logjams and help extract key information from big data, accelerating insights for the business.
  • More efficient big data analytics will increase revenue streams. The Gartner document, “User Survey Analysis: Driving Efficiency and Reducing Cost Is King When It Comes to Decision Making for New Technology Solutions” found that “mobility, big data and analytics were rated as being of greater importance to an organization’s strategy than social.” They deemed that the information “aligns well with the data received from a recent vendor survey conducted by Gartner in which 2,015 providers expect analytics to account for three times the revenue stream of social.” We predict that big data analytics will drive greater revenue by increasing efficiency, reducing internal costs, and enabling new business models.

Rob Clyde, CEO of Adaptive Computing, believes, “big data requires a streamlined approach to a complex data analysis and simulation process that can manage all resources across multiple computing platforms.”

That’s why we’ve engineered our Moab software to broker resources over HPC, cloud, and big data platforms optimizing data center resources by efficiently scheduling compute jobs. Moab is uniquely positioned to enable the enterprise to better leverage big data for game-changing, data-driven decisions.

Facebook Twitter Email