2012-06-26 – Optimize Your HPC Environment for Big Data

“Big Data” is changing how business is done. The creation and availability of large amounts of data enable a wide variety of new applications and analytics. By harnessing these massive amounts of data, organizations can create competitive advantages and market opportunities more quickly. High Performance Computing systems, which historically focused on floating point performance, now have to handle larger and larger input data sets. To optimally manage these Big Data workloads, HPC systems need to approach things differently than in the past.

Adaptive Computing, its customer Digital Globe, and Intersect 360 Research come together in this highly anticipated webinar to discuss Big Data in High Performance Computing. Digital Globe has an insatiable appetite for processing Big Data quickly. They will discuss their applications, what’s driving their success, and how the acceleration of HPC is driving their productivity. Intersect 360 Research, which recently conducted a research study on Big Data, will bring an industry-wide perspective to this timely discussion.

About Digital Globe
Digital Globe is a leading global provider of commercial high-resolution earth imagery products and services. Sourced from their own advanced satellite constellation, their imagery solutions support a wide variety of uses within defense and intelligence, civil agencies, mapping and analysis, environmental monitoring, oil and gas exploration, infrastructure management, Internet portals and navigation technology.

About Intersect 360
Intersect360 Research is a market intelligence, research, and consulting advisory practice focused on suppliers, users, and policymakers across the High Performance Computing ecosystem. Intersect360 Research relies on both user-based and supplier-based research to form a complete perspective of HPC market dynamics, trends, and usage models.


Facebook Twitter Email