In the News

How Big Data Analytics is Aiding Search for Flight 370


Crowdsourcing website,, being used to enable public to join in the hunt for the missing aircraft. As the hours and days go by following the sudden and mysterious disappearance of Malaysia Airlines Flight 370 somewhere in Southeast Asia, more people and organizations are joining the search party. And they are using every tool at their […]

Tech Innovation Convergence: When 1 + 1 Equals Three


Call me a nerd in my deepest soul, but one of the things that excites me most about technology is when two or more concepts get pushed together to create something potentially even more amazing. We’ve witnessed it multiple times in enterprise IT’s past. These resulting capabilities are usually so logical and so liberating that we […]

Big Workflow: The Future of Big Data Computing – Scientific Computing


How can organizations embrace — instead of brace for — the rapidly intensifying collision of public and private clouds, HPC environments and Big Data? The current go-to solution for many organizations is to run these technology assets in siloed, specialized environments. This approach falls short, however, typically taxing one datacenter area while others remain underutilized, functioning […]

Why Big Data needs Cloud – Wired


Ask a hundred pundits, and you’ll get a hundred definitions of big data. Some suggest a specific size (“anything over 50 TB is big data”); others like to talk about the 3 Vs (volume, velocity, variety) or the 4 Vs (3 Vs + veracity). But I think the simplest definition is best: Big data is […]

Adaptive Computing Spans The DigitalGlobe – Enterprise Tech


Given a choice between developing their own cluster and data management software or buying it off the shelf from third parties, most large enterprises would prefer to buy rather than build. This is not always possible, particularly at companies engaged in innovative businesses outside the norm. Such has been the case with satellite imagery provider […]

Cracking the Silos of Custom Workflows – HPCwire


In high performance computing, the time-honored concept of creating tailored workflows to address complex requirements is nothing new. However, with the advent of new tools to analyze and process data—not to mention store, sort and manage it—traditional ways of thinking about HPC workflows are falling by the wayside in favor of new approaches that might […]

Helping data centers cope with big data workloads – ITworld


February 25, 2014, 2:38 PM — Big data applications, on the other hand, tend to suck up massive amounts of compute load. They also tend to feature spikes of activity-they start and end at a particular point in time. “Big data is really changing the way data centers are operating and some of the needs […]

Helping Data Centers Cope With Big Data Workloads – CIO


CIO — The demands of big data applications can put a lot of strain on a data center. Traditional IT seeks to operate in a steady state, with maximum uptime and continuous equilibrium. After all, most applications tend to have a fairly light compute load—they operate inside a virtual machine and use just some of […]

Adaptive Computing’s formula: Workflow management + Big Data = Big Workflow – ZDnet


Adaptive Computing CEO Rob Clyde stopped by to discuss how the company has used its Moab high performance computing (HPC) software to manage Big Data workloads and his company’s attempt to declare a new industry catchphrase, “Big Workflow.” Moab High Performance Computing Adaptive Computing has long offered products that help companies deploy HPC workloads. The […]

Slidecast: How Big Workflow Delivers Business Intelligence – insideHPC


In this slidecast, Rob Clyde from Adaptive Computing describes Big Workflow — the convergence of Cloud, Big Data, and HPC in enterprise computing. The explosion of big data, coupled with the collisions of HPC and cloud, is driving the evolution of big data analytics,” said Rob Clyde, CEO of Adaptive Computing. “A Big Workflow approach […]