Blog Posts

Reviewing The Gauntlet – The Impact of New Standards Developing Code


Previously, I introduced The Gauntlet to you explaining some of our new standards for developing code. As we have now been through our first release using The Gauntlet it seems appropriate to review the changes and discuss how its affected our development. For those that haven’t read the prior piece, essentially we have started following […]

Adaptive Computing: INTERNal Impressions

Holly Vizino - Adaptive Computing

Big data, HPC, high throughput, data staging, and command lines are just a few phrases I hear at work every day. To many, these terms are elementary. However, to me, they seem confusing and complex. ‘Then why do you work at Adaptive Computing?’ you might ask. I’m the Marketing Intern. A senior at Geneva College […]

Broken Processes and Continuous Delivery

Continuous Delivery

There is a term that refers to a human performing a computer’s job (a simple, repetitive task): broken. As a general rule, we avoid broken processes, and are therefore making great strides in automating almost all of our processes in software development. This is another step in a series on DevOps tools and processes. I feel […]

Intelligent Power Management Control

Intelligent Power Management Control

Our only concern used to be making HPC systems bigger and discovering ways to achieve peak performance. Unfortunately, the fruits of those exciting times created a challenge: how do we afford to power these ever-growing massive machines, limit their carbon footprint, stay within imposed power caps, comply with government regulations, and maintain performance? One way […]

HPC’s After School Special

Trev Harmon on the stage at ISC's Disruptive Technologies panel

Between ISC and HP-CAST (held Friday and Saturday this year), I had several speaking opportunities, wherein I mostly talked about our release of Moab 8.0, and the many new features and improvements it provides to our customers. However, participating as one of the panelists on Disruptive Technologies, I had the opportunity to discuss where I […]

A Big Data Analysis Paradox

Big Data analysis

There is a nuance about Big Data analysis. It’s really about small data. While this may seem confusing and counter to the whole Big Data “movement,” small data is the product of Big Data analysis. This is not a new concept, nor is it unfamiliar to people who have been doing data analysis for any […]

TORQUE Protocols 101


One definition for protocol put forth by Miriam-Webster is a system of rules that explain the correct conduct and procedures to be followed in formal situations When it comes to network communications I think that this definition pretty much covers it. Sure there are some technical elements needed but overall that is all there is […]

Moab and TORQUE: Divide and Conquer

Bull Dutch National Supercomputer

Just like the area of an ever-increasing rectangle, the challenge of scheduling larger and larger HPC systems becomes, in the worst case, a problem of evaluating an ever-increasing number of jobs against an ever-increasing number of nodes to find the best fit. Partitioning the cluster helps with this problem in some cases, but when you […]

Don’t Fix My Software Bugs

Image via

Every piece of software has a few broken bits. In some cases broken behavior is harmless enough that we can work around it. In other cases those broken bits warrant action and a Google search is initiated, an email is sent or a trouble-ticket is created. The worst bugs are the kind we build into […]

Adaptive Computing’s DevOps Tool: DOCKER


Previously here on the Adaptive Computing blog I have explored some of the value we have discovered as a company by implementing various DevOps practices. I talked directly about Builder, an internal tool created by the DevOps group and some of the direct time savings we have seen within development, quality, and professional services groups. In […]