Is All Computing Now “Cloud Computing”?

I still remember the first time I saw a web browser. I was a grad student in the early ’90s, and one of my colleagues told me he wanted to show me something cool. He pulled up Mosaic 1.0 and showed me my first “web page”—a brochure-like “welcome” page from CERN, as I recall.

Remember when browsing the web looked like this? Mosaic 1.0. Image credit: NCSA.

Remember when browsing the web looked like this? Mosaic 1.0. Image credit: NCSA.

I find it interesting that Mosaic was gifted to the world by NCSA—the same NCSA that I visited earlier this week—home to the biggest supercomputing clusters in academia and that continually pushes the envelope of high-performance computing. An innovation from NCSA moved rapidly into the mainstream, with the advent of Netscape, then the browser wars of the late ’90s, the heyday of internet search engines, the dot com bubble, and so forth.

On the day that I saw my first web page, only a tiny fraction of “computing” had anything to do with the web or Internet. The specialized subclass of computing where network-centrism was critical needed a qualifying adjective. There was general computing, and then there was “networked computing” or “internet-connected computing.”

Today, almost any commodity computer, in any form factor, has a TCP/IP stack; using the computer without being able to connect it to a world-wide fabric is so much of an anomaly that we talk about “disconnected” or “offline” mode.

Probably, there was a time when computers that had removable media were so novel that they deserved special description. Nowadays, we’ve lived with floppy drives, optical drives, and USB drives for long enough that we just assume. If you have a computer, you probably have some interface for removable media.

I sense that we’re nearing this same tipping point with “cloud” computing. At one point, infrastructure hosted in remote, internet-accessed datacenters was unusual. Software that took advantage of massive scale, that was savvy about latency and transmission hiccups, or that persisted data somewhere offsite was the exception.

Not so much, nowadays.

Eventually, I predict that software will assume that local storage, local RAM, local CPU, local anything except raw display is just another form of cache. Like Mosaic, this mindset has been introduced by high-end compute consumers, but it’s sweeping down market. The rise of app stores shows how pervasive and comfortable the assumption has become. And the fact that even specialized computing environments begin to use it is another clue. Consider that the supercomputing community where NCSA lives is now talking about cloudbursting…

We don’t bother to say “color TV” anymore; we just talk about “TV.”

You know an idea has arrived when it’s just assumed.

The day when “computing” automatically implies cloud savvy seems almost upon us.

Facebook Twitter Email
  • http://www.heroix.com/ Susan Bilder

    We’re not quite there yet – there are still large sections of the world where high speed internet is not available, and that is the basic prerequisite for a pervasive cloud architecture. And, there will always be some organizations that have data protection needs that require their servers be kept off the net. But – for me at least – computers have gone from programming FORTRAN on a PDP-11 to checking multiple email accounts on my phone – so I do agree on the direction you see things taking.