What’s a “Cloud”?

[deprecated]

this section is another time capsule. Largely outdated, and these days I work fully on the consumer side, but still a nice bit of nostalgia for no one to read!

[/deprecated]

People ask me this question a lot lately. That’s what happens when marketing is a bit too effective. The term has even bled into popular culture so most folks figure “well this seems important”, but there is often no wood behind the messaging arrow. Quite the opposite it is frequently being coopted these days a “cool” catch all phrase. Have a lukewarm product line or service offering? Hey, rebrand it “cloud”! Cheap and easy right?

It’s not only in the consumer space that there is confusion, however. Even IT professionals (and the vendors who serve them) struggle both with the real definition of this nebulous thing as well as what its implications really are. Views range from pure cynicism (“it’s all BS marketing!!!” to hyper optimism (“this is the only way computing will be done”). For the record the latter camp has the better point, but let’s not skip ahead.

So what is “cloud”? Well a true understanding requires a (brief) history lesson. Way back when in a time when old guys were young, a time called the 70s, “computers” meant big giant beasts called mainframes which were fabulously expensive and occupied massive amounts of room. People didn’t own computers, they used them. Literally. IBM had a leasing model for the machines and only huge corporations, universities or governments could afford them. Regular folks, be they travel agents, scientists, stock brokers, or even students, bought processing time and storage on these machines by the hour. The model was called “Time Sharing Operation” and it was good (ish).

As the years rolled by technology companies labored continually to shrink the technology and an emerging generation of young upstart tech radicals dared to envision a day when computing was democratized (literally – this was a very deep and complex movement in those days). The massive potential represented by computing belonged in the hands of all people, not just wealthy institutions that could afford the cost of entry. These folks turned their passion into action and in places like the legendary “Homebrew Computer Club” of Silicon Valley, the “kit” computer phenomenon was born. It’s important to not take this one lightly. This really was a true revolution and it changed the course of human history. Exaggeration? Homebrew club members including Steve Jobs, Steve Wozniak and Bill Gates in their eager, young and visionary forms.

As amazing and exciting as this movement was, it wasn’t accessible. These guys were far beyond the current “system builder” skill set. We’re not talking about screwing a motherboard into a case like a Lego set. Anyone who has seen an original Apollo kit understands exactly what I’m saying. This was an engineer only club. The price of membership included the ability to read a schematic, operate a soldering iron, and understand electronic circuit design. While computing was certainly getting personal, this really wasn’t “personal computing” any more than the libertine intellectual discussion groups of 19th century Europe were truly having a measurable impact on the plight of the common worker. A far more radical change was going to be needed for this “revolution” to become THE Revolution. Enter Apple.

While there is no doubt that there were earlier machines like Olivetti’s Programma 101, and Commodore with the Pet was innovating right at the same time, Apple came out of the gate with what would ultimately remain their approach right up through present day. They weren’t the first, but they got it right. With the introduction of the Apple ][ personal computing exploded. By the early 90’s they were everywhere and business was being run on them. He mainframe was heading towards retirement and entirely new possibilities were being discovered. Behind all of this, unbeknownst to most people, the true “killer app” of personal computing was emerging from its awkward adolescence to its world conquering adulthood. The true future of personal computing would be the Internet.

Connected to each other first privately inside of universities and offices (much like the mainframe) and later between organizations through private, and expensive, high speed connections, “networking” continued to grow and proliferate through the early 90s while the granddaddy of all networks, the Internet, continued to bake. In 1989 Tim Berners Lee invented HTML and HTTP. By 1992 the World Wide Web had been born from his efforts. By 2000 it ruled the world. Consider this… In 25 years computing had gone from something completely abstract, interactions with a giant hidden machine performed by experts on monochrome terminals, to a universal. Tens of millions of people across the world connected to each other through massively powerful (relatively) and inexpensive (relatively) machines that could fit in a backpack. So who cares and what the heck does this (overly long) history lesson have to do with “clouds”. Just a bit more patience, we’re almost there!

The late 90s were heady times for technology. The fabled “dot com boom” was at its peak. Amazon sprung from the mind of Jeff Bezos during this period. eBay from the mind of Pierre Omidyar. And in Mountain View, two scrappy guys by the name of Larry Page and Sergei Brin were creating Google. For all of the negative press about the “dot com bust” that ultimately came about because of irrational investing, there is no doubt that this was the second golden age; Homebrew 2.0.  While it’s certainly true that over exuberance (and greed) in this period lead to some really irrational investing, and the creation of a bubble, and an equity destroying collapse it is not the case that every idea which went bust during this time was a bad one.  If you sort through the rubble of the “dot com bust”, you’ll find two very interesting things.  The first is the birth of the term “Application Service Provider” and the second, a bit more abstract but much more significant, is the basic concept of “software as a service”.  The two concepts are very much linked and in the two of them we find what is really the essence of “cloud”, so let’s take a deeper look.  “Software as a service” sort of begs the question of “what is software?”  Most people today probably understand that hardware is the computer itself and software are the programs you run on the computer.  But is that BASIC program you typed in “software”?  How about your Microsoft Office DVD?  Or that macro you use to calculate mortgages?  The answer is yes.  To all of the above.  Super Mario Bros 2 on the Nintendo?  Yes.  What about Angry Birds or any other iPhone “app”?  Absolutely. Ok ok, but what about Amazon?  That can’t be “software”!  Ah, now we’re getting somewhere.  Any program running on a computer is “software”, but how you access those programs is a different matter entirely.  For a very long time “software” was something most people didn’t directly see.  Going back to that Mainframe world, you interacted with the “software” while you were sitting down at a terminal performing a task.  Then the PC came along and “software” became a tangible thing.  You bought it, held it, copied it (illegally) and traded it.  You “installed” it when you wanted to use it and “uninstalled” it when you were done.  And for an increasing number of regular folks you even created it (every early personal computer came with a BASIC interpreter and every modern one has a host of massively powerful programming languages built right in).  In the early 90s Scott McNealy, then head of SUN Microsystems, made some (at the time) insane predictions that “the network is the computer” and that applications would again be ephemeral things not really tied to any single computer.  People laughed and snark ensued.  It took a while, but he was vindicated.  McNealy was absolutely correct and it was in those late days of the “dot com boom” and we saw the beginning of it.  Amazon absolutely is an application.  It’s an application that allows you to buy and sell things.  It also happens to run in a web browser.   As a matter of fact, increasingly the web browser is the way people interact with technology.  The internet, or more specifically the web, is the killer app of personal computing.  The network is the computer.  No one expresses this more directly than Google with ChromeOS.  The entire OS is a browser.  Really, the modern browser is an OS, capable of being a full application platform and the Application Service Provider model of the late dot com boom days was the first glimpse of this.  The first instance of classic application designs meeting the web, creating the concept of “software” (classic applications like a Word processor) being delivered “as a service” (the way a modern “web app” is) by a service provider whose business it is to deliver those experiences.  Early attempts at this “ASP model” included packaging up popular software like Microsoft Office into a web browser consumable model.  It was too much, too soon and the industry wasn’t ready for it.  That said, the model spurred innovation that many didn’t see.  Microsoft slowly started to turn over rocks in the space.  Hosted Exchange was born guerrilla style in the field, then formally adopted by the product team, then evolved into a true service and today… “Cloud”.  At the same time “Software as a Services” business were being born on the web.  Salesforce dot com is the first massive success story of a deeply legacy enterprise application (CRM – Customer Relationship Management software used by sales teams to keep track of contacts, opportunities, and sales performance) delivered in a purely “web 2.0” model.  A service that was as easy to use as Amazon or eBay, but was delivering really traditional enterprise app capabilities.

We’ve traveled a long and winding road to get to this point, but it is now time to answer the initial question:  What is “cloud”?  Put in the simplest terms, “cloud” is computing on the internet.  For consumers, that’s really the right definition.  If your experience is either tied to, or enhanced by, an internet delivered service, that’s cloud.  Facebook, GMail, Instagram, Spotify, DropBox…. These are all examples of “cloud” – computing capabilities deliver to you by a provider through the internet.  Whether that means a “hard drive on the web” (DropBox) or “an infinite jukebox of music” (Spotify) or is as fundamental as basic email but completely operated by a provider and accessed through a browser, it all counts as “cloud”.  For consumers, it doesn’t need to get any more complicated then this.  The “cloud” label being glued onto a product or service basically means that there will be some internet dependency introduced and, in exchange for this tethering to a network connection, the experience should in some way be enhanced.  A great example would be the various hard drive manufacturers recently rebranded “cloud storage” offerings.  In this case these are the regular old external hard drives we’ve used for ages, but enhanced with a user interface that allows you to access and configure the device in a web browser and also extends the device out to an internet hosted service in some meaningful way (either for backing the device up, or for accessing the device while on the road).  So this makes sense for consumers, it’s really a rebranding and a product evolution exercise, but what about for IT professionals and for the developers and architects who are building these services?  What are the true implications of this technology shift?  Not surprisingly the answer here is much more complex.  Let’s take a look.

 

 

IMG_0945.JPG

IMG_0946.JPG

Leave a comment