Cloud computing used to be a fringe concept. Its predecessors - such as time sharing systems and thin clients - go back decades, but it wasn't until the launch of Amazon Web Services (AWS) and Microsoft Azure in the late 2000s that it really took off. It's now so thoroughly in the mainstream that a core consumer-facing Apple service - iCloud, which is backed by AWS and Azure assets - has "cloud" right in its name.
On the business side, growth in the adoption of comparable cloud solutions that follow the Infrastructure-, Platform- and Software-as-a-Service service models has rapidly eaten into the share of IT spend going toward traditional hardware and software:
- The 2018 Spiceworks State of IT survey found that cloud services would account for 21 percent of the typical IT budget next year, rivaling the 26 percent for software and the 31 percent for hardware.
- Technologist Robert Cringely estimated that by the end of the 2010s, all forms of cloud computing combined could consume a majority of the roughly $1 trillion in annual IT-related expenditures in the U.S.
- IT research firm Gartner made similar projections. It also predicted 18 percent year-over-year expansion for the public cloud services market in 2017, with a year-end value of almost $250 billion.
If these current trends hold, cloud would reach what Cringely called a tipping point. But what does that mean for IT professionals and their departments?
How the cloud computing tipping point compares to its predecessors
We can think of corporate IT as having evolved through seven key eras, similar to how life on earth has changed over time. These eras are:
1. BATCH COMPUTING
From the late 19th to mid 20th centuries, what we would now call computing was performed with punch cards and programming languages such as FORTRAN. This type of computing was noninteractive but had advantages in processing large amounts of stored data, either on cards or tape.
Introduced and refined in the 1950s and 1960s, timesharing was sort of proto-cloud. It allowed multiple users to concurrently access a centralized CPU, which had sufficient resources to serve all of them simultaneously.
3. HOME/PERSONAL COMPUTERS
PCs originally were driven by command-line interfaces that could efficiently execute many tasks with the right commands, but weren't super user-friendly. Their evolution eventually initiated a fourth major era, characterized by...
A graphical user interface (GUI) presents an easily navigable layer on top of the operating system. When you click or tap on icons to open apps or perform action, you're using a GUI. All major operating systems (OSes) such as Microsoft Windows and Apple macOS have GUIs.
5. INTERNET PROTOCOL (IP) NETWORKS
The rise of the World Wide Web in the 1990s did for the internet what GUIs did for PC OSes: make it simpler to use. The web and IP initiated a shift in power away from native on-machine applications to websites.
Beginning with the original iPhone in 2007, PC-grade OSes were downsized to run on IP-enabled mobile hardware such as cellphones and tablets. These devices are now the primary way many people access the internet, while their raw computing power is often comparable to a laptop.
"Cloud has become deeply interwoven with today's OSes, applications and websites."
Cloud has become deeply interwoven with today's OSes, applications and websites, to the extent that many of them are basically nothing more than interfaces for accessing cloud resources behind the scenes. For example, every app from Instagram to Office 365 has at one time relied on AWS or Azure.
Cloud turned out to be a "next big thing" that actually panned out, unlike duds such as early virtual reality headsets or 1980s-era artificial intelligence. As an IT professional, it will be a central skill to have in almost any context.