In the early 90s, Microsoft was riding a wave of growth that would propel them to become one of the most valuable companies of all time. At the core of this growth was the Windows operating system. To understand the core value of Windows and explain its stunning success, consider it to have a top and bottom half. The bottom half – the Device Driver Kit (DDK) allowed, and required, hardware vendors to expose their products to Windows, and Windows applications, in a consistent way. The top half – the Software Development Kit (SDK) and Application Programming APIs, also known as Win32, enabled Windows applications to run on a wide range of hardware without adaptation. Combined with Windows global language support (localization), and single worldwide binary, Windows became the glue that enabled an explosion of applications, around the globe, to run on an explosion of hardware. Both Win32 applications and hardware became cheaper as volumes grew exponentially, and Windows to this day is very inexpensive product relative to the value it provides.
This model, while enabling the industry to grow rapidly by decreasing the cost and increasing the value provided by PCs and applications, was not without conflict. Hardware manufacturers frequently wanted to expose unique functionality, and Microsoft restricted this via a certification process. The argument being that unique functionality would not be understood by all applications, and so would break some of them – fragmenting the platform. Over time, Microsoft asserted that a majority of Windows blue screens were actually caused by bugs in “third party” (other companies) code, requiring even more control over it.
As a concrete example, the touch pad on PCs understands click, pinch and drag. Despite the usability of many more gestures on MacBooks, Windows would not easily be able to add more gestures as the operating system abstracts mouse and touch gestures into a narrow set that applications expect.
There were several implications of these design and business decisions. Installations of Windows are measured in the billions. It’s doubtful the world would have experienced the computing revolution that ultimately touched the lives of nearly everyone on the planet. But it also slowed the rate of innovation. Innovation and backwards compatibility, as Apple has demonstrated many times by breaking it, are at odds. Eventually this, combined with a growing hostility toward the degree of control Microsoft exerted on the industry led to other companies winning on the internet and the phone.
Local Area Networking, or the ability for computers to exchange data over office building distances at high speed started to become popular in the mid-80s. As Microsoft and Novell adopted LANs to Windows, their approach was to virtualize remote systems (servers) into models that already existed in Windows, in order to not break applications. Remote servers appeared to be locally attached disk driver and printers, and for the first time could be used by multiple different computers at the same time (sharing).
At the same time, the Unix approach to networking supported virtualization, but also supported simple application to application to application connections made via the sockets API. Unix adopted the TCP/IP protocol, which worked on local area networks but also on networks that spanned the globe. The ability for an application to connect to another application without the operating system understanding and supporting the application produced a wave of innovation on connected application innovation that continues to this day.
Window 95 was not the first version of Windows to support the sockets API, but it was the first commercially successful one. It is unlikely that the internet as it exists today would have grown as fast as it did without the Winsock API in Windows 95. It is unclear if Microsoft really understood the implications of this decision at the time – although in reality TCP/IP and sockets would likely have eventually been adopted globally, regardless of what Microsoft did. This was the beginning of the series of innovations that would break the grip that Windows had on the application ecosystem.
The most disruptive of the TCP/IP and sockets based applications was the browser and web server, starting to catch on in the mid-90s. The very first browsers were not much more than document directories and viewers, but they laid some important groundwork. Much of what we would come to think of as “the application” in a browser environment was downloaded immediately, from the remote server, upon looking a page. From the customer’s perspective, this was a zero-install application, and from the publishers perspective they could update the code of the application, across the globe, a single action. This value proposition remains to this day, and separates browser applications from mobile, Windows and MacOS applications.
Two other transformative browser features were the first global “application directory”, the Uniform Application Locator (URL). The URL was a readable identifier for a site/application, that had one part (www.zintel.net) that was centrally administrated, and another part (http://www.zintel.net/Public/Underwater/Underwater.html) completely under the control of an organization. Unlike existing application directories, this enabled massive growth without collisions, or any central clearing house.
Finally, hyperlinks enabled a new kind of application that referenced parts of other applications, without coordination (or even agreement, as current events in Australia are highlighting). A web of applications. The page/click/link model, which many people assumed to be an overly simple first version, has proven to be useful to this very day.
Microsoft had several responses to the explosion of growth on the internet. The company initially cancelled its competing investment based on the doomed X.25 standard, and invested in a browser called Internet Explorer. Later, the company would “internet-enable” Windows and Office in way that would be come to be known as software+services.
But at that point Microsoft had grown accustomed to defining what would be successful, and what wouldn’t, simply as a side effect of the volumes of Windows being shipped. It is very difficult to displace a product once it has reached global scale. The company took two swings at making the browser standards effectively Windows standards, in a strategy known at the time as embrace and extend.
The first was ActiveX, introduced in 1996. This was a repurposing of a technology called COM/OLE II, which existed on the Windows desktop, and to some extent in Win32 applications, into Internet Explorer. This allowed browser applications to access functionality in the underlying Windows operating system. There is no doubt that this created customer value, as the subsequent popularity of Shockwave Flash for games and video demonstrated, but had the perhaps not entirely coincidental side effect of making web pages that ran ActiveX incompatible with all other operating systems, and put Microsoft in control of a piece of Internet technology.
ActiveX eventually was discontinued primarily because of real and perceived security flaws in ActiveX itself, and more so in the most popular ActiveX plug-in, Flash.
By the end of the late 90s, Microsoft’s control of the computing platform, not unlike Google’s control of web search, and Apple’s control of applications today, was starting to be understood from a technical / strategy perspective, externally, and actively opposed. In 1998 The DOJ under Clinton filed an antitrust suit against Microsoft, egged on by Sun Microsystems, Netscape and other competitors. The actual premise of the suit, that Microsoft held an unfair advantage because they could include their own browser with Windows, and that that customers wanted choice in operating systems on PCs, was quite weak. Unlike Apple today, which simply decides what application vendors will be allowed to compete, and which will be driven to bankruptcy, it was never difficult to load an application onto a Windows PC. And using Internet distribution, achieving scale and dominance quite possible, as Google’s Chrome has demonstrated. It turns out that customers did want choice in operating system, in the from of iPhone, iPad, Android, MacOS and Linux, but they didn’t not want an alternative to Windows – that would have simply not run Windows apps. The market took care of the problem the DOJ thought they were solving.
The suit did distract Microsoft for many years, and slowed innovation at the company. Likely this was the goal all along, at least in the minds of competitors informing the government.
I’ll only briefly mention Microsoft’s second attempt to control the browser standards, Windows Presentation Framework. Had this succeeded, it would have forced every web application developer to choose standards (sorry, enabled them with choice) that ran across all browsers, or a Microsoft standard that ran only on Windows. It was arguably a better technology than HTML at the time, but failed.
Somewhat ironically, the one important standard that is still running the web today, Jscript, was fully supported by Microsoft browsers from the beginning, ,and Microsoft only recently gave up on their own browser core and adopted Google’s WebKit. “Control” of the browser was a losing strategy from day one.
The first “cloud” companies were web hosters like GoDaddy is today. They ran Microsoft’s Internet Information Server (IIS) on Windows Server, or open source Apache on Linux in large datacenters, and leased “instances” to customers. It is quite difficult to manage 100 servers exposed to the internet in a datacenter, but not incrementally more difficult to manage 1000, or 10,000. Customers, who initially thought of themselves more as publishers and less application developers in the beginning, found hosted solutions to be compelling. This remains true today.
Over time, as web and browser technology became more powerful, more and more applications moved to the web, primarily because of no touch install, global single gesture updates, telemetry, and operating system independence.
In the mid-90s, the important large scale off the shelf business applications were from SAP, Seibel and Oracle. Most of these systems were installed and ran on customers’ hardware, and were complex and expensive to operate and upgrade. In 1999, Salesforce was founded on the idea of delivering enterprise grade applications entirely as a browser application plus a hosted (by Salesforce itself) “backend” component. The model was referred to at the time as an [internet] service, or Software as a Service (SaaS), and proved to be highly disruptive. It was much easier for customers to try before buy, to deploy, manage instances, and for Salesforce to update and manage the backend. Over time, Salesforce enabled 3rd parties to extend their application even running the datacenter. Applications tend to command more margin (https://mikezintel.wordpress.com/2019/07/17/cloud-economics) than raw infrastructure, and applications with a 3rd party ecosystem even more so. Salesforce continues today as cloud provider of enterprise applications, with sales of $14B in 2019.
Amazon launched Amazon Web Services (AWS), specifically Elastic Cloud Compute and Simple Storage in 2006. Unlike (cloud) web hosters, and (cloud) hosted applications, AWS offered customers instances of entire (virtualized) computers to do as they wish. At the end of 2020, AWS reported $13B in profit on $45B in annual revenue. This model eventually became known as Infrastructure as a Service (IaaS).
As all of this was happening, a new model of application, enabled by large numbers of inexpensive servers hosted in the cloud. These applications ran on many computers, often 10s of thousands and now 100s of thousands or more in order to perform jobs like internet search. Google took the lead on search, with profit of $34B on revenue of $160B in 2019. Unlike existing licensing models, Google, and later Facebook’s revenue source was selling advertisements.
Consumer distributed systems evolved through many models including messaging, cloud drives, photo browsing, blogging, file sharing, but Facebook emerged as the winner. Initially Facebook solved the “permissions problem” (how do I control who can see my stuff) with a simple gesture – become my online “friend”. Unlike MySpace they were closed by default (the world can’t see my stuff), and they had an innovative way to continue to draw you back into the site via customized “feeds”. As the application evolved, Facebook integrated instant messaging, persistent threaded messaging that hung off individual topics, persistent email style messaging, photo and video storage and promotion, and web links into a highly immersive experience. Facebooks profit in 2020 was $29B on revenue of $86B.
Both Google and Facebook pushed the envelope of assumed privacy by actively reading customer data and aggregating it to sell to advertisers. Microsoft was an early developer of consumer distributed systems, with SkyDrive, Messenger, and Hotmail, and at the peak had very impressive adoption numbers. The company was not comfortable changing the privacy model customers had assumed with the personal computer, which likely led to poor revenue in this space, but increased enterprise customer’s trust in the company.
Microsoft launched an AWS competitor, Azure, in 2010. Microsoft does not disclose Azure revenue.
Microsoft and Apple also pursued the model of software+(cloud) services. The iTunes, iCloud, Office 365 and Windows experiences are now backed by often very large cloud services, that provide storage, sharing, email hosting, music fingerprinting, messaging, and many others.
Despite Apple proving to the world that ordinary web browsing was possible on a mobile phone with the iPhone, and despite the fact that a lot of it is done today, mobile applications+services are the dominant application model on phones, as the apps can break out of the page and click model, work offline, and leverage the unique input models of the phone.
Today, nearly all new systems are browser applications against a cloud backend and mobile app based against possibly the same cloud backend. Windows and Office are not new systems, but service-enabling these things has allowed them to provide new value in a cloud world. SAP other enterprise application vendors are working to move their backend systems from customer installed, to hosted by them, and have all provided browser applications even against their existing pre-cloud backends.
So what is the cloud? And who is winning, in the cloud?
Microsoft, Oracle and IBM seem to want to be cloud vendors, in part chasing new growth streams, but also to be and be seen as current and innovative. I’m sure someone knows what IBM and Oracle are doing in the cloud, but I don’t. Microsoft has both an IaaS offering (Azure) and a software+service offering with Office 365. It’s hard to gauge or even define how much of their 2020 profit of $44B on revenue of $143B is “cloud” revenue, as their “intelligent Cloud” segment includes “Server products and cloud services, including Microsoft SQL Server, Windows Server, Visual Studio, System Center, and related CALs, Microsoft Azure, and GitHub”, according to their financial statements. Sometimes they use the term “Commercial Cloud” in press (but not financial statements), and it seems to include Office 365, the largest value proposition of which is the Win32 Office applications themselves.
Salesforce is a cloud vendor.
Other than Android, all of Google’s systems run in the cloud. And they have an IaaS offering, which they refer to as Google Cloud. They do not generally talk about bulk of their revenue and income which comes from ads, as cloud revenue.
Other than a few mobile apps, all of Facebook’s systems run in the cloud. They do not generally talk about being a cloud vendor.
Netflix is a large software+services company that does not refer to itself as a cloud vendor.
Cloud doesn’t really mean anything at all.
Mike.