In the last blog post we touched on the impact that a simple little thing like a common application interface, the browser, had on software development, and how it was a fundamental step in enabling Software as a Service (SaaS). This week we’ll look at how the internet was the foundational step. I thank Wikipedia for providing the background on the ARPANET, predecessor to the internet in writing this post.
Up until the 1960s, telecommunications essentially consisted of linking sections of lines to create a dedicated communications channel. While this helped ensure fidelity when connected, it ensures several things:
- As the number of computers increase, the number of connections grows exponentially
- Connected computers need to have many expensive communications channels
- There is a lot of expensive bandwidth being unused
Some visionaries had the idea of parsing data into small packets, and putting routing information on each packet. This would allow a lot more information to be moved across the network, as that unused bandwidth could be filled with packets. Companies would only need to have a connection to the network rather than buy or lease dedicated communications “pipes” between their offices or to partner firms. Packets could be routed across any number of routes between companies rather than have to follow one pre-defined path. So the internet is really like a flexible highway and a common set of definitions for how systems will connect with, and communicate, on those highways.
The internet and ubiquitous networking as we know it today didn’t start commercially until the early 1980’s, as companies were still mainframe centric. Application programs were still very much operating system restricted, and the idea of “cross-platform” programs didn’t start taking off until the advent of the relational data base (more on this in a future post) and the spread of engineering and departmental systems running different versions of UNIX. Updates between programs were still primarily nightly batch updates. The pace of change was comparatively slow, as was the delivery of software.
When packaged application companies started to grow, after the introduction of the IBM compatible PC and distributed UNIX machines, software distribution was still primarily through tapes for mainframes and disks or diskettes for everything else. The CD allowed significantly more information to be easily packaged on a portable medium, and also allowed companies to more easily install and re-install software in their enterprises. Vendors had not really thought about making copies of software available for download yet – you needed access to a widely available communications framework like the internet to allow downloads.
As the Internet was finally coming into its own in the 90’s, due, I believe, to the ongoing decrease in costs of transmission, increase in bandwidth and ongoing pressure for connectivity between companies and their supply chains, the economics for software distribution shifted. Companies found shipping CD’s to be more cumbersome than allowing downloads off their website (it’s a lot easier ensuring customers have the latest version if it’s available on your website rather than asking them to read the release level on a CD).
In parallel to the shift to downloading software there was a shift to transactional connectivity between supplier systems. Having systems update each other in real time vs nightly or weekly batch updates impacts how quickly the business needs to change, as well as user expectation of information accuracy. Users continued to change their expectation of how they could access information within their own companies as updated by supplier firms, as well as accessing systems at other companies. The idea of interconnected systems also reduces the resistance to the idea of ubiquitous computing (SaaS) – why do I as a user only have access to applications in my enterprise?
The internet is not just a common information movement platform for companies. It is also a common set of standards that make it easier to establish connectivity, security, and data elements shared between systems. Taken together, the costs in specialized skills, time to implement connectivity, equipment and bandwidth have kept coming down.
If you’re a start-up, lowering the costs of infrastructure lowers your costs of bringing a solution to market. The internet was become one of those business and technical model differentiators for a new class of company and application.
So now you had a common interface to lower the cost of application development and make it easier for users to access information. You had a communications layer that made it easier for your end customer to access your application. The only major technology hurdle left was how to securely store data from multiple users in one location. It turned out that problem had recently been solved by the relational database.
Next post we’ll share some perspective on the state of Software Asset Management, from the IBSMA SAM Summit in Chicago, which we had the fortune to attend this past week. In future posts, we’ll look at a couple of other critical factors that made SaaS come to pass: the relational database, a frustrated sales customer base, built in speed of change associated with packaging applications, and the visionary genius that saw how these elements could be combined for a new type of business model.