The Dawn of SaaS: A short history of how we got to the cloud - and how it changed business forever

Team Workiro
August 21, 2024
2 min read

It all began with a big bang, or at least the planning of one. The first customer of the first lumbering, mechanical machines that could be called “computers” was the US military, in the second half of World War 2, where one of the notable applications was aiding in the development of the atom bomb. The second, almost immediately after the war ended, was business, as the scientists and researchers saw the opportunity to parlay their expertise into devices that could compete with punch-card machines. 

In the early 1950s one of those scientists, Grace Hopper - a rear-admiral and maths PhD, and one of many women who served as the first computer programmers - swiftly conceived of the foundational building block of software. Bored of repeatedly hand-configuring the machine to complete common tasks, she catalogued them and assigned symbolic codes to each, which could then be run by a “compiler”. Hopper cheerfully attributed this to “laziness” on her part - she wanted to “return to being a mathematician” - but the solution was quietly revolutionary, and became the foundation of computer software as we know it. 

Hopper was also among the first to discover that business leaders wouldn’t learn coding. She advised those who thought otherwise to “make the first attempt to teach those symbols to vice-presidents or a colonel or admiral. I assure you that I tried it” and strove to create a human-readable programming language, called FLOW-MATIC, which enabled human-ish language to direct the computing machinery underneath. This was seized upon by a consortium of industry and government seeking standards for the burgeoning computing industry, and lead to COBOL - which arrived in 1959 as the first business-focused programming language, and swiftly became ubiquitous at the behest of the US Department of Defense, who demanded all its suppliers use it.

This focus on making things easier, simpler and standardised has driven software development ever since, as uptake accelerated through the 1960s. As mainframe computers moved from valves and switches to transistors and printed circuit boards, and businesses became increasingly heavy users of them, the need for flexibility came to the fore. Businesses had to buy different machines for different departments - accountancy used a different system to engineering, and neither were compatible, until IBM pioneered a modular mainframe platform that could support multiple business applications: System360, which launched in 1965. Central systems hosted software programs which were accessed by “dumb” terminals, a model which would return.

Concurrently, as processors and memory gradually became faster, data handling changed too. In 1970 a paper by E.F. Codd, a scientist employed by IBM, proposed moving from rigidly structured databases to a new “relational” model - and that’s where the very first seeds of NetSuite were planted. IBM was slow to act on Codd’s work, but a programmer named Larry Ellison read Cobb’s research and saw it was the future: in 1977 he set up his own company, and in 1979 his company launched its own database product. In 1983, it changed its name, to Oracle (named after a project Ellison had previously worked on for the CIA).

Oracle Database was a success, as were similar business products from IBM and SAP, and businesses the world over became reliant on heavy-duty mainframe systems made by companies like IBM, Xerox and Hewlett-Packard. But the world outside the office was advancing rapidly too: the first personal computer, the Altair 8800, arrived in 1975. Computers quickly became popular outside the office, both for personal use and for businesses who couldn’t or wouldn’t opt for heavy mainframes. IBM, home of such mainframes, created the PC as we know it in 1981 to meet this demand - but built it as an open platform, enabling other manufacturers to produce their own clones (all running a version of the operating system created for the IBM PC, by an opportunistic startup called Microsoft).

While this was happening, the foundation of what would become the internet was being laid inside US universities, and it expanded out as computing became ubiquitous. Online services became available the mid-80s, and towards the end of 1989 Tim Berners-Lee conceived what would become the World Wide Web. In 1994 Netscape released a web browser which supported secure connections; that same year Jeff Bezos moved to Seattle to start an online retailer which he ended up calling Amazon. Web-based businesses boomed, and in 1996 PC manufacturer Compaq decided that the future lay in “cloud computing-based applications” that would enable consumers to store files online. 

The roaring expansion of the early web did not last. Breakneck expansion and unchecked spending turned into catastrophic falls as interest rates rose in the year 2000, and many upstart companies failed in the resulting dot-com crash. But while the bubble burst, the foundation remained, and over at Oracle Larry Ellison saw the future coming again. Oracle was flying high, but it was a slow-moving behemoth, and the world was changing faster than it could keep up. In 1999, as dot-com company share growth topped 2,000%, he made a phone call to a former colleague to pitch a new sort of business service for the internet age. We’ll find out more about that conversation in our next article.

Read our previous article on our Road to SuiteWorld journey here.

Share this article

Team Workiro

Book a Discovery Call

Want to find out more about how Workiro works? Book a zoom-based discovery call with one of our experts who’ll be happy to answer any questions you may have, to ensure Workiro is the right fit for your business needs.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

The Dawn of SaaS: A short history of how we got to the cloud - and how it changed business forever

Blog
The author image who wrote the blog article
By
Team Workiro

It all began with a big bang, or at least the planning of one. The first customer of the first lumbering, mechanical machines that could be called “computers” was the US military, in the second half of World War 2, where one of the notable applications was aiding in the development of the atom bomb. The second, almost immediately after the war ended, was business, as the scientists and researchers saw the opportunity to parlay their expertise into devices that could compete with punch-card machines. 

In the early 1950s one of those scientists, Grace Hopper - a rear-admiral and maths PhD, and one of many women who served as the first computer programmers - swiftly conceived of the foundational building block of software. Bored of repeatedly hand-configuring the machine to complete common tasks, she catalogued them and assigned symbolic codes to each, which could then be run by a “compiler”. Hopper cheerfully attributed this to “laziness” on her part - she wanted to “return to being a mathematician” - but the solution was quietly revolutionary, and became the foundation of computer software as we know it. 

Hopper was also among the first to discover that business leaders wouldn’t learn coding. She advised those who thought otherwise to “make the first attempt to teach those symbols to vice-presidents or a colonel or admiral. I assure you that I tried it” and strove to create a human-readable programming language, called FLOW-MATIC, which enabled human-ish language to direct the computing machinery underneath. This was seized upon by a consortium of industry and government seeking standards for the burgeoning computing industry, and lead to COBOL - which arrived in 1959 as the first business-focused programming language, and swiftly became ubiquitous at the behest of the US Department of Defense, who demanded all its suppliers use it.

This focus on making things easier, simpler and standardised has driven software development ever since, as uptake accelerated through the 1960s. As mainframe computers moved from valves and switches to transistors and printed circuit boards, and businesses became increasingly heavy users of them, the need for flexibility came to the fore. Businesses had to buy different machines for different departments - accountancy used a different system to engineering, and neither were compatible, until IBM pioneered a modular mainframe platform that could support multiple business applications: System360, which launched in 1965. Central systems hosted software programs which were accessed by “dumb” terminals, a model which would return.

Concurrently, as processors and memory gradually became faster, data handling changed too. In 1970 a paper by E.F. Codd, a scientist employed by IBM, proposed moving from rigidly structured databases to a new “relational” model - and that’s where the very first seeds of NetSuite were planted. IBM was slow to act on Codd’s work, but a programmer named Larry Ellison read Cobb’s research and saw it was the future: in 1977 he set up his own company, and in 1979 his company launched its own database product. In 1983, it changed its name, to Oracle (named after a project Ellison had previously worked on for the CIA).

Oracle Database was a success, as were similar business products from IBM and SAP, and businesses the world over became reliant on heavy-duty mainframe systems made by companies like IBM, Xerox and Hewlett-Packard. But the world outside the office was advancing rapidly too: the first personal computer, the Altair 8800, arrived in 1975. Computers quickly became popular outside the office, both for personal use and for businesses who couldn’t or wouldn’t opt for heavy mainframes. IBM, home of such mainframes, created the PC as we know it in 1981 to meet this demand - but built it as an open platform, enabling other manufacturers to produce their own clones (all running a version of the operating system created for the IBM PC, by an opportunistic startup called Microsoft).

While this was happening, the foundation of what would become the internet was being laid inside US universities, and it expanded out as computing became ubiquitous. Online services became available the mid-80s, and towards the end of 1989 Tim Berners-Lee conceived what would become the World Wide Web. In 1994 Netscape released a web browser which supported secure connections; that same year Jeff Bezos moved to Seattle to start an online retailer which he ended up calling Amazon. Web-based businesses boomed, and in 1996 PC manufacturer Compaq decided that the future lay in “cloud computing-based applications” that would enable consumers to store files online. 

The roaring expansion of the early web did not last. Breakneck expansion and unchecked spending turned into catastrophic falls as interest rates rose in the year 2000, and many upstart companies failed in the resulting dot-com crash. But while the bubble burst, the foundation remained, and over at Oracle Larry Ellison saw the future coming again. Oracle was flying high, but it was a slow-moving behemoth, and the world was changing faster than it could keep up. In 1999, as dot-com company share growth topped 2,000%, he made a phone call to a former colleague to pitch a new sort of business service for the internet age. We’ll find out more about that conversation in our next article.

Read our previous article on our Road to SuiteWorld journey here.

Author:
Team Workiro
Follow team Workiro for actionable work tips, how they apply to real-life scenarios, and take a deeper dive into our supercharged enterprise content management system, which seamlessly integrates with NetSuite.