The Processing Pendulum | From Servers to PCs to Cloud & Beyond
For this month’s blog post I wanted to talk about the transitions that have taken place over the years in where we do our processing. In my years working in technology, I have noticed a cyclical trend in where our processing power comes from. Like a pendulum swinging back and forth, we go from doing all our processing on a server to processing on personal devices.
The Big Server
Back when I first started working in technology, just shortly after having to walk 10 miles in the snow to get to school, all your processing was done on a big server typically local to your location. These were the days of Mainframes, Mini Mainframes, and Minicomputers. Connected to these large and powerful (for the time) servers were ‘dumb’ terminals or green screens. These terminals were not much more than a keyboard connected to a display. They had very little processing power.
I worked in a data center that was mostly HP centric during this time. We had an HP3000 980-100 server running a software application for mail order and catalog businesses. Yes, before eCommerce was a thing, we had to review catalogs and place orders over the phone for stuff. All the phone operators that were taking orders used terminals to enter the orders and take payment. All the processing for all the users of this system was done on the server, the terminals only displayed the rudimentary user interface and the results of the processing. All the users in the company used terminals.
Accounting, shipping, inventory, HR all used them to update information held on the server. No data was stored locally, and all the processing was done on this one big server as well. Larger batch jobs would be run on the server that could take hours to complete and output the result as a large printed report, typically these were accounting tasks.
These servers had their limitations. They could only process so much data in a given time and could only support a finite number of concurrent users. During peak periods of activity, like Black Friday, the servers would slow and response time on the terminals would increase. This limited the company’s ability to take orders quickly.
Over time the servers became more powerful and faster. HP released servers with multiple processors, like the 3000 980-200, which the company I worked for later acquired. The processors got faster. Memory got faster and less expensive. Connections between peripherals got faster (like optically connected disc drives). Everything on the server-side was scaled to meet the demand for processing power to support larger numbers of users and process larger datasets.
In an effort to scale servers beyond what a single server could support, companies developed ways to have multiple servers supporting the same user base. The ability to mirror data between two servers and split the user base allowed companies to scale to multiple servers instead of waiting for processing power to increase on a single server.
The PC And Workstation – The Pendulum Swings
When I first started working in technology it was all about the big, all-powerful server. Then came the PC and workstation. With the introduction of the PC and production of smaller and more powerful workstations, such as those from Sun Microsystems, the pendulum of processing started to swing away from the big server. It swung towards more distributed computing. It started with terminal emulation where a PC could be used to emulate a terminal connected to a large server. However, it could also be used for other tasks where the processing and storage would be done locally.
The problem was that servers and connectivity speeds became a bottleneck to performance. Servers had their limitations. To scale beyond what a server could do, companies started to look to offload work from the server and push it out to these new smart desktops.
Over time, the desktops became more and more powerful. Moore’s law was in full effect. PCs and workstations had more processing power than some of the early computers that required an entire room to house. Simple tasks like word processing (yes this used to require a server) could now be offloaded to the desktop. Spreadsheet applications like MS Excel and Lotus 123 (remember that?) could be used to do a lot of the financial reporting tasks that use to require hours to run on the server and even more time to print.
During this time desktop applications progressed rapidly, offering more functionality with each release. New applications would enable functions that were typically the domain of the big server. Applications like Crystal Reports could be used on these new, powerful, smart desktops to enable users to create their own reports. Users no longer needed to rely on an application developer to create a new program to create the same report. Applications enabled users to take the output from the server, like those big accounting reports, and view them on their desktop rather than needing to print them. Users could even slice the data in these reports further using the processing power of their desktop.
All of these developments moved to process tasks from the big server to the user’s personal device. Offloading the server and pushing tasks toward distributed computing was a big trend for many years. During this time the server was still the place to store large datasets. The speed of the connection between the server and the desktop became a critical bottleneck. Desktops also used multiple operating systems. This meant software developers often had to write applications to support more than one OS.
The Cloud – The Pendulum Swings Again
Then came the cloud.
The big servers from the likes of IBM, HP, and Sun slowly faded. Or they mutated into a bunch of smaller servers clustered together to provide even more processing power. Instead of having one big server with a few processors, data centers started to use many smaller servers with fewer processors utilizing software that enabled these servers to act as one. The cloud became the new big server with cloud service providers utilizing large numbers of server and storage devices to provide virtually unlimited processing power and storage space.
Now, cloud-based applications that use the desktop’s browser to provide a user interface are all the rage. We no longer require a powerful desktop to run applications to process our data. A Chrome book with a web browser is all that is required. Applications like Google Docs, Sheets, and Slides are offloading processing that used to be done on the desktop back to the big server (the cloud).
I realize that maybe not all the processing for these apps is done in the cloud, but you get my point, we are starting to move back in that direction. It is interesting that our new default is not to build an application that runs on the desktop. Most of everything we develop these days are web applications that only require a browser. The big benefit of this is it does not really matter what OS the desktop is running. Web applications run through the browser, so we don’t need to write separate programs for each OS.
Will the Pendulum Swing Again?
It seems that the progress of the desktop has stagnated somewhat. Not so long ago you had to get a new desktop every couple years to keep up with the latest hardware benefits. PCs got faster by leaps and bounds every year. If your PC was a couple of years old, it may as well be a dinosaur. I am writing this post on a PC that was manufactured in 2014 and it is totally sufficient for all the applications I run. I would not expect to have to upgrade my PC for something more powerful any time soon. I am using MS Word which is installed on my PC to write this, but our company CRM, Development Management, Accounting, and HR applications are all cloud-based web applications with nothing installed locally.
The mobile device space seems to be moving at light speed, however. Processing power on phones and tablets is progressing very rapidly, like the PC days of yore. Faster phones and tablets are released every year and we are starting to see tablets replace the traditional desktop or laptop. Most of these devices use applications that are installed locally and use the local processor to do much of the work. However, even in the mobile space web applications are replacing apps installed locally. Progressive web applications even enable developers to overcome having to write applications for each mobile OS.
It almost seems like processing is currently split between the big server and the personal device. Maybe the pendulum is finding its center. They tend to stop swinging after a while, don’t they?
Recommended For You
Choosing the Right Technology Partner for Your Software Project
Are looking to build a new software product or planning to give your current product or system a makeover? If
Roadmap to the Future | 5 Tips for Mapping Your Product Plan
Creating new software (or modernizing an existing one) requires thousands of decisions along the way. Trying to make this journey
Understanding DevOps | Tools and Services
In my previous post, I described the fundamental concepts of DevOps and touched on some of the tools used in
Receive industry insights, tips, and advice from Saritasa.We publish new articles 1-2 times a month, sign up today.