Relatively inexpensive hardware devices that were connected directly to mainframe computers, dumb terminals would, through archaic commands, make queries of the mainframe and return results. The terminals had neither processing power nor data storage capabilities of their own; they simply allowed users to access the data on the mainframe computer. The mainframes were maintained by professionals at Data Processing departments (predecessors to today’s IT departments). ISL wasn’t large enough to afford a mainframe computer of its own, so our dumb terminals were connected (via things like RS232 and leased lines) to Florida Informanagement Services, an Orlando company that provided data processing services for smaller savings and loan institutions.
Suddenly, personal computers began appearing on people’s desks! These early machines, made by Apple, IBM and the so-called “clone makers” (Eagle, Compaq, Franklin, et al.) differed from dumb terminals because they had their own processing power and data storage capability, and, initially, were not connected to anything at all. With these PCs, users had the entire computer at their disposal to do with what they pleased, but if they wanted to share their work with anyone it had to be done via floppy disks or printed copies. But, for the first time, small businesses could actually afford real computers of their own.
The next step, of course, was networking. By 1988 I had left ISL and started a new small business. There were four of us at the office, each with a PC-compatible computer on his desk. One magical day a couple of guys came out to our office, physically connected the computers to each other and installed Novelle Netware software on each of them. Our computers were now able to communicate with each other, and we could install multi-user software that would allow all of us access to the data. Small office servers could be put into place to make files and other services available to all users. Amazing stuff.
Although online services such as America Online (later known as AOL) and CompuServe had been available since the late ‘80s, and I had actually been an early CompuServe user, in the early 90’s the Internet began its inexorable march toward commercial ubiquity in earnest, beginning with the emergence of email as a business communication tool, and later with the availability of the world wide web. Computers suddenly needed not only to be connected to each other at the office, but also to the Internet. This connection was initially achieved via dial-up modem at per-minute rates, later at flat monthly rates and, finally, via today’s broadband technologies, DSL, cable, and fiber.
Once computers and networks were permanently connected to the Internet, what we today call “cloud computing” became possible. In fact, I was an early believer in this concept, and in 2000 moved my business’ accounting data from our office server to an online service known then as NetLedger (later known as NetSuite), where it remained until I dissolved the company just a few months ago. To me, NetLedger was the harbinger of the future architecture of computing, and for the next few years I focused on only using technologies where the data was cloud-based and accessible from any computer, regardless of operating system. So I changed my email protocol of choice from POP to IMAP, and later just accessed it through the web itself. I moved all of my local files, including photos, to Dropbox, made extensive use of Google Docs (now Google Drive), moved my music to Amazon’s Cloud Drive, and used other services such as Evernote, Mint, and Gliffy. I stopped making presentations on Powerpoint and Keynote, and instead used the online service Sliderocket. By around 2006, basically the only software I used on any of my computers was the browser! I made a conscious effort to live in the cloud, but my children and all of their friends naturally gravitated to the same concept without even knowing it, since from the beginning of their computing experience all of their “stuff” was online on services such as MySpace, Facebook, Twitter, Instagram, etc.
The way I see it, the architecture of computing has come full circle during my 28 year career. I started out with a dumb terminal connected to a far-off server. Then I got my own computer, with its own processing power and storage, but isolated from other computers. Then my computer got networked and was able to communicate with nearby computers. Finally, my computer became part of the network of networks that is the Internet, and this allowed it to access all sorts of servers all over the world; servers which, in my particular case and that of a growing number of users, store and process all of my data. So my computers, albeit exponentially more powerful than all those that came before them, are basically acting as.... well, dumb terminals!
Obviously there is a huge difference between a monochrome terminal connected to a limited, single-purpose server running software that only specially trained users can use, and computers in all shapes and sizes (desktops, laptops, smartphones and tablets), with amazing high resolution full color displays, having access to literally billions of resources that users can take full advantage of with little or no training. But in the sense of their overarching principle they are the same: devices used to connect to professionally maintained servers, where the data and processing power reside. The main difference is that, although the servers are still maintained by professionals, unlike the terminal days of yore we users now have the power to access whichever servers we want, via web browsers or apps, and have the choice of whether to keep our data on our own devices, in the cloud, or both. As an example, I could have written this essay on my computer using any of a myriad word processing apps, and set up my own web server to publish it, thus using the storage and processing power of my own computer. Instead, I used various computers to write, edit and proofread the essay on Google Docs, and then published it on Squarespace. The file never touched any of the computers I used to create it!
We’ve come back to the future, but now, we have choice. The best of all worlds. Or, at least that’s what we think today. I wonder what we will think in another 28 years!