Friday, September 18, 2009

Multitasking and time splicing

The evolution of computer hardware and software over the last five decades of my life is reflected in a set of personal stories I carry around with me - which has impacted my life and career. As I write this passage and edit its content, I have to contain my desire to expand into related topics, because of the broad implications of market pendulum swings. On the left side of the pendulum is the unending drive for cost reduction, capacity and speed. On the right side is the continual push for convenience, application and innovation. This is not a history lesson though. Diving into the past as a virtual archeologist, gives me the opportunity to reveal lessons that I did not notice at the time. Time flew by so fast as I worked hard over those decades, often not stopping to think or recognize how important events and their ripples have become in retrospect. So, I hope to give some attention to them now.

In the early 60's, computer makers strived to overcome many challenges. The largest and most pressing to me was the removal of capacity limitations introduced in early designs of computers where all the resources of memory and execution were devoted to a single use. This began the pendulum swing which we still see today. At first, computers were large sorting and counting machines. They were programmed to do simple tasks over and over again. They were faster than humans running calculation and sorting tasks. As new applications were dreamed up, the hardware and software were pushed to new limits. Like many innovative new industries and products, the computer industry never contemplated the expansion of applications and uses of computing beyond a large appliance for big government and industry.

The first generation systems evolved from scientific and experimental computers like the Eniac and Atanasoff–Berry Computer (ABC). They had limited capacity, yet showed the promise of digital computing all the same. Programming in machine language was very limiting. These first generation computers did not have high level languages or virtual memory or disk space. Programming relied on binary or hexadecimal switches and usually when loaded took over the use of the machine until the program ended. The program running loaded into the entire fixed memory performing the instructions in sequence. There was no time splicing or multitasking in the early days.

Today, with huge amounts of computer memory to spare and billions of bytes of memory even on our cell phones, I think of how much energy I devoted to managing variables, reuse and the artifacts of pointers just a few decades ago – before memory became so cheap and so abundant. Our focus was on efficiency and we optimized what we programmed to fit in very small memory spaces. Magnetic core memory was one of those inventions that radically changed the economics moving computers from university laboratories and government (military) to commercial viability and into the home, car and now in many devices we utilize.

Memory limitations and the growing expectations to do more with the computer power drove the industry to create a solution. Fixed partition of memory and single task computers evolved as tape and disk storage came on the scene which allowed page swapping and multitasking.

Multitasking and time splicing did not come from today's generation sitting in college classrooms switching between devices and tools.

Program efficiency, reuse, and memory optimization evolved to contain the cost of hardware while serving more needs in a fair and economical way. Thus we see the swing between cost and added convenience.

Like dad and mother were so preoccupied by the aftershock of the Great Depression, we see a ripple of behaviors impacted by major events in the world. They focused on resource use around the house like I focused on using limited memory split into compartments to automate general tasks. Leftovers were always saved and used. They lived without for decades during the aftershock of the Great Depression. My dad still shops with coupons at 87. He buys no-name generic stuff. He generally does not associate brands with value except Wheaties. He buys everything generic usually. And, he conserves everything. He saves everything from paper bags to plastic cups much like many of his generation. He does not need to go Green. He is Green. He doesn't leave lights on when he leaves his house. He always wanted to car pool – a form of maximizing the car and energy it expends to benefit the most. He was one of the first to focus on gas mileage and car reliability when everyone else seemed so focus on look and feel, the convenience of material possessions and the waste they often create. Finally, as I look back, I see how my parents managed their use of resources as I was growing up in a world of abundance that I rebelled against. Something I now can relate to even from a different vantage point after the Great Collapse of our era on our kids. What we are going thru in our era will be with us for some time. Thus, the movement toward Green, better resource use and a greater emphasis on the balance between cost and convenience will be with us a long time.

The earliest work on core memory was by An Wang and Way-Dong Woo, two Shanghai born American Physicists. Both were working at Harvard University's Computation Laboratory in 1951. Looking back, Harvard was not interested in promoting inventions created in their labs. Instead Wang was able to patent the system on his own while Woo took ill. Who owns the IP now would be in question at many universities funded by public grants. Dr. Wang's patent was not granted until 1955 – the year I was born, and by that time core memory was already in commercial use. This started a long series of lawsuits, which eventually ended when IBM paid An Wang several million dollars to buy the patent outright. Wang used the funds to fund his Wang Laboratories which he co-founded with Dr. Ge-Yao Chu, a school mate from China.

Not many of you will recall or were around in the heyday of the 70's when Wang Laboratories, Texas Instruments, Prime Computer, Basic Four (also known as MAI), Datapoint, Hewitt-Packard, and Digital Equipment competed for the minds of system managers and programmers comfortable with the mainframe world. Downsizing was not the term we used. But, these third generation computers evolved with the use of the transistor and core memory introduced by Dr. An Wang, the founder of Wang Laboratories two decades prior.

Later in my career, I had a chance to meet An Wang in Lowell Massachusetts, when the company sponsored a gathering of their value added resellers. In was around 1985 I recall, when my young company was developing an administrative system called ABT Campus on Wang 2200, a small mini-computer that could support up to sixteen workstations and be clustered up to 240. The Wang 2200 used fixed memory partitions up to 64K. You could have up to sixteen configurable. This meant that the entire application one would write had to run and operate within 64K space, or call other partitions that had global routines and variables. Paging was born to save the variables and memory use in a partition. We would link logical steps and chunks of code in a sequence to perform complex steps across memory spaces.

Thinking back to the 80's and working with the limits of Wang 2200, was one of my most intimate experiences working with a computer system, yet it gave me the insights to better develop computer system architecture balancing the drivers of cost and application. Programming in small spaces was an art form. Much different than today, where virtual spaces allow for so much more. One had to be very concise with static partitions limited by the amount of memory in a computer. It also forced me to think in modular spaces or what my wife would say compartments. Compare that with the abundance of memory, speed and power, we tend to lose the motivation to optimize how we achieve the ends we seek, since the virtual world hides the impact of variable costs while the physical world struggles to find a new equilibrium. Thus new business models evolve delivering software and computer value (free disk storage, backup, etc…) from Google to Skype contrasting the old legacy world of computers focused on a radically different means to deliver on expectations.

I think the exposure of memory limits explains why today, I process things in compartments and how I multitask. I am sure others do this as well. And, why many who know me, recognize I time splice like the old Wang 2200 between sixteen partitions. They just have to wait until I get around to them. So, as we work on SaaS and SOAP, I find the memory limits an obscure and often related impact on my life's work. Queues and messaging are just another form of time splicing the power of computer and the use of memory resources. Finally, this is all relevant today, since compartments running very small code segments are where we are headed back to in my view, as we break down the complexity of legacy systems and bridge our steps. By doing so, we will regain a more efficient use of computing resources while we see an improvement in access, convenience and application expansion. Multitasking and time splicing evolved out of necessity to maximize the computer power and its utility. We take it for granted today, don't you think?


 

No comments:

Post a Comment