Thursday, March 5, 2009

Working in tight spaces since 2001

Growing up in the sixties, through social turbulence, the war on communism and the call for the great society, I recall the impact of the movie 2001 Space Odyssey when I saw it in 1968. Not many movies captured my imagination or had an impact on my life the way that movie did. Since then, there have been many movies with a computer in the center of the plot. Matrix, Independence Day to War Games showed how computers were not boring. The power of technology has always been portrayed in movies as a force humankind can harness, but there are risks.

The conversations between Dave Bowman and HAL the computer was the seeds of dreams for many. Watching the movie was like a surreal journey through space and time. The movie took three hours to watch, but its impact has been felt over my lifetime. The implications of life out there, our future, how computers would evolve and be an integral part of society by the year 2001 was mesmerizing and discomforting. It launched conversation after conversation about the implications. At thirteen, I wondered how computers might impact my life. And, our society.

The first generation systems evolved from scientific and experimental computers like the Eniac and Atanasoff–Berry Computer (ABC). They had limited capacity, yet showed the promise of digital computing all the same. They were no way close to the power of the HAL 9000. Companies and governments invested in them. Funding for the space race and landing a man on the moon, was in the news almost every day. And, in the late 60’s, it was not that big of a stretch to think by the year 2000, we would have computers talking, understanding, and directing many tasks humans would desire.
 

Well, if we could land a man on the moon, how hard would it be to have a computer talk and understand English? I believed in my lifetime, we would have robots to perform menial effort or work in space to do things too risky for humans. We would have communication systems translating languages I could not understand. And, we would have the ability to connect wirelessly with video transmissions here on earth and in space travel.

Yet, given where we were in the sixties and early seventies with the development of computer systems, operating systems and software languages, it was a realization that it would take a whole lot of small steps to get there. The HAL 9000 was a generational computer dreamed up for the movie. Given we were working on second generation computers at the time, like the IBM and Burroughs mainframes, coding line by line using primitive languages, the idea of getting to where 2001 seemed out of reach and unrealistic the more I put my head around the vision. There were just too many steps for me to think about and unknowns.

Programming in machine language was very limiting. The first generation computers did not have high level languages like Basic, C, Java, Cobol or Fortran. All programming tasks relied on machine languages which placed instructions in sequential forms, usually on punched cards or paper tape. Even in the movie 2001, HAL’s programming and memory banks were massive. It took up rows and rows of space in the ship. Much like the computers of the 60’s and 70’s, memory and CPU’s were not on silicon and transistors and core memory were not in practical use yet. Yet, Dave Bowman had to work in a tight space to alter or adjust HAL, much like programmers had to cope with the limitations of computers of that day.
 

The transition from machine language to symbolic assembly languages changed the landscape. COBOL and FORTRAN were developed with compilers that would create the assembly code for you. Machine instructions were abstracted. Yet, we still danced with Hex Dumps and linkages back to machine instructions to decipher how to correct a program’s logic.

The second generation mainframe vendors were Burroughs, Control Data, GE, Honeywell, IBM, NCR, RCA and Univac, otherwise known as "IBM and the Seven Dwarfs." After GE and RCA's computer divisions were absorbed by Honeywell and Univac respectively, the main framers were known as "IBM and the BUNCH."

Today, with huge amounts of memory to spare and virtual memory spaces to slobber over, I think of how much energy I devoted to managing variables, reuse and the artifacts of pointers just a few decades ago – before memory became so cheap. Many stories popping into my head distract me from what I what to express in this pass. Magnetic core memory was one of those inventions that radically changed the economics moving computers from university laboratories to commercial viability.

The earliest work on core memory was by Dr. An Wang and Dr. Way-Dong Woo, two Shanghai born American Physicists. Both were working at Harvard University's Computation Laboratory in 1951. Looking back, Harvard was not interested in promoting inventions created in their labs. Instead Wang was able to patent the system on his own while Woo took ill. A lesson for all universities.
 

Dr. Wang's patent was not granted until 1955 – the year I was born, and by that time core memory was already in commercial use. This started a long series of lawsuits, which eventually ended when IBM paid Wang several million dollars to buy the patent outright. Wang used the funds to fund his Wang Laboratories which he co-founded with Dr. Ge-Yao Chu, a school mate from China.

Not many of you will recall the heyday of the 70’s when Wang Laboratories, Texas Instruments, Prime Computer, Basic Four (also known as MAI), Datapoint, Hewitt-Packard, and Digital Equipment competed for the minds of system managers and programmers comfortable with the mainframe world. Downsizing was not the term we used. But, these third generation computers evolved with the use of the transistor and core memory introduced by Dr. An Wang, the founder of Wang Laboratories two decades prior.

Later in my career, I had a chance to meet Dr. An Wang in Lowell Massachusetts, when the Wang Labs sponsored a gathering of their value added resellers. In was around 1985 I recall, when my young company was developing an administrative system called ABT Campus on Wang 2200, a small mini-computer that could support up to sixteen workstations and be clustered up to 240. The Wang 2200 used fixed memory partitions up to 64K. Which, meant that the entire application had to run within that space, or call other parturitions that had global routines and variables.

It was exciting to meet the inventor of core memory and the founder of Wang Laboratories. At the time, Wang was a $3B empire selling throughout the world. Dr. Wang was addressing 2200 enthusiasts – mostly resellers who developed interesting applications linked to Wang’s Word Processing Software. My company was one of the resellers and that is how I was invited to Lowell Massachusetts.

Another thing I recall is Dr. An Wang led the team that also developed the computer SIMM, the present means computer memory maker’s layer chips on plug able small boards placed on the motherboards. This saved real-estate on the motherboards and allowed for higher capacity systems. Dr. Wang was a very inventive person. Not bound by the logic of delegation it seemed. He was hands on. He shook everyone’s hand in the small meeting. And, I think Dr. Wang stayed that way throughout his entire career until his death.
 

Comparing to what we work with today in programming and computer systems, whether for windows or the internet or for other platforms, most of the limitations placed on memory utilization and reuse are now forgotten in the distant past. If Stanley Kubrick would have made the movie 2001 today, he would have shown HAL the size of an IPOD and the interface would have been telepathic, not verbal.
 

Yet, the exercise drawing on my memories of movies like 2001 and recalling history of what I experienced and witnessed (and recall) since 1968, has helped me understand the importance of optimization and balance. We have limitations and we should manage our resources wisely no matter how much space we have.

No comments:

Post a Comment