Thursday, March 5, 2009

Working in tight spaces since 2001

Growing up in the sixties, through social turbulence, the war on communism and the call for the great society, I recall the impact of the movie 2001 Space Odyssey when I saw it in 1968. Not many movies captured my imagination or had an impact on my life the way that movie did. Since then, there have been many movies with a computer in the center of the plot. Matrix, Independence Day to War Games showed how computers were not boring. The power of technology has always been portrayed in movies as a force humankind can harness, but there are risks.

The conversations between Dave Bowman and HAL the computer was the seeds of dreams for many. Watching the movie was like a surreal journey through space and time. The movie took three hours to watch, but its impact has been felt over my lifetime. The implications of life out there, our future, how computers would evolve and be an integral part of society by the year 2001 was mesmerizing and discomforting. It launched conversation after conversation about the implications. At thirteen, I wondered how computers might impact my life. And, our society.

The first generation systems evolved from scientific and experimental computers like the Eniac and Atanasoff–Berry Computer (ABC). They had limited capacity, yet showed the promise of digital computing all the same. They were no way close to the power of the HAL 9000. Companies and governments invested in them. Funding for the space race and landing a man on the moon, was in the news almost every day. And, in the late 60’s, it was not that big of a stretch to think by the year 2000, we would have computers talking, understanding, and directing many tasks humans would desire.
 

Well, if we could land a man on the moon, how hard would it be to have a computer talk and understand English? I believed in my lifetime, we would have robots to perform menial effort or work in space to do things too risky for humans. We would have communication systems translating languages I could not understand. And, we would have the ability to connect wirelessly with video transmissions here on earth and in space travel.

Yet, given where we were in the sixties and early seventies with the development of computer systems, operating systems and software languages, it was a realization that it would take a whole lot of small steps to get there. The HAL 9000 was a generational computer dreamed up for the movie. Given we were working on second generation computers at the time, like the IBM and Burroughs mainframes, coding line by line using primitive languages, the idea of getting to where 2001 seemed out of reach and unrealistic the more I put my head around the vision. There were just too many steps for me to think about and unknowns.

Programming in machine language was very limiting. The first generation computers did not have high level languages like Basic, C, Java, Cobol or Fortran. All programming tasks relied on machine languages which placed instructions in sequential forms, usually on punched cards or paper tape. Even in the movie 2001, HAL’s programming and memory banks were massive. It took up rows and rows of space in the ship. Much like the computers of the 60’s and 70’s, memory and CPU’s were not on silicon and transistors and core memory were not in practical use yet. Yet, Dave Bowman had to work in a tight space to alter or adjust HAL, much like programmers had to cope with the limitations of computers of that day.
 

The transition from machine language to symbolic assembly languages changed the landscape. COBOL and FORTRAN were developed with compilers that would create the assembly code for you. Machine instructions were abstracted. Yet, we still danced with Hex Dumps and linkages back to machine instructions to decipher how to correct a program’s logic.

The second generation mainframe vendors were Burroughs, Control Data, GE, Honeywell, IBM, NCR, RCA and Univac, otherwise known as "IBM and the Seven Dwarfs." After GE and RCA's computer divisions were absorbed by Honeywell and Univac respectively, the main framers were known as "IBM and the BUNCH."

Today, with huge amounts of memory to spare and virtual memory spaces to slobber over, I think of how much energy I devoted to managing variables, reuse and the artifacts of pointers just a few decades ago – before memory became so cheap. Many stories popping into my head distract me from what I what to express in this pass. Magnetic core memory was one of those inventions that radically changed the economics moving computers from university laboratories to commercial viability.

The earliest work on core memory was by Dr. An Wang and Dr. Way-Dong Woo, two Shanghai born American Physicists. Both were working at Harvard University's Computation Laboratory in 1951. Looking back, Harvard was not interested in promoting inventions created in their labs. Instead Wang was able to patent the system on his own while Woo took ill. A lesson for all universities.
 

Dr. Wang's patent was not granted until 1955 – the year I was born, and by that time core memory was already in commercial use. This started a long series of lawsuits, which eventually ended when IBM paid Wang several million dollars to buy the patent outright. Wang used the funds to fund his Wang Laboratories which he co-founded with Dr. Ge-Yao Chu, a school mate from China.

Not many of you will recall the heyday of the 70’s when Wang Laboratories, Texas Instruments, Prime Computer, Basic Four (also known as MAI), Datapoint, Hewitt-Packard, and Digital Equipment competed for the minds of system managers and programmers comfortable with the mainframe world. Downsizing was not the term we used. But, these third generation computers evolved with the use of the transistor and core memory introduced by Dr. An Wang, the founder of Wang Laboratories two decades prior.

Later in my career, I had a chance to meet Dr. An Wang in Lowell Massachusetts, when the Wang Labs sponsored a gathering of their value added resellers. In was around 1985 I recall, when my young company was developing an administrative system called ABT Campus on Wang 2200, a small mini-computer that could support up to sixteen workstations and be clustered up to 240. The Wang 2200 used fixed memory partitions up to 64K. Which, meant that the entire application had to run within that space, or call other parturitions that had global routines and variables.

It was exciting to meet the inventor of core memory and the founder of Wang Laboratories. At the time, Wang was a $3B empire selling throughout the world. Dr. Wang was addressing 2200 enthusiasts – mostly resellers who developed interesting applications linked to Wang’s Word Processing Software. My company was one of the resellers and that is how I was invited to Lowell Massachusetts.

Another thing I recall is Dr. An Wang led the team that also developed the computer SIMM, the present means computer memory maker’s layer chips on plug able small boards placed on the motherboards. This saved real-estate on the motherboards and allowed for higher capacity systems. Dr. Wang was a very inventive person. Not bound by the logic of delegation it seemed. He was hands on. He shook everyone’s hand in the small meeting. And, I think Dr. Wang stayed that way throughout his entire career until his death.
 

Comparing to what we work with today in programming and computer systems, whether for windows or the internet or for other platforms, most of the limitations placed on memory utilization and reuse are now forgotten in the distant past. If Stanley Kubrick would have made the movie 2001 today, he would have shown HAL the size of an IPOD and the interface would have been telepathic, not verbal.
 

Yet, the exercise drawing on my memories of movies like 2001 and recalling history of what I experienced and witnessed (and recall) since 1968, has helped me understand the importance of optimization and balance. We have limitations and we should manage our resources wisely no matter how much space we have.

Wednesday, March 4, 2009

Where would I be if I did not learn WATFOR?

The artifacts of software are explicitly linked to the evolution of computers or CPUs. If there is a CPU (Central Processing Unit), then there must be software instructions governing the CPU. The instructions compiled or assembled into executable tasks is called software in general compared to the physical components of the CPU called hardware. There are various layers of software and hardware that make up a computer system. The interaction of hardware components is also controlled by software building on specialized instructions to manage input, processing, storage and output. 

Whether it is a CPU utilized to control the flow of air in a building or a generalized CPU used for desktop computing, software is by far a very expansive path to study. To uncover and resurrect past software (or Ruins) and its contribution to the evolution and innovation of the computer industry - in a few short years, is challenging enough, given the nature of our minds forgetting things and tossing out we no longer need or use. Computers are living lives less than Dog Years. In a few short years, a computer once thought of as the greatest and best, is now on the trash heap forgotten. Because of this, many of the artifacts and history of software is left in rubbles in the back of our minds, lost unless we bring the stories out and share them.

Without CPU and hardware, software would have no purpose or exist. As I explore my own personal stories growing up with computers over the last five decades, and those of my friends, it resonates how fortunate I have been to witness, and be a part of the explosion of innovation – driven by the economics or some would just say just sheer capitalism. To create some improved value over the previous works of art drives the computer industry and much of technology exploration in the hope each new innovation brings us purpose and usefulness, which would then derive value.

I carry around with me - like impressions of art scattered on the wall of my life, a diverse set of stories that brings me to the present and propels me to the future. The stories have varying colors, mediums and textures that bring to light the impact of what I was seeing, feeling and absorbing through the years. Some are deep rooted and moved me to take a new direction. Some are real boring to bring up. Some are quite interesting and filled with lessons learned. Many are humorous, since we cannot take ourselves so serious in a world that is always top-see-turvy.
 

Almost every day, thoughts come to mind, relating to my personal history and the evolution of modern computers and software over the last half century. This has led me to writing down some my stories and in part, sharing them on Software Ruins as a means to collect the artifacts of the virtual world that are not with us any longer. My son will never need to prepare paper tape or punched cards to read it into a computer. My daughter will never have to play with a floppy disk. Yet, both will utilize the descendants of computers and software that leveraged those devices in the migration we call time.

This is not meant as a history lesson – so don’t worry. But, looking back, and diving into the past, offers the opportunity to reveal and share with retrospect and insight. Given what we have witnessed and experienced, one can only imagine what our future holds. Most of the time, we are not thinking about events relative to their impact on us or our community or our world as they occur or soon after. It often takes years to reflect and ponder the subtle implications. I guess that is one of the benefits of looking backward and relating it to what transpired.
 

My father worked for GE before I could remember much. He moved around a bit between the commercial and space divisions as they focused on different initiatives in New York and Pennsylvania. In the 50’s, he was part of a team inventing magnetic tape drive assemblies. They were external storage devices that radically changed how second generation mainframes evolved past their physical memory limits of their ancestors just a decade earlier. My father’s engineering and inventive side was inherited by me I guess. My mother was the artistic and nurturing spirit who always told me I could do anything I put my mind to. She was the one who gave me the comfort to think outside the box.

My first introduction to computers came when I took my freshman course in FORTRAN. It was a required course in 1973. The class was in one of those huge lecture halls where 300 or so students sat in the auditorium and we listened to the professor talk about the history of computers and how relevant it was to our day. My class involved writing simple FORTRAN programs to solve mathematical problems using punched cards. It was not one of the coolest courses. I struggled and persevered through it. It was only one semester anyway.

Looking back, I can now see further insights into how the computer industry was evolving. It is not savage. But, there is a form of cannibalization and efforts orchestrated to dominate competitors through any means possible, including using the legal system to fight patents and inventiveness.
 

The evolution and drive to address the obstacles of commercializing computers begins and ends with adoption. As a consumer, back in the 70’s, I saw no relevance to what I was learning about a technology that was still evolving. It was early in the adoption cycle, where the masses sit back and wait until others, more adventurous, explore the unknown. Computer memory and capacity limitations played out over decades – not one semester. Software operating systems and hardware evolved to serve commercial markets and their obstacles incrementally. There was no big boom and computers were everywhere. It was a long stretch over time that patterned Darwin’s theory of evolution.

FORTRAN was an early language developed by IBM. It was designed for numerical computation compared with COBOL, designed for character based data as an alternative to machine level programming for the IBM mainframes. From what I understand, IBM funded the University of Waterloo to develop WATFOR in 1965 for the IBM 7040. By the early 70’s, Drexel University was using the WATFOR compiler as the basis of the Introduction to Computers course. The WATFOR compiler was popular on college and university campuses.
 

The employees of the University of Waterloo led by Ian McPhee spun out to form a compiler company in the early 80’s. They developed other products including WATCOM APL, BASIC, COBOL, PASCAL, FORTRAN and WATCOM SQL on a range of hardware platforms. It was an impressive array of work for a small company. They moved down to micros and embedded applications with their WATCOM Assembler. The Commodore SuperPET used WATCOM as the basis of their software. And, in the late 80’s, WATCOM introduced a C compiler for the IBM PC competing with Borland and Microsoft.
 

C was a developed originally on UNIX by Bell Telephone Laboratories in 1972 to form independent system software. The C language incorporated some of the syntax and semantics of other languages including ALGOL, FORTRAN and COBOL. As a compiler and language, C was designed for portability with features supported by general purpose CPUs with modularization, structure, and code re-use. Before I got into C, I learned ALGOL first on the Burroughs 5500 and fell in love with writing software. I soon moved to PL/1 and then C as I worked on Prime Computers running UNIX. Which is when I realized my calling was in developing and architecting software systems.

I look back on my programming start with WATFOR. How ironic things seem to cycle back to where one starts. Fast forward twenty years. It was 1993, and my software company was launching a second generation campus administrative system called ABT PowerCAMPUS to replace the first generation ABT CAMPUS product line that was running on Wang 2200, Digital Vax and PC LAN Networks.
 

We chose to utilize Sybase SQL Server and PowerSoft’s PowerBuilder as our primary development platform. This was a ground up project. We were starting from scratch. We also wanted to support MS SQL Server and Novel SQL in the early stage of our development. Our attempt was to build a portable front-end and back-end. But, that ran into difficultly when dialect differences overwhelmed us. So, our primary focus was on Sybase and PowerSoft. Our goal was to develop a stable product by 1995, and MS SQL Server was not well regarded at the time. We played around with version 4.2 and had to wait until version 4.9 before it became more stable and provided the functions we needed including Stored Procedures.
 

Now, let’s regress a bit. Microsoft bought the Windows based SQL Server code in 1993 from Sybase and dissolved their revenue sharing partnership with Sybase that started in 1988. So, the split created two competing code bases on the same NT platform, with a common ancestor of Transact SQL, but gave Sybase a small war chest to acquire companies and diversify. Both were competing for the growing client/server database market against IBM, Oracle and Informix still on UNIX or mainframes.

While in the midst of starting our PowerCAMPUS project, WATCOM International was acquired in 1994 by PowerSoft. Then, a short year later, Sybase acquired PowerSoft in 1995.

Remember, Sybase, was the author and developer of SQL Server as I mentioned earlier. Its popularity was mostly on UNIX. Yet, a growing base of corporate PowerSoft developers were loyally aligned with Sybase SQL Server on NT– because of the dialect variations and stability issues I mentioned earlier kept everyone away from adopting Microsoft’s competing version. Product maturity has a way of helping one retain a client base. Lesson learned.

As Sybase absorbed the PowerSoft and WATCOM technologies, they incorporated the C compiler into the PowerSoft PowerBuilder toolkit and WATCOM SQL was renamed as SQL Anywhere, a desktop, single user database. Soon, we were compiling the PowerBuilder objects generating C code and executables replacing the interpreted Basic like code. Performance improved. And, we were utilizing the standalone SQL Anywhere to develop and qa our product development.

So, from the start in 1973, I learned programming using WATFOR - a simple FORTRAN compiler. And, twenty years later, I was reintroduced to WATCOM technology that evolved with the industry, at the same time I was trying to navigate the nuances of who was doing what to whom and trying to pick which vendor and products to utilize in my efforts to create value for my niche market, given the nature of how technology companies follow the laws of Darwin.



Innovate or perish.

Tuesday, March 3, 2009

Being first to market, ARCnet

I’m going to date myself. How many of you out there remember ARCnet? In 1976, I was attending Drexel University. As an early geek, I lived in the computer center on the basement floor of the student center rarely attending class. My days were filled experimenting with everything from dumb terminals connected to early mainframes, minis and micro computers to teletype systems with paper tape or punch cards. Drexel provided me such a wonderful exposure to technologies and early devices such as the IBM 5120 and the Textronics Plotters and Datapoint Databus CPU’s to name a few. Drexel also had a mainframe Burroughs 5500 and RJE Terminal to UNI-COLL, a shared non-profit hosting an IBM360 at first and then an IBM370 mainframe for some thirty higher education institutions on 34th and Market Street in Philadelphia. It was part of the Science Center back in the day of timesharing. I will cover some stories on UNI-COLL another day. This day, I will reflect on ARCnet.

As a student, I got into everything. Nick Demaio, the Drexel Computer Center IT manager grew fond of my skills and one day gave me a job of looking into the Datapoint Databus CPU employed in the Admissions department. I recall a group of CPU’s were connected together and they looked like hybrid office machines – part printer – part terminal – part CPU. They were used by administrative operators in Admissions to follow-up inquiries and applicants. In other words, they were early word processors, designed to support the text letter management and data files storing all the names and addresses of people who contacted the University. The applications were developed by Michael McCabe, who was a few years ahead of me. The Cobol like language was used with a compiler and the routines developed split across the multiple CPU’s. Multiple CPU’s? Everything that I was exposed to prior centered around a single CPU or a system built with multiple CPU’s but managed by a shared operating system like MVS on the mainframe.

What intrigued me was the connection between the “micro” computers. It was a coax cable on the outside connected to the back of the chassis far more advanced than the punched card readers and paper tape readers we had in the computer center. The applications were developed and managed with no release management, check in and check out or any thought about architecture. The programming to support the Datapoint applications also had to incorporate updates of the OS (operating system) and device driver updates for the plug and play expansion boards. Each board included a set of disks. And, the system software had to be loaded in ROM and made part of the boot sequence. Part of the responsibilities of the computer center was to provide ongoing maintenance of the Datapoint system, as new releases were received on 8 inch floppy disks from Datapoint.

That was where I came in. Nick Demaio asked me to update the system one day. That is an over simplification. I had to learn how to install a set of patches and to re-compile the Cobol programs Mike McCabe developed. There is an art to developing new solutions. Just like there is an art in trying to figure out things someone else created. In the process, I consumed the manuals and technical release papers on the ARCnet and began dreaming of all sorts of applications that could connect computer to computer together. There was little documentation on the actual application Drexel used. Maybe twenty or so pages written into ASCII text files. But, I figured it out.

Reverse engineering was a term that was coined years after I was working on these Datapoint systems. Back then, the data files were localized on one CPU as it served others. Multiuser support had to be handled with file and record locking. ARCnet introduced new challenges, in that use of the tables was no longer under the session control of a single CPU. On top of that, ARCnet had no central CPU. The hub of computers shared the external backbone over the 1.5Mhz communication channel offered by the coax cable. Datapoint’s incarnation of ARCnet was more like peer to peer networking. Do you recall NetBios? ARCnet was the forerunner of that which Microsoft fashioned a decade later and pushed for a long time. We still have NetBios installed over TCP/IP on most networks today. Some of you may be familiar with that. Most of you don’t recall ARCnet because it was the first real venture into (LANS) local area networking before Novel picked it up and supported it as a topology in the late 70’s. It was before Banyan Vines. And, it was before Ethernet and Token Ring came on the scene in the early 80’s as well as a competitive threat.

Token Ring from IBM took the same path as ARCnet. I worked on that too. IBM controlled the technology and did not want other vendors to tell them what to do, so I would call it a pretty closed system. As a result, Ethernet, which was pushed by Xerox and Digital, took off, because it used twisted copper and was open to all to support from Apple to Zenith. What limited ARCnet, Token Ring and other network topologies of that day? Most likely, the reason none of us use them today, was the reluctance of Datapoint to support anything other than RG62 Coax. The engineers thought it was the best way to send signals without interference. From a business perspective, they also thought they had the market cornered and after going public in the early 80’s with tons of capital, they faltered when their senior management and the SEC clashed over revenue recognition, throwing the company in a tail spin they never recovered from.

Meanwhile, IBM and others who resisted open standards and market forces believed Token Ring was their best way to connect computers and devices even with their bulky cables. All their hardware offered Token Ring adapters and none of it connected with Ethernet or ARCnet. That is another story.

The lesson learned from ARCnet and IBM, at least from a marketing perspective, is that sticking with your guns and trying to control the evolution of technology is often better served when one adopts an open market perspective. By opening the Ethernet technology as platform commerce could expand on, we all benefited and we now have the Internet, home computers and a world connected.

If you want to read more about ARCnet, check out
http://en.wikipedia.org/wiki/ARCNET

Sunday, March 1, 2009

The Archaeology of Software

Software can be thought of as the building blocks of a virtual universe – an integral and organic part of computer systems complimenting the evolution of hardware.

Much like the early inventions of mankind, software has evolved rapidly through generations of discovery, use and decline. The logic and architecture of programming is training a computer in steps of instructions. The interactions are controlling. Inputs, storage, processing and output commands are componetized. Generally, we think of software as technical and abstract. Yet, they are languages and forms of communications all the same - between human and machine and machine and human or machine and machine. Software is similar to virtual life - reflecting species and genetics impacted by Darwin's theory of evolution.

My work is not an exclusive list or complete compendium. My observations and gathering of the relics of software companies and remains from my 'digs' is like the study of archaeology though - in short years - yet felt like centuries. What remains of software often is hidden under the surface away from our sight. Decisions long ago forgotten reveal how software, much like early civilization moved around feeding on the landscape, impacted by climate, density, food and predators outside their control.

Software is a medium of communication, much like writing on papyrus or paper or punched card or computer screen. And, it deserves to be preserved or understood like the artifacts uncovered in a dig. Where does it fit? How did it impact the day? Who were the users? How long did it survive? What is the ancestral tree?

So, this blog is my venture to do so. To develop a virtual museum of sorts where I can scavenge for the remains and attempt to describe software's history and placement or at least some of it.