Friday, September 18, 2009

Multitasking and time splicing

The evolution of computer hardware and software over the last five decades of my life is reflected in a set of personal stories I carry around with me - which has impacted my life and career. As I write this passage and edit its content, I have to contain my desire to expand into related topics, because of the broad implications of market pendulum swings. On the left side of the pendulum is the unending drive for cost reduction, capacity and speed. On the right side is the continual push for convenience, application and innovation. This is not a history lesson though. Diving into the past as a virtual archeologist, gives me the opportunity to reveal lessons that I did not notice at the time. Time flew by so fast as I worked hard over those decades, often not stopping to think or recognize how important events and their ripples have become in retrospect. So, I hope to give some attention to them now.

In the early 60's, computer makers strived to overcome many challenges. The largest and most pressing to me was the removal of capacity limitations introduced in early designs of computers where all the resources of memory and execution were devoted to a single use. This began the pendulum swing which we still see today. At first, computers were large sorting and counting machines. They were programmed to do simple tasks over and over again. They were faster than humans running calculation and sorting tasks. As new applications were dreamed up, the hardware and software were pushed to new limits. Like many innovative new industries and products, the computer industry never contemplated the expansion of applications and uses of computing beyond a large appliance for big government and industry.

The first generation systems evolved from scientific and experimental computers like the Eniac and Atanasoff–Berry Computer (ABC). They had limited capacity, yet showed the promise of digital computing all the same. Programming in machine language was very limiting. These first generation computers did not have high level languages or virtual memory or disk space. Programming relied on binary or hexadecimal switches and usually when loaded took over the use of the machine until the program ended. The program running loaded into the entire fixed memory performing the instructions in sequence. There was no time splicing or multitasking in the early days.

Today, with huge amounts of computer memory to spare and billions of bytes of memory even on our cell phones, I think of how much energy I devoted to managing variables, reuse and the artifacts of pointers just a few decades ago – before memory became so cheap and so abundant. Our focus was on efficiency and we optimized what we programmed to fit in very small memory spaces. Magnetic core memory was one of those inventions that radically changed the economics moving computers from university laboratories and government (military) to commercial viability and into the home, car and now in many devices we utilize.

Memory limitations and the growing expectations to do more with the computer power drove the industry to create a solution. Fixed partition of memory and single task computers evolved as tape and disk storage came on the scene which allowed page swapping and multitasking.

Multitasking and time splicing did not come from today's generation sitting in college classrooms switching between devices and tools.

Program efficiency, reuse, and memory optimization evolved to contain the cost of hardware while serving more needs in a fair and economical way. Thus we see the swing between cost and added convenience.

Like dad and mother were so preoccupied by the aftershock of the Great Depression, we see a ripple of behaviors impacted by major events in the world. They focused on resource use around the house like I focused on using limited memory split into compartments to automate general tasks. Leftovers were always saved and used. They lived without for decades during the aftershock of the Great Depression. My dad still shops with coupons at 87. He buys no-name generic stuff. He generally does not associate brands with value except Wheaties. He buys everything generic usually. And, he conserves everything. He saves everything from paper bags to plastic cups much like many of his generation. He does not need to go Green. He is Green. He doesn't leave lights on when he leaves his house. He always wanted to car pool – a form of maximizing the car and energy it expends to benefit the most. He was one of the first to focus on gas mileage and car reliability when everyone else seemed so focus on look and feel, the convenience of material possessions and the waste they often create. Finally, as I look back, I see how my parents managed their use of resources as I was growing up in a world of abundance that I rebelled against. Something I now can relate to even from a different vantage point after the Great Collapse of our era on our kids. What we are going thru in our era will be with us for some time. Thus, the movement toward Green, better resource use and a greater emphasis on the balance between cost and convenience will be with us a long time.

The earliest work on core memory was by An Wang and Way-Dong Woo, two Shanghai born American Physicists. Both were working at Harvard University's Computation Laboratory in 1951. Looking back, Harvard was not interested in promoting inventions created in their labs. Instead Wang was able to patent the system on his own while Woo took ill. Who owns the IP now would be in question at many universities funded by public grants. Dr. Wang's patent was not granted until 1955 – the year I was born, and by that time core memory was already in commercial use. This started a long series of lawsuits, which eventually ended when IBM paid An Wang several million dollars to buy the patent outright. Wang used the funds to fund his Wang Laboratories which he co-founded with Dr. Ge-Yao Chu, a school mate from China.

Not many of you will recall or were around in the heyday of the 70's when Wang Laboratories, Texas Instruments, Prime Computer, Basic Four (also known as MAI), Datapoint, Hewitt-Packard, and Digital Equipment competed for the minds of system managers and programmers comfortable with the mainframe world. Downsizing was not the term we used. But, these third generation computers evolved with the use of the transistor and core memory introduced by Dr. An Wang, the founder of Wang Laboratories two decades prior.

Later in my career, I had a chance to meet An Wang in Lowell Massachusetts, when the company sponsored a gathering of their value added resellers. In was around 1985 I recall, when my young company was developing an administrative system called ABT Campus on Wang 2200, a small mini-computer that could support up to sixteen workstations and be clustered up to 240. The Wang 2200 used fixed memory partitions up to 64K. You could have up to sixteen configurable. This meant that the entire application one would write had to run and operate within 64K space, or call other partitions that had global routines and variables. Paging was born to save the variables and memory use in a partition. We would link logical steps and chunks of code in a sequence to perform complex steps across memory spaces.

Thinking back to the 80's and working with the limits of Wang 2200, was one of my most intimate experiences working with a computer system, yet it gave me the insights to better develop computer system architecture balancing the drivers of cost and application. Programming in small spaces was an art form. Much different than today, where virtual spaces allow for so much more. One had to be very concise with static partitions limited by the amount of memory in a computer. It also forced me to think in modular spaces or what my wife would say compartments. Compare that with the abundance of memory, speed and power, we tend to lose the motivation to optimize how we achieve the ends we seek, since the virtual world hides the impact of variable costs while the physical world struggles to find a new equilibrium. Thus new business models evolve delivering software and computer value (free disk storage, backup, etc…) from Google to Skype contrasting the old legacy world of computers focused on a radically different means to deliver on expectations.

I think the exposure of memory limits explains why today, I process things in compartments and how I multitask. I am sure others do this as well. And, why many who know me, recognize I time splice like the old Wang 2200 between sixteen partitions. They just have to wait until I get around to them. So, as we work on SaaS and SOAP, I find the memory limits an obscure and often related impact on my life's work. Queues and messaging are just another form of time splicing the power of computer and the use of memory resources. Finally, this is all relevant today, since compartments running very small code segments are where we are headed back to in my view, as we break down the complexity of legacy systems and bridge our steps. By doing so, we will regain a more efficient use of computing resources while we see an improvement in access, convenience and application expansion. Multitasking and time splicing evolved out of necessity to maximize the computer power and its utility. We take it for granted today, don't you think?


 

Saturday, August 15, 2009

Software Age - How old is your system? Do you have a clunker?

In our age of energy conservation, our push to replace clunkers is a good example of how policy and advocacy have aligned - at least in the auto industry. I drive a 2001 Dodge Durango which gets about 12 miles to the gallon. It is a big truck. I like it. But, it is not the most fuel efficient auto on the road nor do I have all the industry gadgets my friends have in their new hybrids.

Obviously, if I replace it with a new hybrid, I would get 40 or more miles per gallon. And, I would benefit from newer technology such as a built-in GPS, alert systems for parking and backup monitors. I would also have a dashboard system to keep me focused on fuel efficiency. Cool stuff. I could keep my Dodge running on 87 octane fuel for another ten years - assuming gas stations continue to sell it.

It would not cost me as much as a full replacement to just keep my Durango and maintain it. I do have another 20,000 miles on my lifetrain warranty. Yet, I will eventually conclude the utility of keeping the Durango will be replaced by the call to upgrade, given the draw to new benefits and the cost of not having them.

As computer systems have evolved from standalone applications to the cloud, software architecture has evolved significantly. How systems should be developed and deployed to serve the evolving expectations of the higher education market or for that matter the Internet world is a matter contrasts.

As we look back over the decades of software development, we have seen systems age and new products emerge to replace them. It is usually the competitive landscape that drives improvements in the market by the threat of replacement. Yet, obsolescence has been buffered by software and system maintenance agreements. We live in an incremental world.

Part of the challenge for software developers and implementers is when do they re-engineer or re-design the basic framework of applications and create a new generation? Do clients receive the new generation replacing the old? Or, do they need to re-license? The age of ERP and huge monolithic systems is giving way to a more component based architecture. As this movement continues, it gives us (the software industry in general) the opportunity to address how best to replace application frameworks and designs that could address changes and innovations desired by a new user base.

It is great to see a product generation live ten, twenty or more years and pay back the investment. But, hidden beneath the surface of aging systems, is the software architecture and assumptions that drove the development and design in the first place. As such, many older software products are stuck in a time capsule and are protected by their authors in the marketplace with software maintenance agreements and restrictions on integration.

Vendors derive a great deal of revenue from the services to band-aid integration and to service the complexity. They embed duplicate code to support validation and work flows outside their original designs. The lack of reuse of code and the duplication of code to handle the every growing expectations of linking to the Internet world, is taxing the IT investment so much, that the general user community is exhausted and all but given up hope that computer automation can make their jobs more relevant with links embedded to enable networking, collaboration and teamwork.

Part of the challenge we face today connecting stakeholder systems can be directed linked to the inadvertent and unintended consequences of systems and tools not designed for the 24x7 online web service world. Many legacy products deployed and utilized by higher education institutions were initially standalone or offline from students and faculty. If you date the software origin, you can pretty much predict the behavior of the author or vendor or implementer.

Vendors and authors can claim success by pointing to a large install base. They can also demonstrate functional value, for what a product was designed to perform under the controlled constraints of a demonstration. Yet, many of these same systems were built with a single monolithic layer where the business logic cannot be separated from the user interface, integration and work flow.

Thus, code is duplicated throughout the products to handle multiple functions, because there is no business logic layer supporting reuse. This drives the costs higher for the vendor and the client. This makes integration and interoperability expectations near impossible to address in the 21st century online world.

Application software systems written in COBOL or BASIC for instance two or three decades ago were designed for the batch era and lack connections to online web forms. Some have been transformed with new front-ends, but the basic simplicity of their software architecture still remains. This explains the protective and often proprietary resistance of vendors and authors trying to hide or control the limitations under the surface.

Vendors and authors generally behave the same when they are sitting on aged systems. It may be good to have antiques in your house, but that is not the best approach in today’s online competitive world. Students and faculty have high expectations for online systems. To utilize antiquated software systems built for a bygone era is just foolish since the interfaces will hamper the value, intent and use.

Many vendor/client relationships restrict users and partners from interacting with software products directly through alternative interfaces called APIs (application program interfaces) because they can’t be developed. The logic layer is not callable because it was never segregated in the first place. Data interchange directly between products of different eras is compromised by proprietary interests – in other words of protecting the installed base. Application interfaces standards are not supported and thus integration takes a back seat to supporting functional perspectives limited by roles and use.

In more detail, software has business rules governing how data is validated as part of complex transactions from inserting to updating data managed by an application. Information generated from these systems is also constrained by not allowing other applications access through query or request. When the validation code rests in the user interface because that is where many vendors and authors put it, then integration frameworks can’t call it to process alternative forms of entry. Therefore, the ability to support new import and export requests using web services today is severely limited because of legacy architecture. Batch interface choices can be deployed, but that requires the duplication of code and increases support costs.

Trying to fix this is near impossible while staying in place. Software vendors, developers, and implementers rely on software maintenance to support their products, but also siphon off the revenue to fund new development at a slow pace. Cannibalizing and shifting products from one generation to the next is very challenging. Moving the installed base forward is also challenging. These areas of friction deter the advancement of systems overall and slows down the adoption cycle of systems that can address new expectations, but can’t or are not provided interfaces to the legacy or older products.

Thus, we have seen legacy products retain their own technology stacks for a long time and get away with not addressing the integration expectations of the market because the service revenue generated clouded their vision. Component add-on products follow the same path. Vendors with nice tools developed in the 80’s, 90’s and after the turn of the century have dramatically different architectures which relates back to how easy they are to implement and fit into a Student System or ERP.

Having to re-write any component would require front-end investment and the hope of recouping that investment from the target market – which may be too small. That is pretty risky as the installed base behavior burdens massive replacement across higher education or any market like it. Thus, we are in a quandary. What do we do?

Well, for one, let's call on all software authors and implementers to begin to realize holding on to aging systems is actually only going to make things worse for business models dependent on software maintenance. It does not matter whether the software is home grown, open source or commercial. The challenge remains. Just because software is virtual and not a physical artifact, does not mean it should not be seen as an asset that depreciates and needs to be replaced after it's life cycle is impacted by architectural alternatives.

If the system ages beyond five to seven years in the Internet universe, it will mean obsolescence and the install base will limit re-engineering.

Take a look at the business model supporting the application or system. It has a built-in incentive to avoid re-writing and re-architecture. Maybe it would be better to focus on a subscription model with annual, predicable revenues, that times out. The time out, like an office lease, must be extended or it is time to move. In the extension, would you not want to re-do the carpets, paint the walls, and maybe correct the office layout, given the changes in your organization? I think so and that is something we should consider.

Hardware does not have the same challenge, since we have a built-in pattern to replace hardware on desktops and servers over a life cycle of three to four years. With virtualization, we even buffer the hardware impact by using the abstracted layer to avoid the implications. The mean time between failure or duty cycle forces us to replace. But, in software, we don't have such limitations imposed by physical stress and use. We just accept we can't do things and live with the inefficiencies.

So, in conclusion, just as Detroit has experienced the life extension of it's products in the market, by building better cars, they lost sight of the need to motivate replacement and to tackle the saturation of the market that had cars that were good enough. This is so similar to our dilemma in the software industry in general and higher education specifically. The Academy is weighed down by aging systems and applications (clunkers that need to be replaced). But, replaced with what is the next question? Given differentiation and product choice, I don't see enough value in replacing the present architectures, since the community of vendors and software implementers have little in their new products that would drive greater value. This will change. It has too.

If you are a software author, developer or implementer, now is the time to heed the lesson of fifty years of software development and life cycle management. We all expect a new generation of software systems designed as components sharing the cloud. These new systems will adopt loosely coupled integration technologies to provide a plug and play value for users who consume their services. This requires the adoption of common specifications for how interfaces must work in a community of applications coming from different authors or vendors. And, this gives us all the responsibility to follow these specifications, help stimulate them and push to get others to do the same. These same specifications would also help drive new innovations that will eventually create new market opportunities for developers and authors to satisfy - which is good for everyone. Which in the end, will develop an improved market for all innovators and those dying to get a much better value for the dollars they spend on technology.

Thursday, March 5, 2009

Working in tight spaces since 2001

Growing up in the sixties, through social turbulence, the war on communism and the call for the great society, I recall the impact of the movie 2001 Space Odyssey when I saw it in 1968. Not many movies captured my imagination or had an impact on my life the way that movie did. Since then, there have been many movies with a computer in the center of the plot. Matrix, Independence Day to War Games showed how computers were not boring. The power of technology has always been portrayed in movies as a force humankind can harness, but there are risks.

The conversations between Dave Bowman and HAL the computer was the seeds of dreams for many. Watching the movie was like a surreal journey through space and time. The movie took three hours to watch, but its impact has been felt over my lifetime. The implications of life out there, our future, how computers would evolve and be an integral part of society by the year 2001 was mesmerizing and discomforting. It launched conversation after conversation about the implications. At thirteen, I wondered how computers might impact my life. And, our society.

The first generation systems evolved from scientific and experimental computers like the Eniac and Atanasoff–Berry Computer (ABC). They had limited capacity, yet showed the promise of digital computing all the same. They were no way close to the power of the HAL 9000. Companies and governments invested in them. Funding for the space race and landing a man on the moon, was in the news almost every day. And, in the late 60’s, it was not that big of a stretch to think by the year 2000, we would have computers talking, understanding, and directing many tasks humans would desire.
 

Well, if we could land a man on the moon, how hard would it be to have a computer talk and understand English? I believed in my lifetime, we would have robots to perform menial effort or work in space to do things too risky for humans. We would have communication systems translating languages I could not understand. And, we would have the ability to connect wirelessly with video transmissions here on earth and in space travel.

Yet, given where we were in the sixties and early seventies with the development of computer systems, operating systems and software languages, it was a realization that it would take a whole lot of small steps to get there. The HAL 9000 was a generational computer dreamed up for the movie. Given we were working on second generation computers at the time, like the IBM and Burroughs mainframes, coding line by line using primitive languages, the idea of getting to where 2001 seemed out of reach and unrealistic the more I put my head around the vision. There were just too many steps for me to think about and unknowns.

Programming in machine language was very limiting. The first generation computers did not have high level languages like Basic, C, Java, Cobol or Fortran. All programming tasks relied on machine languages which placed instructions in sequential forms, usually on punched cards or paper tape. Even in the movie 2001, HAL’s programming and memory banks were massive. It took up rows and rows of space in the ship. Much like the computers of the 60’s and 70’s, memory and CPU’s were not on silicon and transistors and core memory were not in practical use yet. Yet, Dave Bowman had to work in a tight space to alter or adjust HAL, much like programmers had to cope with the limitations of computers of that day.
 

The transition from machine language to symbolic assembly languages changed the landscape. COBOL and FORTRAN were developed with compilers that would create the assembly code for you. Machine instructions were abstracted. Yet, we still danced with Hex Dumps and linkages back to machine instructions to decipher how to correct a program’s logic.

The second generation mainframe vendors were Burroughs, Control Data, GE, Honeywell, IBM, NCR, RCA and Univac, otherwise known as "IBM and the Seven Dwarfs." After GE and RCA's computer divisions were absorbed by Honeywell and Univac respectively, the main framers were known as "IBM and the BUNCH."

Today, with huge amounts of memory to spare and virtual memory spaces to slobber over, I think of how much energy I devoted to managing variables, reuse and the artifacts of pointers just a few decades ago – before memory became so cheap. Many stories popping into my head distract me from what I what to express in this pass. Magnetic core memory was one of those inventions that radically changed the economics moving computers from university laboratories to commercial viability.

The earliest work on core memory was by Dr. An Wang and Dr. Way-Dong Woo, two Shanghai born American Physicists. Both were working at Harvard University's Computation Laboratory in 1951. Looking back, Harvard was not interested in promoting inventions created in their labs. Instead Wang was able to patent the system on his own while Woo took ill. A lesson for all universities.
 

Dr. Wang's patent was not granted until 1955 – the year I was born, and by that time core memory was already in commercial use. This started a long series of lawsuits, which eventually ended when IBM paid Wang several million dollars to buy the patent outright. Wang used the funds to fund his Wang Laboratories which he co-founded with Dr. Ge-Yao Chu, a school mate from China.

Not many of you will recall the heyday of the 70’s when Wang Laboratories, Texas Instruments, Prime Computer, Basic Four (also known as MAI), Datapoint, Hewitt-Packard, and Digital Equipment competed for the minds of system managers and programmers comfortable with the mainframe world. Downsizing was not the term we used. But, these third generation computers evolved with the use of the transistor and core memory introduced by Dr. An Wang, the founder of Wang Laboratories two decades prior.

Later in my career, I had a chance to meet Dr. An Wang in Lowell Massachusetts, when the Wang Labs sponsored a gathering of their value added resellers. In was around 1985 I recall, when my young company was developing an administrative system called ABT Campus on Wang 2200, a small mini-computer that could support up to sixteen workstations and be clustered up to 240. The Wang 2200 used fixed memory partitions up to 64K. Which, meant that the entire application had to run within that space, or call other parturitions that had global routines and variables.

It was exciting to meet the inventor of core memory and the founder of Wang Laboratories. At the time, Wang was a $3B empire selling throughout the world. Dr. Wang was addressing 2200 enthusiasts – mostly resellers who developed interesting applications linked to Wang’s Word Processing Software. My company was one of the resellers and that is how I was invited to Lowell Massachusetts.

Another thing I recall is Dr. An Wang led the team that also developed the computer SIMM, the present means computer memory maker’s layer chips on plug able small boards placed on the motherboards. This saved real-estate on the motherboards and allowed for higher capacity systems. Dr. Wang was a very inventive person. Not bound by the logic of delegation it seemed. He was hands on. He shook everyone’s hand in the small meeting. And, I think Dr. Wang stayed that way throughout his entire career until his death.
 

Comparing to what we work with today in programming and computer systems, whether for windows or the internet or for other platforms, most of the limitations placed on memory utilization and reuse are now forgotten in the distant past. If Stanley Kubrick would have made the movie 2001 today, he would have shown HAL the size of an IPOD and the interface would have been telepathic, not verbal.
 

Yet, the exercise drawing on my memories of movies like 2001 and recalling history of what I experienced and witnessed (and recall) since 1968, has helped me understand the importance of optimization and balance. We have limitations and we should manage our resources wisely no matter how much space we have.

Wednesday, March 4, 2009

Where would I be if I did not learn WATFOR?

The artifacts of software are explicitly linked to the evolution of computers or CPUs. If there is a CPU (Central Processing Unit), then there must be software instructions governing the CPU. The instructions compiled or assembled into executable tasks is called software in general compared to the physical components of the CPU called hardware. There are various layers of software and hardware that make up a computer system. The interaction of hardware components is also controlled by software building on specialized instructions to manage input, processing, storage and output. 

Whether it is a CPU utilized to control the flow of air in a building or a generalized CPU used for desktop computing, software is by far a very expansive path to study. To uncover and resurrect past software (or Ruins) and its contribution to the evolution and innovation of the computer industry - in a few short years, is challenging enough, given the nature of our minds forgetting things and tossing out we no longer need or use. Computers are living lives less than Dog Years. In a few short years, a computer once thought of as the greatest and best, is now on the trash heap forgotten. Because of this, many of the artifacts and history of software is left in rubbles in the back of our minds, lost unless we bring the stories out and share them.

Without CPU and hardware, software would have no purpose or exist. As I explore my own personal stories growing up with computers over the last five decades, and those of my friends, it resonates how fortunate I have been to witness, and be a part of the explosion of innovation – driven by the economics or some would just say just sheer capitalism. To create some improved value over the previous works of art drives the computer industry and much of technology exploration in the hope each new innovation brings us purpose and usefulness, which would then derive value.

I carry around with me - like impressions of art scattered on the wall of my life, a diverse set of stories that brings me to the present and propels me to the future. The stories have varying colors, mediums and textures that bring to light the impact of what I was seeing, feeling and absorbing through the years. Some are deep rooted and moved me to take a new direction. Some are real boring to bring up. Some are quite interesting and filled with lessons learned. Many are humorous, since we cannot take ourselves so serious in a world that is always top-see-turvy.
 

Almost every day, thoughts come to mind, relating to my personal history and the evolution of modern computers and software over the last half century. This has led me to writing down some my stories and in part, sharing them on Software Ruins as a means to collect the artifacts of the virtual world that are not with us any longer. My son will never need to prepare paper tape or punched cards to read it into a computer. My daughter will never have to play with a floppy disk. Yet, both will utilize the descendants of computers and software that leveraged those devices in the migration we call time.

This is not meant as a history lesson – so don’t worry. But, looking back, and diving into the past, offers the opportunity to reveal and share with retrospect and insight. Given what we have witnessed and experienced, one can only imagine what our future holds. Most of the time, we are not thinking about events relative to their impact on us or our community or our world as they occur or soon after. It often takes years to reflect and ponder the subtle implications. I guess that is one of the benefits of looking backward and relating it to what transpired.
 

My father worked for GE before I could remember much. He moved around a bit between the commercial and space divisions as they focused on different initiatives in New York and Pennsylvania. In the 50’s, he was part of a team inventing magnetic tape drive assemblies. They were external storage devices that radically changed how second generation mainframes evolved past their physical memory limits of their ancestors just a decade earlier. My father’s engineering and inventive side was inherited by me I guess. My mother was the artistic and nurturing spirit who always told me I could do anything I put my mind to. She was the one who gave me the comfort to think outside the box.

My first introduction to computers came when I took my freshman course in FORTRAN. It was a required course in 1973. The class was in one of those huge lecture halls where 300 or so students sat in the auditorium and we listened to the professor talk about the history of computers and how relevant it was to our day. My class involved writing simple FORTRAN programs to solve mathematical problems using punched cards. It was not one of the coolest courses. I struggled and persevered through it. It was only one semester anyway.

Looking back, I can now see further insights into how the computer industry was evolving. It is not savage. But, there is a form of cannibalization and efforts orchestrated to dominate competitors through any means possible, including using the legal system to fight patents and inventiveness.
 

The evolution and drive to address the obstacles of commercializing computers begins and ends with adoption. As a consumer, back in the 70’s, I saw no relevance to what I was learning about a technology that was still evolving. It was early in the adoption cycle, where the masses sit back and wait until others, more adventurous, explore the unknown. Computer memory and capacity limitations played out over decades – not one semester. Software operating systems and hardware evolved to serve commercial markets and their obstacles incrementally. There was no big boom and computers were everywhere. It was a long stretch over time that patterned Darwin’s theory of evolution.

FORTRAN was an early language developed by IBM. It was designed for numerical computation compared with COBOL, designed for character based data as an alternative to machine level programming for the IBM mainframes. From what I understand, IBM funded the University of Waterloo to develop WATFOR in 1965 for the IBM 7040. By the early 70’s, Drexel University was using the WATFOR compiler as the basis of the Introduction to Computers course. The WATFOR compiler was popular on college and university campuses.
 

The employees of the University of Waterloo led by Ian McPhee spun out to form a compiler company in the early 80’s. They developed other products including WATCOM APL, BASIC, COBOL, PASCAL, FORTRAN and WATCOM SQL on a range of hardware platforms. It was an impressive array of work for a small company. They moved down to micros and embedded applications with their WATCOM Assembler. The Commodore SuperPET used WATCOM as the basis of their software. And, in the late 80’s, WATCOM introduced a C compiler for the IBM PC competing with Borland and Microsoft.
 

C was a developed originally on UNIX by Bell Telephone Laboratories in 1972 to form independent system software. The C language incorporated some of the syntax and semantics of other languages including ALGOL, FORTRAN and COBOL. As a compiler and language, C was designed for portability with features supported by general purpose CPUs with modularization, structure, and code re-use. Before I got into C, I learned ALGOL first on the Burroughs 5500 and fell in love with writing software. I soon moved to PL/1 and then C as I worked on Prime Computers running UNIX. Which is when I realized my calling was in developing and architecting software systems.

I look back on my programming start with WATFOR. How ironic things seem to cycle back to where one starts. Fast forward twenty years. It was 1993, and my software company was launching a second generation campus administrative system called ABT PowerCAMPUS to replace the first generation ABT CAMPUS product line that was running on Wang 2200, Digital Vax and PC LAN Networks.
 

We chose to utilize Sybase SQL Server and PowerSoft’s PowerBuilder as our primary development platform. This was a ground up project. We were starting from scratch. We also wanted to support MS SQL Server and Novel SQL in the early stage of our development. Our attempt was to build a portable front-end and back-end. But, that ran into difficultly when dialect differences overwhelmed us. So, our primary focus was on Sybase and PowerSoft. Our goal was to develop a stable product by 1995, and MS SQL Server was not well regarded at the time. We played around with version 4.2 and had to wait until version 4.9 before it became more stable and provided the functions we needed including Stored Procedures.
 

Now, let’s regress a bit. Microsoft bought the Windows based SQL Server code in 1993 from Sybase and dissolved their revenue sharing partnership with Sybase that started in 1988. So, the split created two competing code bases on the same NT platform, with a common ancestor of Transact SQL, but gave Sybase a small war chest to acquire companies and diversify. Both were competing for the growing client/server database market against IBM, Oracle and Informix still on UNIX or mainframes.

While in the midst of starting our PowerCAMPUS project, WATCOM International was acquired in 1994 by PowerSoft. Then, a short year later, Sybase acquired PowerSoft in 1995.

Remember, Sybase, was the author and developer of SQL Server as I mentioned earlier. Its popularity was mostly on UNIX. Yet, a growing base of corporate PowerSoft developers were loyally aligned with Sybase SQL Server on NT– because of the dialect variations and stability issues I mentioned earlier kept everyone away from adopting Microsoft’s competing version. Product maturity has a way of helping one retain a client base. Lesson learned.

As Sybase absorbed the PowerSoft and WATCOM technologies, they incorporated the C compiler into the PowerSoft PowerBuilder toolkit and WATCOM SQL was renamed as SQL Anywhere, a desktop, single user database. Soon, we were compiling the PowerBuilder objects generating C code and executables replacing the interpreted Basic like code. Performance improved. And, we were utilizing the standalone SQL Anywhere to develop and qa our product development.

So, from the start in 1973, I learned programming using WATFOR - a simple FORTRAN compiler. And, twenty years later, I was reintroduced to WATCOM technology that evolved with the industry, at the same time I was trying to navigate the nuances of who was doing what to whom and trying to pick which vendor and products to utilize in my efforts to create value for my niche market, given the nature of how technology companies follow the laws of Darwin.



Innovate or perish.

Tuesday, March 3, 2009

Being first to market, ARCnet

I’m going to date myself. How many of you out there remember ARCnet? In 1976, I was attending Drexel University. As an early geek, I lived in the computer center on the basement floor of the student center rarely attending class. My days were filled experimenting with everything from dumb terminals connected to early mainframes, minis and micro computers to teletype systems with paper tape or punch cards. Drexel provided me such a wonderful exposure to technologies and early devices such as the IBM 5120 and the Textronics Plotters and Datapoint Databus CPU’s to name a few. Drexel also had a mainframe Burroughs 5500 and RJE Terminal to UNI-COLL, a shared non-profit hosting an IBM360 at first and then an IBM370 mainframe for some thirty higher education institutions on 34th and Market Street in Philadelphia. It was part of the Science Center back in the day of timesharing. I will cover some stories on UNI-COLL another day. This day, I will reflect on ARCnet.

As a student, I got into everything. Nick Demaio, the Drexel Computer Center IT manager grew fond of my skills and one day gave me a job of looking into the Datapoint Databus CPU employed in the Admissions department. I recall a group of CPU’s were connected together and they looked like hybrid office machines – part printer – part terminal – part CPU. They were used by administrative operators in Admissions to follow-up inquiries and applicants. In other words, they were early word processors, designed to support the text letter management and data files storing all the names and addresses of people who contacted the University. The applications were developed by Michael McCabe, who was a few years ahead of me. The Cobol like language was used with a compiler and the routines developed split across the multiple CPU’s. Multiple CPU’s? Everything that I was exposed to prior centered around a single CPU or a system built with multiple CPU’s but managed by a shared operating system like MVS on the mainframe.

What intrigued me was the connection between the “micro” computers. It was a coax cable on the outside connected to the back of the chassis far more advanced than the punched card readers and paper tape readers we had in the computer center. The applications were developed and managed with no release management, check in and check out or any thought about architecture. The programming to support the Datapoint applications also had to incorporate updates of the OS (operating system) and device driver updates for the plug and play expansion boards. Each board included a set of disks. And, the system software had to be loaded in ROM and made part of the boot sequence. Part of the responsibilities of the computer center was to provide ongoing maintenance of the Datapoint system, as new releases were received on 8 inch floppy disks from Datapoint.

That was where I came in. Nick Demaio asked me to update the system one day. That is an over simplification. I had to learn how to install a set of patches and to re-compile the Cobol programs Mike McCabe developed. There is an art to developing new solutions. Just like there is an art in trying to figure out things someone else created. In the process, I consumed the manuals and technical release papers on the ARCnet and began dreaming of all sorts of applications that could connect computer to computer together. There was little documentation on the actual application Drexel used. Maybe twenty or so pages written into ASCII text files. But, I figured it out.

Reverse engineering was a term that was coined years after I was working on these Datapoint systems. Back then, the data files were localized on one CPU as it served others. Multiuser support had to be handled with file and record locking. ARCnet introduced new challenges, in that use of the tables was no longer under the session control of a single CPU. On top of that, ARCnet had no central CPU. The hub of computers shared the external backbone over the 1.5Mhz communication channel offered by the coax cable. Datapoint’s incarnation of ARCnet was more like peer to peer networking. Do you recall NetBios? ARCnet was the forerunner of that which Microsoft fashioned a decade later and pushed for a long time. We still have NetBios installed over TCP/IP on most networks today. Some of you may be familiar with that. Most of you don’t recall ARCnet because it was the first real venture into (LANS) local area networking before Novel picked it up and supported it as a topology in the late 70’s. It was before Banyan Vines. And, it was before Ethernet and Token Ring came on the scene in the early 80’s as well as a competitive threat.

Token Ring from IBM took the same path as ARCnet. I worked on that too. IBM controlled the technology and did not want other vendors to tell them what to do, so I would call it a pretty closed system. As a result, Ethernet, which was pushed by Xerox and Digital, took off, because it used twisted copper and was open to all to support from Apple to Zenith. What limited ARCnet, Token Ring and other network topologies of that day? Most likely, the reason none of us use them today, was the reluctance of Datapoint to support anything other than RG62 Coax. The engineers thought it was the best way to send signals without interference. From a business perspective, they also thought they had the market cornered and after going public in the early 80’s with tons of capital, they faltered when their senior management and the SEC clashed over revenue recognition, throwing the company in a tail spin they never recovered from.

Meanwhile, IBM and others who resisted open standards and market forces believed Token Ring was their best way to connect computers and devices even with their bulky cables. All their hardware offered Token Ring adapters and none of it connected with Ethernet or ARCnet. That is another story.

The lesson learned from ARCnet and IBM, at least from a marketing perspective, is that sticking with your guns and trying to control the evolution of technology is often better served when one adopts an open market perspective. By opening the Ethernet technology as platform commerce could expand on, we all benefited and we now have the Internet, home computers and a world connected.

If you want to read more about ARCnet, check out
http://en.wikipedia.org/wiki/ARCNET

Sunday, March 1, 2009

The Archaeology of Software

Software can be thought of as the building blocks of a virtual universe – an integral and organic part of computer systems complimenting the evolution of hardware.

Much like the early inventions of mankind, software has evolved rapidly through generations of discovery, use and decline. The logic and architecture of programming is training a computer in steps of instructions. The interactions are controlling. Inputs, storage, processing and output commands are componetized. Generally, we think of software as technical and abstract. Yet, they are languages and forms of communications all the same - between human and machine and machine and human or machine and machine. Software is similar to virtual life - reflecting species and genetics impacted by Darwin's theory of evolution.

My work is not an exclusive list or complete compendium. My observations and gathering of the relics of software companies and remains from my 'digs' is like the study of archaeology though - in short years - yet felt like centuries. What remains of software often is hidden under the surface away from our sight. Decisions long ago forgotten reveal how software, much like early civilization moved around feeding on the landscape, impacted by climate, density, food and predators outside their control.

Software is a medium of communication, much like writing on papyrus or paper or punched card or computer screen. And, it deserves to be preserved or understood like the artifacts uncovered in a dig. Where does it fit? How did it impact the day? Who were the users? How long did it survive? What is the ancestral tree?

So, this blog is my venture to do so. To develop a virtual museum of sorts where I can scavenge for the remains and attempt to describe software's history and placement or at least some of it.