Saturday, August 15, 2009

Software Age - How old is your system? Do you have a clunker?

In our age of energy conservation, our push to replace clunkers is a good example of how policy and advocacy have aligned - at least in the auto industry. I drive a 2001 Dodge Durango which gets about 12 miles to the gallon. It is a big truck. I like it. But, it is not the most fuel efficient auto on the road nor do I have all the industry gadgets my friends have in their new hybrids.

Obviously, if I replace it with a new hybrid, I would get 40 or more miles per gallon. And, I would benefit from newer technology such as a built-in GPS, alert systems for parking and backup monitors. I would also have a dashboard system to keep me focused on fuel efficiency. Cool stuff. I could keep my Dodge running on 87 octane fuel for another ten years - assuming gas stations continue to sell it.

It would not cost me as much as a full replacement to just keep my Durango and maintain it. I do have another 20,000 miles on my lifetrain warranty. Yet, I will eventually conclude the utility of keeping the Durango will be replaced by the call to upgrade, given the draw to new benefits and the cost of not having them.

As computer systems have evolved from standalone applications to the cloud, software architecture has evolved significantly. How systems should be developed and deployed to serve the evolving expectations of the higher education market or for that matter the Internet world is a matter contrasts.

As we look back over the decades of software development, we have seen systems age and new products emerge to replace them. It is usually the competitive landscape that drives improvements in the market by the threat of replacement. Yet, obsolescence has been buffered by software and system maintenance agreements. We live in an incremental world.

Part of the challenge for software developers and implementers is when do they re-engineer or re-design the basic framework of applications and create a new generation? Do clients receive the new generation replacing the old? Or, do they need to re-license? The age of ERP and huge monolithic systems is giving way to a more component based architecture. As this movement continues, it gives us (the software industry in general) the opportunity to address how best to replace application frameworks and designs that could address changes and innovations desired by a new user base.

It is great to see a product generation live ten, twenty or more years and pay back the investment. But, hidden beneath the surface of aging systems, is the software architecture and assumptions that drove the development and design in the first place. As such, many older software products are stuck in a time capsule and are protected by their authors in the marketplace with software maintenance agreements and restrictions on integration.

Vendors derive a great deal of revenue from the services to band-aid integration and to service the complexity. They embed duplicate code to support validation and work flows outside their original designs. The lack of reuse of code and the duplication of code to handle the every growing expectations of linking to the Internet world, is taxing the IT investment so much, that the general user community is exhausted and all but given up hope that computer automation can make their jobs more relevant with links embedded to enable networking, collaboration and teamwork.

Part of the challenge we face today connecting stakeholder systems can be directed linked to the inadvertent and unintended consequences of systems and tools not designed for the 24x7 online web service world. Many legacy products deployed and utilized by higher education institutions were initially standalone or offline from students and faculty. If you date the software origin, you can pretty much predict the behavior of the author or vendor or implementer.

Vendors and authors can claim success by pointing to a large install base. They can also demonstrate functional value, for what a product was designed to perform under the controlled constraints of a demonstration. Yet, many of these same systems were built with a single monolithic layer where the business logic cannot be separated from the user interface, integration and work flow.

Thus, code is duplicated throughout the products to handle multiple functions, because there is no business logic layer supporting reuse. This drives the costs higher for the vendor and the client. This makes integration and interoperability expectations near impossible to address in the 21st century online world.

Application software systems written in COBOL or BASIC for instance two or three decades ago were designed for the batch era and lack connections to online web forms. Some have been transformed with new front-ends, but the basic simplicity of their software architecture still remains. This explains the protective and often proprietary resistance of vendors and authors trying to hide or control the limitations under the surface.

Vendors and authors generally behave the same when they are sitting on aged systems. It may be good to have antiques in your house, but that is not the best approach in today’s online competitive world. Students and faculty have high expectations for online systems. To utilize antiquated software systems built for a bygone era is just foolish since the interfaces will hamper the value, intent and use.

Many vendor/client relationships restrict users and partners from interacting with software products directly through alternative interfaces called APIs (application program interfaces) because they can’t be developed. The logic layer is not callable because it was never segregated in the first place. Data interchange directly between products of different eras is compromised by proprietary interests – in other words of protecting the installed base. Application interfaces standards are not supported and thus integration takes a back seat to supporting functional perspectives limited by roles and use.

In more detail, software has business rules governing how data is validated as part of complex transactions from inserting to updating data managed by an application. Information generated from these systems is also constrained by not allowing other applications access through query or request. When the validation code rests in the user interface because that is where many vendors and authors put it, then integration frameworks can’t call it to process alternative forms of entry. Therefore, the ability to support new import and export requests using web services today is severely limited because of legacy architecture. Batch interface choices can be deployed, but that requires the duplication of code and increases support costs.

Trying to fix this is near impossible while staying in place. Software vendors, developers, and implementers rely on software maintenance to support their products, but also siphon off the revenue to fund new development at a slow pace. Cannibalizing and shifting products from one generation to the next is very challenging. Moving the installed base forward is also challenging. These areas of friction deter the advancement of systems overall and slows down the adoption cycle of systems that can address new expectations, but can’t or are not provided interfaces to the legacy or older products.

Thus, we have seen legacy products retain their own technology stacks for a long time and get away with not addressing the integration expectations of the market because the service revenue generated clouded their vision. Component add-on products follow the same path. Vendors with nice tools developed in the 80’s, 90’s and after the turn of the century have dramatically different architectures which relates back to how easy they are to implement and fit into a Student System or ERP.

Having to re-write any component would require front-end investment and the hope of recouping that investment from the target market – which may be too small. That is pretty risky as the installed base behavior burdens massive replacement across higher education or any market like it. Thus, we are in a quandary. What do we do?

Well, for one, let's call on all software authors and implementers to begin to realize holding on to aging systems is actually only going to make things worse for business models dependent on software maintenance. It does not matter whether the software is home grown, open source or commercial. The challenge remains. Just because software is virtual and not a physical artifact, does not mean it should not be seen as an asset that depreciates and needs to be replaced after it's life cycle is impacted by architectural alternatives.

If the system ages beyond five to seven years in the Internet universe, it will mean obsolescence and the install base will limit re-engineering.

Take a look at the business model supporting the application or system. It has a built-in incentive to avoid re-writing and re-architecture. Maybe it would be better to focus on a subscription model with annual, predicable revenues, that times out. The time out, like an office lease, must be extended or it is time to move. In the extension, would you not want to re-do the carpets, paint the walls, and maybe correct the office layout, given the changes in your organization? I think so and that is something we should consider.

Hardware does not have the same challenge, since we have a built-in pattern to replace hardware on desktops and servers over a life cycle of three to four years. With virtualization, we even buffer the hardware impact by using the abstracted layer to avoid the implications. The mean time between failure or duty cycle forces us to replace. But, in software, we don't have such limitations imposed by physical stress and use. We just accept we can't do things and live with the inefficiencies.

So, in conclusion, just as Detroit has experienced the life extension of it's products in the market, by building better cars, they lost sight of the need to motivate replacement and to tackle the saturation of the market that had cars that were good enough. This is so similar to our dilemma in the software industry in general and higher education specifically. The Academy is weighed down by aging systems and applications (clunkers that need to be replaced). But, replaced with what is the next question? Given differentiation and product choice, I don't see enough value in replacing the present architectures, since the community of vendors and software implementers have little in their new products that would drive greater value. This will change. It has too.

If you are a software author, developer or implementer, now is the time to heed the lesson of fifty years of software development and life cycle management. We all expect a new generation of software systems designed as components sharing the cloud. These new systems will adopt loosely coupled integration technologies to provide a plug and play value for users who consume their services. This requires the adoption of common specifications for how interfaces must work in a community of applications coming from different authors or vendors. And, this gives us all the responsibility to follow these specifications, help stimulate them and push to get others to do the same. These same specifications would also help drive new innovations that will eventually create new market opportunities for developers and authors to satisfy - which is good for everyone. Which in the end, will develop an improved market for all innovators and those dying to get a much better value for the dollars they spend on technology.