Winning success from Babbages failure

Send to friend

The year 1991 saw the successful conclusion of one of the most remarkable enterprises in the history of invention. A team of engineers at the London Science Museum managed to complete the construction of a cogwheel computer that had been designed, but never built, 170 years earlier by the Victorian scientist and mathematician, Charles Babbage (1791-1871).

During his lifetime, Babbage insisted that his cogwheel brain, which he named the Difference Engine, would be an ultra-reliable cogwheel calculator for printing mathematical tables devoid of any risk of human error. His efforts to build it, however, met largely with ridicule, and Babbage died a disappointed man in 1871.

Why couldnt Babbage build a Difference Engine in his lifetime? Today, with the benefit of hindsight, we know that the biggest problem which stymied him was the lack of a precision metal industry. It wasnt that the overall design of his machine was faulty, or that cogwheels cant be computer components (they can, though theres no denying that electronic components do the job a whole lot better.) The successful modern initiative to build a Difference Engine deliberately made use of components that were no more precisely-engineered than Babbage himself could have produced back in the nineteenth century. However, the 4,000 components - most of them cogwheels needing to be fashioned to be as alike as possible - were manufactured for the modern build using modern industrial manufacturing processes which ensured consistent high quality components to a standard pattern. This was the result of the difference between the small-scale craftsman approach for building precision components that Babbage was obliged to adopt, and the modern industrial method used by the Science Museums engineers.

Today, despite his ultimate failure, Charles Babbage is regarded as the father of computing. His reputation continues to increase in direct proportion to the increasing importance of the computer in our high-tech age. It was somehow highly appropriate that in the year 2000, Microsofts former technology director, Nathan Myhrvold, took personal delivery of a full-scale modern-built Difference Engine. This was the second such machine ever built. Myhrvold had sponsored the construction of a second machine so that it could become the ultimate conversation piece in his home.

In the arcane world of cogwheel computing, the 170-year battle between the crafts approach and the industrial approach to making components for Babbages Engines was decisively won by the industrial approach. Today, in the enormously important and influential business of computer software development - a business which, in a very real sense, has made the modern world possible - the craftsman-like way of doing things has prevailed for a long time. But the day of the craftsman-like approach may be coming to an end, to be replaced by an industrial way of designing software that is faster, less risky, less expensive and more efficient.

There is widespread agreement throughout the computer software business today that software development as it is currently practised is in a bad way. As Jack Greenfield, an influential software architect at Microsoft points out:

Software development... is slow, expensive and error-prone, often yielding products with large numbers of defects, causing serious problems of usability, reliability, performance, security and other qualities of service.

Greenfield goes on to quote research from the leading computer industry research house the Standish Group. This research states that in the United States businesses spend about $250 billion annually on the software development of about 175,000 projects.

The research also indicates that only about 16 percent of these projects finish on schedule and within budget, while another 31 percent are - on average - cancelled each year, mainly due to quality problems, with overall losses of about $81 billion. A further 53 percent exceed their budget by an astonishing average of 189 percent, incurring a total loss of about $59 billion. The same research suggests that even those projects which do get completed deliver an average of only about 42 percent of the originally planned features.

These figures are disconcerting, to put it mildly. And certainly, anyone with practical experience today of being involved in software development, whether as a software developer or as a client, is likely to testify to the sheer difficulty and stress of the convoluted progress - or lack of it - which these projects undergo.

As well as the problems unearthed objectively by the Standish Group research and capable of corroboration by extensive anecdotal evidence, there is also the brutal fact that software development is extremely labour-intensive. Indeed, Jack Greenfield suggests that software development is consuming far more human capital than we expect of a modern industry.

The whole point of an industry is that it brings a concerted and methodical approach to the task of creating something or putting raw materials through a particular process to achieve a final result. A key definition of industry in the Shorter Oxford English Dictionary is systematic work or labour.

But in so many cases, software development is not systematic. Instead, it is a strangely haphazard matter, involving methods, approaches and outputs that bear far more resemblance to the brilliant but ill-equipped genius Babbage painstakingly struggling with a few hired craftsmen to manufacture precisely-made cogwheels, than anything resembling a modern industry. No wonder the track record of success of modern software development is so, well, unsuccessful.

What about people who might say, all right, anybody can criticise the modern software development business, but the fact of the matter is that enough projects have been completed, and have delivered enough of the functionality that was required, for the world we live in to be an enormously different place from what it was even just twenty years ago, let alone in Babbages time.

This is, admittedly, a fair point, and certainly there is no denying that despite the shortcomings of the software development business, the products of software development obviously do provide significant value to the organisations that commission them, and the users who benefit from them.

Even as I write, software is being used around the world to control, monitor and make safe literally millions of processes that not even such a far-sighted technological prophet as Babbage could have come close to foreseeing. Drilling taking place ten miles below the ground, civilian and military aircraft flying ten miles or more above the ground, and all the myriad intricate visible and invisible processes that make the modern world what it is, are all enormously dependent on the power, precision and reliability of computer software.

But does this mean that we should be complacent about the effectiveness of modern software development techniques? Certainly not. Instead, all it means is that despite the ever-present practical, logistical and financial problems relating to software development today, computer software remains so much valued that, by and large, the organisations paying for software development projects are willing to suffer large risks and losses in order to obtain the benefits which software development can offer them. Commenting on the lamentable situation in which modern software development finds itself, Microsofts Jack Greenfield observes:

While this state of affairs is obviously not optimal... it does not seem to be forcing any significant changes in software development methods and practises industry-wide.

This is undeniable. Ten or even fifteen years ago, the risk factors attaching themselves to software development were much the same as they are today. Indeed I remember that back in the early 1990s the figure of 16 percent of problem-free software development projects was bandied about much as it is nowadays.

Can things ever get better?

The general sense of dissatisfaction with the way software is usually developed has been felt keenly since at least the mid-1980s. From time to time there have been false dawns of new ways of doing things; new ways that were supposed to constitute revolutionary improvements in how software was designed. However, in most cases the false dawns barely survived to lunchtime.

Object-orientated programming (OOP), for example, was supposed to introduce a new way of managing software development by basing programs around certain objects - a bundle of features associated with a particular application - that could in principle be regularly re-used as and when required to facilitate rapid software development. But whilst OOP is widely used it didnt fulfil its wider promise; it was more a question of oops! because developers didnt find the objects as helpful as the original author initially hoped would be the case. Most developers concluded that the level of modification and customisation of the objects necessary was simply too great for the objects to be useful; so they wound up building their software from scratch instead. Albeit using OOP techniques.

More progress has been achieved in recent years from new kinds of tools which, rather than seeking to revolutionise software development, aim instead to make developers more productive. A particularly useful tool here has been application development frameworks (ADFs). These are a form of open-to-all software packages that facilitate rapid customisation of the software for a particular application to the precise requirements of the organisation in question. But even ADFs are really still a set of generic building blocks, with the objective of being used across a wide variety of applications.

Today, the reason why software development is such a laborious, potentially risky and fraught matter is that the way in which software is designed means that opportunities to re-use code (ie. pre-written programs) are much rarer than one might imagine. Software development mainly takes place using generic programming languages that render every project essentially a bespoke one. There has been limited infrastructure and markets that have encouraged developers to supply potentially re-useable programming components. But maybe re-use of pre-packaged functionality wasnt the way to do it after all and a different perspective was required.

That different perspective appears to be a concept known as the domain-specific language (DSL). A DSL is a special kind of language designed to be used for a particular application or solution. It is not a new concept to computer scientists but it is one that is now entering the mainstream with the launch of tools that support this type of language.

As one might imagine, a DSL will be all the more useful the more tightly the particular application or solution to which it caters is focused, because this will tend to increase the likelihood that another programmer, building a solution to a similar problem, will be able to use the same features of the language to specify what it is s/he wants the computer to do.

In fact some DSLs have relatively wide domain remits (eg. retail banking) while others have narrow remits (eg. operation of a retail banking ATM network). Incidentally, Microsoft has recently developed a special kind of DSL whose particular domain is helping developers to design other DSLs! In many ways this is the type of catalyst that will see DSLs enter the mainstream as it will now be possible for organisations (or their suppliers) to define languages aimed specifically at providing solutions for problems within their business domain. I doubt even Babbage would have thought of that.

DSLs are at present in their infancy, but already they are creating the possibility of the development of a software factory. What is a software factory? A software factory is not some new form of business or a special type of organisation. In essence, it is a set of tools, a development environment if you will, that is optimised for building solutions in a particular business domain. Hence, a software factory could be used by a traditional software supplier (assuming he adopts the new tools and techniques) or an organisations own internal IT department if they are developing applications for that organisation.

A software factory provides solutions not through using pre-written code or following best practice for a particular type of solution (although these can be a part of it) but through the use of a programming language (a DSL) that is inherently designed to write computerised solutions to particular types of problems.

Software factories are most emphatically not about selling software packages that provide a one-size-fits-all solution to a particular problem. Instead, software factories acknowledge that everyone may want a different type of product but - to employ a useful analogy - if a factory is optimised to build motorcars, it will not be able to build houses, or at least it will hardly do so very effectively. Despite allowing individual solutions to be provided, the software factory would offer reliable ways to develop applications rapidly, relatively inexpensively and with lower risk than is seen using the generic programming languages and tools. One can make a direct comparison here with the introduction, in the mid nineteenth century, of standardised and precisely-milled mechanical components by the great mechanical engineer Joseph Whitworth, whose components and machine tools won international renown for their supreme accuracy in a repeatable format. The impact was felt not only in the form of Whitworth being the only supplier of these items, but also that his gospel of precision encouraged everyone to think about solving particular problems using tools built to common standards.

The ultimate aim for software factories would be to remove much of the small-scale crafts element from routine software development and confine such craftsmanship to the really sharp end of the most demanding, leading-edge software development initiatives. This will allow software development to become less expensive, lower risk and far more reliable while still giving organisations all the scope they need to establish and maintain a competitive advantage from their software. A software factory, in other words, would offer the best of all possible worlds.

Software factories are not here yet. But they are coming, and they will be the new Engine that will truly make the Difference in how software is developed.




Alan Woodward is Chief Technology Officer at the business and information technology consultancy Charteris plc.

Comments (0)

Add a Comment

This thread has been closed from taking new comments.