The worm in the machine
are bugs associated with software rather than any other technology? Dilip
Ranade ponders the question
We live in an increasingly complex world, but the convenience offered by technology
is offset by the increasing difficulty of making these systems work correctly.
Take credit cards. They reduce the risk of carrying cash, thus making shopping
convenient. However, an estimated US $1 billion is stolen electronically from
credit card users every year in the US.
Aircraft and cars make travel quick and convenient, but a huge number of people
die in car and aircraft accidents every year.
IMPACT OF BUGS
It would be nice if people could build systems that just worked right everytime,
but it is not easy. Consequently, a lot of the systems that are built do not
work properly they have bugs. Some bugs are merely annoying,
but sometimes their repercussions are catastrophic.
Here are some interesting examples.
An Iraqi Scud missile got past the Patriot missile defence system due to a computer
error. 18 American soldiers died.
An oil and gas producing offshore platform (57,000 tons, $700 million) collapsed
because of a weakness in undersea concrete pillars that were under-designed
due to a bug in the design software.
A European political party lost an election in 2002 because of a rounding error
in a spreadsheet.
The European Space Agencys $1 billion Ariane-5 rocket was destroyed right
after blast-off due to a bug in the on-board guidance computer.
BUGS ARE EVERYWHERE
Bugs today have come to be associated with computers, but bugs, or product defects,
have probably existed since the Stone Age. A product generally goes through
the following phases: requirements specification, design, implementation, testing,
maintenance, and end-of-life.
Bugs manifest themselves at every phase. However, many products are complicated
by embedded software, which is in almost everything nowadays, from spaceships
to toasters. The result is an increase in subtle and unpredictable bugs in many
of our products.
WHY SOFTWARE IS OFTEN BAD
Are we, then doomed to live an increasingly frustrating and dangerous life because
of bugs cropping up everywhere? I offer some comments on why software is often
The inmates are running the asylum. Users build up a
mental model of a software application as they learn to use it. The model need
not match how things actually happen under the hood. If their model
helps them predict how the application behaves they are comfortable, otherwise
they get frustrated. The problem is that the geeks who build the software really
think differently from ordinary users. They tend to build software which is
very natural to usefor geeks.
Unsafe at any speed. Would you buy a car if the manufacturer
had you sign an agreement like the one below?
This car is provided as is. The manufacturer
disclaims all warranties, including warranties of merchantability, fitness for
a particular purpose, title and non-infringement with respect to the car and
the accompanying documentation. You assume responsibility for selecting the
car to achieve your intended results, and for the installation of, use of, and
results obtained from the car. The manufacturer makes no warranty that the car
will be error-free or free from interruptions or other failures, or that the
car will meet your requirements.
Well, this is the kind of thing you agree to in the EULA (End User Licence Agreement)
of a software package. It took a Ralph Nader to make cars safethat is
yet to happen in the software industry.
The Red Queen Effect. [It] takes all the running
you can do to keep in the same place, the Red Queen says to Alice in Through
the Looking Glass. It takes effort (that is, money) to craft a good quality
product. But after selling a software package this year, where will next years
revenue come from? Either get customers to pay for just a bug-fixed version,
or add new features that the customers must buy. The cycle then perpetuates,
because the new version with the great features happens to contain new codeand
Moores Law vs. Parkinsons Law. Computer
hardware doubles in power every three years. On the other hand, Parkinsons
Law appears to hold in the software arena: Software bloats to consume
all computer resources. Personal computers have become a hundred times
faster since their introduction. Why then does it still take the same time to
boot them or to start up a program?
I used to suspect a conspiracy between hardware and software manufacturers,
but now I believe this is related to the Red Queen Effect. Software, in its
Darwinian struggle for existence, keeps evolvingfancier graphics, more
formatting options, animationalthough most users of version 11 only care
about the same basic functions that were present even in version 2. The bells
and whistles eat up computer resources, slowing down the computer program.
Today, application programmers are given powerful machines so that they can
be more productive (software development tools also suffer from bloat, so it
is a real need). They then produce software which has acceptable response times.
But ordinary users with older PCs find a noticeable slowdown when they upgrade
their software. This is why enlightened software development managers should
conduct usability studies with older hardware before releasing their product.
LACK OF LIABILITY
Software, unlike cars or DVD players, is an intangible, copyrighted product.
Just as with books or movies, the software publisher gives no guarantee that
you will be pleased with it, notwithstanding claims in advertisements. Hence
the EULA, which absolves the software maker from any liability arising from
use of the product.
Now, if software was used only for non-critical processing, this would be a
workable arrangement. Unfortunately, some computer software directly initiates
potentially dangerous actions. Therac 25, a model of a radiation therapy machine,
was involved in six patients getting massive radiation overdoses (three of them
died), over a two-year period. The cause was software bugs and buggy bug fixes
coupled with bad safety practices.
What is the moral and legal liability on the software developers in such cases?
Where do we draw the line between a bug slipping through and inexcusable negligence?
Today, the EULA protects the negligent. Will it take the equivalent of a 9/11
caused by a software bug to redefine the lines? Until stricter laws are passed,
I feel every vendor who uses software should institute programmes for software
quality, similar to successful environment- and safety-consciousness programmes
run in other industries. It even makes good economic sense.
THE END OF INNOCENCE
Another type of bug is gaining prominence these days: security flaws. Defects
introduced in design or implementation deliver software that may work in an
ordinary environment, but is susceptible to attack by people that deliberately
exploit flaws in the system.
The Red Queen Effect is at work here too. A rational approach would design and
develop protocols and their implementations that are inherently secure. The
same goes for operating systems and applications. However, complex untested
protocols and software with hundreds of security flaws are the norm. Like walking
the dog every morning and evening, it has become an unavoidable chore to keep
all my machines updated with the latest patches.
Is there a method to end this madness? I once dreamt that all
the computer users in the world arose in revolt and switched to hardened Open
Dilip Ranade is a Senior Principal Security Engineer at Symantec