Motivation of software professionals

  • Thread starter Thread starter Stefan Kiryazov
  • Start date Start date
Basically no one knows how to build 100% bug-free anything.
Witness Toyota.  Globally, in fact, you can probably do better
with software than with most other things.  And I've never
worked on a project where there have been liability exclusions
(which probably aren't legal anyway).


Software from Ebenezer Enterprises is free. I think only
an idiot would attempt to sue us for a problem they find
in the software. I think the same things goes for Boost.
I don't think they've ever been sued for defects.


Brian Wood
http://webEbenezer.net
(651) 251-9384
 
Reasonable, justified or necessary, I don't know. But it's a
fact of life. If you deliver software, and it fails, you're
liable for it. Most of the time, the liability is spelled out
in the contract, but that still doesn't exclude any legal
guarantees the buyer may have.

Ahh, there was some context lost in quoting. I was asking why such terms
should be necessary to call something "engineering". Obviously, there
are markets and/or projects where they are a practical reality, and others
where they aren't.

Since most of my work is in the free software world, I see a whole lot of
software distributed without any kind of warranty whatsoever.

-s
 
Well, yeah, it does. Unless you believe that most software developers
and their employers are going to spend the extra time and money to do
all these things out of the goodness of their hearts. We already know
that the market will not force software quality to improve - it hasn't
happened yet.

Not in general, but:
1. Some people will do these things because they believe it is important
to do their work to the best of their ability.
2. Some people with contracts and such in place will still produce shoddy
code.

In short, while there's certainly a correlation, liability and certification
are neither necessary nor sufficient to produce reliable software. Insisting
on them strikes me as missing the point; they're a proxy value at best for
the actual goal.

-s
 
Hi,

Can any locksmith or
burglar alarm maker guarantee that a building will withstand all attacks
for 12 months?  _That_ is the equivalent of withstanding all virus
attacks for 12 months - and it's on a far simpler system.

Maybe not the locksmith itself, but there are insurance companies
which calculate how high the risk is, and they take that liability.

For locks, cars, even airplanes, insurance companies do that all the
time. But there are only a few cases where this is done for software.
 
Seebs said:
Not in general, but:
1. Some people will do these things because they believe it is important
to do their work to the best of their ability.
2. Some people with contracts and such in place will still produce shoddy
code.

In short, while there's certainly a correlation, liability and certification
are neither necessary nor sufficient to produce reliable software. Insisting
on them strikes me as missing the point; they're a proxy value at best for
the actual goal.

-s

I agree with your specific statements. It's impossible not to - I see
evidence often enough of software developers who have no relevant formal
education, have no certifications, and have only OJT, who nonetheless
have applied themselves in their own time to study their field. These
are the people who read the formal language and API specifications, own
copies of books on concurrency and software testing and software design,
regularly read the better programming website articles, and subscribe
to good articles. In fact, in some cases people who follow programming
NGs on Usenet. :-)

Generally speaking the majority of programmers who do these things do
have some relevant formal education, and often certifications, but since
currently a CS degree rarely prepares a person to be a good software
developer it's not an important factor: what counts is the other stuff I
mentioned.

For sake of argument the fraction of such software developers may be 1
in 10. I doubt it's more, because in my 25 years or so in the field,
spread out over nearly 35 years, I've encountered hundreds of developers
in many different employment settings as co-workers and colleagues, and
I've seen no evidence that it's higher than that.

Here's the thing: the main reason for having the codes of conduct,
required education and certifications, professional associations,
regulated professional development, legal guarantees and so forth, is to
prove to the consumer (employer of software developers, purchaser of
computer software etc) with a reasonable degree of assurance that they
are getting a known, reliable quantity, whether that be an employee or a
product. All that stuff is there is to help regulate the market in
software professionals and software itself, so that the consumer is not
flying in the dark when it comes to choosing services and products.

It's not a be-all and end-all by any means. It simply raises the bar,
just as it does in existing professions. It's defining the minimums, and
currently we truly have none.

Such a system also protects the true software development professionals.
As it is employers have a tough time selecting good people, and
consumers have very little idea of what they are getting. Shoddy
programmers can last a long time (and God knows they do), and while
they're at it they damage or destroy the credibility of people who are
really trying to do a proper job. Employers become accustomed to having
mediocre employees, and customers get used to shabby software. When an
employer gets an employee who really knows what they are doing they are
lauded as superstars - well, they're not, they're just doing their job.
And when customers get a reliable piece of software it gets 5 stars just
for being good software, which is also pretty sad.

The system should not be so hit and miss. And it does not have to be.

You're quite right in a narrow sense. If you are talking about the 10%
of developers who try, and the 10% of software shops that get it, and
the 10% of software that is truly quality stuff, they don't need the
extra push or the regulations. But the other 90% do.

AHS
 
Brian said:
Software from Ebenezer Enterprises is free. I think only
an idiot would attempt to sue us for a problem they find
in the software. I think the same things goes for Boost.
I don't think they've ever been sued for defects.


Brian Wood
http://webEbenezer.net
(651) 251-9384

Free (as in beer) software brings up an interesting set of arguments. If
I understand your point as being, if a product is free how can one
possibly sue the maker of it for flaws in the product? Correct me if I'm
wrong.

I have my own thoughts on this topic but I simply want to make sure what
we're discussing.

AHS
 
For me, a non-pro, it's like solving a crossword puzzle. You're
trying but can't quite figure it out. But when you finally do it's a
huge rush and you can't wait for more.

OMG I'm addicted to programming -- my wife was right.

Chrizs
 
Software management is not so stupid.

sometimes it is. Sometimes quality is seen as being too expensive. The
cost of a pissed off customer isn't always factored in.

If adequate procedures were
available that could ensure bug-free software, at reasonable cost and
time, they they would have been adopted.

they are available, they do have reasonable cost (in many fields) and
they have been adopted.

Telecommunication systems mostly invisibly work. There's a good reason
for this.

Except in a few areas
customers would soon shy away from 'no warrantry including the implied
warrantry of suitability for any particular purpose' products.

we've pretty well brain washed the consumer to accept this as
reasonable.

The fact is that many many formal methods are in existence. Some of
them might work, to some extent, and in some circumstances. But none
have really proved themselves when it comes to the acid test of
developing real software for non-trivial projects.

talk to the telcommunications people, talk to the military, talk to
avionics and space, talk to automotive (ok, bad example!).
 
Why?

Do you have any evidence to suggest that this kind of liability is actually
reasonable, justified, and/or necessary?

I am not an expert at law, so I cannot reason about justification or
necessity. However, I do recall quite a few "mishaps" and software
bugs that cost both money and lives.
Let's see: a) Mariner I, b) 1982, an F-117 crashed, can't recall if
the pilot made it, c) the NIST has estimated that software bugs cost
the US economy $59 billion annually, d) 1997, radar software
malfunction led to a Korean jet crash and 225 deaths, e) 1995, a
flight-management system presents conflicting information to the
pilots of an American Airlines jet, who got lost, crashed into a
mountain, leading to the deaths of 159 people, f) the crash of Mars
Polar Lander, etc. Common sense tells me that certain people bear
responsibility over those accidents.

How can anybody ignore this? Do more people have to die for us to
start educating software engineers about responsibility, liability,
consequences? Right now, CS students learn that an error in their
program is easily solved by adding carefully placed printf()'s or
running inside a debugger, and that the worst consequence if the TA
discovers a bug in their project solution is maybe 1/10 lesson
credits.

I was exposed to the same mentality, but it's totally ****ed up.
I agree.  I just don't think that rants about liability or certification
are going to do anything.  Neither of those has a thing to do with learning
to write more reliable software.

So what? We already know how to write more reliable software, it's
just that we don't care.
 
Here's the thing: the main reason for having the codes of conduct,
required education and certifications, professional associations,
regulated professional development, legal guarantees and so forth, is to
prove to the consumer (employer of software developers, purchaser of
computer software etc) with a reasonable degree of assurance that they
are getting a known, reliable quantity, whether that be an employee or a
product. All that stuff is there is to help regulate the market in
software professionals and software itself, so that the consumer is not
flying in the dark when it comes to choosing services and products.

I don't believe that they can do this, because the key measures of quality
are sufficiently orthogonal to anything we know how to test that it simply
doesn't work out that way. I'd almost certainly fail any coherently-specified
certification, because my memory's like a sieve, so any check as to whether
I know even basic things will discover that, well, no, I don't.

It's very hard to develop a test you can administer for "knows how to look
things up" or "knows whether or not he knows something".

Furthermore, I very much dislike the notion of the software industry becoming
afflicted with the kind of thing that many engineering professions, or the
legal profession, have, where certification becomes a way for groups doing
certification to make a ton of money preventing people from doing work unless
they pay the group a ton of money.

Right now, any old person can try to put software up on the internet. I would
not want to see that change, but I do not have any confidence that a
"professional organization" would be willing or able to refrain from doing so
over time. It is the nature of such organizations (like any other human
institution) to seek ever-broader power and influence.
Such a system also protects the true software development professionals.

Some, but it may protect other people from getting the experience they would
need to *become* true software development professionals.

Had I needed a certification to start doing stuff, I would never have made
it.
You're quite right in a narrow sense. If you are talking about the 10%
of developers who try, and the 10% of software shops that get it, and
the 10% of software that is truly quality stuff, they don't need the
extra push or the regulations. But the other 90% do.

I think the harm they'd do in the boundary cases is pretty severe, though,
and essentially irreparable, while we've found somewhat survivable ways for
employers who care about quality to obtain it if they want it.

-s
 
Michael said:
How can anybody ignore this? Do more people have to die for us to
start educating software engineers about responsibility, liability,
consequences? Right now, CS students learn that an error in their
program is easily solved by adding carefully placed printf()'s or
running inside a debugger, and that the worst consequence if the TA
discovers a bug in their project solution is maybe 1/10 lesson
credits.

You say that like the developers were at fault. I cannot tell you how many
times I've seen management overrule developers who wanted to make things
right. It's been the overwhelming majority, though. I recall a manager in
1982 refusing to let a team fix the Y2K bug in the project. Many good
developers have grown resigned to the policies and have given up pushing for
quality. Many more use stealth quality - they simply don't tell management
they're doing things in an unauthorized way that's better than the official
process. Only rarely in the last thirty years have I encountered
management alignment with known best practices.

Nearly all projects I've worked on involved many programmers, dozens even.
Parts are written independently of each other, often over a period of years.
Often each part test perfectly in isolation and only reveal bugs emergently
under production conditions.

Many of those projects had large test teams. Products have passed all the
tests, yet still failed to meet spec in production.

Sometimes the provided test environment differed significantly from the
production environment.

Before you make the developer liable, you'd better darn well be certain the
developer is actually the one at fault.
 
Lew said:

Andy said:
In 1982 the manager may well have been right to stop them wasting their
time fixing a problem that wasn't going to be a problem for another 18
years or so. The software was probably out of use long before that.

Sure, that's why so many programs had to be re-written in 1999.

Where do you get your conclusions?
 
[...]
They are. That's why independent contractors have liability
insurance.
In that case they're *not* liable for their unreliable code.

It depends. For large projects, it's unlikely that the
contractor did the work alone, and it's the prime contractor,
who gave him the job, who's liable. But I've also worked on
smaller projects, where I was responsible, and liable, for all
of the software in the project.
 
Free (as in beer) software brings up an interesting set of arguments. If
I understand your point as being, if a product is free how can one
possibly sue the maker of it for flaws in the product? Correct me if I'm
wrong.

I have my own thoughts on this topic but I simply want to make sure what
we're discussing.

Imagine driving by a house and seeing a car in front with
this sign -- "Free car." It is your responsibility to
check out the car. If I were interested in that car, I'd
talk to the giver of the car, check out the car for
myself (is it stolen?) and then either drive it carefully
to a mechanic or have a mechanic come to the car. After
that I'd be the only one that rides in the car for a
month or two to be more certain that it is in fact a safe
car. As long as the giver reveals any known problems
about the car to me, I don't think there's any basis for
suing him if the car is later found to have a serious problem.


Brian Wood
http://webEbenezer.net
(651) 251-9384
 
Where do you get your conclusions that there was much software out there
that was worth re-writing eighteen years ahead of time? Remember to
allow for compound interest on the money invested on that development...

I'm using tens of thousands of lines of code right now that are over twenty
years old. It's called "OS X", and it contains a large hunk of code that
was written either at Berkeley in the 80s or at NeXT in the 80s.

We're still using classes with names like "NSString", where "NS" stands for
NeXTStep. You know, the company that folded something like fifteen years
ago, and wrote this stuff originally prior to 1990?

Heck, I have code *I personally wrote* 19 years ago and which I still use.
It was last touched in any way in 1998, so far as I can tell. It's been
untouched ever since, *because it works correctly*.

And honestly, here's the big thing:

In reality, I do not think that writing things correctly costs that much
more. Because, see, it pays off in debugging time. The rule of thumb they
use at $dayjob is that in the progression from development to testing and
then to users, the cost of fixing a bug goes up by about a factor of 10
at each level. That seems plausible to me. So if I spend an extra day
on a project fixing a couple of bugs, those bugs would probably cost about
10 days total people-time if caught in testing, and 100 if caught by
our customers. And 1000+ if caught by their customers.

This is why I have successfully proposed "this is too broken to fix, let's
build a new one from scratch" on at least one occasion, and gotten it done.

-s
 
MarkusSchaber said:
Hi,



Maybe not the locksmith itself, but there are insurance companies
which calculate how high the risk is, and they take that liability.

For locks, cars, even airplanes, insurance companies do that all the
time. But there are only a few cases where this is done for
software.

Is it? What about the software that controls the locks, cars, and
airplanes?


Bo Persson
 
Andy said:
Pretty well everything I saw back in 1982 was out of use by 1999. How
much software do you know that made the transition?

Pretty much everything I saw back in 1982 is in production to this day, never
mind 1999.

Pretty much everything that had Y2K issues in 1999 was in production since the
1980s or earlier. By the 90s, more software was written without that bug.

Again, why do you think Y2K was such an issue, if affected software had gone
out of production by then?
Let's see.. Operating systems. The PC world was... umm.. CP/M 80? Maybe
MS-Dos 1.0? And by 1999 I was working on drivers for Windows 2000.
That's at least two, maybe three depending how you count it, ground-up
re-writes of the OS.

PCs were not relevant in 1982. PCs largely didn't have Y2K issues; it was
mainly a mainframe issue.
With that almost all the PC apps had gone from 8 bit versions in 64kb of
RAM to 16-bit DOS to Windows 3.1 16-bit with non-preemptive multitasking
and finally to a 32-bit app with multi-threading and pre-emptive
multitasking running in hundreds of megs.

Largely irrelevant to the discussion of Y2K issues, which were a mainframe
issue for the most part.

PCs were not in common use in 1982.
OK, so how about embedded stuff? That dot-matrix printer became a
laserjet. The terminal concentrator lost its RS232 ports, gained a
proprietary LAN, then lost that and got ethernet. And finally
evaporated in a cloud of client-server computing smoke.

Not relevant to the discussion of Y2K issues.
I'm not so up on the mainframe world - but I'll be surprised if the
change from dumb terminals to PC clients didn't have a pretty major
effect on the software down the back.

This was mainframe stuff. Most PC software didn't have Y2K bugs, and there
weren't PCs in common use in 1982.

PCs have had negligible effect on mainframe applications, other than to
provide new ways of feeding them.
Where do you get your conclusions that there was much software out there
that was worth re-writing eighteen years ahead of time? Remember to
allow for compound interest on the money invested on that development...

Software development costs are inversely proportional to the fourth power of
the time allotted. That's way beyond the inflation rate.

Y2K repair costs were inflated by the failure to deal with them early, not
reduced.

The point of my example wasn't that Y2K should have been handled earlier, but
that the presence of the bug was not due to developer fault but management
decision, a point you ignored.
 
Andy said:
Pretty well everything I saw back in 1982 was out of use by 1999. How
much software do you know that made the transition?

OK, so how about embedded stuff? That dot-matrix printer became a
laserjet. The terminal concentrator lost its RS232 ports, gained a
proprietary LAN, then lost that and got ethernet. And finally
evaporated in a cloud of client-server computing smoke.

I know there is software flying around today that is running on Z80
processors (well, the military variant of them) and the plan in the late
90s was for it to continue for another 20 years (I don't know the
details, but a customer signed off on some form of ongoing support
contract). Admittedly the software I used was not doing date processing
(apart from the test rigs, which used the date on printouts, which I
tested to "destruction" which turned out to be 2028).

So yes, software from the 80s is still in active use today in the
embedded world and planned to be in use for a long time to come.
I'm not so up on the mainframe world - but I'll be surprised if the
change from dumb terminals to PC clients didn't have a pretty major
effect on the software down the back.

Where do you get your conclusions that there was much software out there
that was worth re-writing eighteen years ahead of time? Remember to
allow for compound interest on the money invested on that development...

Remember to allow for the fact that software does continue to be used
for a *long* time in some industries.
 
Back
Top