Vyacheslav Lanovets said:
Hello, Bob!
You wrote on Tue, 8 Mar 2005 10:07:26 -0700:
BG> Having said that, there does eventually come a time and place where it
BG> becomes compelling to rewrite any code base
If some algorithm's implementation in C was fine 10 years ago, still fine
now, and will be fine in 10 years, why it should be implemented in VB.Net?
Nah, not in VB.NET, in C#.
Seriously, though, when I say "complete rewrite", I don't mean that so
literally that every single line of code will be rewritten. For example,
come the year 2020, let us suppose that Microsoft has decided that C# is a
thing of the past and we should use D# or E Flat or whatever they come up
with for Visual Studio 2020 and the Quantum.NET Framework running under
Windows HyperServer 2020 for AMD 256 bit processors. Now if I did a rewrite
of a project originally concieved way back in 2005, and I had a general
purpose routine in the original C# code that does matrix addition or token
parsing or something of a very basic nature, and I could call the old code
without a significant penalty, I would in most cases leave that code
completely alone. It's usually no individual routines that are the problem
so much as higher level things like where they are in the object hierarchy
and how objects are composited at runtime and the contracts they have with
each other.
While I understand why first projects should be completely rewritten,
second ones usually can live without it.
It's true that all things being equal, each subsequent system is better,
stabler, and longer-lasting than the previous one. But organizational,
budgetary, technological and political realities tend to conspire against
this. In my experience over more than two decades, taken as a whole, every
system is built with some compromises, some bad assumptions, and some lack
of anticipating what direction things are going to evolve in, no matter how
much talent, care and foresight is applied. Such systems acquire what I
call "system lint". These are regrettable artifacts that start out as
insignificant annoyances and slowly evolve into roadblocks. They will never
really go away without extensive revision or rewriting. Of course you have
to be careful not to introduce new problems in the same process!
Then, too, projects sometimes push the limits of current technology
platforms. I'm put in mind of a classic ASP project that evolved into a
painful mass of nested #includes and Response.Write statements and would be
able to be implemented so much cleaner and maintained so much more cheaply
in ASP.NET. But to do this would require an extensive rewrite.
I'm also currently working on a project that is itself a complete rewrite of
a FoxPro 2.6 application from 1991 or so; we're re-doing it in VB.NET and
SQL Server. Fox 2.x is pre-OOP, and porting it to FoxPro 9.0 would have
been quite a project, and then there is always the fear that all the
prophecies about the demise of Fox that have continued decade after decade
will finally become self-fulfilling one day soon, so we decided it was no
more work to just move it over to .NET. You can barely get the Fox 2.x
runtime to function under Win32, and the UI is sort of Win 3.1 in appearance
and rather atypical in the details of its behavior even aside from visuals.
The software is sold commercially to a vertical market. It's an
embarrassment to the client, and that's why they want to bring it up to
date. That, and have a code base that will take them through the next
couple of decades (or more if possible).
As we've reworked this application we have many times asked the client, why
does feature X work this way? Isn't that a little inconsistent? Well yes,
but we just didn't think of that at the time and it's been a problem ever
since -- by all means change it. What about feature Y, isn't Z a better
way? Sure, sure, use Z. We just couldn't do Z back in those days. It
wasn't available / didn't work / was too hard. So along the way, the
product is getting considerably better from a usability standpoint, yet with
minimal retraining issues for the user. These are all "WOW, this is a big
improvement / much more intuitive" kind of things.
And the third one - after several revisions during _normal_ development
process the software becomes near-to-perfect. I mean it's hard to make it
better from the first try, particularly in new language and under new
platform.
If an application is feature complete and stable and won't change much, yes.
But most applications evolve constantly as business needs change. Even when
they don't, technology changes can be a problem. For example, a text-based,
80 column by 24 line user interface written in COBOL may be perfectly
serviceable, but it's also a very atypical environment for users, it may not
integrate well with other technologies, and it may be expensive and
difficult to find or develop COBOL talent when it does need to be tweaked.
So, when you say that revisions only make product worse then it means that
something wrong with development.
But I don't say that revisions only make a product worse -- especially
individual revisions, taken individually. I only say that the overall
design becomes out of sync with current realities and becomes increasingly
expensive to maintain.
And please remember, I'm talking about time frames much longer than most
people contemplate in their IT planning. I'm talking about 10, 15, 20 year
old systems. Maybe in rare cases, 5 year old systems. I've been
architecting systems for 22 years; it tends to make me take a longer view of
things.
Best,
--Bob