kneejerkreaction said:
IMHO, languages are becoming irrelevant. The days are past when a new
'better' language was invented every couple of years. For instance in
.NET, any supported language can be used. And developers are free to
create additional .NET supported languages. The executables are
essentially identical. So the issue becomes one of standardization. What
skills will be easiest to find in the marketplace. If a new language is
invented it will not be because it is better, but more likely so a company
can eliminate a competitor langusge from the marketplace.
That seems to be a rather...cynical opinion. The runtime a langauge compiles
to has *no* bearing on the generation of a new language. There are dozens(if
not hundreds) of languages that run on the java vm that experiment with
being better, some of which are java extensions that are eventually pulled
into the java core and others are entirely new languages designed for the
purpose of creating something the user feels is or could be better. A few
new languages have shown up for the CLR as well, Microsoft Research's
C-omega for example, as well as a few third party languages like boo and
nemerle.
Better languages come for the sake of better languages, not for the sake of
better runtimes. Microsoft and Sun and Borland may not be producing new
langauges any time soon, as C#, VB.NET, and Java have a lot of room for
improvement, but they aren't standing idly by and claiming the language is
irrelevent. Each one will get better and there *will* be a new ruby some
time.
..NET simply allows for a unified type system, a way for langauges to
interoperate. On its own the framework is a useless mass of bytes. Someone
has to be able to produce exectuables using that framework and language
evolution and generation is often the best way to improve the way that
person does his work. Being free to create a new language is not an excuse
and it is counter to your original point. The langauge is not irrelevent by
any means.