.NET SUCKS --- READ FOLLOWING. MICROSOFT IS A SUCKY CO

  • Thread starter Thread starter consumer62000
  • Start date Start date
Did you know that there are two types of GC in .Net ?

I am sure you did not so please do some research on how GC works before
blaming a super-successful company of a bad product. Since you do not
take the pains to understand, let me explain what the above means.

A company becomes rich by selling its products and if it Microsoft is
the richest or amongst the richest companies in the world, it must be
surely because it has sold millions of its products to millions of
people around the world.

You cannot possibly believe, even in your wildest dreams, that you could
be more intelligent than million (possibly billion) others or do you?
Let me tell you one thing though - million or billion people cannot be
stupid to buy a stupid product.


with regards,


J.V.Ravichandran
- http://www.geocities.com/
jvravichandran
- http://www.411asp.net/func/search?
qry=Ravichandran+J.V.&cob=aspnetpro
- http://www.southasianoutlook.com
- http://www.MSDNAA.Net
- http://www.csharphelp.com
- http://www.poetry.com/Publications/
display.asp?ID=P3966388&BN=999&PN=2
- Or, just search on "J.V.Ravichandran"
at http://www.Google.com
 
After tweaking (I have mentioned the tweaks I have applied) to your
code, the result is

Populating...
Filtering...
Collecting...
00:00:01.8125000

You can further optimize your code by not collecting on all generations
as GC.Collect() does but instead collect only on the generation

int i=GC.GetGeneration(hook);
GC.Collect(i);

occupied by the static object hook and by commenting out the
GC.Collect() call in the Prepare method.

A collection forces an object up the generation, as you must know, which
may cause the object to remain in memory for far too long plus the
collection may not happen in the higher generation always again causing
the object to be in memory for longer than necessary. So, by calling the
GC.Collect, the object(s) are being pushed up the gens. and by
commenting out the line, I get lesser than the original (your code)
result, which is

C:\Prgs>jon
Populating.
Filtering..
Collecting.
00:00:02

Mine is a 1GB PIV box as well.

with regards,


J.V.Ravichandran
- http://www.geocities.com/
jvravichandran
- http://www.411asp.net/func/search?
qry=Ravichandran+J.V.&cob=aspnetpro
- http://www.southasianoutlook.com
- http://www.MSDNAA.Net
- http://www.csharphelp.com
- http://www.poetry.com/Publications/
display.asp?ID=P3966388&BN=999&PN=2
- Or, just search on "J.V.Ravichandran"
at http://www.Google.com
 
Yeah, I kinda think that i too am dense and it is not because of the
stylish cigar that dangles from my lips with a lazy, curl of smoke
threatening to increase the density of the air around my head!!

with regards,


J.V.Ravichandran
- http://www.geocities.com/
jvravichandran
- http://www.411asp.net/func/search?
qry=Ravichandran+J.V.&cob=aspnetpro
- http://www.southasianoutlook.com
- http://www.MSDNAA.Net
- http://www.csharphelp.com
- http://www.poetry.com/Publications/
display.asp?ID=P3966388&BN=999&PN=2
- Or, just search on "J.V.Ravichandran"
at http://www.Google.com
 
gets JIT compiled to native code when you start the application

No way. It gets stubbed. Imagine the performance implications of compiling
the entire application at application boot. It's a competitve advantage to
boot quickly. Lots of ISVs with appropriately demanding customers would
"spew" if this were not the case. Microsoft would never employ a systems
architect who thought that was a good idea. Their not stupid.

And if Microsoft had supplied a JIT complier that precompiled an entire
application at application boot, the JIT compilers supplied with the
platform would not be the "market" defaults they are today. Microsoft would
find itself in the embarrasing position of not possessing the controlling
market share for JIT compiliers of its "own" flagship application platform
technology.

They own the Internet in terms of browser mind share. This is a company that
understands the concept of "owing the important layers". No way would they
allow third parties to control the layer that converts between the code
container and the engine that runs it.

If you controlled the compiler you could effectly "compile away" from the
underlying platform. The scenarios a bit far fetched but no way would the
Microsoft "Corporation", (remembering that there is a lot of business grads
that work for Microsoft), allow it to happen through such a simple
oversight. Even a member of the "Marketing Team" could tell you why you
should only compile method by method.

Thats why its call "Just In Time" and not "Just In Case".
 
No sorry you've just peetered off into an unsubstantiated woffle there.
Tyrant and Damien have clearly refuted your earilier claim and you "still"
dont want to hear it. If you cant keep up then go with a slower moving
technology.

Your position is basically that Microsoft is moving too fast for you. Its
actually backwards compliment to Microsoft, that a company of its size can
move so quickly that some learned developers simply cant keep up.

I suppose the next snipe from the "slow down" brigade is that Microsoft is
not going too fast but is rather "erractic". That seemed to be what you were
trying to insinuate by listing those various technologies as though you were
confessing Microsfts sins.

But again this is a complete oversight. Each of the technologies you "",
represent a progression from the previous "". Thats called evolution.

That's called a committment to progression, growth, adaption and innovation.
How could you possibly have trouble selling that message to your employers?
 
You can further optimize your code by not collecting on all generations
as GC.Collect() does but instead collect only on the generation

Mine is a 1GB PIV box as well.

But optimising it to make the garbage collection take *less* time goes
against the whole point of the exercise - the aim is to get the system
into the nastiest possible state, to find out how long garbage
collection could take. There's no reason to believe that things
couldn't end up in generation 2 in a real system, or that the GC won't
need to collect generation 2.

We're trying to find the maximum we can provoke here.
 
Jon Skeet said:
Are you suggesting it's impossible to come up with a situation which
takes over a second to garbage collect? I suspect I could do that,
given a bit of time.

Anyone else up for a "be cruel to the garbage collector" contest?

--

No, I am not. See my first post above. I just found the statement about one
second very funny. Medern real time systems handle tens, hundreds, sometimes
even thousands events/messages per second.

In some versions of WebLogic working on a early versions of JVM GC tooks up
to 15 minutes. It causes all connections to be broken.
Now JVM is faster, and .NET is fast also. But real-time system is not just a
fast working system. Real time system must to warrant some exact level of
service. Maybe, it will be only 5 events per second, but never less than 5
per second. So neither .NET, nor JVM- based systems are true RT-system.

And thinking that unmanaged C++ is a panacea for RT systems seems funny to
me also. Don't get me wrong. My first OS was RSX-11, and one of my first
programming languages was PDP-11 assembly language, so later I had no
problems with C. But, as I stated before - memory management in RT systems is
much more complicated problem, and it could not be resolved just by using
unmanaged C++ or VB6, or Delpni, whatever. "What happens if GC begins?" - he
asked. And what happens if a swap starts? And what happens if some network
delay occurs? Unmanaged C++ will not help, if the architect will not use
his/her head properly.
 
Vadim Indrikov said:
No, I am not. See my first post above. I just found the statement about one
second very funny. Medern real time systems handle tens, hundreds, sometimes
even thousands events/messages per second.
Sure.

In some versions of WebLogic working on a early versions of JVM GC tooks up
to 15 minutes. It causes all connections to be broken.
Now JVM is faster, and .NET is fast also. But real-time system is not just a
fast working system. Real time system must to warrant some exact level of
service. Maybe, it will be only 5 events per second, but never less than 5
per second. So neither .NET, nor JVM- based systems are true RT-system.

Absolutely (well, maybe not absolutely - I know there's a real-time
Java project, but I don't know what its status is). I never claimed
that they were. My point in suggesting this was to show that all those
who were claiming it was laughable for a GC to take a whole second were
wrong - it's far from inconceivable for a GC to take a second.
And thinking that unmanaged C++ is a panacea for RT systems seems funny to
me also. Don't get me wrong. My first OS was RSX-11, and one of my first
programming languages was PDP-11 assembly language, so later I had no
problems with C. But, as I stated before - memory management in RT systems is
much more complicated problem, and it could not be resolved just by using
unmanaged C++ or VB6, or Delpni, whatever. "What happens if GC begins?" - he
asked. And what happens if a swap starts? And what happens if some network
delay occurs? Unmanaged C++ will not help, if the architect will not use
his/her head properly.

Indeed. It just struck me that a simple way of absolutely proving that
..NET won't necessarily cope is by showing a situation where the GC took
over a second. It's far from the only proof and the only reason - but
you only need one to make a counterexample :)
 
Yes, Jon, I got your point from the very beginning, and I really enjoy the
contest you started. What it really proofs, is that the real professional
could acheive any result. Both good and bad, depending on his/her goal ;)

My example about 15 minutes is true also. Many people try to defer GC, so
they simply worked with a heap up to 2GB.

And what about the .NET (and Java) - their primary goal was to simplify the
development of a wide class of applications, but not all kinds of
application. When I started to work with Java, I was permanently scared - it
looks like C++, not Smalltalk etc., but no free, no delete. It was a big
stress. But everybody knows what happens after calling free() without
corresponding malloc(). Of course, there is an overhead for GC, but this is
the price.

However, I hope we both are talking about the same thing - technology never
deliver from using the head. So if somebody picks a wrong technology, or uses
it in a wrong way, this is his/her problem, not the technology problem. In
that case it's better to read (books or whatever) than to write (here).
Sometimes it helps.
 
May I suggest that you sell your .Net compilers on Ebay and just don't use
the product. Quit whinning and use another product..................
 
But optimising it to make the garbage collection take *less* time goes
against the whole point of the exercise - the aim is to get the system
into the nastiest possible state,

maybe he was maing a joke ;-)
or he is another one of these people who don't read what they respond to ;-)
 
First of all it is not my job to market or sell Microsoft technologies to my
employer, especially the technologies I believe to inferior to previous ones.
And no Microsoft is not moving too fast for me, in my opinion it is moving
backward. I had no problem to learn and use any of “depreciated†Microsoft
technologies in last 15 years. No do I have problem using .Net, from
technological perspective, it is not a brain surgery/rocket science. Same as
I do not have problem moving to Linux, except that it is not mature enough
system. I belong to the diminishing population of programmers, who believe
that programs should run better (faster, more reliable, more secure) as
technology progresses, not wise versa, and that we do not need supercomputer
to run a word processor. I also believe that simplicity is better approach in
programming, and anybody who have implemented custom .Net channel should
understand that it is new “dll hell†in making.
Microsoft technological approach was always "erratic†that is almost a plus
not a problem. It is only starting 2001 that Microsoft actively depreciates
technologies, for no other reason, but marketing. And that was the whole
point. .Net will eventually go away as every other innovation from Microsoft,
the systems we are developing today will probably outlive it, the only
question will they run on Windows? As for “…peetered off into an
unsubstantiated woffle there…â€, if I understand correctly you are accusing me
of just crying for no reason. Well there is probably some truth in that, but
also consider this: I have 2 workstations under my desk , one is WinXP pro,
the second one is Linux, I really, really would like to work on Windows only,
but can not make case for it because Linux really performs better in some
areas (TCP) and Microsoft not doing anything about it (I guess all resources
spent on .Net), and there is a danger of Microsoft dropping support for the
technology we are using (and please lets not go into Mono). And am realy
sorry to say this, but “committment to progression, growth, adaption and
innovation†programmers do not talk like this, marketing people do.
 
Hi
First of all it is not my job to market or sell Microsoft technologies to my
employer, especially the technologies I believe to inferior to previous
ones.

Then why state this?

"It is becoming more and more difficult to convince employers to stay with
Microsoft technologies because of such decisions. "
Same as I do not have problem moving to Linux, except that it is not
mature enough system.

So you do have a problem with moving to Linux? I dont understand why you
hold up Linux as though it were some kind of "contender" and then tell us
all it no good because it's incomplete. If you're trying to convince us that
the .Net sky is falling then you'll need to supply evidence that supports
the claim.

If you like the pain in making love to skeletons then you'll no doubt enjoy
developing within a Linux environment.

I belong to the diminishing population of programmers, who believe
that programs should run better (faster, more reliable, more secure) as
technology progresses,

This could be construed as being quite an arrogant kind of thing to say. Are
you suggesting that you are a member of a small elite group that knows
better and that only you and they know whats what? Id like to know who
besides yourself endorses that claim?
not wise versa, and that we do not need supercomputer
to run a word processor.

This is just exaggeration not a programming argument. If you were more
specific then people like Tyrant and Damien could offer specific and simple
remedy, as demonstrated above. When you make general claims like this its
sounds as if you dont want a solution and are happier with the supposed
problem.
should understand that it is new "dll hell" in making

Q: Is it an improvement? A: Most definitely.

Can you supply a general example of .dll hell with .Net that was NOT a
problem with previous deployment scenarios. Have you asked for the answer to
this scenario in this newsgroup and not received an answer? I can show you
many in which .NET has been of enormous benefit in managing .DLL hell.

Any argument suggesting a incremental fix cannot be a value adding fix
doesn't know much about the concept of value.

And so in your opinion what "specifically" is the problem with .Net
deployment scenarios. It's not enough to say "dll hell". What specific
deployment scenario has .NET made impossible and/or worse. Have you
confirmed that the limit of what you know is the full and final fact?
point. .Net will eventually go away as every other innovation from
Microsoft, the systems we are developing today will probably outlive it,
the only question will > they run on Windows

Yes. This is general consensus about almost every form of technology
currently in circulation today. Its called the Information Age. Everything
about everything will "eventually go away" at a steadily increasing rate. So
what's your point?

You generalise way to much to make your argument at all convincing. All
you've said there is that Microsoft doesn't have perfect visibility into the
technological future. To which i can only reply "well duh?".
And am realy sorry to say this, but "committment to progression, growth, adaption and
innovation" programmers do not talk like this, marketing people do.

For most of us the functional decomposition of an enterprises human capital
is for the most part a theoretical excerise. Its not a good thing to only be
able to speak "programmer".

Given the average programmers shepard is the systems architect are you
suggesting you would follow a design plan that wasn't comitted to some form
of "progression, growth, adaption or innovation"

What's left to form the basis of any kind "team spirit". Whats your purpose?
What would you talk to a customer about if you ever met one? Number of
Floating point ops per second?

Sorry AAO but your position is based on nothing but sentiment.

Ian
 
You guys are great! Some Troll posts an unthinking MS bash, and you geeks (I
use the term with the highest respect!) turn it into a programming contest.
Keep it up!
 
Never underestimate the power of mass stupidity. When you think people
can't be stupid in vast numbers, just look back on the 2004 elections.
 
Dear Misled,

You are really lost in the fact that you think C++ is faster than .net in
all ways. But as it turns out, many of the performance tests comparing .net
with C++ show that it's only marginally slower or even faster in some areas.
Take a look at this article for details:

http://www.microsoft.com/indonesia/msdn/systemmessperf.aspx

There are major performance hits when using .net 1.1 native serialization.
..net 2.0 has improved the serialization 100 times and if you use custom
serialization, as shown in the article above, you can obtain speeds almost
twice that of C++...

Usually performance issues that you mention are related to poor coding
practices. You should always use "using(Object myObject = new Object())".
There are a ton of whitepapers on scaleability. Microsoft actually released
a 1,000 page paper on scaleability design patterns with .net.

You also mention that you're wrapping unmanaged code in .net... Resource
allocation with unmanaged code is allowed to go much higher than managed
code. You have to be sure that you cleanup unmanaged resources in the
unmanaged code. The GC will not do this for you.

The payoff with .net comes in so many ways from manageability, to
scaleability, to deployment, all the way up to useability. Performance wise,
I have seen alot of benchmarks that show it to be comparable to C++ and C.

Good luck!
 
I think that Windows itself do not support real 'real-time' opeations (this
is a domain of so called Real Time OSes).

Even if you write a code in pure C it is not guaranteed to do some operation
exactly at a specified time. E.g. if you run more processes with high
priority or if you access some hardware resources.
 
Poor Planning and Design make a sucky software, no matter what language you
use. Maybe you should try to Review your System Design Documentation before
blaming .net.
 
Tyrant Mikey said:
Never underestimate the power of mass stupidity.

You aught to know...
When you think people can't be stupid in vast numbers,
just look back on the 2004 elections.

Next you will claim Visual Studio is part of the "vast right-wing conspiracy."
You know a thread has reached it limits when it devoles into politics.
 
Hey, I'm a huge FAN of Visual Studio. But my point remains valid. Large
numbers of people doing the same thing does necessarily make that thing
right. History is replete with such examples.
 
Back
Top