P
Peteroid
I created a managed C++ application at work on a Dell computer running at
around 3 ghz. It runs fine, and looking at CPU usage (vie the Windows Task
Manager), it doesn't even show up (the meter is almost at zero, indicating
its not using very much CPU time).
Then I took it home and tried running it on my computer with a AMD Athlon
64-bit processor running at 2 ghz. It's slow as molasses, and uses up 100%
of CPU!
Why the big difference, and what's going on here? Both (I believe) have the
most recent version of .NET Framework, so why is this happening?
[==P==]
around 3 ghz. It runs fine, and looking at CPU usage (vie the Windows Task
Manager), it doesn't even show up (the meter is almost at zero, indicating
its not using very much CPU time).
Then I took it home and tried running it on my computer with a AMD Athlon
64-bit processor running at 2 ghz. It's slow as molasses, and uses up 100%
of CPU!
Why the big difference, and what's going on here? Both (I believe) have the
most recent version of .NET Framework, so why is this happening?
[==P==]