S
Stephan Steiner
Hi
My application is using several threads at once: the GUI thread, a timer, a
thread to receive packets (which is blocked most of the time using a
ManualResetEvent), and a thread to process received packets (packet
processing is quite complex, so I put received packets in a queue, and have
another thread clear out the queue. Direct reception and processing would
lead to a lot of dropped packets because the receive buffer isn't cleared
quickly enough).
The thread to process received packets is running at below normal priority
in order not to disturb the timer and reception thread (it's important that
those get the cpu time they need.. which isn't much as during reception
phase the cpu load (measured via XCPUScalar) stays below 25%). However, if I
have a lot of incoming packets, the receiver thread will eventually go back
to sleep (waiting for the manualresetevent to be set again), whereas the
processing thread is still working. And that's where the weirdness starts.
The application seems to choose a CPU load at random. It can happen that
during the packet processing phase, the cpu load is 0 % (but it's still
processing), and then I get every value between 0% and 41%, and sometimes
very high values (87 - 97%). Needless to say that if the CPU load is high,
processing is done a lot quicker but wha confuses me is that a thread,
regardless of its priority is supposed to use up whatever resources
available. If there are several idle threads (I don't touch the pda during
the processing phase, so there should be no GUI redraws) and an idle
priority thread with a lot of work to do, the latter should still end up
eating up all the CPU time, should it not? At least that's how it works on
the PC. I have the same application (with minimal changes) running on the
PC, where it behaves as I think it should. The low priority procesing thread
will effectively use up the available CPU time until it has processed every
packet there is to process.
So, I'm wondering if there's any fundamental difference between thread
priorities on Windows XP and Windows CE.NET (Windows Mobile 2003) and why my
application is not using the same amount of cpu time when I run the exact
same test multiple times.
Stephan
My application is using several threads at once: the GUI thread, a timer, a
thread to receive packets (which is blocked most of the time using a
ManualResetEvent), and a thread to process received packets (packet
processing is quite complex, so I put received packets in a queue, and have
another thread clear out the queue. Direct reception and processing would
lead to a lot of dropped packets because the receive buffer isn't cleared
quickly enough).
The thread to process received packets is running at below normal priority
in order not to disturb the timer and reception thread (it's important that
those get the cpu time they need.. which isn't much as during reception
phase the cpu load (measured via XCPUScalar) stays below 25%). However, if I
have a lot of incoming packets, the receiver thread will eventually go back
to sleep (waiting for the manualresetevent to be set again), whereas the
processing thread is still working. And that's where the weirdness starts.
The application seems to choose a CPU load at random. It can happen that
during the packet processing phase, the cpu load is 0 % (but it's still
processing), and then I get every value between 0% and 41%, and sometimes
very high values (87 - 97%). Needless to say that if the CPU load is high,
processing is done a lot quicker but wha confuses me is that a thread,
regardless of its priority is supposed to use up whatever resources
available. If there are several idle threads (I don't touch the pda during
the processing phase, so there should be no GUI redraws) and an idle
priority thread with a lot of work to do, the latter should still end up
eating up all the CPU time, should it not? At least that's how it works on
the PC. I have the same application (with minimal changes) running on the
PC, where it behaves as I think it should. The low priority procesing thread
will effectively use up the available CPU time until it has processed every
packet there is to process.
So, I'm wondering if there's any fundamental difference between thread
priorities on Windows XP and Windows CE.NET (Windows Mobile 2003) and why my
application is not using the same amount of cpu time when I run the exact
same test multiple times.
Stephan