J
Jemar
In the following Win32 program:
///////////////////////////////////////////////////////////////////////////////
....
static int i=0;
ThreadFunction( ... )
{
while( true )
{
i++;
Sleep(1); // 1 ms delay
}
}
main()
{
....
StartThread( ..., &ThreadFunction, ... )
Sleep( 10000 );
printf( "i=%d\n", i );
getchar();
....
}
///////////////////////////////////////////////////////////////////////////////
The result will invariably be i=1000 which indicates that the
increment has occured at every 10 milliseconds instead of the desired
1 ms.
Now, if I change the program to start a multimedia timer it makes the
program reacts according to code expectation ie: the Sleep(1) is
released after 1 ms and the result becomes 'i=10000'.
///////////////////////////////////////////////////////////////////////////////
....
static int i=0;
TimerFunction( ... )
{
}
ThreadFunction( ... )
{
while( true )
{
i++;
Sleep(1); // 1 ms delay
}
}
main()
{
....
timeSetEvent( 100, 0, &TimerFunction, 0, TIME_PERIODIC ... )
StartThread( ..., &ThreadFunction, ... )
Sleep( 10000 );
printf( "i=%d\n", i );
getchar();
....
}
///////////////////////////////////////////////////////////////////////////////
Also, if I make the multimedia timer stop after 5 seconds, the result
will be 'i=5000'.
I have come to the conclusion that for as long as the mmtimer is
enabled the scheduling priority/resolution of the whole process is
being boosted in some way.
While having a mmtimer makes the whole thing work, I would prefer to
understand what is going on exactly so I can come up with a more
direct/explicit solution to the problem.
Thanks.
///////////////////////////////////////////////////////////////////////////////
....
static int i=0;
ThreadFunction( ... )
{
while( true )
{
i++;
Sleep(1); // 1 ms delay
}
}
main()
{
....
StartThread( ..., &ThreadFunction, ... )
Sleep( 10000 );
printf( "i=%d\n", i );
getchar();
....
}
///////////////////////////////////////////////////////////////////////////////
The result will invariably be i=1000 which indicates that the
increment has occured at every 10 milliseconds instead of the desired
1 ms.
Now, if I change the program to start a multimedia timer it makes the
program reacts according to code expectation ie: the Sleep(1) is
released after 1 ms and the result becomes 'i=10000'.
///////////////////////////////////////////////////////////////////////////////
....
static int i=0;
TimerFunction( ... )
{
}
ThreadFunction( ... )
{
while( true )
{
i++;
Sleep(1); // 1 ms delay
}
}
main()
{
....
timeSetEvent( 100, 0, &TimerFunction, 0, TIME_PERIODIC ... )
StartThread( ..., &ThreadFunction, ... )
Sleep( 10000 );
printf( "i=%d\n", i );
getchar();
....
}
///////////////////////////////////////////////////////////////////////////////
Also, if I make the multimedia timer stop after 5 seconds, the result
will be 'i=5000'.
I have come to the conclusion that for as long as the mmtimer is
enabled the scheduling priority/resolution of the whole process is
being boosted in some way.
While having a mmtimer makes the whole thing work, I would prefer to
understand what is going on exactly so I can come up with a more
direct/explicit solution to the problem.
Thanks.