L
Lynne
I have a database that logs information based on a timer.
Every 120 seconds the event runs.
It is not time critical, but usually is pretty accurate.
But sometimes it just slows down - and the 120 seconds
grow to e.g. 500 seconds. I can see that because my data
has a timestamp.
Now my theory is that if the PC is heated up - then at
some point the CPU slows down to protect itself. Is that
possible?
My theory is based on the PC standing in a window - and
with the sun on it it appears to slow down. Looking at the
BIOS settings there is a "75 C" setting for the CPU
temperature. So at some point it could be a likely theory.
Any comments? Or other ideas on why the PC suddenly slows
down (it is not just my database - it is everything on the
PC - and even if I shut down the database, everything else
is still slow).
Every 120 seconds the event runs.
It is not time critical, but usually is pretty accurate.
But sometimes it just slows down - and the 120 seconds
grow to e.g. 500 seconds. I can see that because my data
has a timestamp.
Now my theory is that if the PC is heated up - then at
some point the CPU slows down to protect itself. Is that
possible?
My theory is based on the PC standing in a window - and
with the sun on it it appears to slow down. Looking at the
BIOS settings there is a "75 C" setting for the CPU
temperature. So at some point it could be a likely theory.
Any comments? Or other ideas on why the PC suddenly slows
down (it is not just my database - it is everything on the
PC - and even if I shut down the database, everything else
is still slow).