Joe said:
I have a problem with an application which runs smoothly but once every
few
seconds shudder.
Having used the .net stopwatch class I now know that a certain loop
generally runs very quickly but sometimes it takes ten times as long.
The loop is not trivial so the easiest way for me to find the reason for
the
delay would be to profile it in such a way that I can see the time each
call
to it takes and not just the accumulated time.
I'm interested in your problem. I've modified the code you posted to try
and reproduce the spikes you saw, but I can't.
I can think of two obvious possible culprits:
* GC pauses
- I don't know how much garbage you're generating in your application,
but depending on its size and nature (e.g. if you're not recycling large
objects living on the large object heap), then you'll end up hitting
expensive gen2 collections too often. What kind of percentage GC are you
seeing in existing profilers, and how does it compare to what you
calculate your total 'long pauses' are as a fraction of total time?
* Badly behaved drivers
- It's possible that you've got a badly implemented network or sound
card driver (or others) being activated by background tasks. Does the
same problem occur on other machines with different hardware?
Here's the code (C# 3.0) I was working with, trying to reproduce your
data at
http://www.greenleaf.dk/time.jpg :
I tweaked the inner loop to run in about 16.6 ms on my machine.
---8<---
using System;
using System.IO;
using System.Linq;
using System.Collections.Generic;
using System.Diagnostics;
class App
{
static double Variance(IEnumerable<double> values)
{
double average = values.Average(x => x);
double result = 0;
foreach (double value in values)
{
double var = average - value;
result += var * var;
}
return result;
}
static double StandardDeviation(IEnumerable<double> values)
{
return Variance(values) / values.Count();
}
static void Main()
{
Stopwatch sw = new Stopwatch();
double[] times = new double[2000];
int c = 0;
while (true)
{
sw.Reset();
sw.Start();
double y = 0;
for (int i = 0; i < 50000000; i++)
y = Math.Sin(Math.Cos(Math.Sqrt(i)));
sw.Stop();
times[c++] = sw.ElapsedTicks
/ (double) Stopwatch.Frequency * 1000;
if (c == times.Length)
{
using (StreamWriter w = File.CreateText("times.txt"))
{
foreach (var time in times)
w.WriteLine(time);
w.WriteLine();
var average = times.Average(t => t);
w.WriteLine("Average: {0:F3}ms", average);
w.WriteLine("Variance: {0:F3}ms", Variance(times));
w.WriteLine("StdDev: {0:F5}ms",
StandardDeviation(times));
foreach (var result in
from index in Enumerable.Range(0, times.Length)
let time = times[index]
let diff = (time - average) / average
where Math.Abs(diff) > 0.02 // 2%
select new { Index = index, Time = time,
Difference = diff })
{
w.WriteLine(
"Perturbed: {0:F3}ms at {1}, " +
"difference {2:F3}%",
result.Time, result.Index,
result.Difference);
}
}
break;
}
}
}
}
--->8---
Compiled using:
$ csc -optimize -debug- app.cs
I'm running it in a Cygwin Bash shell, using the nice command to elevate
its priority:
$ nice -n -20 ./app.exe
I get these tail results from the output, 'times.txt':
---8<---
Average: 16.666ms
Variance: 4.355ms
StdDev: 0.00218ms
Perturbed: 17.013ms at 379, difference 0.021%
Perturbed: 17.055ms at 439, difference 0.023%
Perturbed: 17.105ms at 499, difference 0.026%
Perturbed: 17.033ms at 559, difference 0.022%
Perturbed: 17.216ms at 838, difference 0.033%
Perturbed: 17.031ms at 979, difference 0.022%
Perturbed: 17.003ms at 1399, difference 0.020%
--->8---
These differences are minor, not high enough to be your 10x inflation.
-- Barry