av comparatives testing

  • Thread starter Thread starter badgolferman
  • Start date Start date
B

badgolferman

Proactive http://www.av-comparatives.org/seiten/ergebnisse_2005_11.php
On Demand http://www.av-comparatives.org/seiten/ergebnisse_2005_08.php

Someone please help me understand the rating differences of this
proactive chart from the on-demand one. The first chart shows numbers
ranging from 4-62%. I find it hard to believe the differences between
products to be that great. According to the chart AVG detected 4% of
the sample malware and NOD32 detected 62%.

My only explanation is the On Demand chart is for identified malware
and the Proactive chart is for unidentified malware. Is this correct?
 
badgolferman said:
Proactive http://www.av-comparatives.org/seiten/ergebnisse_2005_11.php
On Demand http://www.av-comparatives.org/seiten/ergebnisse_2005_08.php

Someone please help me understand the rating differences of this
proactive chart from the on-demand one. The first chart shows numbers
ranging from 4-62%. I find it hard to believe the differences between
products to be that great. According to the chart AVG detected 4% of
the sample malware and NOD32 detected 62%.

My only explanation is the On Demand chart is for identified malware
and the Proactive chart is for unidentified malware. Is this correct?

Sort of... The retrospective tests test the heuristic capability of the
AV engines, by ensuring the signature database is out of date.
 
badgolferman said:
Proactive http://www.av-comparatives.org/seiten/ergebnisse_2005_11.php
On Demand http://www.av-comparatives.org/seiten/ergebnisse_2005_08.php

Someone please help me understand the rating differences of this
proactive chart from the on-demand one. The first chart shows numbers
ranging from 4-62%. I find it hard to believe the differences between
products to be that great. According to the chart AVG detected 4% of
the sample malware and NOD32 detected 62%.

My only explanation is the On Demand chart is for identified malware
and the Proactive chart is for unidentified malware. Is this correct?


They describe their charts on their web site. It's been awhile since I read
it, but I recall that the proactive chart rates how old version of products
do against NEW threats. That is, they take an old version to see how it
fares against today's pests. That shows how well the product works against
unknown pests. As you see, most anti-virus software doesn't fare well at
all without signature updates to identify new threats or future ones.
BitDefender and NOD32 were best, but the chart shows that you really need to
keep up with the updates.

Unfortunately, AV-comparatives only tests a small sample of all anti-virus
software.
 
Proactive http://www.av-comparatives.org/seiten/ergebnisse_2005_11.php
On Demand http://www.av-comparatives.org/seiten/ergebnisse_2005_08.php

Someone please help me understand the rating differences of this
proactive chart from the on-demand one. The first chart shows numbers
ranging from 4-62%. I find it hard to believe the differences between
products to be that great. According to the chart AVG detected 4% of
the sample malware and NOD32 detected 62%.

My only explanation is the On Demand chart is for identified malware
and the Proactive chart is for unidentified malware. Is this correct?

Mainstream av scanners use a careful balance of heuristic and "exact"
identification. Heuristic detection is proactive in that it strives to
recognize new and "unknown" malware. Exact ID is reactive, meaning
that malware samples have been analyzed, and definition files created
and released.

Proactive tests attempt to compare the heuristic capabilities of
scanners ... their ability to alert on new and "unknown" malware. The
tests can be done in different ways. One way is to use a test malware
collection consisting of only very recently discovered samples. Then
every scanner is burdened by supplying each with old and outdated
definition sets so that they don't have exact ID capability for the
malwares in the test sample collection. If they alert, presumably it's
due only to their heuristic (proactive) capabilities.

Proactive testing should be coupled with good false positive (or false
alarm) tests since scanners which rely too heavily on heuristics tend
to produce more false alarms.

Art
http://home.epix.net/~artnpeg
 
Back
Top