VB100 test results:
Antivir - 12 passes 14 fails (failed the 2 most recent tests)
http://www.virusbtn.com/vb100/archives/products.xml?drweb.xml
Avast! - 7 passes 17 fails (passed the 2 most recent tests)
http://www.virusbtn.com/vb100/archives/products.xml?avist.xml
AVG - 2 passes 19 fails (passed the last test, several fails before that)
http://www.virusbtn.com/vb100/archives/products.xml?avg.xml
Antivirus products often fail the VB100 for no other reason than
differing philosophies between the tester and the product developer. A
failure to pass the VB100 doesn't necessarily mean that the procuct is
incapable of detecting the test samples. A different test protocol
with the same test samples would produce an entirely different result.
I recall a recent test result at Uni Hamburg where if you looked at
only the results of ITW (In The Wild) viruses, AVG and other products
scored 100%. Most all av products do well in this category. They do
the basic function of a anti _VIRUS_ very well. They all detect _KNOWN
VIRUSES_ which are currently in active circulation. This being the
case, to make judgements concerning a product on the basis of just
VB100 pass/fail on ITW viruses is absurd and foolish. The pass/fail
idea itself is actually childish.
Antivirus products nowdays do far more than detect ITW viruses. The
best ones detect virus droppers (which are Trojans), and they test
well in a "zoo" (large collection of) viruses including old viruses no
longer deemed to be ITW. They now perform better in the Trojan
detection category than Trojan specific scanners. They have a variety
of other capabilities and features as well which require extensive and
thorough testing.
Since antivirus products are thus becoming anti _MALWARE_ (a wide
variety of malicious code), I tend to use a different yardstick. The
VB100 has become irrelevant IMO. And using a anti-malware yardstick,
AVG is practically at the bottom of the heap. So is the highly touted
NOD32.
Art
http://www.epix.net/~artnpeg