Best antivirus

  • Thread starter Thread starter Tass
  • Start date Start date
Beauregard T. Shagnasty said:
Are you comfortable using a seven-year-old scanning engine?

I wouldn't be. But I'm not.

Symantec virus definition packages frequently include corresponding
updates to the scan engine.

I verified this several years ago in a discussion with kurt wismer,
using process explorer.
 
I wouldn't be.  But I'm not.

Symantec virus definition packages frequently include corresponding
updates to the scan engine.

I verified this several years ago in a discussion with kurt wismer,
using process explorer.

OK, I have a Norton 2005 disc, would this be a better to use than the
current free versions of AVG 8 or Avira AntiVir?
 
Ron said:
OK, I have a Norton 2005 disc, would this be a better to use than the
current free versions of AVG 8 or Avira AntiVir?

It's my impression that NOD32 is (or was) the best AV product out there
because it was "hooked" into a system in such a way as to intercept all
data going to your browser. Other AV products detected malware by
scanning the web-content that got cached by your browser.

I don't know if that has changed (and they all do what NOD32 does) at
this point.

I don't put much stock in AV products anyways (but maybe I would if XP
were my main OS, instead of Win-98).

But yes, your NAV 2005 disk should be fine - assuming it will install
and activate itself. I have no idea if, for example, it tries to
contact symantec and get the go-ahead to activate itself.

But if it installs ok, then you should have no problems performing an
update with it. If that doesn't work, google for "intelligent updater"
and download the current update package.

But be aware that NAV 2003 and newer became known for code-bloat. Many
people did not have good things to say about the performance hit caused
by those versions.
 
It's my impression that NOD32 is (or was) the best AV product out there
because it was "hooked" into a system in such a way as to intercept all
data going to your browser.  Other AV products detected malware by
scanning the web-content that got cached by your browser.

I don't know if that has changed (and they all do what NOD32 does) at
this point.

I don't put much stock in AV products anyways (but maybe I would if XP
were my main OS, instead of Win-98).

But yes, your NAV 2005 disk should be fine - assuming it will install
and activate itself.  I have no idea if, for example, it tries to
contact symantec and get the go-ahead to activate itself.

But if it installs ok, then you should have no problems performing an
update with it.  If that doesn't work, google for "intelligent updater"
and download the current update package.

But be aware that NAV 2003 and newer became known for code-bloat.  Many
people did not have good things to say about the performance hit caused
by those versions.

Yeah, I know about it being "bloatware" and difficult to uninstall,
but those aren't concerns. I've been using AVG since '06 when Norton
expired. Even when I bought a new computer in '07 I just stuck with
AVG. Since I would be using it for the first time on this computer I
have no doubt it will install and update. I'm just wondering if it is
any "better" then the freeware that I mentioned.
 
It's my impression that NOD32 is (or was) the best AV product out there
because it was "hooked" into a system in such a way as to intercept all
data going to your browser.  Other AV products detected malware by
scanning the web-content that got cached by your browser.

I don't know if that has changed (and they all do what NOD32 does) at
this point.

I don't put much stock in AV products anyways (but maybe I would if XP
were my main OS, instead of Win-98).

But yes, your NAV 2005 disk should be fine - assuming it will install
and activate itself.  I have no idea if, for example, it tries to
contact symantec and get the go-ahead to activate itself.

But if it installs ok, then you should have no problems performing an
update with it.  If that doesn't work, google for "intelligent updater"
and download the current update package.

OK, I went ahead and installed it...thanks for the heads-up for
"intelligent updater"! It was updating, but seem to get stuck after a
while. Even after a reboot, it said that it was "updating" in the
background, but still said the defs were from '05. The "intelligent
updater" fully updated it to '09 in about ten minutes.
 
OK, I have a Norton 2005 disc, would this be a better to use than the
current free versions of AVG 8 or Avira AntiVir?

IMHO, no. Norton is not Symantec and Symantec is not Norton. Yes -
Norton products /are/ sold by Symantec.

Some here say that AVG 8 is a resource hog. Some have given up AVG for
AntiVir and some will say that you can't do much better than AntiVir.

These statements usually bring about some lively debates. You also need
to assess the remainder of your antimalware measures.

Pete
 
IMHO, no.  Norton is not Symantec and Symantec is not Norton.  Yes -
Norton products /are/ sold by Symantec.

Some here say that AVG 8 is a resource hog.  Some have given up AVG for
AntiVir and some will say that you can't do much better than AntiVir.

These statements usually bring about some lively debates.  You also need
to assess the remainder of your antimalware measures.

My malware/spyware/adware is handled by Spy Sweeper.

Oh course I have the free versions of the other products,
Malwarebytes, A Squared, Ad-Aware, Sypware Blaster (anti only),
Spyware Doctor, but all they find are tracking cookies. Been using SS
since '05 and IMO, there is nothing better. It's realtime protection
is unmatched by any other product on the market. It is easily worth
the $20.00 a yr. (after playing their game of lowering the price from
$30.00 if you don't purchase right after the trial is over) for me
anyway because I download a lot of torrents.
 
VanguardLH said:
kurt wismer wrote: [snip]
no, complain about sri.com not being able to design an anti-malware test
that comes anywhere near being reasonable...

virustotal is for testing malware, not anti-malware... even the people
who provide the service say anti-malware tests designed this way are
bogus...

They are upfront with the first statement declaring that their results
are from virustotal.com. They show ranking based on THOSE results. Use
their list for what you want. It's not like their hiding how they came
up with those results.

it's not about hiding things it's about not understanding things... most
people don't understand that the way virustotal uses scanners is such
that their results are not representative of what a user of the full
anti-virus product would see...

telling people your results come from virustotal and admitting that
those virustotal results are inaccurate (thus making your results highly
questionable) are two entirely different things...
 
kurt said:
VanguardLH said:
kurt wismer wrote: [snip]
no, complain about sri.com not being able to design an anti-malware test
that comes anywhere near being reasonable...

virustotal is for testing malware, not anti-malware... even the people
who provide the service say anti-malware tests designed this way are
bogus...

They are upfront with the first statement declaring that their results
are from virustotal.com. They show ranking based on THOSE results. Use
their list for what you want. It's not like their hiding how they came
up with those results.

it's not about hiding things it's about not understanding things... most
people don't understand that the way virustotal uses scanners is such
that their results are not representative of what a user of the full
anti-virus product would see...

telling people your results come from virustotal and admitting that
those virustotal results are inaccurate (thus making your results highly
questionable) are two entirely different things...

Which means the results are for on-demand scanning, which is only a
portion of what anti-malware products provide (i.e., they also have
their real-time protections). Okay, but no one every quantitatively
measures the real-time protections. av-comparatives.org, VB100, and
other such testing of AV products are all based on on-demand scans. I
did find one test that actually had the virtual machine infected and
then use the AV product to see if it caught it which resulted in
different rankings than the on-demand type of tests - but it was an
informal test and users generally can't find similar test methods being
employed. In general, all such testing is similar, is similarly
incomplete, and thus similarly misleading but only if the user doesn't
bother finding out just what testing was done.

Name a test that doesn't do the same. We AV users are all ears as we
keep waiting for an all-inclusive test methodology that accurately ranks
the AV products based on all their features to detect and eradicate
malware. Have YOU found one yet? The results from virustotal are
hardly misleading. They're just typical of the limitation found in all
the same type of testing which is all that users have available to
determine which AV product is better - but only as an indication and not
as an absolute measure.

I think we're back to Bear Bottoms argument with his conclusion that no
AV tests are accurate and/or all tests are misleading. So which test do
YOU think is wholly accurate in ranking the AV products? Come on,
reveal it to us, you can do it ... or maybe you can't.
 
VanguardLH said:
kurt said:
VanguardLH said:
kurt wismer wrote: [snip]
no, complain about sri.com not being able to design an anti-malware test
that comes anywhere near being reasonable...

virustotal is for testing malware, not anti-malware... even the people
who provide the service say anti-malware tests designed this way are
bogus...
They are upfront with the first statement declaring that their results
are from virustotal.com. They show ranking based on THOSE results. Use
their list for what you want. It's not like their hiding how they came
up with those results.
it's not about hiding things it's about not understanding things... most
people don't understand that the way virustotal uses scanners is such
that their results are not representative of what a user of the full
anti-virus product would see...

telling people your results come from virustotal and admitting that
those virustotal results are inaccurate (thus making your results highly
questionable) are two entirely different things...

Which means the results are for on-demand scanning, which is only a
portion of what anti-malware products provide (i.e., they also have
their real-time protections).

no, i'm afraid it goes deeper than just being limited to on-demand
scanning...

i direct your attention to a blog post i made on the subject a little
while ago:
http://anti-virus-rants.blogspot.com/2009/01/virustotal-usage-fail.html

the summary is:
- they aren't using the desktop engine but a command-line engine which
can and in some cases does differ significantly from the desktop equivalent
- they're using settings they can't reveal due to NDAs
- the versions of the products aren't necessarily all up to date or even
all equally out-dated
- they may halt engines that take to long for the service virustotal is
providing (and their performance requirements are different than those a
customer would have)
- they're using apples along-side oranges (different products meant for
different purposes) which you should never compare (for i hope obvious
reasons)

[snip]
Name a test that doesn't do the same.

not too long ago av-test.org produced a test that included run-time
detection capabilities... here's a link describing the test and how the
proactive portion of it included actually running the samples:
http://www.virusbtn.com/news/2008/09_02

i gather from the most recent av-comparatives retrospective test that
they plan on implementing similar tests of dynamic detection sometime
this year...
 
kurt said:
- they aren't using the desktop engine but a command-line engine which
can and in some cases does differ significantly from the desktop equivalent

I don't recall using an AV product in which the UI app was the engine.
The engine is separate from the UI. In fact, the UI may be unloaded or
even crash but it doesn't take out the engine (the kernel-mode file
system filter).
- they're using settings they can't reveal due to NDAs

Alas, there isn't much info at VirusTotal regarding their setup for each
AV product. For other tests, usually they announce that settings were
at "highest" (which doesn't often match the install-time defaults in
typical user installs).
- the versions of the products aren't necessarily all up to date or even
all equally out-dated

But this subthread started due to Rick's comment that VirusTotal is
using an old version of NOD32 so there must be some indication to the
user as to which version was using during the summary period for that
test result report. However, I don't see SRI listing the version in
their summary report. I took a very quick peek at VirusTotal's site and
didn't see versions mentioned there, either. The only time that I see
the product version listed is when I submit a file to them and then look
at the scan report.

With a summary report that spans a period of time, it is possible the
version of the product has changed. That's why I submitted a request to
SRI that they either show coverage by product over several of their
summary reports (since a single snapshot alone is hard to guage how well
a product has fared over time), or keep an archive of their old summary
reports so users could copy them into a spreadsheet to see the
effectiveness of a particular product over repeated snapshots.
- they may halt engines that take to long for the service virustotal is
providing (and their performance requirements are different than those a
customer would have)

Not sure how timeout scans for a particular test could be included in a
summary report that includes multiple tests. But then users might not
want to use a product, even with high coverage, that takes a really long
time to detect a pest.
- they're using apples along-side oranges (different products meant for
different purposes) which you should never compare (for i hope obvious
reasons)

Yet each product included in the scan *claims* to also detect viruses.
There are few exactly identical products. If they were identical, we
wouldn't need any of these "results" summaries since every product would
fare the same as another because they were the same. When you shake
flour through sifter, you're looking for an overall granularity of
powder, not that it is a absolutely perfect consistency. They throw the
suspect at their sifters, one for each product, and see what falls
through. Like PGP, the test is pretty good. Not very good, or
extremely good, or perfectly good, but good enough to provide some gauge
of effectiveness.
not too long ago av-test.org produced a test that included run-time
detection capabilities... here's a link describing the test and how the
proactive portion of it included actually running the samples:
http://www.virusbtn.com/news/2008/09_02

I've never trusted av-test.org. They get commissioned (paid) by AV
vendors to "test" that vendor's product but are guidelined as to the
test scenarios and sometimes as to even which sample of pests that they
are to test against. I'm not convinced they qualify as an *independent*
testing agency. I haven't seen one AV vendor who commissioned
av-test.org to test their product where that product didn't come out
shining like a white knight of security products.

The only free and publicly available "comparison" they offer on their
web site is how often the various AV products provide updates. Oh gee,
golly, big deal.
i gather from the most recent av-comparatives retrospective test that
they plan on implementing similar tests of dynamic detection sometime
this year...

Alas, I remember reading a blog or article from them where they mention
that costs are getting prohibitive to do this testing for free. So they
may go the way of VirusBulletin and others that charge for testing an AV
vendor's product. That means:

- Some vendors won't submit their product for testing, or they will be
selective as to who tests their product that results in presenting them
with the best image.

- Vendors can pay to have their product tested but they can also request
the results not be published. So if they did really poorly, you don't
get to see it.

- A bias can creep into the tester's methodology regarding products for
which they get paid to test, especially if repeatedly paid in subsequent
tests, and those that don't pay to get tested only occasional pay to
test their product.
 
VanguardLH said:
But this subthread started due to Rick's comment that VirusTotal is
using an old version of NOD32 so there must be some indication to the
user as to which version was using during the summary period for that
test result report. However, I don't see SRI listing the version in
their summary report. I took a very quick peek at VirusTotal's site
and didn't see versions mentioned there, either. The only time that I
see the product version listed is when I submit a file to them and
then look at the scan report.


The only reason I knew which version of NOD32 they were using was because
it was one of two (AhnLab was the other) of the 30+ products listed that
actually showed their version numbers. The fact that they were using the
current version of AhnLab and NOD32v2 (v4 was just released to the public)
shows that the test in question cannot be reliably used as a "comparative"
between AV engines. Not even in a broad sense.

I've never trusted av-test.org. They get commissioned (paid) by AV
vendors to "test" that vendor's product but are guidelined as to the
test scenarios and sometimes as to even which sample of pests that
they are to test against. I'm not convinced they qualify as an
*independent* testing agency. I haven't seen one AV vendor who
commissioned av-test.org to test their product where that product
didn't come out shining like a white knight of security products.


I take it then that ClamAV, CA AV (VET), DrWeb, VBA32 and Rising did not
pay to have their products tested by av-test.org in the test Kurt
referenced? After all, their results were not what I would have called
"shining like a white knight".

Alas, I remember reading a blog or article from them where they
mention that costs are getting prohibitive to do this testing for
free. So they may go the way of VirusBulletin and others that charge
for testing an AV vendor's product. That means:

- Some vendors won't submit their product for testing, or they will be
selective as to who tests their product that results in presenting
them with the best image.

- Vendors can pay to have their product tested but they can also
request the results not be published. So if they did really poorly,
you don't get to see it.

- A bias can creep into the tester's methodology regarding products
for which they get paid to test, especially if repeatedly paid in
subsequent tests, and those that don't pay to get tested only
occasional pay to test their product.


It's possible. Then again there is also the argument against "for free"in
that you "get what you pay for". For what it's worth, IMHO this is an
argument that will never end. There is no single, definitive test out there
that I am aware of and there will probably never be one. Even if the
testing parameters did cover a lot of what has been discussed in this
thread, there will continue to be an argument, ad nauseum, over how to rank
the imnportance of such things as false positives, load on system
resources, speed, etc. All subjective parameters that will vary from person
to person.

The only reason I originally poked my nose in on this thread was that
SRI.com was mentioned as if it were a definitive ranking of AV products
when it is not. IMHO av-comparatives is a much better source for such
rankings, but even it has its limitations.
 
Rick said:
IMHO av-comparatives is a much better source for such
rankings, but even it has its limitations.

While their normal testing is useful, I always find it interesting to
wait until they do the retrospective test. This uses the same version
engine and same database of signatures that were used in the normal test
but against pests that didn't exist at the time. The retro test is
performed months later using the same engine and signatures to see how
they fare against new and unknown pests. So you need to use BOTH their
normal and retro tests to get indication of coverage by the product.
 
VanguardLH said:
I don't recall using an AV product in which the UI app was the engine.
The engine is separate from the UI. In fact, the UI may be unloaded or
even crash but it doesn't take out the engine (the kernel-mode file
system filter).

you're being disingenuous... nobody said the engine was in the UI app,
rather, what was said was that the command-line tool that they are
provided with does not necessarily use the same engine as the desktop
tool that customers would normally use...
Alas, there isn't much info at VirusTotal regarding their setup for each
AV product. For other tests, usually they announce that settings were
at "highest" (which doesn't often match the install-time defaults in
typical user installs).

sigh - you're thinking about this all wrong... the virustotal people are
*not* doing tests, therefore there's no reason for them to provide
details about their setup... they'd only be lending credibility to the
wrong-headed practice of interpreting their results as though it was
valid testing methodology - which they have openly criticized on their
own company blog...
But this subthread started due to Rick's comment that VirusTotal is
using an old version of NOD32 so there must be some indication to the
user as to which version was using during the summary period for that
test result report. However, I don't see SRI listing the version in
their summary report. I took a very quick peek at VirusTotal's site and
didn't see versions mentioned there, either. The only time that I see
the product version listed is when I submit a file to them and then look
at the scan report.

With a summary report that spans a period of time, it is possible the
version of the product has changed. That's why I submitted a request to
SRI that they either show coverage by product over several of their
summary reports (since a single snapshot alone is hard to guage how well
a product has fared over time), or keep an archive of their old summary
reports so users could copy them into a spreadsheet to see the
effectiveness of a particular product over repeated snapshots.

well, you're free to spend your time as you see fit, obviously, but i
wouldn't waste my time trying to get SRI to report what versions
virustotal were using when they shouldn't be using virustotal for that
task in the first place...
Not sure how timeout scans for a particular test could be included in a
summary report that includes multiple tests. But then users might not
want to use a product, even with high coverage, that takes a really long
time to detect a pest.

be that as it may, the timeouts that av companies build into their own
engines for the user's benefits are not the same as the timeouts that
hispasec (the virustotal folks) built into their system... the demands
on their system are considerably different than the demands on the
desktop...
Yet each product included in the scan *claims* to also detect viruses.
There are few exactly identical products. If they were identical, we
wouldn't need any of these "results" summaries since every product would
fare the same as another because they were the same. When you shake
flour through sifter, you're looking for an overall granularity of
powder, not that it is a absolutely perfect consistency. They throw the
suspect at their sifters, one for each product, and see what falls
through. Like PGP, the test is pretty good. Not very good, or
extremely good, or perfectly good, but good enough to provide some gauge
of effectiveness.

what does that have to do with the fact that different products have
their heuristics tuned for different situations (desktop vs gateway)??
that's what i was alluding to when i said they were using apples
along-side oranges... yes they all claim to detect malware, but some are
intended for user input and others are not and that makes a big
difference in how the non-signature-based technologies are tuned...
I've never trusted av-test.org.

be that as it may, they run tests that include behavioural detection...
They get commissioned (paid) by AV
vendors to "test" that vendor's product but are guidelined as to the
test scenarios and sometimes as to even which sample of pests that they
are to test against. I'm not convinced they qualify as an *independent*
testing agency. I haven't seen one AV vendor who commissioned
av-test.org to test their product where that product didn't come out
shining like a white knight of security products.

judging by a paper andreas clementi (of av-comparatives) published a
while back about which testing organizations you can trust, he didn't
seem to think there was much wrong with the financials of av-test.org...

since av-comparatives.org is notoriously unfriendly to direct links to
their content, you're looking for a paper released back in april 2007
about anti-virus testing websites...
The only free and publicly available "comparison" they offer on their
web site is how often the various AV products provide updates. Oh gee,
golly, big deal.

which would be why they provide links to other publications that publish
their tests...
Alas, I remember reading a blog or article from them where they mention
that costs are getting prohibitive to do this testing for free. So they
may go the way of VirusBulletin and others that charge for testing an AV
vendor's product. That means:

- Some vendors won't submit their product for testing, or they will be
selective as to who tests their product that results in presenting them
with the best image.

some vendors have pulled out anyways - generally because they get bad
results...
- Vendors can pay to have their product tested but they can also request
the results not be published. So if they did really poorly, you don't
get to see it.

i really think your tinfoil hat needs refitting...
- A bias can creep into the tester's methodology regarding products for
which they get paid to test, especially if repeatedly paid in subsequent
tests, and those that don't pay to get tested only occasional pay to
test their product.

consider the possibility that there are byproducts of testing (other
than the report itself) that testing organizations can sell to av
vendors such that the test can remain independent and the testing
organization can still get paid...
 
kurt said:
you're being disingenuous... nobody said the engine was in the UI app,
rather, what was said was that the command-line tool that they are
provided with does not necessarily use the same engine as the desktop
tool that customers would normally use...

Programmers reuse code. They don't reinvent the wheel. They reuse the
same signature database. They reuse the same heuristics algorithms (if
any can be applied against static files, that is). The engine or driver
itself has no UI. Any windowed or console interface to it will call
methods available within the same modules used by the engine.
be that as it may, the timeouts that av companies build into their own
engines for the user's benefits are not the same as the timeouts that
hispasec (the virustotal folks) built into their system... the demands
on their system are considerably different than the demands on the
desktop...

So, in other words, we have no information on which to base judgment as
to what causes the timeouts at virustotal.com for some products or can
even tell if those timeouts are typical of a particular product or are
pseudo-random across all products. That the a product times out as used
by virustotal.com cannot be used in determining whether or not a product
is adequate for use on a particular workstation. So why bring it up?
be that as it may, they run tests that include behavioural detection...

To bad they don't then go publish them so we users can actually see
them.
which would be why they provide links to other publications that publish
their tests...

Actually I've never see those official reports from av-test.org
published at those companies that commissioned av-test.org to do the
test. All you get is the overview that the commissioner composed on
which they claim the av-test.org report. This comes back to who owns
the publication. It would be the entity that paid for its creation.
That might be why av-test.org can't provide a list of their reports. It
also means we won't ever see them except after their modification or
filtering to be aligned to the commissioner's intent.
i really think your tinfoil hat needs refitting...

Time to take off the rose-colored eyeglasses and see the real world.

Now we start the battle of mots. No thanks.
 
VanguardLH said:
Programmers reuse code. They don't reinvent the wheel. They reuse the
same signature database. They reuse the same heuristics algorithms (if
any can be applied against static files, that is). The engine or driver
itself has no UI. Any windowed or console interface to it will call
methods available within the same modules used by the engine.

i'm not going to debate the way things *ought* to be - there are
documented cases of command-line av tools having significantly different
capabilities than the desktop av product of the same vendor... mcafee's
stinger, for one... trend's sysclean is another...

furthermore, this point isn't me pulling things out of my arse, i'm
relaying a point that the hispasec/virustotal folks themselves have made...
So, in other words, we have no information on which to base judgment as
to what causes the timeouts at virustotal.com for some products or can
even tell if those timeouts are typical of a particular product or are
pseudo-random across all products. That the a product times out as used
by virustotal.com cannot be used in determining whether or not a product
is adequate for use on a particular workstation. So why bring it up?

because it calls into question the results of tests based on virustotal...
To bad they don't then go publish them so we users can actually see
them.

they let magazines publish them instead...
Actually I've never see those official reports from av-test.org
published at those companies that commissioned av-test.org to do the
test. All you get is the overview that the commissioner composed on
which they claim the av-test.org report. This comes back to who owns
the publication. It would be the entity that paid for its creation.
That might be why av-test.org can't provide a list of their reports. It
also means we won't ever see them except after their modification or
filtering to be aligned to the commissioner's intent.

except of course that i provided a link already to republishing of
av-test results at vb (who most certainly did not commission any test
from them)...
Time to take off the rose-colored eyeglasses and see the real world.

Now we start the battle of mots. No thanks.

i'm sorry, i should never have expressed it quite that way... that was
wrong of me...

i do, however, feel you are putting too much weight in conspiracy
theories... it's easy to believe the worst in people (heck, i just did
it in my previous post), and in this context the worst to believe when
you discover that an av testing organization gets some or all of it's
revenue from av vendors is that the vendors are paying for cooked
results, that the test is not independent, that there is inherent bias
because of the financial relationship...

but that assumes the worst - there are, as a matter of fact, a variety
of other business models that allow publicly released av tests to remain
independent while the organization still derives a revenue stream from
av vendors... one hypothetical example is the following: let's say that
a truly impartial independent test has been performed by a testing
organization, and let's further say that that organization protects it's
brand and symbols as intellectual property - in order for any vendor to
mention the test in their marketing (which they'll want to do if they
perform well) they'll have to pay to license the brand IP of the test
organization... in this scenario the money is paid not for the test
results but for the rights to reference the test at all...

if we're always going to believe the worst, we also have to throw the
vb100 out the window because of the relationship between virus bulletin
and sophos... if, on the other hand, we keep an open mind about virus
bulletin, we really ought to afford av-comparatives and av-test the same
benefit of the doubt...
 
I read thru the entire thread, can someone please list the reliable sites
links where you can down load these software from?

Which one(s)?
I am afraid I may get some fake ones if I just click on anyone of them.

thanks you very much.

dcarch

Please don't top post.

Pete
 
I am sorry, I am not very good in this kind of things.

do you recommend the following?
If so where are reliable places to get them?

Thanks again

dcarch
I use the following, all free for the downloading:

Anti-virus:

Avast! <http://www.avast.com/eng/avast_4_home.html>. It runs in the
background, scanning everything I download. I especially like the boot-time
scan.

Anti-malware:

Spybot Search & Destroy, <http://www.spybot.info/en/index.html>
Superantispyware (SAS), <http://www.superantispyware.com/>
Malware Bytes (MBAM), <http://www.malwarebytes.org/>
a-squared <http://www.emsisoft.com/en/software/free/>

These are "on demand" scanners, have overlapping coverage. I update and scan
my system with one or more of these about once/week.

HTH...
 
Back
Top