V
Virus Guy
My observation is that M$ (with their various virtual machines running
different patch levels) seems to be treating the "virtual computer" as
a black box (or boxes) and subjecting them to what-ever web sites they
think are risky and observing the behavior of the black-box as best
they can (ie - registry changes? appearance of new files in
unexpected locations? memory violations or buffer over-runs?
escalation of user privledges?).
This seems to be a brute-force approach to the detection of mal-pages
or mal-content, as opposed to a more careful (necessarily automatable)
analysis of the content of URL data.
But what will come of their efforts? Will they roll it into their own
web search engines and tout their search engine as a "safe" one that
will do it's best to keep users from bad URL's?
Can they automate the detection of bad web content - and build that
into IE? Will they do so in IE7 - a version of IE that will probably
not be compatible (by intent) with anything less than XP?
different patch levels) seems to be treating the "virtual computer" as
a black box (or boxes) and subjecting them to what-ever web sites they
think are risky and observing the behavior of the black-box as best
they can (ie - registry changes? appearance of new files in
unexpected locations? memory violations or buffer over-runs?
escalation of user privledges?).
This seems to be a brute-force approach to the detection of mal-pages
or mal-content, as opposed to a more careful (necessarily automatable)
analysis of the content of URL data.
But what will come of their efforts? Will they roll it into their own
web search engines and tout their search engine as a "safe" one that
will do it's best to keep users from bad URL's?
Can they automate the detection of bad web content - and build that
into IE? Will they do so in IE7 - a version of IE that will probably
not be compatible (by intent) with anything less than XP?