Microsoft hunting down exploits

  • Thread starter Thread starter P. Thompson
  • Start date Start date
My observation is that M$ (with their various virtual machines running
different patch levels) seems to be treating the "virtual computer" as
a black box (or boxes) and subjecting them to what-ever web sites they
think are risky and observing the behavior of the black-box as best
they can (ie - registry changes? appearance of new files in
unexpected locations? memory violations or buffer over-runs?
escalation of user privledges?).

This seems to be a brute-force approach to the detection of mal-pages
or mal-content, as opposed to a more careful (necessarily automatable)
analysis of the content of URL data.

But what will come of their efforts? Will they roll it into their own
web search engines and tout their search engine as a "safe" one that
will do it's best to keep users from bad URL's?

Can they automate the detection of bad web content - and build that
into IE? Will they do so in IE7 - a version of IE that will probably
not be compatible (by intent) with anything less than XP?
 
P. Thompson said:
Now where did I say it was going to run on customer's computers?

if it's only going to run on microsoft's computers who gives a flying
fig how big it is? microsoft can fill their own computers with crap if
they want to...
I was expounding on virus guy's marveling at their technique, which in
the paper they coyly describe as "fairly expensive".

i don't think virus guy was marveling in earnest...
 
Virus said:
My observation is that M$ (with their various virtual machines running
different patch levels) seems to be treating the "virtual computer" as
a black box (or boxes) and subjecting them to what-ever web sites they
think are risky and observing the behavior of the black-box as best
they can (ie - registry changes? appearance of new files in
unexpected locations? memory violations or buffer over-runs?
escalation of user privledges?).

seems to be treating them as a black box? that's exactly what the paper
claims they're doing - it says so in just about as many words...
This seems to be a brute-force approach to the detection of mal-pages
or mal-content, as opposed to a more careful (necessarily automatable)
analysis of the content of URL data.

it is not automatable... just as the analysis of the content of
executable data is not automatable...
 
Back
Top