"cquirke (MVP Win9x)" <
[email protected]> wrote
Sorry this is long - some nuances may be lost if I oversnip the quotes
Always the best approach! I agree with that principle whole-heartedly.
No, the DoS effect is no more manageable by other means just because
risk management (patch) isn't available
That's the same logic as "if I have zero RAM my system would be really
fast, because zero RAM needs no swap and there'd be no swapping to HD"
Or "feature A is weak on problem X, but that's OK because it's better
at handling problem Y" {e.g. NTFS, bad hardware, rollback}
IKWYM, but in the absence of a surgical fix, you have to be able to
damage-control through bulkheads. If that is not possible, then you
have a bad design on your hands... any functionalities that can't be
hidden from the world or bulkheaded off have to be 100% COAB
bullet-proof, i.e. out of today's "patch it later" approach. Better
to design in such a way so that no code ever has to be that good.
That's a difficult choice sometimes. Many (hardly all, but many) of the
MS-originated vulnerabilities are because conveying great capacity also
creates great exploits. The classic, of course, is the case of Office macro
virues. These wouldn't have arisen except that MS provided in Office a macro
language that is pretty much able to run the entire operating system.
That's not the point of failure. The point of failure was giving
"data" any auto-executing programming rights whatsoever.
Exploit opportunities fall into three categories:
1) Those relying entirely on SE (Social Engineering)
2) Those that leverage bad software design
3) Those that leverage bad software coding
What is "bad software design"? Anything that facilitates attack not
dependent on SE - i.e. the definition is circular.
What is "bad software coding"? Coding that creates attack
opportunities unintended by design.
Vendors can disclaim responsability for (1), but not (2) or (3). If
MS wishes to reduce their responsibilities (and hell, I would - a
massive legal takedown is the only real threat MS has) then they'd do
well to design prudently to avoid (2), knowing that (3) is a problem
that will not go away completely.
IMO, risk management should go beyond patching, as patching only
addresses (3) but does nothing for (2). Often risk-managing (2)
modifies or removes the need for (3), and in the cases we were
discussing - where an exploiter predates the fix - it's all you have.
For example, being able to run scripts within cookies as "local HD
zone" is described by MS as (3). I would cite allowing scripts within
cookies at all as a case of (2).
For example, being able to automate cetratin controls from within
scripts embedded in email "message text" is described by MS as (3). I
would cite allowing scripts to auto-run in "message text" as (2).
For example, Office macro and VBA malware may extend due to particular
coding holes that are documented as (3). I would consider the notion
of assigning any sort of programming rights to "data files" as (2).
no comfortable solution to the trade off, since "give the user less power"
usually isn't a good solution. Users want more capacity, and blast Microsoft
for withholding control and flexibility -- but when these are given, a new
door is opened for exploitation.
It's not about "power". Power to whom? The user, or whoever has the
skills to attack the user? How many users ever write scripts, or ever
see scripts other than malware?
It's about risk expectations. A user may choose to read a data file
or email message, but doesn't expect to have conferred programming
rights to this material. In a sense, you could file this as "bad UI".
The industry has evolved a good layered UI, and much of that credit
goes to MS. Features of this UI are...
- common tasks are easy to do
- dangerous tasks prompt for confirmation
- you can find obscure tasks by clicking through menus
- you can shortcut common tasks you know via shortcut keys
IOW, when you look at something, your intuition as to how to do it is
usually on track - and so is your intuition as to what will happen
when you go ahead. When there may be unforseen consequences, you are
prompted, e.g. "Do you really want to wipe all files off C:?"
This is known as WYSIWYG, and it fails *badly* when it comes to risk.
Most computer concepts are expressed as metaphors that are already
familiar to the user, e.g. "desktop", "folder" and so on - everything
they already know about these concepts applies to the computer
equivalent, and if the analogy is accurate, there's no further
understanding required. That's what makes computers "easy".
When it comes to security (or even basic safety), NT brings a model
that assumes you are a professional system administrator. After all,
the "home" market is just the same as "corporate" but thinner, right?
So everything the user already knows about security - which revolves
around physical access - is badly mismatched to what they need to
know. In effect, they have to pretend to be a corporation with a boss
or administrator, and everyone else as dumb users who can't be trusted
with sharp objects. There are more specific flaws in the way MS has
delivered user account rights etc. but that's another story.
I suspect MS goes too far to appease "business partners" who want to
manipulate user systems from web sites etc. to be considered healthy
or appropriate by most consumers. There's also much pandering to
corporate business needs within what is ostensibly the OS they are not
supposed to be using (XP Home), and finally there's the fear of DRM,
i.e. that your computer will act against you to preserve the rights of
not only MS (WPA) but assorted media pimp cronies.
What this does is FUD up the waters, so that a significant number of
users will always avoid patches for fear of intrusive side-effects.
That's why infosphere infectability will remain so high.
I've never considered disabling scripts a valid solution. I rely on their
execution too often.
As you say, one takes a choice - so far I've chosen to avoid scripting
and use .bat instead (one extension, only two interpreters, no risk of
..bat being embedded in "data" files such as HTML).
I'll use scripting when I figure it out (heh heh) and when I need to
do things that .bat can't do (that will be soon). However I'd either
toggle scripting so it can be turned off when I'm done, or I would
"privatize" it if possible via non-standard extensions and engine
names, etc. so that arbitrary scripts would fail.
In many enterprise situations, this is a main way that IT
management has to deliver forced patches, repairs, and
configurations on machines enterprise-wide.
Sure - but these are the dudes who are supposed to be using XP Pro,
right? I can understand MS being too timid to push the point when
Win95 came out, as the success of Win95 wasn't assured and besides,
the NT of the day still had the Win3.yuk UI (having the corporate
market remain standardised on the old UI would have been Bad).
But now, the time to subject the home market to risks associated with
the corporate model just so businesses can save a buck is past. The
impact of consumer PCs on the infosphere is non-trivial in a world of
broadband; a more appropriate and intuitive security model is needed
for that market, even if that means business has to pay for Pro.
That would be a win-win for everyone:
- home users get a model they already understand
- business saves on the impact of malware'd home systems
Instead of Home being resented as an artificially stunted Pro, it
becomes genuinely better value for that market.
And what business spends up on Pro, they save on reduced malware
impact - the bulk of attacks from infected PCs is avoided in consumer
PCs on broadband is structurally immune to infection.
Arm-wrestling always invites a stronger arm to enter the contest.
Exactly. Better to lock out the bad guys than let everyone in and
rely on the bouncers to spot and throw out troublemakers.
Users know that "home" means "a physical place where safety can be
assumed". They know it may be safe to speak to strangers through a
locked door, but less so to invite them in. They lock up the PC when
away so that unathorised ppl can't access it - and there is NO-ONE
outside the house whose rights to the PC exceeds theirs.
Use that common sense as the basis for XP Home's security model - chop
out all the corporate stuff that allows a notional "administrator"
anywhere in the world to override the user's rights - held at bay only
by a password system that users can't be bothered with and which is in
any case quite porous (pasword cracks, leakage out of user zones)
Most users see the Internet as a place to consume, i.e. they choose
to visit sites, read stuff and so on as if they were watching a movie
or reading a book. No-one expects the characters on the screen to
jump out and shoot you, or an arm to reach out of a book and stab you
in the chest - why should web sites be allowed to program the user's
PC as a matter of course?
I'll leave your further education on this to Walter. <vbg>
No please anything but that
Certainly, my experience with a Swen infection on NTFS was *exactly*
that - I had to blindly rely on toolls running head-to-head with
active Swen to kill it, all the while being unable to be sure other
undetected malware weren't also running (Swen had killed the av, so
who knows what else may have come in).
If the user asks "is the PC clean now?" I'll have to shrug and say
"it's either clean, or there is active malware that is successfully
hiding from av". The latter statement is always true of course, but
how much more so when you know your attempts to get your av airborne
and detecting malware are done while any active malware is already
airborne and can be dodging or dropping bombs on the runway.
NTFS is *exactly* like a hi-rise with no fire escape.
Yes! This is scary stuff. I've watched AV programs be deleted off of a
system (files vanishing in Explorer) faster than they could be installed.
Yep. This is all entirely predictable from basic theory (even if it's
my own theory - with no formal training I have no idea as to how it
Venns with what is taught as computer science, but to me it's just
common sense). I'd rather have been wrong on this, but if it wasn't
now, it would have been later IMO.
It's getting to where the only recovery is going to be frequent backups
(preferably by imaging), and wipe-then-restore as a recovery. (Funny how
history loops around like that.)
That's a disaster for a number of ways - even if it works (hint;
what's the half-life of an unpatched system during a worm war?) it's
an unacceptable result, especially for an OS touted as "better".
There will always be collateral damage and productivity impact.
Somewhere in another post (this thread or another) I covered the
problems inherent in backup - negative timelines, scope boundries.
Right now, NT on NTFS is a very precarious situation where malware is
concerned, and I still consider it unfit for general home use.
------------------------------------ ---- --- -- - - - -
Malware coders are the Wild Weasels
of Microsoft Quality Assurance