Why would anyone WILLINGLY give malware any permission to do anything?
Most wouldn't, but the "I_AM_A_VIRUS.EXE" PoC showed that there are
indeed folks who will "open" such things. Why? Perhaps "I don't
believe it" reverse-SE, disgruntled users on work PCs, etc.
The problem is that quite often the consequences of doing things are
not obvious (or even visible), and Windows was always written to
assume good intentions, as in "scripts are usually safe".
You guys are priceless in your endless blind defense of Microsoft
decisions!
Jeez, you are so blind you can't tell when we're attacking dumb-ass
Microsoft design decisions. Do you even read what you reply to?
The FACT is Microsoft ADMITS it had no choice but to leave
the door wide open to accept any installer request to have access
anything. Any reasonably clever hacker therefore can write code to
pretend his malware code is a installer of a "trusted" application and
such a attack will do whatever it wants.
That's not the main problem.
When you install sware, you know you are giving it traction to not
only run code, but... well, to install software, DUH.
However, the same consequences could arise when you:
- visit a web site
- read "message text"
- open an MS Office "document"
- simply connect to the Internet (RPC etc.)
That's bad design, when content can pretend to offer the low risk of
"reading data" but actually execute the higher risk of running code.
This is before you factor in code insanity, i.e. that code written to
safely view data may in fact run it as raw code due to unchecked
buffers or whatever. The take-home lesson there is that all content
handling can be dangerous and therefore should be avoided until the
user has initiated that process. That lesson has not been taken home.
It's taken years from MS to slowly retreat from the excesses of IE 4's
"all the wotrld's a web page" model, MS Office's auto-running of
macros in "data" files, and Outbreak taking orders to spam.
If you include Windows in that statement you are entirely correct.
Any program. Yes, if you source Windows in the form of a ?tainted
download, or conterfeit CD, then what you're installing may be a
little more than just Windows alone ;-)
Windows is the biggest thread to your PC's security because of how it
was originally written and nothing to date changes that.
Current Windows is based on NT, and NT was written to be a network
chew-toy. It was intended that some big-boss system administrator
would be able to fiddle with PCs through the networjk, overriding any
wishes the user might have had on the subject.
When that design is chucked into broadband consumerland, guess what
happens? Anything that can spoof "sysadmin" status has all that
lovely remote admin access to play with.
XP was the first NT to be mass-sold into consumerland. It was also
the first version of Windows to be open to pure network worms that
attack within minutes of connecting to the Internet, without running
any apps at all - and there were two majot outbreaks of that (Lovesan
et al through RPC, Sasser et al through LSASS).
And now in Vista, we find the RPC service cannot be set not to restart
the whole damn PC whenever it falls in its ass. Where's the logic in
that? The only logic I can see is that corporate sysadmins want
access to the system at all times, even if the user kills RPC and thus
potentially blocks remote access. And because the same basic code
base is used across all Vistas, us home users have to have the same
"solution" for this as crafted for corporate needs.
MS still doesn't "get" it that consumers have needs that are too
different from pro-IT that you cannot simply use the same design as-is
for both. It's not enough to rip out the geekiest bits and dab on a
coat of "easy to use" paint, and call that "Home".
Windows has patches on top of previous patches over the course of 20
plus years. Just for kicks it would be damn interesting to see all the
source code don't you think?
IMO, this isn't where the problems come from. If anything, I'd expect
*NIX to have even deeper and more tortuous legacy roots. Only Apple
have slashed and burned compatibility, mainly when changing
processors, and I'm not sure how relevant that is, either.
In fact, I'd say the greatist risks in a new Windows are not from
legacy carry-over, but new 1.0 feature sets added for the first time.
Why is Windows so weak when in comes to security? Well Mr. Gates
himself made a poor decision. When Windows was first being developed
the Internet (main threat) was unknown to most. Microsoft originally
ignored the Internet. Gates is on record saying the Internet was a
passing fad that Microsoft wasn't interested in.
If you're going to initiate a discussion topic, you need to be a bit
more specific. For example, when you say "Windows", where are you
joining the evolutionary path - Windows 1.0, Windows 3.0 or 3.1,
Windows 95? You'd expect *NIX to have the strongest Internet
heritage, given that it was invented by a telecommunications
enterprise with communications as a major goal.
In fact, I'd say the version of Windows that had the best by-design
safety would have been Windows 95. This predated web browsers that
ran active content, HTML email clients that autoran scripts, HTML and
scripting embraced as internal technologies, deep integration of the
web browser, RPC and other remote-facing "services", ActiveX opening
up DDE/OLE to Internet access, etc.
Only after he realized that was a huge miscaculation did Microsoft
start to try to patch the huge number of security holes hackers were
starting to exploit in Windows itself (stupid policy of turning everything
on like file sharing) making Windows easy prey to port sniffers and the
laughable early attempts with Microsoft's early browsers and Active X.
The sequence was a bit different.
Even before Windows shipped with networking capabilities, viruses were
a clear and present danger with diskette swapping and BBS downloads as
the vectors. Destructive payloads were more common than today.
Then malware simply used by-design opportunites that Microsoft handed
out on a plate - MS Office macro viruses, scripts that used Outbreak's
by-design functionality, and HTML scripts within email "message text".
Quite late in the Win9x era, we saw a move to the discovery and use of
exploitable code defects. The first spectacular examples were SQL
Slammer (Sapphire) and perhaps Code Red, which swept through servers
like wildfire. Still, at this point, Win9x users were not at risk
unless they'd installed something that dropped a SQL engine on the PC.
When XP waved RPC, LSASS etc. at the Internet, mass exploits of
defects in these services followed fairly swiftly. From that moment
on, the search has been for exploitable code defects, rather than
simply using by-design opportunities that are beginning to wane.
The problem is no matter how much Windows gets patched it still wasn't
designed as a secure OS. Microsoft had pleny of time to fix this
oversight by rewriting Windows from scratch.
I make a distinction between "security" and "safety".
When you need some folks or contexts to use risky functionalities and
others not, then you need "security" to mediate access to these
things. But when you do NOT need any folks or contexts to have access
to risky things, then you simply need to rip these out altogether.
A piss-weak strategy is to rely on "security" to act as a zero-pass
band-aid instead of building in "safety". Would you feel safer if
nuclear weapons were never invented, or if anyone could pick up a
phone and command a strike, blocked by the 100%-foolproof security of
needing an impossible-to-guess 3072-character string?
NT was indeed designed as a secure OS, unlike Win9x - from the user
accounts and domain logon down to NTFS, it's designed to secure access
to everything - but, alas, also open everything to remote access,
"protected" by this security. And XP has suffered far more
devastating mass drive-by attacks than Win9x as a result.
On writing the OS from scratch, I remember it was claimed in the NT 4
era that the whole code base had been re-written to root out all
unchecked buffers. Er... right. As long as folks write in C, we will
prolly have unchecked buffers are similar exploitable defects, and
this underlying factor prolly applies equally to *NIX and MacOS.
they chickened out fearful they would lose too many customers if
Windows suddendly became more secure but nobody's hardware or software
worked anymore with this new beefed up Windows.
Interesting you mention that - as they have indeed come closer to
doing just that with XP SP2 and even more so with Vista.
...you would think Vista would be more secure, but all Microsoft did
was put a bandaid on Windows called UAC which is badly flawed
Actually, UAC is the temporary tip of a far larger iceberg of safer
re-design. It is there to bridge between today's apps and the safer
(or "more secure", if you prefer) native design of Vista.
UAC isn't going to be developed further; it more likely to fall away
as development embraces the new Vista practices. What happened to
Share.exe between Win95 and Win98 is what will happen to UAC... in a
few years' time, apps that throw up UAC prompts today will not run.
Hopefully, Vista64 will be that more secure platform - with DEP,
signed drivers etc. as the norm. It's the only clean-slate
opportunity MS is likely to get in the next 5-10 years, so I hope they
don't squander it by allowing today's practices to continue.
I'm not against the concept of UAC, I'm simply surprised Microsoft did
such a crappy job with it considering its taken them over 5 years to
push Vista out the door. What have they been doing all this time?
Prolly similar to what they did when Win95 was in (protracted) beta.
In both cases, the current OS had core reasons why it HAD to be
redeveloped. Win3.yuk was dying every few hours because the 64k
global heaps were being overrrun with modern multitasking needs. XP
is being shot to pieces because most of its security depends on
limited account rights, and no-one developing consumer software has
given a damn about writing for use with less than admin rights.
In both cases, MS responded by building a relatively clean-slate OS
designed to impliment a new software standard, with concessions added
so that current software will still work.
In 1995, the new standard was 32-bit code, as supported by the
minority NT OS of the time. In 2006, the "new" standard was pretty
much the same one they advocated for XP, i.e. develop code so that it
can run in limited user accounts, sign your drivers, etc.
The original Win95 moved everything from 16-bit to 32-bit heaps, thus
killing the resource heap crisis for once and for all. At the API
level, they hid this detail, so that existing sware would still
work... then they discovered many apps broke API rules and wrote
directly to the heaps, and thus would crash with the new OS. So they
moved some items back to the old legacy 16-bit heaps, and I suspect
the extended public beta period was mainly needed to test which items
had to be moved and which could stay in the 320bit heaps.
The original Vista was prolly written to run properly-developed
programs, with UAC as a tide-over for everything else. In its earlier
forms, UAC was even less tolerable than it is today. The extended
beta may have been required to polish it up, and if late changes were
still being made, it may explain why so many vendors are still not
Vista-ready today (e.g. HP printer drivers, QuickBooks, etc.)
IOW, simply developing for Vista from 2004 doesn't ensure you'll be
Vista-ready in 2006, if the OS changes late in the beta process so
that your development work is invalidated.
That's what 2007 smells like, to me.
I think MS's approach is sound, because the pain of today's sware and
UAC will fade with time. If the new platform we move to was deeply
compromised for the benefit of today's legacyware, then we'd carry
that pain forward for the next 5+ years.
As it was, the need to compromise Win9x for Win16 heap-fiddlers had a
crippling effect on Win9x in the long term. Let's hope we aren't in
for the same thing with Vista.
--------------- ---- --- -- - - - -
Saws are too hard to use.
Be easier to use!