Microsoft Zero Day security holes being exploited

  • Thread starter Thread starter imhotep
  • Start date Start date
imhotep said:
Surely the critically merits promptness. Does it not?

Contrarily, surely it is the scope of disruption to installed base,
or potential thereof, that merits thoroughness and correctness.
Does it not?

See. Yet another game of trade-offs.
Asking if you knew of a ETA? Sorry, but I thought you actually might know.

If I knew I could not say, something true for all that might.
No trapping this time....

I would call you a liar, were it not so obvious you did not understand
"follow-up" did mean follow-up, as set on your post, which is again to
only the ie.sec NG
 
On Sun, 24 Sep 2006 02:45:01 -0700, Ian
Think we'll only achieve secure computing when C is dropped in favour of a
better language. The list of buffer-overflow exploits in every single major
software-package gets monotonous.

Yes, that makes a lot of sense.

As C tends to be used across all platforms (UNIX, Linux, MacOS,
Microsoft) it's unsurprising that all of these platforms share the
same sort of exploits and code repairs.

It's a lot like the Y2k problem, i.e. trade-offs made in the interests
of performance on the slower hardware of the time.

We now waste processing power on eye candy (animated menus, Glass in
Vista, etc.) and underfootware (indexing, thumbnailers, .PF and SR
management). Plus, the risks from bad code often mean thatr the real
code is preceded by several layers of "is this safe?" content testing,
from av to code that checks for "shapes" that will exploit the real
code to full-blown emulation of the real code in an attempt to
"sandbox" it. One's tempted to say it would be more efficient to
simply use a less "powerful" language that doesn't use pointers and
data-driven buffer operations instead.

The challenge is that much of the code logic only works because the
need to know how big material is, relative to the buffer space, isn't
needed to code it. That's unsafe, but to fix this is more than just a
matter of changing the source code language. Wherever there's a
dynamic buffer that's having ad-hoc material added to it, it may be
difficult to track remaining buffer size and incoming data size.

One way is to truncate data to fit, but that can create a different
sort of exploitability, where this can match the wrong (malicious)
object when truncated.


Pointers are another story... it's hard to re-write code that relies
on pointers to work in some other way. Object Orientated Programming
may mix code and data within multiple small objects, thus creating
opportunities for code to overwrite (or point into) data areas that
would have been avoided if the older code segment, data segment model
was used instead.


Finally, what really blows the doors off safety is the irresistable
urge to "make everything work". It's so cheap in terms of programming
effort to massively extend functionality by passing logic outwards
into additional content and syntax parsers, such as programming
languages that include some sort of SystemAPI( ) calls - but this is
where the original intention of the code is trumped by unintended
possibilities that can be exploited.

This is very similar to the Internet, where any IP that can't be found
locally is sought through the next in a chain of gateways. We love
the Internet, though we wish coders wouldn't treat it as one big
network just because it's made out of networking. We have less love
for sprawling extended interpreters that allow any content to be
activated from any kind of visible wrapper.


------------ ----- --- -- - - - -
Drugs are usually safe. Inject? (Y/n)
 
<cough>

See http://cquirke.mvps.org/9x/mimehole.htm, Google( BadTrans B )

That's a very old bug, long fixed unless you "just" re-installed any
OS predating XP, as every such OS uses an IE that is both exploitable
and too "old" to be patched other than by upgrading the whole of IE.

If you read up that bug, you'd see how the nature of exploits and code
bugs have changed.

The MIME hole was a design safety failure, not a code defect - IOW, it
"worked as designed" but the design was brain-dead.

There's still design failures in Windows, and until these are proven
to be exploitable, they won't be patched because "it's working the way
we expected it to". Most exploits that are being patched today are
genuine code defects, and may be harder to exploit.

Then again, the modern malware industry is optimised to overcome any
"an attacker would have to..." mitigations. Once an exploit shape is
found, the source code becomes rapidly available, and malware coders
then drop it straight into attack vehicles that are ready to roll;
either full-fledged multi-function bots, or simple stubs that can pull
down the "real" malware. If these malware haven't been released
before, av won't "know" them at the signature level.


Malware can always out-turn patching. The attacks are smaller than
the patches and can drown out the patching process by sheer volume,
even before you consider DDoSing the fewer number of patching sources
or poisoning the patch pool via fake patching sources.

The other reason malware will always win the race is that the required
software standards are far lower. A malware has to work on some PCs,
and it doesn't matter if it trashes others. But a patch has to work
on all systerms, and not cause new problems of any of them.

If you insist on butting heads with malware on a level playing field,
you will always lose. Better to tilt the playing field so that the
user at the keyboard has the ability to trump all code and remote
access - but MS's fixation on centrally-managed IT and DRM undermines
this and rots the top of the Trust Stack.

See http://cquirke.blogspot.com/2006/08/trust-stack.html
Then how do you explain the record breaking time to patch Microsoft's DRM
hole? Three days to patch? Please explain (no propaganda necessary).

Well, it could be that the nature of the hole was trivial to fix -
e.g. simply changing some static "secret" info that became harvested
and used by the attackers. I suspect this is the case, given how
quickly the fix has been circumvented by the attackers.

We have a very small sample from which to draw conclusions. Sure, we
have a lot of defects that allow user interests to be attacked, and we
have a smaller number where users were left hanging out to try while
patching caught up with ITW exploits. But we have a sample of 1
prompt DRM fix, and it may just happen to have been an easy one to
fix; maybe the next one (or even the continuation of this one) will
take a far longer time to fix. If so, don't expect to read about it!

So... are you saying that all fixes should be held back the same
amount of time, even if they are ready earlier, so that MS can be seen
to act more promptly on the issues we'd most like to see fixed first?


BTW, the post I'm replying to has a ?bug of its own; the sole
newsgroup set for replies is not found to exist on my news server.
Maybe it exists on other news servers, who knows? Here, it's 100%
broken and buggy. Should I wave this around as "proof" that the
poster I'm replying to is trying to hide refutations to his post?


------------ ----- --- -- - - - -
Drugs are usually safe. Inject? (Y/n)
 
Smitty said:
I have to agree with Imhotep.

I have been thoroughly p****ed off this week as a result of a virus which
somehow evaded the countless security systems I have in place. In
retrospect, the 'vulnerability' is simply MS stupidity. Imagine allowing
WinLogon to to load arbitrary DLLs into its address space simply by adding
entries into the registry.

WinLogon is supposed to be my first line of defense against security
issues.

All operating systems do that. They are designed to launch code at boot
time by reading registry values, text files, etc. Because those registry
values are protected from unauthorized access by permissions, someone would
have to already own your system to modify those values, wouldn't they?
 
(e-mail address removed) says...
Your thinking is flawed - most OS vendors don't release patches quickly.
Most of them come out with a workaround until they can get their patches
out after testing. Follow the HP-UX group and see how long they take,
follow the MAC groups and see how long they take....

See http://cquirke.blogspot.com on "how well" other vendors patch.

MS are better at this than most, and may have one of the strongest
understandings of the need to patch. I'd want to see that
understanding pervade system design, but that's another story.

It beats me why, as consumers, we ever accepted the patching model in
the first place. If there had been more litigation impact when code
was found to be defective, back before it became such a regulat
occurrance, the industry may have changed.

OTOH, if you look at other vendor's products with similar exposure,
you see exactly the same process of defect, exploit, fix. Think
Apple's recent WiFi blues, Trend av's update that crashed servers due
to flawed archive decompression, exploits of a popular Linux MP3
player, monthly Firefox revisions through most of Firefox 1.0.x, Sun's
mishandling of Java updates, the disasterous exploitation of Black Ice
firewall... then some vendors can't reach even beta stability after
they ship; hello, Sony's horrendous Connect Player

The real surprise is how well MS has coped with an inherently losing
game up until now. Folks who really should know better have come to
expect all defects to be patched before exploits start ITW, and no
patches to ever bring in new defects or problems.

But the cracks are beginning to show.


------------ ----- --- -- - - - -
Drugs are usually safe. Inject? (Y/n)
 
"Ian" <[email protected]> wrote in message
And note, I don't regard C as inheritently unsafe - it is just it requires
programmer discipline.

Humans are just system components, along with everything else - and as
such, they have notoriously high error rates.

When designing languages, that should be taken into account.

With C, it wasn't - the mindset was that programmers are smart enough
not to need training wheels, and the beauty of C was that it stayed
out of your way so you had full control (and full responsibility).

And we can see how well human programmers have filled those shoes...


------------ ----- --- -- - - - -
Drugs are usually safe. Inject? (Y/n)
 
Stephen Howe said:
Why can't you prune the conversation to what is relevant?
Too difficult for you?
Must you quote everything?

Stephen Howe

Point taken.

I also sometimes get lost in the "what is most relevant?"
when considering the fractured view today's search engines
yield with currently primative (ill thread xref'd) hits.
 
Roger said:
Contrarily, surely it is the scope of disruption to installed base,
or potential thereof, that merits thoroughness and correctness.
Does it not?

An how does one predict what tomorrow brings? Crystal ball? Surely one can
not. This is why is much better rate to the security hole based on the
critically rather than popularity...CRITICALITY DOES NOT CHANGE POPULARITY
DOES!!!
See. Yet another game of trade-offs.

I do not see a trade off here. Honestly, I do see mistakes in how some
people try to evaluate security holes thus resulting in making things
worse...
 
Dan said:
Possibly, but Microsoft is not the big evil cooperation that users
associate it to be. Microsoft does have some problems that are common
in a big company but they do try. For example, they had the security cd
for free that has been very help in countless 98SE machines that I
service.

Honestly it is not that people, like me, view Microsoft as evil in the real
sense of the word. This is not the case.

Microsoft has drifted away from the golden rule. What do I mean with that
statement? Microsoft has used their marketshare as a stick to force people
into doing things Microsoft's way instead of making solutions that their
customer want. This is bad. Their is no reason that Microsoft could not
completely integrate with Apple, Linux or BSDs.

Anytime a company starts to play games with it's users instead of listening
to it's users is a cause for alarm.

As an example, I recently bought a new car. My car as a really nice
navigation system that can interface with my GSM phone. Now, what if my
car's manufacturer tried to force me into buying only *their* phone? By
doing this, they can supply the cheapest phone they can find yet charge me
a fortune for it. Even worse, suppose their phone needs and expensive
upgrade every year! This is the sort of thing that Microsoft does everyday.
That is why people like me (I am an X-Windows user going back to DOS 2.1)
have become dis-enchanted with them and their games. I want to design
systems that benefit my company NOT Microsoft's wallet (or any other
companies wallet). I want options as to what systems comprise my companies
infrastructure. I do not want artificial limitations. I want options!

Microsoft intentionally tries to take away options because they truly do
fear competition. Which is a shame.

Again, you do not have to agree with me, but at least try to understand my
point.

Imhotep
 
Roger said:
Sorry. I guess I cannot cure your blind spots.

Man, you are the absolute best at *not* answering something that debunks
your arguments. To bad, you can't disiguse it more....

Im
 
Dan said:
Leythos said:
Leythos wrote:

[snipped most, as I agree with Roger]
Please, take the conspiracy theorist motivated part of this
discussion to alt dot something.

This thread should be about the present risks, workarounds, and
degrees of exposure in the wild - that is, keep to YOUR subject.
I don't think I've seen this stated better (all that you said, not just
want I kept) in thousands of posts I've read this weekend.

Sure. However, you can not deny that it would be nice to have a patch
out in days instead of months....we know they can do it, they have in
the past...

I think you misunderstand regression testing and proper QA methods. If I
want to patch a program that does not interact with any other programs,
then I only need to test the program. If I want to patch a interface,
something that interacts with many programs and services, it means that
I have to regression test all interconnected parts.

MS has no reason to lag in pushing out patches or fixes, they do it as
quickly as possible with the least risk they can manage to end-users.

Nice point and even then you get users with tons of posts to the
Microsoft update newsgroup about why the download did not work properly
and folks who suddenly say they hate Microsoft because they can't get
the patch to work right. Sure, Microsoft is not perfect but I feel they
do a darn good job supporting their user base.

I beg to differ. For the amount of money and resources Microsoft has, they
clearly could do much better. Why is it patching is only risky on Windows?
Why is it other platforms (some totally free) never have patching problems?
Why is it that the time from security hole discovery to patch is a couple
of days for Linux (which is free) and Microsoft is 30+ days? When Microsoft
clearly has billions of dollars in the bank and rich resources????

Still think that do a "darn good job"?

Imhotep
 
Leythos said:
Your thinking is flawed - most OS vendors don't release patches quickly.
Most of them come out with a workaround until they can get their patches
out after testing. Follow the HP-UX group and see how long they take,
follow the MAC groups and see how long they take....

You are flawed. Reasearch linux patch times...from time of discovery to
patch release.

Im
 
cquirke said:
On Sun, 24 Sep 2006 02:45:01 -0700, Ian


Yes, that makes a lot of sense.

Totally disagree. In fact I could not disagree more.
As C tends to be used across all platforms (UNIX, Linux, MacOS,
Microsoft) it's unsurprising that all of these platforms share the
same sort of exploits and code repairs.

How? Please specify? Buffer overflows? All low level languages can be
improperly programmed by bad programming technique, not just C.


The real problem here is not the language. The real problem is that many
software companies push release dates over quality.

When I first graduated from college it was quality first marketing second.
Sadly, this has inverted. It is now, marketing first quality second.

The second problem is that when I first started out, the senior programmers
were well respected and considered a prized resource within the
organization. Also sadly, a lot of companies have outsourced these people
to third parties who's people have no direct pride or ties to the
organization.

Is there any question why quality has gone away?


Lastly, C/C++ are low level languages and as such have little restrictions
for the programmers that use it. This is how it should be with low level
languages since they are often the languages that kernels and other complex
programs are written in.

Middle layer and High layer languages take away some of "dangerous things"
within the language BUT AT A COST. These languages add restrictions at the
cost of flexibility. They are designed to address the most common
programming needs, say 80%. However, should you need to program in the 20%
area, they become clumsy if not impossible to use.

Removing the so called "dangerous things" from a programming language does
not make a better programmer if they did not understand the fundamentals in
the first place. All it does is make a poor programmer look better. If you
do not understand shared memory, semaphores, IPCs, pointers, memory
management, etc, etc you are not a programmer. You are a glorified
scripter. Not that there is anything wrong with being a scripter....If you
do not understand a simple statement like "***ptr = &object" you should not
be programming...

So, the comment about removing low layer languages will make security better
is just plain bogus. Buffer overflows are poor programming!!! If you want
to make more secure and better quality programs, make quality a priority
again instead of marketing!!!


Imhotep
 
Stephen Howe said:
Your right in one sense. What I don't understand is with MS's trustworthy
programming initiative, why havent they visited all Windows APIs and
proofed them by now? MS 's approach seems reactionary not pro-active.

And note, I don't regard C as inheritently unsafe - it is just it requires
programmer discipline.

Stephen Howe


....and good technique. Well, I guess there is one thing we agree on...go
figure.

Imhotep
 
cquirke said:
Humans are just system components, along with everything else - and as
such, they have notoriously high error rates.

When designing languages, that should be taken into account.

With C, it wasn't - the mindset was that programmers are smart enough
not to need training wheels, and the beauty of C was that it stayed
out of your way so you had full control (and full responsibility).

And we can see how well human programmers have filled those shoes...

Seems his is more of a Microsoft problem than anyone else. Maybe it is not
languages fault after all!

....and the problem with high level languages is that they put to many
restrictions on you. Higher layer languages were not designed, nor any
language, to hide the programmers ineptness!

Languages, and the program that results, will ONLY be as good as the
programmer is...

Imhotep
 
I always thought "NT" stood for "Not Tested"...


<snip>

It is interesting that the NT (New Technology) source code was
originally nicknamed the "Not There" source code since it did not have a
true maintenance operating system like the 9x had. Chris Quirke, MVP
can post more information on this because he knows about it extensively.
9x had DOS which was really nice because you could get down and dirty
and solve many problems with commands and it overcame the limitations of
fixing things that are inherent in GUI (Graphical User Interface). I
researched and read about this in a book about Microsoft's early
history. The actual base of 9x has a more secure and solid foundation
than NT.

Check this out for further information:





Well, you people get the idea and all the garbage about XP being so
secure is just plain foolishness if people would just remove the
blinders from their eyes and see the truth then we would be getting
somewhere. BTW, I tri-boot with 98SE, XP Pro. and am testing Windows
Vista Ultimate 32 bit with glass "Aero" interface enabled.
 
imhotep said:
An how does one predict what tomorrow brings? Crystal ball? Surely one can
not. This is why is much better rate to the security hole based on the
critically rather than popularity...CRITICALITY DOES NOT CHANGE POPULARITY
DOES!!!

Who said anything at al about popularity ?
Scale of potential implacts/disruptions in simply a feel obtained
from the dependancy tree size, etc all as previously outlined but
apparently not comprehended by yourself.
I do not see a trade off here. Honestly, I do see mistakes in how some
people try to evaluate security holes thus resulting in making things
worse...

That then explains some of your blind spots
 
You are flawed. Reasearch linux patch times...from time of discovery to
patch release.

Check it again, if you actually look, not all patches are quick and not
all patches are without problems.
 
Back
Top