System running like crap

  • Thread starter Thread starter Guest
  • Start date Start date
Every machine performas better with it off so getting a better machine
wont make any difference.

You are totally missing the point. The machine is there to make YOU perform
faster, not the other way around. Indexing makes finding and accessing
files and programs much faster. The time you save is yours, not the
computer's. To do this, your machine has to do extra work, but it does this
work during idle periods when neither you nor it have anything better to do.

Look, if you want to take extra seconds to find stuff so that your computer
saves a few milliseconds of CPU time during idle periods, knock yourself
out. But I think you have your priorities exactly backwards.

Ken
 
You are totally missing the point. The machine is there to make YOU perform
faster, not the other way around. Indexing makes finding and accessing
files and programs much faster. The time you save is yours, not the
computer's. To do this, your machine has to do extra work, but it does this
work during idle periods when neither you nor it have anything better to do.

Those of us who know where we save things, may not want the extra
overhead of the OS treating us like morons.


There's a *reason* folks came up with the original filespec model of
nested directories and location-unique file names.

This ensures that the file you are working on is in fact the one you
assumed you are working on - i.e. that every file can be uniquely and
unambiguously identified.

You'd think with today's malware spoofing etc. the need for this would
be obvious, but instead we have things dumbed down for the benefit of
those who can't be bothered to build the most basic skills: "I saved
something, but I don't know what I called it or where I saved it".

Several things get in the way of uniquely identifying files:
- some files are hidden
- risk-relevant parts of the file name (.ext) are hidden
- full path is not shown (somewhat fixed in Vista)
- true file paths are "prettied" as namespace objects
- system ignores name risk cues ("open on content")
- system applies poor type discipline (e.g. code in .pif)
- some contexts flatten the directory space
- files within archives are shown as if loose files


But there's another reason I don't want the OS blundering about,
groping arbitrary files at random; code exploitability.

By now, we should know that we cannot assume code will do only what it
was written to do. We have countless cases where specially-malformed
content can get to run as raw code, no matter how safely it was
intended to be handled; think GDIPlus, WMF, icon extraction, RPC,
LSASS, ASN.1, Word, PowerPoint, Witty, Lovesan, Sasser, and the need
to leave the system open so we can receive "code of the day" that is
*limited* to once a month for convenience.

With that in mind, do you *really* want a background service groping
the contents of files you didn't know were there, and had expressed no
intention to "open"?

I predict this will be the Vista generation's biggest risk


--------------- ---- --- -- - - - -
Saws are too hard to use.
Be easier to use!
 
cquirke (MVP Windows shell/user) said:
On Sat, 17 Feb 2007 13:51:40 -0600, "Ken Gardner"

With that in mind, do you *really* want a background service groping
the contents of files you didn't know were there, and had expressed no
intention to "open"?

I predict this will be the Vista generation's biggest risk

Well to be perfectly straight about this - XP did the same don't forget and
then there are the scary things like Google Desktop Search.

So, indexing has been an issue for a while. It was always smarter to turn it
off, if nothing else, just for the fact that no matter if you have a brand
new top of the line machine or one that is just barely able to make the
grade in Vista or something in between, it *IS* a noticeably faster machine
without indexing going on. Now add to this the fact that there is no perfect
OS nor is the power from the wall socket always going to be OK. If your
machine crashes while indexing, what then? That is one reason I don't use
hibernate. While it is relatively easy to fix for someone who has an
interest in computers and some small Windows experience, it is just a pain
in the backside to have to do that when simply turning it off means you
don't have to face it.

So I am pedantic about wanting my machine to be as crash proof as possible
and run the fastest it can, you say? I say "isn't that what everyone wants?"
 
Well to be perfectly straight about this - XP did the same don't forget and
then there are the scary things like Google Desktop Search.
So, indexing has been an issue for a while. It was always smarter to turn it
off, if nothing else, just for the fact that it *IS* a noticeably faster machine
without indexing going on.

Indexing has been with us for a while, though one could avoid some of
the meadata handling by avoiding NTFS.

It's more pervasive in Vista, which doesn't run well on FAT32
(suprisingly it runs, but several things break such as SR).

As NTFS is pushed past being the default to being required, the OS can
start to use some of its most powerful features on the assumption that
they are there. So we see larger collections of small files in a flat
directory space, making use of NTFS's more efficient directory
indexing, and we see increased use of non-key identification fields,
as leveraged by tags and metadata searching.

These are good things, from a performance perspective. Whereas FATxx
would require cluster chains to be groped or a separate metadata store
to be maintained in a loose file, NTFS makes it almost as efficient to
search on non-key (non-unique) fields as it is to look up the name as
the unique key field.

That should make searching without index a lot cheaper, as well as
making it cheaper to maintain search indexes.


MS has always been in love with content indexing, even before "Google
envy" set in, and many of us users have always hated it. Remember MS
Offive's wretched Fast Find; how this was rammed in by default, and
became progressivly harder to get rid of?

It passed through migrating the engine to a less-obviously-named
"Microsoft Office" startup item, to tapping into NT's indexer so you
lost the ability to exclude it during MS Office install.
So I am pedantic about wanting my machine to be as crash proof as possible
and run the fastest it can, you say? I say "isn't that what everyone wants?"

I can relate to that.

On the risk aspects of Vista vs. XP; yes, indexing has been in XP, as
have handlers that grope content when folders are displayed. Vista
accelerates the risk in that devs are enouraged to use these things
more than they do so already.


The obverse of "all non-trivial code has bugs" is "if you want
bug-free code, keep it trivial".

DOS-era directory entries are trivial and pretty safe. The data held
in them is closed, and is extracted as bounded in size. There are no
3rd-party extensions, no variable-length fields, no parsing that
expands content or chains to other parsers.

NTFS metadata is not trivial enough to be considered safe, IMO;
3rd-party code can integrate as content handling or indexing filters,
which means the nature and interpretation of metadata is as unbounded
as "opening" the contents of the file itself.


Metadata is a half-way house between DOS's trivial and rapid-access
directory entry information, and the performance-impacting grind of
accessing each file's cluster chain.

That means it may be "fast enough" to scratch around in the metadata
whenever files are listed in a folder view, hovered over for a
ToolTip, selected so that a "details" panel is populated, as well as
an explicit Rt-click, properties. It also makes it fast "enough" to
build indexes in the background, groping the metadata to do so.

The trouble here is that the OS is now operating ahead of the user's
intent - and whenever that happens, it is an opportunity for malware
to go beyond SE'ing the user, to bypassing the user entirely.

So I'd want:

- a safe list view that does NOT dig into metadata when
listing files etc. unless user explicitly rt-clicks, Properties

- exclusion of subtrees from indexing, as well as index-free
modes of operation

We have the second in Vista. To be clear on the first; it is not
enough for a List view to be fast because it does not display the
results of the groping of metadata ; it must be safe by not groping
the metadata in the first place.

What would I do with these?
- default safe behaviours for mOS boot
- default safe behaviours for Safe Mode
- default safe behaviours for new shell folder templates

We have content templates in Vista already, i.e. some folders handled
as "for pictures", others "for music" etc.

We also have shell folders such as Documents, Music etc. and now, very
welcome, a Downloads location that markes an awareness that arb stuff
downloaded from the 'net shouldn't be treated as "data".

My existing practice in Win9x and XP is to create a subtree that is
for all incoming material, workspaces, desktops etc. which I consider
to be hi-risk. This is separated from data spaces, so that data
backups do not include this risky material.

I'd like to define such a subtree as a new shell folder, and map my
choice of behaviours to it, such as no indexing, no content groping
when "listing" files, safe List view, always show file name
extensions, do not "open" based on hidden content type, etc.

Now I just have to wait for MS to catch up ;-)


--------------- ---- --- -- - - - -
Saws are too hard to use.
Be easier to use!
 
cquirke (MVP Windows shell/user) said:
Indexing has been with us for a while, though one could avoid some of
the meadata handling by avoiding NTFS.

.....and as most people went from win 98 to XP, that is why most wouldnt have
known about indexing until they went to XP no doubt.
It's more pervasive in Vista, which doesn't run well on FAT32
(suprisingly it runs, but several things break such as SR).

I can understand why they support FAT32 but why would someone really want to
use it?
MS has always been in love with content indexing, even before "Google
envy" set in, and many of us users have always hated it. Remember MS
Offive's wretched Fast Find; how this was rammed in by default, and
became progressivly harder to get rid of?

I do remember that and I always disabled it. In fact I always got rid of the
damned preload of Office on machines that didnt use Office most of the time,
too.
It passed through migrating the engine to a less-obviously-named
"Microsoft Office" startup item, to tapping into NT's indexer so you
lost the ability to exclude it during MS Office install.

Later version? I found, back when I bought Word 2000 that it wasnt there as
an identifiable item but then my Word 2000 came with Works Suite at the
time.
I can relate to that.

On the risk aspects of Vista vs. XP; yes, indexing has been in XP, as
have handlers that grope content when folders are displayed. Vista
accelerates the risk in that devs are enouraged to use these things
more than they do so already.

Leading to the inevitable "doesnt work" syndrome when a future OS comes out
that doesnt use that method any longer.
The obverse of "all non-trivial code has bugs" is "if you want
bug-free code, keep it trivial".

IOW K.I.S.S. of course. Yes, I know that one well. A major BBS program of
the later 80s had authors wanting to check every file as the user downloaded
it and they were realising what that would mean to enforced time limits. I
asked the obvious - why not just inspect the file for a virus as it is
uploaded? I got blank stares and then forehead slapping. I always went for
keeping it simple as possible. I only turned on Aero last night for the
first time just out of curiosity. What's with that? Cant find the big deal
in Aero!
The trouble here is that the OS is now operating ahead of the user's
intent - and whenever that happens, it is an opportunity for malware
to go beyond SE'ing the user, to bypassing the user entirely.

It depends on what you mean by "operating ahead of intent". Eg, the software
HAS to assume, by default, certain things. Most people know about viruses
but a heck of a lot dont do a thing about them so should the OS come with
one implanted that can only be turned off by the installation of a 3rd party
one? MS may well be criticised for doing so but when you consider that this
may mean the spread of such stuff that damages businesses has a better
chance of being slowed then I think yes it should be. Even a basic free one
like AVG which isnt as "automatic" as those such as Mcafee to auto update,
auto handle viruses etc is better than nothing at all. I dont think there
will ever be the impregnable computer unless it is one that is never turned
on. So, we need anti virus programs. They operate ahead of intent and I have
to tell you that doing so is a good thing in many cases. My elderly father
in law says he is so happy that his AV prog is automatic as he doesnt have
to worry about stuffing things up. I have had to get rid of viruses from his
machine before when he was told by his old AV prog that the file he was
trying to open was a virus and he clicked to ignore the warning before
actually taking it in properly but by then it was far too late. Now he
doesnt worry as it is all stopped before he can click on it, emails with a
virus attachment are stripped before he even sees them and so on. THAT is a
sign of where working ahead of the user's intent is a good thing.
My existing practice in Win9x and XP is to create a subtree that is
for all incoming material, workspaces, desktops etc. which I consider
to be hi-risk. This is separated from data spaces, so that data
backups do not include this risky material.

I understand why you do that but I work in the industry fixing those
problems for businesses and home users thus I have to operate my machinery
as normally as a user would as I will allow in order to be as open as they
are to problems in the hope that should there be one, I get it before they
do and know how to fix it. With that in mind, I have an AV prog that is auto
in every respect, I run the usual progs for spyware and keep the machine
tuned constantly but in the end, the three big rules for serious computer
users who value their data are simply:

1) Backup.
2) Backup.
3) After 1 and 2, Backup.

I used to use Nortons Ghost and found, when C drive died, that two weeks of
backups didnt work on the other drive as they were corrupt. Fortunately the
one from 3 weeks ago was OK but I lost 3 weeks of data. I dont use Ghost any
longer.
Now I just have to wait for MS to catch up ;-)

Not hanging by the thumbs, I hope! :)
 
Those of us who know where we save things, may not want the extra
overhead of the OS treating us like morons.

How is making it easier and faster for us to find our stuff "treating us
like morons?" I love typing in the name of a file I'm looking for and
finding it in a second or less. I do that much faster than opening Windows
Explorer and then clicking through a nest of folders until I find it. In
fact, it takes me less time to type in the keyword than it does to move my
mouse to "Documents" and clicking it -- assuming that what I'm looking for
is in my documents folder.

And yes, indexing requires more overhead, but the machine does it when idle
and/or on low priority when neither it nor the user has anything better to
do anyway. Again, I'm puzzled by this line of thinking that says that we
exist to make life easier for the machine instead of the other way around.

[...]
You'd think with today's malware spoofing etc. the need for this would
be obvious, but instead we have things dumbed down for the benefit of
those who can't be bothered to build the most basic skills: "I saved
something, but I don't know what I called it or where I saved it".

The isse is not remembering where you saved the file. The issue is how
quickly you can find and use it. For me, finding my stuff is much quicker
with the new Vista search features than the old way of mouse-clicking my way
through Windows explorer (or trying to remember custom keyboard shortcuts,
as I still do on my XP machine).
Several things get in the way of uniquely identifying files:
- some files are hidden
- risk-relevant parts of the file name (.ext) are hidden
- full path is not shown (somewhat fixed in Vista)
- true file paths are "prettied" as namespace objects
- system ignores name risk cues ("open on content")
- system applies poor type discipline (e.g. code in .pif)
- some contexts flatten the directory space
- files within archives are shown as if loose files

None of this is an issue with me.
But there's another reason I don't want the OS blundering about,
groping arbitrary files at random; code exploitability.

Nor is this.
By now, we should know that we cannot assume code will do only what it
was written to do. We have countless cases where specially-malformed
content can get to run as raw code, no matter how safely it was
intended to be handled; think GDIPlus, WMF, icon extraction, RPC,
LSASS, ASN.1, Word, PowerPoint, Witty, Lovesan, Sasser, and the need
to leave the system open so we can receive "code of the day" that is
*limited* to once a month for convenience.

I prefer to cross these types of bridges if and when I ever get there (which
I neve do). I'm not going to sacrifice actual everyday speed and efficiency
because of hypothetical situations that may never happen. And I suspect
that Microsoft made the same decision when designing Vista. It is the
right decision for the vast majority of computer users.
With that in mind, do you *really* want a background service groping
the contents of files you didn't know were there, and had expressed no
intention to "open"?

You work around these types of problems with tags or keywords. The more
specific your tags or keywords, the quicker you can find your stuff.

[...]

Ken
 
Ken Gardner said:
How is making it easier and faster for us to find our stuff "treating us
like morons?" I love typing in the name of a file I'm looking for and
finding it in a second or less.


Yep and 99% of the time, most people using it wouldn't use that search
feature but 100% of the time, indexing NOTICEABLY slows absolutely ANY
machine down.
 
I can understand why they support FAT32 but why would someone
really want to use it?

FATxx is far easier for data recovery:
- it's well documented
- hand-editing tools are available
- user-controlled file system repair tools are available
- all file system structure is at "front" in predictable location
- avoids rich and possibly exploitable feature set, e.g. ADS
- doesn't change with new OS versions or SPs
- compatible with older OSs etc.

FATxx is less efficient where large single files (hard upper limit
that becomes a real problem in an age of DVD images) and large numbers
of files in a single directory. But when it comes to IMO the most
crucial job of a file system - do not lose data - I'd feel safer off
NTFS. So just as I'd not want to fly with an aerobatic specialist who
is prone to eplieptic fits every now and then, I'd rather use a file
system that doesn't eat data automatically and can be fixed manually.

A 2G FAT16 is very survivable for small data files that may exist in
one of the large clusters, so needing only a dir pointer to recover.
You can peel the whole thing off for "offline" recovery.

When working on larger FATxx volumes, you can backup the "front" of
the file system (from boot record to end of 2nd FAT), edit this, copy
files off with that particular pair of FATs in effect, undo the edits,
etc. Try that with NTFS; the relevant chaining info etc. are
splattered all over the volume, so even if you could get your head
around visualizing everything in raw hex, you would not be able to
save and restore the chaining info between copy-offs.
I do remember that and I always disabled it.

Yep. At least Outlook 2000 and later stopped jouneling Office data
files by default; that was also a pain.
Leading to the inevitable "doesnt work" syndrome when a future OS comes out
that doesnt use that method any longer.

I don't think that's more likely to be an issue than any other APIs.
It depends on what you mean by "operating ahead of intent". Eg, the software
HAS to assume, by default, certain things.

Not many.

For example, until I write to a non-OS disk, I don't want that disk
written to (maybe it's a sick HD I'm trying to recover data from?)

For example, there may be files present that I have NO intention to
ever "open" or use; I may in fact select them because I want to delete
them without Recycle Bin recoverability. So until I "open" a file, or
take an interest in its Properties by rt-clicking it etc., I do not
want the OS groping anything beyond the legact "DOS" dir entry.
So, we need anti virus programs.

Sure, but that has nothing to do with any of the above. Having an av,
resident or otherwise, does NOT make it safe for the OS to grope files
that I have shown no intention to grope or "open".
They operate ahead of intent and I have to tell you that doing so is a
good thing in many cases.

Yes, av does operate ahead of user intent, tho generally only when
files are selected or listed (I turn off sheduled "whole system"
scans). And av is indeed an exploitable surface; what helps is that
there is no single dominant or always-present av to exploit. That's a
strong reason for MS not to bundle an av as part of the OS.
THAT is where working ahead of the user's intent is a good thing.

I'd agree, tho circumstances may require you do disable this in a
hurry (i.e. if a malware comes out that does exploit the av).

But to take the risk of triggering malware, on the off-chance I may be
too dumb to know where I saved my stuff? I don't think so.

If the OS had evolved to an awareness of risky material, separating
this out into a "suspect" subtree instead of murking it in with user
data, then it would be meaningful to enable such behaviors in some
"places" and have it off in others. Vista has only just begun to show
the glimmerings of such clue (e.g. Downloads as a new shell folder).
I understand why you do that but I work in the industry fixing those
problems for businesses and home users

Me2. I set PCs up with the above logic when I build them, and on
first contact with "unwashed" PCs. I do NOT leave MS duhfaults like
1G web caches, IE dumping in "My Docs", file name extensions hidden,
admin shares waving around etc. in place.

Learn once, learn right.
I have an AV prog that is auto in every respect

I have one av resident, though not patched into the email access (esp.
on dialup, where this can cause silent loss of outgoing mail when the
email app auto-disconnects before the av sends the stuff out). I like
av and scanners that update themselves, but I don't like more than one
that auto-scans files on contact, and I don't like whole-system scans.

I do like automatic overnight scans of the "suspect" subtree, using
multiple av in series, though I seldom impliment this except for
certain hi-risk installations.

I chase suspected active malware fornmally (i.e. "scan from orbit"
when the infected installation is NOT running, using Bart etc.)
three big rules for serious computer
users who value their data are simply:

1) Backup.
2) Backup.
3) After 1 and 2, Backup.

Nah. Backup's useful, but it doesn't solve all problems, and as a
general solution, "just backup" is as uselessly simplistic as "just
re-install Windows". I'm not saying "don't bother to backup:, but I
am saying backups will not make other maintenance redundant.

To be effective (or even relevant), backup has to scope in all wanted
content and changes and scope out all unwanted content and changes.
Unless you've very carefully designed your data locations, separating
out infectable and incoming material, you can't be sure your most
recent backup will be malware-free.

Even if you break integrations by wiping and rebuilding the system
before restoring the "data" backup, you may be open to
self0integration via exploitation of internal surfaces... which is
where we came in, fretting about indexers and "rich" listings.
I used to use Nortons Ghost and found, when C drive died, that two weeks of
backups didnt work on the other drive as they were corrupt. Fortunately the
one from 3 weeks ago was OK but I lost 3 weeks of data. I dont use Ghost any
longer.

I automate a daily, 5-day-deep backup for small key data, which is
highly survivable, especially when combined with matrix cross-backup
storage in a serverless network environment.

Even so, the need for data recovery does not go away, and with folks
generating huge collections of precious photos, fully-automated data
backup is becoming even more of a challenge.


--------------- ---- --- -- - - - -
Saws are too hard to use.
Be easier to use!
 
"cquirke (MVP Windows shell/user)" wrote:

As long as the machine doesn't keep me waiting or take risks I want to
avoid, I'm happy. I don't like Start Menu and folders that dribble
down the screen slowly because the OS is scratching around in each
file, and I don't like silly "look I'm a menu" animations that keep me
waiting, yes even for a fraction of a second, while I'm trying to
work. I don't like race-condition risks that arise when the UI is a
few mouse clicks or keystrokes behind what I'm doing.

I don't like "rich" folder displays that force me to scroll everytime
there are more than a dozen items present. Scrolling is to human/UI
performance what paging to disk is to system performance.

So the performance aspect is: If I can "feel" the PC getting laggy
because of overhead that is answering questions I do not ask, I want
those "services" dead.

But performance is not the only aspect.

Who's looking for them? I know where they are :-)

Yeah, I heard that before... the system's often got that wrong in the
past (low point being WinME's wretched cabbing of SR gunk, and don't
even mention FastFind and every second useless PoS app that wants to
load itself on boot on the off-chance you'll want to use it)

Having said that, Vista may be better at this than past schemes; so
far, it's looking good. I just hope the indexes don't blow out into
huge lumps that gunk up the engine room (i.e. small C: for the OS) or
traipse all over the disk to access other volumes all the time.
How is making it easier and faster for us to find our stuff "treating us
like morons?" I love typing in the name of a file I'm looking for and
finding it in a second or less. I do that much faster than opening Windows
Explorer and then clicking through a nest of folders until I find it.

The risk is that you may "find" the wrong thing. If you navigate to
D:\Data\SomePlace and "open" the one and only Filename.doc there, you
know exactly what you've got. If you let the system find Filename.pif
in E:\Suspect\Downloads instead, you could have a problem.

Having said that, they've done it quite well in Vista; Windows
programs sort above name-alike files, and until there's only one
choice, Vista won't guess. But it won't show you paths unless you ask
to see them somehow, and it still hides file name by duhfault.

At least it no longer finds files within archives; tho you can enable
that, I wouldn't, especially if you keep autobackup .ZIPs of your data
lying around - you may edit the backup copy that will get FIFO'd off.

Try this, tho...
- Orb (Start), in Start search type Read
- notice two "Read Me" files appear
- notice how you can't see file name .ext or path
- now press Enter while still typing (i.e. you go "Read<enter>")
- Notepad comes up with ReadMe
- which ReadMe are you editing?

Notice that this is AFTER Windows Explorer has been set NOT to hide
file name extensions - i.e. the list of found items does NOT show file
name extensions even when the system has been set to do so.

There's a malware ITW that does this:
- finds your .DOC files
- copies itself as samename.exe, with Word icom
- sets the matching somename.doc to hidden attributes
- sets Windows Explorer to hide file name extensions
- actively prevents changing that setting

By duuuuhfault, Windows will just show one "Somename" file, and that
will be the malware .EXE; when you "open" that, it goes resident and
then chains into opening the hidden data file in Word; you won't
notice the difference.

Being easy to use is not enough - things have to be easy to use
safely, and you can't dumb down what you need to see in order to
operate safely. It's useless claiming that file name extensions can
be replaced with icons when the most dangerous types of files (.pif,
..exe etc.) can set their own icons to spoof whatever they like.
In fact, it takes me less time to type in the keyword than it does to move
my mouse to "Documents" and clicking it -- assuming that what I'm looking
for is in my documents folder.

See above. If you don't know what you've found, you are no longer in
control of what you're doing.
The isse is not remembering where you saved the file. The issue is how
quickly you can find and use it. For me, finding my stuff is much quicker
with the new Vista search features than the old way of mouse-clicking my way
through Windows explorer (or trying to remember custom keyboard shortcuts,
as I still do on my XP machine).

See above. Convenience is nice, but not the only criterion.

I see that hovering over a "found" match will show you path, though it
still doesn't show you file name extension. But that means letting go
of the keyboard and waving the mouse around, which is a tedious extra
step compared to typing what you think is enough of the file name and
pressing Enter. Guess what one will usually do?

There may be a bug in the way Vista does this search-and-match stuff.
Let me test something... hmm, interesting; I create a file called
"Read Me.txt" and I see it *does* show the file name .ext for that
file, but not the others! Strange... one "Read Me" is a shortcut to a
"Read Me.txt" in NTI CD sware, the other is also a shortcut to a "Read
Me.txt". So one bug is that shortcuts are not shown with shortcut
arrows, nor do they show the file name .ext of their target.

Now let's try a creating a file called "Read M.txt" instead of "Read
Me.txt" and see if Vista still "opens" the one at the top of the list
when I press Enter... yes, it does. When I did an earlier search for
something else, it came up with a Search window instead.

Let's try "Read Me.exe"... no, still opens top of the list on Enter.

So yes, there does appear to be a risk of namealike companions being
"opened" instead of the file you thought you'd found.
I prefer to cross these types of bridges if and when I ever get there (which
I neve do). I'm not going to sacrifice actual everyday speed and efficiency
because of hypothetical situations that may never happen.

Malware infects PCs every day - 95% of spam is carried by botnets.
And I suspect that Microsoft made the same decision when designing Vista.
It is the right decision for the vast majority of computer users.

The majority of users rate malware as a major concern. "Easy to use"
is more trouble than it's worth if you end up doing things you did not
intend to do, and those things have real adverse consequences.
You work around these types of problems with tags or keywords. The more
specific your tags or keywords, the quicker you can find your stuff.

So now I have to name the same file multiple times to find it? Like
malware can't tag as well? Mhh... it's nice-to-have, but I still
prefer to know exactly what file I'm "opening" at all times.

I already have a familiar, existing system that does that.


--------------- ---- --- -- - - - -
Saws are too hard to use.
Be easier to use!
 
Back
Top