Not for a HDD. Even a few millisecs is a long time -for the system-.
When you consider track seek time, rotational latency, settle time etc
for each fragment that the drive has to pick up sequentially, it can
have substantial impact on performance. But as I mentioned, the degree
of fragmentation of the files is key. Apart from the optical drives, the
harddrive is the slowest component of the PC because of it's mechanical
operation, so if it runs even slower due to heavy fragmentation, it's
not good.
Well it has to do that for even if the whole file is unfragmented it has to
find the file. When a file is written to a fragmented disk I would imagine
it
puts it in the places which are quickest to access, (seems sensible) so
I doubt the access overhead would be that much.
Not really. I keep adding and deleting programs quite often, and with
the size of today's files fragmentation can build up quickly. And
fragmentation affects all files not just 'programs'..modify any file and
it may get fragmented or cause free space fragmentation if it is flanked
by other files.
But it's not a great overhead all things considered.
Er....'the loading into memory' is what is affected by fragmentation.
As is writing to the drive. Once it's in RAM, it shouldnt matter unless
you are *gasp* paging it to the HDD
and the page file itself is
fragmented.
I don't use a page file anymore. I think it is better to ensure you never
need
a page file by not overloading your system.
Not necessary at all. In NTFS, it gets puts into the first bits of free
space available, which might or might not be fragmented free space.
However it's likely to be on the same trackor nearest track to the read
head, so not to much work. Next time you use that file the read head is
also likely to be in a similar position, unless of course you have defragged
in which case it will be likely be in some random position on the disk.
Not at all. Defragmenters consolidates files and directories.
And that is what I am what I am saying could be the cause of the problem.
A file will have been moved from what was a convienient place to access into
a different place based upon directory structures. Initally the required
files
might have been written on the same track, now they will be pretty much
scattered
randomly all over the drive.
Actually, they can. Atleast most of the new ones offer sequencing and
placment options based on a number of file attributes.
I don't think that wil be helpful.
Once the files are defragmented, the head can pick them up sequentially
so no wear and tear. A defragmented drive with well consolidated free
space suffers from lesser fragmentation during future file writes.
Whilst the files themselves may be defragmented, a set of files as used
as a functional group are liklely scattered all over the drive.
It's a bit like an untidy desk, it may look untidy but things tend to be
automatically
be grouped togeather by usage, everything for a particualar function will
tend
to be grouped togeather by last uasge, which is likely them most convienient
grouping for their next usage. When you tidy up that desk you destroy that
'natural grouping'. Things become grouped by other things unrelated their
most lilkely usage.
And the auto defraggers dont go to work 24x7; as I said, only when
necessary, and using the barest minimum of resources. Usually, they
would run for a few minutes a day at the most.
Better than the head going crazy *each time* it has to pick up a
fragmented file.
I don't think that would happen, the fragments would be initially
writte to the most convienient space and hence bein a convienient space
when it comes to reading them again.
First time I ever defragged my computer, to speed up the start up time
I timed it to see 'how much faster' it was. If anything it appeared to
be
slower!!!
(honest!!). I have not really bother much with it after that, it seems
to
make little of know difference. The six hours or so of constant disk
activity
didn't really endear me to the idea either!!
You are right, that's quite a departure from the norm. It has never
been the case in my experience. Usually, manual fragmentation ought to
be as follows: [defragmentation of files] -> [boot-time defrag to defrag
the MFT, paging file
etc] -->[final file defrag]. Once this is done,
you are all set.
Well whatever the case I don't think find fragmentation an issue for me.
My disk does not go crazy in general, and if it does I am pretty sure it is
nothing to do with fragmented files.More likely to do with excessive
pageing,
my view is once it starts trying to use your hard drive as RAM you may as
well
give up, the differnce in access times is collossal.
Not a waste of time at all, since it is completely automatic in nature.
And it is useful for those who use their systems heavily. I game, use
Photoshop, and my PC is my main entertainment device in my room, so
defragging definitely helps me.
Well in my experience it makes no noticable difference, I did it several
times
on my old system and it seemed eaxctly the same, if not worse. Even if you
defrag individual files you will oftend be working with hundreds of small
files
anyway, which is the same as one file in a hundred fragments. Defragging may
well put these 100 files in less convienient places than those which they
were initially
in, so it's swings and roundabouts. I certaintly have no initension
whatsoever of defragging
any of my drives at the moment. I think it would more likely make thing
worse
than better, and as it is fine at the moment it is not a risk I am prepared
to take.
As for AV scans, if your AV setup is good in the first place, no
viruses will get through the net; but fragmentation is an inherent trait
(er, 'feature', thanks Microsoft!) of the FAT and NTFS file systems.
Others such as ext3 dont suffer as much from this.
Anything Microsoft produces is rubbish, it takes 2 seconds to pop up my
volume control, from RAM. No ammount of defragging will make a silk
purse out of a cows ear. Enough said.
If you say there is no drawback or benefit from disabling the paging
file apart from the relative lack of HDD activity, then it does not seem
to be necessary to take the risk. Maybe I can try it out on my office PC
which is er..'expendable' and ironically contains no important data.
Well I was a little worried at first "Will it crash?" I thought, but it has
been
fine for about a week now, and considerably quiter I would say. Certaintly
no noiser.
That slowdown could have been due to a number of reasons including
fragmentation or a fragmented paging file, background processes/programs
accessing the disk etc.
Actually, I've never had any problems with the paging file being
enabled since it sits inside it's own little partition on the outer edge
of the platter. In fact, I cant remember when was the last time my
system BSODed or hard crashed. It's always been running smoothly since I
first built it 2 years ago with a A64/1GB RAM as the starting point. I
upgraded the sytem to intel only recently.
Never has a BSOD on mine yet, seemed to have locked up a couple
of times but generally I just reboot pretty quickly rather than wait to
see if it 'sorts itself out' and then have to reboot anyway. Better to
reboot
in a couple of minutes than wait 5 hoping it will cure itself!!
You do have a point, that RAM is always much faster than the HDD, but
it still has to get the poker files from the HDD to the RAM, and that's
where the bottleneck comes in. I doubt paging has much to do with it.
No it can't really.
Actually another poker site puts all the poker hand historys into one big
file, or
several big files and I think this is a much better approach, much less disk
activity.
with say one 40 meg file than 40,000 1KB files and I do men much less, I
would
say at least 50 times faster.
It was a bit of a pain modifying the program though especially as I was not
100% sure
of the structure of the history files initially, I do now so the second
program is structured
better. I think I would also be better off bunging other sites files into
one big file
too. Mind you the statistics its gather on a player are not of much use,
they don't tell
you what cards he holds, and it is easier to guess than from how he plays
his current
hand rather than statistics on how he played his previous hands. So counter
productive
in a way, but it sharpened up my programming skills.