Jump to content
IObit Forum
Top Free Driver Updater Tools Best 25 PC Optimization Software Best 22 Antimalware Best 22 Uninstaller Software IObit Coupons & Discount Offers PC Optimizer Mac Boost Advice IObit Coupons A Good Utility Program From IObit IObit Promo Codes IObit Coupon Codes IObit Coupons and Deals FAQs Driver Booster Pro Review

Think about Defragmentation!!!


enoskype

Recommended Posts

Hi Everybody,

 

Here is my two pennies worth of Defragmentation experience.

(I have been doing that since the beginning. :))

 

I have attached two images, one with a bit of what is what and the other one is plain GUI (Graphical User Interface) comparison of four Defragmenters (I have 11 of them in the same Notebook) including ISD (IObit SmartDefrag Beta4.02). The basic images are the same.

 

I have to point out that two of them are paid ones.

 

I have put my defragmented encrypted 1 GB (GigaByte) virtual drive file to the innermost clusters of the disk with one of the defragmenters and then excluded it with all the three defragmenters except ISD of course.

 

I have Offline (Boot time) Defragmented my disk first with one of them and then, defragmented it with the same defragmenter taking into consideration the access dates to the files. (This defragmenter also puts the Boot files to the beginning of the disk.)

 

Then, with the first Defragmenter I mentioned above, I have put the Directories close to MFT (Master File Table).

 

Here you go.... You have a Disk which has almost ideally located and defragmented files. (This is certainly for a general purpose everyday PC. For a database server, old data should be at the innermost clusters of the disk.)

 

With contrary to some of my friends in the Forum, if you have enough (By my standards 1 GB for XP and 2 GB for Vista is enough) RAM (Random Access Memory) and CPU (Central Processing Unit), I am a believer of neither stopping any programs in the Taskbar, nor defragmenting in the Safe Mode without Startup programs. This is what I did when I defragmented my disk you see in the image, and I was connected to the internet by wireless at the time of defragmentation.

 

The moment you start a program, fragmentation occurs.

In my humble opinion, we should defrag the disk at such a condition that we are as ready as possible to start working on the PC.

 

I will not go into details of the Layout.ini file, but maybe, it is useful to mention that every third start of Windows, the OS (Operating System) places the files according to that and some of the Defragmenters have an option to respect that.

 

I will also not go into details of MFT and Metadata (data about data) files, but it is good to know that they are the most important system files which should be contiguous, i.e. defragmented, to have a positive effect on the speed.

The other point is to have an enlargement area for MFT just to follow MFT to make sure that MFT will not be fragmented when it gets larger in the future. By default, in XP, it is 12.5% of the disk space. It is an emtpy space and Windows will use it for normal operations if and only if there is no any other empty space left in the disk.

You can enlarge the default size of enlargement area for MFT if you wish. (In my opinion, it is more than enough.)

 

Page File (Swap file), [Virtual Memory (extra RAM substitude)], and if you have enabled hibernation in your PC, Hibernation File, both should be contiguous and close to each other.

 

Because of Zoned-Bit Recording, [A method of optimizing a hard drive (at the factory) by placing more sectors in the outer tracks of a hard drive than on the inner tracks. Standard practice for all modern hard drives.], and angular velocity, the most outer tracks of the disk is the fastest, and when Windows start, it starts from the outer tracks of the disk.

Therefore, the BOOT Files should be there, at the outermost tracks (beginning) of the disk.

Unlike in FAT (File Allocation Table) and FAT32, in NTFS (NewTechnology File System), Directories should be close to MFT.

 

------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

 

Although I know the logic behind all the other three Defragmenters, unfortunately I don't know what is the logic behind "Defrag and Optimize" function of ISD.

In the GUI of ISD, some clusters are seen as empty, although they are not and system files are not correctly shown.

The only thing, the other three don't have in the GUI and ISD has, is that, after defragmentation, defragmented files are shown in a different color. (Not seen in the image of ISD, because only analyzation is done.)

 

------------------------------------------------------------------------------------------------------------------------------

 

I will give below my way of approaching defragmentation in a PC. (Maintenance in fact.)

 

The sequence may differ from PC to PC.

 

- Take an Image of the disk.

 

- Clean the PC from virus, junk, spyware, malware, ActiveX, useless registry entries. (All sectors) [AWC (Adwanced Windows Care) & others]

 

- Immunize it. (AWC & others).

 

- Run=>cmd=>chkdsk /R in boottime. (Check Disk)

 

- Windows Update and other updates.

 

- SFC purge, and SFC scan. (System File Checker.)

 

- Clean again. AWC & others. (If you wish, you can clean the Windows update setups by "dial-a-fix" and Windows update uninstallers by "CCleaner". If you are using MS Outlook Express, you can delete "Deleted Items.dbx" manually, usually at "C:\Documents and Settings\User Name\Local Settings\Application Data\Identities\{User ID}\Microsoft\Outlook Express")

 

- Making sure everything is OK, take an Image of the disk.

 

-Clear all Windows Restore Points.

 

- Create a Windows Restore Point.

 

- Make sure that you have a Page file size of minimum 50% greater than the size of RAM.

 

- If you are going to use Hibernation in your Computer, it is a good time to enable Hibernation in Power Settings.

 

- Defragment only strategy Defrag.

 

- Offline Defrag. (Boot time defrag.), (Defragmentation of System Files.)

 

- Optimized Defrag. (Defragmentation according to your needs and preferences.) [recency, consolidated free space, Folder/File name, high performance (outer clusters), archive (inner clusters), respect Layout.ini, inclusion, exclusion, smart replacement, etc.]

 

- Take a clean Disk Image.

 

- It is better that storage area of the image on the backup disk should not be reached by Windows OS.

 

------------------------------------------------------------------------------------------------------------------------------

 

 

I thought if anybody is interested deeper in it, and if I know the answer, or somebody else knows the answer, we may share our knowledge here.

 

Cheers.

 

Note:

The attached zip files contain same images in bmp format with sizes of around 5 MB each, after extraction .

These are included if anybody is interested in seeing the images in better resolution.

Please also zoom in full screen to view all of them in better images.

 

Printable PDF version of this thread may be found as attachment here:

http://forums.iobit.com/showpost.php?p=31475&postcount=9

4Defragmenters.zip

4Defragmentersdescription.zip

Link to comment
Share on other sites

  • Replies 68
  • Created
  • Last Reply

Top Posters In This Topic

Hi Enoskype!

This is very intriguing. To me, it seems you've covered all bases. The question on my mind now is how long it will take before ISD comes to that level (showing the MetaData and System Files). I do believe it does a great job, but I would like to see those file types shown in the future sometime. :] I'm one that loves knowing everything I can about my computer's current state.

 

It would also be nice to allow a boot time defrag for the paging files and such in the future. Currently, I have "pagedfrg.exe" set to run every time my computer starts to cover that area. I am using an always-on 120Gb external hard drive as a paging file extension, with a 2Gb paging file on it as well as the one on my laptop. I'm not sure whether pagedfrg can access and defrag that drive or not...but it's another idea for ISD. =]

 

 

PS: Thanks very much for your detailed documentation! I appreciate that!

Link to comment
Share on other sites

Thanks for the appreciation ClearD989,

 

I agree with you for ISD.

 

For Page File defragmentation:

First, if you have just a bit more than enough RAM, and you have left the Windows to adjust the Paging file, Windows will adjust the Paging file as 1,5 times the size of the RAM. (Hibernation file by default, will be as the same size of RAM.)

The second choice would be your adjustment of the Paging file, and if you adjusted that equal or greater than 1,5 times the RAM, (My preference is 1,5-2 times when RAM is 1 GB in XP).

THEN,

there is a little chance that the Paging file will be fragmented if it is defragmented once.

RESULT:

I would not set pagedfrg.exe to run every time your computer starts, but after checking from time to time, you can do that if it is needed.

 

You can check if your external is accessed and/or defragged by pagedfrg if you adjust a very small Page file in your external drive and no Page file in the main drive.

Windows will use the Page file in external drive, and after you put some files there, Windows will create more space for Paging File in external drive in another location. Then you can try to defrag that fragmented Page file.

 

It may not work for different external disk connections (i.e. USB, Network cable etc.) and some types of the external drives though.

 

Let's hear your experiences.

 

 

Hi solbjerg,

 

Thank you for the attached file and the rest of it.:-D :-D :-D

I have opened the abbreviations.

 

Cheers.

Link to comment
Share on other sites

  • 2 weeks later...

Hi enoskype,

You have some great ideas here, and seeing the comparisons of the different programs gives a different perspective of how each work. I also agree that an offline (boot) defrag would be a very important addition to ISD!

I am curious about this statement:

"With contrary to some of my friends in the Forum, if you have enough RAM (Random Access Memory) and CPU (Central Processing Unit), I am a believer of neither stopping any programs in the Taskbar, nor defragmenting in the Safe Mode without Startup programs."

I have always been under the understanding that by stopping the startup programs or defragging in safe mode it allows these programs to be defragged,where as they can't be defragged properly while in use. Could you explain this further?

I would also like to hear anyone else's thoughts on this (and any of the other ideas you bring up on defrag!)

Thanks, samr.

Link to comment
Share on other sites

Hi enoskype!

 

This is just because I have a suspicion that some of the problems we read here in the forum is caused by too little free space or/and almost full disks.

 

I just want to add that sufficient free space is an important factor in the speed of the computer, it is also essential to leave enough space for the defragmentation to be able to run.

Futhermore an almost full disk will fragment much more than a disk that is only half full or less. You will not notice much increase in speed in a slightly filled disk after defragmentation, but the harddrive may last longer because the heads in the harddisk does not have to travel as much as they will when the disk is fragmented.

 

Thank you!

 

 

Thanks for the appreciation ClearD989,

 

I agree with you for ISD.

 

For Page File defragmentation:

First, if you have just a bit more than enough RAM, and you have left the Windows to adjust the Paging file, Windows will adjust the Paging file as 1,5 times the size of the RAM. (Hibernation file by default, will be as the same size of RAM.)

The second choice would be your adjustment of the Paging file, and if you adjusted that equal or greater than 1,5 times the RAM, (My preference is 1,5-2 times).

THEN,

there is a little chance that the Paging file will be fragmented if it is defragmented once.

RESULT:

I would not set pagedfrg.exe to run every time your computer starts, but after checking from time to time, you can do that if it is needed.

 

You can check if your external is accessed and/or defragged by pagedfrg if you adjust a small Page file in your external drive and no Page file in the main drive.

Windows will use the Page file in external drive (slower), and after you put some files there, Windows will create more space for Paging File in external drive in another location. Then you can try to defrag that fragmented Page file.

 

It may not work for different external disk connections (i.e. USB, Network cable etc.) and some types of the external drives though.

 

Let's hear your experiences.

 

 

Hi solbjerg,

 

Thank you for the attached file and the rest of it.:-D :-D :-D

I have opened the abbreviations.

 

Cheers.

Link to comment
Share on other sites

Hi sunny!

This is undoubtedly true, but it depends to a great extend on how much deleting and installing and general use you have done between defragmentations.

By the way have you noticed if you have fewer programs in your start up list?

 

Cheers

 

 

IObit defrag seems to improve greatly after its been run about 5-6 times. It used to leave stuff fragmented but no longer and run quicker.
Link to comment
Share on other sites

solbjerg

 

no difference in start up programs, always updating / installing /deleting programs and running various utilities to establish which ones perform well.

 

there are quiet a few good free defrag progams about now, IObit is not the fastest to defrag but I feel the optimization is much better which I consider better than overall speed of the defrag.

Link to comment
Share on other sites

Sorry for being late to respond to your posts guys.

 

Hi samr,

 

Thank you for bringing that point.

 

I would not want to stop the programs because as soon as I start them they will be fragmented all over.

In fact, since they are in RAM, they can be defragmented when they are at background. That's why I said if you have enough RAM.

Example: My NoteBook, which the images above in my post show the situation as is with 154 000 plus files and 14 000 plus Directories, 28 startup programs and 21 icons in the taskbar. 1.6 GHz Pentium M Centrino with 1 GB RAM running XP Pro Sp2. (also not different in my Vista Ultimate.)

The system files can not be defragged (not all of them though.) when they are in use.

When running in Safe Mode, some of system files, drivers are not running and they are defragged at that stage, giving the impression more is defragged (this is true, but in fact they may be fragmented in the next boot.)

In my opinion, the best optimization is done when you are in a regular position, i.e. As the PC started and regulated itself, cleaned and ready for a new task in a steady condition.

If you defrag in a safe mode, when you restart, it is already fragmented to a some extent.

I may be wrong, but I would like to hear from anybody if they can get nearly a 100% defragmented disk just after a normal start after a Safe Mode defragmentation.

 

 

Hi solberg,

 

I do agree with you about the availability of sufficient empty space of the disk to be defragmented. Very good point to bring up.

 

Another point may be as follows.

 

-----------------

Supposing you have a second partition or disk such as D:,

I made the following comment in one of the threads and agree with the suggestion, so that, ISD may be the first Defragmenter to do this:

 

 

If you move enough files to your drive D:, you can defragment your "fuller" drive C: .

After defragmentation, if you have enough contiguous empty space in your drive C:, the files will be placed contiguously in drive C: when you re-move the files from drive D: back to drive C:.

 

One of the forum members suggested that this function could be put in ISD.

 

would the defrag run better on a "fuller" disk if you could write files to a second disk? My hard drive, like many has a "C" and "D" drive on one physical drive. If you were able to move some files out and back would it help in defragging?

 

JerryW

the defrag software use the available space on "D" (maybe with permission) so as to do a better defrag and end up with a better product.

 

JerryW

-----------------------------------------------

 

 

Cheers to you all!!!.

 

 

Happy New Year.

Link to comment
Share on other sites

  • 2 weeks later...

Strategic File Placement

 

Hi again,

 

Here is a pretentious challenge by an expert of defragmentation and I thought it might be interesting to bring this to your attention and discussion.

 

Hard drive file usage follows Pareto's Law (The 80/20 Rule) - 80% of the time you only use 20% of the files on your computer.

The other 80% of the files that you use only 20% of the time is competing for the space where you get the highest drive performance and slowing your hard drive down.... significantly!

 

That doesn't mean that you remove those rarely used files.

What it does mean is that you need to relocate those rarely used files to the slower section of your hard drive - the inner tracks. In other words, get them out of the way.

 

Then take the remaining 20% (the files that you use the most) and place them on the outer tracks of your drive where performance is 180 to 240% that of the inner tracks.

 

You then consolidate them closely together (i.e. maximize seek confinement) - the result is that the data that you actually access - booting your computer, loading programs, frequently used data files are on the area of your disk that is 200% faster and then.... because it is all close together you are achieving track-to-track seeks and instantaneous seeks that result in seek times that are 100% to 1000% times quicker than quoted drive seek times.

This is called strategic file placement and the performance gains are remarkable.

Link to comment
Share on other sites

Interesting enoskype!

I did a little mental calculation

 

A standard CD is 6 cm in radius

The inner radius where no data is stored is 2 cm

This means that you have an area of (π*6²) – (π*2²) = approximately 100 cm² to fill with data.

This again means that the outermost 1,75 cm of the radius contains half the data.

The difference in speed in accessing the outermost data and the innermost data is approximately

37 to 25 which is approximately 50% faster in the outer circle, as opposed to the inner circle.

 

How much of the disk’s in the hard drive that is inaccessible I do not know, but if it is only the inner 1.5 cm radius that is unused, the difference in reading speed can be approximately 4 times higher in the outer region. In other words up to 400% higher!

 

Cheers

 

 

Hi again,

 

Here is a pretentious challenge by an expert of defragmentation and I thought it might be interesting to bring this to your attention and discussion.

Link to comment
Share on other sites

Hi Enoskype,

So this should be the goal of a well optimized disc after defragmentation? Would there also be a goal of leaving space near the outer disc for programs that would use temp. files, or would they be placed more toward the slower inner section?

samr.

Link to comment
Share on other sites

Hi solbjerg,

 

I am not sure about the usable area of the hard disk, I think it may also differ from PC to Laptop. But, you are not far away with your calculation, and the writer is talking about 100-240 % performance difference between innermost and outermost tracks.

Very good approach my friend.

 

 

Hi samr,

 

Very good points.

 

According to my opinion, this could be one of the methods of defragmentation optimization.

Coming to your placement of temp files, although one defrag program maker prefers to leave a space near the outer disc for temp files, this is not my piece of cake.

You have to be doing things very regularly and cleaning very regularly your PC.

 

I have given my point of view about that in this post which I will quote below.

 

Hi adchia,

 

I am afraid I don't agree with you.

The beginning of the disk (first clusters) should belong to boot files according to me, to Microsoft, and to most of the well known defragmenter programmers.

$Boot(0-2 clusters), NTLDR(2-60 clusters), NTDETECT.COM(60-72 clusters), ntoskrnl.exe(72-605 clusters), bootvid.dll(605-608 clusters), system(626-4338 clusters), etc files are all there.

 

Yes, this part is the outer most track and also the fastest part of the disk by Zone-Bit Recording (A method of optimizing a hard drive at the factory).

 

But, with all due respect to Jeroen, I would rather have the temp files nearer to the most resent used files, where they are closer to the middle of the drive and nearer to the Directories, MFT, Page file, MetaData, and Hibernation file.

 

I have to point out that nearly all the defrag program makers have their own theories of the best placement of the files. This is an ongoing discussion.

 

Anyway, Layout.ini, every three days will change that if you don't deliberately stop Windows doing that.

 

Cheers.

 

 

I would like to remind that all this is about an everyday used general purpose PC.

 

Cheers to you all.

Link to comment
Share on other sites

Hi enoskype!

 

I now found that the radius of the disk is 7 cm, the inner inaccessible part is somewhere between 1.5 - 2 cm, this makes the calculation somewhat different. Storage room >140 cm² per disk, difference in speed between 300% and almost 500%, but a lot of factors influence that - every disk has different sectors, but I do not know what govern their behaviour.

I think the most used disks operate at speeds around 7000 revolution per minute, but I read that some operate at 15000

By the way the access of data is measured in nano-seconds (1/10^9)

Number of disks in the harddisk is between 3 and at least 7, - absolutely fantastic that the reading head can pick up data at those speeds - practically without error. (about 50 mph at the outer rim (84 km/t))

Very interesting subject my friend!

 

Cheers

 

Hi solbjerg,

 

I am not sure about the usable area of the hard disk, I think it may also differ from PC to Laptop. But, you are not far away with your calculation, and the writer is talking about 100-240 % performance difference between innermost and outermost tracks.

Very good approach my friend.

 

 

Hi samr,

 

Very good points.

 

According to my opinion, this could be one of the methods of defragmentation optimization.

Coming to your placement of temp files, although one defrag program maker prefers it, this is not my piece of cake.

You have to be doing things very regularly and cleaning very regularly your PC.

 

I have given my point of view about that in this post which I will quote below.

 

 

 

 

I would like to remind that all this is about an everyday used general purpose PC.

 

Cheers to you all.

Link to comment
Share on other sites

(Command window "defrag C: -b") Did you know that?

 

Hello guys,

 

Here is how you manually execute what Windows do in every three days.

 

Click "Start" button=>click "Run"=>type defrag C: -b=>click OK

or,

Click "Start" button=>click "Run"=>type cmd=click OK=>command window=>type defrag C: -b to command prompt=>hit Enter

 

 

On a Windows XP-based computer, the Layout.ini file is in the %windir%\Prefetch\ folder. The Layout.ini file lists files and directories so that the Windows operating system can reference these files and directories when the operating system starts or when an application starts.

 

The Windows operating system puts these files together on the disk to reduce seek time and to improve performance. The Layout.ini file is updated one time every three days to make the future startup of the Windows operating system and the startup of certain applications more efficient.

 

 

Uninstalling and installing programs seems to take its toll on the old hard drive and consequently major fragmentation occurs.

 

Apparently to improve the speed of your starting applications, WindowsXP continually monitors files that are used when the computer starts and when you start applications. It then creates an index that lists segments of frequently used programs and the order they are loaded in. This pre-fetching process improves performance by allowing the operating system to quickly grab program files.

 

WindowsXP runs this about every 3 days but if you need to optimize manually all you need to do is run a Command window and type defrag c: -b

 

This will do a very fast defrag of your starting applications and put them on the outer part of your hard drive making it feel as though you have just completed a full re-installation of WindowsXP!

 

For cases where you can't run a full defrag--- it can indeed be slow, especially if it hasn't been done in a while--- just being able to move the most-critical apps around is very handy.

 

I knew that the switches below,

 

[-a] Analyze only,

[-f] Force defragmentation even if free space is low,

[-v] Verbose output (detailed, comprehensive),

[-?] Display help text.

 

worked with defrag command, but I didn't know that there existed a

 

[-b] switch

 

Enjoy it!!!

 

Cheers.

Link to comment
Share on other sites

Hi enoskype!!

A very good tip!!!! I'll elect it "tip of the year"!

Cheers and thank you!

solbjerg

 

Hello guys,

 

Here is how you manually execute what Windows do in every three days.

 

 

 

 

I knew that the switches below,

 

[-a] Analyze only,

[-f] Force defragmentation even if free space is low,

[-v] Verbose output,

[-?] Display help text.

 

worked with defrag command, but I didn't know that there existed a

 

[-b] switch

 

Enjoy it!!!

 

Cheers.

Link to comment
Share on other sites

fsutil dirty query C:

 

solbjerg has reminded me of the command fsutil dirty query.

 

fsutil dirty query, and fsutil dirty set commands are related somehow to defragmentation.

 

Queries to see whether a volume's dirty bit is set. Sets a volume's dirty bit. When a volume's dirty bit is set, autochk automatically checks the volume for errors the next time the computer is restarted.

 

If a volume's dirty bit is set, this indicates that the file system may be in an inconsistent state. The dirty bit can be set because the volume is online and has outstanding changes, because changes were made to the volume and the computer shutdown before the changes were committed to disk, or because corruption was detected on the volume. If the dirty bit is set when the computer restarts, chkdsk runs to verify the consistency of the volume.

 

Every time you start a computer running in Windows XP, Autochk.exe is called by the Kernel to scan all volumes to check if the volume dirty bit is set. If the dirty bit is set, autochk performs an immediate chkdsk /f on that volume. Chkdsk /f verifies file system integrity and attempts to fix any problems with the volume.

 

You cannot defragment volumes that the file system has marked as dirty, which indicates possible corruption. You must run chkdsk on a dirty volume before you can defragment it. You can determine if a volume is dirty by using the fsutil dirty query command.

 

When Windows or a defrag program refuses to do a defragmentation because of the bad sectors, if query parameter first results with dirty disk, and then set parameter later setting the dirty bit, allows for the defragmenter to run.

 

Usage is as below:

 

1)

fsutil dirty query C:

 

Response will be either "Volume C: is dirty" or "Volume C: is not dirty"

 

2)

fsutil dirty set C:

 

Dirty bit on drive C is set to dirty.

 

Thank you solbjerg and cheers.

Link to comment
Share on other sites

MFT (Master File Table) defragmentation in Vista

 

Here is another one from Microsoft Technet which is not known to many users:

 

One of the features of the Windows Vista hard disk defragmentation utility is:

 

Master File Table (MFT) defragmentation

If the MFT is spread into multiple fragments, the defrag engine can combine the MFT fragments during defragmentation.

 

Cheers.

Link to comment
Share on other sites

Hi ClearD989,

 

I am positive that IObit is going to put a similar feature in the future in IObit SmartDefrag.

 

I would like to add something to your opinion in this post and in this post in General Discussion of ISD section.

Although I agree with you that when different defragmenters are used, different "best placement of the files" algorithms are used, but "defragmentation only" algorithm is the same and it is the same concept that all files are contiguous wherever they are. (No free space defragmentation and optimization)

Most probably, your percentage difference came from system files fragmentation, and system restore files' excess size allocation.

 

Cheers.

Link to comment
Share on other sites

Hi enoskype,

 

That is entirely correct! I guess I was thinking about the different algorithms for defragmenting and optimizing. Would you happen to know which strategy is utilized by SmartDefrag in it's Optimize setting?

 

Hi sunny,

 

I also use that program on occasion, and thanks to enoskype (if I recall), I no longer run it every time my computer starts lol. It's a handy little tool to have!

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×
×
  • Create New...