Jump to content
IObit Forum
Top Free Driver Updater Tools Best 25 PC Optimization Software Best 22 Antimalware Best 22 Uninstaller Software IObit Coupons & Discount Offers PC Optimizer Mac Boost Advice IObit Coupons A Good Utility Program From IObit IObit Promo Codes IObit Coupon Codes IObit Coupons and Deals FAQs Driver Booster Pro Review

Think about Defragmentation!!!


Recommended Posts

Contig & Power Defragmenter GUI


Hi sunny staines,


Thanks for Page Defrag from Sysinternals. ClearD989 has mentioned it in the post #2 of this thread and in this post.

As I mentioned somewhere else, Sysinternals is bought by Microsoft in July 2006.


It is a very useful program together with other programs from Sysinternals.


The mother of the defrag programs "Contig.exe" is from them also.

You can download Contig version 1.55 (100 KB)from here.

Put it to the folder where your command prompt is, and run it in the command prompt window.


To make it easier to use, there is a relatively small utility (483 KB) called Power Defragmenter, which enables Contig.exe to be used with GUI. (Graphical User Interface)

You can download PowerDefragmenter.zip from here .


Just, extract the zip file to the directory of your choice where you should put the Contig.exe together with it.


Run "Power Defragmenter GUI.exe" from the same folder where Contig.exe is, and you can defrag a single file, a single folder, multiple files and multiple folders (4 at a time utmost), or the whole disk.


Enjoy it!!!



Hi ClearD989,


As I mentioned in post #1 of this thread, unfortunatellly I don't know the logic of optimize in the recent ISD. My guess is, in the older versions, it was doing defragmentation and free space optimization.


About Page defrag, thank you, you recall correctly from #4 post of this thread.:smile:



Link to comment
Share on other sites

  • Replies 68
  • Created
  • Last Reply

Top Posters In This Topic

Hi enoskype!


Wow...I downloaded Contig and the GUI and ran it and it was quite impressive! It found a single file (nearly a Gb large) that was in over 1,200 fragments and (magically) was missed by my other defragmenters! Isn't that a little strange? It returned it to one fragment, though, and brought me to 0% fragmentation. I must say I really like this program, it's very powerful. Thanks for posting it! =]

Link to comment
Share on other sites

  • 3 weeks later...

Disable/Enable "Optimize hard disk when idle" in XP.


Hi again,


You can enable or disable "Optimize hard disk when idle" function of Windows from within "Tweak UI", which is a Microsoft product, part of "Microsoft PowerToys for Windows XP".


Tweak UI 2.10 is downloadable from here .


If you wish to disable -Windows' rearrange files on the hard disk when the computer is not in use to improve performance- function, uncheck "Optimize hard disk when idle" checkbox, after clicking General link and scrolling under Settings and click "OK" button in Tweak UI 2.10.


There may be times where you want to disable this:


1) You are using a third-party disk defragmenter.

2) You are running a laptop via a battery, and defragmenting your hard disk can use up power.




Link to comment
Share on other sites

  • 2 months later...



BootVis is an application to check how long a Windows XP PC takes to boot, and then optimize the boot process, which is a Microsoft product but no longer available from Microsoft to download.


Microsoft states: "BootVis.exe is a performance tracing and visualization tool that Microsoft designed to help PC system designers and software developers identify performance issues for boot/resume timing while developing new PC products or supporting software. The boot optimization routines invoked by Bootvis.exe are built into Windows XP. These routines run automatically at pre-determined times as part of the normal operation of the operating system."


This reminds !!! me usage of Command window "defrag C: -b" and Tweak UI 2.10 .


BootVis defines boot and resume times as the time from when the power switch is pressed to the time at which the user is able to start a program from a desktop shortcut. The application measures time taken during Windows XP's boot or resume period. BootVis can also defragment the files accessed during boot to improve startup performance by "Optimize System" option in the program. (However a similar optimization is already done in the background by Windows XP every three days.)


Windows XP monitors the boot process over a period of boots - over a period of time. BootVis is used primarily to diagnose slow boots by identifying drivers that take a long time to load (can indicate a poorly written driver). One of the other things that it can be used for is to monitor a single boot process and to then take that information (exactly the same information that Windows XP gathers over a period of system boots) and perform the optimizations to driver loading. One of the other options that BootVis allows you to "perform now" is the partial defrag of the layout.ini files.


You can download BootVis (bootvis.msi) from here or here .


It is interesting to read this comment.


Please, also have a look at Clayton's_post .



Link to comment
Share on other sites

  • 1 month later...

Vista defrag improvements over XP defrag.


One of the imrovements of Vista defragmentation is, unlike Windows XP, Windows Vista does not require 15% free space (unless you use the –w parameter from the command line). If there is any free space at all, Defrag will make an attempt to defragment the volume.


Although it needs confirmation, also, Vista SP1 defragmantation is improved compared to original Vista defragmentation for the matter of decreasing the amount of time it takes to defragment a volume and providing better defragmentation.:roll:




EDIT: It is confirmed,please have a look at post #46 of this tread.

Link to comment
Share on other sites

  • 4 weeks later...

Optimize Defrag Algorithm


Hello guys,


First time in the history of IObit's defragmantation, company has announced the likely algorithm of the defrag engine (Smart Defrag).


Please find below the quote about the "Smart Defrag" in AWC3 Beta 2.7.1


Enable Smart Defrag

Intelligently organizes drive data for maximum program performance and long-lasting data contiguity. Smart Defrag is an advanced and patent-suspended technology of IObit. Files are organized by their creation and modified dates. The theory is that files that have not changed recently are less likely to change in the future. These older files are grouped together so that once Smart Defrag have placed them, future defragmentation passes are less likely to move them again. Thus, future defragmentation will work much better, keep hard disk healthy as well.



Link to comment
Share on other sites

  • 1 month later...

ISD Beta 6.10 Defrag & Optimize


Hi again everbody,


Before the official version is released, I just wanted to show by graphical view that, how ISD Beta 6.10 does change the location of the files, of a disk which is defragmented by "PerfectDisk v8.0" using "SMARTPlacement" with "Aggressive free space consolidation" when you Defrag & Optimize just after.


Images are self explanatory, and you can compare both of them by using the colours and definitions in the Legend section on the left side of the images.

(More info about the legend of files in the images in post #1 of this thread)



The yellow, excluded files and folders at the bottom are:

-C:\....\....\EncryptedDrive (1GB)

-C:\RETURNIL (Virtual drive) (5GB)








, and


The yellow, excluded files in the middle are:

-C:\pagefile.sys (Page File)

-C:\hiberfil.sys (Hibernation File)





We certainly will need Exclude Files function together with Offline Defrag in future releases of ISD.:idea:





Link to comment
Share on other sites

ISD Beta 0.6.1 is now at Official 1.0


Even though, it is, now, Officially released onto the Web Market, this newest instalment should work in a totally same manner.8)


Therefore, the above post by Enoskype is, still, totally valid. Unless proven otherwise, in the near future.:idea:


Thank-you all, for your undivided understanding over this huge and, totally (took everybody) by surprise I am sure, change in stance of this 100% FreeWare Utility program.


Link to comment
Share on other sites

Some files shouldn't be defragged or optimized.


Read this thread from another defragmentation program as to why

certain files can't be moved. MSofts defragger wont cause problems

since it knows that certain file[name]s can't be moved or very

serious problems can occur. Thus, other defragger programs have

to be made smart enough to account for these and skip over defragging

them or moving them (optimize); otherwise, your computer may not

bootup as it happened to this person. Thus, perfection in defragging

should be constrained.



Link to comment
Share on other sites

Use "Normal Sense" (Common sense isn't common anymore)


Similar to Spyware programs, there are a ton of Defragmenting programs. The question is always the same--which ones work the best? Paid or free? In some cases the answer can only be found through extensive testing, while in other cases the answers are staring us right in the face. Consider you have a huge punch bowl filled with 500,000 M&M's. Their are 100,000 RED M&M's. Each RED M&M is numbered 1 thru 100,000 and of course they are all mixed up (Fragmented) in the bowl. Your task? Find all 100,000 RED M&M's and lay them out on the table (your disk) in order from #1 to #100,000. A standard run of the mill Defragmenting program will indeed sort out the RED M&M's, but not in order. This is good because all the RED M&M's are now on one section of your disk and the track head has much less seeking to do to find them---but your track head STILL has to sort through all 100,000 RED M&M's to find the numbers it is looking for because they are not in order. Keep in mind that there are still 400,000 M&M's of different colors that have to be dealt with! In enters "Optimization". The better programs do it in different ways, but the attempt is the same---to sort out all the RED M&M's and to ALSO put them in order by number. The program that claims to do this best is Ultimate Defrag (both free and paid), however I had many questions with UD and the Company refuses to answer them unless I pay for the full version. Then there is O&O Defrag, and PerfectDisk, and Diskeeper and many, many others. To date, in my experience, the most efficient Defragging programs are the FREE ones, being the free version of ULtimateDefrag, JKDefrag and------Iobit's SmartDefrag. I will tell you may favorite feature in a moment, but I can tell you that if you just want to keep your HOME system clean and as fast as possible, there is no need for an "always on" program--one that constantly is defragging in the background. That's too much bloat and too many resources slowing you down. You want a "small footprint" efficiently coded program that both defrags AND optimizes--when you ask it to---or when you SCHEDULE it to defrag. SmartDefrag does this very well. UD does it also, but the nicest feature of both programs is that they use the Windows Task Scheduler which is built into Windows anyway. My favorite feature----I can program SmartDefrag to do its thing and go to bed. It will Defrag and Optimize and then shut down my System--no need for me to wait around until it's done. For "Boot Optimization", try defragmenting in "SAFE" mode. Some defragmenting programs will do this and it is basically the same thing as "Boot Optimization". By the way, if your Page file gets messed up (fragmented), there are two good ways to handle that. One is to uninstall your page file--reboot--and then reinstall, and the other, which is my favorite, is a small program created and coded by Mark Russinovich called aptly "Pagedfrg" and is now a part of "Sysinternals"-------which was bought out by a company you may have heard of--Microsoft! Pagedfrg is a free program that does exactly what it says it will do! Strange, in this day and age, to find a program that is coded extremely well and does exactly what it is supposed to do. So, I would not spend any money on a Defragmenting program--or an antiVirus program--or a Firewall. I've been doing this soooo long (too long) that my main concerns now are with System stability and a Company's "willingness to help" should the need arise. The Company "Disktrix" who makes UltimateDefrag fails miserably in the area of Tech support unless you buy their program--and they tell you that up front. Iobit is a totally different Ball Game with their willingness to help and answer questions. With a normal Home Computer, I would use only SmartDefrag, program SmartDefrag on a schedule, and get on with life without worrying about Defragmentation! Don't eat the M&M's.

Link to comment
Share on other sites

  • 2 weeks later...

Can SmartDefrag do the same?


Why not make SmartDefrag do an offline defrag of the system files, paging file, registry hives etc – everything that PageDefrag does?


Besides optimising the paging file, it needs to be placed on the optimal location on the disk, some people say it's faster placed exactly in the MIDDLE of the disk and others say it's faster placed at BEGINNING of the disk, unfortunately I don't have the page sources to prove what I'm saying. :evil: Also, can the system files be placed at the beginning or other area of the drive to massively increase performance?


If SmartDefrag is to match its commercial rivals, it needs to perform an offline defrag of ALL the system files that it can't touch when Windows is running.


As for the paging file fragmenting, this can be avoided by setting both the initial and maximum size to 3x the amount of RAM memory installed (up to 4096MB in Windows XP, I'm not sure if that applies to 64-bit versions too). However, you WILL need to alter these amounts if you change the amount of installed RAM in future. After changing these values and rebooting, you will only need to defragment the paging file once, unless you change the settings in future.

Also, placing a second paging file on the FIRST partition of a DIFFERENT internal hard disk will increase performance, as well as having the first copy on the boot partition (source).

Link to comment
Share on other sites

  • 3 weeks later...

defrag programs ratings? CNET, PCMAG. etc-m also freeware for cd/dvd software


There is no consistent comparisons on any site of the efficacy, efficiency or highest benefits of the widely available defragmentor.

PC MAG lists the Executive Software, but neglects to list the other available ones as IOBIT, DEFRAGLATER etc.

What other independent group analysis for such utilities exists?

CNET is wide but perhaps incomplete.

Thank you for your suggestions as I have too many duplicate programs without certainty of which is going to give the most effective outcome.

The same goes for "copy, cd/dvd programs"in particular" for freeware.



Link to comment
Share on other sites

Defragmenters' Analysis Times.



Please find below some of the defragmenter's analysis times for disk defragmentation.

Each analysis has been done with the same conditions.


OS________:XP sp3 Pro

Processor___:Intel Centrino M 1600MHz

RAM_______:1024 MB

Free RAM___:~460 MB

Disk_______:SEAGATE 7500RPM ST980825A


File System_________: NTFS

Total Disk Size______: 74.53 GB

Used Disk Space_____: 30.85 GB

Free Disk Space_____: 43.68 GB


Security Softwares Turned On at the time of analysis:

-McAfee VirusScan Plus V.9.0 (SystemGuards Protection Off)

-ZoneAlarm Free 7.0.483.000

-SS&D TeaTimer

-Windows Defender 1.1.1593.0 Def:1.45.633.0



O&O Defrag 2000 Free V3.5_____: 370 seconds

Smart Dfrag Beta 1.02_________: 109 seconds

PerfectDisk8 Built 67___________: 105 seconds

JK Defrag 3.36.02_____________: 35 seconds

UltimateDefrag 23 seconds

Diskeeper light 7.0.418.0_______: 18 seconds

Piriform Defraggler 1.03.094_____: 13 seconds




Link to comment
Share on other sites

How to defragment the USN Journal?


I've currently seen absolutely no solution (not even in commercial applications or in offline defragmenters, or in defragmenters running rom Linux) about how to defragment the USN Journal on the NTFS partition containing the Windows installation. The number of fragments is growing ever, and now this is the only fragmented part that I can't optimize (this is true on Windows XP as well as on Vista).


It seems that the USN Journal is ONLY handled by the NTFS filesystem driver, and this driver shuold handle internally a background task to recollect all those fragments.


I've tried to disable the System Snapshots, and to clean up them with CleanMgr (including after a Failsafe boot to minimize the number of running processes or services, or booting from a BartPE live CD containing Windows XP, or a Linux kernel) however this just collects some fragments, but never most of them. For this reason, now the USN journal has more that 4000 fragments spread all over the disk, and this is certainly explaining why my disks have become so slow.


Do I have to reformat completely the NTFS partition and then restore the files from a backup?


This hevy fragmentation came after an installation of Service Pack 2 (that had failed completely the first time I tried it due to a bug in the installer of one of its storage drivers, requiring a preinstallation of another driver from the PC manufacturer to pass over this bug): I had to restore the system using the Windows builtin recovery from an archive, and since this time, my PC is dramatically slow.


Note: I don't want to reinstall everything from scratch: I'll loose several licences. If someone can indicate me how to backup the installed partition or rebuild it from a backup without also restoring the fragmentation of the USN journal, I would be very pleased.


Is there a Microsoft article somewhre speaking how to troubleshoot the USN journal?


Final note: CHKDSK does not detect any error in the USN journal. And I don't use the NT Filesystem Replication Service (NTFRS): Should I enable it to perform the transfer of partition?


Microsoft's PageDefrag CANNOT defragment the USN journal. I've tried to play with the defragmentation API, and visibly, you cannot move its clusters around. The USN journal is normally allocated ONCE after the NTFS volume is created, the first time Journaling is enabled on it. It normally never grows after it, the area is fixed on disk, even if the journal is used as a cyclic file (that's why it is marked as a "sparse" file, however it should only contain two fragments at most, unless the total USN journal size is grown).


I've not found any API to truncate the used part of the USN journal, and you can't even move the unused alllocated part of the journal as long it is active (and there's no way to make it completely inactive, once it has been created on a volume, because the NTFS driver maintains it constantly open, even when Journaling gets disabled; this effectively puts a permanent exclusive system lock on it, including when booting in safe mode or when booting from another partition, and there's visibly no way to mount an inactive partition without also having the NTFS driver also opening this journal, even if it keeps it idle when journaling is disabled on this volume).


If you look at a detailed map of the volume, you'll see these areas as unmoveable (for example with JkDefrag, another free, but open-sourcedn defragmenter: they appear in black).


So I see only one way to defragment the USN journal:

- disabling the journaling

- rebooting from another partition, in a system that DOES NOT use the Windows NTFS driver (this means using Linux or some DOS extender)

- in this system, emulating the NTFS driver and managing the NTFS structure completely (without using the Windows defragmentation API).


Note that Microsoft's Systernal PageDefrag uses the Windows kernel, and its NTFS driver as well as the Windows defragmentation API, but can defragment the special files like Paging, because it does it at boot time, when those special files are not needed, and still closed. Due to its design, it cannot defragment the USN journal, that must still be active for the NT Defragmentation API to work reliably.



My suggested workaround:

- delete the USN journal from the NTFS volume: open a command prompt and run "fsutil usn deletejournal /D C:" <press ENTER>

- recreate the USN journal immediately: run "fsutil usn createjournal m=1024 a=128 C:" <press ENTER>

(change "C:" according to the drive letter of the NTFS volume where you need to defragment the USN journal)

Caveat: the filesystem replication will stop working and you'll need to restart the volume replication from scratch (incremental updates are lost, it will take time and lots of I/O to resynchronize completely the replication, if you use NTFRS on a server, and you'll need admin authorization on the server to reenable the replication); this is not a problem if you are running a standalone desktop environment, but don't do it on a server relying on replication.

After this operation, make sure you create a new System Snapshot, using "rstrui.exe" (in Vista, open the System control panel to do it, in XP, use the shortcut in the "System Tools" in the "Accessories" menu), then cleanup the old snapshopts that are no longer usable: run "CleanMgr" and in the advanced panel, click the button to keep only the last snapshot just created.

Before doing all that, make sure your system was fully functional, because you won't be able to revert your system to a past snapshot.

Link to comment
Share on other sites

Hi tester,

Read this from the above mentioned thread:



What the article does not says is where SmartDefrag takes the dates of creation/modification. apparently it takes it by looking at the file attributes in directories, however this method is not reliable, and maintaining those dates within directories is dramatically slow.


The NTFS filesystem has another reliable (and much faster) way to track the dates of change, as demonstrated in an MSDN article: you can use the "last USN" field that is present in ALL entries of the MFT.


Every file or directory on NT has an associated entry in the MFT, except when they are "resident", i.e. when they are stored in the MFT entry of their parent container: this occurs when a stream or attribute is small enough to fit within the 1KB record in the MFT describing a file or directory.


Normally, for all files, the NFS filename attribute is always resident, as well as the DOS 8.3 name, and (most of the time) the fragments location map to its content (except when the file is too fragmented: the file location map may require its own MFT record to store the whole list of fragments), and most named streams (such as the named stream that is added on files that were downloaded from the Internet to mark that they may be unsafe: these streams have only a few bytes of content when they are present), as well as the "standard" file attributes (compatible with DOS).


The file content may also be resident (not allocated separately) if it can fit in the file's MFT record, as well as the directory entries of a directory that only contains very few files (if more files get added to the directory, the resident content will be moved out of the MFT record to an external file).


But in all cases: all MFT records contain the value of the last USN assigned specifically to a version of a file; when a file or directory changes, and if journaling is enabled, the last USN entry gets updated to track the change. (Microsoft indicates that the last USN contains a timestamp, this may change in some future to use another method for linking a version to a file and timestamp, however this is still a usable timestamp in NTFS 5.0; apparently this has still not changed in Windows 7 Beta, and it is very unlikely that it will change for the next 6 years with Windows 8 or a major W7 Service Pack changing radically the way NTFS works...).



Final note: It seems that the installation of MSN Live 8 and of other Microsoft "Live" products enables the USN journal by default, however it does not specify appropriate parameters to maintain it in a stable condition: the USN journal can grow dramatically. However, it has absolutely NO use unless your system is connected with a server that archives a live replication of your filesystem (for its restoration). On a standalone PC, or if you don't have a Windows server with FRS enabled you should disable this unneeded USN journal or just create it with a minimum size (that will still allow system snapshots to work reliably). If your USN journal is too large, you'll immediately see that your volume gets fragmented at lightning speed and that you need much more frequent defragmentations to maintain its performance!


On a notebook, I really suggest you disable the USN journal completely, just delete it with this simple command from a command line window:

"fsutil usn deletejournal /D C:" <without the quotes, then press ENTER>


repeat this command for the other NTFS partitions you have outside C:

Link to comment
Share on other sites

Verdy p ,


Very informative posts thank you. I will copy them to Think about Defragmentation thread in Lounge section. (HERE)


FYI, Perfect Disk8 does defragment USN Journal on NTFS partition with Offline defrag when Master File Table, Metadata, Hibernation file and Paging File checkboxes are checked in Drive Properties. I have done that many times. (XP Pro sp3) (Never allowed fragmentation of it to grow to 4000 fragments though.)


Please find the related images attached.


-54 fragments of USN Journal are highlighted by Ultimate Defrag.

-An offline defrag is performed by Perfect Disk8 when there were 54 fragments (53 excess+1) of USN journal.

-Contiguous USN Journal (~10.51 MB) is highligted by Ultimate Defrag just after Offline defrag by Perfect Disk.

-Perfect Disk Report is shown with 0 (ZERO) fragmentation of the System files after offline defrag.


I hope this helps.


Link to comment
Share on other sites

Defragmentation commands in Vista:


From the link detailer provided I gleaned this information:


Defragmentation commands in Vista:


defrag c: -a -v

a=analyze (i.e. defragment)

v=verbose (detailed report)


defrag c: -v -r

r=regards fragments larger than 64Mb as not fragmented (Default Vista setting)


defrag c: -v -w

w=command to do a full defragmentation regardless of the 64Mb standard.

The report will still not see fragments over 64Mb as fragmented.




Link to comment
Share on other sites

  • 5 weeks later...

Defrag USB Flash Memory


To defrag USB flash memory, copy the content of it to your hard disk, delete everything on it, and re-copy the former content, which you copied before to hard disk, to flash memory. All files will be defragmented and contiguous. There is no need for optimized defrag.


It is not advisable to defrag the flash memory as often as you do other drives, as, there is a usable life issue concerning number of write and delete cycles.



Link to comment
Share on other sites

  • 2 weeks later...

There is a free version and a paid version of Ultimate Defrag. The paid version will allow you to allocate files for "High Performance" and also to allocate files for "Archiving". The high performance files are placed at the outer track of your HDD (fastest) and the archived files are placed on the inner tracks. I have tested Ultimate Defrag (both free and paid) for a few months now and have come to a conclusion. If you have extreme amounts of Data such as on a server, then perhaps Ultimate Defrag would be good, but for a normal Home PC, Ultimate Defrag is way overkill. Even Disktrix, the developer of UD states that most users would benefit the most using the "Auto" feature of the program. I can tell you that it is possible to spend days and days trying to configure UD to determine what files should be "high performance" and what should be archived. For instance, you can allocate all "DLL" files for high performance and all "ZIP" files to be archived. The bottom line on all this is that we are talking "Mille-seconds" speed improvements for a Home PC. It has been my experience that a good defragment AND optimize program is all a Home PC needs. Let me put it this way: with Ultimate Defrag, I have scheduled every day to simply defragment. Every other day to defragment by "Consolidation", and once a week to defrag by "File and Folder Name". I see NO speed difference after doing all this with UD than I do with simply running Smart Defrag's "Defrag and Optimize"---and trust me on this---Smart Degrag is MUCH easier to use. So, I now use only Smart Defrag and occasionally I use the free version which I found of Paragon Partition Manager. Paragon has a boot-time defrag and also an MFT defrag option, which very, very few programs offer. In the event my Page file gets split up, I use the free Pagedfrg developed by Mark Rusinovich of Sysinternals (which is now part of Microsoft). It works perfectly. Anyway, if you are like me, go ahead and try Ultimate Defrag but you must use the paid version to get the benefits you are asking for. Disktrix will NOT answer any tech questions for their free version. Eventually I think you will find that a good and easy to use program like Smart Defrag works just as well. For years I used Diskeeper and Vopt, then I used O&O and PerfectDisk. I've used the "Defragler" program and the free Auslogics program and I've used a myriad of others. As I stated in another Post, my favorite feature of Smart Defrag is a personal one. If I want, I can start the defrag and optimize running-----and go to bed! Smart Defrag will shut down my system when it is finished. This is what I have started doing. After a long day of PC use, every night I set the Smart Defrag program to do its thing and then shut down my System. When I awake the next morning--I am good to go for another day. Sometimes SIMPLE is BEST!

Link to comment
Share on other sites

  • 1 month later...

Think about Defragmentation!!!


Hi Everybody,


Now I have 2 questions.



Reserved Zone for MFT enlargement is for MFT (Master File Table) not to get fragmented in the future, so they shoud be adjacent.


Why Deep Optimize in SD (Smart Defrag) places some files in between MFT and Reserved Zone for MFT in Vista, when they are already adjacent before the defragmentation?



Unlike in FAT (File Allocation Table) and FAT32, in NTFS (NewTechnology File System), Directories should be close to MFT.


Why SD puts Directories in the beginning of the drive away ftom MFT, in an NTFS system with Vista when they are already adjacent before the defragmentation?


I would really like to have answers from an expert!!!


That's not a quiz, I don't know the answers.


Thanks in advance.



Link to comment
Share on other sites

  • 3 weeks later...
  • 1 month later...
  • 4 weeks later...

Considering that Vista uses a lot of memory, I think it is a bad idea not to create a permanent paging file, that should live in the middle of the boot partition (if you have only one physical drive), or on the first partiton of an alternate physical drive. If you don't create it, Widnows will create one for yuo but it will be extrmely fragmented. Set it to a size that is at least the recommended size displayed, but in only one time.

Don't use the Windwos interface to change the paging file size, unless you first remove it, and reboot, then recreate it with the desired size only after defragmenting the drive.


I've also seen absoutely no benefit for putting the paging file on multiple disks. Windows is more efficient with a single paging file.


Disabling completely the shadow copies is not a very good idea these days. It's proven that those copies can really help in case of crashes, to avoid losing data (because Windows can recover from a crash using those copies).

However, it's a good idea to reduce the size of those shadow copies (because the maximum size of these shadows is certanily too big). This is possible in the advanced system properties.


However, there are other disk space that no existing tool can defragment: this is the USN journal, hose growth is completely unpredictable. The only way to defragment it can only pass through a offline backup (you can boot from a CDROM image, then run the full system backup from there; you'll have to reformat the NTFS volume completely before restoring the data from the backup.)


I've looked everywhere on the Net, and there's apparently no documented way to defragment this USN journal (I had a system where it was extremely heavily fragmented).


What is documented is a command line tool (fsutil.exe) in the Windows system directory that you can use to delete the USN journal, but it is immediately recreated: after testing it, I immediately realized that it was recreated and growing exactly at the same place as before, so the effect was only temporary and did not last more than 2 days. In fact the USN journal gets filled automatically by every file move made by the Windows defragmentation API.


I cannot understand why the Windows NTFS driver does not allocate the spaace for the USN journal in just one operation and in a single fragment, all what it can do is to recerate it with a ridiculous initial size and then increase it incrementally with a strange formula that creates a lot of free gaps on disk. Given that it should be a cyclic file, it would be better if it was created in the third of the free space that Windows uses for allocated new files, and handled as a sparse file so that oldest (and smallest) initial fragments can be released, and later replaced cyclicly by larger newer fragments if more space is needed in the USN journal between two daily system snapshots. The maximum size shuold be tracked, and with the help of daily system snapshots and monitoring, the USN journal would reach its optimum size (and the older smaller fragments cuold be eliminated). Unfortunately, this USN journal is a space hog on NTFS within Vista (it was working much better in XP and Server 2003).


I can't understand why Vista needs to write so many information in this journal (In fact I can see a reason: there are too many event log files in Vista, and Vista writes too many things in them, most of the events are monitored by absolutely nobody; in XP and Server 2003, there only 4 event files and it was enough and much faster, and did not require a lot of NTFS filespace management operations, so there was much less activity recorded in the USN journal).


There's a way to improve the situation anyway: open the Vista Events Viewer: you can disable almost all the event logs in the "Applications and services logs\Microsoft\Windows" category (except those in the main "Windows logs" category): Just keep the standard "System" and "Security" event logs, that you can also cleanup their messages from time to time to recover their space after looking in it for past errors, or before rebooting to diagnose problems more easily. If you are not in a networked domain environment, you may also disable the Security event log. Keep the System event log as they are really needed.


You can also recover system files fragmented space (i.e. all the hidden files stored in "C:\System Volume Information" that include system snapshots) by using the System control panel, creating two snapshots successively and then using Cleanmgr to drop all the oldest snapshots: the files currently open from the system snapshot cant be defragmented as they are active. This is visibly independant of the system shadow copies that are also written and tracked in the active system snapshot files.


Note that I've not seen any system cleanup tool (including the most powerful ones, like RegCure which I think is still better than ioBit's ASC) performing the cleanup of the system event logs: this is extremely long to do manually in the Event Viewer, due to the number of categories and its almost unusable user interface where you constantly need to click everywhere in a confusive GUI layout for the various dialogs...

Link to comment
Share on other sites

Hi verdy_p,


I agree most of the parts, but not to the conclusion that "no existing tool can defragment USN Journal".


I have replied to you in an other thread a while ago about that. Please have a look to this post there, or this post in this thread here.


PerfectDisk defragments USN Journal, it is clearly seen in the attached images there.


I am going to copy your post to Think about Defragmentation!!! thread, as it is very informative.


Thank you.

Link to comment
Share on other sites

  • 1 month later...

How are "frequently used" files determined?


Still that question seems to not being answered by iobit, and it applied for the most recent version (1.11).


Sine many of us (if not all) should known, NtfsDisableLastAccessUpdate=1 disable the last access update which according to all tuning and tweaking tools improves performance (including Advanced SystemCare).


Also some Antivirus programs are reported to updated that last access value


So if SmartDefrag uses the NtfsDisableLastAccessUpdate then it's not reliable nor usable to have "frequently used" files


Please iobit, give us a deep & complete explanation of how smartdefrag determines the "frequently used" files.



Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Create New...