Jump to content
IObit Forum
Top Free Driver Updater Tools Best 25 PC Optimization Software Best 22 Antimalware Best 22 Uninstaller Software IObit Coupons & Discount Offers PC Optimizer Mac Boost Advice IObit Coupons A Good Utility Program From IObit IObit Promo Codes IObit Coupon Codes IObit Coupons and Deals FAQs Driver Booster Pro Review

Defrag Free Space?


ragous

Recommended Posts

Hi ragous,

 

Free space defragmentation is the result of files/folders defragmantation.

 

It is the by product of defragmenting files and folders without leaving any free space amongst them.

 

It isnot good for optimized defragmentation, because if there is any changes in the files /folders, they will be fragmented and next time you defragment them all have to change location on the disk.

 

I believe, this will cause more speedy degradation of a hard drive.

 

Reading Think about Defragmentation!!! thread may well give you more information about defragmentation.

 

Cheers.

Link to comment
Share on other sites

free space/empty room

 

Hi enoskype:-) :-) :-)

4096 bytes in each allocation unit. (often called 4 kB)

14633198 total allocation units currently on my C: drive disk.

8189453 allocation units currently available on my C: drive disk. (what is often called free space) (empty room?)

Empty space in the Universe usually holds around 1 atom per cubic meter.

When cleaning a room it is easy to remove the furniture and sweep it for dirt and dust. (Usually called clean-up - followed by a chkdsk)

Tidying up the table and pushing in the chairs around it (defragmenting) will create some walk around space.

If you wash the room down you will probably remove a couple of microns (1/1000 mm) of dust or tar, but will perhaps leave a micron of water instead. Painting it will probable make the room smaller by >10 microns in every direction.

Optimizing free/empty space depends on what you want to achieve - I mean it may not be optimal to have all the empty space located in the right corner of the round disk :-) :-) :-)

Cheers

solbjerg

 

Hi ragous,

 

Free space defragmentation is the result of files/folders defragmantation.

 

It is the by product of defragmenting files and folders without leaving any free space amongst them.

 

It isnot good for optimized defragmentation, because if there is any changes in the files /folders, they will be fragmented and next time you defragment them all have to change location on the disk.

 

I believe, this will cause more speedy degradation of a hard drive.

 

Reading Think about Defragmentation!!! thread may well give you more information about defragmentation.

 

Cheers.

Link to comment
Share on other sites

I've been reading about this question and I've come to the conclusion that the 'defrag of free-space' is mainly just 'Sales-Hype',

to make their software sound better and would Slow it down.

Since they say that there is not much fragmented free-space, if a normal defrag is run regularly.

 

I have tried an earlier verion of Defraggler and found that it was 'Very Slow'

and I determined that it was defraging files that really did not need defraging, which slowed it even More.

I think these are the same System-files that SD marks as 'Unmovable'?

 

I've not tried the latest version though.

I think it was Defraggler-V1.2 that I tested, before desiding to use Smart-defrag.

Their website says the new version is faster, but I would have to see it to believe it.

Link to comment
Share on other sites

I would say that SD Does 'defrag free-space',

All defragmentation programs do actually, since they combine fragmented data into available 'Free-space'.

There are some small free-space gaps left after each defrag but most gaps are filled during the Normal usage and defrag processes.

Each time the defrag program is run more of those gaps are filled.

 

Free-space is Empty-space, Not Read/Write Data.

Link to comment
Share on other sites

I would say that SD Does 'defrag free-space',

All defragmentation programs do actually, since they combine fragmented data into available 'Free-space'.

There are some small free-space gaps left after each defrag but most gaps are filled during the Normal usage and defrag processes.

Each time the defrag program is run more of those gaps are filled.

 

Free-space is Empty-space, Not Read/Write Data.

 

I am not an expert on defragmentation but thats how I see it after reading this:

Over time, though, as Windows writes, modifies and deletes files on your hard drive, the free space on the drive will exist in various bits and pieces.

 

Why is this a problem? Fragmented free space leads to further fragmentation of your files. When Windows writes a new file to your hard drive, it may have to break it up if it can't find a large enough chunk of free space for it to fit in.

I guess they mean that Defraggler groups all the free (unused) space which is "in various bits and pieces" together so there is "a large enough chunk of free space for it (a new file) to fit in" but I don't know how they would achieve this.

 

I guess this would be similar to the IObit Smart Defrag Optimize option just expressed differently (in reverse)

From my understanding, and put simply, IObit Smart Defrag Fast Optimize groups the files (used space) together and Fully Optimize groups them at the outer edge (fastest part) of the disk.

If this is correct then the free (unused) space would naturally be grouped together and therefore create "a large enough chunk of free space for it (a new file) to fit in"

 

But, since I am very happy with IObit Smart Defrag I have never seen the need to use Defraggler and so I can't give you any specific details on the difference between them

 

All the best, woz of oz

Link to comment
Share on other sites

Hi ragous

I think the problem is that no one really understands what you mean by defragging free space?

There is nothing to defrag in free space (that is why it is free)

If you mean where the socalled free blocks are positioned - I think most people would call that optimizing.

This is a discussion topic amongst the defragmentation specialists - what is the best strategy for the harddisk to optimize the placement of files and the durability of the free space in the harddisk?

This depends on what you want to try to achieve - many for example thinks that a few free blocks between the different programs is the best way to go. This is probably true in some setups, - while it in others do not yield the desired results.

The whole discussion is also influenzed by the defrag windows that usually shows you about 1000 small blocks that is thought to depict your harddisk. If your harddisk is a TB (terra-bytes) harddisk, each small square (blocks) in such a setup would represent 1 GB, So the colours in the windows really just tells you if that particualar square actually holds a fragment that really belongs to another program/file (somewhere else)

Differences between the strategy of defragment between the the different defragmentation programs should be conducted between the experts I think.

SmartDefrag does use optimization if that setting is chosen - what defraggler does or what it calls it I do not know.

SmartDefrag has served me well for several years :-)

(don't forget cleaning of the harddisk - and a check disk once in while)

Cheers

solbjerg

 

 

My question isn't answered yet xD
Link to comment
Share on other sites

  • 2 weeks later...

Defrag Free Space?

 

Defrag should not have decreased your free space, but if it found and marked bad sectors that would reduce your free space.

 

Did you run disk cleanup first? That finds and displays various files and folders that you may no longer need, like recycle, temporary, temporary internet files, etc. - you the choose the ones you want removed.

Link to comment
Share on other sites

  • 1 month later...

It isnot good for optimized defragmentation, because if there is any changes in the files /folders, they will be fragmented and next time you defragment them all have to change location on the disk.

Not consolidating free space does not prevent this as this would require too many variables to line up:

- Enough free space has to be adjacent to the file being re-written

- Windows must never receive another read or write request on that volume and the HDD head musn't park/zero-land

- VSS/System protection must be disabled for the volume containing the file in question (otherwise, it locks the existing file into system volume information and creates a new file)

 

Typically, none of these three conditions will be met. Additionally, this logic is more concerned about write performance when read performance is more relevant. Consolidating files towards the beginning of the drive (rather than leaving them all over the place) will improve read performance (in some cases, dramatically).

 

Having said all of that and having only started using SD yesterday (Diskeeper trial ran out and, sorry Condusiv, but Diskeeper is far too expensive to install on five machines in the home), I cannot say whether SD can be improved with respect to consolidating files as I've not enough time to make a mess of file placement yet. I can say that SD consolidates the green blocks (frequently used files).

Link to comment
Share on other sites

Hi RoloX2. welcome to IObit Forum! :grin:

 

The below quote is only part of the concept of IObit's Smart Defrag and should not be taken as a general comment.

It isnot good for optimized defragmentation, because if there is any changes in the files /folders, they will be fragmented and next time you defragment them all have to change location on the disk.

Certainly, the below quote about the algorithm of the Smart Defrag defrag engine should be taken into consideration.

 

Enable Smart Defrag

Intelligently organizes drive data for maximum program performance and long-lasting data contiguity. Smart Defrag is an advanced and patent-suspended technology of IObit. Files are organized by their creation and modified dates. The theory is that files that have not changed recently are less likely to change in the future. These older files are grouped together so that once Smart Defrag have placed them, future defragmentation passes are less likely to move them again. Thus, future defragmentation will work much better, keep hard disk healthy as well.

 

The intention of the quote you mention is to point out that defragmentation of free space is not the only factor that should be taken in to account.

 

Having said all that, I do agree with your 3 points. :smile:

 

Please find my thoughts about defragmentation in Think about Defragmentation!!! thread. :mrgreen:

A lot of water under the bridge since the beginning of XP era anyway.

 

Cheers.

Link to comment
Share on other sites

A lot of water under the bridge since the beginning of XP era anyway.

 

Can you elaborate on that part?

 

-+-

So far, I like what I'm seeing with Smart Defrag. It does seem to strike a balance between moving/packing frequently used files to the beginning (main thing I'm after) and it does seem to consolidate some of the rarely used files without spending days moving everything around (which would be excessive and prohibitive).

 

My point earlier was that packing everything together would neither prevent nor add to fragmentation. Additionally, its difficult and time-consuming to try to maintain that configuration (been there, done that, with custom defrag scripting).

Link to comment
Share on other sites

:

Originally Posted by enoskype http://forums.iobit.com/images/buttons/viewpost.gif

A lot of water under the bridge since the beginning of XP era anyway.

Can you elaborate on that part?

-+-

Hi RoloX2,

 

As you can see from my screenshots in the aforementioned thread in my previous post, in the beginning of XP era, the best defragmentation is ment, all the files should be contiguous without any space in between and the placement of the folders and certain system files should have been in certain locations of the HD. all the well known defraggers done nearly the same thing, call it Diskeeper, PerfectDisk, or any other utility program using contig.exe of Sysinternals (Not MS at that time).

At that time SSD were not used and smart placement of the files were not used. Even Zoned-Bit Recording was not taken into consideration. US journal files of todays OSes are defragmentable online, but they were not at XP's time and so on.

 

It will be really too lenghty and not the place of the forum to go further in deep in Smart Defrag section, but that's what I tried to ment that with the quote you have given.

 

BTW, if the best defragmentation algorithm was known, there wouldn't be many defragmenters. :lol: It does differ for the users with different habbits, different usage logics, different hardware, etc. anyway.

 

All the best and cheers.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...