Disk Fragmentation Test

In computing, file system fragmentation, sometimes called file system aging, is the inability of a file system to lay out related data sequentially (contiguously) This increases disk head movement or seeks, which are known to hinder throughput. File system fragmentation is projected to become more problematic with newer hardware due to the increasing disparity between sequential access speed and rotational latency (and to a lesser extent seek time), of consumer-grade hard disks, on which file systems are usually placed. Thus, fragmentation is an important problem in recent file system research and design.

The correction to existing fragmentation is to reorganize files and free space back into contiguous areas, a process called defragmentation. Defragmentation is the mechanism that physically reorganizes the contents of the disk in order to store the pieces of each file close together and in order (contiguously). It also attempts to create larger regions of free space using compaction to impede the return of fragmentation. Some defragmenters also try to keep smaller files within a single directory together, as they are often accessed in sequence.

This test determines the extent of fragmentation that has occurred on every disk partition/volume on a Windows host. This analysis is essential as it enables administrators to proactively decide whether it is time for disk defragmentation to be carried out or not and on which disk volumes.

This test is disabled by default. To enable the test, go to the enable / disable tests page using the menu sequence : Agents -> Tests -> Enable/Disable, pick the desired Component type, set Performance as the Test type, choose the test from the DISABLED TESTS list, and click on the << button to move the test to the ENABLED TESTS list. Finally, click the Update button.

Target of the test : A Windows host

Agent deploying the test : An internal agent

Outputs of the test : One set of results for every disk volume on the monitored host

Configurable parameters for the test
Parameter Description

Test Period

How often should the test be executed.

Host

The host for which the test is to be configured.

Measurements made by the test
Measurement Description Measurement Unit Interpretation

Total fragmentation:

Indicates the percentage of this volume that has been fragmented.

Percent

Ideally, this value should be low. A high value is indicative of a highly fragmented volume. This could multiply the data access time and could cause inefficient usage of the storage space. Such situations necessitate defragmentation, which is sure to make reading and writing to the disk much faster.  

Preemptive techniques attempt to keep fragmentation at a minimum at the time data is being written on the disk. The simplest is appending data to an existing fragment in place where possible, instead of allocating new blocks to a new fragment.

Many of today's file systems attempt to preallocate longer chunks, or chunks from different free space fragments, called extents to files that are actively appended to. This largely avoids file fragmentation when several files are concurrently being appended to, thus avoiding their becoming excessively intertwined.

Retroactive techniques attempt to reduce fragmentation, or the negative effects of fragmentation, after it has occurred. Many file systems provide defragmentation tools, which attempt to reorder fragments of files, and sometimes also decrease their scattering (i.e. improve their contiguity, or locality of reference) by keeping either smaller files in directories, or directory trees, or even file sequences close to each other on the disk.

Average free space size:

Indicates the average size of the free space extents on this volume that has been fragmented.

MB

Free space fragmentation means that the empty space on a disk is broken into scattered parts rather than being collected in one big empty space. This type of fragmentation occurs when there are several unused areas of the file system where new files or metadata can be written to. Unwanted free space fragmentation is generally caused by deletion or truncation of files, but file systems may also intentionally insert fragments ("bubbles") of free space in order to facilitate extending nearby files

Fragmented free space should ideally be low. A high value for these measures therefore, could cause data file creation and extension worries. Even an odd spike or two would hence necessitate defragmentation.

Free space fragmentation:

Indicates the percentage of free space on this volume that has been fragmented.

Percent

Free space count:

Indicates the number of  free space extents on this volume that has been fragmented.

Number

Largest free space size:

Indicates the size of the largest free space extent on this volume that has been fragmented.

MB

 

File fragmentation:

Indicates the percentage of files that are fragmented on this volume.

Percent

Sometimes when you install a program or create a data file, the file ends up chopped up into chunks and stored in multiple locations on the disk - this is called file fragmentation. A high value of this measure indicates that there exists a severe dearth of sequential data on the volume. This makes data retrieval difficult and time-consuming. Only defragmentation can resolve such a situation.