You should read the pre-cursor to this post here if you want to make any sense out of what I am going to write here. Anyway, I hacked the previous program further, being a person who always craves for some kind of visual representation, I decided to print the blocks of the file to the screen using a certain symbol for each block. Now this was fine for small files, but to really see some fragmentation happening you have to look at files in the order of a few hundreds of MB in size, and for such huge files, printing a symbol for each block will clutter the screen. So I decided to make the program a little more intelligent, it will instead print the blocks to the screen using the following key :
@ represents 10000 contiguous blocks on the disk
#represents 1000 contiguous blocks on the disk
$ represents 100 contiguous blocks on the disk
& represents 10 contiguous blocks on the disk
* represents 1 block on the disk
so now I print the blocks that have been used to allocate for the file and the blocks that do not belong to the file but which fall in between. The blocks which belong to the file are printed in blue while the ones which don't are printed in red.
I was searching for a highly fragmented file on my 1 TB external HD and found an iso image of about 1.2 GB which was fragmented at approx 200 places (thanks to ntfs on the external HD), so I ran the program on this file once and then copied the file to my laptop's HD(which has ext3 btw) and ran the program again, here are the results, the excessive fragmentation is visible in the first case:
tomorrow, I am going to try to create an image with the block map for a file, I will try using the gd library for this. More results tomorrow :). BTW, you can get the program from here.