disk defragger in linux?

Using applications, configuring, problems
Message
Author
User avatar
crabbypup
Posts: 91
Joined: Sun 20 Jan 2008, 21:49
Location: toronto canada

disk defragger in linux?

#1 Post by crabbypup »

okay. i have a machine that has a combines 370gb capacity... all in fat32. this is because i dual boot xp and puppy 4.00.

recently i have discovered that my disks need to be defragged, but the normal utilities all crash my windows box when i try to run them. it has caused me to lose no data yet (luckily) . all my hard drives are 70% - 80% full. i have a 320gb drive, a 30gb drive and a 20gb drive.

i would really like to defrag my disks with linux considering the problems i have had in windows. i have researched and come up fairly empty so far if anyone could point me to a program that would be great. thanks!
my avatar is what happens when you run windoze.

User avatar
mikeb
Posts: 11297
Joined: Thu 23 Nov 2006, 13:56

#2 Post by mikeb »

Hmm why not try a different approach...
I now have shared storage as ext2 which can be read tranparently from windows using the ext2 driver from
http://www.fs-driver.org/

Just keep a small partition just for windows which won't need defragging very often and you shared storage will benefit from the better space allocation of ext2.

mike

ps..the standard defragger in windows shouldn't be crashing or causing crashing...something not right there plus I never found a linux fat32 defragger either

code_m
Posts: 65
Joined: Wed 02 Jul 2008, 19:11

#3 Post by code_m »

One of the main reasons ext2 and ext3 file-systems was to avoid the need to defragment your drives. You have to admit, if you defrag your hard drives as often as you really needed to, it would get really annoying.

I have read it explained like this: FAT16 and FAT32 both both files as close as possible to make lots of space available on the end of the drive (or partion). For example you have an old text file, if you write enough to the file, it will extend past the write-able space around and it moves the file to a bigger place on the drive, leave a hole in the middle of the drive.

ext2 and ext3 on the other hand will space out all the files across the drive with a lot of space between each file. New files are obviously written to these spaces on the drive. This does present some problems however, if your drive is extremely full, there may not be a big enough space for a large file to be written to.

Hope that all made sense!

bill
Posts: 490
Joined: Wed 28 May 2008, 15:32

#4 Post by bill »

I haven't a clue about a Linux defragger,but Crabbypups Avatar absolutely gave me a great laugh. :lol: cheers

otropogo

Re: disk defragger in linux?

#5 Post by otropogo »

crabbypup wrote:...

i would really like to defrag my disks with linux considering the problems i have had in windows. i have researched and come up fairly empty so far if anyone could point me to a program that would be great. thanks!
I Use Diskeeper Lite to defrag my Vfat32 files, and it mostly works well, although I've had a couple of scary episodes (the detailed display showing the fragemented segments growing instead of shrinking). However, I haven't lost a disk yet using it.

I understand there are better freeware defraggers for XP. I don't know whether they work on vfat32 partitions or only NTFS.

Suggest you do a Google usenet news search on the subject in the newsgroup alt.comp.freeware, or go to their pricelessware sites,

pricelesswarehome.com IIRC is one.

Bruce B

Re: disk defragger in linux?

#6 Post by Bruce B »

crabbypup wrote: i would really like to defrag my disks with linux considering the problems i have had in windows. i have researched and come up fairly empty so far if anyone could point me to a program that would be great. thanks!
Crabbypup,

Just because Windows file systems get fragmented, and cause you problems, doesn't mean you will have the same problem with the superior operating system and its file systems.

After you run e2fsck on your partition, which you should do periodically, you will get a fragmentation report. Likely you will find the fragmentation percentage is low. An exception would be a partition that is used to store 4gb and 8gb movie isos which have been deleted and replaced. But the high percentage is a bit misleading, because you only have a dozen or so files, although they may not be 100% contigious, they aren't divided into small parts. It's just that the percentage figures tend to run high.

If I want to defrag a partition, I copy the files to another partition, format or delete the files on the orginal partition, then copy the files back.

Bruce

User avatar
crabbypup
Posts: 91
Joined: Sun 20 Jan 2008, 21:49
Location: toronto canada

#7 Post by crabbypup »

thanks mikeb! i will definately do that. i suppose i will have to get an external disk drive (300gb+) so i am going to get a 500gb usb drive for $80 - $100 and use that as my main tool for copying. i shall switch my machine over to ext2 as soon as i get that other drive.

@bruceb my computer is 400% more stable under linux than it is under windoze. if i didn't need it for paint.net and some games my brother has on it, i would obliterate windows altogether.
my avatar is what happens when you run windoze.

User avatar
mikeb
Posts: 11297
Joined: Thu 23 Nov 2006, 13:56

#8 Post by mikeb »

Well I would suggest making a small ext2 partition and install the driver to get familiar and see if works ok for you (it should but i am cautious).
I've heard of peeps running (not booting) windows from an ext2 partition.
have fun

mike

User avatar
cb88
Posts: 1165
Joined: Mon 29 Jan 2007, 03:12
Location: USA
Contact:

#9 Post by cb88 »

hahha somebody finally used that avatar! I had it for a couple weeks months back and got asked repeatedly to change it LOL

Heh if you just buy SSDs then you wouldn't have to care about defraging since they have 0 seek time
Taking Puppy Linux to the limit of perfection. meanwhile try "puppy pfix=duct_tape" kernel parem eater.
X86: Sager NP6110 3630QM 16GB ram, Tyan Thunder 2 2x 300Mhz
Sun: SS2 , LX , SS5 , SS10 , SS20 ,Ultra 1, Ultra 10 , T2000
Mac: Platinum Plus, SE/30

User avatar
SirDuncan
Posts: 829
Joined: Sat 09 Dec 2006, 20:35
Location: Ohio, USA
Contact:

#10 Post by SirDuncan »

crabbypup wrote:i suppose i will have to get an external disk drive (300gb+) so i am going to get a 500gb usb drive for $80 - $100 and use that as my main tool for copying.
I've been looking at this for $56 http://www.ecost.com/detail.aspx?edp=40683037. 500gb USB/eSATA. I'm not familiar with the company, though.
Be brave that God may help thee, speak the truth even if it leads to death, and safeguard the helpless. - A knight's oath

amigo
Posts: 2629
Joined: Mon 02 Apr 2007, 06:52

#11 Post by amigo »

Here's a 'braindead' defragger written by an ex-kernel developer:

Code: Select all

#!/bin/sh
# defrag v0.06 by Con Kolivas <kernel@kolivas.org
# Braindead fs-agnostic defrag to rewrite files in order largest to smallest
# Run this in the directory you want all the files and subdirectories to be
# reordered. It will only affect one partition.

trap 'abort' 1 2 15

renice 19 $$ > /dev/null

abort()
{
	echo -e "\nAborting"
	rm -f tmpfile dirlist
	exit 1
}

fail()
{
	echo -e "\nFailed"
	abort
}

declare -i filesize=0
declare -i numfiles=0

#The maximum size of a file we can easily cache in ram
declare -i maxsize=`awk '/MemTotal/ {print $2}' /proc/meminfo`
(( maxsize-= `awk '/Mapped/ {print $2}' /proc/meminfo` ))
(( maxsize/= 2))

if [[ -a tmpfile || -a dirlist  ]] ; then
	echo dirlist or tmpfile exists
	exit 1
fi

# Sort in the following order:
# 1) Depth of directory
# 2) Size of directory descending
# 3) Filesize descending

echo "Creating list of files..."

#stupid script to find max directory depth
find -xdev -type d -printf "%d\n" | sort -n | uniq > dirlist

#sort directories in descending size order
cat dirlist | while read d;
do
	find -xdev -type d -mindepth $d -maxdepth $d -printf "\"%p\"\n" | \
		xargs du -bS --max-depth=0 | \
		sort -k 1,1nr -k 2 |\
		cut -f2 >> tmpfile
	if (( $? )) ; then
		fail
	fi

done

rm -f dirlist

#sort files in descending size order
cat tmpfile | while read d;
do
	find "$d" -xdev -type f -maxdepth 1 -printf "%s\t%p\n" | \
		sort -k 1,1nr | \
		cut -f2 >> dirlist
	if (( $? )) ; then
		fail
	fi
done

rm -f tmpfile

numfiles=`wc -l dirlist | awk '{print $1}'`

echo -e "$numfiles files will be reordered\n"

#copy to temp file, check the file hasn't changed and then overwrite original
cat dirlist | while read i;
do
	(( --numfiles ))
	if [[ ! -f $i ]]; then
		continue
	fi

	#We could be this paranoid but it would slow it down 1000 times
	#if [[ `lsof -f -- "$i"` ]]; then
	#	echo -e "\n File $i open! Skipping"
	#	continue
	#fi

	filesize=`find "$i" -printf "%s"`
	# read the file first to cache it in ram if possible
	if (( filesize < maxsize ))
	then
		echo -e "\r $numfiles files left                                                            \c"
		cat "$i" > /dev/null
	else
		echo -e "\r $numfiles files left - Reordering large file sized $filesize ...                \c"
	fi

	datestamp=`find "$i" -printf "%s"`
	cp -a -f "$i" tmpfile
	if (( $? )) ; then
		fail
	fi
	# check the file hasn't been altered since we copied it
	if [[ `find "$i" -printf "%s"` != $datestamp ]] ; then
		continue
	fi

	mv -f tmpfile "$i"
	if (( $? )) ; then
		fail
	fi
done

echo -e "\nSucceeded"

rm -f dirlist
That should work reasonably well on any file system including FAT. But look for it to take a very long time on large disks...

User avatar
Béèm
Posts: 11763
Joined: Wed 22 Nov 2006, 00:47
Location: Brussels IBM Thinkpad R40, 256MB, 20GB, WiFi ipw2100. Frugal Lin'N'Win

#12 Post by Béèm »

Bruce B wrote:If I want to defrag a partition, I copy the files to another partition, format or delete the files on the original partition, then copy the files back.
Is there same kind of procedure for a .2FS file (a pup_save)
I start to have segmentation faults and was thinking about fragmentation as the file check I did said something about it.
Time savers:
Find packages in a snap and install using Puppy Package Manager (Menu).
[url=http://puppylinux.org/wikka/HomePage]Consult Wikka[/url]
Use peppyy's [url=http://wellminded.com/puppy/pupsearch.html]puppysearch[/url]

User avatar
Flash
Official Dog Handler
Posts: 13071
Joined: Wed 04 May 2005, 16:04
Location: Arizona USA

#13 Post by Flash »

I think Bruce's technique works for any combination of hard disk filesystem(s). In general, copying a fragmented file automatically reconstitutes its fragments as a single contiguous file on the disk to which it is copied.

Bruce B

#14 Post by Bruce B »

Béèm wrote:
Bruce B wrote:If I want to defrag a partition, I copy the files to another partition, format or delete the files on the original partition, then copy the files back.
Is there same kind of procedure for a .2FS file (a pup_save)
I start to have segmentation faults and was thinking about fragmentation as the file check I did said something about it.
Béèm,

I'll outline a general proceedure, without an excess of details, for those, such as yourself who don't need the details. However, I doubt the segmentation faults are the result of fragmented files.

A proceedure

1) defragment the pup_save file itself. if it's on an XP computer you can run its defrag utility or a third party defrag utility, then;

2) copy pup_save.2fs to newpup.2fs, then;

3) boot Puppy from cd using puppy pfix=ram, then;

4) mkfs.ext2 newpup.2fs (to clean it up), then;

5) e2fsck pup_save.2fs, to make sure it doesn't have errors; then

6) mount both .2fs files; then

7) copy pup_save.2fs contents to newpup.2fs, then;

8) rename pup_save.2fs -> pup_save.old, newpup.2fs -> pup_save.2fs

------------------------

Notes on big files:

A 10mb file is a big file on a 64mb pup_save.2fs
A 10mb file is a small file on a 200mb partition

If your destination or target partition will be nearly filled up, a big file might get fragmented if it happens to be one of the last files written.

If you target partition will be only, say, 2/3 full when done, probably all files will be contiguous. But even that would depend on the relative size of the big file. A 500mb file on a 1024mb pup_save probably should be written first to guarantee that it will be contiguous as possible.

Sometimes big file fragmentation doesn't matter to me. If the big file is an iso intended to be read one time for writing to a disc, its fragmentation doesn't warrant much consideration to me, as long as the computer can read it without buffer underruns.

-------------------------

Omitted in the procedure, and this post are considerations about the hundreds or even thousands of 'whiteout' files which will be copied over.

It seems this copy proceedure would make a good occasion to delete many of them.

Bruce

User avatar
crabbypup
Posts: 91
Joined: Sun 20 Jan 2008, 21:49
Location: toronto canada

#15 Post by crabbypup »

thanks amigo for the script!

@mikeb thanks for the warning on that. i'll take it into consideration.

@cb88 well, but it would take alot of ssd's to give me the same amount of space. and ssd's have poor write rates. they are also limited to about 100, 100 writes. but thanksfor the suggestion.
my avatar is what happens when you run windoze.

User avatar
Béèm
Posts: 11763
Joined: Wed 22 Nov 2006, 00:47
Location: Brussels IBM Thinkpad R40, 256MB, 20GB, WiFi ipw2100. Frugal Lin'N'Win

#16 Post by Béèm »

Bruce,
Thank you for this valuable and clear tutorial.
Bernard
Time savers:
Find packages in a snap and install using Puppy Package Manager (Menu).
[url=http://puppylinux.org/wikka/HomePage]Consult Wikka[/url]
Use peppyy's [url=http://wellminded.com/puppy/pupsearch.html]puppysearch[/url]

purple_ghost
Posts: 416
Joined: Thu 10 Nov 2005, 02:18

Options.

#17 Post by purple_ghost »

First of all. If you are going to defrag an FAT32 partition, then, you might want to consider running "scandisk" first. If you have a floppy drive you might use a boot disc from: http://www.bootdisk.com/

EDIT: Do NOT use the Scandisk that usually comes with Windows install CD. It does not like Long File Names and will throw them all away, and their is no recovery program for that. You can safely use the Scandisk which comes with the programs from bootdisk.com, it is a later version.

You seemed to say that your Windows Defrag errored off. ??? I apologize if I am saying what you already know. Just I have to keep reminding myself (I make the same mistake every time) that to run Defrag in Windows 98 then one must start the computer in "Safe Mode" On my computer one must power up the computer, just as the BIOS is about to end checking the RAM, I start pressing and releasing the F5 key every two seconds. (That is the F (function) Five key) In running standard Windows, Defrag in Windows 95, 98, ME gets interrupted by some process or the other writing to the hard drive, which makes the defrag keep restarting until it errors off.

The other really time consuming option, if you have an extra hard drive and are used to opening the tower, moving hard drives around, setting jumpers, working with the cables. One might insert another hard drive into the tower as the new Master, which means changing the other drives to Slaves. Then throw a basic install of whatever Windows you have onto your new master drive. Then you can start that Windows in "SAFE MODE" and SCANDISK, (make sure you use a later version of ScanDisk) then DEFRAG the now Slave Drives. When you are through, remove the temporary master drive and put everything back like you had it. It is a lot of trouble, but it depends on what kind of resources you have and what you feel comfortable doing. FYI. The DEFRAG program for Windows ME is supposed to be a lot faster than the one that comes standard with Windows 98 and can be installed in Windows 98 to be used. I know that by reading what others have said. I have not used the Windows ME DEFRAG program in Windows 98. In using Windows 98 I have found interesting, "http://www.mdgx.com/" I also have used one of the standard Unofficial Windows 98 Service packs listed on this site and in one move got rid of a lot of funny Windows 98 glitches.

None of this is going to fix a Virus or a Trojan, Spyware, Root Kit that is currently on your Windows drive.

I like the idea to simply do a disk Partition to disk partition copy and back again to effectively get a Defrag accomplished. However. Do NOT use a Windows copy to do that. Whether the copy is from Windows itself or DOS because M$ intended that we not be able to proliferate (clone) Windows from one computer to another with their copy programs, you can NOT boot Windows if a (Windows) copy is used.. You can use a partition to partition copy. I am guessing PUDD in Puppy should do it correctly, although I have not tried. I have used the Seagate Install CD (comes with buying a new Seagate hard drive, or can be downloaded, and only works if one of the hard drives is a Seagate) to do a partition to partition copy. While using the Seagate CD takes a long, long, long time, it does work and the resulting Windows partition can be booted from. There is supposed to be a DOS program (not from M$) named XXcopy which is reputed to do logically copying of the Windows partition correctly, I have also not used xxcopy. Perhaps Bruce can come back in here and tell us which programs can be safely used, and to correct whatever wrong I have said here.

I have heard it said that one nearly always finds the situation occurs where one ends up re-installing Windows at least once a year. Back up before starting.

Best Wishes.
Google Search of Forum: http://wellminded.com/puppy/pupsearch.html

Bruce B

Re: Options.

#18 Post by Bruce B »

Purple_Ghost,
purple_ghost wrote:The DEFRAG program for Windows ME is supposed to be a lot faster than the one that comes standard with Windows 98 and can be installed in Windows 98 to be used. I know that by reading what others have said. I have not used the Windows ME DEFRAG program in Windows 98. In using Windows 98 I have found interesting, "http://www.mdgx.com/" I also have used one of the standard Unofficial Windows 98 Service packs listed on this site and in one move got rid of a lot of funny Windows 98 glitches.
The ME tools are superior, the defrag is super fast.

Run Scandisk before the defrag. I let defrag fix bad reported free space and delete lost clusters. I do not trust it to fix more complex errors. I don't trust Norton Disk Doctor to fix them either. If you can locate the files to a directory, copy the files to a new directory, delete the old one after you fixed the copy. See if you can see any obvious errors, such as files too short. Rename the new directory to the old name and run scandisk again. Invalid long file names are actually orphaned aliases, but ms doesn't want to report them as aliases part. I fix these with a disk editor.

Finding stuff at mdgx.com is the hard part. I think the package is called scanfrag. But it is worth the search. Just run the installer once found.

I've never had damage from running the ME defragger.
purple_ghost wrote:I have also not used xxcopy. Perhaps Bruce can come back in here and tell us which programs can be safely used, and to correct whatever wrong I have said here.
Xxcopy is very good, but there is a learning curve involved. So many switches.

I like Total Commander for drive backups, because onlike Windows Explorer it doesn't stop on an error, like accidently trying to copy the swap file, Total Commander just advise you and continue the copy operation.


Bruce

User avatar
crabbypup
Posts: 91
Joined: Sun 20 Jan 2008, 21:49
Location: toronto canada

#19 Post by crabbypup »

didn't anyone read the first post? i am using windows xp service pack 3. not win 98 or any of the earlier versions. i know xp is basically NT, but whatever.
i used the braindead defragger and it works amazing. but it took about 3 hours on my 320gb drive. that is only to be expected.
my avatar is what happens when you run windoze.

User avatar
SirDuncan
Posts: 829
Joined: Sat 09 Dec 2006, 20:35
Location: Ohio, USA
Contact:

#20 Post by SirDuncan »

crabbypup wrote:didn't anyone read the first post?
Yes.
crabbypup wrote:i am using windows xp service pack 3. not win 98 or any of the earlier versions.
I'm not entirely certain why Purple_ghost went off on a tangent about 98. I assume that he was just mentioning a problem that he had with an earlier version in hopes that it might give some insight into the current problem.
crabbypup wrote:i know xp is basically NT, but whatever.
NT? Who mentioned NT? You're not being confused by NTFS, the file system used by all versions of Windows that use the NT kernel (like XP), are you?
crabbypup wrote:i used the braindead defragger and it works amazing. but it took about 3 hours on my 320gb drive. that is only to be expected.
Yeah, defragging any drive much longer than about 20 gigs is a real time hog. That's why I try to avoid using large NTFS or FAT formatted partitions.
Be brave that God may help thee, speak the truth even if it leads to death, and safeguard the helpless. - A knight's oath

Post Reply