Remove the lint from your hard drive

Salt RockWell, not really. But let me explain. I hoard data…a lot of data. I have a hard time deleting anything if there’s the slightest chance that I might want it later (downloads, docs, scripts, etc…)
I’ve done a pretty good job keeping a copy of things on my home server. In fact, I usually have a few copies of things on that server and quite often, duplicate files. Anyway, a while ago, I came across a tool called fslint.
fslint is a Linux utility that will search through your drive and help you clean up files. In my case, I wanted to remove all duplicate files in one of my directories on this server. It has a ton of options, the ones I use most are: 1) delete duplicate files and 2) hard-link duplicate files. I’m not going to go into details on the differences between the two, but they both free up space used by duplicate files.
The last time I ran it on one of the directories I wanted to clean out, the disk use of that directory decreased from 2.3GB to 578MB. That’s less than 1/4th the amount of space I was using before!
If you have a Linux system that’s starting to run out of space, lint may be a tool that can help you!