View Single Post
Old April 1st, 2013, 09:02 PM   #483 (permalink)
9to5cynic
Senior Member
 
9to5cynic's Avatar
 
Join Date: Feb 2011
Location: /home/
Posts: 4,858
 
Device(s): Galaxy S3 (Verizon) Evo 4G - retired/rooted
Carrier: Verizon

Thanks: 3,066
Thanked 1,762 Times in 1,189 Posts
Send a message via AIM to 9to5cynic
Default

Quote:
Originally Posted by palmtree5 View Post
[HIGH]cd /path/to/files
ls > ~/list.txt[/HIGH]Use your favorite text editor to check manually for duplicates (I don't know of another way)
If they are the exact same files with only different names, you could do something with hashing kinda like this:

Code:
cd /my/dir/here; for line in $( ls ); do md5sum $line; done| cut -d" " -f1 | uniq -u
I didn't put too much time in this, but it *should* output only unique hashes from the files in the /my/dir/here directory... which should give you an idea of how many duplicates you have...

or you could just end before the pipe and see all files (including duplicates -again based only on hashes- )

:/
9to5cynic is offline  
Reply With Quote
The Following User Says Thank You to 9to5cynic For This Useful Post:
palmtree5 (April 1st, 2013)