More efficient way to find & tar millions of files

Posted by Stu Thompson on Stack Overflow See other posts from Stack Overflow or by Stu Thompson
Published on 2010-04-23T08:40:43Z Indexed on 2010/04/23 8:43 UTC
Read the original article Hit count: 249

Filed under:
|
|

I've got a job running on my server at the command line prompt for a two days now:

find data/ -name filepattern-*2009* -exec tar uf 2008.tar {} ;

It is taking forever, and then some. Yes, there are millions of files in the target directory. But just running...

find data/ -name filepattern-*2009* -print > filesOfInterest.txt

...takes only two hours or so. At the rate my job is running, it won't be finished for a couple of weeks.. That seems unreasonable. Is there a more efficient to do this? Maybe with a more complicated bash script?

A secondary questions is "why is my current approach so slow?"

© Stack Overflow or respective owner

Related posts about bash

Related posts about find