Use ‘parallel’ for easy multi-processor execution
May 26th, 2010
I just discovered the parallel utility, which is a easy way to make use of multiple processors while munging data from the shell. I especially like that I can pipe data directly in and out of it, just like the other shell utils (sed, awk, cut, sort, etc).
From the examples on that page:
Use imagemagick’s “convert” command to downsize many images to thumbnails:
ls *.jpg | parallel -j +0 convert -geometry 120 {} thumb_{}
“-j +0″ means use as many processes as possible, based on the number present in the system
Or do it recursively using find:
find . -name '*.jpg' | parallel -j +0 convert -geometry 120 {} {}_thumb.jpg
I suspect this will become an integral part of my pipelines soon.
Tags: commandlinefu, linux, tools | 1 Comment »
June 14th, 2010 at 1:47 am
Cool! Thanx for the tip! Works fine:
parallel -j 4 pdftotext — *.pdf