The example I use most frequently is that of image manipulation. When I download all the images I've taken on my digital camera, they're huge: 1600x1200 pixels. I'm not in the habit of making people download such giant images with their browser, so for the images that I post on the web, I resize them to 640x480, which is a reasonable size for most web sites. To do the resizing quickly, I use the ImageMagick tool "convert" to do all the work. I don't need to mouse around and click things to make it happen (as I would if I was using the Gimp). So, I generally type in my shell:
for i in *.jpg; do convert -geometry 640x480 -quality 70 $i $i doneThis will get the job done, but it's mostly CPU intensive. There is disk access, but it's only for reading the original file, and writing back the result. The result is pure number crunching. Watching my CPU monitor on dual-CPU machine is sad, because that "for" loop executes one "convert" at a time, leaving an entire CPU idle. And as a result, I have to wait twice as long as I would have to if both CPUs were working on the task.
(for i in *.jpg; do echo convert -geometry 640x480 -quality 70 $i $i done) | maxcpuTo portably detect how many CPUs are available, I use the POSIX system call "sysconf". Unfortunately, Perl's POSIX module doesn't include the constant required for CPU detection (_SC_NPROCESSORS_ONLN). As a result, I use Inline::C to execute the detection function.