xargs to runs processes multi-threaded

For discussions about programming, programming questions/advice, and projects that don't really have anything to do with Puppy.
Post Reply
Message
Author
User avatar
sc0ttman
Posts: 2812
Joined: Wed 16 Sep 2009, 05:44
Location: UK

xargs to runs processes multi-threaded

#1 Post by sc0ttman »

xargs
-P N Run up to N PROGs in parallel

Parallel runs of up to `max-procs` processes at a time; the default is 1. If `max-procs` is 0, xargs will run as many processes as possible at a time:

Code: Select all

  arguments_source | xargs -P max-procs command
Using the above we can run commands across a number of process, making them faster...

I haven't tried it.. But figure Pkg would be a good script to test this in ..

It could be used on slow cat, grep, cut, sort, cp, rm, tar, dpkg, commands (etc) ... or when processing large file lists..

I have no idea whether or not it would work if you piped it a shell function ...
[b][url=https://bit.ly/2KjtxoD]Pkg[/url], [url=https://bit.ly/2U6dzxV]mdsh[/url], [url=https://bit.ly/2G49OE8]Woofy[/url], [url=http://goo.gl/bzBU1]Akita[/url], [url=http://goo.gl/SO5ug]VLC-GTK[/url], [url=https://tiny.cc/c2hnfz]Search[/url][/b]

musher0
Posts: 14629
Joined: Mon 05 Jan 2009, 00:54
Location: Gatineau (Qc), Canada

#2 Post by musher0 »

Hi sc0ttman.

Yep, it will!. E.g.

Code: Select all

#!/bin/sh
shuf -i 1-49 -n6 | xargs -n6
will generate 3 sets of Lotto 1-49 numbers, 6 per line.
(Not by me, BTW; I picked it up on the stackoverflow forum.)

As well, former forum member Iguleder (of librepup fame) used it in his brush-up
of the PPM a few years back, IIRC.

I have never used xargs myself, I must confess.

~~~~~~~~~~~~~~~;

Another way to run commands in parallel, so, faster, is to use the good old ampersand
after a command.

Time each command or function. Then depending on the results of this timing, start
a secondary function with the ampersand at the end, at the top of your script --
BEFORE starting your main section or function -- so the two finish about the same
time. I did this in my menu generating and history scripts. In my menu generating
script, it brought down the execution time from 10 seconds to a little over 1 second.

Also presort your lists. This I got in a note from the author(s?) of the sort utility. Deep
down the computer still works alphabetically, so presorting any data if you can does
speed up processes. Once your data is presorted, you can call off ANSI and use the
full width of the bus instead of just half for a portion or all of your processes. (You can
re-enable ANSI as needed.)

Finally, try to use the built-in string manipulations in bash as much as you can, instead
of externally, through awk or sed or replaceit. Bash has a reputation for the best and
fastest handling of strings among computer languages for a reason.

Just a few thoughts.

IHTH.
musher0
~~~~~~~~~~
"You want it darker? We kill the flame." (L. Cohen)

Post Reply