Let’s say you have couple of job steps where the second step reads the output of the first and perhaps they aren’t running as quickly as you’d like. It would be nice to multitask these programs to run them at the same time to reduce the overall elapsed time but you can’t really read the input into the second step until the first step finishes creating it. Or can you?
<disclaimer>
This is more of an experiment than something I’ve ever done in production. You should be concerned about how this technique may affect job restart, database contention, or whatever is important in your shop.
</disclaimer>
You can feed the output of one job into the input of another whilst they are both running by using a Unix pipe. And you don’t need any new software to do it. And the jobs can even be running in different LPARs in the Sysplex as long as the directory you choose to define the pipe is available from each LPAR. Here’s a simple example showing the method.
The first job copies ‘SYS1.HELP(HELP)’ to a pipe in a Unix directory, the file system is shared:
/*JOBPARM S=SYSA //* //WRITPIPE EXEC PGM=IEBGENER //SYSPRINT DD SYSOUT=* //SYSIN DD DUMMY //SYSUT1 DD DISP=SHR,DSN=SYS1.HELP(HELP) //SYSUT2 DD DSNTYPE=PIPE,PATH='/u/xxxxxxx/testpipe', // FILEDATA=BINARY, // PATHOPTS=(OWRONLY,OCREAT,OTRUNC), // PATHDISP=(KEEP,KEEP), // PATHMODE=(SIRWXU,SIRWXG,SIROTH)
And the second job, running in a different LPAR, reads the pipe and copies the data to SYSOUT:
/*JOBPARM S=SYSB //* //READPIPE EXEC PGM=IEBGENER //SYSPRINT DD SYSOUT=* //SYSIN DD DUMMY //SYSUT1 DD DSNTYPE=PIPE,PATH='/u/xxxxxxx/testpipe', // FILEDATA=BINARY, // PATHOPTS=(ORDONLY,OCREAT), // PATHDISP=(DELETE,DELETE), // PATHMODE=(SIRWXU,SIRWXG,SIROTH), // RECFM=F,LRECL=80,BLKSIZE=80 //SYSUT2 DD SYSOUT=*
As long as you coordinate the names of the pipes, you are good to go.
Pretty slick, eh?