I posted the gist of this question as a follow on to a different thread I started in the S3 Systems Admin forum, then realized it might be more appropriate here. I hope this doesn't violate any cross posting rules. The first step of a two step job runs a token assigned to a Windows command file that first runs a VB script, then calls loadrpts. The second job step runs PR530. I don't want the PR530 to run if the first step fails., it would better for to job to go into recovery. When I force errors to occur, I notice that when loadrpts encounters an error, the job terminates. In this case, I removed a folder expected by the script, and a folder expected by loadrpts. When the script errors, the command file continues to execute after the script exits, but when loadrpts encounters an error, it terminates the job.
*** An errror occured while procssing HistCopy.vbs *** *E* Print file folder was not found. Exiting with error code 76 D:\lawdev\law\eugene\Vbs\histcopy.vbs(235, 4) Microsoft VBScript runtime error: Path not found (Loadrpts invoked here) Loading from D:\lawdev\law/print/4JLSFDEV_hrisuser/AES001. . . Cannot Open Directory: D:\lawdev\law/print/4JLSFDEV_hrisuser/AES001 Elapsed Time . . . . . .: 00:00:00 ERROR: Stopped On Exit 1. Elapsed Time: 00:00:01 END: Job Ended: Sun Apr 14 06:55:31 2013
I can return a code from the script to the command file, trap it, and act on it, which suggests that one way to accomplish what I want is to invoke loadrpts (or some other utility) with an invalid directory if the script returns an error, but that seems kludgy. Is there an accepted, or at least slightly more elegant way to force an exit-causing-error up from a command file?