Purging WorkUnits

 12 Replies
 0 Subscribed to this topic
 52 Subscribed to this forum
Sort:
Author
Messages
Robert
Veteran Member
Posts: 82
Veteran Member

    Testing is generating quite a few.   Which utility is used to purge them and possible their log counterparts inbpm/wflog ?

    fred.kobos
    Advanced Member
    Posts: 27
    Advanced Member
      depends which one here is both for workunits once moved you coud clear the directory
      perl $GENDIR/bin/batch.pl MoveWorkUnitToHistory move -outputFileName /law9p/law/bpm/reqapproval06012009 -processThruDate 06/01/2009 -processThruTime 12:20:33


      Perl $GENDIR/bin/batch.pl PurgeHistWorkUnit -outputFileName /law9p/law/bpm/purgeapprovals070108 -purgeBeforeDate 07/01/2008 -purgeOption Y
      Robert
      Veteran Member
      Posts: 82
      Veteran Member
        Fred -

        Thanks for the tip.
        David Williams
        Veteran Member
        Posts: 1127
        Veteran Member
          If you have PFI then you can run the move and purge jobs as a flow.
          David Williams
          JudeBac
          Veteran Member
          Posts: 129
          Veteran Member

            Purging using PFI processflow would be a neat addition. We will be getting PFI in a week.

            Thanks,

            Jude

            David Williams
            Veteran Member
            Posts: 1127
            Veteran Member
              You will use the CustomActivity node and select the MoveWorkunitToHistory Activity. You can use a date variable if you don't want to move recent completed Workunits. Note that only Completed or Cancel Completed Workunits will be moved. If you have WU's in error they won't move off.
              David Williams
              Sam Simpson
              Veteran Member
              Posts: 239
              Veteran Member
                Well purging workunits using PFI really is a good idea but how do you purge the actual workunit logs in the WFLOG directory? Here's what I did:
                1. Create a processflow that is scheduled to run every night. This will call a perl script that will move processed workunits to history. You can set the parameters to only purge records that is older than nn days.
                You can also direct the log file created by the perl script into a directory that you can work on later.
                2. Call a script that will purge workunit logs from the wflog directory. This script (sorry but this is unix folks) read the log created by the perl purge program and physically delete the logs in the wflog directory. This script should be the last node of your processflow. the script should also be able to create a report of what workunits has been purge and email it to you.

                Not only does WU logs are clogging the system but if you notice the logs of the active processflow itself
                is getting bigger and bigger until it is hard for you to even open it. Here's what I did:

                1. Create a cobol program under LOGAN env whose function is to read WFPROC and get it's version number. Write the processes name into a file in a directory (sequentially) and affix _V plus the version number. This program is scheduled to run (recdef) every first day of the month (11 PM).
                2. Create a script that will purge the processname logs based on the listing from above.
                3. Create a processflow scheduled to run every first of the month that runs right after the scheduled
                cobol program above whose only function is to call the purging script.

                As for purging the history records, we have not decided yet on how to save the csv file to the archive directory but we have decided that the Admin can do this manually every year.
                Sam Simpson
                Veteran Member
                Posts: 239
                Veteran Member
                  Forgot to mention. The monthly script also create a backup copy before purging. This is to ensure that users has a way of looking back just in case.
                  Gary Davies
                  Veteran Member
                  Posts: 248
                  Veteran Member
                    Keep in mind, because it is a Perl script now and not a 4GL program that you want to run it as frequently as possible because you can get a "memory exceeded" error if you have too many to delete.

                    John Henley
                    Posts: 3353
                      BTW, there is a Lawson KB article (5207793) on changing the properties of the purge program to (possibly) avoid the "memory exceeded" error...

                      Thanks for using the LawsonGuru.com forums!
                      John
                      Robert
                      Veteran Member
                      Posts: 82
                      Veteran Member

                        Gary:
                        Perl is extremely effecient, it's the script's construction that causes the issue.
                        Thanks for all this excellent advice gurus.

                        Gary Davies
                        Veteran Member
                        Posts: 248
                        Veteran Member

                          I wan't blaming Perl, just noting that the process changed and as a result a memory issue can occur and that if the interval the process is run is too long and results in too many workunits to delete you can get the memory exceeded error.

                          Thanks for the reference to the KB John. I have found though that even with that tweeked if you leave purging to too long you can never get the perl script to complete resulting in a need for either a custom 4GL or the painful process of manually removing the workunits.

                          Sam Simpson
                          Veteran Member
                          Posts: 239
                          Veteran Member
                            Yes I was aware of the limitation of perl and was ready to use my 4GL program in doing the purge. I tested the perl script to purge 5 years of data and never had a problem so I continued using the perl script. My flow has been without any glitch since day one. As for the memory usage I think I did something to increase the size in batch.pl but I'm not sure if that was it.