Deleting Archive folder after certain time

 3 Replies
 2 Subscribed to this topic
 32 Subscribed to this forum
Sort:
Author
Messages
Utsav
New Member
Posts: 3
New Member

    Hello ,

               I have a flow in Infor IPD and is creating output files at the end. I have already created a archive path where output files are stored. Also i wanted to create a feature on that archive folder to delete all files after 30 days. is this possible in IPD ?  Suggestions would be appreciated.

    Ragu Raghavan
    Veteran Member
    Posts: 476
    Veteran Member
      Not the most elegant solution, but this is how I handled it -
      1. assumes that files are saved with a date stamp yyyymmdd in the archive folder
      2. FileAccess - List Files - for all files in the archive folder
      3. DataIterator to process list from #2
      4. Javascript to parse date from file_name and compare it to a certain date (-30 in your case)
      5. FileAccess - Delete the file if it satisfies condition #4
      Utsav
      New Member
      Posts: 3
      New Member

        Thanks Ragu!! Appreciate for your comments and suggestions on this topic. 

        mthedford
        Basic Member
        Posts: 8
        Basic Member
          I have a simple IPA flow that performs a query on this table for Flag=Y, pulls in the path and retention days, and within the query loop, performs a SysCmd node that runs the KSH script passing in the Folder Path and the Days of Retention, and a MsgBuilder that captures the Success folders that were process and any Folders that had errors.

          I have another KSH script specifically for the Productline WORK dir file cleanup to remove all of the old GLTDISK* and tmp[0-9]* files. Uses same params of path to each PL/Work folder and retention days back.

          #!L:\CYGWIN64\bin\ksh
          export prtdir=$1;
          echo "pathis: $prtdir";
          find $prtdir -name "GLTDISK*" -type f -mtime +$2 -exec rm {} \;
          find $prtdir -name "tmp[0-9]*" -type f -mtime +$2 -exec rm {} \;

          ************
          For Archival, I have another flow and table to use.
          This table contains at least the following – full path to folder (either local or UNC), full path to the archive folder (local or UNC, I use a subfolder for archive in each location), file retention days num (files older than XX are archived), script to run, flag to run it or not.
          KSH script to generate a list of files that are older than the file retention days – 2 params, $1 path to folder where the files are located, $2 file retention – files older than this number are archived.

          export prtdir=$1;
          find $prtdir -type f -maxdepth 1 -mtime +$2

          The flow performs a SQL loop query reading the table, for each record it runs the KSH script, the output of that script (file list) is stored in a variable, FileAccess to write the file list to a text file to be used as input for 7-Zip, SysCmd to run 7-Zip using file list as input of what to zip up, then FileAccess node to write/append to a log file adding the file list and the zip output, deletes the file list text file, clears vars, then loops to next record, sends an email of successful and error folder locations if any.
          That’s it.

          Only thing to remember, if you are on Windows Servers and performing these tasks on other Windows servers that are NOT Lawson servers, you will need to add the Machine Account of the server you are running the SysCmd on to the folder access on the other server you are removing or zip/archiving files on, as the SysCmd runs as the account your LSF ENV is actually running as, usually the Machine Account.

          Been running flawlessly for years.