LPA Version 10

 15 Replies
 0 Subscribed to this topic
 52 Subscribed to this forum
Sort:
Author
Messages
Erin
Advanced Member
Posts: 24
Advanced Member
    We are using LPA version 10 and when we are in designer we can watch it go through every node on the page (yes, it's that slow).  When inserting records, it takes 1.5 hours to insert 300 records.

    Does anyone have any insight as to what could cause this?  Or anyone who is on version 10 and getting the speed of version 9?

    Thank you for any assistance.
    Woozy
    Veteran Member
    Posts: 709
    Veteran Member
      Hi Erin,

      Running anything in Designer is slow, but that is considerably slower than I would expect.  Here are some questions that may help pinpoint the answer:
      • What are you inserting records into (LTM? S3? SQL?), and what node type are you using? 
      • Can you upload and run it on the server instead? If so, turn off your logging (or at the very least Activity logging).
      • Is there anything helpful in the server logs or Designer logs? 
      • How beefy of a machine is Designer running this on? 
      • How busy is the server you are inserting to?
      • How busy is the Landmark server (if different from the destination server)?
      • How good is the network connection to your client machine (i.e. ping the Landmark and Destination servers from the Designer client)?  Is there a latency issue that could be contributing?

      If you are a Managed Services client, I would encourage you to bump it to them to review.

      Kelly

      Kelly Meade
      J. R. Simplot Company
      Boise, ID
      Erin
      Advanced Member
      Posts: 24
      Advanced Member
        Woozy,
        Thanks for replying! Here are the answers to your questions.

        •What are you inserting records into (LTM? S3? SQL?), and what node type are you using?
        There are several processes we are doing.
        1. Reading in an interface flat file (located on the Lawson database server) and inserting into the staging table then importing into the tables.
        2. Selecting from the Lawson tables and inserting into remote DB2 and SQL tables.
        3. Reading in a flat file (EFT file) and writing out selected fields to a flat file.

        •Can you upload and run it on the server instead? If so, turn off your logging (or at the very least Activity logging).
        Turning off the logging does help but we need it for doing development. Three checks/details in EFT file took 4 minutes to just write out into another flat file.

        •Is there anything helpful in the server logs or Designer logs?
        Not that we can see (or understand)

        •How beefy of a machine is Designer running this on?
        Pretty beefy!

        •How busy is the server you are inserting to?
        It isn't in production yet so I am thinking it is not horribly busy (we are only installing GL, AR and AP for now)

        •How busy is the Landmark server (if different from the destination server)?
        I don't know.

        •How good is the network connection to your client machine (i.e. ping the Landmark and Destination servers from the Designer client)? Is there a latency issue that could be contributing?
        Interesting question. I will ping and see if there is a problem there.

        We opened an incident with Infor and they are also looking into it. They wanted logs but so far it has taken over 26 hours just to do the logs.
        Woozy
        Veteran Member
        Posts: 709
        Veteran Member
          Hmmm - I thought this was one specific flow that was having issues. It sounds like you have other "stuff" going on. Unfortunately I don't think I can help much. I think Infor is going to have to do the heavy lifting to figure out what is causing the latency. You might also work with your infrastructure team to see if they can do any troubleshooting.

          Just curious - are you an LSF customer who has just implemented IPA as a replacement for PFI, or are you a Landmark customer? From you previous answers, it sounds like you're using this for S3. Do you have Landmark/IPA installed on your LSF server, or is it on it's own server? Either one of those options could have performance consequences.
          Kelly Meade
          J. R. Simplot Company
          Boise, ID
          Erin
          Advanced Member
          Posts: 24
          Advanced Member
            We are using S3 with Landmark. Version 10
            John Henley
            Posts: 3353
              Erin, what version of the designer are you using? Also, do you have debug turned on and/or have breakpoints on any of the nodes?
              Thanks for using the LawsonGuru.com forums!
              John
              Erin
              Advanced Member
              Posts: 24
              Advanced Member
                Hi John,
                We are using Product Version: 10.0.2.8..5466 2012-10-11 21:41:29
                We have infor looking into it also but we are at a point now that if LPA doesn't start moving faster, we can't go live on the target date since it will take several hours to bring in one GL interface.

                We are not using any breakpoints and are at WorkUnit level for logging (since we are still developing)

                P.S. Hershel Nunez from HyBridge says "Hi"
                Woozy
                Veteran Member
                Posts: 709
                Veteran Member
                  Hi Erin - I'm a bit confused by a couple of your comments, so I want to be sure we're on the same page.

                  You said that you can't upload and run the flows on the server because you need to do development.  Normally I will create "stripped down" test files to use for testing via the Designer to validate functionality, and then upload the flows to the the server to do the real grunt work and test performance.  The Designer will be much, much slower than the server in all cases (as long as server Pflow logging is off) - and particularly if you are doing updates.

                  Also, you mentioned that you have debugging off, but there are two types of debugging - Designer debugging (via the menu>>Windows>>Preferences>>Lawson Process designer>>debug checkbox) and server debugging (WorkUnit, Activity, WorkUnit and Activity, etc.).  If you are running on the Designer, then the server debugging setting doesn't matter because a workunit and related logs/activity records on the server aren't created.  If you are running locally in Designed, you instead need to turn off the Designer debugging for the best performance, but it will still be much slower than the server.

                  When running a flow in the Designer, it is running locally on the client machine even though the Landmark server has to be running and connected.

                  You may already understand all this, but I want to be sure.

                  Kelly
                  Kelly Meade
                  J. R. Simplot Company
                  Boise, ID
                  John Henley
                  Posts: 3353
                    I definitely agree with Woozy...all good points. You shouldn't be basing your design decisions on how flows are running in the designer, and you shouldn't be running thousands of records in the designer. Once you have a valid flow developed, then you move it to the server and throw batches of records at it.

                    I'm sure my friends at Lawson, um--Infor--will wring my neck, but Process Automation isn't the right solution for every problem, particularly if you're trying to do complicated processing (i.e. with all of the DB2 database nodes you're talking about) for thousands of records. Depending on your requirements, it might be that you need to dust off your COBOL manual
                    Thanks for using the LawsonGuru.com forums!
                    John
                    John Henley
                    Posts: 3353
                      Posted By Erin on 07/12/2013 04:12 PM
                       P.S. Hershel Nunez from HyBridge says "Hi"

                      Hi Hershel! (why does Hershel working with True Value *scare* me ??).
                      Thanks for using the LawsonGuru.com forums!
                      John
                      Woozy
                      Veteran Member
                      Posts: 709
                      Veteran Member
                        Excellent points, John.  IPA performance is not good for large batch processing.  It will do it, but it won't scream like it will in COBOL, and it can suck resources like you would not believe.  We have the same issue with Spreadsheet Designer (Add-Ins for Landmark).  The applications just aren't designed for high performance mass transactions.  This made our data conversion into LTM a bit painful.

                        We have a few interface jobs (creating outbound files for external vendors or systems - like Kronos) that we run on IPA, but they run painfully slow, even on the server - and they are just pulling data out of our systems.  We are currently reviewing these to determine if we should migrate them off IPA and into something else (unfortunately, we can't use COBOL since we need to pull in LTM data). 

                        We do have a lot of update flows, but they are typically small batches that are triggered by events or daily "changes" feeds.

                        Good Luck.
                        Kelly Meade
                        J. R. Simplot Company
                        Boise, ID
                        Erin
                        Advanced Member
                        Posts: 24
                        Advanced Member
                          John & Woozy,
                          We only design in the designer and then we upload to the server for testing\running (outside of flat file to flat file testing we can't run the larger flows from our PCs if we tried).

                          We are building LPA daily interfaces to take transactions from our DB2 systems (manufacturing, Order Billing etc) and loading them into S3.
                          We are also extracting data (like Accounts / Accounting Units combos) from S3 in order to update a DB2 table so those other systems can do a lookup in order to put correct info on their invoices before we pick it up.

                          But it took 1.5 hours just to move 300 records from S3 to one DB2 table. And it takes hours to run just a few thousand records in. We were hoping it was a Landmark setting issue or something.


                          John Henley
                          Posts: 3353
                            It sounds like you're trying too many things at once and just shooting at whatever moves

                            I think you need to step back a minute and be a little more methodical.

                            Let's start with an easy one first:

                            "it took 1.5 hours just to move 300 records from S3 to one DB2 table".

                            Is this just a simple loop to insert records?
                            Thanks for using the LawsonGuru.com forums!
                            John
                            Erin
                            Advanced Member
                            Posts: 24
                            Advanced Member
                              Sorry to have confused you
                              1. we are installing Lawson S3 with go-live end of September.
                              2. our interface strategy between other systems and S3 is via LPA.
                              3. we have several LPA's written (as mentioned) already.
                              4. All of the LPAs run very slowly on the server to the point that it takes hours for just a few hundred rows to be written.
                              5. was asking if anyone else on version 10 had such slow processing..examples mentioned earlier.
                              Erin
                              Advanced Member
                              Posts: 24
                              Advanced Member
                                To answer yor question...yes it is basicalky selecting 4 columns from GL and inserting into a DB2 table (via an assign and insert loop through the data)
                                Lynne
                                Veteran Member
                                Posts: 122
                                Veteran Member
                                  Our interface process flows have always run a long time, but it got extremely worse when we coupled (federated) S3 and Landmark. A 2 hour flow now runs in 8 or more hours. I have rewritten the longest ones in COBOL. If you are federated, this may be part of your problem also. We are working with Infor to go to Direct IOS. They think this will help with performance as it currently authenticates every time it queries a record. We are not on LPA yet, but Lawson said that wouldn't help with these long running flows. We have to go to Direct IOS.