IPA to Update Records, export to CSV, and upload to external web site

 11 Replies
 7 Subscribed to this topic
 52 Subscribed to this forum
Sort:
Author
Messages
HDAustin13
Advanced Member
Posts: 31
Advanced Member

    Hello Everyone!

    I am new to IPA and trying to build an IPA process that can run a SQL script with multiple queries and do multiple updates to export to Excel CSV file then to upload the output CSV to the external website. I try to use "SQLQRY" node  but it looks like it is more a single query rather than multiple queries which can be end up to a more complicated process as seen below in 1st picture. 

    I also try the "FIile" node with "Write To File" in the Excute mode and "Output file" but still not getting any better. (picture 2)

    Can anyone please advice on how it could be done to execute/run a SQL script with multiple queries/update to export it to csv file so it can be upload to external web site. Any sample process that similar to this process would be appreciated.

     

    Thank you much in advance.

     

    Woozy
    Veteran Member
    Posts: 709
    Veteran Member
      Hi HDAustin13,

      I don't really understand what you're trying to do, but I do see one issue with your flow.  SQL Query Nodes will return multiple records, but you have to deal with each record within the loop like this:

      If you need to do multiple queries from different sources (that can't be done in a single SQL join), then the queries need to be nested somehow before writing to the output file.  

      I hope this helps point you in the right direction. Good Luck.

       

       

      Kelly Meade
      J. R. Simplot Company
      Boise, ID
      pbelsky
      Veteran Member
      Posts: 80
      Veteran Member

        Gosh, maybe I am just an old programmer here, but it seems to me like all the processing you want to do might be better handled by a processing language - i.e. a 4GL program that creates an output file.

        mikeP
        Veteran Member
        Posts: 151
        Veteran Member
          I'm not a SQL expert, but with SQL Server T-SQL, I think you can have a fairly large query in a single "statement" that contains different logical queries, each terminated by a "Go" command. I should think that the other major database players have the same capability. If that's so, they you should be able to combine your multiple update queries into a single SQL Transaction node, either by hard coding the query in the node's SQL property, or by creating it in a variable with an assign node.

          Assuming I had a SQL Transaction node that performed all the updates I needed, I would probably add a Sql query node to return the updated data, and loop through its returned data set using a Message builder node to assemble and format the CSV records, then write it to local disk on Landmark using a File access node. Finally, I'd use a File transfer node to move it to its final location.
          Ethan
          Advanced Member
          Posts: 28
          Advanced Member
            We are using SQL Server. In LPA, I have scripted out multiple insert statements using a File Access node, writing them to file and then calling the script/File Access node file using "sqlcmd" in a SYSCMD node (I actually write a .bat file with the SQLCMD file in it in another File Access node and just call the batch file in the SYSCMD node). You can also export the output of the SQLCMD to a CSV file using the "-o" switch. If you go the route of SQLCMD CSV outout, I have been adding "-h-1" (suppress header), "-s","" (column separator) and -W (remove trailing spaces) to the end of the command to get the formatting clean. You can do any necessary data formatting during the select.

            You might be able to do what you need with two SQLCMDs, one to do the updates and one to do the selects.
            Randy
            Veteran Member
            Posts: 50
            Veteran Member

              Thought I responded, but haven't seen it pop up.

              I tired of the data iterator when on a project that required reading a flat file into a custom table, working the data and then exporting a CSV to be loaded back into Lawson GLTRANSREL. So the approach I developed was to use the SQLTXN node using the SQLBULK INSERT statement to load the flat file, and then use a Windows SYSCMD node calling a batch file containing MS SQL Server SQLCMD using a script file and output the data. The Data Iterator approach went from 10-15 minute's read input to 5 seconds using the BULK INSERT, and 10-15 minute extract to 30 seconds using SQLCMD's. The other beauty was the ability to stack the SQLCMD's in the batch file to process multiple output files and to use parameter's for BOTH the batch file and pass to the SQL scripts.

              Ethan
              Advanced Member
              Posts: 28
              Advanced Member
                Randy - We ended up needing to do something similar. We were trying to use data iterator to parse, format and rewrite a 20k line flat file and the fastest we could get the LPA to run was about 1.5 hours. Rewrote the PFlow to insert into a custom table, format output on the select and export to CSV in SQLCMD. Got the processing time down to roughly 25 seconds.
                HDAustin13
                Advanced Member
                Posts: 31
                Advanced Member
                  Thanks for all of the valuable advises.
                  Ethan, I am currently doing the way like you have by rewriting the flow with creating a custom table (a temporary table), update it, and export the output to CSV but I am still confusing on how the File Access and SQLCMD nodes actually work. Do you have a sample flow that you can share? Thanks.
                  Ethan
                  Advanced Member
                  Posts: 28
                  Advanced Member
                    HDAustin13, I put together a sample flow. What is the best way to get it to you? Or would you prefer I just screenshot it and post it on the thread?
                    Shane Jones
                    Veteran Member
                    Posts: 460
                    Veteran Member
                      We have a number of processes that work with file access nodes. If you are sharing - I'd like to see your flow that speeds up processing. Usually we don't care if it is a "little" slow because it is run in the evening but that is a significant improvement...
                      Shane Jones
                      Tools: HR, Payroll, Benefits, PFI, Smart Office, BSI, Portal and Self-Service
                      Systems: Lawson, Open Hire, Kronos, Crystal Reporting, SumTotal Learning
                      ** Teach others to fish...
                      Chris
                      Advanced Member
                      Posts: 20
                      Advanced Member

                        Ethan,

                        I am doing something similar and would love to see the example flow you can up for this - do you still have it? Would you mind sharing it with me?

                        Many thanks,
                        Chrs

                        Ethan
                        Advanced Member
                        Posts: 28
                        Advanced Member

                          Shane, sorry I completely missed your post a few weeks ago. Chris and Shane, a very generic sample of what we did with SQL Server and IPA is attached. Please let me know if you have any questions.

                          Attachments