Import data from CSV file to an external Oracle table

Sort:
You are not authorized to post a reply.
Author
Messages
HDAustin13
Advanced Member
Posts: 31
Advanced Member

    Hello and Happy Friday, everyone!

    I am trying to import data from the csv file (no delimited comma, just the regular csv file) to an external Oracle db table by using the DataIterator with the Reading by File and Reading by File but it takes so long like 45 minutes for just 800 records vs a 2 min in SQL Developer to finish. Can someone please advise if there is a better way to do the import beside the DataIterator thing? Thanks in advance. 

     

    David Williams
    Veteran Member
    Posts: 1127
    Veteran Member
      So you're getting CSV data out with your SQL query and parsing it to reload into another table via a SQL Update? Why are you going through the data iterator instead of loading the entire CSV data from the first query into the new table? Are you having to parse the data in some way? If you are, then run the CSV data through a JavaScript Expression (in the Assign node) to format it in such a way so it can be inserted into your other table.
      David Williams
      HDAustin13
      Advanced Member
      Posts: 31
      Advanced Member

        Hello David! Thanks for the quick response.

        Well, the data from CSV was parsing out from the API site (another IPA flow) and I trying to import/load that data into an Oracle table. The DataIterator was suggested by a friend but it does not seem to work well in this case. I know that there is a better way to do load the entire csv file  but I just dont know how/where to start. Do you have a sample flow of doing the JavaScript Expression in the Assign node that you mentioned, can you please share it? TIA.

        FireGeek21
        Veteran Member
        Posts: 84
        Veteran Member
          I load about 4,000 records to a SQL table and the whole process takes about a minute. I use the SQL Transaction node:

          truncate table [schema].[tablename]
          insert into [schema].[tablename]

          Iterate through your file to load records as a comma separated dataset. Then use the SQL Transaction node above.
          FireGeek21
          Veteran Member
          Posts: 84
          Veteran Member

            Looks like my variable was cut out of the insert into line.  At the end of the line (after [tablename]) there should be a space and then the variable name where the dataset is stored.

            Sorry for the confusion.

            HDAustin13
            Advanced Member
            Posts: 31
            Advanced Member

              Would you mind sharing a sample of it or print screen of that flow? Thank you!

              HDAustin13
              Advanced Member
              Posts: 31
              Advanced Member

                I have tried the iterator to read the file with the SQL Transaction node, the process was sucessfully ran without any errors. However, it just inserted the word "null" instead of data. I think I missed something in the SetFileName part but was not able to figure it out yet. Do you have any idea that I might miss? Thank you.

                 

                David Williams
                Veteran Member
                Posts: 1127
                Veteran Member
                  Email me a copy of your flow - david@consultdavidw.com
                  David Williams
                  FireGeek21
                  Veteran Member
                  Posts: 84
                  Veteran Member
                    HDAustin, Sorry I have been a tad busy. I will get you a working flow. I hope to have something by the end of the day. You will be able to see how to adapt it to what you are doing.

                    Tammy
                    FireGeek21
                    Veteran Member
                    Posts: 84
                    Veteran Member

                      Here is a print screen of my trimmed down flow.  This sample is using a .csv file with EMPLOYEE, LAST_NAME, FIRST_NAME as the fields.  I removed true database name/table and reference DatabaseName, Schema and TableName where you need to insert your own information.

                      I have uploaded the actual flow.  Hope it helps.

                      Attachments
                      HDAustin13
                      Advanced Member
                      Posts: 31
                      Advanced Member

                        Thank you FireGeek21 for sending the sample flow, FireGeek21. I will try it and post an update soon.

                        HDAustin13
                        Advanced Member
                        Posts: 31
                        Advanced Member

                          Hello FireGeek21,

                          Again, thanks for sharing the flow. It works great for me with the small file that has just a few columns/fields.

                          However, when I try to apply the same method for a bigger file that has 10 or more columns/fields than it does not seems to work right. I created additional variables for those fields but I think I messed up the JavaScript DataArray in the Assign node as I was confused on how it works as seen below.  Would you mind to explain a little more on these arrays in that node? Thanks in advance.

                           

                           

                           

                           

                          You are not authorized to post a reply.