Using Oracle tools to refresh Landmark data

 2 Replies
 1 Subscribed to this topic
 32 Subscribed to this forum
Sort:
Author
Messages
This_Guy
Veteran Member
Posts: 93
Veteran Member

    We have some pretty detailed steps to export/import our LM data from Source to Target, and have done that only 3 times, so relatively new. I chatted with my DBA and we were curious if there is any insight as to why no one 'seems' to be using Oracle tools (data pump) to do this kind of refresh, like we do with LSF current day. 

    I noticed the daexport is essentially pumping out to CSV files (BusinessClasses/Tables) individually, and the daimport is switching the prodline (Ex: prodlm=devlm), but other than that sure seems like we can take advantage of the Oracle back-end tools to improve this process. 

    Any insight is most welcomed! 

    Woozy
    Veteran Member
    Posts: 709
    Veteran Member
      Hi This_Guy,

      My understanding is that DB tools can't be used for Landmark data refreshes due to the heavy use of system-generated UniqueID values across Landmark for primary keys and relationships. If I recall correctly, the DB approach can't be used because the application creates and tracks those UniqueIDs and if you just push them into the tables with the source environment values, it causes duplicate key and other issues.

      It's a pain-in-the-posterior, but we've always been told that the Landmark utilities are the only option. I wish that wasn't the case.

      I'd love to hear from anyone who knows differently, or who can provide more clarity.

      Kelly
      Kelly Meade
      J. R. Simplot Company
      Boise, ID
      This_Guy
      Veteran Member
      Posts: 93
      Veteran Member

        Yeah, that's what I thought. It's funny, since we use the DBA to roll the LSF refreshes in at a specific time, then I come in and do the LM refresh to keep in synch. Maybe there is a mystery way??!