Skip to content
Menu
ChaoTic
  • Home
  • Python
  • APEX
  • PL/SQL
  • Oralce DB
  • Docker
  • OCI
  • Nginx
  • C#
  • Linux
  • JavaScript
  • Privacy Policy
ChaoTic

How to import Data d(p)ump into Oracle OCI Autonomous Database

Posted on December 9, 2022April 8, 2024

https://docs.oracle.com/en/database/oracle/oracle-database/21/sutil/oracle-datapump-import-utility.html#GUID-BBB3CB15-B136-423F-B1EA-BF93A935793F

documentation

more information on oracle-base

PreRequisite

  • data dump export from your source database
  • a host (read here) with sqlplus installed ( try OCI vm instance with image Oracle Cloud Developer Linux, it has sqlplus , sqlcl pre installed) and able to connect to Autonomous Database
  • an Oracle OCI object storage bucket (upload your database dump to this bucket, your bucket can be private, no need to set it public. Once uploaded, copy the link via object details button, the link is needed later)
  • an Oracle OCI user who has MANAGE Buckets/Objects rights to this bucket

———————————————————————————-

Step !: Create an Auth TOKEN from OCI user

save the token somewhere safe (you can copy it during creation, once created it can NOT be displayed anymore)

———————————————————————————-

Step 2: use sqlplus login to your autonomous database using !!!<ADMIN> account and run

begin
   dbms_cloud.create_credential(credential_name => 'OBJECT_STORE_CRED'
                               ,username        => '${your objectuser}'
                               ,password        => '${your auth token}');
end;
/
# to list all created credentials
SELECT credential_name, username, comments FROM all_credentials;

———————————————————————————-

Step 3: create .par file

On the host machine, create a .par file e.g my_import.par and type the following content ( you can also work without the par file, but then u need to type everything all at once )

Infor about the parameters:

Schemas is optional if your export is on schema level, remap_tablespace optional , remap_schema is optional as well. content optional, EXCLUDE is optional, table_exists_action optional. logfile optional

read more here

directory=DATA_PUMP_DIR    
credential=OBJECT_STORE_CRED
schemas=${SOURCE_SCHEMA}
remap_tablespace=${SOURCE_TABLESPACE}:${TARGET_TABLESPACE}
remap_schema=${SOURCE_SCHEMA}:${TARGET_SCHEMA}
dumpfile=${URL To your Dump file on your object storage}
content=DATA_ONLY
EXCLUDE=CONSTRAINT,REF_CONSTRAINT,INDEX 
TABLE_EXISTS_ACTION=[SKIP | APPEND | TRUNCATE | REPLACE]
LOGFILE=my_import_log.log

TABLE_EXISTS_ACTION=[SKIP | APPEND | TRUNCATE | REPLACE]

SKIP is default value. If SKIP is used then table replacing is not done.

APPEND  loads rows from the export files and leaves target existing rows unchanged.

TRUNCATE deletes existing rows in target table and then loads rows from the export.

REPLACE drops the existing table in the target and then creates and loads it from the export.

———————————————————————————-

Step 4: execute import command

when you have this .par file , type (It needs User ORACLE and export TNS_ADMIN folder )

impdp userid=ADMIN/${password}@${tns_name} parfile=${path to your .par file}

before you run this, you probably need to do the following to disable triggers and constraints if you do a data only import

select t.*
from   user_constraints t;

select *
from   user_triggers t;

-- disable all Foreign key constraints
declare
   l_sql clob;
begin
   for rec in (select t.constraint_name
                     ,t.table_name
               
               from   user_constraints t
               where  t.constraint_type in ('R')
               and    t.status = 'ENABLED'
               and    t.table_name not in ('flyway_schema_history'))
   loop
      l_sql := q'[ALTER TABLE ]' || rec.table_name || ' disable CONSTRAINT ' || rec.constraint_name;
     -- dbms_output.put_line(l_sql);
      execute immediate l_sql;
   end loop;
   for rec in (select t.constraint_name
                     ,t.table_name
               
               from   user_constraints t
               where  t.constraint_type in ('P'
                                           ,'U')
               and    t.status = 'ENABLED'
               and    t.table_name not in ('flyway_schema_history'))
   loop
      l_sql := q'[ALTER TABLE ]' || rec.table_name || ' disable CONSTRAINT ' || rec.constraint_name;
     -- dbms_output.put_line(l_sql);
      execute immediate l_sql;
   end loop;

end;




-- Enable all constraints
declare l_sql clob;
begin
   -- first primary key and unique key
   for rec in (select t.constraint_name
                     ,t.table_name
               
               from   user_constraints t
               where  t.constraint_type in ('P'
                                           ,'U')
               and    t.status = 'DISABLED'
               and    t.table_name not in ('flyway_schema_history'))
   loop
      l_sql := q'[ALTER TABLE ]' || rec.table_name || ' enable CONSTRAINT ' || rec.constraint_name;
      --dbms_output.put_line(l_sql);
      execute immediate l_sql;
   end loop;

   for rec in (select t.constraint_name
                     ,t.table_name
               
               from   user_constraints t
               where  t.constraint_type in ('R')
               and    t.status = 'DISABLED'
               and    t.table_name not in ('flyway_schema_history'))
   loop
      l_sql := q'[ALTER TABLE ]' || rec.table_name || ' enable CONSTRAINT ' || rec.constraint_name;
      --dbms_output.put_line(l_sql);
      execute immediate l_sql;
   end loop;

end;

-- disable all triggers
declare
   l_sql clob;
begin
   for rec in (select t.trigger_name
               from   user_triggers t
               where  t.status = 'ENABLED')
   loop
      l_sql := 'ALTER TRIGGER ' || rec.trigger_name || ' DISABLE';
      --dbms_output.put_line(l_sql);
      execute immediate l_sql;
   end loop;
end;
-- enable all triggers
declare
   l_sql clob;
begin
   for rec in (select t.trigger_name
               from   user_triggers t
               where  t.status = 'DISABLED')
   loop
      l_sql := 'ALTER TRIGGER ' || rec.trigger_name || ' ENABLE';
      dbms_output.put_line(l_sql);
      execute immediate l_sql;
   end loop;
end;

———————————————————————————-

Step 5: access the log file

As long as your connection is good, even there are errors during import, it should finish with a log file. You can check the log file by running this sql.

   select *
   from   dbms_cloud.list_files('DATA_PUMP_DIR')
   order  by last_modified desc;
Store your logfile to OCI object storage

run this plsql code to upload log file to your bucket storage and so that you can download it and view it in detail


begin
   dbms_cloud.put_object('OBJECT_STORE_CRED'
                        ,'${URL}'
                        ,'DATA_PUMP_DIR'
                        ,'${file name you see in previous sql}');
end;
/

e.g ${URL} ‘https://objectstorage.eu-amsterdam1.oraclecloud.com/n/aosdfasdf/b/DD/o/${anyname}.log’

Troubleshooting

if having ORA-28759: failure to open file

check sqlnet.ora file in /op/oracle/network/admin folder value of wallet location

Leave a Reply Cancel reply

You must be logged in to post a comment.

Recent Posts

  • Oracle APEX cheat sheets (on going)
  • Second Take on React Native – ReadCast
  • Switch Between APEX builder Authentication schemes
  • Use BitBucket Pipelines to Automate Oracle APEX deployment
  • MARKDown TEST

Categories

  • APEX
  • C#
  • chatgpt
  • Docker
  • JavaScript
  • Linux
  • Nginx
  • OCI
  • Oracle APEX
  • Oralce DB
  • PL/SQL
  • Python
  • Uncategorized
©2025 ChaoTic | Powered by SuperbThemes
We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept All”, you consent to the use of ALL the cookies. However, you may visit "Cookie Settings" to provide a controlled consent.
Cookie SettingsAccept All
Manage consent

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
CookieDurationDescription
cookielawinfo-checkbox-analytics11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checkbox-functional11 monthsThe cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checkbox-necessary11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-others11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-performance11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
viewed_cookie_policy11 monthsThe cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
Functional
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytics
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Others
Other uncategorized cookies are those that are being analyzed and have not been classified into a category as yet.
SAVE & ACCEPT
Scroll Up