I have assessed my schema with 100% success, what can I do next ?

I have assessed my schema with 100% success, what can I do next?
Accepted Solutions (1)
Accepted Solutions (1)
- Report Inappropriate Content
Solution
Once you have accessed your schema with 100% success, you can export the script using EXPORT DDL. You may choose any of the following options to deploy the script and populate data, including:
- Execute the script in an on-premises database to create objects. Verify objects in schema and on successful verification, move data from Oracle to EDB Postgres using EDB Migration Toolkit.
- Transport and execute exported script in database on EDB Postgres Ark instance to create objects. (EDB Postgres Ark is a DBaaS for hybrid cloud deployments of Postgres to public and private clouds; on Amazon, Azure, OpenStack, and other deployment platforms) . Verify objects in schema and on successful verification, move data from Oracle to EDB Postgres using EDB Migration Toolkit. EDB Migration Toolkit allows data migration in both Online and Offline mode, so you may migrate data in offline mode and then transport data files to Ark instance and execute this file in PEM or EDB-PSQL to populate objects.
Deployment Example
Following is the example to use EDB-PSQL to deploy the exported DDLs for schema HR:
edb=# \i /exported_ddls_folder/hr_schema/ProjHR_hr.sql
Data Migration Examples in Online Mode
Following are the examples of data migration in online mode:
- JDBC COPY API
/runMTK.sh -sourcedbtype oracle -targetdbtype enterprisedb -targetSchema HR - dataOnly -truncLoad HR
- -copyViaDBLinkOra module
The dblink_ora module provides Advanced Server-to-Oracle connectivity at the SQL level.
A dblink_ora is bundled and installed as a part of the Advanced Server database installation. The dblink_ora utilizes the COPY API method to transfer data between databases. This method is considerably faster than the JDBC COPY method.
./runMTK.sh -sourcedbtype oracle -targetdbtype enterprisedb -targetSchema HR -copyViaDBLinkOra -dataOnly -truncLoad HR
- SQL INSERT statements
SQL INSERT statements for each row in the table
./runMTK.sh -sourcedbtype oracle -targetdbtype enterprisedb -dataOnly -truncLoad -safeMode HR
Data Migration Examples in Offline Mode
Following are the examples of data migration in offline mode:
- .cpy (copy) files
To generate the .cpy (copy) files, run the following command:
./runMTK.sh -sourcedbtype oracle -targetdbtype enterprisedb -targetSchema HR - dataOnly -offlineMigration /data_folder/hr_schema/ HR
Note: Only one file per table is generated.
- Inserts as SQL Statements
To generate Inserts as SQL Statements, run the following command:
./runMTK.sh -sourcedbtype oracle -targetdbtype enterprisedb -targetSchema HR - dataOnly -safeMode -offlineMigration /data_folder/hr_schema/ HR
Note: Only one file per table is generated.
- -singleDataFile keyword
To generate a single file that contains the data from all tables, run the following command:
./runMTK.sh -sourcedbtype oracle -targetdbtype enterprisedb -targetSchema HR - dataOnly -safeMode -singleDataFile -offlineMigration /data_folder/hr_schema/ HR
Inserting Data in Tables in EDB Postgres
- COPY Command
Following example deploys exported DDLs for schema HR using EDB-PSQL.
edb=# COPY hr.locations FROM '/data_folder/hr_schema/mtk_hr_locations_data.cpy';
- SQL File
edb=# \i /data_folder/hr_schema/mtk_hr_locations_data.sql
- Online Resources
Comments (0)
Comments (0)