1. Home
  2. Knowledge Base
  3. Oracle Cloud
  4. Data migration and data transformation – SQL Server to OCI Autonomous Database using ODI Web Edition

Data migration and data transformation – SQL Server to OCI Autonomous Database using ODI Web Edition

This post demonstrates how we can move data from an on-premise Microsoft SQL Server database to an Oracle Cloud hosted Autonomous Database using Oracle Data Integrator (ODI) web edition which is provisioned via the stack available on Oracle Cloud Marketplace.

Along with the migration of data from SQL Server 2018 to Oracle 19c (source table has 164574 rows), we are also using ODI to transform the data.

The following transformations will be performed while migrating the data from SQL Server to Oracle.

The source SQL Server has a column named “TIMESTAMP” (which is an Oracle reserved word) which we will rename to “TIMESTAMP_NEW” as part of the data migration exercise. In addition we will also be increasing the size of some source SQL Server VARCHAR 200 columns to VARCHAR2(2000) on the target Autonomous Database.

Connect to OCI Cloud Marketplace and select Data Integrator: Web Edition (Versions prior to May 2022) stack

Click on Launch Stack

For the Data Transforms Repository Location, select the option to Create a new Repository in an Autonomous Database

Provide the compartment, VCN and Subnet information

Select the appropriate node shape for the VM, provide the SSH key and also make a note of the VNC password

Provide the Autonomous Database ADMIN account password and also make a note of the SUPERVISOR password as we will need to provide this password to connect to ODI web.

The ODI repository which is created in an Autonomous Database will have a number of schemas created – the schema prefix used in this case is ODIWEB2022, so after the installation is complete we would see schemas like these created in the ODI repository database – ODIWEB2022_ODI_REPO, ODIWEB2022_WLS, ODIWEB2022_OPSS, ODIWEB2022_STB etc.

Review the information provided to create the stack and then click on Create

Verify the ODI installation by connecting to the Autonomous Database which hosts the ODI repository as well as the compute node VM which has been created via the deployment of Oracle Cloud Marketplace ODI Web Edition stack.

SQL> select distinct owner from dba_tables where owner like 'ODIWEB2022%';



[root@oracle-odi-inst-mk0p logs]# vi odiConfigure.log

Database details:
Connect Descriptor                           : (description= (retry_count=20)(retry_delay=3)(address=(protocol=tcps)(port=1522)(host=DOHADW001.adb.ap-sydney-1.oraclecloud.com))(connect_data=(service_name=g6da9c2218ad114_dohadw001_low.adb.oraclecloud.com))(security=(ssl_server_cert_dn=CN=adb.ap-sydney-1.oraclecloud.com, OU=Oracle ADB SYDNEY, O=Oracle Corporation, L=Redwood City, ST=California, C=US)))
Connected As                                 : admin
Prefix for (prefixable) Schema Owners        : ODIWEB2022
RCU Logfile                                  : /tmp/RCU2022-10-31_07-02_422714987/logs/rcu.log

Component schemas created:
Component                                    Status         Logfile

Common Infrastructure Services               Success        /tmp/RCU2022-10-31_07-02_422714987/logs/stb.log
Oracle Platform Security Services            Success        /tmp/RCU2022-10-31_07-02_422714987/logs/opss.log
Master and Work Repository                   Success        /tmp/RCU2022-10-31_07-02_422714987/logs/odi.log
Audit Services                               Success        /tmp/RCU2022-10-31_07-02_422714987/logs/iau.log
Audit Services Append                        Success        /tmp/RCU2022-10-31_07-02_422714987/logs/iau_append.log
Audit Services Viewer                        Success        /tmp/RCU2022-10-31_07-02_422714987/logs/iau_viewer.log
WebLogic Services                            Success        /tmp/RCU2022-10-31_07-02_422714987/logs/wls.log

Repository Creation Utility - Create : Operation Completed
Loading KM : LKM SQL to SQL (Built-In), Base KM : null
Loading KM : LKM SQL to Oracle (Built-In), Base KM : LKM SQL to SQL (Built-In)
Loading KM : LKM File to Oracle (Built-In), Base KM : LKM SQL to Oracle (Built-In)
Loading KM : LKM SQL to Hive (Built-In), Base KM : LKM SQL to SQL (Built-In)
Loading KM : LKM Oracle to Oracle (Built-In), Base KM : LKM SQL to Oracle (Built-In)
Loading KM : XKM SQL Extract, Base KM : null
Loading KM : IKM SQL Load, Base KM : null
Loading KM : IKM SQL Insert, Base KM : IKM SQL Load
Loading KM : IKM SQL Update, Base KM : IKM SQL Load
Loading KM : IKM SQL Merge, Base KM : IKM SQL Load
Loading KM : IKM SQL Multi-Connect Merge, Base KM : IKM SQL Merge
Loading KM : IKM No-op, Base KM : null
Loading KM : XKM SQL Join, Base KM : null
Loading KM : XKM SQL Filter, Base KM : null
Loading KM : LKM Oracle to Oracle (DB Link), Base KM : null

[root@oracle-odi-inst-mk0p logs]# netstat -an |grep 9999
tcp6       0      0 :::9999                 :::*                    LISTEN

Launch Oracle Data Integrator web edition.

Since we are connecting via a bastion using SSH port forwarding, we are not providing the hostname or IP address of the ODI host.

The port number is 9999 and the URL to connect will be as shown below.

Provide the SUPERVISOR password.

Create a connection to the source on-premise SQL Server database.

We need to provide the user name and password to connect to the SQL Server instance as well as the port number which allows connectivity to the SQL Server instance.

Create a connection to the target Oracle OCI Autonomous Database.

Download the wallet from the Autonomous Database and provide the password for the ADMIN user.

Click on the Microsoft SQL Server connection which we have just created.

Click on Projects.

Create a new project called DEMO.

Click on Create.

As part of this data migration/data transformation exercise, we will be working with a SQL Server database table called T_OPERATION.

Click on the project DEMO and then click on Import Data Entities.

A job will be created for the above task and we can monitor the job execution and also view the results.

After the job to import data entities has completed, we will then edit the data entity T_OPERATION and increase the size of some of the columns from varchar 200 to varchar 2000.

Click on Create Data Flow

Select the entity T_OPERATION and drag it to the panel on the right.

Click on T_OPERATION in the panel and from the Attributes menu we can see the column names of the table as well as the data types and lengths.

We can see the columns which we have changed earlier are showing as varchar 2000 (earlier was varchar 200).

We are now going to do a Data Transform operation – click on the “+” icon in the panel as shown below.

Add a Data Entity called T_OPERATION_NEW.

The connection is to the target Oracle Autonomous Database which we had created earlier.

Click on Next and we can now edit the target table column definitions.

We are changing the source table column name from “TIMESTAMP” to “TIMESTAMP_NEW”.

Save the changes we have just made.

The transformation is now configured. Note the “Transfer” showing on the panel.

Click on the Execute button to start the Data Transfer job

Click on Start

Click on the job name link to track the progress of the job.

In under a minute we can see that 164574 rows have been loaded into the target T_OPERATION_NEW table. A temporary staging table has also been created by ODI for the data transformation job which is later cleaned up.

Verify the table on the Autonomous Database. We can see the columns which we modified to VARCHAR2(2000) by the ODI data transformation job are present as well as the column which we had renamed to TIMESTAMP_NEW.

Complete the data transformation exercise on the target Autonomous Database.

SQL> conn demo/******@dohadw001_high

SQL> select table_name from user_tables;


SQL> select count(*) from T_OPERATION_NEW;


SQL> alter table T_OPERATION_NEW rename column TIMESTAMP_NEW to TIMESTAMP;

Table altered.


Table renamed.
Updated on March 4, 2023

Was this article helpful?

Related Articles

Leave a Comment