Thursday, October 24, 2013

Migrating Stored procedures from one DBMS to another



Convert Informix Schema to Oracle Schema Or Any Other RDBMS:
SQL is not very different from one RDBMS to another. It is easy to convert one RDBMS schema to another RDBMS, especially simple datatypes. Of course, the rest of a conversion is lots of work, all the administration tasks such as backup, integrity, resource management, configuration as well as the migration of applications, many of which are using proprietary features of RDBMS products such as stored procedure languages, special data types, etc.
Naturally, standard RDBMS features are easily portable to other RDBMS. I myself tend to design with as few proprietary features of an RDBMS as possible as long lived processes, applications and data stores will inevitably be ported to something else over time.
Here are some scripts that translate Informix to Oracle. These same scripts can be modified to translate Oracle to Postgresql or Mysql or Oracle to Informix or Mysql to Postgresql. The basic idea is to have scripts that dump the SQL data definition language, modify it to the new RDBMS DDL, unload the data, create the tables in the new RDBMS, load the data in the new RDBMS.
########### cut here ###############
#!/bin/bash
# filename: conversion_config.bsh
# Configuration for conversion.

# List of tables to convert, can be lots or few.

TABLE_LIST='
A_table_name
B_table_name'

AWK="/usr/bin/gawk"
SCHEMA_FILE="schema.sql"
TABLE_DATA_PATH="table_data" # Directory for table data.
mkdir $TABLE_DATA_PATH
TABLE_CONTROL_FILE_PATH="control_file" # Oracle sqlldr control files.
mkdir $TABLE_CONTROL_FILE_PATH
LOAD_LOG_PATH="load_log" # Table loading logs.
mkdir $LOAD_LOG_PATH

########### cut here ###############

#!/bin/bash
# filename: create_schema.bsh
# Run the Oracle schema script, load tables with Oracles loader
# Note: make sure Oracle environment variables are set.

# Config
. ./conversion_config.bsh
LOADER="sqlldr"
# For Informix to Oracle date formats:
export NLS_LANG='american_america.us7ascii' # Oracle juju for date format.
export NLS_DATE_FORMAT='MM/DD/YYYY' # Set Oracle to USA Informix date format.

# Create Oracle tables
sqlplus << EOF
user_name/password
@${SCHEMA_FILE}

EOF

# Load tables into Oracle
# $TABLE_CONTROL_FILE_PATH/$table.bad is the file with bad records.
for table in $TABLE_LIST
do
   ${LOADER} control=${TABLE_CONTROL_FILE_PATH}/$table.ctl log=${LOAD_LOG_PATH}
done

#################### cut here ######################################

#!/bin/bash
# filename: get_schema.bsh
# Get Informix schema, convert to Oracle schema DDL.
# Then
# unload Informix tables to files named .txt
# Note: make sure Informix environment variables are set.

. ./I-O_config.bsh

# Get Informix schema DDL.
get_schema () {
for table in ${TABLE_LIST}
do
   echo "drop table $table;"
   dbschema -d order_entry -t $table
done
}

# Remove extra text from Informix schema.
remove_dbschema_header () {
grep -v 'DBSCHEMA Schema Utility'|
grep -v 'Copyright (C) Informix Software'|
grep -v 'Software Serial Number'|
grep -v '{ TABLE'|
grep -v ' }'
}

# Convert Informix datatypes to Oracle
convert_datatypes () {
${AWK} '
/ serial/ {gsub(" serial", " number")}
/ money/ {gsub(" money", " number")}
/ integer/ {gsub(" integer", " number")}
/ decimal/ {gsub(" decimal", " number")}
/ smallint/ {gsub(" smallint", " number")}
/ char/ {gsub(" char", " varchar2")}
/ informix/ {gsub("\"informix\".", "")} # Remove user from DDL.
/ revoke all/ {next}  # Skip permission granting.
{print}'
}

get_table_columns () {

dbaccess database_name << EOF | grep -v '^$'
   output to pipe "cat" without headings
   select colname from syscolumns, systables
   where
   systables.tabname = "$table"
   and systables.tabid = syscolumns.tabid;
   -- order by colno; May use if columns are NOT in correct order.

EOF
}

# Informix unload.
unload_tables () {

for table in ${TABLE_LIST}
do
dbaccess database_name << EOF

unload to "${TABLE_DATA_PATH}/${table}/.out"
select * from $table;

EOF
done
}

# Create Oracle control files.
make_control_file () {
for table in ${TABLE_LIST}
do
cat << EOF > $TABLE_CONTROL_FILE_PATH/$table.ctl

load data
infile '${TABLE_DATA_PATH}/${table}/.out'
into table $table
fields terminated by "|"
EOF

  echo '(' >> $TABLE_CONTROL_FILE_PATH/$table.ctl
  COLUMNS=$(get_table_columns)
  echo $COLUMNS |
  ${AWK} '{ gsub (" ",","); print}' >> $TABLE_CONTROL_FILE_PATH/$table.ctl
  echo ')' >> $TABLE_CONTROL_FILE_PATH/$table.ctl
done
}

#################
# Main

get_table_schema |
remove_dbschema_header |
convert_datatypes > $ORACLE_SCHEMA

unload_tables

make_control_file


Moving data by using distributed SQL
If you want to move data with different binary pages and page sizes across platforms and you have expertise in using distributed SQL, you can use INSERT and SELECT SQL statements to transfer the data.
Important: Do not use INSERT and SELECT statements to move data if the database contains BLOB data types.
Prerequisites: A network connection must exist between database server instances.
To move data using INSERT and SELECT statements with fully qualified table names:
  1. Capture the complete database schema from the source database server.
  2. Alter the extent sizing and, if necessary, the lock modes on tables from page to row.
  3. Create and verify the schema on the target database server.
  4. Disable logging on both source and target servers where necessary.
  5. Create and run the following scripts:
    1. Create and run separate scripts for:
      • Disabling select triggers on the source server
      • Disabling indexes, triggers and constraints for each table on the target database server.
    2. Create and run one script per table for the fully-qualified INSERT and SELECT statements.
For example:
INSERT INTO dbname@target:owner.table SELECT *
FROM dbname@source:owner.table
You can run the scripts in parallel. In addition, for larger tables, you can create multiple scripts that can partition the table to run in parallel.
    1. Create and run separate scripts for enabling indexes, triggers and constraints for each table
  1. Run UPDATE STATISTICS on system catalog tables and stored procedures and functions on the target database server.
  2. Adjust starting values for all tables that have serial columns on the target database server.
  3. Turn on transaction logging on the source and target database servers.
  4. Return the source and target database servers to multi-user mode.
  5. Validate the data that was transferred to the target database server.