Open Source RDBMS - Seamless, Scalable, Stable and Free

한국어 | Login |Register

Use Apache DdlUtils and Scriptella to migrate databases to CUBRID

The scope of this document is to provide some guidelines to migrate databases to CUBRID, using Scriptella and Apache DdlUtils, both open source tools.

The idea is to combine these tools together to get optimal results in terms of:

  • Extracting the source database schema and importing the schema into the target CUBRID database
  • Transferring the data between the databases (and applying data transformations, if needed)


DdlUtils, an Apache DB project, provides utilities for creating and working with Database Definition (DDL) files and for exporting and importing data in XML format via Ant tasks or programmatically in Java.

We will use DdlUtils to export/import the database schema.

Scriptella is an open source ETL (Extract-Transform-Load) and script execution tool, which can be used to export/import the databases data. In addition, Scriptella provides excellent support for data transformation, which might prove very useful when migrating the database.

In our scenario, we will use Scriptella for transferring the data between the databases. As both these tools provide support for Ant, we will use Ant scripts for the operations required.


First, you need to install the required software:

  • Download CUBRID and follow the installation instructions.
  • Install Java JDK latest release.
  • Install Apache Ant. Set the ANT_HOME environment variable and add the Ant /bin folder to your PATH.
  • Download DdlUtils JAR library for CUBRID.
    • Alternatively, you can manually compile it. For that, download and install the latest version of Apache DdlUtils. Then apply the CUBRID patch (download it from here). Finally, compile.
  • Create the CUBRID target database. This step cannot be done automatically from the tools; therefore it must be executed as part of the environment preparation.

In addition to this setup, depending on your source database, you will need to:

  • Have the specific JDBC driver available.
  • Have access to the source database. Make sure you have the appropriate user permissions.

The next step is to create a working directory, where you will create the scripts, generate the files etc.


We have a database (source database) that we need to migrate to CUBRID (target database). The source database has support for JDBC and is one of the supported databases for both Scriptella and Apache DdlUtils.

In order to perform the migration, we need to:

  • Step 1: Extract the schema for the objects in the source DB
  • Step 2: Create the objects in the target DB, using the previously exported definitions (Step 1)
  • Step 3: Transfer the data between the databases. In our scenario, the transfer will be done directly, without using intermediate files.
  • Step 4: Test the target DB. This is covered in this document, but it is an important step to perform in order to make sure that the process completed successfully!

While theoretically it is possible to do all these steps in just one script, this is probably not a real life scenario, because of the specific differences between the databases. After you create the schema in the target database, before you will transfer the data, you will probably want to perform some specific schema operations:

  • Adjust some column data types to better match CUBRID
  • Perform specific CUBRID optimizations

Summarizing, we will have to prepare 2 Ant scripts:

  • One script to:
    • Extract the source database schema
    • Create the target CUBRID schema
  • Another script to:
    • Transfer the data to the CUBRID target database

Scripts Preparation

The first Ant script we need to prepare will perform the following operations:

  • Extract the source database schema; we will use DdlUtils.
  • Create the target CUBRID schema; we will use DdlUtils.

Letos name this Ant script file pStep1.xmlq.

The script will contain 2 targets: ExportDDL, ImportDDL:

<target name="ExportDDL">
<taskdef name="databaseToDdl" classname="org.apache.ddlutils.task.DatabaseToDdlTask">
<classpath refid="runtime-classpath"/>
<delete file="${outputfile-schema}" verbose="true"/>
<databaseToDdl verbosity="debug" 
<database url=<provide your source db specific info>
driverClassName=<provide your source db specific info>
username=<provide your source db specific info>
password=<provide your source db specific info>/>
<writeSchemaToFile outputFile="${outputfile-schema}"
<target name="ImportDDL">
<taskdef classname="org.apache.ddlutils.task.DdlToDatabaseTask"
<ddlToDatabase schemaFile="db-schema.xml"
<database url=<provide your target db specific info>
username=<provide your target db specific info>
password=<provide your target db specific info>/>
<writeSchemaToDatabase dodrops="false"

The code listed above is only a generic example of the relevant DdlUtils part of the Ant script file. You will need to customize it to match your setup.

The second Ant script file will be named pStep2.xmlq.

The script will contain just one (Scriptella) target ? TransferData.

The script itself will contain a reference to an external Scriptella script file ? petl.xmlq.

<target name="TransferData">
<taskdef resource="">
<pathelement location="../lib/scriptella.jar"/>
<etl debug="true">
<fileset file="etl.xml"/>


The petl.xmlq script will contain the required calls to transfer (and transform, if needed) the data between the databases. For example, to transfer the Product table in the image above, the script will contain these instructions:

<query connection-id="in">
SELECT * FROM Product WHERE category='software';
<script connection-id="out">
INSERT INTO Product(id, category, product_name) values (?id, ?{category}, ?name);

Again, you will have to customize the scripts to match your specific environment.

We do not intend here to provide the complete listings of these scripts, nor enhance documentation for using DdlUtils and Scriptella. To find the details you need, please visit the online documentation available for both Apache DdlUtils and Scriptella.

Execute the Scripts

Now itos time to run the scripts we have prepared before. Make sure the JDBC drivers are in CLASSPATH and also the DdlUtils and Scriptella libraries (or the location is specified in the script files).

Execute, in command line, the following command:

ant -f "Step1.xml" ExportDDL ImportDDL

Verify that the script execution completed without errors.

The next thing to do is to login to CUBRID (use CUBRID Manager, for example) and:

  • Verify that the objects were created
  • Perform any adjustments required to the schema
  • Perform any optimizations regarding the objects definition

Then execute the second script:

ant -f "Step2.xml" TransferData

Verify that the script execution completed without errors. Depending on the size of the data to be imported, it might take some time before it completes.

After all this is completed, you will now have migrated to CUBRID your source objects and data.

Remember, you will need to perform some testing, to verify the migration process.

Post-migration Operations

The tools cannot help with the migration of some specific objects, like stored procedures, for example. This means that, depending on your specific case, you might need to perform some extra steps to ensure a complete database migration.

Make sure you take into account all the objects you have in your source database and re-create them in the CUBRID database as well, in order to have a complete migration.

See also

Getting Help

If you have any questions, post your questions to CUBRID Apps & Tools forum.

You are either using a very old browser or a browser that is not supported.
In order to browse you need to have one of the following browsers:

Internet Explorer: Mozilla Firefox: Google Chrome: