Clurgo logo
  • Clurgo
  • Blog
  • Migrating an Oracle 12c Development Database from Virtualbox to Docker – step by step guide

Migrating an Oracle 12c Development Database from Virtualbox to Docker – step by step guide

7/24/2023 Bartosz Jędrzejczak

Share

At Clurgo, we are big enthusiasts of using a database as a Docker container for development and test environments. We recognize the transformative potential of cloud solutions, particularly in projects of significant duration and data complexity. Today, we explore the migration of large Oracle databases from Virtualbox to the cloud. This migration strategy effectively addresses specific challenges, resulting in a more streamlined and efficient process of resource management.

The Clurgo team working for a client in the database management industry faced such a task. A local database was created based on a production database with anonymized data, running on Virtualbox. This solution had several drawbacks:

  • Virtualbox required a lot of resources for regular operation, especially CPU and RAM,
  • The database took up about 400GB of disk storage,
  • This created challenges in the area of testing, which took up to 25-40 minutes.

Solution to these problems? Run the tests in the cloud! Find out how we did it.

Choosing the best cloud technology

First of all, when looking for the most efficient way to solve such problems, you should consider solutions that assume using the infrastructure you already have. Since our client stored the code in Oracle’s Visual Builder Studio, we looked only for options compatible with this tool. VBS is similar to Gitlab, and allows easy integration with Oracle Cloud Infrastructure. Those dynamically create Linux servers on which Maven, Docker, and other tools can be installed. The result of the build is visible in a merge request.
It seemed a logical solution to migrate the database from Virtualbox to Docker and run a new container each time based on the prepared database image. I considered the option of using test containers, but decided on a different solution due to the fact that we have two repositories from which we need to run database migrations.
It was also clear that the database could not be as large as originally due to limited resources on the cloud. Therefore, an additional goal was to reduce the database so it only contained the necessary data. In addition, it was worth checking whether it would be possible to use a small database locally, instead of a large one on Virtualbox.

Migration of database from VirtualBox to Docker step by step

We migrated the database using the procedure described below:

  • Creating a clean database image on Docker

At https://github.com/oracle/docker-images find the repository that will help you create a docker container. It is necessary to download the database installer corresponding to the version of the database you want to create.
The following example shows the creation of a database version 12.2.0.1.

To do this, place the installer in the folder:

{projectDirectory}/OracleDatabase/SingleInstance/dockerfiles/12.2.0.1/

To generate the database container, run the script:

./buildContainerImage.sh -v 12.2.0.1 -t generated-oracle-12c -e

The script verifies the checksum of the installer to make sure that the supplied installer has not been modified. Before running the script, you can make changes to the database configuration files, for example, defining your own user, password or database SID.

  • Exporting the database structure from Virtualbox

Once you have prepared a clean database, you can proceed to migrate the existing database to the newly created one. Oracle databases have additional database management tools, such as SQL*Plus or Oracle Data Pump (expdp commands for exporting data and impdp for importing data). You can use these tools to export the database structure and data and import them into a new database.

How to do that? First, create a test_dir space that points to a folder on the server and hook it up to the database, so that the database can access the selected files, and assign read and write permissions to the folder:

sqlplus / as sysdba << EOF
alter session set container = docker_dev;
CREATE OR REPLACE DIRECTORY test_dir AS '/opt/oracle/scripts/startup/files/';
GRANT READ, WRITE ON DIRECTORY test_dir TO system;
exit
EOF

Exporting metadata of our database without data – tablespaces, schemas/users, tables, indexes, constraints, etc from the existing database first is a good practice.

expdp system/password@dev full=Y directory=test_dir dumpfile=dmpdevfull.dmp logfile=empdpdevfull.log content=METADATA_ONLY

This command contains the user (system) and password for connecting the tool to the database with the given SID (dev). test_dir is the name of the space we defined earlier. The exported data will be placed in the dmpdevfull.dmp file, and the run logs can be found in empdpdevfull.log. Both files should be created under the path pointed to by the test_dir variable. All of the above parameters should be configured to match the database being exported.

The dmpdevfull.dmp file should be copied to the place to which the clean database has access, and then configure the test_dir folder in the new database pointing to the folder with the dmpdevfull.dmp file.

In order to minimize the allocated space by the import tool, I recommend creating tablespaces manually. Below you can find an example command:

CREATE TABLESPACE MY_TABLESPACE DATAFILE 'MYTAB01.dbf' SIZE 1M REUSE AUTOEXTEND ON NEXT 10M;

Next, it is a good idea to export the users with their permissions. Do it with the command:

impdp system/password@docker_dev directory=test_dir dumpfile=dmpdev12full.dmp include=user

After importing users, you can proceed with remaining metadata:

impdp system/password@docker_dev full=Y directory=test_dir dumpfile=dmpdev12full.dmp logfile=impdpdevfull.log TABLE_EXISTS_ACTION=REPLACE content=METADATA_ONLY transform=disable_archive_logging:Y,SEGMENT_ATTRIBUTES:N exclude=statistics,user

This way, you prepared a database with the same structure as the one you exported the data from.

Migrating the data

If you need to migrate also particular data, use this command to export them:

expdp system/password@dev directory=test_dir dumpfile=inst.dmp logfile=inst.log content=DATA_ONLY schemas=INST include=TABLE:\"IN \(\'INSTALL_LOG\'\)\"

The command above exports data from the INST scheme, from the table INSTALL_LOG only.
Similarly, you can import the data to the new database:

impdp system/password@docker_dev directory=test_dir dumpfile=inst.dmp logfile=inst.log content=DATA_ONLY

It may turn out that you need to temporarily turn off the constraints before importing the data. After the import, validate and turn the constraints on again.

After importing the data, you can try to reduce the size of the database container by reducing the temporary space that was expanded during data import.
Do it with this command:

ALTER DATABASE TEMPFILE '/opt/oracle/oradata/dev/docker_dev/temp01.dbf' RESIZE 10M;

It should be a path to the DFB file, concerning temporary tablespace. From such prepared container, you can create an image with the command:

docker commit <CONTAINER_ID> <IMAGE_NAME>

Conclusions after database migration

A database prepared this way works well for running integration tests in the cloud and for running the application in a local environment. Oracle Data Pump tools have many additional parameters, which help personalize the commands to suit your needs.
The presented method works well for migrating a development database from Virtualbox to Docker. We do not recommend its use for migrating a production database, as there is no guarantee that with large volumes of data all data will be migrated correctly.

Interested in the topic? Deep-dive in additional resources:

Do you face challenges related to database migration or need a hand with choosing the right cloud solution? Contact our experts!

Clurgo logo

Subscribe to our newsletter

© 2014 - 2024 All rights reserved.