Upgrading to 12.2 in the Cloud — Tales from the Battlefield

    By: Michael Gangler on Jan 17, 2019

    By Mike Gangler, team lead at Secure-24 and Oracle Ace and Curtis Cristodero, technical team lead – Microsoft SQL Server at Secure-24 | Edited by Michelle Malcher

    The technical process of moving databases to the cloud is no different than the process of moving database to other servers, which have been practiced for many years. As many people have said, moving to the cloud is just leasing someone else’s hardware. This article will help folks understand what it requires to go to the cloud and demonstrate some common errors that may occur when moving databases to a cloud provider.  

    5 Cloud Misconceptions 

    There are many incorrect presumptions surrounding cloud computing, which will cause many issues and anxieties during pre- and post-move activities. Below are just a few I have experienced and want to share. 

    1. It is simple.

    Cloud providers will make it sound like all you have to do is press one or two buttons and the databases are installed and configured. This is the most common misconception. Moving databases and applications is NOT simple and there are many details associated with a move to the cloud. Items such as networks, firewalls, ports, application versions, OS versions, “Tnsnames” files, and database links are just a few items that have to be architected, discovered and figured out BEFORE you move to any cloud provider. 

    2. It is autonomous.

    Many cloud providers are using autonomous APIs to start/stop databases or create maintenance and performance tasks. Cloud providers will claim that their platforms require no human intervention, or many tasks can be done by non-technical providers. However, even if they are providing autonomous APIs, a technical person will be required on the other side of these tasks to decide to run them. For example, do they shut down the system if there is a full backup running? When are the backups running, or are there month-end or year-end processes running? These items, plus many others need to be considered when pressing buttons. Also, there are only a few Autonomous APIs currently available, so most of the work still requires a technical person to log into the machines to analyze the issues. 

    3. You don’t need DBAs.

    This saying has been phrased and rephrased since Oracle version 9 and it couldn’t be farthest from the truth. This is only true if you have purchased premium support from your cloud provider, which includes DBAs and admins. Like the autonomous item, you will need experienced DBAs and admins to perform many non-normal tasks that are associated with your application(s) and to know when to re-index your application based on import schedules, for example. Your DBAs may not be performing mundane tasks like building databases or running indexes, but will be required for analyzing data, application performance analysis and recommending database tasks based on your analysis. Please remember that the data is still yours and your DBAs and application teams understand that data, unlike the cloud providers. 

    4. It is cheaper than on-premise computing.

    In many ways, going to the cloud will save on data center costs, resource and machine costs. However, this can cost you in the long run if you don’t manage your usage of the cloud. With most cloud providers, you pay for putting data into the cloud, taking it out and the amount of resources you utilize. An exception is Oracle Cloud and Azure, where you only pay for usage data or when you take it out (egress). Also, most cloud providers charge you on a monthly basis – so there are no fixed costs – unless you manage that yourself. Many private cloud providers offer fixed costs and only change it with customer approval. Lastly, agile tends to increase costs unless managed by the customer. Databases seem to grow significantly in agile environments, which could lead to high costs per hour. One company didn’t manage those costs and were paying anywhere from $184 to $200 per hour.

    5. Cloud computing is outage-free.

    Unfortunately, like any datacenter, stuff happens and most cloud providers have had experienced outages. Please remember data centers are filled with people, so as long as people work on computers there will always be a chance of outages.

    Cloud Realities

    Once you know the misconceptions of the cloud, it’s important to understand the realities of moving databases and applications to any cloud providers.   

    1. Fix your process inefficiencies.

    Although your application and databases are moving, your processes will stay the same (with a few added steps). If your processes are inefficient, moving to the cloud will expose those inefficiencies even more. Review all your processes that require applications and databases prior to moving to the cloud.  Examples of processes to review are items like database and application deployments, change control processes and security. These are just a few that can affect your cloud deployment and on-going process. 

    2. Verify and check the GUI interfaces.

    Most cloud providers provide GUI interfaces to manage the application and database resources. The customer will have to work with the provider to understand the limitations of the interface. You still may need to connect via ssh, telnet, rdp, or other remote server tools. Hopefully you will have a choice to use a GUI that provides you all the necessary capabilities. 

    3. Cloud computing and agile are a perfect fit.

    If all your processes are efficient and you have a process in place to manage resources, then agile seems to be a perfect fit for cloud computing. Agile has the requirement for quick deployment of database and machine resources. Although cloud computing clicks many boxes off for agile development, please consider active monitoring of cloud usage and resources as it could grow out of control. 

    4. The data is yours.

    Please remember the data is still the customer’s (your) responsibility. Depending on the subscription model, you will still have to manage your application (i.e. add users, tables, indexes, etc.). Many cloud providers have application support services such as Hyperion, SAP and even DBAs, but those typically incur extra costs.

    5. Application security is dictated by the customer.

    Even though datacenter and infrastructure security may be in place by the cloud provider, the customer and application team will dictate the level of security that is required for the application level and database. Most cloud providers have good security infrastructure tools in place, but you will need to dictate the application and database security based on any federal or government requirements. 

    Common Challenges

    Below are a few challenges I’ve observed, that others should be aware 

    • Network latency: This is most common challenge when moving databases between sites. This has caused many a delay in most migrations, and it is wise to work closely with your network teams to resolve any issues prior to the cutover. Our company performs many tests prior to cutover to verify network speeds and data movement.
    • Datatype differences: When moving data from appliances like Exadata to Non-Exadata, you most likely will have datatypes that are NOT supported on the destination databases. This may require tables or objects to be exported and re-imported during the cutover.
    • Large databases (> 15 TB): When moving large databases, you need to be creative to move it and still fit it in a down-time window requirement. There are many options, and Michael Dietrich of Oracle has many listed on his blog.
    • Cutover times: Most companies have a limit of downtime allowed to perform cutover or switch overs, regardless of the size of the databases. Due to this constraint, the options are limited and those options are presented later in this 
    • Homegrown applications: Many applications are “homegrown” and have obsolete code that is no longer available in Oracle 12 (i.e. wm_concat). Also, many homegrown applications don’t use primary keys. This only affects you if you are using logical replication to replicate data from your source to the cloud provider (i.e. – Streams or Golden 
    • Multiple tnsnames.ora files: Many times the application and users have multiple tnsnames.ora files and database names can be accessed by multiple different aliases.   This can cause many applications to fail if not found and consolidated into a common tnsnames.ora 
    • Application dependencies: Many applications have application dependencies such as hard coded IP addresses in database links or applications. This will cause the cutover to fail when moving to cloud 
    • Bugs in older versions: If you are migrating from older versions, you may experience bugs that restrict your choices of cutovers or application. An example is in Oracle Version can’t perform network exports or imports due to a bug (fixed in later versions).

    Migration Choices

    • Export/Import: This is usually used for smaller databases and if you have good network speeds. Not good with high network latencies or large databases (> 500GB).  You may also have issues with missing synonyms, database links and if you move data multiple times.
    • Database Upgrade Assistant (DBUA): Upgrade prior to moving databases and moving database after upgrade. This option is good with smaller databases and good network speeds.
    • Transportable table spaces: Good if applications are tied to one table space, but you still have to move data. Need good network speeds and good for applications like Hyperion that use 1 tablespace per application.
    • RMAN: This process is the most used method to migrate data but does require additional storage. If the database is greater than one terabyte (1TB), you will have large delay and build times. You may have to set up a standby database using RMAN weeks prior to cutover due to the time it takes to move and setup.
    • Third party tools: There are many tools available to migrate data: Golden Gate, DBVISIT, and Streams, to name a few. Our company has used all three tools, and each has its place in migrations – Golden Gate for logical replication, DBVISIT for physical replication (When Enterprise Edition is not available) and Streams for small set of tables.

    Upgrade and Migration Strategies

    Planning and pre-testing migrations are key to all successful migration projects. Many times, pre-testing cutovers will uncover issues you didn’t discover during your first glance. Pre-testing the cutovers will allow you to test the planning steps, determine timings and discover any anomalies not previously discovered.

    Our team has moved and upgraded many databases to cloud environments, as well as other data centers. Our most successful approach is to use the RMAN approach and start the setup weeks prior to cutover. The following steps can only be used in enterprise edition databases or tools like DBVISIT for standard licenses are general steps and processes.  This presumes upgrading from to Oracle 12.2.

    • Take backup of database using RMAN Level 0 from source database
    • Copy RMAN backup (L0) to destination. Many options to copy are available with cloud providers to speed this up (including storage devices and dedicated networks).
    • Setup standby database on destination and start applying logs.
    • Run pre-upgrade script to determine changes required during upgrade.
    • Perform test migrations prior to official cutover to test steps and time required.
    • Cutover to cloud
    • Apply logs and open in upgrade mode
    • Upgrade database using catctl.pl or DBUA (GUI)
    • Recompile objects, validate and test

    Again, for additional strategies, follow Michael Dietrich of Oracle Blog as he offers other migration strategies with minimal times, including databases greater than 100TB.

    Summary and Lessons Learned

    Below is a recap you can use as a quick guide for some of the lessons learned in this article.

    • Plan and test cutover process: This helps in timings and discoveries.
    • Network latency: Network issues have occurred in every cloud migration project, so having a good working relationship with the network team is crucial.
    • Firewall ports/SSH: Make sure all firewall ports are opened for SSH and connections to the cloud from your source.
    • Save prior statistics: This will help during performance issues.
    • Review database new features: Determine options available for new database.
    • Check parameters: Validate for outdated and obsolete parameters.
    • Check components: Validate for mandatory and invalid components and remove obsolete components.
    • Spend time: Understanding your current environment or application and processes needs time upfront.
    • Service Level Agreements: Understand the cloud provider SLA’s agreements.
    • Create processes: Define a process for managing resources in the cloud (monthly reports, requesting databases, etc.).
    • Plan: Plan for and expect issues with the migration to the cloud, and plan for time to make this migration happen.
    Released: January 17, 2019, 4:24 pm | Updated: January 17, 2019, 4:25 pm
    Keywords: Feature | cloud migration

    Copyright © 2019 Communication Center. All Rights Reserved
    All material, files, logos and trademarks within this site are properties of their respective organizations.
    Terms of Service - Privacy Policy - Contact

    Independent Oracle Users Group
    330 N. Wabash Ave., Suite 2000, Chicago, IL 60611
    phone: 312-245-1579 | email: ioug@ioug.org

    IOUG Logo

    Copyright © 1993-2019 by the Independent Oracle Users Group
    Terms of Use | Privacy Policy