DBA Blogs

Moving to https://mattypenny.github.io/

Matt Penny - Sun, 2020-01-05 08:10

Much as I like WordPress, I’m moving all of this stuff over to:


Categories: DBA Blogs

Configure Distribution Service between two Secure GoldenGate Microservices Architectures

DBASolved - Mon, 2019-12-30 14:46

Once you configure an Oracle GoldenGate Microservices environment to be secure behind the Nginx reverse proxy, the next thing you have to do is tackle how to connect one environment to the other using the Distribution Server.  In using the Distribution Server, you will be creating what is called a Distribution Path. Distribution Paths are […]

The post Configure Distribution Service between two Secure GoldenGate Microservices Architectures appeared first on DBASolved.

Categories: DBA Blogs

Installing Nginx

DBASolved - Sat, 2019-12-28 22:03

With Oracle GoldenGate Microservices, you have the option of using a reverse proxy or not.  In reality, it is a best practice to install the recommended reverse proxy for the architecture.  The main benefit here is the security aspect of using it.  In Oracle GoldenGate Microservices, depending on the number of deployments you have per […]

The post Installing Nginx appeared first on DBASolved.

Categories: DBA Blogs

Use Conda to Generate Requirements.txt for Docker Containers

Pakistan's First Oracle Blog - Fri, 2019-12-27 00:01
pip is a standard package manager. Requirements.txt can be generated in one environment and installed by pip in a new environment. Conda replicates own installation. Pip produces a list of packages that were installed on top of standard library to make the package you wrote work.

Following are the steps to generate requirements.txt file to be used insdie Dockerfile for docker containers:

Go to your project environment conda activate

conda list gives you list of packages used for the environment

conda list -e > requirements.txt save all the info about packages to your folder

conda env export > .yml

pip freeze

Hope that helps.
Categories: DBA Blogs

Identity wallets used by Oracle GoldenGate Microservices

DBASolved - Tue, 2019-12-24 13:54

Wallets, Wallets, and more wallets! … Wallets are used for a few different things within Oracle GoldenGate Microservices; identifying what wallet is being used by a service is simple if you know where to look.   The biggest usage of wallets within Oracle GoldenGate Microservices is to help secure the communication between the Distribution Service […]

The post Identity wallets used by Oracle GoldenGate Microservices appeared first on DBASolved.

Categories: DBA Blogs

Merry Christmas and Happy New Year !! (“Heroes”)

Richard Foote - Mon, 2019-12-23 16:01
I would like to take this opportunity to wish all my readers a very Merry Christmas and a most happy, peaceful and prosperous New Year. My gift this year is not David Bowie and Bing Crosby doing their famous Christmas duet but a performance by Bowie of his classic “Heroes” as featured on the same […]
Categories: DBA Blogs

To compare two same tables from different schema without primary key and not same number of columns

Tom Kyte - Fri, 2019-12-20 08:55
We have table 'CUSTOMER' in two different schema's. Both are not having any primary key and the column numbers in both table do not match(i.e schema1 table can have 97 column other schema table has 101).the column names are same which are present in ...
Categories: DBA Blogs

PL/SQL code in packages on DB vs on APEX sites - how it affect performance?

Tom Kyte - Fri, 2019-12-20 08:55
Dear Tom, As you develop in APEX you can move all your PL/SQL code into packages or you can put all on APEX. How this affect performance? I know that it is better to move code to packages to make quick changes and have more control over code. ...
Categories: DBA Blogs


Tom Kyte - Fri, 2019-12-20 08:55
Requ:- If table do not have records then need to be inserted if have already the END DATE column only update with an no.of years based on Terms( For example If Term year is 10, then 10*12=120 Months, means..10 years needs to added to the END DATE col...
Categories: DBA Blogs

Merge Delete

Tom Kyte - Fri, 2019-12-20 08:55
How do I delete with a merge statement? I want to update MERGE_DELETE_TEST2 to match MERGE_DELETE_TEST1 (think ETL). I cannot get merge delete to remove the row that exists in MERGE_DELETE_TEST2 that does not exist in MERGE_DELETE_TEST1.
Categories: DBA Blogs

Import Production Dump to new schema

Tom Kyte - Fri, 2019-12-20 08:55
Hi, We have a live project which has only 1 schema. We are upgrading some features and so we are going to release it as version 2.0 . My question is that the production dump which is only one schema has to be to imported to 4 different schema o...
Categories: DBA Blogs

Check Constraints and Explain Plan Filter Predicates

Tom Kyte - Fri, 2019-12-20 08:55
Why does the Oracle SQL Optimizer include a filter predicate for a Check Constraint when generating an execution plan for a SELECT statement? If the constraint is valid (according to DBA_CONSTRAINTS), then the table rows are all compliant with the co...
Categories: DBA Blogs

Dynamic filters and arriving bind variables for them.

Tom Kyte - Fri, 2019-12-20 08:55
Team, we have an application, that used to search using any kind of filters on any colums - something like below. the procedure is used to return the resultset to the application, based on the WHERE clause being passed as input. when running...
Categories: DBA Blogs

Error while relocating database service

Tom Kyte - Fri, 2019-12-20 08:55
Hello, Ask Tom Team. <b>My environment: </b> I have a database running on 2-node RAC. I created a database service with TAF and transaction guard srvctl add service -db dbprod -service dbprod1_xa -preferred dbprod1 -available dbprod2 -fail...
Categories: DBA Blogs

London March 2020: “Oracle Indexing Internals and Best Practices” and “Oracle Performance Diagnostics and Tuning” Seminars

Richard Foote - Thu, 2019-12-19 02:06
Places are filling up, but there are still some available at both of my acclaimed seminars that I’ll be running in London, UK in March 2020. The dates and registration links are as follows: 23-24 March 2020: “Oracle Indexing Internals and Best Practices” seminar – Tickets and Registration Link 25-26 March 2020: “Oracle Performance Diagnostics and […]
Categories: DBA Blogs

Datapump Import Partitioned Tables ORA-00600 qesmaGetPamR-NullCtx

Bobby Durrett's DBA Blog - Wed, 2019-12-18 10:28

I have not yet had time to build a test case and prove this out, but I wanted to document one last bug that we found so far in our to 19c upgrade. We tried copying a bunch of partitioned tables on our source database to the new one using Datapump Import (impdp) over a database link. We got a boatload of errors like this:

ORA-00600: internal error code, arguments: [qesmaGetPamR-NullCtx], 

There are many Oracle bugs like this, but they seem to have been fixed in For example:

Bug 12591399 – ORA-600[qesmagetpamr-nullctx] / ORA-14091 with distributed query with local partition table (Doc ID 12591399.8)

Puzzling. We ended up just exporting to disk and that has worked well so no big deal, but I wonder if this is some sort of recession of a fixed bug.

Anyway, I am off for the rest of the year. This should be my last post unless I mess with Nethack over vacation and post something about that. I hope everyone out there has a good new year.


P.S. Created a simple partitioned table with 2 partitions and 100 rows in each one. I got the error importing over a link from to 19c. It worked perfectly going from to Same source table. Parfile:

$ cat bobby_link_test.par


  PART_COL              NUMBER,
  data                  NUMBER

PPS. Works fine going from to 18c. Going to try a different 19c database just to be sure it isn’t the one that has the problem.

PPPS. Definitely a 19c bug. It fails on two different 19c databases but not on 18c. In every case source is same database and same small partitioned table. Does anyone have time to file the bug report?

Categories: DBA Blogs

Oracle Database 19c Automatic Indexing – Indexed Column Reorder (What Shall We Do Now?)

Richard Foote - Tue, 2019-12-17 18:49
  I previously discussed how the default column order of an Automatic Index (in the absence of other factors) is based on the Column ID, the order in which the columns are defined in the table. But what if there are “other factors” based on new workloads and the original index column order is no […]
Categories: DBA Blogs

db_securefile PREFERRED results in ORA-60019 with small uniform extents

Bobby Durrett's DBA Blog - Tue, 2019-12-17 17:35

Last 19c upgrade issue. Working on our new 19c database, several things died off with errors like this:


ERROR at line 1:
ORA-60019: Creating initial extent of size 14 in tablespace of extent size 8
ORA-06512: at "SYS.DBMS_STATS", line 20827
ORA-06512: at "SYS.DBMS_STATS", line 20770
ORA-06512: at "SYS.DBMS_STATS", line 20765
ORA-06512: at line 1

Our tablespaces had small uniform extents and our 19c database had defaulted the parameter db_securefile to PREFERRED. We bumped our uniform extent sizes up to 1 megabyte and the problem went away. Setting db_securefile to PERMITTED also resolved the issue.

Oracle’s support site has a bunch of good information about this. This might be a relevant bug:



Categories: DBA Blogs

Datapump Import Fails on Tables With Extended Statistics

Bobby Durrett's DBA Blog - Tue, 2019-12-17 17:11

Quick post before I leave on vacation. We used Datapump to import a schema from an 11.2 HP-UX database to a 19c Linux database and got errors on a few tables like these:

ORA-39083: Object type TABLE:"MYSCHEMA"."TEST" failed to create with error:
ORA-00904: "SYS_STU0S46GP2UUQY#45F$7UBFFCM": invalid identifier

Failing sql is:

Workaround was to create the table first empty with no indexes, constraints, etc. and import. Today I was trying to figure out why this happened. Apparently, the table has extended statistics on the three primary key columns. I found a post by Jonathan Lewis that shows a virtual column like the one this table has with extended statistics. The error is on the datapump import, impdp, of the table that has extended statistics. This error is similar to some Oracle documented issues such as:

DataPump Import (IMPDP) Raises The Errors ORA-39083 ORA-904 Due To Virtual Columns Dependent On A Function (Doc ID 1271176.1)

But I could not immediately find something that says that extended statistics cause a table to not be importable using Datapump impdp.

If you want to recreate the problem, try added extended stats like this (which I derived from Jonathan Lewis’s post):

select dbms_stats.create_extended_stats(NULL,'TEST','(COL1, COL2, COL3)') name from dual;

select * from user_tab_cols where table_name='TEST';

Then export table from 11.2 and import to 19c database using datapump. Anyway, posting here for my own memory and in case others find it useful. Maybe this is a bug?


Categories: DBA Blogs

Merge Always Updates Sequence Number

Bobby Durrett's DBA Blog - Tue, 2019-12-17 11:55

This is nothing new, but I wanted to throw out a quick post to document it. If you have a sequence.nextval in the insert part of a merge statement the merge calls nextval for all the updated rows as well.

Oracle has a bug report about this from a 9.2 issue, so this is nothing new:


I created a couple of testcases if you want to try them: sequencewithmerge.zip

Oracle’s bug report says you can work around the issue by encasing the sequence.nextval call in a function so I tried it and it works.

Anyway, you can’t count on the sequence only being advanced on inserted rows with merge statements if you include sequence.nextval in the insert part of the merge statement.


Categories: DBA Blogs


Subscribe to Oracle FAQ aggregator - DBA Blogs