Working with Oracle 19c database and on-premises S3 Object Storage with DBMS_CLOUD

The DBMS_CLOUD package has been available for on-premises 19c database for over a year now, and since the initial 19.9 release several fixes have been shipped, so I suggest you consider using the latest patch if possible, to avoid any known issues.

I have previously shared how to configure an Oracle 19c database to use DBMS_CLOUD packages, if you missed it can you check-it out here.

In this blog I will using Oracle 19.15 to demonstrate how we can interact with on-premises Pure Storage FlashBlade S3 Object Storage from within an Oracle 19c database using the DBMS_CLOUD package.

Obtain Certificate

For this post I will be using a FlashBlade for S3 Object Storage, but the the steps are similar for other solutions.

If your S3 server is using self-signed certificate you will need to obtain the root CA certificate and add it to the Oracle Wallet.

The simplest way to get a certificate is by using Chrome, for example visit the S3 Server URL and click on the padlock and then select Certificate and then drag the Root Certificate Authority certificate onto your desktop, and transfer to your database server.

If you don’t have access to a Web Browser, you can obtain the certificate using openssl s_client -showcerts -connect <URL>:443

And cut-&-pasting everything between the BEGIN CERTIFICATE and END CERTIFICATE. for example.

[oracle@z-rac1 ~]$ openssl s_client -showcerts -connect s3-fbstaines02.amazonaws.com:443
...
---
Certificate chain
 0 s:/CN=fbstaines02
   i:/CN=uklab Root Certificate Authority/ST=Staines/C=UK/emailAddress=admin@uklab.purestorage.com/O=uklab purestorage/OU=uklab
-----BEGIN CERTIFICATE-----
MIID+TCCAuGgAwIBAgIJAPpit8f8HAhTMA0GCSqGSIb3DQEBBQUAMIGiMSkwJwYD
...
KM12VLTgJ7id2xU+LA==
-----END CERTIFICATE-----
---
...

Check Certificate

Before installation check you have the correct certificate by using orapki cert display -summary command for example.

[oracle@z-rac1 dbc]$ orapki cert display -cert pureCA.pem -summary
Oracle PKI Tool Release 19.0.0.0.0 - Production
Version 19.4.0.0.0
Copyright (c) 2004, 2021, Oracle and/or its affiliates. All rights reserved.

Subject:        OU=uklab,O=uklab purestorage,EmailAddress=admin@uklab.purestorage.com,C=UK,ST=Staines,CN=uklab Root Certificate Authority
Issuer:         OU=uklab,O=uklab purestorage,EmailAddress=admin@uklab.purestorage.com,C=UK,ST=Staines,CN=uklab Root Certificate Authority
Valid Until:    Thu Jun 06 12:39:02 BST 2041

Or with the -complete option, for example orapki cert display -cert pureCA.pem -complete

Update Oracle Wallet

This script below creates a directory, creates a wallet, imports the Baltimore, DigiSign, Versign and PureCA certifcates.

rm -Rf /home/oracle/dbc/commonstore/wallets/ssl
mkdir -p /home/oracle/dbc/commonstore/wallets/ssl
cd /home/oracle/dbc/commonstore/wallets/ssl
tar -xvf /home/oracle/dbc/dbc_certs.tar
cp /home/oracle/dbc/pureCA.pem /home/oracle/dbc/commonstore/wallets/ssl/
orapki cert display -cert ./pureCA.pem -summary

orapki wallet create -wallet . -pwd MyPassword1 -auto_login
orapki wallet add -wallet . -trusted_cert -cert ./VeriSign1.cer -pwd MyPassword1
orapki wallet add -wallet . -trusted_cert -cert ./BaltimoreCyberTrust.cer -pwd MyPassword1
orapki wallet add -wallet . -trusted_cert -cert ./DigiCert1.cer -pwd MyPassword1
orapki wallet add -wallet . -trusted_cert -cert ./pureCA.pem -pwd MyPassword1

orapki wallet display -wallet /home/oracle/dbc/commonstore/wallets/ssl/

Show Wallet Certificates

Confirm certificate has been installed with orapki.

[oracle@z-rac1 dbc]$ orapki wallet display -wallet /home/oracle/dbc/commonstore/wallets/ssl/
Oracle PKI Tool Release 19.0.0.0.0 - Production
Version 19.4.0.0.0
Copyright (c) 2004, 2021, Oracle and/or its affiliates. All rights reserved.

Requested Certificates: 
User Certificates:
Trusted Certificates: 
Subject:        OU=uklab,O=uklab purestorage,EMAIL=admin@uklab.purestorage.com,C=UK,ST=Staines,CN=uklab Root Certificate Authority
Subject:        CN=DigiCert Global Root CA,OU=www.digicert.com,O=DigiCert Inc,C=US
Subject:        CN=Baltimore CyberTrust Root,OU=CyberTrust,O=Baltimore,C=IE
Subject:        CN=VeriSign Class 3 Public Primary Certification Authority - G5,OU=(c) 2006 VeriSign\, Inc. - For authorized use only,OU=VeriSign Trust Network,O=VeriSign\, Inc.,C=US

Please note these steps needs to be performed on all database nodes in when your RAC cluster

Update sqlnet.ora

Update the $ORACLE_HOME/network/admin/sqlnet.ora file to reflect the wallet location, for example.

[oracle@z-rac1 dbc]$ cat $ORACLE_HOME/network/admin/sqlnet.ora
NAME.DIRECTORY_PATH= (TNSNAMES, EZCONNECT, HOSTNAME)
SQLNET.EXPIRE_TIME=10
TCP.VALIDNODE_CHECKING=no

WALLET_LOCATION=
  (SOURCE=(METHOD=FILE)(METHOD_DATA=
  (DIRECTORY=/home/oracle/dbc/commonstore/wallets/ssl)))

Test Certificate

An easy way to confirm the SSL certificate and S3 account credentials is with the aws s3api utility

I have previously added a profile for FlashBlade S3 details to my $HOME/.aws/credentials file, for example.

[fbstaines02]
aws_access_key_id = PSFBSAZREELMLGDDHKMIOAPHOJBKBBEIIDEJNBJDEK
aws_secret_access_key = <aws_secret_access_key>

Below you can see how I validated the configuration by listing the S3 Buckets.

[oracle@z-rac1 dbc]$ aws s3api list-buckets --query "Buckets[].Name" --endpoint-url https://s3-fbstaines02.amazonaws.com --profile fbstaines02 --ca-bundle /home/oracle/dbc/commonstore/wallets/ssl/pureCA.pem
[
    "ora-bucket"
]

We can also use the aws s3api utility to list the contents of an S3 bucket.

[oracle@z-rac1 dbc]$ aws s3api list-objects --bucket ora-bucket --query 'Contents[].{Key: Key, Size: Size}' --endpoint-url https://s3-fbstaines02.amazonaws.com --profile fbstaines02 --ca-bundle /home/oracle/dbc/commonstore/wallets/ssl/pureCA.pem
[
    {
        "Key": "sales_extended.parquet",
        "Size": 9263650
    }
]

Oracle Database

Now we have confirmed our environment, we can move to the Oracle 19c database.

Create Account

For testing let’s start by creating an account, grants and directories, for example.

--drop user testuser cascade;
create user testuser identified by testuser quota unlimited on users;
grant connect, resource to testuser;

grant create credential to testuser;
grant execute on dbms_cloud to testuser;

create or replace directory tmp_files_dir as '/tmp/files';
grant read, write on directory tmp_files_dir to testuser, C##CLOUD$SERVICE;

alter session set "_oracle_script"=TRUE;
create or replace directory data_pump_dir as '/u01/app/oracle/admin/DEMO/dpdump/';
alter session set "_oracle_script"=FALSE;
grant read, write on directory data_pump_dir to testuser;

Create Credentials

Logon to the the pluggable database (PDB) using required schema, for example.

$ sqlplus testuser/testuser@//z-rac1:1521/pdb1

Now create credentials for the S3 Object Storage, providing the AWS style Access Key ID and Access Secret Access Key.

begin
  dbms_credential.drop_credential(
    credential_name => 'fb_cred');
end;
/

begin
  dbms_credential.create_credential(
    credential_name => 'fb_cred',
    username        => 'PSFBSAZREELMLGDDHKMIOAPHOJBKBBEIIDEJNBJDEK',
    password        => '<aws_secret_access_key>');
end;
/

Check creation with user_credentials, for example.

set linesize 200
column credential_name format a20
column username format a50
column enabled format a10

select credential_name, username, enabled from user_credentials;

CREDENTIAL_NAME      USERNAME						ENABLED
-------------------- -------------------------------------------------- ----------
FB_CRED 	     PSFBSAZREELMLGDDHKMIOAPHOJBKBBEIIDEJNBJDEK 	TRUE

Working with S3 Buckets

List Buckets

We can using the DBMS_CLOUD send_requests to list all the available S3 buckets.

SET SERVEROUTPUT ON
DECLARE
  resp DBMS_CLOUD_TYPES.resp;
BEGIN
  -- send request
  resp := dbms_cloud.send_request(
            credential_name => 'fb_cred',
            uri => 'https://s3-fbstaines02.amazonaws.com/', 
            method => dbms_cloud.METHOD_GET
          );

  -- Response Body in TEXT format
  dbms_output.put_line('Body: ' || '------------' || CHR(10) ||
  DBMS_CLOUD.get_response_text(resp) || CHR(10));

  -- Response Headers in JSON format
  dbms_output.put_line('Headers: ' || CHR(10) || '------------' || CHR(10) ||
  DBMS_CLOUD.get_response_headers(resp).to_clob || CHR(10));

  -- Response Status Code
  dbms_output.put_line('Status Code: ' || CHR(10) || '------------' || CHR(10) ||
  DBMS_CLOUD.get_response_status_code(resp));

END;
/

Body: ------------
<?xml version="1.0" encoding="UTF-8" standalone="yes"?><ListAllMyBucketsResult
xmlns="http://s3.amazonaws.com/doc/2006-03-01/"><Owner><ID>DefaultID</ID><DisplayName>DefaultDisplayName</DisplayName></Owner><Buckets><Bucket><Name>ora-bucket</Name><CreationDate>2022-07-14T10:45:14.
000Z</CreationDate></Bucket></Buckets></ListAllMyBucketsResult>

Headers: 
------------
{"Date":"Thu, 04 Aug 2022 10:52:09 GMT","Content-Type":"application/xml","connection":"close","content-length":"342"}

Status Code: 
------------
200

PL/SQL procedure successfully completed.

Using the above we can see the https request Body, Header, Status Code and s3 Object Store bucket name.

List Objects

Using the same approach we can also list S3 Objects, for example:

SET SERVEROUTPUT ON
DECLARE
  resp DBMS_CLOUD_TYPES.resp;
BEGIN
  -- send request
  resp := dbms_cloud.send_request(
            credential_name => 'fb_cred',
            uri => 'https://s3-fbstaines02.amazonaws.com/ora-bucket/', 
            method => dbms_cloud.METHOD_GET
          );

  -- Response Body in TEXT format
  dbms_output.put_line('Body: ' || '------------' || CHR(10) ||
  DBMS_CLOUD.get_response_text(resp) || CHR(10));
  
  -- Response Headers in JSON format
  dbms_output.put_line('Headers: ' || CHR(10) || '------------' || CHR(10) ||
  DBMS_CLOUD.get_response_headers(resp).to_clob || CHR(10));
 
  -- Response Status Code
  dbms_output.put_line('Status Code: ' || CHR(10) || '------------' || CHR(10) ||
  DBMS_CLOUD.get_response_status_code(resp));
 
END;
/

Body: ------------
<?xml version="1.0" encoding="UTF-8"
standalone="yes"?><ListBucketResult
xmlns="http://s3.amazonaws.com/doc/2006-03-01/"><Name>ora-bucket</Name><Prefix><
/Prefix><Marker></Marker><MaxKeys>1000</MaxKeys><IsTruncated>false</IsTruncated>
<Contents><Key>airports.csv</Key><LastModified>2022-08-04T14:19:01.000Z</LastMod
ified><ETag>5d0064780d4819a20148eb99f45ac02a</ETag><Size>4965317</Size><Owner><I
D>DefaultID</ID><DisplayName>DefaultDisplayName</DisplayName></Owner><StorageCla
ss>STANDARD</StorageClass></Contents><Contents><Key>sales_extended.parquet</Key>
<LastModified>2022-07-22T09:16:51.000Z</LastModified><ETag>af5d8b41591ced60e59d3
ecc9f1026ad</ETag><Size>9263650</Size><Owner><ID>DefaultID</ID><DisplayName>Defa
ultDisplayName</DisplayName></Owner><StorageClass>STANDARD</StorageClass></Conte
nts></ListBucketResult>

Headers: 
------------
{"Date":"Thu, 04 Aug 2022 14:30:53
GMT","Content-Type":"application/xml","connection":"close","content-length":"816
"}

Status Code: 
------------
200

PL/SQL procedure successfully completed.

Put Object

The Oracle DBMS_CLOUD package provides the ability to upload files to an S3 bucket from an Oracle managed directory.

Below I have a file called airports.csv located in the /tmp/files (Oracle tmp_file_dir)

[oracle@z-rac1 ~]$ ls -lh /tmp/files/
total 4.8M
-rw-r--r--. 1 oracle oinstall 4.8M Aug  4 12:02 airports.csv

Using DBMS_CLOUD put_object we can upload the objects to a S3 bucket called ora-bucket.

begin
  dbms_cloud.put_object (
    credential_name => 'fb_cred',
    object_uri      => 'https://s3-fbstaines02.amazonaws.com/ora-bucket/airports.csv',
    directory_name  => 'tmp_files_dir',
    file_name       => 'airports.csv');
end;
/

List Objects

We can see objects stored in S3 buckets from an Oracle 19c database using DBMS_CLOUD list_objects.

set linesize 200
column object_name format a25
column checksum format a35
column created format a35
column last_modified format a35

select *
from   dbms_cloud.list_objects(
       'fb_cred',
       'https://s3-fbstaines02.amazonaws.com/ora-bucket/')
/

OBJECT_NAME		       BYTES CHECKSUM				 CREATED			     LAST_MODIFIED
------------------------- ---------- ----------------------------------- ----------------------------------- -----------------------------------
airports.csv		     4965317 5d0064780d4819a20148eb99f45ac02a					     04-AUG-22 11.07.36.000000 +00:00
sales_extended.parquet	     9263650 af5d8b41591ced60e59d3ecc9f1026ad					     22-JUL-22 09.16.51.000000 +00:00

Delete Object

To delete an object from an S3 bucket we can use DBMS_CLOUD package but this time with delete_object.

begin
  dbms_cloud.delete_object (
    credential_name => 'fb_cred',
    object_uri      => 'https://s3-fbstaines02.amazonaws.com/ora-bucket/airports.csv'
  );
end;
/
exit
/

Summary

In this post I shared how from an on-premises Oracle 19c database database we can Upload, List and Delete objects to a Pure Storage FlashBlade S3 Objects Storage bucket.

One thought on “Working with Oracle 19c database and on-premises S3 Object Storage with DBMS_CLOUD

Add yours

Leave a Reply

Create a website or blog at WordPress.com

Up ↑

%d bloggers like this: