Introduction
Last year I wrote a number of posts on Big Data and shared how we can access CSV, Parquet and Avro file formats directly from an on-premises Oracle 19c database using Oracle external files on a local file system or and NFS share from an NFS server e.g. FlashBlade
Unfortunately, at that time I was unable to access Object Storage due to on-premises databases missing the required DBMS_CLOUD package which was only available to databases within Oracle Cloud Infrastructure (OCI)
Exciting News!
Oracle has now made the DBMS_CLOUD package available to on-premises database users running Oracle 19c (19.9 and above).
In this blog I will be using 19.10, but it will also work with 19.9, the detailed installation and setup of DBMS_CLOUD is documented on MOS (DOC ID 2748362.1)
I have already blogged on how to configure an Oracle 19c database to use DBMS_CLOUD packages, if you missed it can you find it here.
The Oracle DBMS_CLOUD package support access to Oracle Object Storage, Microsoft Azure Blob Storage, and Amazon Simple Storage Service (s3)
I also shared how we can configure and access an Oracle Cloud Infrastructure (OCI) Object Storage here.
In this post I will share how we can use the DBMS_CLOUD package to access an on-premises Pure Storage FlashBlade Object Store.
Oracle URI Formats
The Oracle DBMS_CLOUD package supports a number of URI formats (refer to the documentation for full list and syntax) e.g.
- OCI: https://swiftobjectstorage.<region>.oraclecloud.com/v1/<tenant>/<bucket>/<objectname>
- AWS: https://s3-<region>.amazonaws.com/<bucket>/<objectname>
- Azure: https://<account>.blob.core.windows.net/<container>/<objectname>
I will be using the AWS URI style to access my Pure Storage FlashBlade Object Storage.
FlashBlade Certificate
For this post I have created a certificate with a Common Name (CN) of s3-fbstaines01-amazonaws.com e.g a region of fbstaines01, I have also created and a local host file entry to resolve the fully qualified name to my FlashBlade IP address.
The Oracle 19c database will require a SSL certificate to access the FlashBlade via https, you can obtain the certificate using openssl s_client -showcerts -connect <FlashBlade>:443
And cut-&-pasting everything between the BEGIN CERTIFICATE and END CERTIFICATE. eg.
$ openssl s_client -showcerts -connect s3-fbstaines01.amazonaws.com:443 ... --- Certificate chain 0 s:/C=UK/ST=Staines/L=London/O=Pure Storage/CN=s3-fbstaines01.amazonaws.com i:/C=UK/ST=Staines/L=London/O=Pure Storage/CN=fbstaines01.amazonaws.com -----BEGIN CERTIFICATE----- MIIDUjCCAjoCCQCzM0h7DCnymDANBgkqhkiG9w0BAQUFADBrMQswCQYDVQQGEwJV ... XNcO9s9+dRe7Qb6vJyPk7UwF0hdRbLJmgb0ciMbrApab6Wib/BM= -----END CERTIFICATE----- ...
Alternatively, simply logon to the FlashBlade and navigate to Settings scroll down to Security and click Certificates
Find the certificate and click on the 3 vertical pips on the right hand side ‘Export Certificate’ select Download to save a file called pure.crt.

If you followed my previous blog you will already have a wallet, if not you will need to create an Oracle wallet before moving on.
Add Certificate to Oracle Wallet
To add the Pure Storage FlashBlade certificate we can use orapki wallet add e.g.
$ orapki wallet add -wallet . -trusted_cert -cert /home/oracle/pure.crt -pwd <password> Oracle PKI Tool Release 19.0.0.0.0 - Production Version 19.4.0.0.0 Copyright (c) 2004, 2020, Oracle and/or its affiliates. All rights reserved. Operation is successfully completed.
If you try to re-add a certificate you should see a PKI-04003 error e.g.
Could not install trusted cert at/home/oracle/pure.crt PKI-04003: The trusted certificate is already present in the wallet.
Let’s now list our certificates using orapki.
$ orapki wallet display -wallet . -pwd <password>
Oracle PKI Tool Release 19.0.0.0.0 - Production
Version 19.4.0.0.0
Copyright (c) 2004, 2020, Oracle and/or its affiliates. All rights reserved.
Requested Certificates:
User Certificates:
Trusted Certificates:
Subject: CN=s3-fbstaines01.amazonaws.com,O=Pure Storage,L=London,ST=Staines,C=UK
Subject: CN=DigiCert Global Root CA,OU=www.digicert.com,O=DigiCert Inc,C=US
Subject: CN=Baltimore CyberTrust Root,OU=CyberTrust,O=Baltimore,C=IE
Subject: CN=VeriSign Class 3 Public Primary Certification Authority - G5,OU=(c) 2006 VeriSign\, Inc. - For authorized use only,OU=VeriSign Trust Network,O=VeriSign\, Inc.,C=US
Create Object Store Bucket
Logon to your Pure Storage FlashBlade and navigate to Storage -> Object Store, scroll down to Buckets and click the plus sign to create a new Object Store Bucket.
Select Account and enter Bucket Name.

Note: the Bucket Name must be in lowercase.
List Buckets
If you have already installed and configured the AWS s3api client you can use this to list all buckets for an account. e.g.
$ aws s3api list-buckets --query "Buckets[].Name" --endpoint-url http://fbstaines01.uklab.purestorage.com --profile fbstaines01 [ "bigdata", "oracle", "oracle-rman-backup" ]
Upload Object Data
Now we have a created a new FlashBlade Object Store Bucket let’s upload some sample data using the AES s3api client, for this blog I be using 3 data files e.g. a CSV, Avro, and Parquet data file.
$ aws s3api put-object --bucket bigdata --key sales_extended.parquet --body /mnt/oracle/BigData/sales_extended.parquet --endpoint-url http://fbstaines01.uklab.purestorage.com --profile fbstaines01 { "ETag": "\"af5d8b41591ced60e59d3ecc9f1026ad\"", "VersionId": "AAAAAAAAAANgXXdkr-qfNgCsAAAABpGmAAAAAAAAAAUADAAAAAIGWwA2AAAAAACAAAAAAAAAAB4AZnNhbGVzX2V4dGVuZGVkLnBhcnF1ZXQ." } $ aws s3api put-object --bucket bigdata --key movie.avro --body /mnt/oracle/BigData/movie.avro --endpoint-url http://fbstaines01.uklab.purestorage.com --profile fbstaines01 { "ETag": "\"bb1c13cdd021303dc738b16b67a6fd89\"", "VersionId": "AAAAAAAAAANgXXdkr-qfNgDAAAAABpioAAAAAAAAAAEAwAAAAAIBMwA2AAAAAACAAAAAAAAAAB4AZm1vdmllLmF2cm8." } $ aws s3api put-object --bucket bigdata --key "On_Time_Reporting_Carrier_On_Time_Performance_(1987_present)_2019_1.csv" --body "/mnt/oracle/BigData/On_Time_Reporting_Carrier_On_Time_Performance_(1987_present)_2019_1.csv" --endpoint-url http://fbstaines01.uklab.purestorage.com --profile fbstaines01 { "ETag": "\"616d5cecda0308439192c0708075999d\"", "VersionId": "AAAAAAAAAANgXXdkr-qfNgBIAAAABqW_AAAAAAAAAAEASAAAAAHiugA2AAAAAACAAAAAAAAAAB4AZk9uX1RpbWVfUmVwb3J0aW5nX0NhcnJpZXJfT25fVGltZV9QZXJmb3JtYW5jZV8oMTk4N19wcmVzZW50KV8yMDE5XzEuY3N2" }
List Objects
We can also use the AWS s3api client to list objects we have just uploaded.
$ aws s3api list-objects --bucket bigdata --endpoint-url http://fbstaines01.uklab.purestorage.com --profile fbstaines01 { "Contents": [ { "Key": "On_Time_Reporting_Carrier_On_Time_Performance_(1987_present)_2019_1.csv", "LastModified": "2021-04-26T15:38:37.000Z", "ETag": "616d5cecda0308439192c0708075999d", "Size": 263023752, "StorageClass": "STANDARD", "Owner": { "DisplayName": "DefaultDisplayName", "ID": "DefaultID" } }, { "Key": "movie.avro", "LastModified": "2021-04-21T09:40:58.000Z", "ETag": "bb1c13cdd021303dc738b16b67a6fd89", "Size": 331459, "StorageClass": "STANDARD", "Owner": { "DisplayName": "DefaultDisplayName", "ID": "DefaultID" } }, { "Key": "sales_extended.parquet", "LastModified": "2021-04-21T09:46:35.000Z", "ETag": "af5d8b41591ced60e59d3ecc9f1026ad", "Size": 9263650, "StorageClass": "STANDARD", "Owner": { "DisplayName": "DefaultDisplayName", "ID": "DefaultID" } } ] }
Object Storage Usage
Now we have created a bucket and uploaded some objects, let’s check Size and Object Count using the aws s3api list-objects command e.g.
$ aws s3api list-objects --bucket bigdata --output json --query "[sum(Contents[].Size), length(Contents[])]" --profile fbstaines01 [ 272618861, 3 ]
If we return to our FlashBlade and navigate to our Account and Bucket we can also see the Size, Object Count and FlashBlade Data Reduction, a nice 4.3 : 1 for these 3 files.

Create FlashBlade Credentials
We can now logon to the same pluggable database (PDB) I previously used and create a new set of credentials for my FlashBlade account.
BEGIN DBMS_CLOUD.drop_credential(credential_name => 'FBSTAINES01'); END; / BEGIN DBMS_CLOUD.CREATE_CREDENTIAL( credential_name => 'FBSTAINES01', username => '<User Access Key ID>', password => '<Secret Access Key>' ); END; /
Test Object Storage Access
Let’s now try and access our FlashBlade Object Storage from within our Oracle 19c pluggable database using the list_objects function to see objects within a bucket.
SQL> select OBJECT_NAME, CHECKSUM, BYTES from dbms_cloud.list_objects('FBSTAINES01','https://s3-fbstaines01.amazonaws.com/bigdata/'); OBJECT_NAME CHECKSUM BYTES --------------------------- On_Time_Reporting_Carrier_On_Time_Performance_(1987_present)_2019_1.csv 616d5cecda0308439192c0708075999d 263023752 movie.avro bb1c13cdd021303dc738b16b67a6fd89 331459 sales_extended.parquet af5d8b41591ced60e59d3ecc9f1026ad 9263650
The above query has returned the 3 objects I uploaded into my FlashBlade Object Store Bucket, showing their names (Key), checksum (ETAG) and size in bytes.
Summary
In this post I shared how we can now access objects stored in a Pure Storage FlashBlade from within an Oracle 19c on-premises database.
In future posts I will explore how we can now use object storage with block and file storage to deliver application solutions.
[twitter-follow screen_name=’RonEkins’ show_count=’yes’]
Hello, I’m looking for a way how to make RMAN backup to local (on-premise) S3 AWS-compatible storage. This seems promissing, thanks for that! What I don’t like is fake hosts file record (“s3-fbstaines01-amazonaws.com”) because my S3 relies on dns round robin for load-balancing. Is it really necessary?
Thanks & Regards
Martin.
Hi Martin,
You can use a Pure Storage FlashBlade as an on-premises s3 RMAN backup Target, I will be creating a Blog for this, but in advance of that you many want to check-out:
https://support.purestorage.com/Solutions/Oracle/Backup_and_Recovery/Oracle_RMAN_Backup_to_FlashBlade_Object_Storage_using_the_S3_API
Kind regards
Ron
Hi Ron, this works for me perfectly, it’s very straightforward method. Thanks for that!
Best regards
Martin