How to access a FlashBlade Object Store from within your Oracle 19c database


Last year I wrote a number of posts on Big Data and shared how we can access CSV, Parquet and Avro file formats directly from an on-premises Oracle 19c database using Oracle external files on a local file system or and NFS share from an NFS server e.g. FlashBlade

Unfortunately, at that time I was unable to access Object Storage due to on-premises databases missing the required DBMS_CLOUD package which was only available to databases within Oracle Cloud Infrastructure (OCI)

Exciting News!

Oracle has now made the DBMS_CLOUD package available to on-premises database users running Oracle 19c (19.9 and above).

In this blog I will be using 19.10, but it will also work with 19.9, the detailed installation and setup of DBMS_CLOUD is documented on MOS (DOC ID 2748362.1)

I have already blogged on how to configure an Oracle 19c database to use DBMS_CLOUD packages, if you missed it can you find it here.

The Oracle DBMS_CLOUD package support access to Oracle Object Storage, Microsoft Azure Blob Storage, and Amazon Simple Storage Service (s3)

I also shared how we can configure and access an Oracle Cloud Infrastructure (OCI) Object Storage here.

In this post I will share how we can use the DBMS_CLOUD package to access an on-premises Pure Storage FlashBlade Object Store.

Oracle URI Formats

The Oracle DBMS_CLOUD package supports a number of URI formats (refer to the documentation for full list and syntax) e.g.

I will be using the AWS URI style to access my Pure Storage FlashBlade Object Storage.

FlashBlade Certificate

For this post I have created a certificate with a Common Name (CN) of e.g a region of fbstaines01, I have also created and a local host file entry to resolve the fully qualified name to my FlashBlade IP address.

The Oracle 19c database will require a SSL certificate to access the FlashBlade via https, you can obtain the certificate using openssl s_client -showcerts -connect <FlashBlade>:443

And cut-&-pasting everything between the BEGIN CERTIFICATE and END CERTIFICATE. eg.

$ openssl s_client -showcerts -connect
Certificate chain
 0 s:/C=UK/ST=Staines/L=London/O=Pure Storage/
   i:/C=UK/ST=Staines/L=London/O=Pure Storage/

Alternatively, simply logon to the FlashBlade and navigate to Settings scroll down to Security and click Certificates

Find the certificate and click on the 3 vertical pips on the right hand side ‘Export Certificate’ select Download to save a file called pure.crt.

Export Certificate

If you followed my previous blog you will already have a wallet, if not you will need to create an Oracle wallet before moving on.

Add Certificate to Oracle Wallet

To add the Pure Storage FlashBlade certificate we can use orapki wallet add e.g.

$ orapki wallet add -wallet . -trusted_cert -cert /home/oracle/pure.crt -pwd <password>
Oracle PKI Tool Release - Production
Copyright (c) 2004, 2020, Oracle and/or its affiliates. All rights reserved.

Operation is successfully completed.

If you try to re-add a certificate you should see a PKI-04003 error e.g.

Could not install trusted cert at/home/oracle/pure.crt
PKI-04003: The trusted certificate is already present in the wallet. 

Let’s now list our certificates using orapki.

$ orapki wallet display -wallet . -pwd <password>
Oracle PKI Tool Release - Production
Copyright (c) 2004, 2020, Oracle and/or its affiliates. All rights reserved.

Requested Certificates: 
User Certificates:
Trusted Certificates: 
Subject:,O=Pure Storage,L=London,ST=Staines,C=UK
Subject:        CN=DigiCert Global Root CA,,O=DigiCert Inc,C=US
Subject:        CN=Baltimore CyberTrust Root,OU=CyberTrust,O=Baltimore,C=IE
Subject:        CN=VeriSign Class 3 Public Primary Certification Authority - G5,OU=(c) 2006 VeriSign\, Inc. - For authorized use only,OU=VeriSign Trust Network,O=VeriSign\, Inc.,C=US

Create Object Store Bucket

Logon to your Pure Storage FlashBlade and navigate to Storage -> Object Store, scroll down to Buckets and click the plus sign to create a new Object Store Bucket.

Select Account and enter Bucket Name.

Create Bucket

Note: the Bucket Name must be in lowercase.

List Buckets

If you have already installed and configured the AWS s3api client you can use this to list all buckets for an account. e.g.

$ aws s3api list-buckets --query "Buckets[].Name" --endpoint-url --profile fbstaines01

Upload Object Data

Now we have a created a new FlashBlade Object Store Bucket let’s upload some sample data using the AES s3api client, for this blog I be using 3 data files e.g. a CSV, Avro, and Parquet data file.

$ aws s3api put-object --bucket bigdata --key sales_extended.parquet --body /mnt/oracle/BigData/sales_extended.parquet --endpoint-url --profile fbstaines01
    "ETag": "\"af5d8b41591ced60e59d3ecc9f1026ad\"",

$ aws s3api put-object --bucket bigdata --key movie.avro --body /mnt/oracle/BigData/movie.avro --endpoint-url --profile fbstaines01
    "ETag": "\"bb1c13cdd021303dc738b16b67a6fd89\"",

$ aws s3api put-object --bucket bigdata --key "On_Time_Reporting_Carrier_On_Time_Performance_(1987_present)_2019_1.csv" --body "/mnt/oracle/BigData/On_Time_Reporting_Carrier_On_Time_Performance_(1987_present)_2019_1.csv" --endpoint-url --profile fbstaines01
    "ETag": "\"616d5cecda0308439192c0708075999d\"",

List Objects

We can also use the AWS s3api client to list objects we have just uploaded.

$ aws s3api list-objects --bucket bigdata --endpoint-url --profile fbstaines01
    "Contents": [
            "Key": "On_Time_Reporting_Carrier_On_Time_Performance_(1987_present)_2019_1.csv",
            "LastModified": "2021-04-26T15:38:37.000Z",
            "ETag": "616d5cecda0308439192c0708075999d",
            "Size": 263023752,
            "StorageClass": "STANDARD",
            "Owner": {
                "DisplayName": "DefaultDisplayName",
                "ID": "DefaultID"
            "Key": "movie.avro",
            "LastModified": "2021-04-21T09:40:58.000Z",
            "ETag": "bb1c13cdd021303dc738b16b67a6fd89",
            "Size": 331459,
            "StorageClass": "STANDARD",
            "Owner": {
                "DisplayName": "DefaultDisplayName",
                "ID": "DefaultID"
            "Key": "sales_extended.parquet",
            "LastModified": "2021-04-21T09:46:35.000Z",
            "ETag": "af5d8b41591ced60e59d3ecc9f1026ad",
            "Size": 9263650,
            "StorageClass": "STANDARD",
            "Owner": {
                "DisplayName": "DefaultDisplayName",
                "ID": "DefaultID"

Object Storage Usage

Now we have created a bucket and uploaded some objects, let’s check Size and Object Count using the aws s3api list-objects command e.g.

$ aws s3api list-objects --bucket bigdata --output json --query "[sum(Contents[].Size), length(Contents[])]" --profile fbstaines01

If we return to our FlashBlade and navigate to our Account and Bucket we can also see the Size, Object Count and FlashBlade Data Reduction, a nice 4.3 : 1 for these 3 files.

Object Store Usage

Create FlashBlade Credentials

We can now logon to the same pluggable database (PDB) I previously used and create a new set of credentials for my FlashBlade account.

  DBMS_CLOUD.drop_credential(credential_name => 'FBSTAINES01');

    credential_name => 'FBSTAINES01',
    username => '<User Access Key ID>',
    password => '<Secret Access Key>'

Test Object Storage Access

Let’s now try and access our FlashBlade Object Storage from within our Oracle 19c pluggable database using the  list_objects function to see objects within a bucket. 

SQL> select OBJECT_NAME, CHECKSUM, BYTES from dbms_cloud.list_objects('FBSTAINES01','');




The above query has returned the 3 objects I uploaded into my FlashBlade Object Store Bucket, showing their names (Key), checksum (ETAG) and size in bytes.


In this post I shared how we can now access objects stored in a Pure Storage FlashBlade from within an Oracle 19c on-premises database.

In future posts I will explore how we can now use object storage with block and file storage to deliver application solutions.

[twitter-follow screen_name=’RonEkins’ show_count=’yes’]

3 thoughts on “How to access a FlashBlade Object Store from within your Oracle 19c database

Add yours

  1. Hello, I’m looking for a way how to make RMAN backup to local (on-premise) S3 AWS-compatible storage. This seems promissing, thanks for that! What I don’t like is fake hosts file record (“”) because my S3 relies on dns round robin for load-balancing. Is it really necessary?
    Thanks & Regards

      1. Hi Ron, this works for me perfectly, it’s very straightforward method. Thanks for that!
        Best regards

Leave a Reply

Create a website or blog at

Up ↑

%d bloggers like this: