Friday, January 23, 2026

Step-by-Step Guide: OCI DevOps & Resource Manager Terraform Infrastructure Provisioning

 

Introduction

Infrastructure provisioning on Oracle Cloud Infrastructure (OCI) can be automated with Infrastructure as Code (IaC) using OCI DevOps, OCI Resource Manager, and Terraform — enabling CI/CD-driven deployments across environments.

In this blog, we’ll walk through a real-world, high level plan of provisioning OCI infrastructure using OCI DevOps build pipelines integrated with OCI Resource Manager (Plan & Apply).

High-Level Architecture

The overall workflow looks like this:

  1. OCI DevOps Code Repository stores Terraform and pipeline artifacts

  2. OCI DevOps Build Pipeline is triggered on code changes

  3. Build Pipeline invokes OCI Resource Manager

  4. Resource Manager runs Terraform Plan and Apply

  5. Infrastructure is provisioned automatically




Step 1: Create OCI DevOps Code Repository

Start by creating a Code Repository inside your OCI DevOps Project. This repository will store:

  • build_spec.yaml

  • Terraform configuration files

Once created, clone the repository using Cloud Shell:

Authenticate using your OCI username and Auth Token.

Initially the repository will be blank and terraform codes will pushed to the repository using cloud shell.



Step 2: Create OCI DevOps Build Pipeline

Next, create a Build Pipeline in OCI DevOps. This pipeline will:

  • Read Terraform artifacts

  • Trigger OCI Resource Manager operations

You don’t need to configure all stages immediately; the pipeline will be connected later using triggers.




Step 3: Prepare Repository Structure

Organize your repository with a clean structure:

devops-repository/
├── build_spec.yaml
└── terraform/
└── resource_manager.tf

At present, the files are present locally in cloud shell but not on OCI Devops


Step 4: Upload Artifacts and Push to Repository

Add the Terraform and build specification files, then push them to the repository:

git add .
git commit -m "updated artifacts first time"--this will prompt for the username and password
git push -u origin main

Note:The build file has to be present in the root folder of the devops repository so that build pipeline can read it.


This ensures the build pipeline always pulls the latest Terraform configuration.

Step 5: Upload Terraform Artifacts to Object Storage

In this case, OCI Resource Manager requires Terraform configuration to be sourced from OCI Object Storage. You can also call resource manager directly from build pipeline.

  1. Create an Object Storage bucket

  2. Upload the Terraform artifacts (resource_manager.tf files)



Step 6: Create OCI Resource Manager Stack (CLI)

In this scenario, the stack cannot be created from the OCI Console. In such cases, use OCI CLI:

export compartment_id=<compartment_ocid>
export config_source_bucket_name=ORM_STACK
export config_source_namespace=<namespace>
export config_source_region=us-ashburn-1
export stack_display_name=ORM-STACK
export terraform_version=1.1.x

oci resource-manager stack create-from-object-storage \
--display-name $stack_display_name \
--compartment-id $compartment_id \
--config-source-bucket-name $config_source_bucket_name \
--config-source-namespace $config_source_namespace \
--config-source-region $config_source_region \
--terraform-version $terraform_version

Successful output returns the Stack OCID.




Step 7: Update build_spec.yaml

Update the build_spec.yaml file to reference the Resource Manager Stack OCID. This file defines:

  • Build stages

  • Resource Manager Plan

  • Resource Manager Apply

This allows OCI DevOps to orchestrate Terraform execution automatically.


Step 8: Create Build Pipeline Trigger

Create a trigger that connects:

  • Code Repository (main branch)

  • Build Pipeline

Now, every git push automatically triggers infrastructure provisioning




Step 9: Commit and Trigger the Pipeline

Make final updates and push changes:

git add .
git commit -m "updated resource manager stack"
git push









Under the resource manager, we can see that both apply and plan jobs were triggered.



Note: In this case, the state file is internally being managed by Resource manager.

Thus every commit code pushed to devops will trigger the resource manager stack to create resources in OCI.

Benefits of This Approach

  • Fully automated infrastructure provisioning

  • Terraform state managed securely by OCI

  • CI/CD driven infrastructure changes

  • Repeatable, auditable deployments

  • Reduced manual errors


Conclusion

By integrating OCI DevOps, OCI Resource Manager, and Terraform, you can achieve a powerful Infrastructure as Code (IaC) pipeline on Oracle Cloud. This setup is ideal for enterprises looking to standardize cloud provisioning with governance, automation, and scalability


Saturday, January 10, 2026

Using Oracle SQLcl MCP Server with Oracle 19c: A Step-by-Step Guide for NLP-Based Database Queries

 

Introduction

With the rapid evolution of AI, databases are no longer limited to traditional SQL-only interactions. Oracle has taken a major step forward by introducing MCP (Model Context Protocol) support in SQLcl, allowing AI tools like Claude Desktop to interact directly with Oracle databases using natural language.

In this blog, I’ll walk you through a hands-on, end-to-end setup of Oracle SQLcl MCP Server with an on-prem / OCI-hosted Oracle 19c database, and show how conversational AI can query enterprise databases securely.

This guide is ideal for Oracle DBAs, Cloud Architects, and AI-curious professionals who want to explore NLP-driven database access.


   Image source:-https://blogs.oracle.com/database/introducing-mcp-server-for-oracle-database


Architecture Overview

AI Client (Claude Desktop)
⬇️ MCP Protocol
SQLcl MCP Server (Local Machine)
⬇️ JDBC
Oracle Database 19c (OCI / On-Prem)

The AI never connects to the database directly. SQLcl acts as a secure MCP bridge, translating natural language into database operations.


Prerequisites

Before starting, ensure you have:

  • Oracle Database 19c (On-Prem or OCI Compute VM)

  • Windows laptop or desktop

  • Internet access to download tools

  • Basic Oracle SQL knowledge


Step 1: Install JDK 17 (Required for SQLcl)

Oracle SQLcl requires Java 17.

  • Download JDK 17 for Windows from Oracle

  • Install using the .exe

  • Set JAVA_HOME and update PATH

Verify:

java -version

Step 2: Install Oracle SQLcl

  • Download SQLcl from Oracle

  • Unzip it to a directory (example):

    C:\AI\sqlcli

SQLcl is portable—no installer required.


Step 3: Install Claude Desktop

Claude Desktop will act as the AI MCP client.

  • Download Claude Desktop

  • Install and launch once

  • Close it before MCP configuration


Step 4: Prepare Oracle Database 19c

Verify PDBs

show pdbs;

Ensure your PDB (e.g., ORCLPDB) is in READ WRITE mode.

Listener and Network Setup

  • Ensure port 1521 is open

  • Disable firewall (lab use only):

systemctl stop firewalld
systemctl disable firewalld
  • Confirm connectivity from Windows:

Test-NetConnection <DB_PUBLIC_IP> -Port 1521

Step 5: Create SQLcl Connection

Launch SQLcl:

sql /nolog

Create and save a connection:

conn -save oracle19c_mcptest -savepwd system/password@<IP>:1521/ORCLPDB

Validate:

CONNMGR test oracle19c_mcptest

Step 6: Start SQLcl MCP Server

sql -mcp -name oracle19c_mcptest

You should see:

MCP Server started successfully

This process must remain running.


Step 7: Configure Claude Desktop for MCP

Edit Claude configuration file:

{
"mcpServers": {
"oracle19c": {
"command": "C:/AI/sqlcli/sqlcl/sqlcl/bin/sql.exe",
"args": ["-mcp", "-name", "oracle19c_mcptest"]
}
}
}

Restart Claude Desktop and allow MCP access when prompted.


Step 8: Follow Least Privilege (Best Practice)

Instead of SYSTEM, create an application user:

CREATE USER app_user IDENTIFIED BY password;
GRANT CREATE SESSION, CREATE TABLE TO app_user;

Create sample data:

CREATE TABLE sales_orders (...);
INSERT INTO sales_orders VALUES (...);
COMMIT;

Create a separate SQLcl MCP connection for this user.

This ensures:

  • AI only sees approved schemas

  • SYS/SYSTEM access is avoided


Step 9: Test NLP Queries via Claude

Now the magic ✨

Ask Claude:



Claude:

  • Understands intent

  • Calls SQLcl MCP

  • Executes SQL

  • Returns results

No SQL typing required.



Security Considerations

✔ SQLcl connections are local-only ✔ Credentials stored in user profile ✔ Secure with OS file permissions ✔ Use separate DB users ✔ Optional: Oracle Wallet for credentials

AI never gets raw database access.


Why This Matters

This setup demonstrates:

  • Conversational AI for ad-hoc querying

  • AI + Oracle DB without exposing credentials

  • Perfect for DBAs, Support, and Architects


Final Thoughts

Oracle SQLcl MCP Server bridges the gap between enterprise databases and modern AI—securely, locally, and powerfully.

If you’re running Oracle 19c today, you can already start experimenting with conversational data access.