What Is the Databricks CLI?
# On macOS (Homebrew)
brew install databricks
# Or download binary from GitHub releases
# https://github.com/databricks/cli/releases
# Then configure (OAuth or token-based)
databricks configure –profile sap-bdc
You can define multiple profiles in ~/.databrickscfg for different environments (e.g., dev SAP Databricks workspace vs. prod).
Useful Commands for Everyday and SAP-Relevant Work
{
“cluster_name”: “prod-cluster”,
“spark_version”: “13.3.x-scala2.12”,
“node_type_id”: “i3.xlarge”,
“autotermination_minutes”: 30,
“enable_elastic_disk”: true
}
→ Benefit in SAP contexts: Quickly spin up clusters for processing large SAP extractions or ML training on business data without clicking through the UI.
Jobs and WorkflowsList jobs: databricks jobs listRun job immediately: databricks jobs run-now –job-id <job-id>List job runs: databricks jobs list-runs –job-id <job-id>Export job config: databricks jobs get –job-id <job-id> –output-json > job.json
{
“name”: “daily-batch-job”,
“tasks”: [
{
“task_key”: “ingest-data”,
“job_cluster_key”: “prod-cluster”,
“spark_python_task”: {
“python_file”: “/Shared/ingest.py”
}
}
]
}
→ Automate nightly ETL jobs that transform SAP delta-shared tables into analytics-ready layers.
Workspace & NotebooksSync local folder workspace: databricks sync . /Users/your.name/projectList notebooks: databricks workspace ls /Users/your.nameExport notebook: databricks workspace export /path/to/notebook –format SOURCE > notebook.py→ Ideal for CI/CD: push transformed notebooks or Delta Live Tables pipelines from Git → SAP Databricks workspace.
DBFS (Databricks File System)Upload file: databricks fs cp localfile.txt dbfs:/mnt/sap-data/Download: databricks fs cp dbfs:/output/results.csv .List: databricks fs ls dbfs:/mnt/→ Move SAP export CSVs, model artifacts, or interim results without browser-based uploads.
Unity Catalog (governance for SAP data)List catalogs: databricks catalog listList shares (Delta Sharing): databricks shares listCreate recipient for sharing to/from SAP BDC: (via API/CLI equivalents)→ Manage access to SAP Business Data Cloud shared data products securely.
Other Handy OnesCurrent version: databricks –versionHelp for any group: databricks clusters –helpOutput formatting: –output JSON or TABLE
Key Benefits of Using Databricks CLI (Especially with SAP Databricks)
Explore Databricks CLI ReferenceCombine CLI with Databricks Terraform Provider for infrastructure-as-codeBuild end-to-end pipelines using Databricks Workflows + CLI scripts
Conclusion
The Databricks CLI (Command-Line Interface) is a powerful, open-source tool that enables users to interact with the Databricks platform directly from the terminal, command prompt, or automation scripts. In the context of SAP Databricks — the natively embedded, SAP-managed version of Databricks within SAP Business Data Cloud (BDC) — the CLI provides the same core functionality as in standard Databricks deployments on AWS, Azure, or GCP. While SAP Databricks focuses on seamless integration with SAP data (via zero-copy Delta Sharing, semantic preservation, and tight coupling with SAP applications like S/4HANA), the CLI remains the standard Databricks CLI (version 0.205+ as of 2026, in public preview for many advanced features). It allows data engineers, administrators, and developers to manage workspaces, automate workflows, and handle SAP-enriched data pipelines more efficiently outside the web UI.What Is the Databricks CLI? The Databricks CLI wraps the Databricks REST API, giving you command-line access to almost every resource and operation available in the platform. Key capabilities include:Managing compute clustersHandling jobs and workflowsSyncing code, notebooks, and files (DBFS or workspace)Administering Unity Catalog (metastores, catalogs, shares)Configuring account-level settingsIn SAP Databricks environments, you can use it to:Manage workspaces integrated with SAP BDCAutomate ingestion/transformation of SAP data products shared via Delta SharingOrchestrate jobs that combine SAP context-rich data with external sourcesInstallation and SetupInstallation is straightforward and identical to standard Databricks: # On macOS (Homebrew)
brew install databricks
# Or download binary from GitHub releases
# https://github.com/databricks/cli/releases
# Then configure (OAuth or token-based)
databricks configure –profile sap-bdcYou can define multiple profiles in ~/.databrickscfg for different environments (e.g., dev SAP Databricks workspace vs. prod).Useful Commands for Everyday and SAP-Relevant WorkHere are some of the most practical commands, especially valuable in SAP-integrated scenarios:Cluster ManagementList clusters: databricks clusters listGet cluster details: databricks clusters get <cluster-id>Start/stop/restart: databricks clusters start <cluster-id>Create from JSON config: databricks clusters create –json-file cluster.json{
“cluster_name”: “prod-cluster”,
“spark_version”: “13.3.x-scala2.12”,
“node_type_id”: “i3.xlarge”,
“autotermination_minutes”: 30,
“enable_elastic_disk”: true
}→ Benefit in SAP contexts: Quickly spin up clusters for processing large SAP extractions or ML training on business data without clicking through the UI.Jobs and WorkflowsList jobs: databricks jobs listRun job immediately: databricks jobs run-now –job-id <job-id>List job runs: databricks jobs list-runs –job-id <job-id>Export job config: databricks jobs get –job-id <job-id> –output-json > job.json{
“name”: “daily-batch-job”,
“tasks”: [
{
“task_key”: “ingest-data”,
“job_cluster_key”: “prod-cluster”,
“spark_python_task”: {
“python_file”: “/Shared/ingest.py”
}
}
]
}→ Automate nightly ETL jobs that transform SAP delta-shared tables into analytics-ready layers.Workspace & NotebooksSync local folder workspace: databricks sync . /Users/your.name/projectList notebooks: databricks workspace ls /Users/your.nameExport notebook: databricks workspace export /path/to/notebook –format SOURCE > notebook.py→ Ideal for CI/CD: push transformed notebooks or Delta Live Tables pipelines from Git → SAP Databricks workspace.DBFS (Databricks File System)Upload file: databricks fs cp localfile.txt dbfs:/mnt/sap-data/Download: databricks fs cp dbfs:/output/results.csv .List: databricks fs ls dbfs:/mnt/→ Move SAP export CSVs, model artifacts, or interim results without browser-based uploads.Unity Catalog (governance for SAP data)List catalogs: databricks catalog listList shares (Delta Sharing): databricks shares listCreate recipient for sharing to/from SAP BDC: (via API/CLI equivalents)→ Manage access to SAP Business Data Cloud shared data products securely.Other Handy OnesCurrent version: databricks –versionHelp for any group: databricks clusters –helpOutput formatting: –output JSON or TABLEKey Benefits of Using Databricks CLI (Especially with SAP Databricks)Benefit Description Value in SAP Databricks ContextAutomation & ScriptingScript repetitive tasks in Bash, PowerShell, Python, etc.Automate provisioning for SAP data pipelines or dev/test envsCI/CD IntegrationNative fit with GitHub Actions, Jenkins, Azure DevOpsDeploy notebooks/pipelines that process SAP + non-SAP dataSpeed & EfficiencyNo need to switch to browser; faster for power usersQuick troubleshooting of SAP Delta Sharing ingestion jobsBulk & Admin OperationsCreate 50+ users, assign catalogs, manage IP lists in one scriptOnboard teams working with SAP-enriched Unity Catalog assetsError ReductionAvoid manual UI mistakes in complex setupsConsistent handling of semantically rich SAP data productsRemote & HeadlessRun from servers, containers, or local machines without UI accessAutomate in restricted corporate environmentsCost ControlScript cluster start/stop based on schedule/usageOptimize spend on clusters processing large SAP datasetsExplore Databricks CLI ReferenceCombine CLI with Databricks Terraform Provider for infrastructure-as-codeBuild end-to-end pipelines using Databricks Workflows + CLI scriptsConclusion The Databricks CLI transforms how you interact with SAP Databricks — turning a powerful but UI-heavy platform into something scriptable, repeatable, and DevOps-friendly. Whether you’re an SAP data engineer building real-time analytics on S/4HANA extracts, a platform admin governing Delta Sharing between SAP BDC and other systems, or an AI developer training models on enriched business data, the CLI saves hours of manual work every week.If you’re just starting with SAP Databricks, install the CLI early — it will quickly become your go-to for anything beyond quick exploratory work in the notebook editor. Read More Technology Blog Posts by SAP articles
#SAP
#SAPTechnologyblog