Skip to content

A simple util to get a spark and mlflow session objects from an .env file

License

Notifications You must be signed in to change notification settings

broomva/databricks_session

Repository files navigation

Databricks Session Util

A simple utility for spark and mlflow session objects

Setup

Quick Install

python -m pip install databricks_session

Build from source

Clone the repository

git clone https://github.com/Broomva/databricks_session.git

Install the package

cd databricks_session && make install

Build manually

After cloning, create a virtual environment

conda create -n databricks_session python=3.10
conda activate databricks_session

Install the requirements

pip install -r requirements.txt

Run the python installation

python setup.py install

Usage

The deployment requires a .env file created under local folder:

touch .env

It should have a schema like this:

databricks_experiment_name=''
databricks_experiment_id=''
databricks_host=''
databricks_token=''
databricks_username=''
databricks_password=''
databricks_cluster_id=''
databricks_sql_http_path=''
import databricks_session 

# Create a Spark session
spark = DatabricksSparkSession().get_session()

# Connect to MLFLow Artifact Server
mlflow_session = DatabricksMLFlowSession().get_session()

About

A simple util to get a spark and mlflow session objects from an .env file

Resources

License

Code of conduct

Stars

Watchers

Forks

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •