How to load databricks package dbutils in pyspark
Solution 1
as explained in https://docs.azuredatabricks.net/user-guide/dev-tools/db-connect.html#access-dbutils
depending on where you are executing your code directly on databricks server (eg. using databricks notebook to invoke your project egg file) or from your IDE using databricks-connect you should initialize dbutils as below. (where spark is your SparkSession)
def get_dbutils(spark):
try:
from pyspark.dbutils import DBUtils
dbutils = DBUtils(spark)
except ImportError:
import IPython
dbutils = IPython.get_ipython().user_ns["dbutils"]
return dbutils
dbutils = get_dbutils(spark)
Solution 2
As of databricks runtime v3.0 the answer provided by pprasad009 above no longer works. Now use the following:
def get_db_utils(spark):
dbutils = None
if spark.conf.get("spark.databricks.service.client.enabled") == "true":
from pyspark.dbutils import DBUtils
dbutils = DBUtils(spark)
else:
import IPython
dbutils = IPython.get_ipython().user_ns["dbutils"]
return dbutils
See: https://docs.microsoft.com/en-gb/azure/databricks/dev-tools/databricks-connect#access-dbutils
Solution 3
In Scala you can
import com.databricks.dbutils_v1.DBUtilsHolder.dbutils
And follow below links for more dependency..
https://docs.databricks.com/user-guide/dev-tools/dbutils.html
Babu
Updated on July 17, 2022Comments
-
Babu almost 2 years
I was trying to run the below code in pyspark.
dbutils.widgets.text('config', '', 'config')
It was throwing me an error saying
Traceback (most recent call last): File "<stdin>", line 1, in <module> NameError: name 'dbutils' is not defined
so, Is there any way I can run it in pyspark by including the databricks package ,like an import ?
Your help is appreciated
-
Babu over 5 yearsYes Ritesh,but I dont have databricks cluster . So ,just finding an alternative to import packages.
-
Ritesh over 5 yearsAs per my knowledge, you have to run your code on databricks cluster if you wish to use dbutils. Please let me know if you find any alternative.