How to list all tables in database using Spark SQL?

23,579

The answer to this question isn't actually spark specific. You'll just need to load the information_schema.tables.

The information schema consists of a set of views that contain information about the objects defined in the current database. The information schema is defined in the SQL standard and can therefore be expected to be portable and remain stable — unlike the system catalogs, which are specific to RDBMS and are modelled after implementation concerns.

I'll be using MySQL for my code snippet which contains a enwiki database on which I want to list tables :

# read the information schema table 
spark.read.format('jdbc'). \
     options(
         url='jdbc:mysql://localhost:3306/', # database url (local, remote)
         dbtable='information_schema.tables',
         user='root',
         password='root',
         driver='com.mysql.jdbc.Driver'). \
     load(). \
     filter("table_schema = 'enwiki'"). \ # filter on specific database.
     show()
# +-------------+------------+----------+----------+------+-------+----------+----------+--------------+-----------+---------------+------------+----------+--------------+--------------------+-----------+----------+---------------+--------+--------------+-------------+
# |TABLE_CATALOG|TABLE_SCHEMA|TABLE_NAME|TABLE_TYPE|ENGINE|VERSION|ROW_FORMAT|TABLE_ROWS|AVG_ROW_LENGTH|DATA_LENGTH|MAX_DATA_LENGTH|INDEX_LENGTH| DATA_FREE|AUTO_INCREMENT|         CREATE_TIME|UPDATE_TIME|CHECK_TIME|TABLE_COLLATION|CHECKSUM|CREATE_OPTIONS|TABLE_COMMENT|
# +-------------+------------+----------+----------+------+-------+----------+----------+--------------+-----------+---------------+------------+----------+--------------+--------------------+-----------+----------+---------------+--------+--------------+-------------+
# |          def|      enwiki|      page|BASE TABLE|InnoDB|     10|   Compact|   7155190|           115|  828375040|              0|   975601664|1965031424|      11359093|2017-01-23 08:42:...|       null|      null|         binary|    null|              |             |
# +-------------+------------+----------+----------+------+-------+----------+----------+--------------+-----------+---------------+------------+----------+--------------+--------------------+-----------+----------+---------------+--------+--------------+-------------+

Note: This solution can be applied to the scala and java with respectful languages constraints.

Share:
23,579
Abe
Author by

Abe

Updated on July 09, 2022

Comments

  • Abe
    Abe almost 2 years

    I have a SparkSQL connection to an external database:

    from pyspark.sql import SparkSession
    
    spark = SparkSession \
      .builder \
      .appName("Python Spark SQL basic example") \
      .getOrCreate()
    

    If I know the name of a table, it's easy to query.

    users_df = spark \
      .read.format("jdbc") \
      .options(dbtable="users", **db_config) \
      .load()
    

    But is there a good way to list/discover tables?

    I want the equivalent of SHOW TABLES in mysql, or \dt in postgres.

    I'm using pyspark v2.1, in case that makes any difference.

  • Jacek Laskowski
    Jacek Laskowski about 7 years
    Nitpicking: I'd change filter to use Column (not String) or even where and work with a real Dataset.