How to connect Google Cloud SQL from Cloud Functions?

27,248

Solution 1

New Answer:

See other answers, it's now officially supported. https://cloud.google.com/functions/docs/sql

Old Answer:

It's not currently possible. It is however a feature request on the issue tracker #36388165:

Connecting to Cloud SQL from Cloud Functions is currently not supported, as the UNIX socket does not exist (causing ENOENT) and there is no defined IP range to whitelist (causing ETIMEDOUT). One possibility is to whitelist 0.0.0.0/0 from the Cloud SQL instance but this is not recommended for security reasons.

If this is an important feature for you I would suggest you visit the issuetracker and star the feature request to help it gain popularity.

Solution 2

I found answer in further discussion of #36388165.

disclaimer: this does not seem to be announced officially, so may change afterward. also I only test in mysql. but nature of this solution, I think same way should work as in pg module (it seems to accept domain socket path as host parameter)

EDIT(2017/12/7): google seems to provide official early access, and same method still works.
EDIT(2018/07/04): it seems that there is someone just copy-and-paste my example code and get into trouble. as google says, you should use connection pool to avoid sql connection leak. (it causes ECONNREFUSE) so I change example code a bit. EDIT(2019/04/04): in below example, using $DBNAME as spanner instance name is confusing, I modify example.

in https://issuetracker.google.com/issues/36388165#comment44 google guy says cloud function instance can talk with cloud sql through domain socket in special path '/cloudsql/$PROJECT_ID:$REGION:$DBNAME'.

I actually can connect and operate cloud SQL from below cloud function code.

const mysql = require('mysql');
const pool = mysql.createPool({
    connectionLimit : 1,
    socketPath: '/cloudsql/' + '$PROJECT_ID:$REGION:$SPANNER_INSTANCE_NAME',
    user: '$USER',
    password: '$PASS',
    database: '$DATABASE'
});
exports.handler = function handler(req, res) {
    //using pool instead of creating connection with function call
    pool.query(`SELECT * FROM table where id = ?`, 
                                req.body.id, function (e, results) {
        //made reply here
    });
};

I hope this would be help for those cannot wait for official announce from google.

Solution 3

Find your database region and instance name on GCP > SQL > Instances page:

enter image description here

Save your database password into Firebase environment by running:

$ firebase functions:config:set \
    db.user="<username>" \
    db.password="<password>" \
    db.database="<database>"

Then...

db.js

const { Pool } = require('pg');
const { config } = require('firebase-functions');

const project = process.env.GCP_PROJECT;
const region = 'europe-west1';
const instance = 'db';

module.exports = new Pool({
  max: 1,
  host: `/cloudsql/${project}:${region}:${instance}`,
  ...config().db
});

someFunction.js

const { https } = require('firebase-functions');
const db = require('./db');

module.exports = https.onRequest((req, res) =>
  db
    .query('SELECT version()')
    .then(({ rows: [{ version }]) => {
      res.send(version);
    }));

See also https://stackoverflow.com/a/48825037/82686 (using modern JavaScript syntax via Babel)

Solution 4

there's now official documentation for this, still in Beta though as at July 2018

https://cloud.google.com/functions/docs/sql

Solution 5

CONNECTING FROM GOOGLE CLOUD FUNCTIONS TO CLOUD SQL USING TCP AND UNIX DOMAIN SOCKETS 2020

1.Create a new project

gcloud projects create gcf-to-sql
gcloud config set project gcf-to-sql
gcloud projects describe gcf-to-sql

2.Enable billing on you project: https://cloud.google.com/billing/docs/how-to/modify-project

3.Set the compute project-info metadata:

gcloud compute project-info describe --project gcf-to-sql
#Enable the Api, and you can check that default-region,google-compute-default-zone are not set. Set the metadata.
gcloud compute project-info add-metadata --metadata google-compute-default-region=europe-west2,google-compute-default-zone=europe-west2-b

4.Enable Service Networking Api:

gcloud services list --available
gcloud services enable servicenetworking.googleapis.com

5.Create 2 cloud sql instances, (one with internall ip and one with public ip)- https://cloud.google.com/sql/docs/mysql/create-instance:

6.a Cloud Sql Instance with external ip:

#Create the sql instance in the 
gcloud --project=con-ae-to-sql beta sql instances create database-external --region=europe-west2
#Set the password for the "root@%" MySQL user:
gcloud sql users set-password root --host=% --instance database-external --password root 
#Create a user
gcloud sql users create user_name --host=% --instance=database-external  --password=user_password
#Create a database
gcloud sql databases create user_database --instance=database-external
gcloud sql databases list --instance=database-external

6.b Cloud Sql Instance with internal ip:

i.#Create a private connection to Google so that the VM instances in the default VPC network can use private services access to reach Google services that support it.

gcloud compute addresses create google-managed-services-my-network     --global  --purpose=VPC_PEERING --prefix-length=16  --description="peering range for Google"  --network=default --project=con-ae-to-sql
gcloud services vpc-peerings connect --service=servicenetworking.googleapis.com --ranges=google-managed-services-my-network  --network=default  --project=con-ae-to-sql
#Check whether the operation was successful.
gcloud services vpc-peerings operations describe     --name=operations/pssn.dacc3510-ebc6-40bd-a07b-8c79c1f4fa9a
#Listing private connections
gcloud services vpc-peerings list --network=default --project=con-ae-to-sql
 
ii.Create the instance:

gcloud --project=con-ae-to-sql beta sql instances create database-ipinternal --network=default --no-assign-ip  --region=europe-west2
#Set the password for the "root@%" MySQL user:
gcloud sql users set-password root --host=% --instance database-ipinternal --password root
#Create a user
gcloud sql users create user_name --host=% --instance=database-ipinternal  --password=user_password
#Create a database
gcloud sql databases create user_database --instance=database-ipinternal
gcloud sql databases list --instance=database-ipinternal 


gcloud sql instances list
gcloud sql instances describe database-external
gcloud sql instances describe database-ipinternal
#Remember the instances connectionName

OK, so we have two mysql instances, we will connect from Google Cloud Functions to database-ipinternal using Serverless Access and TCP, and from Google Cloud Functions to database-external using unix domain socket.

7.Enable the Cloud SQL Admin API

gcloud services list --available
gcloud services enable sqladmin.googleapis.com

Note: By default, Cloud Functions does not support connecting to the Cloud SQL instance using TCP. Your code should not try to access the instance using an IP address (such as 127.0.0.1 or 172.17.0.1) unless you have configured Serverless VPC Access.

8.a Ensure the Serverless VPC Access API is enabled for your project:

gcloud services enable vpcaccess.googleapis.com

8.b Create a connector:

gcloud compute networks vpc-access connectors create serverless-connector --network default --region europe-west2 --range 10.10.0.0/28
#Verify that your connector is in the READY state before using it
gcloud compute networks vpc-access connectors describe serverless-connector --region europe-west2

9.Create a service account for your cloud function. Ensure that the service account for your service has the following IAM roles: Cloud SQL Client, and for connecting from App Engine Standard to Cloud Sql on internal ip we need also the role Compute Network User.

gcloud iam service-accounts create cloud-function-to-sql
gcloud projects add-iam-policy-binding gcf-to-sql --member serviceAccount:[email protected]   --role roles/cloudsql.client
gcloud projects add-iam-policy-binding gcf-to-sql --member serviceAccount:[email protected]  --role roles/compute.networkUser

Now that I configured the set up

1. Connect from Google Cloud Functions to Cloud Sql using Tcp and unix domanin socket

cd app-engine-standard/
ls
#main.py requirements.txt

cat requirements.txt
sqlalchemy
pymysql
      
cat main.py 
import pymysql
from sqlalchemy import create_engine


 def gcf_to_sql(request):

    engine_tcp = create_engine('mysql+pymysql://user_name:[email protected]:3306')
    existing_databases_tcp = engine_tcp.execute("SHOW DATABASES;")
    con_tcp = "Connecting from Google Cloud Functions to Cloud SQL using TCP: databases => " + str([d[0] for d in existing_databases_tcp]).strip('[]') + "\n"
    engine_unix_socket = create_engine('mysql+pymysql://user_name:user_password@/user_database?unix_socket=/cloudsql/gcf-to-sql:europe-west2:database-external')
    existing_databases_unix_socket = engine_unix_socket.execute("SHOW DATABASES;")
    con_unix_socket = "Connecting from Google Cloud Function  to Cloud SQL using Unix Sockets: tables in sys database:  => " + str([d[0] for d in existing_databases_unix_socket]).strip('[]') + "\n"
    return con_tcp + con_unix_socket
     

2.Deploy the cloud function:

gcloud beta functions deploy gcf_to_sql --runtime python37 --region europe-west2 --vpc-connector projects/gcf-to-sql/locations/europe-west2/connectors/serverless-connector  --trigger-http
 

3.Go to Cloud Function, choose gcf-to-sql, Testing, TEST THE FUNCTION:

#Connecting from Google Cloud Functions to Cloud SQL using TCP: databases => 'information_schema', 'mysql', 'performance_schema', 'sys', 'user_database'
#Connecting from Google Cloud Function  to Cloud SQL using Unix Sockets: tables in sys database:  => 'information_schema', 'mysql', 'performance_schema', 'sys', 'user_database'

SUCCESS!

Share:
27,248
Quang Van
Author by

Quang Van

Updated on August 24, 2021

Comments

  • Quang Van
    Quang Van over 2 years

    I am trying to use Cloud Functions for Firebase to build an API that talks with a Google Cloud SQL (PostgreSQL) instance.

    I am using HTTP(S) trigger.

    When I white-list my desktop's IP address, I can connect to the Cloud SQL with the function's node.js code from my local machine. But when I deploy, I can't connect, and I can't figure out the HOST IP address of Firebase Function's server, to white-list.

    How do you talk to Google Cloud SQL from Cloud Functions for Firebase?

    Thanks!

    // Code Sample, of what's working on Localhost.
    var functions = require('firebase-functions');
    
    var pg = require('pg');
    var pgConfig = {
      user: functions.config().pg.user,
      database: functions.config().pg.database,
      password: functions.config().pg.password,
      host: functions.config().pg.host
    }
    
    exports.helloSql = functions.https.onRequest((request, response) => {
      console.log('connecting...');
      try {
        client.connect(function(err) {
          if (err) throw err;
    
          console.log('connection success');
          console.log('querying...');
    
          client.query('SELECT * FROM guestbook;', function(err, result){
            if (err) throw err;
    
            console.log('querying success.');
            console.log('Results: ', result);
            console.log('Ending...');
    
            client.end(function(err){
              if (err) throw err;
              console.log('End success.');
              response.send(result);
            });
          });
    
        });
      } catch(er) {
        console.error(er.stack)
        response.status(500).send(er);
      }
    });
    
  • kospol
    kospol almost 7 years
    this did work. the socket path is the same as the "instance connection name" in the properties of your instance.
  • rudolph1024
    rudolph1024 almost 7 years
    Does this work with a First Generation Cloud SQL server? I noticed that the instance connection name does not have a region for my first generation server.
  • takehiro iyatomi
    takehiro iyatomi almost 7 years
    @rudolph1024 you have already tried? I don't have 1st gen cloud sql server, so cannot try. but if '$PROJECT_ID:$REGION:$DBNAME' means "instance connection name" as kospol says, it may works. I would appreciate if you try and post the result.
  • rudolph1024
    rudolph1024 almost 7 years
    @takehiroiyatomi I tried it with $PROJECT_ID:$REGION:$DBNAME and with $PROJECT_ID:$DBNAME, but neither of these worked.
  • takehiro iyatomi
    takehiro iyatomi almost 7 years
    @rudolph1024 thank you for reporting but I'm sorry to here that. maybe this is the reason google does not seem to announce it yet.
  • Yanai
    Yanai over 6 years
    Works Perfectly
  • takehiro iyatomi
    takehiro iyatomi over 6 years
    @rudolph1024 FYI if you still interested in connecting 1st gen cloud SQL from cloud funtions, official doc (docs.google.com/document/d/…) says it possible with instance connection name like "<ProjectID>:<InstanceID>"
  • Cris
    Cris over 6 years
    don't you need to whitelist ipaddresses to access the cloud sql database?
  • takehiro iyatomi
    takehiro iyatomi over 6 years
    @Cris I don't need to do it. do you?
  • Wes Cossick
    Wes Cossick over 6 years
    If you need to connect from another Google Cloud Platform project, add <YOUR_PROJECT_ID>@appspot.gserviceaccount.com to your IAM and provide the Cloud SQL Client role.
  • user4092086
    user4092086 over 6 years
    Wondering it works for connection through firebase free account?
  • takehiro iyatomi
    takehiro iyatomi over 6 years
    @user4092086 I don't have firebase project but according to official user guide and stackoverflow.com/a/42859932/1982282, Cloud Function for Firebase seems to be thin wrapper of Google Cloud Functions, and there is no caveats about it in official guide. so its likely to work. same as previous case, I would appreciate if you try and post the result.
  • Golden mole
    Golden mole about 5 years
    looking at your example, that is the difference between $DBNAME and $DATABASE?
  • takehiro iyatomi
    takehiro iyatomi about 5 years
    @Goldenmole certainly this was confusing. $DBNAME is a spanner instance name and $DATABASE is what we called database in other database server product. (eg. mysql). I modify example to clarify these difference. thanks!
  • Golden mole
    Golden mole about 5 years
    @takehiroiyatomi, thank you for your response. Since I am still new to GCP, could you explain where I can find value for $SPANNER_INSTANCE_NAME and what it is? I googled spanner instance but I am still not clear about what it is.
  • vovahost
    vovahost about 5 years
    It is currently possible to connect to Cloud SQL from Cloud Functions easily. There is an official guide as well. Check the other answers.
  • Niklas B
    Niklas B over 4 years
    Do note that my answer is from 2017, so I don't see the need for downvoting it. I will update it to reflect that it's no longer relevant.
  • Shabirmean
    Shabirmean over 3 years
    If your cloud function is in Java and you are following the GCP Doc linked in the answer the following repo in the GCP github also can be useful: github.com/GoogleCloudPlatform/cloud-sql-jdbc-socket-factory