Google Cloud Function- How To Create And Deploy First Cloud Functions.

Kapil Jain
4 min readDec 26, 2021

--

If you are Data Engineer, Big Data Engineer, Cloud Engineer, or interested to explore more about GCP services, here is the my another blog on google cloud function. It will help you to understand from basics of needs, benefits and also to execute your first cloud function workflow.

GCP

Google Cloud Functions:

Cloud Functions is the serverless computing service available on the google cloud platform.
The serverless is a way to modular pieces of code on the edge. It is a scalable service and a cost-effective way to execute event-driven code.
It is required zero maintance.

Why Serverless Computing

Nowadays serverless computing is quite famous among all the developers and programmers and the main reason behind that is readymade environment will be provided by cloud or containerized services.

Function-as-a-Service (FaaS) lets developers write and update a piece of code on the fly, which can then be executed in response to an event.

Serverless compute service that runs your code in response to events and automatically manages the underlying compute resources. Simply means a developer can focus on an event or business logic rather than surrounding infrastructure resource configuration and management.

Benefits of Cloud Function:

  • Servers provisioning and maintance are not required.
    Automatically scale based on the load. We can say fully managed third-party service.
  • Integrated with other GCP services for monitoring, logging, and debugging capability.
  • Built-in security at the role and per function level based on the principle of least privilege.
  • Different language support is available like Node.js, Python, Go, Java, .NET, Ruby, and PHP.
  • We can use Google Cloud Function to extend other GCP services with custom logic.

Trigger Cloud Function

Cloud Functions allows you to trigger your code from Google Cloud, Firebase, Cloud Scheduler, Cloud PubSub, Google Storage Event, Google Assistant or call it directly from any web, mobile, or backend application via HTTP.

Cost of Execution

  • Billed for your function’s execution time, metered to the nearest 100 milliseconds. You pay nothing when your function is idle.
  • The First 2 million execution of cloud function is free and after that, you need to pay per invocation $0.40.
  • The costing depends on the factor time taken during execution, no. of time it invoked, and behind the scene how many resources are provisioned.

Cloud Function Deployment Steps

  • Go to the GCP cloud function and select the parameter as per your event-triggered requirement.
Cloud Function Creation\
  • The Source, section selects the programming language and you can write your logic for your event triggering.
  • Here main.py has the main python code and requirement.txt has all dependency or jar names.
  • Click on Deploy means the function is up and running, it performs invocation as per selected event calling or triggering.

Sample Program

create cloud function which will invoke when new CSV create on the specific google storage location, the function will read the data of file through cloud function and upload into big query.

Cloud Storage(Binary Storage like hdfs) →(CSV File Creation on Bucket) → Trigger Cloud Function → Upload Data into Bigquery(Google Managed Data Warehouse)

Upload gs CSV file to bigquery via Cloud Functions

Cloud Function Code for above workflow:

from google.cloud import bigquery
import os
from pytz import timezone
from datetime import datetime
dataset_name = ''
table_name = ''
project_id = ''
location_name = ''
base_bucket_loc = ''
def hello_gcs(event, context):
file = event
print(f"Processing file: {file['name']}.")
file_name_inp = str(file['name'])
file_exist(file_name_inp)
def file_exist(file_name_inp):
file_inp_gen = getcurrentDate('UTC') + '.csv'
if file_inp_gen == file_name_inp:
upload_data(file_name_inp)
def getcurrentDate(timeZone):
timeDate = timezone(timeZone)
currDate = str(datetime.now(timeDate).strftime("%m-%d-%Y"))[:10]
print('Current date is ' + currDate)
return currDate
def upload_data(file_name_inp):
client = bigquery.Client(project=project_id, location=location_name)
table_ref = client.dataset(dataset_name).table(table_name)
job_config = bigquery.LoadJobConfig()
job_config.write_disposition = bigquery.WriteDisposition.WRITE_TRUNCATE
job_config.skip_leading_rows = 1
# The source format defaults to CSV, so the line below is optional.
job_config.source_format = bigquery.SourceFormat.CSV
uri = base_bucket_loc + "/" + file_name_inp
load_job = client.load_table_from_uri(
uri, table_ref, job_config=job_config
) # API request
print("Starting job {}".format(load_job.job_id))
load_job.result() # Waits for table load to complete.
print("Job finished.")
destination_table = client.get_table(table_ref)
print("Loaded {} rows.".format(destination_table.num_rows))

For Code Execution Steps :
- Create Bucket on Google Storage.
- Create a big query table with the required column.
- Create cloud function with above mention code and select trigger on the bucket which you have already created.
- Create a CSV file inside that bucket then the event will trigger cloud function and upload data on Bigquery(fully managed data warehouse).

Questions?
If you have any questions, I will be happy to answer them. Please leave a comment for follow-up.
For more updates: Follow me Medium or Linkedin

Hope you enjoy learning and are ready to run your first google cloud function execution.. ❤

- Want to understand the basics of GCP data flow and create your first application click

--

--

Kapil Jain
Kapil Jain

Written by Kapil Jain

Technophile | Backend Developer | Cloud Architect | Big Data Development and Services | Java Enthusiast | Transaction System | Data analysis | Data Engineer ❤

No responses yet