Google Cloud Function- How To Create And Deploy First Cloud Functions.

GCP

Why Serverless Computing

Benefits of Cloud Function:

  • Servers provisioning and maintance are not required.
    Automatically scale based on the load. We can say fully managed third-party service.
  • Integrated with other GCP services for monitoring, logging, and debugging capability.
  • Built-in security at the role and per function level based on the principle of least privilege.
  • Different language support is available like Node.js, Python, Go, Java, .NET, Ruby, and PHP.
  • We can use Google Cloud Function to extend other GCP services with custom logic.

Trigger Cloud Function

Cost of Execution

  • Billed for your function’s execution time, metered to the nearest 100 milliseconds. You pay nothing when your function is idle.
  • The First 2 million execution of cloud function is free and after that, you need to pay per invocation $0.40.
  • The costing depends on the factor time taken during execution, no. of time it invoked, and behind the scene how many resources are provisioned.

Cloud Function Deployment Steps

  • Go to the GCP cloud function and select the parameter as per your event-triggered requirement.
Cloud Function Creation\
  • The Source, section selects the programming language and you can write your logic for your event triggering.
  • Here main.py has the main python code and requirement.txt has all dependency or jar names.
  • Click on Deploy means the function is up and running, it performs invocation as per selected event calling or triggering.
Upload gs CSV file to bigquery via Cloud Functions
from google.cloud import bigquery
import os
from pytz import timezone
from datetime import datetime
dataset_name = ''
table_name = ''
project_id = ''
location_name = ''
base_bucket_loc = ''
def hello_gcs(event, context):
file = event
print(f"Processing file: {file['name']}.")
file_name_inp = str(file['name'])
file_exist(file_name_inp)
def file_exist(file_name_inp):
file_inp_gen = getcurrentDate('UTC') + '.csv'
if file_inp_gen == file_name_inp:
upload_data(file_name_inp)
def getcurrentDate(timeZone):
timeDate = timezone(timeZone)
currDate = str(datetime.now(timeDate).strftime("%m-%d-%Y"))[:10]
print('Current date is ' + currDate)
return currDate
def upload_data(file_name_inp):
client = bigquery.Client(project=project_id, location=location_name)
table_ref = client.dataset(dataset_name).table(table_name)
job_config = bigquery.LoadJobConfig()
job_config.write_disposition = bigquery.WriteDisposition.WRITE_TRUNCATE
job_config.skip_leading_rows = 1
# The source format defaults to CSV, so the line below is optional.
job_config.source_format = bigquery.SourceFormat.CSV
uri = base_bucket_loc + "/" + file_name_inp
load_job = client.load_table_from_uri(
uri, table_ref, job_config=job_config
) # API request
print("Starting job {}".format(load_job.job_id))
load_job.result() # Waits for table load to complete.
print("Job finished.")
destination_table = client.get_table(table_ref)
print("Loaded {} rows.".format(destination_table.num_rows))

--

--

--

Technophile | Backend Developer | Cloud Architect | Big Data Development and Services | Java Enthusiast | Transaction System | Data analysis | Data Engineer ❤

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Graph Fundamentals — Part 2: Labelled Property Graphs

A Simple Python Class for Formatting Table Data to Display in HTML or CSV

FULL Walkthrough of The Re-Delegation Process for Staking on Terra Station

MicroZed Chronicles: Getting Started with the RFSoC

Developer Tech News #4

How to root Itel P12

Root LG Phone

Data Oriented Computing — DRE, not SRE

View in SQL

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Kapil Jain

Kapil Jain

Technophile | Backend Developer | Cloud Architect | Big Data Development and Services | Java Enthusiast | Transaction System | Data analysis | Data Engineer ❤

More from Medium

How to get rid of partial data load between two cross-platform systems.

Customer-hosted Looker installation on Google Cloud Platform VM

Slack notification for BigQuery results using GitHub Actions

Optimize costs in BigQuery — 9 solutions