In this tutorial, I'll show you how to create
Google Cloud Functions using GCP Console, gcloud cli, and terraform. Let's create our first function from the UI. Go to the console and click on Cloud Functions. It will take a few seconds to enable Cloud
functions in your GCP project. When it's done, click on CREATE FUNCTION. Give it a name, for example, first-function. Then select the region. I'll choose us-central1, which is one of the
cheapest regions in Google Cloud. It's perfect for the demo. There are multiple ways how you can trigger
this function. You can use HTTP trigger, Pub/Sub, and others. I'll start with the simplest one, the HTTP
trigger. You already can see the URL that we can use
to trigger the function. Now, if you're building the public API, you
want to disable Authentication. On the other hand, if you're going to keep
this function private, leave Require authentication. We will enable Authentication for the next
Goole Function. Also, you have a choice to disable HTTPS and
allow HTTP as well. In general, you would always want to secure
all the communications between your services. Unless you have a specific requirement, keep
this option on. Under Runtime, build, connections, and security
settings, you can find additional options. We will use some of them later in the tutorial. We will create and use service accounts. Also, we will create an API key in the Secret
Manager, and I'll show how to access it from the Cloud Function. Let's SAVE the settings, and click NEXT. If you have never used Cloud Build API before
in this project, you may get a warning. Click Enable API to continue. Under Runtime, you can select the language
that you want to use. I'll start with Nodejs, and then we will create
a few Python functions. To create a function, you can write code directly
in the console. You can create a function locally and upload
it as a zip archive as well as you can use gcloud cli to simplify the deployment process. We will start by modifying the function in
the UI. If your function requires third-party npm
modules, you can declare them in the package.json file. Function generated by Google will check for
URL query parameter and payload. If none of it is present, it will return Hello
World. Let's deploy our first function. To invoke the function, click on the first-function. Then under the trigger, you can find the URL. Copy it and use curl to call it. It should return Hello World. HTTP functions can be invoked by different
kinds of identities, originating in different places. The invoker can be a developer who is testing
the function or another function or service that wants to use the function. These identities must provide an ID token
with the request to authenticate themselves. In addition, the account being used must also
have been granted the appropriate permissions. As a developer, you are also likely to need
to invoke your functions for testing purposes. To invoke a function using curl or similar
tools, you need to have cloudfunctions.functions.invoke permission. Let's check if my current user has this policy. Go to IAM & Admin and then IAM. I have Organization Administrator and Owner
role. Let's check if the Owner role has this policy. Select Owner role and search for cloudfunctions.functions.invoke. Now, let's quickly create another Function
and require Authentication. Let's call it secure-function. Keep all the default parameters and click
deploy. Try to invoke this function using a similar
technique. You will get an error. Your client does not have permission to get
URL... To be able to invoke this function, we need
to include the bearer token in the Authorization header. You need to have gcloud cli installed for
that. When building services that connect multiple
functions, it's a good idea to ensure that each function can only send requests to a
specific subset of your other functions. For example, if you have a login function,
it should be able to access the user profiles function, but it probably shouldn't be able
to access the search function. To configure the receiving function to accept
requests from a specific calling function, you need to grant the Cloud Functions Invoker
(roles/cloudfunctions.invoker) role to the calling function's service account on the
receiving function. To demonstrate, let's create two service accounts. One for function-a and one for function-b. In the demo, function-b will be invoking function-a. In the function, just update the return string
to Hello from Function A! To allow function-b to call function-a, we
need to add function-a as a Principal. Paste the service account email for the function-b. Select the Cloud Functions Invoker role. Next, create another Cloud function and call
it function-b. Check Allow unauthenticated invocations, and
select the function-b service account. To be able to invoke the other function, we
need to use the GoogleAuth module to get a token. Then specify the URL of the function that
you want to invoke. Suppose you don't have any arguments in the
URL (for example, delete/id), then set targetAudience equal to URL. In the try-catch block, try to invoke the
function and return the content to the client. If it fails, return the error. It may be a security issue to return the exact
error, but for the demo, it's fine. Don't forget to update the Entry point for
this function as well to invokeFuncA. Since we're using the third-party module GoogleAuth,
we need to declare it in the package.json file. Deploy the function and copy the URL to invoke
it. Use curl to call function-b. It should return Hello from Function A! You can use Secret Manager to securely store
API keys, passwords, and other sensitive information. Let's create an API key in Secret Manager,
call it api-key. If you haven't used Secret Manager before,
you may need to enable Secret Manager API. Click Create secret and give it a name api-key. For secret value, let's use devops123. Leave other default parameters and create
a secret. In order to grant access for our function
to access this secret, we need to create a service account and add it as principal to
the secret. Name it secret-function. If you want this function to access all the
secrets in this project which is unlikely, you can grant project-level permissions. Don't forget to copy email of that service
account. Next, add this email as a principal to the
api-key secret. Use Secret Manager Secret Accessor role. To test, create a new function. Give it a name secret-function. Disable authentication. Under advanced settings, choose the service
account that we just created. Then go to the Security tab and reference
the api-key secret. For mount path, enter api. Our secret will be available on /api/api-key. Now, we just need to read the file with the
secret and return it to the client. In the function, import fs module. Then in the body of the function, try to read
the file. If it fails for some reason, log the error
and return the 500 status code to the client. On success, return the secret. Don't forget to update the Entry point for
the function. It should match the function that you export. In this case, it should be readSecret. When it's done, click Deploy. Since it's a public function, we can use curl
to invoke it. It should return devops123 API key. You also have an option to deploy cloud functions
from the source repository like GitHub or Bitbucket. But first, you need to mirror your existing
repository to the Google Cloud Source Repository. To try it out, we need to create a GitHub
repository that will contain all our functions. Let's give it a name simply functions and
make it private. You can add a README file. Clone this repository to your working station. git clone git@github.com:antonputra/functions.git
Create a new folder for the function. Maybe call it git-function. Now create an index.js file and define a simple
function that will return Hello from Git! You need to create a package.json file where
you can optionally can define dependencies for this function. Next, we need to mirror this repository to
Google. Go to source repositories and create one. Select connect external repository. Choose the Google project where you are working
on and git provider. In my case, it is GitHub. If you are doing this for the first time,
Google will ask you to authorize them to get read access to your repositories. Let me select the one that I just created
- functions. It may take a few minutes to initialize and
mirror your GitHub repository. If you change to the main branch, you should
see your function's source code. To deploy this function, you can use the GCP
console, or let's do it from the command line. Run gcloud functions deploy. Specify the source that will point to the
git repository. For the first time, you also need to specify
the runtime. Then HTTP trigger and an entry point. For the source URL, you need to replace devopsbyexample
with your project id. Then the Cloud Repository id, branch name,
and a file path to the function folder. Let's say you want to make a change in your
function. Make a change, then commit and push to your
GitHub repository. It should immediately sync with the Cloud
repo. If it is out of sync, you can go to settings
and manually synchronize repositories. To redeploy, just rerun the same command without
runtime and trigger. Continuous Integration and Deployment (CI/CD)
pipelines help to ensure that your functions work both locally and in a test environment
on Google Cloud. Let's reuse the git repository functions that
we created for the previous example. However, you can use any third-party CI/CD
services such as GitHub actions, CircleCI, and even Jenkins. In this video, we will stick with Google-managed
services. First of all, we need to create a new build
trigger in Cloud Build. Go to Cloud Build, then Triggers, and click
Create Trigger. Name it as you want, maybe cloud-functions. Then you can choose when you want to run this
build. On push to a branch, tag, or maybe you want
to run some tests when a pull request is open. For the source, we can select a mirrored repo,
but I highly recommend connecting a new one. I had some issues integrating this cloud repo
with cloud build. Select GitHub and click Continue. Then you need to authenticate with GitHub
to allow Goole to access your repository. Then select the repo that you want to integrate
with Cloud Build, and click Connect. You can specify what branch name to use; in
my case, it's a main. We're going to be using the standard cloudbuild.yaml
config file, keep Autodetected. Leave other parameters as is and click Create. If you go to Dashboard, you'll see No Builds. Now, we need to grant some permissions to
the CLoud Build to be able to execute our pipeline. Describe the GCP project to get the project
id and number. Allow the Cloud Build service account to act
as the Cloud Functions Runtime service account Assign the Cloud Functions Developer role
to the Cloud Build service account, which allows Cloud Build to deploy Cloud Functions
Next, let's create cloudbuild.yaml file and define steps to execute our pipeline. The first step is to install all the nodejs
dependencies. If you have any tests, we can run them next. Finally, deploy a function. Similar parameters to the gcloud cli that
we used before. Let's call our new function cicd-function. Commit and push to the remote branch to trigger
a build. It's possible that it will fail for the first
time. In that case, you need to find the issue in
the build. It may ask you to enable Cloud Resource Manager
API. The build is finished. Let's go to the cloud functions and copy the
URL to trigger the function. By default, it will be a private function,
and we need to authenticate with a bearer token. If you make a change in your function and
push it to the remote, cloud build will redeploy your new function for you. Some of you may be familiar with terraform. A lot of companies nowadays manage their infrastructure
as a code. In this example, I'll show you how to deploy
Cloud Function using Terraform. Before we begin, you would need to enable
Enable Cloud Build API and Enable Cloud Functions API before we start with terraform if you
never deployed cloud functions in your project. If you followed along, you don't need to do
it. For simplicity, I will create all terraform
code in a single main.tf file. We can declare local variables to be able
to reuse them through the code. First is a project id and a timestamp. We're going to append it to a zip-archive
with our function. To create resources in GCP, we need to use
terraform google provider. You need to specify the project id and a region. We're going to be using the function that
we created in the previous example. For terraform to deploy that function, we
need to create a zip archive and then upload it to the GS bucket. You can use an existing bucket or create a
new one using the google_storage_bucket terraform resource. Then just upload the zip archive to the bucket. To trigger redeployment of the function when
we change the source code, we also need to change the name of the object. Finally, we get to the function itself. Give it a name, runtime, and you can customize
other parameters as well. Pretty much the same set of options that you
have when you deploy it from the command line. By default, it will be a private function. You can optionally make it public by allowing
all users to invoke it. For our convenience, we can output a function
URL that we can use to trigger it. Keep in mind when you don't specify the terraform
backend, it will store its state locally. This means you have to commit and store it
in the git, or the better approach would be to use remote storage such as GS bucket. Let's run terraform. First, you need to initialize it and download
all the plugins, including google provider. Then you can run a plan or apply to create
a function. In a couple of minutes, it will deploy terraform-function. We can grab this URL and try to invoke it. In your function's directory, install the
Functions Framework library for your language. For the NodeJS, run the following command. Before running a function with the Functions
Framework, you must first specify which function within your code should be run. The Node.js Functions Framework allows you
to specify your function's name and signature type as command line arguments or environment
variables. You can also override the default 8080 port. Now go to the terminal and run your function. Your function will be available at localhost:8080. You can use curl to invoke it. Cloud Functions uses event-driven functions
to handle events from your Cloud infrastructure. For example, Cloud Functions can be triggered
by messages published to Pub/Sub topics in the same Cloud project as the function. Before we're going to deploy a cloud function,
we need to create a Pub/Sub topic. You can do it from the UI, use gcloud cli
or terraform. Give a name lesson-106. Next, go to the Cloud functions section. Create a new one and call it pubsub-function. Instead of HTTP, select Cloud Pub/Sub. And a topic lesson-106 that we created earlier. You have access to the same additional parameters. You can adjust memory, timeout, update the
service account, and others. For this function, let's use python. You have main.py to define your function and
a requrements.txt to specify dependencies. We can leave function as is. It will print the message from the pubsub
topic to the log. Waite a couple of minutes till the function
is ready and go to Pub/Sub. Select the topic and publish a test message. Now, let's check if the function was triggered
by this message. You should be able to find the test message
in the log of the function. Cloud Functions also can respond to change
notifications emerging from Google Cloud Storage. These notifications can be configured to trigger
in response to various events inside a bucket—object creation, deletion, archiving and metadata
updates. We need a bucket. Create one and give it a name lesson-106. I'm going to skip all of the possible configurations
since we just need a bucket to trigger a function. Next, create a function. Give it a name gs-function. Replace HTTP trigger with Cloud Storage. You can select different types of events when
you want to execute your function. I want to run it when the file is uploaded
to the bucket. Then you need to specify which bucket to watch. Let's go with python again, but it does not
make any difference. This function simply prints out the file name
to the log. I'm going to upload a thumbnail for this video
to the bucket to check if it will be triggered. Now, if you go back to the function, you should
see a thumbnail-v1 message in the log. In the last example, we're going to use the
Cloud function as a backend for the API gateway. This example will cover bare minimal setup,
and in the next video, we're going to deep dive into the API gateway. API Gateway requires that you enable the following
Google services: API Gateway API, Service Management API, and Service Control API. After you have enabled all of them, you can
create an API gateway. But before, let's create a backend function
and two service accounts. The first one will be backend-function and
the second one is api-gateway. Give it a name backend-function and select
service account. Leave all other default parameters as is. Let's keep the default NodeJs function as
well. We also need to allow the api-gateway service
account to invoke this function with Cloud Functions Invoker role. Now go back and create API Gateway. Let's name it my-gateway and same for the
id. Then we need to create OpenAPI Spec locally. Specify the path hello and point to the backend-function. Under upload API spec, select the config. After API Gateway is created, we can use its
URL and hello path to invoke our function.