Stackdriver Logging - Google Cloud Logging Service
Stackdriver Logging - Google Cloud Logging Service
Check out some of the features of Stackdriver Log Library and learn to install it and integrate it with Google Cloud with this tutorial.
Join the DZone community and get the full member experience.Join For Free
Insight into the right steps to take for migrating workloads to public cloud and successfully reducing cost as a result. Read the Guide.
Running out of space for storing logs? Tired of taking backups? Finding problems with log rotation jobs and exporting the data from logs? Unable to search specific entries? If you have requirements for the above and having hard time, it’s time to move to Google Cloud logging/Stackdriver logging. Here, in this article I am going to show how to get started with Stackdriver and how to integrate it with the standard Python logging library. In this article, I will give a brief overview of the Stackdriver Logging with a few simple examples explaining how to configure it and how to use it in your Python code.
It is a logging service on cloud which provides various features like:
To use any of the cloud services, you need a service account which would be used by your application to make calls to Google APIs. The following section describes how to create a service account.
Service Account Creation
Go to https://console.cloud.google.com/. On the left menu navigate to IAM & admin -> Service accounts.
You can see a link for service account creation if there is not one already. Click the create service account button to create one and make sure Stackdriver log writer role selected.
Also check the “Furnish a new private key” check box and “key type” as json format which downloads the service account credentials file to your computer.
You are now done with the service account creation. Keep the credentials file safe in a secure place on your machine.
Setting up Stackdriver Logging for Python
Install the Google Cloud logging library by using the following command.
pip install --upgrade google-cloud-logging
And set the environment to use the created service account by exporting the GOOGLE_APPLICATION_CREDENTIALS environment variables as below, setting it to the path of your service key json file.
Using the Logging Library
Google Cloud provides a client library for accessing StackDriver. Following code snippets tells you how to use this client library:
# Imports the Google Cloud client library from google.cloud import logging # Instantiates a client - this will create the client which we are going to use to access # Stack Driver logging logging_client = logging.Client() # The name of the log to write to log_name = 'user_app' # Selects the log to write to logger = logging_client.logger(log_name) # The data to log text = 'Hey, say Hello to every new bie!!' try: # Writes the log entry logger.log_text(text) # Write an entry with severity logger.log_text(text, severity = ‘ERROR’) except Exception,e: # Do something here
Now, run the script and your logger messages are written to the Cloud logging service. In order to see the logs in cloud, go to cloud console and navigate to logging -> logs.
Select the project type in the first dropdown as “Global” and select the log name you have chosen in your application.
Integration with Python Standard Logging Library
What we have seen in the above section, is that the logs are written to the Cloud logging service, but since it uses different syntax and methods from the python root library it might become hard to migrate the existing applications. But no worries!! You can integrate the cloud logger library with python root library and use the methods from python library as it is.
Use the following code and have the cloud logger methods just like the way you use with python logger.
# Imports the Google Cloud client library import google.cloud.logging import os # set service account file path to gcloud env os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = "/path/to/your/service/account/json/key/file" credentials, project = google.auth.default() # Instantiates a client logging_client = google.cloud.logging.Client(project="stackdriver-log-test",credentials = credentials) # Use setup helper method which attaches stackdriver logger to python root logger logging_client.setup_logging() # import the py standard logging import logging try: # Use the standard library methods to writes the log entries logging.info("Info message from py logger") logging.warn("Warning from py logger!") logging.error("Error from py logger!!") logging.critical("Critical msg from py logger. Urgent Action required!!!") except Exception,e: # handle exception here
Now, you can see the messages logged by python root library.
Filter the Logs
Stackdriver web GUI provides a convenient interface for search which allows you to filter on data, date ranges, severity, log names by writing SQL like queries.
You can search using a keyword or multiple words, which works using an “OR” operator by default.
Let’s say, If I want to search for the messages which have either “hey” or “hello,” just type the words separated by space and hit enter.
As a result, 4 messages were listed.
If you want to write some advanced/custom filters, you can do that by clicking the right arrow and choosing "Convert to Advanced filter. For example, change the filter query to use AND, NOT operators. Now, you see 3 messages having either of “hey” or “hello” but not the word “world.”
Stackdriver log provides a feature called sink which allows automatic export of your log entries to other sources like Bigquery, Cloud storage etc.
The step by step process of how to export some specific log entries is a below:
- Write a filter as shown in the above section.
- Click Create Export button on the top.
- Give a name to the exported data filter (called as sink name)
- Select Sink Service as BigQuery.
- Select the Sink Destination dataset where you want to export.
Let’s say if I have some log entries as below
Now, I’m going to apply a filter to export all the user actions (entries which have “action” and “user”).
Once you click the “Create Export” button, you will see a popup in the right side asking for destination details of exported data.
You are done with the setup, so hereafter, when the stack driver receives a log entry with the specified filter, it will export it to bigquery dataset but keep a copy with itself.
This was just a brief discussion on some of the powerful features provided by Google Cloud logging service. With the advent of distributed systems, it makes a lot more sense to consider this as logging for our application. There are other features also in Stackdriver which you need to familiarize with. One good resource would be the Google Documentation at https://cloud.google.com/logging/
Opinions expressed by DZone contributors are their own.