Cloud Logging Pricing For Cloud Admins: How To Approach It & Save Cost

 Flexera’s State of the Cloud Report 2022 pointed out that significant cloud spending is wasted, a major issue that is getting more critical as cloud costs continue to rise. In the current macroeconomic conditions, companies focus on identifying ways to reduce spending. To effectively do that, we need to understand the pricing model. We can then work towards the challenges of cost monitoring, optimization, and forecasting. One area that often gets overlooked in budgeting is observability—logging, monitoring, tracing. This can represent a significant cost, especially if it’s not optimized. Let’s explore how to understand and optimize our most voluminous data source—logs—within Google Cloud.Cloud Logging is a fully managed real-time log solution that allows you to ingest, route, store, search and analyze your logs to easily troubleshoot incidents using your log data. It can collect data from on-prem, Google Cloud and other clouds with open source agents that support over 150+ services. Unlike traditional licensing models or self hosted logging solutions, Cloud Logging pricing model is simple and based on actual usage.On a high level, Cloud Logging pricing is based on ingestion which includes storing the logs in Log Bucket for a default period of 30 days.

Let’s explore the various components of Cloud Logging and address a few commonly asked questions about pricing.

Cloud Logging – Components & Purpose

To understand pricing better and be able to predict future costs, we need to understand the high-level components of Cloud Logging and where billing occurs in our system. There are three important components within Cloud Logging: Cloud Logging API, Cloud Logging Router (Log Router) and log buckets (Log Storage).

The below table outlines the high-level components, purpose and pricing information for Cloud Logging. 

As indicated above, today billing in Cloud Logging occurs only for a log that is routed and ingested into a log bucket. “Ingestion” in Cloud Logging is the process of saving log data into a log bucket, not simply processing it in the Log Router.Our pricing includes 30 days of storage for all logs ingested.There are three options for log buckets

  • Required
  • Default
  • User-defined or Custom.

Only Default and User-defined buckets are billed.

Today, our logging pricing is based on the volume of logs ingested in a chargeable log bucket—default or user-defined. All charges in Cloud Logging occur at the log bucket and all log types incur the same cost. Logs dropped using sink filters or exclusion filters are not charged by Cloud Logging, even if these logs are routed to a destination outside of Cloud Logging.

Now, we’ll address frequently asked questions about the Cloud Logging pricing model.

What Cloud Logging charges will I see on my bill?

There are two types of charges your logs can potentially incur:

  1. An ingestion charge of $0.50/GiBwhich includes default storage of 30 days. Note that the first 50 GB in a project fall under the free tier quota. You get charged based on the volume of logs ingested into the Default and User-defined log buckets.
  2. Logs stored beyond 30 days will incur a retention charge of $0.01/GiB/monthfor non-required buckets. Note that this pricing is not currently enforced. We will begin charging in early 2023.

For the latest pricing, check here.

How can I reduce my bill?

Because Cloud Logging pricing is based on actual usage, you can reduce your pricing by adjusting the ingestion volume or retention period.

  1. Reduce the volume of logs ingested per log bucket by identifying and keeping (ingesting) only valuable log data for analysis.
  2. If you do not need to keep data beyond the included 30 days, reduce the retention period. Because the first 30 days of retention are included with ingestion, reducing retention to less than 30 days will have no impact on your bill.

Does Cloud Logging charge based on the number of queries, searches either from Cloud Logging UI or Client SDK/APIs?

No, Cloud Logging does not charge for the number of queries, searches, logs read from disks during queries, or varied log types. There is a quota limit for querying logs, though, so for integrations with SIEMs or other logging tools, it’s a best practice to set up a log sink via Pub/Sub to push the logs to the downstream system.

Can I incur multiple ingestion charges?

It is possible to be charged for ingesting the same log entry into Cloud Logging log buckets multiple times. For example, if your sinks route a log entry to two log buckets, you will pay ingestion costs at two buckets. You may choose to do this to have independent retention of logs or to keep copies of logs in multiple regions for compliance reasons.

Are there different costs for hot and cold storage?

No, there are no differences between hot and cold storage. The beauty of Cloud Logging is that all logs are accessible throughout their lifespan. Cloud Logging is designed to scale easily and efficiently, which makes logs accessible for troubleshooting, investigating and compliance reasons whether they are seconds or years old.

How much does it cost to route logs to other destinations?

Today, Cloud Logging does not charge for centrally collecting and routing logs to other destinations like Cloud Storage, BigQuery, Pub/Sub. Usage rates for the destination services, Cloud Storage, BigQuery and Pub/Sub apply.

Do Logs have a generation fee?

For network telemetry logs such as VPC Logs, Firewall rules logs and Cloud NAT logs, you might incur an additional network generation charge if logs are not stored in Cloud Logging. If you store your logs in Cloud Logging, networking logs generation charges are waived, and only Cloud Logging charges apply.

How do I understand my ingestion volume in Cloud Billing?

To determine the cost per Project:

  1. Go to Cloud Console -> Billing -> Select the Billing Account -> Reports (left pane)
  2. On the right side, under filters -> Services -> select “Cloud Logging”
  3. Now, Let’s drill down to learn about the cost incurred by each log bucket. Select the Project on the top bar. On the Left pane, go to Logging -> Logs Storage. Now you should be able to see the log volume per bucket.

Putting it all together

Now that we understand pricing for Cloud Logging, we can optimize our usage. Here are four best practices:

  • Recommendation #1: Use a log router to centralize your collection; get a 360 view of your log world and then use an exclusion filter to reduce noisy logs and send only valuable logs to the log bucket. Logs dropped using sink filters or exclusion filters are not charged by Cloud Logging, even if these logs are routed to a destination outside of Cloud Logging.
  • Recommendation #2: Admin activity audit logs are captured by default for all GCP services for no additional cost. Leverage the audit logs from Required Bucket by identifying use-cases for your organizations and configure log-based-alerts on them.
  • Recommendation #3: Logs can be stored cost effectively for up to 10 years and easily accessed via Cloud Logging. Cloud Logging will begin charging customers for long term log retention starting Jan 2023. Between now and Jan 2023, determine the required lifespan of a log and set the appropriate retention period for each log bucket.
  • Recommendation #4: If you are a new customer, estimate your bills. This is a great way to compare costs with your current Cloud Logging solution. If you are an existing customer, create a budget and set up alerts on your Cloud Logging bills.

In addition to analyzing log volumes by buckets, customers may want to analyze the sources, projects, etc. Metrics explorer in Cloud Monitoring can also be used to identify costs. We will discuss this in the next blog. For more information, join us in our discussion forum. As always, we welcome your feedback. Interested in using Cloud Logging to save costs in your organization, contact us here. We are hosting a webinar to talk about how you can leverage Log Analytics, powered by BigQuery in Cloud Logging for no additional cost. Register here.

By: Afrina M (Product Manager, Google Cloud)
Source: Google Cloud Blog

Total
0
Shares
Previous Article

Powering Up Firestore To COUNT() Cost-Efficiently

Next Article

Accelerating App Development Lifecycle With Managed Container Platforms, Firebase And CI/CD

Related Posts