-
Notifications
You must be signed in to change notification settings - Fork 395
AWS Forwarder v5 #1010
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AWS Forwarder v5 #1010
Changes from 7 commits
4df1098
a9301a9
b3342fc
f7fb1e1
7a39141
14e7d11
b50198d
9e5b3f4
84a369c
4899a4c
f2d2298
6121a28
df4d09c
8edda81
2356c4e
2712e51
58ce3b7
1220669
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change | ||||
|---|---|---|---|---|---|---|
| @@ -0,0 +1,120 @@ | ||||||
| # Datadog Lambda Forwarder Changelog | ||||||
|
|
||||||
| ## v5.0.0 - BREAKING CHANGES | ||||||
|
|
||||||
| ### Overview | ||||||
|
|
||||||
| Version 5.0.0 of the Datadog Lambda Forwarder introduces several breaking changes that remove deprecated features and improve log filtering behavior. This release introduces a new way to enrich your logs with tags that will reduce AWS Lambda related cost (S3, KMS and Lambda). | ||||||
|
|
||||||
| ### New Features | ||||||
|
|
||||||
| #### 1. Backend Storage Tag Enrichment | ||||||
|
|
||||||
| **Added:** | ||||||
|
|
||||||
| - New `DD_ENRICH_S3_TAGS` / `DdEnrichS3Tags` parameter (default: `true`) | ||||||
| - New `DD_ENRICH_CLOUDWATCH_TAGS` / `DdEnrichCloudwatchTags` parameter (default: `true`) | ||||||
| - These instruct the Datadog backend to automatically enrich logs with resource tags **after ingestion** | ||||||
| - New cloudwatch tags can appear on logs, check your Datadog log index configuration to ensure smooth transition. | ||||||
|
|
||||||
| **Benefits:** | ||||||
|
|
||||||
| - **Reduces forwarder cost** and execution time | ||||||
| - Provides the same tag enrichment as `DdFetchS3Tags` and `DdFetchLogGroupTags` | ||||||
| - Requires [Resource Collection](https://docs.datadoghq.com/integrations/amazon-web-services/#resource-collection) enabled in your AWS integration | ||||||
|
|
||||||
| **Deprecation Notice:** | ||||||
|
|
||||||
| - `DdFetchS3Tags` is now marked as **DEPRECATED** in favor of `DdEnrichS3Tags` | ||||||
| - `DdFetchLogGroupTags` is now marked as **DEPRECATED** in favor of `DdEnrichCloudwatchTags` | ||||||
| - `DD_FETCH_S3_TAGS` now defaults to `false` (previously `true`) | ||||||
|
|
||||||
| --- | ||||||
|
|
||||||
| ### Breaking Changes | ||||||
|
|
||||||
| #### 1. Changed Regex Matching Behavior for Log Filtering | ||||||
|
|
||||||
| **What Changed:** | ||||||
|
|
||||||
| - `IncludeAtMatch` / `INCLUDE_AT_MATCH` and `ExcludeAtMatch` / `EXCLUDE_AT_MATCH` regex patterns now match **only against the log message** itself | ||||||
| - Previously, these patterns matched against the **entire JSON-formatted log** | ||||||
|
|
||||||
| **Migration Required:** | ||||||
|
|
||||||
| - **Review and update your filtering regex patterns** | ||||||
|
||||||
| - **Review and update your filtering regex patterns** | |
| - **Review and update filtering regex patterns** |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
TCP (transport) doesn't relate to HTTP/HTTPs (application) maybe we should also clarify that logs are sent to a different intake as well?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's a different intake than the direct TCP intake yes, but it was an internal configuration depending on the DD_SITE and it was not configurable.
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: HTTP intake (endpoint) instead of transport
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Updated to HTTP Protocol
ViBiOh marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
ViBiOh marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
ViBiOh marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| 2. **Verify you're not using the deprecated PrivateLink variable:** | |
| 2. **Verify that deprecated PrivateLink variable is not used:** |
ViBiOh marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
ViBiOh marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
Large diffs are not rendered by default.
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,48 @@ | ||
| # Unless explicitly stated otherwise all files in this repository are licensed | ||
| # under the Apache License Version 2.0. | ||
| # This product includes software developed at Datadog (https://www.datadoghq.com/). | ||
| # Copyright 2021 Datadog, Inc. | ||
|
|
||
|
|
||
| import logging | ||
| import os | ||
| import re | ||
|
|
||
| from logs.exceptions import ScrubbingException | ||
| from logs.helpers import compileRegex | ||
|
|
||
| logger = logging.getLogger() | ||
| logger.setLevel(logging.getLevelName(os.environ.get("DD_LOG_LEVEL", "INFO").upper())) | ||
|
|
||
|
|
||
| class DatadogMatcher(object): | ||
| def __init__(self, include_pattern=None, exclude_pattern=None): | ||
| self._include_regex = None | ||
| self._exclude_regex = None | ||
|
|
||
| if include_pattern is not None: | ||
| logger.debug(f"Applying include pattern: {include_pattern}") | ||
| self._include_regex = compileRegex("INCLUDE_AT_MATCH", include_pattern) | ||
|
|
||
| if exclude_pattern is not None: | ||
| logger.debug(f"Applying exclude pattern: {exclude_pattern}") | ||
| self._exclude_regex = compileRegex("EXCLUDE_AT_MATCH", exclude_pattern) | ||
|
|
||
| def match(self, log): | ||
| try: | ||
| if self._exclude_regex is not None and re.search( | ||
| self._exclude_regex, str(log) | ||
| ): | ||
| logger.debug("Exclude pattern matched, excluding log event") | ||
| return False | ||
|
|
||
| if self._include_regex is not None and not re.search( | ||
| self._include_regex, str(log) | ||
| ): | ||
| logger.debug("Include pattern did not match, excluding log event") | ||
| return False | ||
|
|
||
| return True | ||
|
|
||
| except ScrubbingException: | ||
| raise Exception("could not filter the payload") |
This file was deleted.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: prefer to remove you, yours, and just use abstracts i.e.
new way to enrich logs