我被要求使用 terraform 创建数据收集规则,其中包括自定义表的数据源,需要 KQL 查询 (transformKql) 来正确格式化传入数据。总之,我不知道如何编写 KQL 查询代码。以下是直接输入到 Azure 门户时可以使用的 KQL 查询:
来源 |扩展 d=todynamic(RawData) |扩展 TimeCreated = extract(@"\d*[-]\d*[-]\d*\s\d*[:]\d*[:]\d*[.]\d*",0,RawData ) |扩展 AppPriority = tostring(split(d,"-",3)) |扩展 ReqId = tostring(split(d,"-",4)) |扩展 Prog = tostring(split(d,"-",5)) |扩展消息 = tostring(split(d,"-",6))
这是带有转义字符的相同字符串。这就是字符串在 tf 文件中的显示方式:
source | extend d=todynamic(RawData) | extend TimeCreated = extract(@\"\\d*[-]\\d*[-]\\d*\\s\\d*[:]\\d*[:]\\d*[.]\\d*\",0,RawData) | extend AppPriority = tostring(split(d,\"-\",3)) | extend ReqId = tostring(split(d,\"-\",4)) | extend Prog = tostring(split(d,\"-\",5)) | extend Msg = tostring(split(d,\"-\",6))
如有任何帮助/建议,我们将不胜感激。
我不断收到以下错误:
azurerm_monitor_data_collection_rule.dcr_name: Creating...
╷
│ Error: creating Data Collection Rule (Subscription: "11111111-2222-3333-4444-555555555555"
│ Resource Group Name: "rg_name"
│ Data Collection Rule Name: "dcr-name"): datacollectionrules.DataCollectionRulesClient#Create: Failure responding to request: StatusCode=400 -- Original Error: autorest/azure: Service returned an error. Status=400 Code="InvalidPayload" Message="Data collection rule is invalid" Details=[{"code":"InvalidTransformQuery","message":"Error occurred while compiling query in query: SemanticError:0x00000006 at 1:30 : Undefined symbol: RawData","target":"properties.dataFlows[0]"}]
我也尝试过使用 Heredoc 格式,但没有成功。
在考虑了KQL查询的不同场景后,我发现问题出在源查询中使用的
todynamic()
函数。如果我删除了该特定功能,它可以在所有场景中工作,如下所示。
如果
RawData
不是正确的 JSON
或兼容的动态类型,则将其转换为动态可能会导致冲突。因此,检查 RawData 具有的数据类型,然后尝试从一种类型转换为另一种类型。
在 KQL 日志分析工作区中运行
source | project typeof(RawData)
以检索内容的类型,然后在 terraform 中应用类型转换。
azurerm_monitor_data_collection_rule
以及来自MSDoc的示例数据收集规则,我已经给出了上面提到的以下查询场景,并根据各自的类型转换按预期工作。
1.
resource "azurerm_monitor_data_collection_rule" "example" {
name = "jahflampul23"
resource_group_name = azurerm_resource_group.example.name
location = azurerm_resource_group.example.location
data_collection_endpoint_id = azurerm_monitor_data_collection_endpoint.example.id
destinations {
log_analytics {
workspace_resource_id = azurerm_log_analytics_workspace.example.id
name = "example-destination-log"
}
event_hub {
event_hub_id = azurerm_eventhub.example.id
name = "example-destination-eventhub"
}
storage_blob {
storage_account_id = azurerm_storage_account.example.id
container_name = azurerm_storage_container.example.name
name = "example-destination-storage"
}
azure_monitor_metrics {
name = "example-destination-metrics"
}
}
data_flow {
streams = ["Microsoft-InsightsMetrics"]
destinations = ["example-destination-metrics"]
}
data_flow {
streams = ["Microsoft-InsightsMetrics", "Microsoft-Syslog", "Microsoft-Perf"]
destinations = ["example-destination-log"]
}
data_flow {
streams = ["Custom-MyTableRawData"]
destinations = ["example-destination-log"]
output_stream = "Microsoft-Syslog"
transform_kql = "source | extend data = RawData"
}
data_sources {
syslog {
facility_names = ["*"]
log_levels = ["*"]
name = "example-datasource-syslog"
streams = ["Microsoft-Syslog"]
}
iis_log {
streams = ["Microsoft-W3CIISLog"]
name = "example-datasource-iis"
log_directories = ["C:\\Logs\\W3SVC1"]
}
log_file {
name = "example-datasource-logfile"
format = "text"
streams = ["Custom-MyTableRawData"]
file_patterns = ["C:\\JavaLogs\\*.log"]
settings {
text {
record_start_timestamp_format = "ISO 8601"
}
}
}
performance_counter {
streams = ["Microsoft-Perf", "Microsoft-InsightsMetrics"]
sampling_frequency_in_seconds = 60
counter_specifiers = ["Processor(*)\\% Processor Time"]
name = "example-datasource-perfcounter"
}
windows_event_log {
streams = ["Microsoft-WindowsEvent"]
x_path_queries = ["*![System/Level=1]"]
name = "example-datasource-wineventlog"
}
extension {
streams = ["Microsoft-WindowsEvent"]
input_data_sources = ["example-datasource-wineventlog"]
extension_name = "example-extension-name"
extension_json = jsonencode({
a = 1
b = "hello"
})
name = "example-datasource-extension"
}
}
stream_declaration {
stream_name = "Custom-MyTableRawData"
column {
name = "TimeGenerated"
type = "datetime"
}
column {
name = "RawData"
type = "string"
}
column {
name = "Properties"
type = "dynamic"
}
}
depends_on = [
azurerm_log_analytics_solution.example
]
}
2.
data_flow {
streams = ["Custom-MyTableRawData"]
destinations = ["example-destination-log"]
output_stream = "Microsoft-Syslog"
transform_kql = "source | extend data = tostring(AdditionalContext) | project TimeGenerated = Time, computer"
}