日志未从 ASE 下部署的 Function App 流向其自己的 Application Insights

问题描述 投票:0回答:1

我们遇到了一个问题,即日志无法从我们的 Azure Function App 流向 Application Insights。 Function App 已部署在应用服务环境 (ASE) 下,并且已设置 APPLICATIONINSIGHTS_CONNECTION_STRING 环境变量,但日志未按预期显示在 Application Insights 中。

此外,还尝试使用日志参数更新 Host.json 但没有成功。



    {
  "version": "2.0",
  "logging": {
    "logLevel": {
      "default": "Information",
      "Host": "Information",
      "Function": "Information",
      "Host.Aggregator": "Information"
    },
    "applicationInsights": {
      "samplingSettings": {
        "isEnabled": true,
        "excludedTypes": "Request"
      },
      "enableDependencyTracking": false
    }
  },
  "extensionBundle": {
    "id": "Microsoft.Azure.Functions.ExtensionBundle",
    "version": "[3.*, 4.0.0)"
  }
}

ASE VNet 已配置默认 NSG 出站规则。

功能代码 - 相同的代码在我们的其他环境中工作,但没有 ASE,我们可以在调用等下看到日志。

import logging
import os

import aiohttp
import arrow
import azure.functions as func
from shared.event_hub import EventHubProducer

VERSION = "1.3.4"
EVENT_HUB_SEND_SLEEP = 1  # In seconds
HOURS_OF_HISTORY = 2

logging.getLogger("azure.core.pipeline.policies.http_logging_policy").setLevel(logging.WARNING)
logging.getLogger("azure.eventhub").setLevel(logging.WARNING)

data_url = (DATA_URL)
device_url = (DEVICE_URL)


async def get_data() -> tuple[dict, dict]:
    """Download data from BCC API"""
    async with aiohttp.ClientSession(timeout=aiohttp.ClientTimeout(connect=10)) as session:
        async with session.get(data_url) as resp:
            result = await resp.json()
            logging.info(f"DATA: {result}")

        async with session.get(device_url) as resp:
            devices_result = await resp.json()
            logging.info(f"DEVICES RAW: {devices_result}")

        return result["records"], devices_result["records"]


def create_parsed_records(sensor: str, data_records: list[dict], device_map: dict[str, str]) -> dict | None:
    """Create parsed records for the specified device/sensor"""
    sensor_data = {item["fields"]["measured"]: item["fields"][sensor] for item in data_records}
    # logging.info(f"Sensor data: {sensor} {sensor_data}")

    metric = "precipitation" if device_map[sensor] == "Rainfall" else "level_mahd"
    # logging.info(f"METRIC: {metric}")

    readings = []

    for key, value in sensor_data.items():
        if value == "-":
            continue

        reading = {"Key": metric, "Value": {}}

        reading["Value"]["timestamp"] = arrow.get(key).int_timestamp * 1000
        reading["Value"]["value"] = float(value)
        reading["Value"]["timespan"] = 5 * 60 if metric == "precipitation" else None

        readings.append(reading)

    if readings != []:
        parsed_record = {
            "key": f"bcc-{sensor.lower()}-dev",
            "docType": "ParsedRecord",
            "deviceId": f"bcc-{sensor.lower()}-dev",
            "readings": readings,
            "traceId": None,
            "timestamp": max([item["Value"]["timestamp"] for item in readings]),
        }
        parsed_record["id"] = str(parsed_record["timestamp"])
        return parsed_record


async def main(mytimer: func.TimerRequest) -> None:
    try:
        # Get one big dump of data and device definitions
        data_records, device_records = await get_data()
        logging.info(f"Retrieved {len(device_records)} devices and {len(data_records)} historical 5-minute rows")

        # Create a map of sensor id to sensor type (Rainfall vs Stream Height)
        device_map = {item["fields"]["sensor_id"].lower(): item["fields"]["sensor_type"] for item in device_records}

        eventhub_client = EventHubProducer(
            fully_qualified_namespace=os.environ["EVENTHUB_PARSED_FQDN"], eventhub_name=os.environ["EVENTHUB_NAME"]
        )

        # Fetch a list of available sensors from the header of the last entry
        # and then iterate for each sensor through all records, only picking data from that sensor in each iteration
        # Send to event hub each time a parsed record is constructed for a sensor
        for sensor in list(data_records[0]["fields"].keys()):
            if not sensor.startswith("e"):
                continue

            parsed_record = create_parsed_records(sensor, data_records, device_map)

            if parsed_record:
                await eventhub_client.send_events([parsed_record], sleep=EVENT_HUB_SEND_SLEEP)

    except Exception as e:
        logging.exception(e)
    finally:
        await eventhub_client.aclose()
azure function logging monitoring appinsights
1个回答
0
投票

使用以下命令将函数部署到具有容器应用程序环境的函数应用程序:

az functionapp create --name <APP_NAME> --storage-account <STORAGE_NAME> --environment <Containerapp_Environment> --workload-profile-name "Consumption" --resource-group <resource_group_name> --functions-version 4 --runtime dotnet-isolated --image <LOGIN_SERVER>/<Image_name>:<tag> --assign-identity

我已将示例函数部署到使用容器应用程序环境创建的 Azure functionapp:

Dockerfile:

FROM mcr.microsoft.com/azure-functions/python:4-python3.11

ENV AzureWebJobsScriptRoot=/home/site/wwwroot \
    AzureFunctionsJobHost__Logging__Console__IsEnabled=true

COPY requirements.txt /
RUN pip install -r /requirements.txt

COPY . /home/site/wwwroot

能够按预期运行功能:

主机.json:

{
  "version": "2.0",
  "logging": {
    "applicationInsights": {
      "samplingSettings": {
        "isEnabled": true,
        "excludedTypes": "Request"
      },
      "enableLiveMetricsFilters": true
    }
  }
}

应用程序洞察日志:

enter image description here

最新问题
© www.soinside.com 2019 - 2025. All rights reserved.