我有一个 ADF 管道,用于将 JSON 文件从 Blob 存储复制到本地文件服务器,并且该管道是通过侦听 BlobCreated 事件来触发的。这些 blob 是文本文件,其中包含管道应复制的文件列表。
当我通过按“调试”启动管道并手动提供文本文件信息时,它工作正常:
但是,如果我从“触发”->“立即触发”启动它,或者通过将 txt 文件上传到 blob 存储来测试真实事件,则会失败并出现错误:
负载中的属性“userid”不能为 null 或为空。
我想也许我没有发布我的更改(因为该警告),但我运行了我们的 DevOps 管道,之后它仍然无法工作。我怀疑这里的某个地方可能存在问题,但我不知道它可能在哪里或如何开始调试它...我会尝试“发布”按钮,但它已被禁用,也许我必须要求授权人员才能启用它进行测试。
管道运行错误:
我的配置错误在哪里?我很困惑。
链接服务 JSON:
{
"name": "LS_DFS_REDACTED",
"properties": {
"description": "some useful text here.",
"annotations": [
"KIPA Integration"
],
"type": "FileServer",
"typeProperties": {
"host": "\\\\ad.redacted.com\\foo\\bar\\dev\\",
"userId": "ASDF\\usr-dev-rw",
"password": {
"type": "AzureKeyVaultSecret",
"store": {
"referenceName": "LS_kv",
"type": "LinkedServiceReference"
},
"secretName": "usr-dev-rw"
}
},
"connectVia": {
"referenceName": "IR-Integration-DataFactory-df",
"type": "IntegrationRuntimeReference"
}
}
}
触发JSON:
{
"name": "Redacted - Processed filelist created",
"properties": {
"description": "Triggered when a /kipa/adf/*_processedJsonFiles.txt blob is created, which contains list of files for pipeline to copy.",
"annotations": [
"KIPA Integration"
],
"runtimeState": "Started",
"pipelines": [
{
"pipelineReference": {
"referenceName": "STA-DFS_CopyProcessedJson",
"type": "PipelineReference"
},
"parameters": {
"triggerFolderPath": "@trigger().outputs.body.folderPath",
"triggerFileName": "@trigger().outputs.body.fileName"
}
}
],
"type": "BlobEventsTrigger",
"typeProperties": {
"blobPathBeginsWith": "/kipa/blobs/adf",
"blobPathEndsWith": "_processedJsonFiles.txt",
"ignoreEmptyBlobs": true,
"scope": "/subscriptions/{redacted}/resourceGroups/{redacted}/providers/Microsoft.Storage/storageAccounts/{redacted}",
"events": [
"Microsoft.Storage.BlobCreated"
]
}
}
}
活动的输入示例(失败和成功的任务相同,所以问题不在输入中?)
{
"source": {
"type": "BinarySource",
"storeSettings": {
"type": "AzureBlobStorageReadSettings",
"fileListPath": "kipa/adf/20241218105953_processedJsonFiles.txt",
"deleteFilesAfterCompletion": false
},
"formatSettings": {
"type": "BinaryReadSettings"
}
},
"sink": {
"type": "BinarySink",
"storeSettings": {
"type": "FileServerWriteSettings",
"copyBehavior": "PreserveHierarchy"
}
},
"enableStaging": false,
"skipErrorFile": {
"dataInconsistency": true
},
"validateDataConsistency": true,
"logSettings": {
"enableCopyActivityLog": true,
"copyActivityLogSettings": {
"logLevel": "Warning",
"enableReliableLogging": false
},
"logLocationSettings": {
"linkedServiceName": {
"referenceName": "LS_REDACTED_STA",
"type": "LinkedServiceReference"
},
"path": "kipa/adf/STA-DFS_CopyProcessedJson"
}
}
}
更新:
我从使用 KeyVault 更改为常规密码,现在错误已更改为:
ErrorCode=AzureBlobCredentialMissing,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=请提供连接字符串或 sasUri 或 serviceEndpoint 以连接到 Blob。,Source=Microsoft.DataTransfer.ClientLibrary,'
进步!我似乎已经在我的 DevOps 管道中正确提供了所有覆盖,但我将进一步调查......
问题确实出在 CI/CD (Azure DevOps) 管道中......我不知道到底是什么问题,但我编辑了 YAML 以匹配官方文档 并三次检查了我的参数覆盖,现在它有效...可能是某个地方的无声错误。