我有一个容器,里面有多个文件、子文件夹和子zip文件夹,文件夹内有不同的文件,如 csv、xlsx 。我想将所有文件和文件夹从该容器复制到其他容器。
我已使用此线程中建议的复制活动如何使用 azure 数据工厂复制特定目录中的所有文件和文件夹但失败并出现以下错误。
我尝试过并发为 1,但没有运气,还有其他方法可以将所有文件和文件夹从该容器复制到其他容器吗?
您可以按照以下方法获得预期的输出:
以下是来自存储容器的输入:
管道由“获取元数据”活动组成,用于从输入容器中检索所有填充物。 在字段列表中选择子项目 在 foreach 活动中使用获取元数据活动 childItems 作为 Items
@activity('Get Metadata1').output.childItems
在每个活动内部,创建两个复制活动,一个用于 获取所有文件夹和其他活动以检索所有文件。在复制活动中,您需要选择通配符路径才能获取文件夹,如下所示:
添加另一个复制活动,以复制带有数据集参数 filename
的文件,文件名值为@item().name
,如下所示:
调试管道成功后,文件夹和文件复制到另一个容器,如下所示:这是管道 Json 供您参考:
{
"name": "pipeline3",
"properties": {
"activities": [
{
"name": "Get Metadata1",
"type": "GetMetadata",
"dependsOn": [],
"policy": {
"timeout": "0.12:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"userProperties": [],
"typeProperties": {
"dataset": {
"referenceName": "Binary1",
"type": "DatasetReference"
},
"fieldList": [
"childItems"
],
"storeSettings": {
"type": "AzureBlobFSReadSettings",
"recursive": true,
"enablePartitionDiscovery": false
},
"formatSettings": {
"type": "BinaryReadSettings"
}
}
},
{
"name": "ForEach1",
"type": "ForEach",
"dependsOn": [
{
"activity": "Get Metadata1",
"dependencyConditions": [
"Succeeded"
]
}
],
"userProperties": [],
"typeProperties": {
"items": {
"value": "@activity('Get Metadata1').output.childItems",
"type": "Expression"
},
"isSequential": true,
"activities": [
{
"name": "Copy data1",
"type": "Copy",
"dependsOn": [
{
"activity": "Copy data1_copy1",
"dependencyConditions": [
"Succeeded"
]
}
],
"policy": {
"timeout": "0.12:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"userProperties": [],
"typeProperties": {
"source": {
"type": "DelimitedTextSource",
"storeSettings": {
"type": "AzureBlobFSReadSettings",
"recursive": true,
"enablePartitionDiscovery": false
},
"formatSettings": {
"type": "DelimitedTextReadSettings"
}
},
"sink": {
"type": "DelimitedTextSink",
"storeSettings": {
"type": "AzureBlobFSWriteSettings"
},
"formatSettings": {
"type": "DelimitedTextWriteSettings",
"quoteAllText": true,
"fileExtension": ".txt"
}
},
"enableStaging": false,
"translator": {
"type": "TabularTranslator",
"typeConversion": true,
"typeConversionSettings": {
"allowDataTruncation": true,
"treatBooleanAsNumber": false
}
}
},
"inputs": [
{
"referenceName": "src1",
"type": "DatasetReference",
"parameters": {
"filename": {
"value": "@item().name",
"type": "Expression"
}
}
}
],
"outputs": [
{
"referenceName": "sink1",
"type": "DatasetReference"
}
]
},
{
"name": "Copy data1_copy1",
"type": "Copy",
"dependsOn": [],
"policy": {
"timeout": "0.12:00:00",
"retry": 0,
"retryIntervalInSeconds": 30,
"secureOutput": false,
"secureInput": false
},
"userProperties": [],
"typeProperties": {
"source": {
"type": "BinarySource",
"storeSettings": {
"type": "AzureBlobFSReadSettings",
"recursive": true,
"wildcardFolderPath": "*",
"wildcardFileName": "*",
"deleteFilesAfterCompletion": false
},
"formatSettings": {
"type": "BinaryReadSettings"
}
},
"sink": {
"type": "BinarySink",
"storeSettings": {
"type": "AzureBlobFSWriteSettings"
}
},
"enableStaging": false
},
"inputs": [
{
"referenceName": "Binary1",
"type": "DatasetReference"
}
],
"outputs": [
{
"referenceName": "Binary2",
"type": "DatasetReference"
}
]
}
]
}
}
],
"annotations": []
}
}