使用 ChoETL 将 JSON 转换为 CSV,在一行而非列中显示值

问题描述 投票:0回答:1

我正在将 JSON 文件转换为 CSV 文件。 JSON 具有多个嵌套对象。转换时,我可以将所有值从 JSON 中获取并放入 CSV 中。但是,所有值都显示为一行,且同一标题重复多次。我正在使用 CHOETL 库。

using (var csv = new ChoCSVWriter("file1.csv").WithFirstLineHeader().WithDelimiter(","))
{
    using (var json = new ChoJSONReader("file2.json")
        .WithField("RecordID", jsonPath: "$..Events[*].RecordId")
        .WithField("RecordType", jsonPath: "$..Events[*].RecordType")
        .WithField("EventDate", jsonPath: "$..Events[*].EventDate")
    {
        csv.Write(json);
    }
}

结果显示为

  • 记录 ID_0 记录 ID_1 记录 ID_2
  • 123 456 789

而不是作为

  • 记录ID
  • 123
  • 456
  • 789

这是 JSON 文件

[
    {
        "Id": "3e399241",
        "IdLineage": [
            "sfdsfdsfs",
            "sdfdsfdsf"

        ],
        "Individuals": [
            {
                "Id": "1232112",
                "IdLineage": [
                    "fdsfsd1"
                ],
                "Events": [
                    {

                        "RecordId": "2132121321",
                        "RecordType": "SALE",
                        "EventDate": "2016-01-04T05:00:00Z"
                    },
                    {

                        "RecordId": "123213212",
                        "RecordType": "SALE",
                        "EventDate": "2012-07-16T04:00:00Z"
                    }


                ]
            },
            {
                "Id": "ssf2112",
                "IdLineage": [],
                "Events": [

                    {

                        "RecordId": "123213ds21",
                        "RecordType": "ACXIOMRECORD",
                        "EventDate": "2017-12-17T03:33:54.875Z"
                    }
                ]
            },
            {
                "Id": "asadsad",
                "IdLineage": [],
                "Events": [
                    {

                        "RecordId": "213213sa21",
                        "RecordType": "SALE",
                        "EventDate": "2018-03-09T05:00:00Z"
                    }
                ]
            }
        ]
    }
]
c# json csv choetl
1个回答
1
投票

根据您发布的示例代码,您正在从 JSON 创建对象,如下所示

{
   RecordID : Array,
   RecordType: Array,
   EventDate: Array
}

这会生成以下格式的 CSV,这是预期的。

RecordID_0, RecordID_1, RecordID_2, RecordType_0, RecordType_1, ....

如果您想创建以下格式的 CSV,则必须修复每个记录字段上的 json 路径

RecordID, RecordType, EventData

示例代码

using (var csv = new ChoCSVWriter("file1.csv").WithFirstLineHeader().WithDelimiter(","))
{
    using (var json = new ChoJSONReader("file2.json")
    .WithField("RecordID", jsonPath: "$..Events.RecordId")
    .WithField("RecordType", jsonPath: "$..Events.RecordType")
    .WithField("EventDate", jsonPath: "$..Events.EventDate")

    {
        csv.Write(json);
    }
}

更新#1: 查看示例 JSON 后,您可以通过以下方式提取数据并生成预期格式的 CSV 文件

StringBuilder msg = new StringBuilder();

using (var w = new ChoCSVWriter(msg)
    .WithFirstLineHeader()
    )
{
    using (var r = new ChoJSONReader("Sample32.json")
        .WithJSONPath("$..Events[*]")
        )
    {
        w.Write(r);
    }
}
Console.WriteLine(msg.ToString());

输出#1:

RecordId,RecordType,EventDate
2132121321,SALE,1/4/2016 5:00:00 AM
123213212,SALE,7/16/2012 4:00:00 AM
123213ds21,ACXIOMRECORD,12/17/2017 3:33:54 AM
213213sa21,SALE,3/9/2018 5:00:00 AM

更新#2:

您必须使用 Linq 将 id 与事件成员组合起来。下面的示例展示了如何

using (var fw = new StreamWriter("Sample32.csv", true))
{
    using (var w = new ChoCSVWriter(fw)
        .WithFirstLineHeader()
        )
    {
        using (var r = new ChoJSONReader("Sample32.json")
            .WithJSONPath("$..Individuals[*]")
            )
        {
            w.Write(r.SelectMany(r1 => ((dynamic[])r1.Events).Select(r2 => new { r1.Id, r2.RecordId, r2.RecordType, r2.EventDate })));
        }
    }
}
Console.WriteLine(File.ReadAllText("Sample32.csv"));

输出#2:

Id,RecordId,RecordType,EventDate
1232112,2132121321,SALE,1/4/2016 5:00:00 AM
1232112,123213212,SALE,7/16/2012 4:00:00 AM
ssf2112,123213ds21,ACXIOMRECORD,12/17/2017 3:33:54 AM
asadsad,213213sa21,SALE,3/9/2018 5:00:00 AM
© www.soinside.com 2019 - 2024. All rights reserved.