Apache光束从GCS进行数据流作业时读取Avro文件

问题描述 投票:0回答:1

运行Java作业以读取Avro文件,并且出现错误。在此寻求帮助-

这里是代码-

// Get Avro Schema
String schemaJson = getSchema(options.getAvroSchema());
Schema schema = new Schema.Parser().parse(schemaJson);

// Check schema field types before starting the Dataflow job
checkFieldTypes(schema);

// Create the Pipeline object with the options we defined above.
Pipeline pipeline = Pipeline.create(options);
String bqStr = getBQString(options);
// TableSchema ts = BigQueryAvroUtils.getTableSchema(User.SCHEMA$);
// Convert Avro To CSV
PCollection<GenericRecord> records =
    pipeline.apply(
        "Read Avro files",
        AvroIO.readGenericRecords(schema)
            .from(options.getInputFile()));

records
    .apply(
        "Convert Avro to CSV formatted data",
        ParDo.of(new ConvertAvroToCsv(schemaJson, options.getCsvDelimiter())))
    .apply(
        "Write CSV formatted data",
        TextIO.write().to(options.getOutput())
            .withSuffix(".csv"));

records.apply(
      "Write to BigQuery",
      BigQueryIO.write()
          .to(bqStr)
          .withJsonSchema(schemaJson)
          .withWriteDisposition(WRITE_APPEND)
          .withCreateDisposition(CREATE_IF_NEEDED)
          .withFormatFunction(TABLE_ROW_PARSER));
  // [END bq_write]

这是我看到的错误-

2020-06-01 13:14:41 ERROR MonitoringUtil$LoggingHandler:99 - 2020-06-01T07:44:39.240Z: java.lang.ClassCastException: org.apache.avro.generic.GenericData$Record cannot be cast to org.apache.avro.specific.SpecificRecord
        at com.example.AvroToCsv$1.apply(AvroToCsv.java:1)
        at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1.processElement(PrepareWrite.java:76)

2020-06-01 13:14:52 ERROR MonitoringUtil$LoggingHandler:99 - 2020-06-01T07:44:48.956Z: java.lang.ClassCastException: org.apache.avro.generic.GenericData$Record cannot be cast to org.apache.avro.specific.SpecificRecord
        at com.example.AvroToCsv$1.apply(AvroToCsv.java:1)
        at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1.processElement(PrepareWrite.java:76)

2020-06-01 13:15:03 ERROR MonitoringUtil$LoggingHandler:99 - 2020-06-01T07:44:58.811Z: java.lang.ClassCastException: org.apache.avro.generic.GenericData$Record cannot be cast to org.apache.avro.specific.SpecificRecord
        at com.example.AvroToCsv$1.apply(AvroToCsv.java:1)
        at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1.processElement(PrepareWrite.java:76)

2020-06-01 13:15:15 ERROR MonitoringUtil$LoggingHandler:99 - 2020-06-01T07:45:10.673Z: java.lang.ClassCastException: org.apache.avro.generic.GenericData$Record cannot be cast to org.apache.avro.specific.SpecificRecord
        at com.example.AvroToCsv$1.apply(AvroToCsv.java:1)
        at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1.processElement(PrepareWrite.java:76)
java google-cloud-dataflow apache-beam-io avroio
1个回答
0
投票

错误出在您的TABLE_ROW_PARSER函数中。它似乎正在将Avro GenericRecord投射到SpecificRecord

PrepareWrite中失败的行是here。该行调用您提供的format函数。格式函数必须将每个输入元素转换为JSON TableRow。为了提高效率,最好使用withAvroFormatFunction

© www.soinside.com 2019 - 2024. All rights reserved.