如果我有两个 avsc 文件,该插件将创建不同的记录对象,直到 1.12.0。但自从这个版本 avro 插件在 ParseContext.java 中抛出 Can't redefine 错误。
第一个架构:
{
"type": "record",
"name": "MessageKey",
"namespace": "com.example.domain",
"doc": "Kafka message key.",
"fields": [
{
"name": "id",
"type": {
"type": "string",
"avro.java.string": "String"
},
"doc": "The ID."
},
{
"name": "provenance",
"type": {
"name": "Provenance",
"type": "record",
"fields": [
{
"name": "orderId",
"type": "boolean"
}
]
}
}
]
}
第二个模式:
{
"type": "record",
"name": "Programme",
"namespace": "com.example.domain",
"doc": "A Programme.",
"fields": [
{
"name": "id",
"type": {
"type": "string",
"avro.java.string": "String"
},
"doc": "The id of the programme."
},
{
"name": "provenance",
"type": {
"name": "Provenance",
"type": "record",
"fields": [
{
"name": "orderId",
"type": "boolean"
}
]
}
}
]
}
尝试调试,发现ParseContext在第219行抛出“Can't redefine:”错误。它检查访问的对象,相同命名模式的结构是否相同。
已访问且预期比较的对象是
{"name":"Provenance","type":"record","fields":[{"name":"orderId","type":"boolean"}]}
,
但随后第二次发现同一个对象,并且由于某种原因而有所不同:
{"type":"record","name":"Provenance","namespace":"com.example.domain"}
升级到 Avro 1.12.0 时我遇到了类似的问题。我的解决方法是将公共部分提取到一个专用文件中(
common.avsc
,为此我不得不摆弄一下)
这就是我最终的结果:
public class KafkaDeserializer<GENERATED_MODEL> {
private final String schemaName;
private final Class<GENERATED_MODEL> targetClass;
public KafkaDeserializer(String schemaName, Class<GENERATED_MODEL> targetClass) {
this.schemaName = schemaName;
this.targetClass = targetClass;
}
public GENERATED_MODEL jsonToAvro(String jsonString) throws IOException {
Schema.Parser parser = new Schema.Parser();
//first loading the common objects
parser.parse(new File(getClass().getClassLoader().getResource("avro/schemas/common.avsc").getFile()));
//then loading one of the "main" schemas, that have references to the objects defined in common.avsc
parser.parse(new File(getClass().getClassLoader().getResource(String.format("avro/schemas/%s.avsc", schemaName)).getFile()));
Schema schema = parser.getTypes().get(targetClass.getName());
try {
DatumReader<GENERATED_MODEL> reader = new SpecificDatumReader<>(targetClass);
return reader.read(null, DecoderFactory.get().jsonDecoder(schema, jsonString));
} catch (IOException | AvroRuntimeException e) {
throw new SerializationException(
String.format("Error deserializing json %s to Avro of schema %s", jsonString, schema),
e
);
}
}
}