我希望有人能帮助我。 我有两个过程
当第一个进程消耗第二个进程抛出的事件时,日志向我显示以下消息。
在进程 cra_edit 中未找到触发器 raccolta_dati 的匹配项。跳过消费消息CloudEventWrapDataEvent [cloudEvent=CloudEvent{id='c91d80dc-aaa9-43b4-83ae-dc6ce55f5424', source=/process/raccolta_dati, type='raccolta_dati', time=2024-01-30T17:56:57.096033500+01: 00,数据=JsonCloudEventData{node="string"},扩展={kogitoproctype=BPMN,kogitoprocinstanceid=d48b1e4e-bc15-4ee7-8456-7b5a3adcc577,kogitoprocist=活动,kogitoprocversion=1.0,kogitoprocid=raccolta_dati}}]
对于 kafka,我会对所有节点事件使用相同的主题,我的配置是:
mp.messaging.incoming.kogito_incoming_stream.connector=${KOGITO.KAFKA.CONNECTOR} mp.messaging.incoming.kogito_incoming_stream.topic=${KAFKA.KOGITO.INCOMING-STREAM.TOPIC} mp.messaging.incoming.kogito_incoming_stream.value.deserializer=org.apache.kafka.common.serialization.StringDeserializer mp.messaging.incoming.kogito_incoming_stream.auto.offset.reset=最早 mp.messaging.incoming.kogito_incoming_stream.group.id=节点事件消费者
mp.messaging.outgoing.kogito_outgoing_stream.connector=${KOGITO.KAFKA.CONNECTOR} mp.messaging.outgoing.kogito_outgoing_stream.topic=${KAFKA.KOGITO.OUTGOING-STREAM.TOPIC} mp.messaging.outgoing.kogito_outgoing_stream.value.serializer=org.apache.kafka.common.serialization.StringSerializer
Kogito_incoming_stream 和 kogito_outgoing_stream 具有相同的主题。
也许我需要设置相关性,但我不明白在哪里插入此配置。
谢谢
您需要在消息中设置kogitoprocrefid。它应该与流程实例的 ID 匹配。查看评论部分https://blog.kie.org/2021/09/kogito-process-eventing-add-ons.html
{
"specversion": "1.0",
"id": "21627e26-31eb-43e7-8343-92a696fd96b1",
"source": "",
"type": "inbound.events",
"time": "2024-11-11T13:25:16Z",
"kogitoprocrefid": "6a582155-a524-4974-a9b2-9ceb7858026e",
"correlationKey": "c1",
"data": {
"status":"GOOD",
"taskName":"MyTask"
}
}