无法通过撰写文件将Logstash连接到Kafka

问题描述 投票:0回答:1

我正在使用 compose 文件在 Logstash 和 Kafka 之间创建数据管道。但此消息显示在

logstash
容器中。有人可以帮我吗?

留言:

[WARN ][org.apache.kafka.clients.NetworkClient] [Consumer clientId=logstash-0, groupId=logstash] Connection to node 2 could not be established. Broker may not be available.

我的撰写文件:

version: "3"
services:
    zookeeper:
        image: confluentinc/cp-zookeeper:6.2.0
        container_name: zookeeper
        ports:
            - "2181:2181"
        networks:
            - kafkanet
        environment:
            ZOOKEEPER_CLIENT_PORT: "2181"
            ZOOKEEPER_TICK_TIME: "2000"
            ZOOKEEPER_SYNC_LIMIT: "2"
        
    kafkaserver:
        image: confluentinc/cp-kafka:6.2.0
        container_name: kafka
        ports:
            - "9092:9092"
        networks:
            - kafkanet
        environment:
            KAFKA_ZOOKEEPER_CONNECT: "zookeeper:2181"
            KAFKA_ADVERTISED_LISTENERS: "PLAINTEXT://localhost:9092"
            KAFKA_BROKER_ID: "2"
            KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: "1"
        depends_on:
            - zookeeper

    elasticsearch:
        image: docker.elastic.co/elasticsearch/elasticsearch:6.4.0
        container_name: elasticsearch
        ports:
            - 9200:9200
            - 9300:9300
        networks:
            - kafkanet
    
    kibana:
        image: docker.elastic.co/kibana/kibana:6.4.0
        container_name: kibana
        ports:
            - 5601:5601
        networks:
            - kafkanet
        depends_on: [ 'elasticsearch' ]
    
    # Logstash Docker Image
    logstash:
        image: docker.elastic.co/logstash/logstash:6.4.0
        container_name: logstash
        networks:
            - kafkanet
        depends_on: [ 'elasticsearch', 'kafkaserver' ]
        volumes:
            - './logstash/config:/usr/share/logstash/pipeline/'

networks:
    kafkanet:
        driver: bridge

./logstash/config/logstash.conf

input {
  kafka {
    bootstrap_servers => "kafkaserver:9092"
    topics => ["sit.catalogue.item","uat.catalogue.item"]
    auto_offset_reset => "earliest"
    decorate_events => true
  }
}

output {
  elasticsearch {
    hosts => ["elasticsearch:9200"]
    index => "%{[indexPrefix]}-logs-%{+YYYY.MM.dd}"
  }
}

elasticsearch apache-kafka logstash
1个回答
0
投票

您在 Kafka 中宣传的监听器不正确。应该是

kafkaserver

所以而不是

KAFKA_ADVERTISED_LISTENERS: "PLAINTEXT://localhost:9092"

你需要

KAFKA_ADVERTISED_LISTENERS: "PLAINTEXT://kafkaserver:9092"

欲了解更多详情,请参阅我写的此博客

© www.soinside.com 2019 - 2024. All rights reserved.