问题:Logstash 与 Opensearch 在管道启动后卡住

问题描述 投票:0回答:1

我希望你能帮助我。我需要设置 Logstash 从 AWS S3 存储桶读取作为输入并将其作为输出发送到 OpenSearch。输入是一个包含许多 csv 的文件夹。 问题是,当 Logstash 被触发并到达“管道已启动”时,它会永远卡住。

这里附上我的配置。

input {
  s3 {
    bucket => "my-bucket"
    region => "eu-west-1"
        prefix => "test_logstash/"
        codec => plain {
                    charset => "ISO-8859-1"
            }
  }
}


output {
  opensearch {
    hosts       => "opensearch_HOST:443"
    user        => "my-user"
    password    => "my-pwd"
    index       => "logstash-logs-%{+YYYY.MM.dd}"
    ecs_compatibility => disabled
    ssl_certificate_verification => false
  }
}

从 Logstash 看来,它已成功连接到 S3 和 OpenSearch。这就是我在卡住之前完全得到的。

Using bundled JDK: /root/logstash-7.16.2/jdk
OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
Sending Logstash logs to /root/logstash-7.16.2/logs which is now configured via log4j2.properties
[2022-12-01T17:25:24,912][INFO ][logstash.runner          ] Log4j configuration path used is: /root/logstash-7.16.2/config/log4j2.properties
[2022-12-01T17:25:24,932][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.16.2", "jruby.version"=>"jruby 9.2.20.1 (2.5.8) 2021-11-30 2a2962fbd1 OpenJDK 64-Bit Server VM 11.0.13+8 on 11.0.13+8 +indy +jit [linux-x86_64]"}
[2022-12-01T17:25:25,631][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2022-12-01T17:25:27,177][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2022-12-01T17:25:28,758][INFO ][org.reflections.Reflections] Reflections took 172 ms to scan 1 urls, producing 119 keys and 417 values
[2022-12-01T17:25:55,702][INFO ][logstash.outputs.opensearch][main] New OpenSearch output {:class=>"LogStash::Outputs::OpenSearch", :hosts=>["OPENSEARCHHOST:443"]}
[2022-12-01T17:25:55,761][WARN ][logstash.outputs.opensearch][main] ** WARNING ** Detected UNSAFE options in opensearch output configuration!
** WARNING ** You have enabled encryption but DISABLED certificate verification.
** WARNING ** To make sure your data is secure change :ssl_certificate_verification to true
[2022-12-01T17:25:56,489][INFO ][logstash.outputs.opensearch][main] OpenSearch pool URLs updated {:changes=>{:removed=>[], :added=>[OPENSEARCHHOST:443/]}}
[2022-12-01T17:25:57,364][WARN ][logstash.outputs.opensearch][main] Restored connection to OpenSearch instance {:url=>"OPENSEARCHHOST:443/"}
[2022-12-01T17:25:57,484][INFO ][logstash.outputs.opensearch][main] Cluster version determined (7.10.2) {:version=>7}
[2022-12-01T17:25:57,664][INFO ][logstash.outputs.opensearch][main] Using a default mapping template {:version=>7, :ecs_compatibility=>:disabled}
[2022-12-01T17:25:57,751][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>125, "pipeline.sources"=>["/root/logstash-7.16.2/config/logstash.conf"], :thread=>"#<Thread:0x30578bb4 run>"}
[2022-12-01T17:25:59,304][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>1.55}
[2022-12-01T17:25:59,334][INFO ][logstash.inputs.s3       ][main] Registering {:bucket=>"MYBUCKET", :region=>"eu-west-1"}
[2022-12-01T17:26:00,564][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2022-12-01T17:26:00,685][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2022-12-01T17:26:04,433][INFO ][logstash.inputs.s3       ][main][5edf7936d84baf43ecb3ae4563b20d4d19480a230ba643cd2d19dd66eae2bf29] Using default generated file for the sincedb {:filename=>"/root/logstash-7.16.2/data/plugins/inputs/s3/sincedb_9bd4bb61d9cbaeaa1526f686b91e2669"}

你能帮我一下吗?这里有什么问题吗?这是我第一次使用Logstash。 还有一个问题,我应该在 OpenSearch 上设置一些东西来准备接收数据吗?

期待有人帮助我。 非常感谢。

添加了部分调试级别日志:

org.logstash.config.ir.compiler.ComputeStepSyntaxElement@3fdc2b9c
[2022-12-02T13:10:50,702][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>1.78}
[2022-12-02T13:10:50,742][INFO ][logstash.inputs.s3       ][main] Registering {:bucket=>"MY-BUCKET", :region=>"eu-west-1"}
[2022-12-02T13:10:51,986][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2022-12-02T13:10:51,999][DEBUG][logstash.javapipeline    ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x131789a run>"}
[2022-12-02T13:10:52,034][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2022-12-02T13:10:52,205][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2022-12-02T13:10:53,861][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2022-12-02T13:10:53,867][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2022-12-02T13:10:55,230][DEBUG][logstash.inputs.s3       ][main][28cc012d9b766ac5ffb68a68729f31668a5ed4a8116843251e4a46032fa2c210] Found key {:key=>"test_logstash/"}
[2022-12-02T13:10:55,241][DEBUG][logstash.inputs.s3       ][main][28cc012d9b766ac5ffb68a68729f31668a5ed4a8116843251e4a46032fa2c210] Ignoring {:key=>"test_logstash/"}
[2022-12-02T13:10:55,242][DEBUG][logstash.inputs.s3       ][main][28cc012d9b766ac5ffb68a68729f31668a5ed4a8116843251e4a46032fa2c210] Found key {:key=>"test_logstash/logstash_test.csv"}
[2022-12-02T13:10:55,249][INFO ][logstash.inputs.s3       ][main][28cc012d9b766ac5ffb68a68729f31668a5ed4a8116843251e4a46032fa2c210] Using the provided sincedb_path {:sincedb_path=>"/dev/null"}
[2022-12-02T13:10:55,268][DEBUG][logstash.inputs.s3       ][main][28cc012d9b766ac5ffb68a68729f31668a5ed4a8116843251e4a46032fa2c210] Added to objects[] {:key=>"test_logstash/logstash_test.csv", :length=>1}
[2022-12-02T13:10:55,302][DEBUG][logstash.inputs.s3       ][main][28cc012d9b766ac5ffb68a68729f31668a5ed4a8116843251e4a46032fa2c210] Processing {:bucket=>"MY-BUCKET", :key=>"test_logstash/logstash_test.csv"}
[2022-12-02T13:10:55,318][DEBUG][logstash.inputs.s3       ][main][28cc012d9b766ac5ffb68a68729f31668a5ed4a8116843251e4a46032fa2c210] Downloading remote file {:remote_key=>"test_logstash/logstash_test.csv", :local_filename=>"/tmp/logstash/logstash_test.csv"}
[2022-12-02T13:10:55,699][DEBUG][logstash.inputs.s3       ][main][28cc012d9b766ac5ffb68a68729f31668a5ed4a8116843251e4a46032fa2c210] Processing file {:filename=>"/tmp/logstash/logstash_test.csv"}
[2022-12-02T13:10:55,945][DEBUG][logstash.outputs.opensearch][main][7fcc73fd5d929902f33e89b90a86349660a8f4d5d9b9301e9817b24d6159030b] Sending final bulk request for batch. {:action_count=>1, :payload_size=>161, :content_length=>161, :batch_offset=>0}
[2022-12-02T13:10:57,028][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2022-12-02T13:10:58,877][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2022-12-02T13:10:58,886][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2022-12-02T13:11:02,028][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2022-12-02T13:11:03,893][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2022-12-02T13:11:03,899][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2022-12-02T13:11:07,028][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2022-12-02T13:11:08,905][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2022-12-02T13:11:08,915][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2022-12-02T13:11:12,028][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2022-12-02T13:11:13,922][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2022-12-02T13:11:13,928][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2022-12-02T13:11:17,028][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2022-12-02T13:11:18,933][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2022-12-02T13:11:18,939][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2022-12-02T13:11:22,028][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2022-12-02T13:11:23,946][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2022-12-02T13:11:23,947][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2022-12-02T13:11:27,028][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2022-12-02T13:11:28,958][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2022-12-02T13:11:28,958][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2022-12-02T13:11:32,028][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2022-12-02T13:11:33,970][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2022-12-02T13:11:33,984][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2022-12-02T13:11:37,028][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2022-12-02T13:11:38,990][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2022-12-02T13:11:38,994][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2022-12-02T13:11:42,028][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2022-12-02T13:11:44,001][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2022-12-02T13:11:44,002][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2022-12-02T13:11:47,028][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2022-12-02T13:11:49,012][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2022-12-02T13:11:49,014][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2022-12-02T13:11:52,028][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2022-12-02T13:11:53,001][DEBUG][logstash.inputs.s3       ][main][28cc012d9b766ac5ffb68a68729f31668a5ed4a8116843251e4a46032fa2c210] Found key {:key=>"test_logstash/"}
[2022-12-02T13:11:53,002][DEBUG][logstash.inputs.s3       ][main][28cc012d9b766ac5ffb68a68729f31668a5ed4a8116843251e4a46032fa2c210] Ignoring {:key=>"test_logstash/"}
[2022-12-02T13:11:53,004][DEBUG][logstash.inputs.s3       ][main][28cc012d9b766ac5ffb68a68729f31668a5ed4a8116843251e4a46032fa2c210] Found key {:key=>"test_logstash/logstash_test.csv"}
[2022-12-02T13:11:53,008][DEBUG][logstash.inputs.s3       ][main][28cc012d9b766ac5ffb68a68729f31668a5ed4a8116843251e4a46032fa2c210] Added to objects[] {:key=>"test_logstash/logstash_test.csv", :length=>1}
[2022-12-02T13:11:53,019][DEBUG][logstash.inputs.s3       ][main][28cc012d9b766ac5ffb68a68729f31668a5ed4a8116843251e4a46032fa2c210] Processing {:bucket=>"MY-BUCKET", :key=>"test_logstash/logstash_test.csv"}
[2022-12-02T13:11:53,021][DEBUG][logstash.inputs.s3       ][main][28cc012d9b766ac5ffb68a68729f31668a5ed4a8116843251e4a46032fa2c210] Downloading remote file {:remote_key=>"test_logstash/logstash_test.csv", :local_filename=>"/tmp/logstash/logstash_test.csv"}
[2022-12-02T13:11:53,242][DEBUG][logstash.inputs.s3       ][main][28cc012d9b766ac5ffb68a68729f31668a5ed4a8116843251e4a46032fa2c210] Processing file {:filename=>"/tmp/logstash/logstash_test.csv"}
[2022-12-02T13:11:53,350][DEBUG][logstash.outputs.opensearch][main][7fcc73fd5d929902f33e89b90a86349660a8f4d5d9b9301e9817b24d6159030b] Sending final bulk request for batch. {:action_count=>1, :payload_size=>161, :content_length=>161, :batch_offset=>0}
[2022-12-02T13:11:54,025][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2022-12-02T13:11:54,035][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2022-12-02T13:11:57,028][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.

似乎正在尝试发送到 OpenSearch,但为什么没有完成?

amazon-s3 logstash opensearch
1个回答
0
投票

遇到非常相似的事情。你解决过这个问题吗? :)

© www.soinside.com 2019 - 2024. All rights reserved.