Dropwizard 自定义 AppenderFactory 无法识别

问题描述 投票:0回答:2

我正在尝试为 Splunk HTTP 事件收集器实现自定义 AppenderFactory。我写了一个简单的类如下,

package com.example.app;

import ch.qos.logback.classic.LoggerContext;
import ch.qos.logback.classic.spi.ILoggingEvent;
import ch.qos.logback.core.Appender;
import ch.qos.logback.core.AppenderBase;
import com.fasterxml.jackson.annotation.JsonTypeName;
import io.dropwizard.logging.AbstractAppenderFactory;
import io.dropwizard.logging.async.AsyncAppenderFactory;
import io.dropwizard.logging.filter.LevelFilterFactory;
import io.dropwizard.logging.layout.LayoutFactory;

@JsonTypeName("splunk")
public class SplunkAppenderFactory extends AbstractAppenderFactory{

    @Override
    public Appender build(LoggerContext context, String applicationName, LayoutFactory layoutFactory, LevelFilterFactory levelFilterFactory, AsyncAppenderFactory asyncAppenderFactory) {
        System.out.println("Setting up SplunkAppenderFactory!");
        final SplunkAppender appender = new SplunkAppender();
        appender.setName("splunk-appender");
        appender.setContext(context);
        appender.start();

        return wrapAsync(appender, asyncAppenderFactory);
    }
}

class SplunkAppender extends AppenderBase<ILoggingEvent> {

    @Override
    protected void append(ILoggingEvent eventObject) {
        System.out.println("Splunk: "+ eventObject.toString());
    }
}

据说我们不需要连接任何东西,因为 Dropwizard 会自动扫描并连接东西。但是当我运行该应用程序时,我收到此错误,

./infrastructure/config/config.yml 有错误: * 无法解析配置:logging.appenders.[2];无法将类型 id“splunk”解析为 [简单类型,类 io.dropwizard.logging.AppenderFactory] 的子类型:已知类型 ids = [AppenderFactory,控制台,文件,syslog] 在[来源:N/A;行:-1,列:-1](通过引用链:com.example.app.AppConfiguration["logging"]->io.dropwizard.logging.DefaultLoggingFactory["appenders"]->java.util.ArrayList[2 ])

我的app.config如下,

logging:
  appenders:
      # log format: <Level> - <Time> - <Revision> - <Environment> - <Thread> - <Log Content>
      - type: console
        logFormat: "%level %d{HH:mm:ss.SSS} %mdc{revision} %mdc{environment} '%mdc{user}' %t %logger{5} - %X{code} %msg %n"
        threshold: ${CONSOLE_LOG_LEVEL:-ERROR}
      - type: file
        threshold: INFO
        logFormat: "%level %d{HH:mm:ss.SSS} %mdc{revision} %mdc{environment} '%mdc{user}' %t %logger{5} - %X{code} %msg %n"
        # The file to which current statements will be logged.
        currentLogFilename: ./logs/app.log
        # When the log file rotates, the archived log will be renamed to this and gzipped. The
        # %d is replaced with the previous day (yyyy-MM-dd). Custom rolling windows can be created
        # by passing a SimpleDateFormat-compatible format as an argument: "%d{yyyy-MM-dd-hh}".
        archivedLogFilenamePattern: ./logs/app-%d.log.gz
        # The number of archived files to keep.
        archivedFileCount: 10
        # The timezone used to format dates. HINT: USE THE DEFAULT, UTC.
        timeZone: UTC
      - type: splunk
        logFormat: "%level %d{HH:mm:ss.SSS} %mdc{revision} %mdc{environment} '%mdc{user}' %t %logger{5} - %X{code} %msg %n"
        threshold: INFO

我怎样才能得到这份工作?

dropwizard
2个回答
3
投票

您可能需要创建一个名为:

的文件
 META-INF/services/io.dropwizard.logging.AppenderFactory

在项目的资源文件夹中,该文件的内容是可用的 Appender 类(或多个类)的完整限定名称:

com.example.app.SplunkAppenderFactory

核心 DW 项目还包含此文件和默认附加程序:

https://github.com/dropwizard/dropwizard/blob/v1.1.0/dropwizard-logging/src/main/resources/META-INF/services/io.dropwizard.logging.AppenderFactory


0
投票

即使给出该配置后,它也不起作用。你能帮忙吗?

© www.soinside.com 2019 - 2024. All rights reserved.