我想从 HDFS 切换到 s3a 客户端。为此,我需要从 Hadoop
2.8.5
至少升级到 3.1.2
,因为我需要使用 AssumedRoleCredentialProvider
进行 AWS 访问。根据 HBase 文档中的表 5,Hadoop 3.1.2
应与 HBase 2.2.3
兼容。
我无法让
hbase-testing-util
使用升级后的 Hadoop 客户端运行。无法正确启动迷你集群。
代码:
utility = new HBaseTestingUtility(null);
utility.startMiniCluster();
mvn clean install
例外:
java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.hdfs.server.namenode.FSDirectory
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:871)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:724)
at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1103)
at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:376)
at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:233)
at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:1027)
at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:830)
at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:759)
at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniDFSCluster(HBaseTestingUtility.java:671)
at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniDFSCluster(HBaseTestingUtility.java:643)
at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:1096)
at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:1071)
mvn clean install -Dhadoop.profile=3.0 -Dhadoop-three.version=3.1.2
例外:
java.lang.NoSuchMethodError: org.eclipse.jetty.server.session.SessionHandler.getSessionManager()Lorg/eclipse/jetty/server/SessionManager;
at org.apache.hadoop.http.HttpServer2.initializeWebServer(HttpServer2.java:569)
at org.apache.hadoop.http.HttpServer2.<init>(HttpServer2.java:550)
at org.apache.hadoop.http.HttpServer2.<init>(HttpServer2.java:117)
at org.apache.hadoop.http.HttpServer2$Builder.build(HttpServer2.java:425)
at org.apache.hadoop.hdfs.server.namenode.NameNodeHttpServer.start(NameNodeHttpServer.java:160)
at org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:869)
at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:691)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:937)
at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:910)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1643)
at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:1313)
at org.apache.hadoop.hdfs.MiniDFSCluster.configureNameService(MiniDFSCluster.java:1082)
at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:957)
at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:889)
at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:805)
at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniDFSCluster(HBaseTestingUtility.java:671)
at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniDFSCluster(HBaseTestingUtility.java:643)
at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:1096)
at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:1071)
我当前运行以下功能设置,当用
2.8.5
替换
3.1.2
时,该设置会中断
<hbase.version>2.2.3</hbase.version>
<hadoop.version>2.8.5</hadoop.version>
...
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>${hadoop.version}</version>
<exclusions>
<exclusion>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>${hbase.version}</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>*</artifactId>
</exclusion>
<exclusion>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-testing-util</artifactId>
<version>${hbase.version}</version>
<scope>test</scope>
<exclusions>
<exclusion>
<artifactId>servlet-api</artifactId>
<groupId>javax.servlet</groupId>
</exclusion>
<exclusion>
<groupId>org.glassfish</groupId>
<artifactId>javax.el</artifactId>
</exclusion>
</exclusions>
</dependency>
我已经尝试排除/手动包含一些不同版本的依赖项,但无济于事。
解决方案是克隆 Apache HBase 存储库并使用 hadoop 3 进行编译
mvn versions:set -DnewVersion=2.2.3-hadoop3
mvn -Dhadoop.profile=3.0 -Dhadoop-three.version=3.1.2 -DskipTests package deploy