环境:原始hadoop;开启kerberos hive;depoly-mode:yanr-clint;每个haoop节点放置证书; 过程:覆盖spark jdbcsource,spark使用此源连接hive,连接前有auth,并且auth成功,但是executor执行connect时出现异常:
sqlexception:could not open client for any of server uri is zookeeper:null
问题:如何解决这个错误,我尝试设置UserGroupInformation并验证成功,并设置spark javaextraoption但不生效
这是授权码:
public static void initkerberos() {
try {
String configPath = "/opt/hbaseConfig/tx/"+krbConfig;
String keytabPath = "/opt/hbaseConfig/tx/"+krbKeytab;
System.setProperty("java.security.krb5.conf", configPath);
Configuration conf = new Configuration();
conf.set("hadoop.security.authentication", "kerberos");
UserGroupInformation.setConfiguration(conf);
UserGroupInformation.loginUserFromKeytab(krbUser, keytabPath);
} catch (Exception e) {
e.printStackTrace();
logger.error("Kerberos 验证失败", e);
}
}