我有一个Hive表,它指向Hbase表。我有一个火花作业,它创建具有等于hbase表的架构的数据集。我使用以下命令将此数据帧保存到hbase表。
sql.write().format("org.apache.phoenix.spark")
.mode(SaveMode.Overwrite).option("table", targetTable)
.option("zkUrl", "localhost:2181:/hbase-unsecure)
.insertInto(targetTable);
执行时,我得到下面的错误。
java.lang.NullPointerException at org.apache.phoenix.hive.PhoenixStorageHandler.configureJobProperties(PhoenixStorageHandler.java:185) at org.apache.phoenix.hive.PhoenixStorageHandler.configureOutputJobProperties(PhoenixStorageHandler.java:130) at org.apache.spark.sql.hive.HiveTableUtil$.configureJobPropertiesForStorageHandler(TableReader.scala:324) at org.apache.spark.sql.hive.SparkHiveWriterContainer.<init>(hiveWriterContainers.scala:67) at org.apache.spark.sql.hive.execution.InsertIntoHiveTable.sideEffectResult$lzycompute(InsertIntoHiveTable.scala:226) at org.apache.spark.sql.hive.execution.InsertIntoHiveTable.sideEffectResult(InsertIntoHiveTable.scala:142) at org.apache.spark.sql.hive.execution.InsertIntoHiveTable.doExecute(InsertIntoHiveTable.scala:310) at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:115) at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:115) at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:136) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:133) at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:114) at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:86) at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:86) at org.apache.spark.sql.DataFrameWriter.insertInto(DataFrameWriter.scala:259) at org.apache.spark.sql.DataFrameWriter.insertInto(DataFrameWriter.scala:239) at com.lti.unitrax.data.load.IncrementalHiveTableLoadUnitraxMain.fullDataLoad(IncrementalHiveTableLoadUnitraxMain.java:166) at com.lti.unitrax.data.load.TestDataLoad.main(TestDataLoad.java:38) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:627)
非常感谢您的帮助。
我正在使用spark2和HDP群集。
我有一个Hive表,它指向Hbase表。我有一个火花作业,它创建具有等于hbase表的架构的数据集。我使用以下命令将此数据帧保存到hbase表。 sql.write()....
我知道我在游戏中迟到了,但看到了这篇文章。以为我的回答可以帮助某人。