我正在遵循 this 解决方案将外部表加载到 Impala 中,因为如果我通过引用文件加载数据,我会得到相同的错误。
所以,如果我跑步:
[quickstart.cloudera:21000] > create external table Police2 (Priority string,Call_Type string,Jurisdiction string,Dispatch_Area string,Received_Date string,Received_Time int,Dispatch_Time int,Arrival_Time int,Cleared_Time int,Disposition string) row format delimited
> fields terminated by ','
> STORED as TEXTFILE
> location '/user/cloudera/rdpdata/rpd_data_all.csv' ;
我得到:
Query: create external table Police2 (Priority string,Call_Type string,Jurisdiction string,Dispatch_Area string,Received_Date string,Received_Time int,Dispatch_Time int,Arrival_Time int,Cleared_Time int,Disposition string) row format delimited
fields terminated by ','
STORED as TEXTFILE
location '/user/cloudera/rdpdata/rpd_data_all.csv'
ERROR: ImpalaRuntimeException: Error making 'createTable' RPC to Hive Metastore:
CAUSED BY: MetaException: hdfs://quickstart.cloudera:8020/user/cloudera/rdpdata/rpd_data_all.csv is not a directory or unable to create one
如果运行以下命令,则不会导入任何内容。
[quickstart.cloudera:21000] > create external table Police2 (Priority string,Call_Type string,Jurisdiction string,Dispatch_Area string,Received_Date string,Received_Time int,Dispatch_Time int,Arrival_Time int,Cleared_Time int,Disposition string) row format delimited
> fields terminated by ','
> location '/user/cloudera/rdpdata' ;
Query: create external table Police2 (Priority string,Call_Type string,Jurisdiction string,Dispatch_Area string,Received_Date string,Received_Time int,Dispatch_Time int,Arrival_Time int,Cleared_Time int,Disposition string) row format delimited
fields terminated by ','
location '/user/cloudera/rdpdata'
Fetched 0 row(s) in 1.01s
以及文件夹的内容
[cloudera@quickstart ~]$ hadoop fs -ls /user/cloudera/rdpdata
Found 1 items
-rwxrwxrwx 1 cloudera cloudera 75115191 2020-09-02 19:36 /user/cloudera/rdpdata/rpd_data_all.csv
以及文件内容:
[cloudera@quickstart ~]$ hadoop fs -cat /user/cloudera/rdpdata/rpd_data_all.csv
1,EMSP,RP,RC, 03/21/2013,095454,000000,000000,101659,CANC
impala create table语句中的location选项决定了数据文件存储的hdfs_path或HDFS目录。尝试提供目录位置而不是文件名,这样您就可以使用现有数据。
供您参考:https://impala.apache.org/docs/build/html/topics/impala_tables.html
如果名称节点上有一个文件,例如:
/tmp/data/data1.csv
试试这个:
sudo -u hdfs hadoop fs -mkdir -p /test/stage_data/data1
hadoop fs -copyFromLocal -f /tmp/data/data1.csv /测试/stage_data/data1
如果数据存在则删除表;
创建外部表数据1(
F1 字符串,
F2 字符串,
F3 字符串,
F4 字符串 ) 行格式分隔 字段由 ',' 位置 '/test/stage_data/data1' 终止;
表位置 - 它是一个文件夹,Impala 期望在此文件夹中包含一组 csv 文件,每个文件都将被视为表的数据文件。