Error: Error while compiling statement: FAILED: SemanticException Unable to load data to destination
ods层新加了一张表,和以前的格式一样DROP TABLE IF EXISTS ods_students_industry_level;CREATE TABLE `ods_students_industry_level` (`id` INT COMMENT '编号',`first_industry` STRING COMMENT '一级行业',`second_industry` STRING CO
ods层新加了一张表,和以前的格式一样
DROP TABLE IF EXISTS ods_students_industry_level;
CREATE TABLE `ods_students_industry_level` (
`id` INT COMMENT '编号',
`first_industry` STRING COMMENT '一级行业',
`second_industry` STRING COMMENT '二级行业',
`parent_id` INT COMMENT '父级id'
) COMMENT '行业级别信息表'
PARTITIONED BY (`dt` STRING)
ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
LOCATION '/hive/warehouse/ods/ods_students_industry_level';
数据从hdfs load到表的时候报错
load语句:
load data inpath '/origin_data/sqoop/db/students_firstindustry/2022-04-12' OVERWRITE into table sqoop. partition(dt='2022-04-12');"
load失败报错:
Error: Error while compiling statement: FAILED: SemanticException Unable to load data to destination table. Error: The file that you are trying to load does not match the file format of the destination table
文件格式不对。
查看建表的格式:
hive> show create table ods_students_industry_level;
+----------------------------------------------------+
| createtab_stmt |
+----------------------------------------------------+
| CREATE TABLE `ods_students_industry_level`( |
| `id` int COMMENT '??', |
| `first_industry` string COMMENT '????', |
| `second_industry` string COMMENT '????', |
| `parent_id` int COMMENT '??id') |
| COMMENT '???????' |
| PARTITIONED BY ( |
| `dt` string) |
| ROW FORMAT SERDE |
| 'org.apache.hadoop.hive.ql.io.orc.OrcSerde' |
| WITH SERDEPROPERTIES ( |
| 'field.delim'='\t', |
| 'serialization.format'='\t') |
| STORED AS INPUTFORMAT |
| 'org.apache.hadoop.hive.ql.io.orc.OrcInputFormat' |
| OUTPUTFORMAT |
| 'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat' |
| LOCATION |
| 'hdfs://hdp-01:8020/hive/warehouse/ods/ods_students_industry_level' |
| TBLPROPERTIES ( |
| 'bucketing_version'='2', |
| 'transactional'='true', |
| 'transactional_properties'='default', |
| 'transient_lastDdlTime'='1649831799') |
+----------------------------------------------------+
查看以前的建的表的格式
+----------------------------------------------------+
| createtab_stmt |
+----------------------------------------------------+
| CREATE EXTERNAL TABLE `ods_dh_userlevels`( |
| `id` int COMMENT '??', |
| `name` string COMMENT '???', |
| `weight` int, |
| `visirble` int, |
| `price` int COMMENT '??', |
| `create_time` string COMMENT '????', |
| `describe` string COMMENT '????', |
| `priceclassify` int, |
| `remarks` string) |
| COMMENT 'datahoop???????' |
| PARTITIONED BY ( |
| `dt` string) |
| ROW FORMAT SERDE |
| 'org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe' |
| WITH SERDEPROPERTIES ( |
| 'field.delim'='\t', |
| 'serialization.format'='\t') |
| STORED AS INPUTFORMAT |
| 'org.apache.hadoop.mapred.TextInputFormat' |
| OUTPUTFORMAT |
| 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat' |
| LOCATION |
| 'hdfs://hdp-01:8020/text/hive/ods/ods_dh_userlevels' |
| TBLPROPERTIES ( |
| 'bucketing_version'='2', |
| 'discover.partitions'='true', |
| 'transient_lastDdlTime'='1648629277') |
+----------------------------------------------------+
可以看出来两张表的存储格式改变了。
hive默认的存储格式是textFile。
对新的表指定存储格式,即在建表的时候加一句 store as textFile
新的建表语句:
DROP TABLE IF EXISTS ods_students_industry_level;
CREATE TABLE `ods_students_industry_level` (
`id` INT COMMENT '编号',
`first_industry` STRING COMMENT '一级行业',
`second_industry` STRING COMMENT '二级行业',
`parent_id` INT COMMENT '父级id'
) COMMENT '行业级别信息表'
PARTITIONED BY (`dt` STRING)
ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
STORED AS TEXTFILE
LOCATION '/hive/warehouse/ods/ods_students_industry_level';
这次load成功。
更多推荐
所有评论(0)