hive-issue-inserting-records-to-partitioned-table
2024-09-18 23:49:21
hive-issue-inserting-records-to-partitioned-table
Hi Sam,
Recently we upgraded our cluster from HDP2.5.6 to HDP2.6.4 and I am getting the similar error. With the current version Hive is more stricter on INSERT OVERWRITE TABLE. What it means is you might be deleting the data prior to loading the table and not dropping the partition when you do INSERT OVERWRITE TABLE.
To get around it,
Try to delete the data and drop partition,prior to running the INSERT OVERWRITE TABLE.
OR don't delete the data/drop partition for the external table let the INSERT OVERWRITE TABLE replace it.
Regards
Khaja Hussain.
Similar Error:
Caused by: java.util.concurrent.ExecutionException: org.apache.hadoop.hive.ql.metadata.HiveException: Destination directory hdfs://data_dir/pk_business=bsc/pk_data_source=pos/pk_frequency=bnw/pk_data_state=c13251_ps2111_bre000_pfc00000_spr000_pfs00000/pk_reporttype=BNN/pk_ppweek=2487 has not be cleaned up.
最新文章
- Android 手机卫士--对话初次设置密码验证过程
- java俄罗斯方块游戏代码
- 微信用户授权,取openid
- 打造一款属于自己的web服务器——开篇
- Android布局文件夹引起的问题
- ubuntu下virtualenv的复制
- Crazy Search
- Spring MVC 关于分页的简单实现
- h5 新增特性用法---持续更新
- tomcat中的线程问题2
- Python_字符串之删除空白字符或某字符或字符串
- ARDC连接设备异常之ADB version mismatch的处理
- tun/tap设备_虚拟网卡
- Android系统启动流程(一)解析init进程启动过程
- java 运算符 与(&;)、非(~)、或(|)、异或(^)逻辑操作符 与(&;&;) 或(||) 非(!)
- Android&#160;&#160;<;meta-data>;
- 媒体类型(MIME类型)
- Debian9+PHP7+MySQL+Apache2配置Thinkphp运行环境LAMP
- Python全栈开发之2、运算符与基本数据结构
- 关于java中的OutOfMemory种类和解决方法