SparkException——Dynamic partitio

2019-08-10  本文已影响0人  juddar

问题描述:

在spark-shell控制台运行如下命令:插入数据到hive分区表中

scala> spark.sql("select usrid,age from w_a").map(x=>(x.getString(0),x.getString(1),getNowDate)).toDF("usrid", "age", "dt").write.mode(SaveMode.Append).insertInto("w_d")

报错如下:

org.apache.spark.SparkException: Dynamic partition strict mode requires at least one static partition column. To turn this off set hive.exec.dynamic.partition.mode=nonstrict 

处理方式就是调整hive动态分区模式为非严格模式即可

sqlContext.setConf("hive.exec.dynamic.partition.mode","nonstrict");

2.0以后sqlContext被封装到sparkSession里了,所以要这么设置:

spark.sqlContext.setConf("hive.exec.dynamic.partition.mode","nonstrict")

上一篇 下一篇

猜你喜欢

热点阅读