Spark SQL 写入hive表 字段名称或者类型不一致
2021-01-17 本文已影响0人
团团饱饱
解决方案:
val targetTableSchemaArray = spark.catalog.listColumns(dbName, tableName).
select("name", "dataType", "isPartition", "isBucket").
rdd.map(catalog => {
val name = catalog.getAs("name").toString
val typeName = catalog.getAs("dataType").toString
val isPartition = catalog.getAs("isPartition").toString.toBoolean
val isBucket = catalog.getAs("isBucket").toString.toBoolean
(name, typeName, isPartition, isBucket)
}).collect()
targetTableSchemaArray.foreach(x => {
val name = x._1
val typeName = x._2
//判读字段名称和字段类型是否和目标表一致,如果不一致,抛出异常
val sourcetypeName = sourceSchemaMap.getOrElse(name, "None")
sourcetypeName match {
case "None" => throw new Exception(s"Source table not exist ${name} column")
case typeName => println("yes")
case _ => throw new Exception(s"Inconsistent table structure types ,details:spark -> ${name}:${sourcetypeName} \t hive -> ${name}:${typeName} ")
}
})