SparkSql读取hive表tblproperties异常如何解决
本篇内容介绍了“SparkSql读取hive表tblproperties异常如何解决”的有关知识,在实际案例的操作过程中,不少人都会遇到这样的困境,接下来就让小编带领大家学习一下如何处理这些情况吧!希望大家仔细阅读,能够学有所成!
成都创新互联成都网站建设定制网站,是成都网站设计公司,为成都混凝土搅拌罐车提供网站建设服务,有成熟的网站定制合作流程,提供网站定制设计服务:原型图制作、网站创意设计、前端HTML5制作、后台程序开发等。成都网站维护热线:18982081108
集群环境
sparksql读取Parquet 格式的hive表报错
hive的parquet表,hive和impala读取正常,使用spark-sql读取则报错
异常信息
com.fasterxml.jackson.core.JsonParseException: Unexpected end-of-input within/between Object entriesat [Source: (String)"{"type":"struct","fields":[{"name":"timestamp","type":"string","nullable":true,"metadata":{"HIVE_TYPE_STRING":"string"}},{"name":"xxx","type":"string","nullable":true,"metadata":{"HIVE_TYPE_STRING":"string"}},{"name":"xxx","type":"string","nullable":true,"; line: 1, column: 513]at com.fasterxml.jackson.core.JsonParser._constructError(JsonParser.java:1804)at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._skipAfterComma2(ReaderBasedJsonParser.java:2323)at com.fasterxml.jackson.core.json.ReaderBasedJsonParser._skipComma(ReaderBasedJsonParser.java:2293)at com.fasterxml.jackson.core.json.ReaderBasedJsonParser.nextToken(ReaderBasedJsonParser.java:664)at org.json4s.jackson.JValueDeserializer.deserialize(JValueDeserializer.scala:47)at org.json4s.jackson.JValueDeserializer.deserialize(JValueDeserializer.scala:39)at org.json4s.jackson.JValueDeserializer.deserialize(JValueDeserializer.scala:32)at org.json4s.jackson.JValueDeserializer.deserialize(JValueDeserializer.scala:46)at org.json4s.jackson.JValueDeserializer.deserialize(JValueDeserializer.scala:39)at com.fasterxml.jackson.databind.ObjectReader._bindAndClose(ObjectReader.java:1611)at com.fasterxml.jackson.databind.ObjectReader.readValue(ObjectReader.java:1219)at org.json4s.jackson.JsonMethods$class.parse(JsonMethods.scala:25)at org.json4s.jackson.JsonMethods$.parse(JsonMethods.scala:55)at org.apache.spark.sql.types.DataType$.fromJson(DataType.scala:127)at org.apache.spark.sql.hive.HiveExternalCatalog$.org$apache$spark$sql$hive$HiveExternalCatalog$$getSchemaFromTableProperties(HiveExternalCatalog.scala:1382)at org.apache.spark.sql.hive.HiveExternalCatalog.restoreDataSourceTable(HiveExternalCatalog.scala:845)at org.apache.spark.sql.hive.HiveExternalCatalog.org$apache$spark$sql$hive$HiveExternalCatalog$$restoreTableMetadata(HiveExternalCatalog.scala:765)at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$getTable$1.apply(HiveExternalCatalog.scala:734)at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$getTable$1.apply(HiveExternalCatalog.scala:734)
tblproperites不全的问题,应该是hive存储tblproperites的表,参数字段存在截断,因此找到metastore库中的TABLE_PARAMS表,检查PARAM_VALUE字段,发现该字段的长度仅为256,找到问题
将PARAM_VALUE的长度修改为8000,问题解决
“SparkSql读取hive表tblproperties异常如何解决”的内容就介绍到这里了,感谢大家的阅读。如果想了解更多行业相关的知识可以关注创新互联网站,小编将为大家输出更多高质量的实用文章!
分享名称:SparkSql读取hive表tblproperties异常如何解决
分享地址:http://myzitong.com/article/goisoj.html