我使用带有scala_2.12的Spark SQL3.0。我将数据插入到冰山表中,并从tabel中读取数据。我试图通过spark successfully.when从Tabel表中删除一条错误的记录,但日志显示为异常。github中apache iceberg的问题1444显示了最新版本中的iceberg支持行级删除。为什么删除不成功?我使用的主要冰山版本是0.10.0。包org.apache.iceberg.iceberg-hive版本是0.9.1。请帮帮我!我的Spark SQL代码段是:
public static void deleteSingleDataWithoutCatalog3(){
// SparkSQL Configure
SparkConf sparkSQLConf = new SparkConf();
// 'hadoop_prod' is name of the catalog,which is used in accessing table
sparkSQLConf.set("spark.sql.catalog.hadoop_prod", "org.apache.iceberg.spark.SparkCatalog");
sparkSQLConf.set("spark.sql.catalog.hadoop_prod.type", "hadoop");
sparkSQLConf.set("spark.sql.catalog.hadoop_prod.warehouse", "hdfs://hadoop01:9000/warehouse_path/");
sparkSQLConf.set("spark.sql.sources.partitionOverwriteMode", "dynamic");
SparkSession spark = SparkSession.builder().config(sparkSQLConf).master("local[2]").getOrCreate();
// String selectDataSQLALL = "select * from hadoop_prod.xgfying.booksSpark3 ";
String deleteSingleDataSQL = "DELETE FROM hadoop_prod.xgfying.booksSpark3 where price=33 ";
// spark.sql(deleteSingleDataSQL);
spark.table("hadoop_prod.xgfying.booksSpark3").show();
spark.sql(deleteSingleDataSQL);
spark.table("hadoop_prod.xgfying.booksSpark3").show();
}当代码运行时,异常消息是:
......
Exception in thread "main" java.lang.IllegalArgumentException: Failed to cleanly delete data files matching: ref(name="price") == 33
at org.apache.iceberg.spark.source.SparkTable.deleteWhere(SparkTable.java:168)
......
Caused by: org.apache.iceberg.exceptions.ValidationException: Cannot delete file where some, but not all, rows match filter ref(name="price") == 33: hdfs://hadoop01:9000/warehouse_path/xgfying/booksSpark3/data/title=Gone/00000-1-9070110f-35f8-4ee5-8047-cca2a1caba1f-00001.parquet
......发布于 2021-07-12 23:27:13
我知道这是一个相当老的问题,我最近遇到了一个类似的问题,我能够通过将spark.sql.extension添加到spark配置来解决它
--conf spark.sql.extensions=org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions https://stackoverflow.com/questions/65165940
复制相似问题