我在Databricks Notebook上使用Spark,Cassandra,Spark-Cassandra-Connector,根据他们的网站,我们可以使用'deleteFromCassandra‘来删除行:https://github.com/datastax/spark-cassandra-connector/blob/master/doc/5_saving.md,https://datastax-oss.atlassian.net/browse/SPARKC-349,这是我的python脚本:
def read_table(tableName,kespace, columns):
dfData = (spark
.read
.format("org.apache.spark.sql.cassandra")
.options(table = tableName, keyspace = kespace)
.load()
.select(*columns))
return dfData
emails='abc@test.com'.split(",")
df = read_table(my_table, my_keyspace,"*").where(col("email").isin(emails))
df.rdd.deleteFromCassandra(my_keyspace, my_table)但它失败了:
AttributeError: 'RDD' object has no attribute 'deleteFromCassandra'注意到他们提供的所有示例都是在Scala中提供的,这是否意味着'deleteFromCassandra‘函数在Python中不可用?
发布于 2020-01-08 03:04:47
使用Spark Cassandra连接器是不可能的,因为Python绑定只支持数据帧。但是应该可以使用pyspark-cassandra,它也可以在Spark Packages site上作为--packages anguenot:pyspark-cassandra:2.4.0使用。如下所示:
dataFrame.rdd().deleteFromCassandra(keyspace, table)发布于 2020-04-29 14:01:55
希望这能解决-
import com.datastax.spark.connector._https://stackoverflow.com/questions/59633625
复制相似问题