我目前正在测试Mleap解决方案,以便对Spark模型进行预测。为了做到这一点,我首先实现了这里描述的线性回归的Spark示例:https://spark.apache.org/docs/2.3.0/ml-classification-regression.html#linear-regression我已经能够将模型保存在Mleap捆绑包中,并在另一个Spark上下文中重用。现在,我想在Mleap运行时中使用这个包,但是我面临着一些限制它正常工作的转换问题
错误来自于模式定义:
val dataSchema = StructType(Seq(
StructField("label", ScalarType.Double),
StructField("features", ListType.Double)
)).get"features“部分是一组分组的列。我试过很多方法,但都没成功:
StructField("label", ScalarType.Double),
StructField("features", ListType.Double)
)).get=>这给了我
java.lang.IllegalArgumentException: Cannot cast ListType(double,true) to TensorType(double,Some(WrappedArray(10)),true)所以我试着:
val dataSchema = StructType(Seq(
StructField("label", ScalarType.Double),
StructField("features", TensorType.Double(10))
)).get但它给了我
java.lang.ClassCastException: scala.collection.immutable.$colon$colon cannot be cast to ml.combust.mleap.tensor.Tensor下面是完整的代码:
val dataSchema = StructType(Seq(
StructField("label", ScalarType.Double),
StructField("features", TensorType.Double(10))
)).get
val data = Seq(Row(-9.490009878824548, Seq(0.4551273600657362, 0.36644694351969087, -0.38256108933468047, -0.4458430198517267, 0.33109790358914726,0.8067445293443565, -0.2624341731773887,-0.44850386111659524,-0.07269284838169332, 0.5658035575800715)))
val bundle = (for(bundleFile <- managed(BundleFile("jar:file:/tmp/spark-lrModel.zip"))) yield {
bundleFile.loadMleapBundle().get
}).tried.get
var model = bundle.root
val to_test = DefaultLeapFrame(dataSchema, data)
val res = model.transform(to_test).get // => Here is the place which raises the exception我现在对这个类型映射有点迷惑了。有什么想法吗?
谢谢,
Stéphane
发布于 2019-06-26 16:06:05
我的回答是:从Spark示例开始并不是一个好主意,因为数据已经是libsvm格式的,并且这样的特性已经收集在一个向量中。在这种情况下,映射看起来是不可能的。但是从一个具有完整流水线的基本示例(vectorassembler + ml)开始,它工作得很好
https://stackoverflow.com/questions/56492260
复制相似问题