如何将具有稀疏数据的 PythonRDD 转换为密集的 PythonRDD
how to convert a PythonRDD with sparse data into dense PythonRDD
我想使用 StandardScaler
缩放数据。我已将数据加载到 PythonRDD 中。好像数据很稀疏。要应用 StandardScaler
,我们应该首先将其转换为密集类型。
trainData = MLUtils.loadLibSVMFile(sc, trainDataPath)
valData = MLUtils.loadLibSVMFile(sc, valDataPath)
trainLabel = trainData.map(lambda x: x.label)
trainFeatures = trainData.map(lambda x: x.features)
valLabel = valData.map(lambda x: x.label)
valFeatures = valData.map(lambda x: x.features)
scaler = StandardScaler(withMean=True, withStd=True).fit(trainFeatures)
# apply the scaler into the data. Here, trainFeatures is a sparse PythonRDD, we first convert it into dense tpye
trainFeatures_scaled = scaler.transform(trainFeatures)
valFeatures_scaled = scaler.transform(valFeatures)
# merge `trainLabel` and `traiFeatures_scaled` into a new PythonRDD
trainData1 = ...
valData1 = ...
# using the scaled data, i.e., trainData1 and valData1 to train a model
...
以上代码有错误。我有两个问题:
- 如何将稀疏的 PythonRDD
trainFeatures
转换为可以作为 StandardScaler
输入的密集类型?
- 如何将
trainLabel
和 trainFeatures_scaled
合并成一个新的 LabeledPoint 用于训练分类器(例如随机森林)?
我仍然能找到关于此的任何文档或参考资料。
使用toArray
转换为稠密地图:
dense = valFeatures.map(lambda v: DenseVector(v.toArray()))
要合并 zip:
valLabel.zip(dense).map(lambda (l, f): LabeledPoint(l, f))
我想使用 StandardScaler
缩放数据。我已将数据加载到 PythonRDD 中。好像数据很稀疏。要应用 StandardScaler
,我们应该首先将其转换为密集类型。
trainData = MLUtils.loadLibSVMFile(sc, trainDataPath)
valData = MLUtils.loadLibSVMFile(sc, valDataPath)
trainLabel = trainData.map(lambda x: x.label)
trainFeatures = trainData.map(lambda x: x.features)
valLabel = valData.map(lambda x: x.label)
valFeatures = valData.map(lambda x: x.features)
scaler = StandardScaler(withMean=True, withStd=True).fit(trainFeatures)
# apply the scaler into the data. Here, trainFeatures is a sparse PythonRDD, we first convert it into dense tpye
trainFeatures_scaled = scaler.transform(trainFeatures)
valFeatures_scaled = scaler.transform(valFeatures)
# merge `trainLabel` and `traiFeatures_scaled` into a new PythonRDD
trainData1 = ...
valData1 = ...
# using the scaled data, i.e., trainData1 and valData1 to train a model
...
以上代码有错误。我有两个问题:
- 如何将稀疏的 PythonRDD
trainFeatures
转换为可以作为StandardScaler
输入的密集类型? - 如何将
trainLabel
和trainFeatures_scaled
合并成一个新的 LabeledPoint 用于训练分类器(例如随机森林)?
我仍然能找到关于此的任何文档或参考资料。
使用toArray
转换为稠密地图:
dense = valFeatures.map(lambda v: DenseVector(v.toArray()))
要合并 zip:
valLabel.zip(dense).map(lambda (l, f): LabeledPoint(l, f))