使用散点图 pyspark 用标签命名数据点

Naming Data Points with labels using scatter plot pyspark

我有以下数据框

----------------------------------------
|date       |student_name| count | cluster|
|------------|---------- |-------|--------|
|234454333333|A          |50     |2       |
|345000004000|B          |100    | 4      |
|345000004050|C          |95     | 4      |
------------------------------------------

使用这个数据框我绘制了一个散点图如下

c1 = data_pd[data_pd.cluster == 0]
c2 = data_pd[data_pd.cluster == 1]
c3 = data_pd[data_pd.cluster == 2]
c4 = data_pd[data_pd.cluster == 3]
c5 = data_pd[data_pd.cluster == 4]
plt.scatter(c1.date, c1['count'],color='green')
plt.scatter(c2.date, c2['count'],color='blue')
plt.scatter(c3.date, c3['count'],color='red')
plt.scatter(c4.date, c4['count'],color='pink')
plt.scatter(c5.date, c5['count'],color='yellow')
plt.xlabel('date')
plt.ylabel('count')

我想用数据框中相应的 student_name 值命名每个数据点。如何使用 pyspark 实现此目的?

我生成了一个小示例,只是为了让您了解如何使用 annotate 来实现。

import random
from pyspark.sql import SparkSession
import matplotlib.pyplot as plt


def plot_cluster(cluster, color, data_pd):
    data = data_pd[data_pd.cluster == cluster]
    plt.scatter(data.date, data["count"], color=color)
    for i, label in enumerate(data["student_name"]):
        plt.annotate(label, (data.date.iloc[i], data["count"].iloc[i]))


if __name__ == "__main__":
    spark = SparkSession.builder.getOrCreate()
    data = [
        {
            "date": 234454333333 + random.randrange(50000),
            "student_name": random.choice(["A", "B", "C"]),
            "count": random.randrange(20, 100),
            "cluster": random.randrange(5),
        }
        for _ in range(100)
    ]
    df = spark.createDataFrame(data)
    data_pd = df.toPandas()
    clusters = [0, 1, 2, 3, 4]
    colors = ["green", "blue", "red", "pink", "yellow"]
    for cluster, color in zip(clusters, colors):
        plot_cluster(cluster, color, data_pd)
    plt.xlabel("date")
    plt.ylabel("count")
    plt.show()

X axis should obviously be taken care of but it doesn't matter here

图: