Sparklyr ft_tokenizer 在 RLang 中出错

Sparklyr ft_tokenizer err in RLang

尝试在 sparklyr 中使用 ft_tokenizer 时,我总是遇到错误。 当我执行以下脚本时,我不断收到 Error in rlang::env_get(mapping, nm, default = NULL, inherit = TRUE) : unused argument (default = NULL)

我已经尝试下载最新版本的 R studio 并尝试使用 Microsoft R open 和 R 3.5.2

sc <- spark_connect(master = "local")

dataframe  <- data.frame("Review" = "The pictures from online made it seem the room was big", "Review_Is_Positive" = 0 )
write.csv2(dataframe, file = "test.csv", row.names = FALSE)

dataframe.spark <- spark_read_csv(sc, 
                                  name =  "test",
                                  "test.csv",
                                  overwrite = TRUE,
                                  delimiter = ";")

dataframe.spark

dataframe.spark <- ft_tokenizer(dataframe.spark, input_col = "Review", output_col = "tokens")
spark_disconnect(sc)

有人可以帮助我吗?

我遇到了同样的问题。尝试从开发版本更新 rlang:

devtools::install_github("r-lib/rlang")