spark-shell error: value builder is not a member of object com.amazonaws.services.s3.model.PutObjectRequest
spark-shell error: value builder is not a member of object com.amazonaws.services.s3.model.PutObjectRequest
我刚开始使用 EMR Hadoop/spark 等,我正在尝试使用 spark-shell 到 运行 scala 代码将文件上传到 EMRFS S3 位置但是我收到以下错误 -
没有任何导入如果我 运行 =>
val bucketName = "bucket"
val outputPath = "test.txt"
scala> val putRequest = PutObjectRequest.builder.bucket(bucketName).key(outputPath).build()
<console>:27: error: not found: value PutObjectRequest
val putRequest = PutObjectRequest.builder.bucket(bucketName).key(outputPath).build()
^
为 PutObjectRequest 添加导入包后,我仍然收到不同的错误。
scala> import com.amazonaws.services.s3.model.PutObjectRequest
导入com.amazonaws.services.s3.model.PutObjectRequest
scala> val putRequest = PutObjectRequest.builder.bucket(bucketName).key(outputPath).build()
<console>:28: error: value builder is not a member of object com.amazonaws.services.s3.model.PutObjectRequest
val putRequest = PutObjectRequest.builder.bucket(bucketName).key(outputPath).build()
^
我不确定我错过了什么。如有任何帮助,我们将不胜感激!
注:Spark版本为2.4.5
不是使用生成器,而是通过合适的构造函数创建 PutObjectRequest 的对象。此外,使用 AmazonS3ClientBuilder 创建到 S3 的连接。
import com.amazonaws.regions.Regions
import com.amazonaws.services.s3.AmazonS3ClientBuilder
import com.amazonaws.services.s3.model.ObjectMetadata
import com.amazonaws.services.s3.model.PutObjectRequest
import java.io.File
val clientRegion = Regions.DEFAULT_REGION
val bucketName = "*** Bucket name ***"
val fileObjKeyName = "*** File object key name ***"
val fileName = "*** Path to file to upload ***"
val s3Client = AmazonS3ClientBuilder.standard.withRegion(clientRegion).build
// Upload a file as a new object with ContentType and title specified.
val request = new PutObjectRequest(bucketName, fileObjKeyName, new File(fileName))
val metadata = new ObjectMetadata()
metadata.setContentType("plain/text")
metadata.addUserMetadata("title", "someTitle")
request.setMetadata(metadata)
s3Client.putObject(request)
我刚开始使用 EMR Hadoop/spark 等,我正在尝试使用 spark-shell 到 运行 scala 代码将文件上传到 EMRFS S3 位置但是我收到以下错误 -
没有任何导入如果我 运行 =>
val bucketName = "bucket"
val outputPath = "test.txt"
scala> val putRequest = PutObjectRequest.builder.bucket(bucketName).key(outputPath).build()
<console>:27: error: not found: value PutObjectRequest
val putRequest = PutObjectRequest.builder.bucket(bucketName).key(outputPath).build()
^
为 PutObjectRequest 添加导入包后,我仍然收到不同的错误。
scala> import com.amazonaws.services.s3.model.PutObjectRequest
导入com.amazonaws.services.s3.model.PutObjectRequest
scala> val putRequest = PutObjectRequest.builder.bucket(bucketName).key(outputPath).build()
<console>:28: error: value builder is not a member of object com.amazonaws.services.s3.model.PutObjectRequest
val putRequest = PutObjectRequest.builder.bucket(bucketName).key(outputPath).build()
^
我不确定我错过了什么。如有任何帮助,我们将不胜感激!
注:Spark版本为2.4.5
不是使用生成器,而是通过合适的构造函数创建 PutObjectRequest 的对象。此外,使用 AmazonS3ClientBuilder 创建到 S3 的连接。
import com.amazonaws.regions.Regions
import com.amazonaws.services.s3.AmazonS3ClientBuilder
import com.amazonaws.services.s3.model.ObjectMetadata
import com.amazonaws.services.s3.model.PutObjectRequest
import java.io.File
val clientRegion = Regions.DEFAULT_REGION
val bucketName = "*** Bucket name ***"
val fileObjKeyName = "*** File object key name ***"
val fileName = "*** Path to file to upload ***"
val s3Client = AmazonS3ClientBuilder.standard.withRegion(clientRegion).build
// Upload a file as a new object with ContentType and title specified.
val request = new PutObjectRequest(bucketName, fileObjKeyName, new File(fileName))
val metadata = new ObjectMetadata()
metadata.setContentType("plain/text")
metadata.addUserMetadata("title", "someTitle")
request.setMetadata(metadata)
s3Client.putObject(request)