While trying to access data stored in Amazon S3 through Apache Spark, which internally uses hadoop-aws jar I was getting the following exception :
Exception in thread "main" java.lang.NoSuchMethodError: com.amazonaws.services.s3.transfer.TransferManagerConfiguration.setMultipartUploadThreshold(I)V
Probable reason could be the fact that aws java sdk expects a long parameter for the setMultipartUploadThreshold(long multiPartThreshold) method, but hadoop-aws was using a parameter of type int(multiPartThreshold).
I tried using the downloaded hadoop-aws jar and the build through its maven dependency, but in both the cases I encountered the same exception. Although I can see private long multiPartThreshold; in hadoop-aws GitHub repo, it's not getting reflected in the downloaded jar or in the jar created from maven dependency.
Following lines in the S3AFileSystem class create this difference :
Build from trunk :
private long multiPartThreshold;
this.multiPartThreshold = conf.getLong("fs.s3a.multipart.threshold", 2147483647L); => Line 267
Build through maven dependency :
private int multiPartThreshold;
multiPartThreshold = conf.getInt(MIN_MULTIPART_THRESHOLD, DEFAULT_MIN_MULTIPART_THRESHOLD); => Line 249