开发者

Getting data in and out of Elastic MapReduce HDFS

开发者 https://www.devze.com 2023-04-12 05:45 出处:网络
I\'ve written a Hadoop program which requires a certain layout within HDFS, and which afterwards, I need to get the files out of HDFS.It works on my single-node Hadoop setup and I\'m eager to get it w

I've written a Hadoop program which requires a certain layout within HDFS, and which afterwards, I need to get the files out of HDFS. It works on my single-node Hadoop setup and I'm eager to get it working on 10's of nodes within Elastic MapReduce.

What I've been doing is something like this:

./elastic-mapreduce --create --alive
JOBID="j-XXX" # output from creation
./elastic-mapreduce -j $JOBID --ssh "hadoop fs -cp s3://bucket-id/XXX /XXX"
./elastic开发者_如何学编程-mapreduce -j $JOBID --jar s3://bucket-id/jars/hdeploy.jar --main-class com.ranjan.HadoopMain --arg /XXX

This is asynchronous, but when the job's completed, I can do this

./elastic-mapreduce -j $JOBID --ssh "hadoop fs -cp /XXX s3://bucket-id/XXX-output"
./elastic-mapreduce -j $JOBID --terminate

So while this sort-of works, but it's clunky and not what I'd like. Is there cleaner way to do this?

Thanks!


You can use distcp which will copy the files as a mapreduce job

# download from s3
$ hadoop distcp s3://bucket/path/on/s3/ /target/path/on/hdfs/
# upload to s3
$ hadoop distcp /source/path/on/hdfs/ s3://bucket/path/on/s3/

This makes use of your entire cluster to copy in parallel from s3.

(note: the trailing slashes on each path are important to copy from directory to directory)


@mat-kelcey, does the command distcp expect the files in S3 to have a minimum permission level? For some reason I have to set permission levels of the files to "Open/Download" and "View Permissions" for "Everyone", for the files to be able accessible from within the bootstrap or the step scripts.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号