Linux 如何 scp 到亚马逊 s3?
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/7328849/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
How to scp to Amazon s3?
提问by qliq
I need to send backup files of ~2TB to S3. I guess the most hassle-free option would be Linux scp command (have difficultywith s3cmd and don't want an overkill java/RoR to do so).
我需要将 ~2TB 的备份文件发送到 S3。我想最轻松的选择是 Linux scp 命令(使用 s3cmd有困难,并且不希望使用过大的 java/RoR 这样做)。
However I am not sure whether it is possible: How to use S3's private and public keys with scp, and don't know what would be my destination IP/url/path?
但是我不确定这是否可行:如何在 scp 中使用 S3 的私钥和公钥,并且不知道我的目标 IP/url/路径是什么?
I appreciate your hints.
我很欣赏你的提示。
采纳答案by El Yobo
You can't SCP.
你不能SCP。
The quickest way, if you don't mind spending money, is probably just to send it to them on a disk and they'll put it up there for you. See their Import/Exportservice.
如果您不介意花钱,最快的方法可能就是通过磁盘将其发送给他们,他们会为您放在那里。查看他们的导入/导出服务。
回答by ssmithstone
for our AWS backups we use a combination of duplicity and trickle duplicity for rsync and encryption and trickle to limit the upload speed
对于我们的 AWS 备份,我们结合使用重复性和涓流重复性进行 rsync 和加密,并使用涓流来限制上传速度
回答by robotrobot
Why don't you scp it to an EBS volume and then use s3cmd from there? As long as your EBS volume and s3 bucket are in the same region, you'll only be charged for inbound data charges once (from your network to the EBS volume)
为什么不将它 scp 到 EBS 卷,然后从那里使用 s3cmd?只要您的 EBS 卷和 s3 存储桶位于同一区域,您只需支付一次入站数据费用(从您的网络到 EBS 卷)
I've found that once within the s3 network, s3cmd is much more reliable and the data transfer rate is far higher than direct to s3.
我发现一旦在 s3 网络中,s3cmd 就可靠得多,并且数据传输速率远高于直接到 s3。
回答by datasn.io
Here's just the thing for this, boto-rsync. From any Linux box, install boto-rsync and then use this to transfer /local/path/to your_bucket/remote/path/:
这就是这个的事情,boto-rsync。在任何 Linux 机器上,安装 boto-rsync,然后使用它来将/local/path/传输到your_bucket/remote/path/:
boto-rsync -a your_access_key -s your_secret_key /local/path/ s3://your_bucket/remote/path/
The paths can also be files.
路径也可以是文件。
For a S3-compatible provider other than AWS, use --endpoint:
对于除 AWS 之外的 S3 兼容提供商,请使用 --endpoint:
boto-rsync -a your_access_key -s your_secret_key --endpoint some.provider.com /local/path/ s3://your_bucket/remote/path/
回答by Sridhar Sarnobat
As of 2015, SCP/SSH is not supported (and probably never will be for the reasons mentioned in the other answers).
截至 2015 年,不支持 SCP/SSH(并且可能永远不会因为其他答案中提到的原因)。
Official AWS tools for copying files to/from S3
用于向/从 S3 复制文件的官方 AWS 工具
command line tool(
pip3 install awscli
) - note credentials need to be specified, I prefer via environment variables rather than a file:AWS_ACCESS_KEY_ID
,AWS_SECRET_ACCESS_KEY
.aws s3 cp /tmp/foo/ s3://bucket/ --recursive --exclude "*" --include "*.jpg"
and an rsync-like command:
aws s3 sync . s3://mybucket
Web interface:
命令行工具(
pip3 install awscli
) - 注意需要指定凭据,我更喜欢通过环境变量而不是文件:AWS_ACCESS_KEY_ID
,AWS_SECRET_ACCESS_KEY
.aws s3 cp /tmp/foo/ s3://bucket/ --recursive --exclude "*" --include "*.jpg"
和一个类似 rsync 的命令:
aws s3 sync . s3://mybucket
网页界面:
Non-AWS methods
非 AWS 方法
Any other solutions depend on third-party executables (e.g. botosync, jungledisk...) which can be great as long as they are supported. But third party tools come and go as years go by and your scripts will have a shorter shelf life.
任何其他解决方案都依赖于第三方可执行文件(例如 botosync、jungledisk...),只要它们受支持就可以很好。但是随着时间的流逝,第三方工具来来去去,您的脚本的保质期会更短。
EDIT: Actually, AWS CLI is based on botocore:
编辑:实际上,AWS CLI 基于 botocore:
https://github.com/boto/botocore
https://github.com/boto/botocore
So botosync deserves a bit more respect as an elder statesman than I perhaps gave it.
因此,作为一位年长的政治家,botosync 值得更多的尊重,而不是我给予的尊重。
回答by Koustav Ray
There is an amazing tool called Dragon Disk. It works as a sync tool even and not just as plain scp.
有一个神奇的工具叫做Dragon Disk。它甚至可以用作同步工具,而不仅仅是普通的 scp。
The Guide to setup the amazon s3 is provided hereand after setting it up you can either copy paste the files from your local machine to s3 or setup an automatic sync. The User Interface is very similiar to WinSCP or Filezilla.
此处提供了设置 amazon s3 的指南,设置完成后,您可以将文件从本地机器复制粘贴到 s3 或设置自动同步。用户界面与 WinSCP 或 Filezilla 非常相似。
回答by GypsyCosmonaut
Here you go,
干得好,
scp USER@REMOTE_IP:/FILE_PATH >(aws s3 cp - s3://BUCKET/SAVE_FILE_AS_THIS_NAME)