如何使用 AWS S3 CLI 将文件转储到 BASH 中的标准输出?
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/28330907/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
How to use AWS S3 CLI to dump files to stdout in BASH?
提问by Neil C. Obremski
I'm starting a bash script which will take a path in S3 (as specified to the lscommand) and dump the contents of all of the file objects to stdout
. Essentially I'd like to replicate cat /path/to/files/*
except for S3, e.g. s3cat '/bucket/path/to/files/*'
. My first inclination looking at the options is to use the cp
command to a temporary file and then cat
that.
我正在启动一个 bash 脚本,它将采用 S3 中的路径(如ls命令指定的那样)并将所有文件对象的内容转储到stdout
. 基本上我想复制cat /path/to/files/*
除了 S3,例如s3cat '/bucket/path/to/files/*'
. 我查看选项的第一个倾向是将cp
命令用于临时文件,然后使用该命令cat
。
Has anyone tried this or similar or is there already a command I'm not finding which does it?
有没有人试过这个或类似的,或者已经有一个我没有找到的命令?
回答by quiver
dump the contents of all of the file objects to stdout.
将所有文件对象的内容转储到标准输出。
You can accomplish this if you pass -
for destination of aws s3 cp
command.
For example, $ aws s3 cp s3://mybucket/stream.txt -
.
如果您通过命令-
目的地,您可以完成此操作aws s3 cp
。例如, $ aws s3 cp s3://mybucket/stream.txt -
。
What you're trying to do is something like this? ::
你想要做的是这样的事情?::
#!/bin/bash
BUCKET=YOUR-BUCKET-NAME
for key in `aws s3api list-objects --bucket $BUCKET --prefix bucket/path/to/files/ | jq -r '.Contents[].Key'`
do
echo $key
aws s3 cp s3://$BUCKET/$key - | md5sum
done
回答by Drew
If you are using a version of the AWS CLI that doesn't support copying to "-" you can also use /dev/stdout:
如果您使用的 AWS CLI 版本不支持复制到“-”,您还可以使用 /dev/stdout:
$ aws s3 cp --quiet s3://mybucket/stream.txt /dev/stdout
You also may want the --quiet
flag to prevent a summary line like the following from being appended to your output:
您还可能希望该--quiet
标志防止将如下所示的摘要行附加到您的输出中:
download: s3://mybucket/stream.txt to ../../dev/stdout
下载:s3://mybucket/stream.txt 到 ../../dev/stdout
回答by samarth
You can try using s3streamcat, it supports bzip, gzip and xz formats as well.
您可以尝试使用s3streamcat,它也支持 bzip、gzip 和 xz 格式。
Install with
安装
sudo pip install s3streamcat
sudo pip install s3streamcat
Usage:
用法:
s3streamcat s3://bucketname/dir/file_path
s3streamcat s3://bucketname/dir/file_path | more
s3streamcat s3://bucketname/dir/file_path | grep something
回答by Neil C. Obremski
Ah ha!
啊哈!
https://pypi.python.org/pypi/s3cat/1.0.8
https://pypi.python.org/pypi/s3cat/1.0.8
I'm writing more characters to satisfy the length requirement.
我正在编写更多字符以满足长度要求。
回答by John Rotenstein
If you wish to accomplish this using BASH, you'll have to call-out to an external app such as the AWS Command-Line Interface(CLI). It does not have a CAT equivalent, so you would need to copy the file locally and then CAT it.
如果您希望使用 BASH 完成此操作,则必须调用外部应用程序,例如AWS 命令行界面(CLI)。它没有 CAT 等效项,因此您需要在本地复制文件,然后对其进行 CAT。
Alternatively, you could use/write an app that directly calls the AWS SDK, which is available for languages such as Python, PHP, Java. By using the SDK, file contents can be retrieved in-memory and then sent to stdout.
或者,您可以使用/编写直接调用 AWS 开发工具包的应用程序,该应用程序可用于 Python、PHP、Java 等语言。通过使用 SDK,可以在内存中检索文件内容,然后将其发送到 stdout。