我如何使用 aws lambda 将文件写入 s3(python)?

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/48945389/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-19 18:55:45  来源:igfitidea点击:

How could I use aws lambda to write file to s3 (python)?

pythonamazon-web-servicesamazon-s3aws-lambdaserverless-framework

提问by Rick.Wang

I have tried to use lambda function to write a file to S3, then test shows "succeeded" ,but nothing appeared in my S3 bucket. What happened? Does anyone can give me some advice or solutions? Thanks a lot. Here's my code.

我尝试使用 lambda 函数将文件写入 S3,然后测试显示“成功”,但我的 S3 存储桶中没有出现任何内容。发生了什么?有没有人可以给我一些建议或解决方案?非常感谢。这是我的代码。

import json
import boto3

def lambda_handler(event, context):

string = "dfghj"

file_name = "hello.txt"
lambda_path = "/tmp/" + file_name
s3_path = "/100001/20180223/" + file_name

with open(lambda_path, 'w+') as file:
    file.write(string)
    file.close()

s3 = boto3.resource('s3')
s3.meta.client.upload_file(lambda_path, 's3bucket', s3_path)

回答by Tim B

I've had success streaming data to S3, it has to be encoded to do this:

我已经成功地将数据流式传输到 S3,必须对其进行编码才能做到这一点:

import boto3

def lambda_handler(event, context):
    string = "dfghj"
    encoded_string = string.encode("utf-8")

    bucket_name = "s3bucket"
    file_name = "hello.txt"
    lambda_path = "/tmp/" + file_name
    s3_path = "/100001/20180223/" + file_name

    s3 = boto3.resource("s3")
    s3.Bucket(bucket_name).put_object(Key=s3_path, Body=encoded_string)

If the data is in a file, you can read this file and send it up:

如果数据在文件中,您可以读取此文件并将其发送:

with open(filename) as f:
    string = f.read()

encoded_string = string.encode("utf-8")

回答by grepit

My response is very similar to Tim B but the most import part is

我的回答与 Tim B 非常相似,但最重要的部分是

1.Go to S3 bucket and create a bucket you want to write to

1.转到S3存储桶并创建一个要写入的存储桶

2.Follow the below stepsotherwise you lambda will fail due to permission/access. I've copied and pasted it the link content here for you too just in case if they change the url /move it to some other page.

2.按照以下步骤操作,否则您的 lambda 将因权限/访问而失败。我也为您复制并粘贴了此处的链接内容,以防万一他们更改网址/将其移动到其他页面。

a. Open the roles pagein the IAM console.

一种。在 IAM 控制台中打开角色页面

b. Choose Create role.

湾 选择创建角色。

c. Create a role with the following properties.

C。创建具有以下属性的角色。

-Trusted entity – AWS Lambda.

- 可信实体 – AWS Lambda。

-Permissions – AWSLambdaExecute.

- 权限 – AWSLambdaExecute。

-Role name – lambda-s3-role.

-角色名称 - lambda-s3-role。

The AWSLambdaExecute policy has the permissions that the function needs to manage objects in Amazon S3 and write logs to CloudWatch Logs.

AWSLambdaExecute 策略具有该函数管理 Amazon S3 中的对象并将日志写入 CloudWatch Logs 所需的权限。

  1. Copy and past this into your Lambda python function

    import json, boto3,os, sys, uuid
    from urllib.parse import unquote_plus
    
    s3_client = boto3.client('s3')
    
    def lambda_handler(event, context):
        some_text = "test"
        #put the bucket name you create in step 1
        bucket_name = "my_buck_name"
        file_name = "my_test_file.csv"
        lambda_path = "/tmp/" + file_name
        s3_path = "output/" + file_name
        os.system('echo testing... >'+lambda_path)
        s3 = boto3.resource("s3")
        s3.meta.client.upload_file(lambda_path, bucket_name, file_name)
    
        return {
            'statusCode': 200,
            'body': json.dumps('file is created in:'+s3_path)
        }
    
  1. 将其复制并粘贴到您的 Lambda python 函数中

    import json, boto3,os, sys, uuid
    from urllib.parse import unquote_plus
    
    s3_client = boto3.client('s3')
    
    def lambda_handler(event, context):
        some_text = "test"
        #put the bucket name you create in step 1
        bucket_name = "my_buck_name"
        file_name = "my_test_file.csv"
        lambda_path = "/tmp/" + file_name
        s3_path = "output/" + file_name
        os.system('echo testing... >'+lambda_path)
        s3 = boto3.resource("s3")
        s3.meta.client.upload_file(lambda_path, bucket_name, file_name)
    
        return {
            'statusCode': 200,
            'body': json.dumps('file is created in:'+s3_path)
        }