AWS Lambda and S3 and Pandas - Load CSV into S3, trigger Lambda, load into pandas, put back in bucket? AWS Lambda and S3 and Pandas - Load CSV into S3, trigger Lambda, load into pandas, put back in bucket? pandas pandas

AWS Lambda and S3 and Pandas - Load CSV into S3, trigger Lambda, load into pandas, put back in bucket?


This code triggers a Lambda function on PUTS, then GETS it, then PUTS it into another bucket:

from __future__ import print_functionimport osimport timeimport jsonimport boto3s3 = boto3.client('s3')def lambda_handler(event, context):    bucket = event['Records'][0]['s3']['bucket']['name']    key = quote(event['Records'][0]['s3']['object']['key'].encode('utf8'))    try:        response = s3.get_object(Bucket=bucket, Key=key)        s3_upload_article(response, bucket, end_path)        return response['ContentType']    except Exception as e:        print(e)        print('Error getting object {} from bucket {}. Make sure they exist and your bucket is in the same region as this function.'.format(key, bucket))        raise edef s3_upload_article(html, bucket, end_path):    s3.put_object(Body=html, Bucket=bucket, Key=end_path, ContentType='text/html', ACL='public-read')

I broke this code out from a more complicated Lambda script I have written, however, I hope it displays some of what you need to do. The PUTS of the object only triggers the scipt. Any other actions that occur after the event is triggered are up to you to code into the script.

bucket = event['Records'][0]['s3']['bucket']['name']key = quote(event['Records'][0]['s3']['object']['key'].encode('utf8'))

Bucket and key in the first few lines are the bucket and key of the object that triggered the event. Everything else is up to you.