Can you upload to S3 using a stream rather than a local file? Can you upload to S3 using a stream rather than a local file? python python

Can you upload to S3 using a stream rather than a local file?


I did find a solution to my question, which I will post here in case anyone else is interested. I decided to do this as parts in a multipart upload. You can't stream to S3. There is also a package available that changes your streaming file over to a multipart upload which I used: Smart Open.

import smart_openimport ioimport csvtestDict = [{    "fieldA": "8",    "fieldB": None,    "fieldC": "888888888888"},    {    "fieldA": "9",    "fieldB": None,    "fieldC": "99999999999"}]fieldnames = ['fieldA', 'fieldB', 'fieldC']f = io.StringIO()with smart_open.smart_open('s3://dev-test/bar/foo.csv', 'wb') as fout:    writer = csv.DictWriter(f, fieldnames=fieldnames)    writer.writeheader()    fout.write(f.getvalue())    for row in testDict:        f.seek(0)        f.truncate(0)        writer.writerow(row)        fout.write(f.getvalue())f.close()


We were trying to upload file contents to s3 when it came through as an InMemoryUploadedFile object in a Django request. We ended up doing the following because we didn't want to save the file locally. Hope it helps:

@action(detail=False, methods=['post'])def upload_document(self, request):     document = request.data.get('image').file     s3.upload_fileobj(document, BUCKET_NAME,                                  DESIRED_NAME_OF_FILE_IN_S3,                                  ExtraArgs={"ServerSideEncryption": "aws:kms"})


According to docs it's possible

s3.Object('mybucket', 'hello.txt').put(Body=open('/tmp/hello.txt', 'rb'))

so we can use StringIO in ordinary way

Update: smart_open lib from @inquiring minds answer is better solution