How to upload all files of a specific type to S3 Bucket? How to upload all files of a specific type to S3 Bucket? powershell powershell

How to upload all files of a specific type to S3 Bucket?


When you fill in missing parameters like that from the command line, you need to specify their literal string values. When I mimicked your issue locally:

PS C:\> Write-S3Objectcmdlet Write-S3Object at command pipeline position 1Supply values for the following parameters:BucketName: MyTestBucketNameHereKey: $testNameFile: C:/test.txt

I ended up with a file on S3 whose key was named $testName, because variables aren't evaluated in that context. Likewise, you're getting this "The file indicated by the FilePath property does not exist!" error because there is no file in your filesystem named $f[0].fullName.

An example to write a single file to S3:

PS C:> Write-S3Object -BucketName "MyTestBucketName" -Key "file.txt" -File "C:/test.txt"

To write all of your files to S3:

PS C:\> (Get-ChildItem -filter "*.flv") | % { Write-S3Object -BucketName "MyTestBucketName" -File $_ -Key $_.name}

This will first get all files with the flv file-type in your current directory, and for each object (represented by the percent sign) we will write the file (represented by $_) to MyTestBucketName with a Key that is the name property of the current file being iterated.


The parameter -CannedACLName PublicRead is not correct, correct parameter is

foreach ($f in (Get-ChildItem -filter "*.flv")){Write-S3Object -BucketName bucket.example -File $f.fullName -Key $f.name -CannedACLName public-read}

This fixed the issue for me and should fix for you as well.


For me, none of these solutions worked so what I found is this.

When you're in Powershell ISE (4.0) it doesn't matter how you send the local filename, it will know its location, which means you can simply do this, in my example I'm trying to upload all the backups of a folder named e:\Backups to a bucket in S3:

$results = Get-ChildItem -Path E:\Backupsforeach ($f in $results) {   $filename = [System.IO.Path]::GetFileName($f)  Write-S3Object -BucketName bi-dbs-daily-backup -File $f -Key $filename}

If I run this in Powershell ISE everything works fine, but If I create a .ps1 and try to run it with the Task Scheduler I get: The file indicated by the FilePath property does not exist!

It turns out that when Windows tries to run the .ps1 it does it basically from the CLI, and when it does, it stops recognizing the path of the files you're sending.

So I added | % { $_.FullName } to GetChildItem to get the fullpath of each file and now everything works fine:

$results = Get-ChildItem -Path E:\Backups **| % { $_.FullName }** foreach ($f in $results) {   $filename = [System.IO.Path]::GetFileName($f)  Write-S3Object -BucketName bi-dbs-daily-backup -File $f -Key $filename}

Hope this helps someone out there!