S3 Copy artifacts with Jenkins pipeline S3 Copy artifacts with Jenkins pipeline jenkins jenkins

S3 Copy artifacts with Jenkins pipeline


I agree with your original statement in that the script generation is oh so sad. It does not give enough to go on, even when you have S3BucketPublisher selected.See my snippet below. It assumes you've created a profile already in the system configuration.

stage("publish to s3") {    step([        $class: 'S3BucketPublisher',        entries: [[            sourceFile: 'mybinaryFile',            bucket: 'GoBinaries',            selectedRegion: 'eu-west-1',            noUploadOnFailure: true,            managedArtifacts: true,            flatten: true,            showDirectlyInBrowser: true,            keepForever: true,        ]],        profileName: 'myprofile',        dontWaitForConcurrentBuildCompletion: false,     ])}


For a simpler use case this is now supported in Pipeline AWS Plugin like this: s3Upload(file:'someFolder', bucket:'my-bucket', path:'/path/to/targetFolder/')

djsd123's example works perfectly for more advanced use cases.If you also want to add metadata tags to your objects you can add a userMetaData array:

    profileName: 'myprofile',    dontWaitForConcurrentBuildCompletion: false,     userMetadata: [[ key: 'git_branch', value: "${env.BRANCH_NAME}"],                   [ key: 'build_number', value: "${env.BUILD_NUMBER}"]    ],


adding on @djsd123 answer, you can upload a folder using glob patterns:

                    ...                    bucket: "bucketName/folder",                    sourceFile: 'folder/**',                    ...