Jenkins pipeline: how to upload artifacts with s3 plugin Jenkins pipeline: how to upload artifacts with s3 plugin jenkins jenkins

Jenkins pipeline: how to upload artifacts with s3 plugin


Detailed steps:

  1. Install Pipeline AWS Plugin.Go to Manage Jenkins -> Manage Plugins -> Available tab -> Filter by 'Pipeline AWS'.Install the plugin.

  2. Add Credentials as per your environment. Example here:

    Jenkins > Credentials > System > Global credentials (unrestricted) -> Add

    Kind = AWS Credentialsand add your AWS credentials

    Note the ID

  3. Then in your Pipeline project (Similar to the code I use)

    node {    stage('Upload') {        dir('path/to/your/project/workspace'){            pwd(); //Log current directory            withAWS(region:'yourS3Region',credentials:'yourIDfromStep2') {                 def identity=awsIdentity();//Log AWS credentials                // Upload files from working directory 'dist' in your project workspace                s3Upload(bucket:"yourBucketName", workingDir:'dist', includePathPattern:'**/*');            }        };    }}


Looking at the Pipeline Steps documentation on the Jenkins website, it shows that the Pipeline AWS Plugin provides an s3Upload step.


Try this:

s3Upload(file:'file.txt', bucket:'my-bucket', path:'path/to/target/file.txt')

I think it is easier to show the direct plugin documentation URL.You can find the plugin documentation here.

As you are looking for a way to upload files to S3, here are some examples.