Skip to content

Passing outputs between tasks #1951

@cjohansen

Description

@cjohansen

Local disk resource

Problem: I have a pipeline that includes a "build and deploy" task, like so:

- name: build-deploy
  plan:
  - get: my-repo
    trigger: true
    passed: [test]
  - task: build-artifact
    file: my-repo/ci/build.yml
  - put: ecr-repository
    params:
      build: my-repo-output
      tag: my-repo-output/version

Now I want to split this into two tasks, so I can build once and possibly deploy (or use artifacts in other ways) multiple times. Problem: the output directory does not survive across tasks. I can use an S3 bucket or something like that to persist the artifact across builds, but that's a pretty big detour to cache some files between tasks.

It would be very convenient if there was a "local disk resource" that I could put outputs in and get in other tasks in the pipeline. Does something like this exist?

To clarify, I don't want to use S3 for a few reasons:

  • Requires configuration of additional keys
  • Network penalty in ever task using it
  • Retains artifacts that I don't really need

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions