Skip to content

Pipeline updates should wait until the current pipeline has finished executing #285

@timmow

Description

@timmow

Currently pipelines are updated in-place, which means that if a pipeline is updated while it's executing behaviour may change from what the previous version of the pipeline expected.

Example:

Deploying the following pipeline, starting it, then deploying the second pipeline will result in the final step of the pipeline never executing in the build, which may result in unexpected inconsistencies if the jobs are affecting external resources. Jobs that are removed while executing will also finish execution, but are invisible in the UI.

Pipeline 1


---
jobs:
  - name: foo
    plan:
    - get: state
    - task: foo
      config:
        inputs: 
          - name: state
        run:
          path: sh
          args:
          - -c
          - |
            date >> updating-while-running
            sleep 30
            echo 30
    - put: state
      params:
        file: foo/updating-while-running
  - name: bar
    plan:
    - get: state
      passed: [foo]
      trigger: true
    - task: bar
      config:
        run:
          path: sh
          args:
          - -c
          - |
            sleep 30
            echo 30
resources:
  - name: state
    type: s3
    source:
      bucket: {{state_bucket}}
      versioned_file: updating-while-running
      region_name: {{aws_region}}
      access_key_id: ACCESS-KEY
      secret_access_key: SECRET

Pipeline 2


---
jobs:
  - name: foo
    plan:
    - get: state
    - task: foo
      config:
        inputs: 
          - name: state
        run:
          path: sh
          args:
          - -c
          - |
            date >> updating-while-running
            sleep 30
            echo 30
    - put: state
      params:
        file: foo/updating-while-running
resources:
  - name: state
    type: s3
    source:
      bucket: {{state_bucket}}
      versioned_file: updating-while-running
      region_name: {{aws_region}}
      access_key_id: ACCESS-KEY
      secret_access_key: SECRET

In addition, deploying the second pipeline, then starting the foo job, and then applying the first pipeline while the foo job is running will add the bar job and trigger new build of the bar job, despite the fact the
job started running under an older version of the pipeline.

Ideally when new pipelines are uploaded using fly they should not be applied until the current pipeline has finished executing. There are situations where this is desired behaviour, so the current behaviour could be placed behind a --force flag.

It would also be nice to have a --wait option that causes fly not to return until the pipelines have been updated, for use when scripting pipeline updates.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions