-
-
Notifications
You must be signed in to change notification settings - Fork 867
Description
Bug Report
I accidentally created a pipeline that didn't have a file nor a config parameter for a task, and the web container crashed because of it. Ended up debugging it for hours to find out what was actually wrong. Whenever it tries to run the task, the entire web container crashes, it then restarts itself and then crashes again, stuck in an endless crashing loop.
As far as I could see, there was no way to remove the pipeline causing the problem, preventing the web container from crashing again. So I needed to remove all the data and fill in the pipeline again. While that in itself is not a big problem, but if people want to keep their build logs, then I can't see a way of doing that.
Actual result:
concourse-web_1 | panic: runtime error: invalid memory address or nil pointer dereference
concourse-web_1 | [signal SIGSEGV: segmentation violation code=0x1 addr=0x20 pc=0xa71f83]
concourse-web_1 |
concourse-web_1 | goroutine 3132 [running]:
concourse-web_1 | github.com/concourse/atc/exec.ValidatingConfigFetcher.FetchConfig(0x0, 0x0, 0xc422945460, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...)
concourse-web_1 | /tmp/build/9674af12/concourse/src/github.com/concourse/atc/exec/task_config_fetcher.go:205 +0x73
concourse-web_1 | github.com/concourse/atc/exec.(*ValidatingConfigFetcher).FetchConfig(0xc42038ccd0, 0xc422945460, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...)
concourse-web_1 | <autogenerated>:1 +0xb3
concourse-web_1 | github.com/concourse/atc/exec.DeprecationConfigFetcher.FetchConfig(0x10c5cda0, 0xc42038ccd0, 0x10c4fe60, 0xc420142c30, 0xc422945460, 0x0, 0x0, 0x0, 0x0, 0x0, ...)
concourse-web_1 | /tmp/build/9674af12/concourse/src/github.com/concourse/atc/exec/task_config_fetcher.go:87 +0x92
concourse-web_1 | github.com/concourse/atc/exec.(*DeprecationConfigFetcher).FetchConfig(0xc422945360, 0xc422945460, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...)
concourse-web_1 | <autogenerated>:1 +0xc5
concourse-web_1 | github.com/concourse/atc/exec.(*FetchConfigAction).Run(0xc4248eeb60, 0x10c66620, 0xc4206233e0, 0xc422945460, 0xc4202769c0, 0xc42296c360, 0xc4245185a0, 0xc42002d180)
concourse-web_1 | /tmp/build/9674af12/concourse/src/github.com/concourse/atc/exec/fetch_config_action.go:36 +0x76
concourse-web_1 | github.com/concourse/atc/exec.(*ActionsStep).Run(0xc4245185a0, 0xc4202769c0, 0xc42296c360, 0xf3a3c0, 0xc424518550)
concourse-web_1 | /tmp/build/9674af12/concourse/src/github.com/concourse/atc/exec/actions_step.go:61 +0xd5
concourse-web_1 | github.com/concourse/atc/exec.(*OnSuccessStep).Run(0xc424518550, 0xc4202769c0, 0xc42296c360, 0xc424518550, 0x0)
concourse-web_1 | /tmp/build/9674af12/concourse/src/github.com/concourse/atc/exec/on_success.go:44 +0x4c
concourse-web_1 | github.com/concourse/atc/exec.(*OnSuccessStep).Run(0xc4220ac0f0, 0xc4202769c0, 0xc42296c180, 0xc4220ac0f0, 0x0)
concourse-web_1 | /tmp/build/9674af12/concourse/src/github.com/concourse/atc/exec/on_success.go:57 +0x102
concourse-web_1 | github.com/concourse/atc/exec.(*OnSuccessStep).Run(0xc420143180, 0xc4202769c0, 0xc420276a20, 0x10c4de60, 0xc4225082a0)
concourse-web_1 | /tmp/build/9674af12/concourse/src/github.com/concourse/atc/exec/on_success.go:57 +0x102
concourse-web_1 | github.com/tedsuo/ifrit.(*process).run(0xc420562e80)
concourse-web_1 | /tmp/build/9674af12/concourse/src/github.com/tedsuo/ifrit/process.go:71 +0x49
concourse-web_1 | created by github.com/tedsuo/ifrit.Background
concourse-web_1 | /tmp/build/9674af12/concourse/src/github.com/tedsuo/ifrit/process.go:49 +0x117
concourse-worker_1 | {"timestamp":"1508798718.357427120","source":"worker","message":"worker.beacon.restarting","log_level":2,"data":{"error":"wait: remote command exited without exit status or exit signal","session":"4"}}
Expected result:
That it gives me an error that either the file or config parameter is missing.
- Concourse version: Occurs in 3.3.2 up to and including 3.5.0
- Deployment type (BOSH/Docker/binary): Docker
- Infrastructure/IaaS: Docker-compose
- Browser (if applicable): -
- Did this used to work? No