8000 integration: run build session tests on non-experimental by thaJeztah · Pull Request #39554 · moby/moby · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

integration: run build session tests on non-experimental #39554

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jul 19, 2019

Conversation

thaJeztah
Copy link
Member
@thaJeztah thaJeztah commented Jul 17, 2019

The session endpoint is no longer experimental since 01c9e70 (#37686), so we don't need to start an experimental daemon.

make TESTFLAGS='-test.run ^(TestSession|TestBuildWithSession)' test-integration   

Running /go/src/github.com/docker/docker/integration/build
INFO: Testing against a local daemon
=== RUN   TestBuildWithSession
--- PASS: TestBuildWithSession (6.50s)
PASS

...

Running /go/src/github.com/docker/docker/integration/session
INFO: Testing against a local daemon
=== RUN   TestSessionCreate
--- PASS: TestSessionCreate (0.02s)
=== RUN   TestSessionCreateWithBadUpgrade
--- PASS: TestSessionCreateWithBadUpgrade (0.03s)
PASS

@thaJeztah
Copy link
Member Author

arf

21:45:41 integration/build/build_session_test.go:26:2:warning: should merge variable declaration with assignment on next line (S1021) (gosimple)

@thaJeztah thaJeztah force-pushed the session_not_experimental branch from 13103dd to bb8771e Compare July 17, 2019 21:55
The session endpoint is no longer experimental since
01c9e70, so we don't
need to start an experimental daemon.

Signed-off-by: Sebastiaan van Stijn <github@gone.nl>
@thaJeztah thaJeztah force-pushed the session_not_experimental branch from bb8771e to becd29c Compare July 17, 2019 21:56
@thaJeztah
Copy link
Member Author

Tests passed https://jenkins.dockerproject.org/job/Docker-PRs/54922/console

22:07:58 === RUN   TestBuildWithSession
22:08:03 --- PASS: TestBuildWithSession (5.40s)

22:20:04 Running /go/src/github.com/docker/docker/integration/session
22:20:06 INFO: Testing against a local daemon
22:20:06 === RUN   TestSessionCreate
22:20:06 --- PASS: TestSessionCreate (0.04s)
22:20:06 === RUN   TestSessionCreateWithBadUpgrade
22:20:06 --- PASS: TestSessionCreateWithBadUpgrade (0.02s)
22:20:06 PASS

@thaJeztah
Copy link
Member Author

Experimental failing on known flaky's https://jenkins.dockerproject.org/job/Docker-PRs-experimental/46021/console

22:55:19 FAIL: docker_api_swarm_test.go:292: DockerSwarmSuite.TestAPISwarmLeaderElection
23:11:12 FAIL: docker_cli_swarm_test.go:1317: DockerSwarmSuite.TestSwarmClusterRotateUnlockKey

s390x also failing on known flaky's

23:16:53 FAIL: docker_api_swarm_service_test.go:96: DockerSwarmSuite.TestAPISwarmServicesMultipleAgents

@thaJeztah
Copy link
Member Author
thaJeztah commented Jul 18, 2019

tumblr_ow2ntv8exP1qbbha7o1_400

Janky https://jenkins.dockerproject.org/job/Docker-PRs/54922/console

haven't seen that one failing for a while; if was flaky once #24805, which got addressed by specifying a lower restart delay #25085

23:20:52 FAIL: docker_api_swarm_service_test.go:540: DockerSwarmSuite.TestAPISwarmServicesStateReporting
23:20:52 
23:20:52 Creating a new daemon
23:20:52 [dcab3128044dd] waiting for daemon to start
23:20:52 [dcab3128044dd] waiting for daemon to start
23:20:52 [dcab3128044dd] daemon started
23:20:52 
23:20:52 Creating a new daemon
23:20:52 [db98fdf7bb806] waiting for daemon to start
23:20:52 [db98fdf7bb806] waiting for daemon to start
23:20:52 [db98fdf7bb806] daemon started
23:20:52 
23:20:52 Creating a new daemon
23:20:52 [d961ca0651f7e] waiting for daemon to start
23:20:52 [d961ca0651f7e] waiting for daemon to start
23:20:52 [d961ca0651f7e] daemon started
23:20:52 
23:20:52 waited for 2.500192249s (out of 30s)
23:20:52 waited for 703.203711ms (out of 30s)
23:20:52 waited for 82.890831ms (out of 30s)
23:20:52 assertion failed: expression is false: containers2[i] == nil
23:20:52 [dcab3128044dd] Stopping daemon
23:20:52 [dcab3128044dd] exiting daemon
23:20:52 [dcab3128044dd] Daemon stopped
23:20:52 [db98fdf7bb806] Stopping daemon
23:20:52 [db98fdf7bb806] exiting daemon
23:20:52 [db98fdf7bb806] Daemon stopped
23:20:52 [d961ca0651f7e] Stopping daemon
23:20:52 [d961ca0651f7e] exiting daemon
23:20:52 [d961ca0651f7e] Daemon stopped

Daemon logs for that test:

d961ca0651f7e.log

time="2019-07-17T23:20:34.810169363Z" level=error msg="error in agentInit: failed to create memberlist: Could not set up network transport: failed to obtain an address: Failed to start TCP listener on \"0.0.0.0\" port 7946: listen tcp 0.0.0.0:7946: bind: address already in use"
time="2019-07-17T23:20:40.415187675Z" level=error msg="agent: session failed" backoff=100ms error="rpc error: code = Aborted desc = node must disconnect" module=node/agent node.id=op5dut8u400t4q2jewe0npacg
time="2019-07-17T23:20:40.479358934Z" level=error msg="agent: session failed" backoff=300ms error="rpc error: code = Unavailable desc = all SubConns are in TransientFailure, latest connection error: connection error: desc = \"transport: Error while dialing dial tcp 127.0.0.1:2477: connect: connection refused\"" module=node/agent node.id=op5dut8u400t4q2jewe0npacg
time="2019-07-17T23:20:40.761682725Z" level=error msg="agent: 
8000
session failed" backoff=700ms error="rpc error: code = Unavailable desc = all SubConns are in TransientFailure, latest connection error: connection error: desc = \"transport: Error while dialing dial tcp 127.0.0.1:2477: connect: connection refused\"" module=node/agent node.id=op5dut8u400t4q2jewe0npacg
time="2019-07-17T23:20:40.800009919Z" level=error msg="agent: session failed" backoff=1.5s error="rpc error: code = Unavailable desc = all SubConns are in TransientFailure, latest connection error: connection error: desc = \"transport: Error while dialing dial tcp 127.0.0.1:2477: connect: connection refused\"" module=node/agent node.id=op5dut8u400t4q2jewe0npacg
time="2019-07-17T23:20:41.107974674Z" level=error msg="agent: session failed" backoff=3.1s error="rpc error: code = Unavailable desc = all SubConns are in TransientFailure, latest connection error: connection error: desc = \"transport: Error while dialing dial tcp 127.0.0.1:2477: connect: connection refused\"" module=node/agent node.id=op5dut8u400t4q2jewe0npacg
time="2019-07-17T23:20:41.703801899Z" level=error msg="agent: session failed" backoff=6.3s error="rpc error: code = Unavailable desc = all SubConns are in TransientFailure, latest connection error: connection error: desc = \"transport: Error while dialing dial tcp 127.0.0.1:2477: connect: connection refused\"" module=node/agent node.id=op5dut8u400t4q2jewe0npacg
time="2019-07-17T23:20:43.788326745Z" level=error msg="agent: session failed" backoff=8s error="rpc error: code = Unavailable desc = all SubConns are in TransientFailure, latest connection error: connection error: desc = \"transport: Error while dialing dial tcp 127.0.0.1:2477: connect: connection refused\"" module=node/agent node.id=op5dut8u400t4q2jewe0npacg
time="2019-07-17T23:20:47.268634894Z" level=error msg="status reporter failed to report status to agent" error="context canceled" module=node/agent node.id=op5dut8u400t4q2jewe0npacg
time="2019-07-17T23:20:47.268664714Z" level=error msg="status reporter failed to report status to agent" error="context canceled" module=node/agent node.id=op5dut8u400t4q2jewe0npacg

db98fdf7bb806.log

time="2019-07-17T23:20:33.945683767Z" level=error msg="error in agentInit: failed to create memberlist: Could not set up network transport: failed to obtain an address: Failed to start TCP listener on \"0.0.0.0\" port 7946: listen tcp 0.0.0.0:7946: bind: address already in use"
time="2019-07-17T23:20:33.948391809Z" level=error msg="error in agentInit: failed to create memberlist: Could not set up network transport: failed to obtain an address: Failed to start TCP listener on \"0.0.0.0\" port 7946: listen tcp 0.0.0.0:7946: bind: address already in use"
time="2019-07-17T23:20:40.417872146Z" level=error msg="agent: session failed" backoff=100ms error=EOF module=node/agent node.id=86sy6alx2eyhhzlbmhhm6bf5a
time="2019-07-17T23:20:40.462436704Z" level=error msg="agent: session failed" backoff=300ms error="rpc error: code = Unavailable desc = all SubConns are in TransientFailure, latest connection error: connection error: desc = \"transport: Error while dialing dial tcp 127.0.0.1:2477: connect: connection refused\"" module=node/agent node.id=86sy6alx2eyhhzlbmhhm6bf5a
time="2019-07-17T23:20:40.524600236Z" level=error msg="agent: session failed" backoff=700ms error="rpc error: code = Unavailable desc = all SubConns are in TransientFailure, latest connection error: connection error: desc = \"transport: Error while dialing dial tcp 127.0.0.1:2477: connect: connection refused\"" module=node/agent node.id=86sy6alx2eyhhzlbmhhm6bf5a
time="2019-07-17T23:20:40.768087626Z" level=error msg="agent: session failed" backoff=1.5s error="rpc error: code = Unavailable desc = all SubConns are in TransientFailure, latest connection error: connection error: desc = \"transport: Error while dialing dial tcp 127.0.0.1:2477: connect: connection refused\"" module=node/agent node.id=86sy6alx2eyhhzlbmhhm6bf5a
time="2019-07-17T23:20:42.106996893Z" level=error msg="agent: session failed" backoff=3.1s error="rpc error: code = Unavailable desc = all SubConns are in TransientFailure, latest connection error: connection error: desc = \"transport: Error while dialing dial tcp 127.0.0.1:2477: connect: connection refused\"" module=node/agent node.id=86sy6alx2eyhhzlbmhhm6bf5a
time="2019-07-17T23:20:42.802584481Z" level=error msg="agent: session failed" backoff=6.3s error="rpc error: code = Unavailable desc = all SubConns are in TransientFailure, latest connection error: connection error: desc = \"transport: Error while dialing dial tcp 127.0.0.1:2477: connect: connection refused\"" module=node/agent node.id=86sy6alx2eyhhzlbmhhm6bf5a
time="2019-07-17T23:20:46.088249801Z" level=error msg="agent: session failed" backoff=8s error="rpc error: code = Unavailable desc = all SubConns are in TransientFailure, latest connection error: connection error: desc = \"transport: Error while dialing dial tcp 127.0.0.1:2477: connect: connection refused\"" module=node/agent node.id=86sy6alx2eyhhzlbmhhm6bf5a
time="2019-07-17T23:20:46.134246531Z" level=error msg="status reporter failed to report status to agent" error="context canceled" module=node/agent node.id=86sy6alx2eyhhzlbmhhm6bf5a

dcab3128044dd.log

time="2019-07-17T23:20:32.289999868Z" level=error msg="error reading the kernel parameter net.ipv4.neigh.default.gc_thresh3" error="open /proc/sys/net/ipv4/neigh/default/gc_thresh3: no such file or directory"
time="2019-07-17T23:20:32.290014076Z" level=error msg="error reading the kernel parameter net.ipv4.neigh.default.gc_thresh1" error="open /proc/sys/net/ipv4/neigh/default/gc_thresh1: no such file or directory"
time="2019-07-17T23:20:32.290026235Z" level=error msg="error reading the kernel parameter net.ipv4.neigh.default.gc_thresh2" error="open /proc/sys/net/ipv4/neigh/default/gc_thresh2: no such file or directory"
time="2019-07-17T23:20:38.518117736Z" level=error msg="fatal task error" error="task: non-zero exit (143)" module=node/agent/taskmanager node.id=676o09ls5f2kw8r3h1qgszh10 service.id=patuniobyo9ixx44cvpsrerj0 task.id=adv1u6gucfwrsdryzzs3cqf5r
time="2019-07-17T23:20:40.324850670Z" level=error msg="Leaving cluster with 1 managers left out of 2. Raft quorum will be lost."
time="2019-07-17T23:20:40.411480311Z" level=error msg="fatal task error" error="task: non-zero exit (137)" module=node/agent/taskmanager node.id=676o09ls5f2kw8r3h1qgszh10 service.id=patuniobyo9ixx44cvpsrerj0 task.id=mvvnxzkji8vgh5pddq1qqmhw8
time="2019-07-17T23:20:40.413778428Z" level=error msg="failed to remove node" error="rpc error: code = Aborted desc = dispatcher is stopped" method="(*Dispatcher).Session" node.id=86sy6alx2eyhhzlbmhhm6bf5a node.session=j6z5eu5xjnfigoj8vru4vcwpz
time="2019-07-17T23:20:40.414168582Z" level=error msg="failed to remove node" error="rpc error: code = Aborted desc = dispatcher is stopped" method="(*Dispatcher).Session" node.id=op5dut8u400t4q2jewe0npacg node.session=8rwcqerqsxequdkwa0r787qsd
time="2019-07-17T23:20:40.416726596Z" level=error msg="failed to receive changes from store watch API" error="rpc error: code = Unknown desc = context canceled"

@thaJeztah
Copy link
Member Author

Janky failure on https://jenkins.dockerproject.org/job/Docker-PRs/54926/console

08:51:34 FAIL: docker_api_swarm_test.go:362: DockerSwarmSuite.TestAPISwarmRaftQuorum

Copy link
Contributor
@kolyshkin kolyshkin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM; this is actually good if it accidentally increases flakiness in swarm tests as we have more motivation to attack those.

@thaJeztah
Copy link
Member Author
thaJeztah commented Jul 18, 2019

Hm... Docker Hub making a whoopsie?
https://jenkins.dockerproject.org/job/Docker-PRs/54930/console

17:35:15 FAIL: docker_cli_network_unix_test.go:1787: DockerNetworkSuite.TestConntrackFlowsLeak
17:35:15 
17:35:15 Creating a new daemon
17:35:15 assertion failed: 
17:35:15 Command:  /usr/local/cli/docker run -d --name server --net testbind -p 8080:8080/udp appropriate/nc sh -c while true; do echo hello | nc -w 1 -lu 8080; done
17:35:15 ExitCode: 125
17:35:15 Error:    exit status 125
17:35:15 Stdout:   
17:35:15 Stderr:   Unable to find image 'appropriate/nc:latest' locally
17:35:15 /usr/local/cli/docker: Error response from daemon: Get https://registry-1.docker.io/v2/appropriate/nc/manifests/latest: received unexpected HTTP status: 500 Internal Server Error.
17:35:15 See '/usr/local/cli/docker run --help'.
17:35:15 
17:35:15 
17:35:15 Failures:
17:35:15 ExitCode was 125 expected 0
17:35:15 Expected no error

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants
0