Skip to content

CI: K8sVerifier Runs the kernel verifier against Cilium's BPF datapath: libbpf: Error in bpf_object__probe_loading():Operation not permitted(1) #20724

@joestringer

Description

@joestringer

Test Name

K8sVerifier Runs the kernel verifier against Cilium's BPF datapath

Failure Output

/home/jenkins/workspace/Cilium-PR-K8s-1.16-kernel-4.9/src/github.com/cilium/cilium/test/ginkgo-ext/scopes.go:527
Failed to load BPF program bpf_lxc with datapath configuration:
-DSKIP_DEBUG=1 -DENABLE_IPV4=1 -DENABLE_IPV6=1 -DENABLE_SOCKET_LB_TCP=1 -DENABLE_SOCKET_LB_UDP=1 -DENABLE_ROUTING=1 -DNO_REDIRECT=1 -DPOLICY_VERDICT_NOTIFY=1 -DALLOW_ICMP_FRAG_NEEDED=1 -DENABLE_IDENTITY_MARK=1 -DMONITOR_AGGREGATION=3 -DCT_REPORT_FLAGS=0x0002 -DENABLE_HOST_FIREWALL=1 -DENABLE_ICMP_RULE=1 -DENABLE_CUSTOM_CALLS=1 -DENABLE_IPSEC=1 -DIP_POOLS=1 -DENCAP_IFINDEX=1 -DTUNNEL_MODE=1
Expected command: kubectl exec -n default test-verifier -- env TC_PROGS="" XDP_PROGS="" CG_PROGS="" TC_PROGS="bpf_lxc" ./test/bpf/verifier-test.sh 
To succeed, but it failed:
Exitcode: 1 
Err: exit status 1
Stdout:
 	 => Loading bpf_lxc.c:from-container...
	 
Stderr:
 	 libbpf: loading ./test/bpf/../../bpf/bpf_lxc.o
...
	 libbpf: Error in bpf_object__probe_loading():Operation not permitted(1). Couldn't load trivial BPF program. Make sure your kernel supports BPF (CONFIG_BPF_SYSCALL=y) and/or that RLIMIT_MEMLOCK is set to big enough value.
	 libbpf: failed to load object './test/bpf/../../bpf/bpf_lxc.o'
	 Unable to load program
	 command terminated with exit code 1
	 

/home/jenkins/workspace/Cilium-PR-K8s-1.16-kernel-4.9/src/github.com/cilium/cilium/test/k8s/verifier.go:178

Stack Trace

/home/jenkins/workspace/Cilium-PR-K8s-1.16-kernel-4.9/src/github.com/cilium/cilium/test/ginkgo-ext/scopes.go:527
Failed to load BPF program bpf_lxc with datapath configuration:
-DSKIP_DEBUG=1 -DENABLE_IPV4=1 -DENABLE_IPV6=1 -DENABLE_SOCKET_LB_TCP=1 -DENABLE_SOCKET_LB_UDP=1 -DENABLE_ROUTING=1 -DNO_REDIRECT=1 -DPOLICY_VERDICT_NOTIFY=1 -DALLOW_ICMP_FRAG_NEEDED=1 -DENABLE_IDENTITY_MARK=1 -DMONITOR_AGGREGATION=3 -DCT_REPORT_FLAGS=0x0002 -DENABLE_HOST_FIREWALL=1 -DENABLE_ICMP_RULE=1 -DENABLE_CUSTOM_CALLS=1 -DENABLE_IPSEC=1 -DIP_POOLS=1 -DENCAP_IFINDEX=1 -DTUNNEL_MODE=1
Expected command: kubectl exec -n default test-verifier -- env TC_PROGS="" XDP_PROGS="" CG_PROGS="" TC_PROGS="bpf_lxc" ./test/bpf/verifier-test.sh 
To succeed, but it failed:
Exitcode: 1 
Err: exit status 1
Stdout:
 	 => Loading bpf_lxc.c:from-container...

Standard Output

Empty

Standard Error

=== Test Finished at 2022-07-30T02:11:15Z====
===================== TEST FAILED =====================
02:11:15 STEP: Running AfterFailed block for EntireTestsuite K8sVerifier
Name:               k8s1
Roles:              master
Labels:             beta.kubernetes.io/arch=amd64
                    beta.kubernetes.io/os=linux
                    cilium.io/ci-node=k8s1
                    kubernetes.io/arch=amd64
                    kubernetes.io/hostname=k8s1
                    kubernetes.io/os=linux
                    node-role.kubernetes.io/master=
Annotations:        io.cilium.network.ipv4-cilium-host: 10.0.1.20
                    io.cilium.network.ipv4-health-ip: 10.0.1.12
                    io.cilium.network.ipv4-pod-cidr: 10.0.1.0/24
                    io.cilium.network.ipv6-cilium-host: fd02::111
                    io.cilium.network.ipv6-health-ip: fd02::133
                    io.cilium.network.ipv6-pod-cidr: fd02::100/120
                    kubeadm.alpha.kubernetes.io/cri-socket: /var/run/dockershim.sock
                    node.alpha.kubernetes.io/ttl: 0
                    volumes.kubernetes.io/controller-managed-attach-detach: true
CreationTimestamp:  Sat, 30 Jul 2022 01:26:52 +0000
Taints:             <none>
Unschedulable:      false
Conditions:
  Type                 Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
  ----                 ------  -----------------                 ------------------                ------                       -------
  NetworkUnavailable   False   Sat, 30 Jul 2022 01:31:08 +0000   Sat, 30 Jul 2022 01:31:08 +0000   CiliumIsUp                   Cilium is running on this node
  MemoryPressure       False   Sat, 30 Jul 2022 02:11:12 +0000   Sat, 30 Jul 2022 01:26:49 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
  DiskPressure         False   Sat, 30 Jul 2022 02:11:12 +0000   Sat, 30 Jul 2022 01:26:49 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
  PIDPressure          False   Sat, 30 Jul 2022 02:11:12 +0000   Sat, 30 Jul 2022 01:26:49 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
  Ready                True    Sat, 30 Jul 2022 02:11:12 +0000   Sat, 30 Jul 2022 01:31:02 +0000   KubeletReady                 kubelet is posting ready status. AppArmor enabled
Addresses:
  InternalIP:  192.168.56.11
  Hostname:    k8s1
Capacity:
 cpu:                3
 ephemeral-storage:  80755192Ki
 hugepages-2Mi:      0
 memory:             8173044Ki
 pods:               110
Allocatable:
 cpu:                3
 ephemeral-storage:  74423984824
 hugepages-2Mi:      0
 memory:             8070644Ki
 pods:               110
System Info:
 Machine ID:                 db578da84fc844c3ae20dc6ec27d6c64
 System UUID:                9295A42B-41D2-C34B-97C9-E62F894EDB39
 Boot ID:                    c4a4d134-d4b3-4d3a-9ee9-ca914be79620
 Kernel Version:             4.9.270-0409270-generic
 OS Image:                   Ubuntu 20.04.4 LTS
 Operating System:           linux
 Architecture:               amd64
 Container Runtime Version:  docker://20.10.17
 Kubelet Version:            v1.16.15
 Kube-Proxy Version:         v1.16.15
PodCIDR:                     10.10.0.0/24
PodCIDRs:                    10.10.0.0/24
Non-terminated Pods:         (8 in total)
  Namespace                  Name                            CPU Requests  CPU Limits  Memory Requests  Memory Limits  AGE
  ---------                  ----                            ------------  ----------  ---------------  -------------  ---
  default                    test-verifier                   0 (0%)        0 (0%)      0 (0%)           0 (0%)         33s
  kube-system                etcd-k8s1                       0 (0%)        0 (0%)      0 (0%)           0 (0%)         43m
  kube-system                kube-apiserver-k8s1             250m (8%)     0 (0%)      0 (0%)           0 (0%)         43m
  kube-system                kube-controller-manager-k8s1    200m (6%)     0 (0%)      0 (0%)           0 (0%)         43m
  kube-system                kube-proxy-bc7vc                0 (0%)        0 (0%)      0 (0%)           0 (0%)         44m
  kube-system                kube-scheduler-k8s1             100m (3%)     0 (0%)      0 (0%)           0 (0%)         43m
  kube-system                log-gatherer-dfhcz              0 (0%)        0 (0%)      0 (0%)           0 (0%)         40m
  kube-system                registry-adder-b7f6b            0 (0%)        0 (0%)      0 (0%)           0 (0%)         41m
Allocated resources:
  (Total limits may be over 100 percent, i.e., overcommitted.)
  Resource           Requests    Limits
  --------           --------    ------
  cpu                550m (18%)  0 (0%)
  memory             0 (0%)      0 (0%)
  ephemeral-storage  0 (0%)      0 (0%)
Events:
  Type    Reason                   Age                From              Message
  ----    ------                   ----               ----              -------
  Normal  NodeHasSufficientMemory  44m (x8 over 44m)  kubelet, k8s1     Node k8s1 status is now: NodeHasSufficientMemory
  Normal  NodeHasNoDiskPressure    44m (x8 over 44m)  kubelet, k8s1     Node k8s1 status is now: NodeHasNoDiskPressure
  Normal  NodeHasSufficientPID     44m (x7 over 44m)  kubelet, k8s1     Node k8s1 status is now: NodeHasSufficientPID
  Normal  Starting                 43m                kube-proxy, k8s1  Starting kube-proxy.


Name:               k8s2
Roles:              <none>
Labels:             beta.kubernetes.io/arch=amd64
                    beta.kubernetes.io/os=linux
                    cilium.io/ci-node=k8s2
                    kubernetes.io/arch=amd64
                    kubernetes.io/hostname=k8s2
                    kubernetes.io/os=linux
Annotations:        io.cilium.network.ipv4-cilium-host: 10.0.0.177
                    io.cilium.network.ipv4-health-ip: 10.0.0.155
                    io.cilium.network.ipv4-pod-cidr: 10.0.0.0/24
                    io.cilium.network.ipv6-cilium-host: fd02::2f
                    io.cilium.network.ipv6-health-ip: fd02::b4
                    io.cilium.network.ipv6-pod-cidr: fd02::/120
                    kubeadm.alpha.kubernetes.io/cri-socket: /var/run/dockershim.sock
                    node.alpha.kubernetes.io/ttl: 0
                    volumes.kubernetes.io/controller-managed-attach-detach: true
CreationTimestamp:  Sat, 30 Jul 2022 01:30:01 +0000
Taints:             <none>
Unschedulable:      false
Conditions:
  Type                 Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
  ----                 ------  -----------------                 ------------------                ------                       -------
  NetworkUnavailable   False   Sat, 30 Jul 2022 01:31:53 +0000   Sat, 30 Jul 2022 01:31:53 +0000   CiliumIsUp                   Cilium is running on this node
  MemoryPressure       False   Sat, 30 Jul 2022 02:11:03 +0000   Sat, 30 Jul 2022 01:30:01 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
  DiskPressure         False   Sat, 30 Jul 2022 02:11:03 +0000   Sat, 30 Jul 2022 01:30:01 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
  PIDPressure          False   Sat, 30 Jul 2022 02:11:03 +0000   Sat, 30 Jul 2022 01:30:01 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
  Ready                True    Sat, 30 Jul 2022 02:11:03 +0000   Sat, 30 Jul 2022 01:31:01 +0000   KubeletReady                 kubelet is posting ready status. AppArmor enabled
Addresses:
  InternalIP:  192.168.56.12
  Hostname:    k8s2
Capacity:
 cpu:                3
 ephemeral-storage:  80755192Ki
 hugepages-2Mi:      0
 memory:             8173044Ki
 pods:               110
Allocatable:
 cpu:                3
 ephemeral-storage:  74423984824
 hugepages-2Mi:      0
 memory:             8070644Ki
 pods:               110
System Info:
 Machine ID:                 db578da84fc844c3ae20dc6ec27d6c64
 System UUID:                D991E4AA-99AC-DE4C-964B-0B0D56E0B676
 Boot ID:                    318cd6a1-6519-4709-93a7-0d537429e9cf
 Kernel Version:             4.9.270-0409270-generic
 OS Image:                   Ubuntu 20.04.4 LTS
 Operating System:           linux
 Architecture:               amd64
 Container Runtime Version:  docker://20.10.17
 Kubelet Version:            v1.16.15
 Kube-Proxy Version:         v1.16.15
PodCIDR:                     10.10.1.0/24
PodCIDRs:                    10.10.1.0/24
Non-terminated Pods:         (6 in total)
  Namespace                  Name                          CPU Requests  CPU Limits  Memory Requests  Memory Limits  AGE
  ---------                  ----                          ------------  ----------  ---------------  -------------  ---
  cilium-monitoring          grafana-7fd557d749-f7zbr      0 (0%)        0 (0%)      0 (0%)           0 (0%)         40m
  cilium-monitoring          prometheus-d87f8f984-5xbwz    0 (0%)        0 (0%)      0 (0%)           0 (0%)         40m
  kube-system                coredns-8cfc78c54-fs5dl       100m (3%)     0 (0%)      70Mi (0%)        170Mi (2%)     71s
  kube-system                kube-proxy-6zw7t              0 (0%)        0 (0%)      0 (0%)           0 (0%)         41m
  kube-system                log-gatherer-pfvq6            0 (0%)        0 (0%)      0 (0%)           0 (0%)         40m
  kube-system                registry-adder-ljnss          0 (0%)        0 (0%)      0 (0%)           0 (0%)         41m
Allocated resources:
  (Total limits may be over 100 percent, i.e., overcommitted.)
  Resource           Requests   Limits
  --------           --------   ------
  cpu                100m (3%)  0 (0%)
  memory             70Mi (0%)  170Mi (2%)
  ephemeral-storage  0 (0%)     0 (0%)
Events:
  Type    Reason                   Age                From              Message
  ----    ------                   ----               ----              -------
  Normal  Starting                 41m                kubelet, k8s2     Starting kubelet.
  Normal  NodeHasSufficientMemory  41m (x2 over 41m)  kubelet, k8s2     Node k8s2 status is now: NodeHasSufficientMemory
  Normal  NodeHasNoDiskPressure    41m (x2 over 41m)  kubelet, k8s2     Node k8s2 status is now: NodeHasNoDiskPressure
  Normal  NodeHasSufficientPID     41m (x2 over 41m)  kubelet, k8s2     Node k8s2 status is now: NodeHasSufficientPID
  Normal  NodeAllocatableEnforced  41m                kubelet, k8s2     Updated Node Allocatable limit across pods
  Normal  Starting                 40m                kube-proxy, k8s2  Starting kube-proxy.
  Normal  NodeReady                40m                kubelet, k8s2     Node k8s2 status is now: NodeReady

Name:         test-verifier
Namespace:    default
Priority:     0
Node:         k8s1/192.168.56.11
Start Time:   Sat, 30 Jul 2022 02:10:42 +0000
Labels:       <none>
Annotations:  kubectl.kubernetes.io/last-applied-configuration:
                {"apiVersion":"v1","kind":"Pod","metadata":{"annotations":{},"name":"test-verifier","namespace":"default"},"spec":{"containers":[{"args":[...
Status:       Running
IP:           192.168.56.11
IPs:
  IP:  192.168.56.11
Containers:
  cilium-builder:
    Container ID:  docker://d685398cc1d6927128a236a4e32e801b950854a89f2f8569bd506743e9b94cbf
    Image:         quay.io/cilium/test-verifier:be21913942d60e366d74a74e8ee71ccae03d6d82@sha256:bae938f0b617856b411f62a9a6fac08511adc50ba3236c040ca735562573476c
    Image ID:      docker-pullable://quay.io/cilium/test-verifier@sha256:bae938f0b617856b411f62a9a6fac08511adc50ba3236c040ca735562573476c
    Port:          <none>
    Host Port:     <none>
    Command:
      sleep
    Args:
      1000h
    State:          Running
      Started:      Sat, 30 Jul 2022 02:11:00 +0000
    Ready:          True
    Restart Count:  0
    Environment:    <none>
    Mounts:
      /cilium from cilium-src (rw)
      /sys/fs/bpf from bpf-maps (rw)
      /var/run/secrets/kubernetes.io/serviceaccount from default-token-pk9w4 (ro)
Conditions:
  Type              Status
  Initialized       True 
  Ready             True 
  ContainersReady   True 
  PodScheduled      True 
Volumes:
  bpf-maps:
    Type:          HostPath (bare host directory volume)
    Path:          /sys/fs/bpf
    HostPathType:  DirectoryOrCreate
  cilium-src:
    Type:          HostPath (bare host directory volume)
    Path:          /home/vagrant/go/src/github.com/cilium/cilium
    HostPathType:  Directory
  default-token-pk9w4:
    Type:        Secret (a volume populated by a Secret)
    SecretName:  default-token-pk9w4
    Optional:    false
QoS Class:       BestEffort
Node-Selectors:  cilium.io/ci-node=k8s1
Tolerations:     node.kubernetes.io/not-ready
                 node.kubernetes.io/unreachable
Events:
  Type    Reason     Age        From               Message
  ----    ------     ----       ----               -------
  Normal  Scheduled  <unknown>  default-scheduler  Successfully assigned default/test-verifier to k8s1
  Normal  Pulling    32s        kubelet, k8s1      Pulling image "quay.io/cilium/test-verifier:be21913942d60e366d74a74e8ee71ccae03d6d82@sha256:bae938f0b617856b411f62a9a6fac08511adc50ba3236c040ca735562573476c"
  Normal  Pulled     15s        kubelet, k8s1      Successfully pulled image "quay.io/cilium/test-verifier:be21913942d60e366d74a74e8ee71ccae03d6d82@sha256:bae938f0b617856b411f62a9a6fac08511adc50ba3236c040ca735562573476c"
  Normal  Created    15s        kubelet, k8s1      Created container cilium-builder
  Normal  Started    15s        kubelet, k8s1      Started container cilium-builder

02:11:15 STEP: Collecting bpf_*.o artifacts
===================== Exiting AfterFailed =====================
02:11:15 STEP: Running AfterEach for block EntireTestsuite

[[ATTACHMENT|8c2f82b0_K8sVerifier_Runs_the_kernel_verifier_against_Cilium's_BPF_datapath.zip]]
02:11:15 STEP: Running AfterAll block for EntireTestsuite K8sVerifier

Resources

Anything else?

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    area/CIContinuous Integration testing issue or flakeci/flakeThis is a known failure that occurs in the tree. Please investigate me!kind/duplicateThere is another issue which contains additional details.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions