Aug 5 22:10:59.122017 kernel: Linux version 6.6.43-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.2.1_p20240210 p14) 13.2.1 20240210, GNU ld (Gentoo 2.41 p5) 2.41.0) #1 SMP PREEMPT_DYNAMIC Mon Aug 5 20:36:27 -00 2024 Aug 5 22:10:59.122044 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=4a86c72568bc3f74d57effa5e252d5620941ef6d74241fc198859d020a6392c5 Aug 5 22:10:59.122057 kernel: BIOS-provided physical RAM map: Aug 5 22:10:59.122064 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Aug 5 22:10:59.122071 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Aug 5 22:10:59.122079 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Aug 5 22:10:59.122089 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdcfff] usable Aug 5 22:10:59.122097 kernel: BIOS-e820: [mem 0x000000007ffdd000-0x000000007fffffff] reserved Aug 5 22:10:59.122104 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Aug 5 22:10:59.122115 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Aug 5 22:10:59.122123 kernel: NX (Execute Disable) protection: active Aug 5 22:10:59.122130 kernel: APIC: Static calls initialized Aug 5 22:10:59.122138 kernel: SMBIOS 2.8 present. Aug 5 22:10:59.122146 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014 Aug 5 22:10:59.122156 kernel: Hypervisor detected: KVM Aug 5 22:10:59.122166 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Aug 5 22:10:59.122174 kernel: kvm-clock: using sched offset of 8127351814 cycles Aug 5 22:10:59.122183 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Aug 5 22:10:59.122192 kernel: tsc: Detected 1996.249 MHz processor Aug 5 22:10:59.122201 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Aug 5 22:10:59.122210 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Aug 5 22:10:59.124249 kernel: last_pfn = 0x7ffdd max_arch_pfn = 0x400000000 Aug 5 22:10:59.124260 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Aug 5 22:10:59.124269 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Aug 5 22:10:59.124282 kernel: ACPI: Early table checksum verification disabled Aug 5 22:10:59.124290 kernel: ACPI: RSDP 0x00000000000F5930 000014 (v00 BOCHS ) Aug 5 22:10:59.124299 kernel: ACPI: RSDT 0x000000007FFE1848 000030 (v01 BOCHS BXPC 00000001 BXPC 00000001) Aug 5 22:10:59.124307 kernel: ACPI: FACP 0x000000007FFE172C 000074 (v01 BOCHS BXPC 00000001 BXPC 00000001) Aug 5 22:10:59.124315 kernel: ACPI: DSDT 0x000000007FFE0040 0016EC (v01 BOCHS BXPC 00000001 BXPC 00000001) Aug 5 22:10:59.124324 kernel: ACPI: FACS 0x000000007FFE0000 000040 Aug 5 22:10:59.124332 kernel: ACPI: APIC 0x000000007FFE17A0 000080 (v01 BOCHS BXPC 00000001 BXPC 00000001) Aug 5 22:10:59.124340 kernel: ACPI: WAET 0x000000007FFE1820 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Aug 5 22:10:59.124349 kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe172c-0x7ffe179f] Aug 5 22:10:59.124360 kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffe0040-0x7ffe172b] Aug 5 22:10:59.124368 kernel: ACPI: Reserving FACS table memory at [mem 0x7ffe0000-0x7ffe003f] Aug 5 22:10:59.124377 kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe17a0-0x7ffe181f] Aug 5 22:10:59.124385 kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe1820-0x7ffe1847] Aug 5 22:10:59.124393 kernel: No NUMA configuration found Aug 5 22:10:59.124401 kernel: Faking a node at [mem 0x0000000000000000-0x000000007ffdcfff] Aug 5 22:10:59.124410 kernel: NODE_DATA(0) allocated [mem 0x7ffd7000-0x7ffdcfff] Aug 5 22:10:59.124422 kernel: Zone ranges: Aug 5 22:10:59.124433 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Aug 5 22:10:59.124442 kernel: DMA32 [mem 0x0000000001000000-0x000000007ffdcfff] Aug 5 22:10:59.124451 kernel: Normal empty Aug 5 22:10:59.124459 kernel: Movable zone start for each node Aug 5 22:10:59.124468 kernel: Early memory node ranges Aug 5 22:10:59.124477 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Aug 5 22:10:59.124488 kernel: node 0: [mem 0x0000000000100000-0x000000007ffdcfff] Aug 5 22:10:59.124497 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007ffdcfff] Aug 5 22:10:59.124505 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Aug 5 22:10:59.124514 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Aug 5 22:10:59.124523 kernel: On node 0, zone DMA32: 35 pages in unavailable ranges Aug 5 22:10:59.124531 kernel: ACPI: PM-Timer IO Port: 0x608 Aug 5 22:10:59.124540 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Aug 5 22:10:59.124549 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Aug 5 22:10:59.124558 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Aug 5 22:10:59.124568 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Aug 5 22:10:59.124577 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Aug 5 22:10:59.124586 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Aug 5 22:10:59.124594 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Aug 5 22:10:59.124603 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Aug 5 22:10:59.124612 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs Aug 5 22:10:59.124621 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Aug 5 22:10:59.124629 kernel: [mem 0x80000000-0xfeffbfff] available for PCI devices Aug 5 22:10:59.124638 kernel: Booting paravirtualized kernel on KVM Aug 5 22:10:59.124647 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Aug 5 22:10:59.124658 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Aug 5 22:10:59.124667 kernel: percpu: Embedded 58 pages/cpu s196904 r8192 d32472 u1048576 Aug 5 22:10:59.124676 kernel: pcpu-alloc: s196904 r8192 d32472 u1048576 alloc=1*2097152 Aug 5 22:10:59.124685 kernel: pcpu-alloc: [0] 0 1 Aug 5 22:10:59.124693 kernel: kvm-guest: PV spinlocks disabled, no host support Aug 5 22:10:59.124704 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=4a86c72568bc3f74d57effa5e252d5620941ef6d74241fc198859d020a6392c5 Aug 5 22:10:59.124713 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Aug 5 22:10:59.124724 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Aug 5 22:10:59.124733 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Aug 5 22:10:59.124742 kernel: Fallback order for Node 0: 0 Aug 5 22:10:59.124750 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515805 Aug 5 22:10:59.124759 kernel: Policy zone: DMA32 Aug 5 22:10:59.124768 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Aug 5 22:10:59.124777 kernel: Memory: 1965064K/2096620K available (12288K kernel code, 2302K rwdata, 22640K rodata, 49328K init, 2016K bss, 131296K reserved, 0K cma-reserved) Aug 5 22:10:59.124786 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Aug 5 22:10:59.124795 kernel: ftrace: allocating 37659 entries in 148 pages Aug 5 22:10:59.124806 kernel: ftrace: allocated 148 pages with 3 groups Aug 5 22:10:59.124814 kernel: Dynamic Preempt: voluntary Aug 5 22:10:59.124823 kernel: rcu: Preemptible hierarchical RCU implementation. Aug 5 22:10:59.124833 kernel: rcu: RCU event tracing is enabled. Aug 5 22:10:59.124843 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Aug 5 22:10:59.124852 kernel: Trampoline variant of Tasks RCU enabled. Aug 5 22:10:59.124861 kernel: Rude variant of Tasks RCU enabled. Aug 5 22:10:59.124870 kernel: Tracing variant of Tasks RCU enabled. Aug 5 22:10:59.124879 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Aug 5 22:10:59.124887 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Aug 5 22:10:59.124898 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Aug 5 22:10:59.124907 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Aug 5 22:10:59.124916 kernel: Console: colour VGA+ 80x25 Aug 5 22:10:59.124924 kernel: printk: console [tty0] enabled Aug 5 22:10:59.124933 kernel: printk: console [ttyS0] enabled Aug 5 22:10:59.124942 kernel: ACPI: Core revision 20230628 Aug 5 22:10:59.124951 kernel: APIC: Switch to symmetric I/O mode setup Aug 5 22:10:59.124960 kernel: x2apic enabled Aug 5 22:10:59.124968 kernel: APIC: Switched APIC routing to: physical x2apic Aug 5 22:10:59.124979 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Aug 5 22:10:59.124988 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized Aug 5 22:10:59.124997 kernel: Calibrating delay loop (skipped) preset value.. 3992.49 BogoMIPS (lpj=1996249) Aug 5 22:10:59.125005 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Aug 5 22:10:59.125014 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Aug 5 22:10:59.125023 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Aug 5 22:10:59.125032 kernel: Spectre V2 : Mitigation: Retpolines Aug 5 22:10:59.125041 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Aug 5 22:10:59.125050 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Aug 5 22:10:59.125062 kernel: Speculative Store Bypass: Vulnerable Aug 5 22:10:59.125071 kernel: x86/fpu: x87 FPU will use FXSAVE Aug 5 22:10:59.125079 kernel: Freeing SMP alternatives memory: 32K Aug 5 22:10:59.125088 kernel: pid_max: default: 32768 minimum: 301 Aug 5 22:10:59.125097 kernel: LSM: initializing lsm=lockdown,capability,selinux,integrity Aug 5 22:10:59.125106 kernel: SELinux: Initializing. Aug 5 22:10:59.125114 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Aug 5 22:10:59.125124 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Aug 5 22:10:59.125142 kernel: smpboot: CPU0: AMD Intel Core i7 9xx (Nehalem Class Core i7) (family: 0x6, model: 0x1a, stepping: 0x3) Aug 5 22:10:59.125151 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1. Aug 5 22:10:59.125161 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1. Aug 5 22:10:59.125172 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1. Aug 5 22:10:59.125181 kernel: Performance Events: AMD PMU driver. Aug 5 22:10:59.125191 kernel: ... version: 0 Aug 5 22:10:59.125200 kernel: ... bit width: 48 Aug 5 22:10:59.125209 kernel: ... generic registers: 4 Aug 5 22:10:59.127234 kernel: ... value mask: 0000ffffffffffff Aug 5 22:10:59.127262 kernel: ... max period: 00007fffffffffff Aug 5 22:10:59.127272 kernel: ... fixed-purpose events: 0 Aug 5 22:10:59.127282 kernel: ... event mask: 000000000000000f Aug 5 22:10:59.127291 kernel: signal: max sigframe size: 1440 Aug 5 22:10:59.127300 kernel: rcu: Hierarchical SRCU implementation. Aug 5 22:10:59.127310 kernel: rcu: Max phase no-delay instances is 400. Aug 5 22:10:59.127319 kernel: smp: Bringing up secondary CPUs ... Aug 5 22:10:59.127329 kernel: smpboot: x86: Booting SMP configuration: Aug 5 22:10:59.127338 kernel: .... node #0, CPUs: #1 Aug 5 22:10:59.127352 kernel: smp: Brought up 1 node, 2 CPUs Aug 5 22:10:59.127361 kernel: smpboot: Max logical packages: 2 Aug 5 22:10:59.127370 kernel: smpboot: Total of 2 processors activated (7984.99 BogoMIPS) Aug 5 22:10:59.127380 kernel: devtmpfs: initialized Aug 5 22:10:59.127389 kernel: x86/mm: Memory block size: 128MB Aug 5 22:10:59.127398 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Aug 5 22:10:59.127408 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Aug 5 22:10:59.127417 kernel: pinctrl core: initialized pinctrl subsystem Aug 5 22:10:59.127426 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Aug 5 22:10:59.127438 kernel: audit: initializing netlink subsys (disabled) Aug 5 22:10:59.127447 kernel: audit: type=2000 audit(1722895857.304:1): state=initialized audit_enabled=0 res=1 Aug 5 22:10:59.127456 kernel: thermal_sys: Registered thermal governor 'step_wise' Aug 5 22:10:59.127466 kernel: thermal_sys: Registered thermal governor 'user_space' Aug 5 22:10:59.127475 kernel: cpuidle: using governor menu Aug 5 22:10:59.127484 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Aug 5 22:10:59.127494 kernel: dca service started, version 1.12.1 Aug 5 22:10:59.127503 kernel: PCI: Using configuration type 1 for base access Aug 5 22:10:59.127513 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Aug 5 22:10:59.127524 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Aug 5 22:10:59.127534 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Aug 5 22:10:59.127543 kernel: ACPI: Added _OSI(Module Device) Aug 5 22:10:59.127552 kernel: ACPI: Added _OSI(Processor Device) Aug 5 22:10:59.127562 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Aug 5 22:10:59.127571 kernel: ACPI: Added _OSI(Processor Aggregator Device) Aug 5 22:10:59.127580 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Aug 5 22:10:59.127590 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Aug 5 22:10:59.127599 kernel: ACPI: Interpreter enabled Aug 5 22:10:59.127610 kernel: ACPI: PM: (supports S0 S3 S5) Aug 5 22:10:59.127619 kernel: ACPI: Using IOAPIC for interrupt routing Aug 5 22:10:59.127629 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Aug 5 22:10:59.127638 kernel: PCI: Using E820 reservations for host bridge windows Aug 5 22:10:59.127648 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F Aug 5 22:10:59.127657 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Aug 5 22:10:59.127833 kernel: acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI HPX-Type3] Aug 5 22:10:59.127941 kernel: acpi PNP0A03:00: _OSC: not requesting OS control; OS requires [ExtendedConfig ASPM ClockPM MSI] Aug 5 22:10:59.128041 kernel: acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended configuration space under this bridge Aug 5 22:10:59.128055 kernel: acpiphp: Slot [3] registered Aug 5 22:10:59.128065 kernel: acpiphp: Slot [4] registered Aug 5 22:10:59.128074 kernel: acpiphp: Slot [5] registered Aug 5 22:10:59.128083 kernel: acpiphp: Slot [6] registered Aug 5 22:10:59.128092 kernel: acpiphp: Slot [7] registered Aug 5 22:10:59.128101 kernel: acpiphp: Slot [8] registered Aug 5 22:10:59.128110 kernel: acpiphp: Slot [9] registered Aug 5 22:10:59.128128 kernel: acpiphp: Slot [10] registered Aug 5 22:10:59.128138 kernel: acpiphp: Slot [11] registered Aug 5 22:10:59.128147 kernel: acpiphp: Slot [12] registered Aug 5 22:10:59.128156 kernel: acpiphp: Slot [13] registered Aug 5 22:10:59.128165 kernel: acpiphp: Slot [14] registered Aug 5 22:10:59.128174 kernel: acpiphp: Slot [15] registered Aug 5 22:10:59.128184 kernel: acpiphp: Slot [16] registered Aug 5 22:10:59.128193 kernel: acpiphp: Slot [17] registered Aug 5 22:10:59.128202 kernel: acpiphp: Slot [18] registered Aug 5 22:10:59.128211 kernel: acpiphp: Slot [19] registered Aug 5 22:10:59.128245 kernel: acpiphp: Slot [20] registered Aug 5 22:10:59.128254 kernel: acpiphp: Slot [21] registered Aug 5 22:10:59.128264 kernel: acpiphp: Slot [22] registered Aug 5 22:10:59.128273 kernel: acpiphp: Slot [23] registered Aug 5 22:10:59.128282 kernel: acpiphp: Slot [24] registered Aug 5 22:10:59.128291 kernel: acpiphp: Slot [25] registered Aug 5 22:10:59.128300 kernel: acpiphp: Slot [26] registered Aug 5 22:10:59.128309 kernel: acpiphp: Slot [27] registered Aug 5 22:10:59.128318 kernel: acpiphp: Slot [28] registered Aug 5 22:10:59.128330 kernel: acpiphp: Slot [29] registered Aug 5 22:10:59.128339 kernel: acpiphp: Slot [30] registered Aug 5 22:10:59.128348 kernel: acpiphp: Slot [31] registered Aug 5 22:10:59.128357 kernel: PCI host bridge to bus 0000:00 Aug 5 22:10:59.128464 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Aug 5 22:10:59.128552 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Aug 5 22:10:59.128641 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Aug 5 22:10:59.128730 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xfebfffff window] Aug 5 22:10:59.128817 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x17fffffff window] Aug 5 22:10:59.128899 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Aug 5 22:10:59.129014 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 Aug 5 22:10:59.129117 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 Aug 5 22:10:59.129234 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 Aug 5 22:10:59.129331 kernel: pci 0000:00:01.1: reg 0x20: [io 0xc120-0xc12f] Aug 5 22:10:59.129429 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Aug 5 22:10:59.129519 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Aug 5 22:10:59.129607 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Aug 5 22:10:59.129695 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Aug 5 22:10:59.129799 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 Aug 5 22:10:59.129888 kernel: pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI Aug 5 22:10:59.129979 kernel: pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB Aug 5 22:10:59.130109 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 Aug 5 22:10:59.130209 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref] Aug 5 22:10:59.132341 kernel: pci 0000:00:02.0: reg 0x18: [mem 0xfe800000-0xfe803fff 64bit pref] Aug 5 22:10:59.132431 kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff] Aug 5 22:10:59.132518 kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref] Aug 5 22:10:59.132608 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Aug 5 22:10:59.132714 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Aug 5 22:10:59.132807 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc080-0xc0bf] Aug 5 22:10:59.132897 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff] Aug 5 22:10:59.132986 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfe804000-0xfe807fff 64bit pref] Aug 5 22:10:59.133076 kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref] Aug 5 22:10:59.133175 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Aug 5 22:10:59.135330 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] Aug 5 22:10:59.135450 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff] Aug 5 22:10:59.135543 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfe808000-0xfe80bfff 64bit pref] Aug 5 22:10:59.135646 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 Aug 5 22:10:59.135740 kernel: pci 0000:00:05.0: reg 0x10: [io 0xc0c0-0xc0ff] Aug 5 22:10:59.135868 kernel: pci 0000:00:05.0: reg 0x20: [mem 0xfe80c000-0xfe80ffff 64bit pref] Aug 5 22:10:59.136017 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 Aug 5 22:10:59.136159 kernel: pci 0000:00:06.0: reg 0x10: [io 0xc100-0xc11f] Aug 5 22:10:59.136340 kernel: pci 0000:00:06.0: reg 0x20: [mem 0xfe810000-0xfe813fff 64bit pref] Aug 5 22:10:59.136365 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Aug 5 22:10:59.136379 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Aug 5 22:10:59.136394 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Aug 5 22:10:59.136410 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Aug 5 22:10:59.136425 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 Aug 5 22:10:59.136441 kernel: iommu: Default domain type: Translated Aug 5 22:10:59.136457 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Aug 5 22:10:59.136472 kernel: PCI: Using ACPI for IRQ routing Aug 5 22:10:59.136493 kernel: PCI: pci_cache_line_size set to 64 bytes Aug 5 22:10:59.136508 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Aug 5 22:10:59.136523 kernel: e820: reserve RAM buffer [mem 0x7ffdd000-0x7fffffff] Aug 5 22:10:59.136637 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device Aug 5 22:10:59.136736 kernel: pci 0000:00:02.0: vgaarb: bridge control possible Aug 5 22:10:59.136850 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Aug 5 22:10:59.136868 kernel: vgaarb: loaded Aug 5 22:10:59.136878 kernel: clocksource: Switched to clocksource kvm-clock Aug 5 22:10:59.136887 kernel: VFS: Disk quotas dquot_6.6.0 Aug 5 22:10:59.136902 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Aug 5 22:10:59.136911 kernel: pnp: PnP ACPI init Aug 5 22:10:59.137024 kernel: pnp 00:03: [dma 2] Aug 5 22:10:59.137040 kernel: pnp: PnP ACPI: found 5 devices Aug 5 22:10:59.137049 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Aug 5 22:10:59.137059 kernel: NET: Registered PF_INET protocol family Aug 5 22:10:59.137068 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Aug 5 22:10:59.137078 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Aug 5 22:10:59.137092 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Aug 5 22:10:59.137101 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Aug 5 22:10:59.137111 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Aug 5 22:10:59.137120 kernel: TCP: Hash tables configured (established 16384 bind 16384) Aug 5 22:10:59.137130 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Aug 5 22:10:59.137139 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Aug 5 22:10:59.137148 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Aug 5 22:10:59.137157 kernel: NET: Registered PF_XDP protocol family Aug 5 22:10:59.137263 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Aug 5 22:10:59.137353 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Aug 5 22:10:59.137435 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Aug 5 22:10:59.137519 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xfebfffff window] Aug 5 22:10:59.137601 kernel: pci_bus 0000:00: resource 8 [mem 0x100000000-0x17fffffff window] Aug 5 22:10:59.137698 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release Aug 5 22:10:59.137793 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Aug 5 22:10:59.137807 kernel: PCI: CLS 0 bytes, default 64 Aug 5 22:10:59.137816 kernel: Initialise system trusted keyrings Aug 5 22:10:59.137829 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Aug 5 22:10:59.137838 kernel: Key type asymmetric registered Aug 5 22:10:59.137847 kernel: Asymmetric key parser 'x509' registered Aug 5 22:10:59.137856 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Aug 5 22:10:59.137865 kernel: io scheduler mq-deadline registered Aug 5 22:10:59.137873 kernel: io scheduler kyber registered Aug 5 22:10:59.137882 kernel: io scheduler bfq registered Aug 5 22:10:59.137891 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Aug 5 22:10:59.137901 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10 Aug 5 22:10:59.137912 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 Aug 5 22:10:59.137921 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 Aug 5 22:10:59.137930 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 Aug 5 22:10:59.137939 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Aug 5 22:10:59.137948 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Aug 5 22:10:59.137956 kernel: random: crng init done Aug 5 22:10:59.137965 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Aug 5 22:10:59.137974 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Aug 5 22:10:59.137982 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Aug 5 22:10:59.138103 kernel: rtc_cmos 00:04: RTC can wake from S4 Aug 5 22:10:59.138119 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Aug 5 22:10:59.138203 kernel: rtc_cmos 00:04: registered as rtc0 Aug 5 22:10:59.138314 kernel: rtc_cmos 00:04: setting system clock to 2024-08-05T22:10:58 UTC (1722895858) Aug 5 22:10:59.138400 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram Aug 5 22:10:59.138415 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Aug 5 22:10:59.138425 kernel: NET: Registered PF_INET6 protocol family Aug 5 22:10:59.138440 kernel: Segment Routing with IPv6 Aug 5 22:10:59.138449 kernel: In-situ OAM (IOAM) with IPv6 Aug 5 22:10:59.138459 kernel: NET: Registered PF_PACKET protocol family Aug 5 22:10:59.138468 kernel: Key type dns_resolver registered Aug 5 22:10:59.138477 kernel: IPI shorthand broadcast: enabled Aug 5 22:10:59.138487 kernel: sched_clock: Marking stable (1047008613, 126331592)->(1176595605, -3255400) Aug 5 22:10:59.138496 kernel: registered taskstats version 1 Aug 5 22:10:59.138505 kernel: Loading compiled-in X.509 certificates Aug 5 22:10:59.138515 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.43-flatcar: e31e857530e65c19b206dbf3ab8297cc37ac5d55' Aug 5 22:10:59.138527 kernel: Key type .fscrypt registered Aug 5 22:10:59.138536 kernel: Key type fscrypt-provisioning registered Aug 5 22:10:59.138545 kernel: ima: No TPM chip found, activating TPM-bypass! Aug 5 22:10:59.138555 kernel: ima: Allocated hash algorithm: sha1 Aug 5 22:10:59.138564 kernel: ima: No architecture policies found Aug 5 22:10:59.138573 kernel: clk: Disabling unused clocks Aug 5 22:10:59.138582 kernel: Freeing unused kernel image (initmem) memory: 49328K Aug 5 22:10:59.138592 kernel: Write protecting the kernel read-only data: 36864k Aug 5 22:10:59.138601 kernel: Freeing unused kernel image (rodata/data gap) memory: 1936K Aug 5 22:10:59.138613 kernel: Run /init as init process Aug 5 22:10:59.138623 kernel: with arguments: Aug 5 22:10:59.138632 kernel: /init Aug 5 22:10:59.138641 kernel: with environment: Aug 5 22:10:59.138650 kernel: HOME=/ Aug 5 22:10:59.138659 kernel: TERM=linux Aug 5 22:10:59.138668 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Aug 5 22:10:59.138681 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Aug 5 22:10:59.138696 systemd[1]: Detected virtualization kvm. Aug 5 22:10:59.138707 systemd[1]: Detected architecture x86-64. Aug 5 22:10:59.138717 systemd[1]: Running in initrd. Aug 5 22:10:59.138727 systemd[1]: No hostname configured, using default hostname. Aug 5 22:10:59.138736 systemd[1]: Hostname set to . Aug 5 22:10:59.138747 systemd[1]: Initializing machine ID from VM UUID. Aug 5 22:10:59.138757 systemd[1]: Queued start job for default target initrd.target. Aug 5 22:10:59.138768 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 5 22:10:59.138780 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 5 22:10:59.138791 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Aug 5 22:10:59.138802 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Aug 5 22:10:59.138812 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Aug 5 22:10:59.138822 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Aug 5 22:10:59.138834 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Aug 5 22:10:59.138847 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Aug 5 22:10:59.138857 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 5 22:10:59.138868 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Aug 5 22:10:59.138878 systemd[1]: Reached target paths.target - Path Units. Aug 5 22:10:59.138888 systemd[1]: Reached target slices.target - Slice Units. Aug 5 22:10:59.138908 systemd[1]: Reached target swap.target - Swaps. Aug 5 22:10:59.138920 systemd[1]: Reached target timers.target - Timer Units. Aug 5 22:10:59.138933 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Aug 5 22:10:59.138944 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Aug 5 22:10:59.138954 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Aug 5 22:10:59.138965 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Aug 5 22:10:59.138975 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Aug 5 22:10:59.138986 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Aug 5 22:10:59.138997 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Aug 5 22:10:59.139009 systemd[1]: Reached target sockets.target - Socket Units. Aug 5 22:10:59.139022 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Aug 5 22:10:59.139032 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Aug 5 22:10:59.139043 systemd[1]: Finished network-cleanup.service - Network Cleanup. Aug 5 22:10:59.139054 systemd[1]: Starting systemd-fsck-usr.service... Aug 5 22:10:59.139064 systemd[1]: Starting systemd-journald.service - Journal Service... Aug 5 22:10:59.139075 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Aug 5 22:10:59.139085 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 5 22:10:59.139096 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Aug 5 22:10:59.139107 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Aug 5 22:10:59.139119 systemd[1]: Finished systemd-fsck-usr.service. Aug 5 22:10:59.139131 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Aug 5 22:10:59.139159 systemd-journald[184]: Collecting audit messages is disabled. Aug 5 22:10:59.139186 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Aug 5 22:10:59.139197 systemd-journald[184]: Journal started Aug 5 22:10:59.139237 systemd-journald[184]: Runtime Journal (/run/log/journal/3a0c4425f3c947b2a6652bd55521a554) is 4.9M, max 39.3M, 34.4M free. Aug 5 22:10:59.114709 systemd-modules-load[185]: Inserted module 'overlay' Aug 5 22:10:59.176933 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Aug 5 22:10:59.176977 kernel: Bridge firewalling registered Aug 5 22:10:59.176992 systemd[1]: Started systemd-journald.service - Journal Service. Aug 5 22:10:59.158665 systemd-modules-load[185]: Inserted module 'br_netfilter' Aug 5 22:10:59.177786 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Aug 5 22:10:59.178797 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 5 22:10:59.186382 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Aug 5 22:10:59.188354 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Aug 5 22:10:59.190427 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Aug 5 22:10:59.193077 systemd[1]: Starting systemd-tmpfiles-setup.service - Create Volatile Files and Directories... Aug 5 22:10:59.210703 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Aug 5 22:10:59.214898 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 5 22:10:59.219212 systemd[1]: Finished systemd-tmpfiles-setup.service - Create Volatile Files and Directories. Aug 5 22:10:59.221353 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 5 22:10:59.225422 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Aug 5 22:10:59.229383 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Aug 5 22:10:59.244879 dracut-cmdline[215]: dracut-dracut-053 Aug 5 22:10:59.248115 dracut-cmdline[215]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=4a86c72568bc3f74d57effa5e252d5620941ef6d74241fc198859d020a6392c5 Aug 5 22:10:59.289945 systemd-resolved[217]: Positive Trust Anchors: Aug 5 22:10:59.292309 systemd-resolved[217]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Aug 5 22:10:59.292421 systemd-resolved[217]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa corp home internal intranet lan local private test Aug 5 22:10:59.301708 systemd-resolved[217]: Defaulting to hostname 'linux'. Aug 5 22:10:59.303912 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Aug 5 22:10:59.304495 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Aug 5 22:10:59.315271 kernel: SCSI subsystem initialized Aug 5 22:10:59.328279 kernel: Loading iSCSI transport class v2.0-870. Aug 5 22:10:59.343280 kernel: iscsi: registered transport (tcp) Aug 5 22:10:59.372268 kernel: iscsi: registered transport (qla4xxx) Aug 5 22:10:59.372376 kernel: QLogic iSCSI HBA Driver Aug 5 22:10:59.416025 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Aug 5 22:10:59.423366 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Aug 5 22:10:59.454322 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Aug 5 22:10:59.454461 kernel: device-mapper: uevent: version 1.0.3 Aug 5 22:10:59.454489 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Aug 5 22:10:59.507384 kernel: raid6: sse2x4 gen() 12304 MB/s Aug 5 22:10:59.524470 kernel: raid6: sse2x2 gen() 14603 MB/s Aug 5 22:10:59.541433 kernel: raid6: sse2x1 gen() 9635 MB/s Aug 5 22:10:59.541541 kernel: raid6: using algorithm sse2x2 gen() 14603 MB/s Aug 5 22:10:59.559493 kernel: raid6: .... xor() 9336 MB/s, rmw enabled Aug 5 22:10:59.559605 kernel: raid6: using ssse3x2 recovery algorithm Aug 5 22:10:59.587296 kernel: xor: measuring software checksum speed Aug 5 22:10:59.587360 kernel: prefetch64-sse : 18632 MB/sec Aug 5 22:10:59.590696 kernel: generic_sse : 15805 MB/sec Aug 5 22:10:59.590759 kernel: xor: using function: prefetch64-sse (18632 MB/sec) Aug 5 22:10:59.794290 kernel: Btrfs loaded, zoned=no, fsverity=no Aug 5 22:10:59.812452 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Aug 5 22:10:59.823381 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 5 22:10:59.859853 systemd-udevd[402]: Using default interface naming scheme 'v255'. Aug 5 22:10:59.870621 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 5 22:10:59.880605 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Aug 5 22:10:59.918250 dracut-pre-trigger[406]: rd.md=0: removing MD RAID activation Aug 5 22:10:59.976205 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Aug 5 22:10:59.983518 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Aug 5 22:11:00.056273 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Aug 5 22:11:00.069587 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Aug 5 22:11:00.119676 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Aug 5 22:11:00.123528 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Aug 5 22:11:00.124082 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 5 22:11:00.124705 systemd[1]: Reached target remote-fs.target - Remote File Systems. Aug 5 22:11:00.132485 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Aug 5 22:11:00.147685 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Aug 5 22:11:00.154237 kernel: virtio_blk virtio2: 2/0/0 default/read/poll queues Aug 5 22:11:00.184430 kernel: virtio_blk virtio2: [vda] 41943040 512-byte logical blocks (21.5 GB/20.0 GiB) Aug 5 22:11:00.184565 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Aug 5 22:11:00.184580 kernel: GPT:17805311 != 41943039 Aug 5 22:11:00.184593 kernel: GPT:Alternate GPT header not at the end of the disk. Aug 5 22:11:00.184605 kernel: GPT:17805311 != 41943039 Aug 5 22:11:00.184617 kernel: GPT: Use GNU Parted to correct GPT errors. Aug 5 22:11:00.184635 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Aug 5 22:11:00.163177 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Aug 5 22:11:00.163503 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 5 22:11:00.164575 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Aug 5 22:11:00.165443 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 5 22:11:00.165770 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 5 22:11:00.166569 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Aug 5 22:11:00.179043 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 5 22:11:00.234298 kernel: libata version 3.00 loaded. Aug 5 22:11:00.254248 kernel: BTRFS: device label OEM devid 1 transid 13 /dev/vda6 scanned by (udev-worker) (451) Aug 5 22:11:00.259309 kernel: ata_piix 0000:00:01.1: version 2.13 Aug 5 22:11:00.272357 kernel: BTRFS: device fsid d3844c60-0a2c-449a-9ee9-2a875f8d8e12 devid 1 transid 36 /dev/vda3 scanned by (udev-worker) (456) Aug 5 22:11:00.272379 kernel: scsi host0: ata_piix Aug 5 22:11:00.272516 kernel: scsi host1: ata_piix Aug 5 22:11:00.272632 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc120 irq 14 Aug 5 22:11:00.272647 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc128 irq 15 Aug 5 22:11:00.260484 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 5 22:11:00.274299 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Aug 5 22:11:00.285431 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Aug 5 22:11:00.292865 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Aug 5 22:11:00.297979 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Aug 5 22:11:00.298663 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Aug 5 22:11:00.310485 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Aug 5 22:11:00.315495 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Aug 5 22:11:00.322873 disk-uuid[496]: Primary Header is updated. Aug 5 22:11:00.322873 disk-uuid[496]: Secondary Entries is updated. Aug 5 22:11:00.322873 disk-uuid[496]: Secondary Header is updated. Aug 5 22:11:00.334323 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Aug 5 22:11:00.345258 kernel: GPT:disk_guids don't match. Aug 5 22:11:00.345353 kernel: GPT: Use GNU Parted to correct GPT errors. Aug 5 22:11:00.345369 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Aug 5 22:11:00.346452 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 5 22:11:01.361286 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Aug 5 22:11:01.364197 disk-uuid[497]: The operation has completed successfully. Aug 5 22:11:01.444487 systemd[1]: disk-uuid.service: Deactivated successfully. Aug 5 22:11:01.444836 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Aug 5 22:11:01.479406 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Aug 5 22:11:01.488066 sh[518]: Success Aug 5 22:11:01.526265 kernel: device-mapper: verity: sha256 using implementation "sha256-ssse3" Aug 5 22:11:01.578891 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Aug 5 22:11:01.590660 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Aug 5 22:11:01.591954 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Aug 5 22:11:01.612797 kernel: BTRFS info (device dm-0): first mount of filesystem d3844c60-0a2c-449a-9ee9-2a875f8d8e12 Aug 5 22:11:01.612856 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Aug 5 22:11:01.614286 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Aug 5 22:11:01.616478 kernel: BTRFS info (device dm-0): disabling log replay at mount time Aug 5 22:11:01.616511 kernel: BTRFS info (device dm-0): using free space tree Aug 5 22:11:01.631603 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Aug 5 22:11:01.633825 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Aug 5 22:11:01.641505 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Aug 5 22:11:01.645422 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Aug 5 22:11:01.667510 kernel: BTRFS info (device vda6): first mount of filesystem b6695624-d538-4f05-9ddd-23ee987404c1 Aug 5 22:11:01.673593 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Aug 5 22:11:01.673651 kernel: BTRFS info (device vda6): using free space tree Aug 5 22:11:01.682316 kernel: BTRFS info (device vda6): auto enabling async discard Aug 5 22:11:01.703931 systemd[1]: mnt-oem.mount: Deactivated successfully. Aug 5 22:11:01.707702 kernel: BTRFS info (device vda6): last unmount of filesystem b6695624-d538-4f05-9ddd-23ee987404c1 Aug 5 22:11:01.718457 systemd[1]: Finished ignition-setup.service - Ignition (setup). Aug 5 22:11:01.724374 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Aug 5 22:11:01.797606 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Aug 5 22:11:01.812421 systemd[1]: Starting systemd-networkd.service - Network Configuration... Aug 5 22:11:01.850367 systemd-networkd[701]: lo: Link UP Aug 5 22:11:01.850377 systemd-networkd[701]: lo: Gained carrier Aug 5 22:11:01.851728 systemd-networkd[701]: Enumeration completed Aug 5 22:11:01.851839 systemd[1]: Started systemd-networkd.service - Network Configuration. Aug 5 22:11:01.852876 systemd[1]: Reached target network.target - Network. Aug 5 22:11:01.853342 systemd-networkd[701]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 5 22:11:01.853345 systemd-networkd[701]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 5 22:11:01.854239 systemd-networkd[701]: eth0: Link UP Aug 5 22:11:01.854243 systemd-networkd[701]: eth0: Gained carrier Aug 5 22:11:01.854251 systemd-networkd[701]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 5 22:11:01.871308 systemd-networkd[701]: eth0: DHCPv4 address 172.24.4.33/24, gateway 172.24.4.1 acquired from 172.24.4.1 Aug 5 22:11:01.881851 ignition[636]: Ignition 2.18.0 Aug 5 22:11:01.881868 ignition[636]: Stage: fetch-offline Aug 5 22:11:01.881936 ignition[636]: no configs at "/usr/lib/ignition/base.d" Aug 5 22:11:01.881949 ignition[636]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Aug 5 22:11:01.884496 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Aug 5 22:11:01.882146 ignition[636]: parsed url from cmdline: "" Aug 5 22:11:01.882150 ignition[636]: no config URL provided Aug 5 22:11:01.882157 ignition[636]: reading system config file "/usr/lib/ignition/user.ign" Aug 5 22:11:01.882166 ignition[636]: no config at "/usr/lib/ignition/user.ign" Aug 5 22:11:01.882171 ignition[636]: failed to fetch config: resource requires networking Aug 5 22:11:01.882570 ignition[636]: Ignition finished successfully Aug 5 22:11:01.888439 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Aug 5 22:11:01.905291 ignition[711]: Ignition 2.18.0 Aug 5 22:11:01.905307 ignition[711]: Stage: fetch Aug 5 22:11:01.905514 ignition[711]: no configs at "/usr/lib/ignition/base.d" Aug 5 22:11:01.905527 ignition[711]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Aug 5 22:11:01.905643 ignition[711]: parsed url from cmdline: "" Aug 5 22:11:01.905648 ignition[711]: no config URL provided Aug 5 22:11:01.905654 ignition[711]: reading system config file "/usr/lib/ignition/user.ign" Aug 5 22:11:01.905663 ignition[711]: no config at "/usr/lib/ignition/user.ign" Aug 5 22:11:01.905806 ignition[711]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Aug 5 22:11:01.905904 ignition[711]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Aug 5 22:11:01.905935 ignition[711]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Aug 5 22:11:02.393423 ignition[711]: GET result: OK Aug 5 22:11:02.393639 ignition[711]: parsing config with SHA512: 76292e391347126f130aa2161a706600ac321401d84fc0b60c1a7723d22299af6178fe247f059b9454255284d99e4f2909558a05ee71884f5de13dbc2210d1d7 Aug 5 22:11:02.403933 unknown[711]: fetched base config from "system" Aug 5 22:11:02.403960 unknown[711]: fetched base config from "system" Aug 5 22:11:02.404844 ignition[711]: fetch: fetch complete Aug 5 22:11:02.403980 unknown[711]: fetched user config from "openstack" Aug 5 22:11:02.404857 ignition[711]: fetch: fetch passed Aug 5 22:11:02.409112 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Aug 5 22:11:02.404944 ignition[711]: Ignition finished successfully Aug 5 22:11:02.418834 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Aug 5 22:11:02.455279 ignition[718]: Ignition 2.18.0 Aug 5 22:11:02.455308 ignition[718]: Stage: kargs Aug 5 22:11:02.455730 ignition[718]: no configs at "/usr/lib/ignition/base.d" Aug 5 22:11:02.455758 ignition[718]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Aug 5 22:11:02.460465 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Aug 5 22:11:02.458109 ignition[718]: kargs: kargs passed Aug 5 22:11:02.458214 ignition[718]: Ignition finished successfully Aug 5 22:11:02.474669 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Aug 5 22:11:02.504947 ignition[725]: Ignition 2.18.0 Aug 5 22:11:02.504975 ignition[725]: Stage: disks Aug 5 22:11:02.505484 ignition[725]: no configs at "/usr/lib/ignition/base.d" Aug 5 22:11:02.505512 ignition[725]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Aug 5 22:11:02.510396 systemd[1]: Finished ignition-disks.service - Ignition (disks). Aug 5 22:11:02.507905 ignition[725]: disks: disks passed Aug 5 22:11:02.514152 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Aug 5 22:11:02.508010 ignition[725]: Ignition finished successfully Aug 5 22:11:02.516072 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Aug 5 22:11:02.518466 systemd[1]: Reached target local-fs.target - Local File Systems. Aug 5 22:11:02.521275 systemd[1]: Reached target sysinit.target - System Initialization. Aug 5 22:11:02.523624 systemd[1]: Reached target basic.target - Basic System. Aug 5 22:11:02.532570 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Aug 5 22:11:02.568383 systemd-fsck[734]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Aug 5 22:11:02.583211 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Aug 5 22:11:02.591491 systemd[1]: Mounting sysroot.mount - /sysroot... Aug 5 22:11:02.795249 kernel: EXT4-fs (vda9): mounted filesystem e865ac73-053b-4efa-9a0f-50dec3f650d9 r/w with ordered data mode. Quota mode: none. Aug 5 22:11:02.798210 systemd[1]: Mounted sysroot.mount - /sysroot. Aug 5 22:11:02.801010 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Aug 5 22:11:02.815401 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Aug 5 22:11:02.820345 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Aug 5 22:11:02.821178 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Aug 5 22:11:02.823804 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Aug 5 22:11:02.825486 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Aug 5 22:11:02.825544 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Aug 5 22:11:02.834247 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 scanned by mount (742) Aug 5 22:11:02.839528 kernel: BTRFS info (device vda6): first mount of filesystem b6695624-d538-4f05-9ddd-23ee987404c1 Aug 5 22:11:02.839603 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Aug 5 22:11:02.839635 kernel: BTRFS info (device vda6): using free space tree Aug 5 22:11:02.846249 kernel: BTRFS info (device vda6): auto enabling async discard Aug 5 22:11:02.850933 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Aug 5 22:11:02.852492 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Aug 5 22:11:02.861600 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Aug 5 22:11:02.979739 initrd-setup-root[770]: cut: /sysroot/etc/passwd: No such file or directory Aug 5 22:11:02.988176 systemd-networkd[701]: eth0: Gained IPv6LL Aug 5 22:11:02.990155 initrd-setup-root[777]: cut: /sysroot/etc/group: No such file or directory Aug 5 22:11:02.998429 initrd-setup-root[784]: cut: /sysroot/etc/shadow: No such file or directory Aug 5 22:11:03.011993 initrd-setup-root[791]: cut: /sysroot/etc/gshadow: No such file or directory Aug 5 22:11:03.169964 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Aug 5 22:11:03.178414 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Aug 5 22:11:03.181662 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Aug 5 22:11:03.214098 systemd[1]: sysroot-oem.mount: Deactivated successfully. Aug 5 22:11:03.218129 kernel: BTRFS info (device vda6): last unmount of filesystem b6695624-d538-4f05-9ddd-23ee987404c1 Aug 5 22:11:03.244387 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Aug 5 22:11:03.277348 ignition[859]: INFO : Ignition 2.18.0 Aug 5 22:11:03.277348 ignition[859]: INFO : Stage: mount Aug 5 22:11:03.280149 ignition[859]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 5 22:11:03.280149 ignition[859]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Aug 5 22:11:03.280149 ignition[859]: INFO : mount: mount passed Aug 5 22:11:03.280149 ignition[859]: INFO : Ignition finished successfully Aug 5 22:11:03.281810 systemd[1]: Finished ignition-mount.service - Ignition (mount). Aug 5 22:11:10.056845 coreos-metadata[744]: Aug 05 22:11:10.056 WARN failed to locate config-drive, using the metadata service API instead Aug 5 22:11:10.073406 coreos-metadata[744]: Aug 05 22:11:10.073 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Aug 5 22:11:10.086789 coreos-metadata[744]: Aug 05 22:11:10.086 INFO Fetch successful Aug 5 22:11:10.088147 coreos-metadata[744]: Aug 05 22:11:10.087 INFO wrote hostname ci-3975-2-0-1-de7b5ef465.novalocal to /sysroot/etc/hostname Aug 5 22:11:10.088729 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Aug 5 22:11:10.088848 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Aug 5 22:11:10.105323 systemd[1]: Starting ignition-files.service - Ignition (files)... Aug 5 22:11:10.119699 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Aug 5 22:11:10.137416 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 scanned by mount (876) Aug 5 22:11:10.148263 kernel: BTRFS info (device vda6): first mount of filesystem b6695624-d538-4f05-9ddd-23ee987404c1 Aug 5 22:11:10.148343 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Aug 5 22:11:10.148372 kernel: BTRFS info (device vda6): using free space tree Aug 5 22:11:10.158309 kernel: BTRFS info (device vda6): auto enabling async discard Aug 5 22:11:10.163391 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Aug 5 22:11:10.208346 ignition[894]: INFO : Ignition 2.18.0 Aug 5 22:11:10.209855 ignition[894]: INFO : Stage: files Aug 5 22:11:10.209855 ignition[894]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 5 22:11:10.209855 ignition[894]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Aug 5 22:11:10.214415 ignition[894]: DEBUG : files: compiled without relabeling support, skipping Aug 5 22:11:10.214415 ignition[894]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Aug 5 22:11:10.214415 ignition[894]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Aug 5 22:11:10.221062 ignition[894]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Aug 5 22:11:10.224973 ignition[894]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Aug 5 22:11:10.224973 ignition[894]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Aug 5 22:11:10.221945 unknown[894]: wrote ssh authorized keys file for user: core Aug 5 22:11:10.231052 ignition[894]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Aug 5 22:11:10.231052 ignition[894]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Aug 5 22:11:10.899375 ignition[894]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Aug 5 22:11:11.215640 ignition[894]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Aug 5 22:11:11.215640 ignition[894]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Aug 5 22:11:11.220542 ignition[894]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Aug 5 22:11:11.220542 ignition[894]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Aug 5 22:11:11.220542 ignition[894]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Aug 5 22:11:11.220542 ignition[894]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Aug 5 22:11:11.220542 ignition[894]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Aug 5 22:11:11.220542 ignition[894]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Aug 5 22:11:11.220542 ignition[894]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Aug 5 22:11:11.220542 ignition[894]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Aug 5 22:11:11.220542 ignition[894]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Aug 5 22:11:11.220542 ignition[894]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Aug 5 22:11:11.220542 ignition[894]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Aug 5 22:11:11.220542 ignition[894]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Aug 5 22:11:11.220542 ignition[894]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.29.2-x86-64.raw: attempt #1 Aug 5 22:11:11.745298 ignition[894]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Aug 5 22:11:13.585167 ignition[894]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Aug 5 22:11:13.585167 ignition[894]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Aug 5 22:11:13.592096 ignition[894]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Aug 5 22:11:13.592096 ignition[894]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Aug 5 22:11:13.592096 ignition[894]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Aug 5 22:11:13.592096 ignition[894]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Aug 5 22:11:13.592096 ignition[894]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Aug 5 22:11:13.592096 ignition[894]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Aug 5 22:11:13.592096 ignition[894]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Aug 5 22:11:13.592096 ignition[894]: INFO : files: files passed Aug 5 22:11:13.592096 ignition[894]: INFO : Ignition finished successfully Aug 5 22:11:13.590749 systemd[1]: Finished ignition-files.service - Ignition (files). Aug 5 22:11:13.607926 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Aug 5 22:11:13.611043 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Aug 5 22:11:13.612508 systemd[1]: ignition-quench.service: Deactivated successfully. Aug 5 22:11:13.612629 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Aug 5 22:11:13.639016 initrd-setup-root-after-ignition[923]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Aug 5 22:11:13.642303 initrd-setup-root-after-ignition[927]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Aug 5 22:11:13.644442 initrd-setup-root-after-ignition[923]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Aug 5 22:11:13.643059 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Aug 5 22:11:13.645601 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Aug 5 22:11:13.660505 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Aug 5 22:11:13.694701 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Aug 5 22:11:13.694910 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Aug 5 22:11:13.697099 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Aug 5 22:11:13.699465 systemd[1]: Reached target initrd.target - Initrd Default Target. Aug 5 22:11:13.701734 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Aug 5 22:11:13.714543 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Aug 5 22:11:13.740923 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Aug 5 22:11:13.750533 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Aug 5 22:11:13.768128 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Aug 5 22:11:13.768912 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 5 22:11:13.769763 systemd[1]: Stopped target timers.target - Timer Units. Aug 5 22:11:13.771917 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Aug 5 22:11:13.772059 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Aug 5 22:11:13.775015 systemd[1]: Stopped target initrd.target - Initrd Default Target. Aug 5 22:11:13.776140 systemd[1]: Stopped target basic.target - Basic System. Aug 5 22:11:13.777845 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Aug 5 22:11:13.780162 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Aug 5 22:11:13.782086 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Aug 5 22:11:13.783895 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Aug 5 22:11:13.785788 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Aug 5 22:11:13.787536 systemd[1]: Stopped target sysinit.target - System Initialization. Aug 5 22:11:13.788808 systemd[1]: Stopped target local-fs.target - Local File Systems. Aug 5 22:11:13.789930 systemd[1]: Stopped target swap.target - Swaps. Aug 5 22:11:13.791097 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Aug 5 22:11:13.791276 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Aug 5 22:11:13.792646 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Aug 5 22:11:13.793431 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 5 22:11:13.794445 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Aug 5 22:11:13.794566 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 5 22:11:13.795935 systemd[1]: dracut-initqueue.service: Deactivated successfully. Aug 5 22:11:13.796130 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Aug 5 22:11:13.798894 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Aug 5 22:11:13.799033 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Aug 5 22:11:13.800044 systemd[1]: ignition-files.service: Deactivated successfully. Aug 5 22:11:13.800162 systemd[1]: Stopped ignition-files.service - Ignition (files). Aug 5 22:11:13.807548 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Aug 5 22:11:13.811504 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Aug 5 22:11:13.812177 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Aug 5 22:11:13.812405 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Aug 5 22:11:13.814591 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Aug 5 22:11:13.814785 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Aug 5 22:11:13.825822 ignition[947]: INFO : Ignition 2.18.0 Aug 5 22:11:13.828176 ignition[947]: INFO : Stage: umount Aug 5 22:11:13.828176 ignition[947]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 5 22:11:13.828176 ignition[947]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Aug 5 22:11:13.828533 systemd[1]: initrd-cleanup.service: Deactivated successfully. Aug 5 22:11:13.828652 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Aug 5 22:11:13.833460 ignition[947]: INFO : umount: umount passed Aug 5 22:11:13.833460 ignition[947]: INFO : Ignition finished successfully Aug 5 22:11:13.835527 systemd[1]: ignition-mount.service: Deactivated successfully. Aug 5 22:11:13.835651 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Aug 5 22:11:13.837791 systemd[1]: ignition-disks.service: Deactivated successfully. Aug 5 22:11:13.837898 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Aug 5 22:11:13.838595 systemd[1]: ignition-kargs.service: Deactivated successfully. Aug 5 22:11:13.838640 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Aug 5 22:11:13.839700 systemd[1]: ignition-fetch.service: Deactivated successfully. Aug 5 22:11:13.839742 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Aug 5 22:11:13.840783 systemd[1]: Stopped target network.target - Network. Aug 5 22:11:13.841777 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Aug 5 22:11:13.841850 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Aug 5 22:11:13.842881 systemd[1]: Stopped target paths.target - Path Units. Aug 5 22:11:13.843836 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Aug 5 22:11:13.847278 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 5 22:11:13.847869 systemd[1]: Stopped target slices.target - Slice Units. Aug 5 22:11:13.849064 systemd[1]: Stopped target sockets.target - Socket Units. Aug 5 22:11:13.850155 systemd[1]: iscsid.socket: Deactivated successfully. Aug 5 22:11:13.850207 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Aug 5 22:11:13.851149 systemd[1]: iscsiuio.socket: Deactivated successfully. Aug 5 22:11:13.851186 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Aug 5 22:11:13.852344 systemd[1]: ignition-setup.service: Deactivated successfully. Aug 5 22:11:13.852414 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Aug 5 22:11:13.853458 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Aug 5 22:11:13.853507 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Aug 5 22:11:13.854704 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Aug 5 22:11:13.856025 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Aug 5 22:11:13.860277 systemd-networkd[701]: eth0: DHCPv6 lease lost Aug 5 22:11:13.863092 systemd[1]: systemd-networkd.service: Deactivated successfully. Aug 5 22:11:13.863377 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Aug 5 22:11:13.864998 systemd[1]: systemd-resolved.service: Deactivated successfully. Aug 5 22:11:13.865117 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Aug 5 22:11:13.868994 systemd[1]: systemd-networkd.socket: Deactivated successfully. Aug 5 22:11:13.869080 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Aug 5 22:11:13.873378 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Aug 5 22:11:13.875051 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Aug 5 22:11:13.875117 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Aug 5 22:11:13.877085 systemd[1]: systemd-sysctl.service: Deactivated successfully. Aug 5 22:11:13.877137 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Aug 5 22:11:13.878335 systemd[1]: systemd-modules-load.service: Deactivated successfully. Aug 5 22:11:13.878382 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Aug 5 22:11:13.879427 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Aug 5 22:11:13.879472 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create Volatile Files and Directories. Aug 5 22:11:13.883317 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 5 22:11:13.894404 systemd[1]: systemd-udevd.service: Deactivated successfully. Aug 5 22:11:13.894570 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 5 22:11:13.901031 systemd[1]: network-cleanup.service: Deactivated successfully. Aug 5 22:11:13.901133 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Aug 5 22:11:13.902648 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Aug 5 22:11:13.902701 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Aug 5 22:11:13.903989 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Aug 5 22:11:13.904022 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Aug 5 22:11:13.905013 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Aug 5 22:11:13.905058 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Aug 5 22:11:13.906713 systemd[1]: dracut-cmdline.service: Deactivated successfully. Aug 5 22:11:13.906755 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Aug 5 22:11:13.907912 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Aug 5 22:11:13.907955 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 5 22:11:13.924419 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Aug 5 22:11:13.925031 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Aug 5 22:11:13.925092 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 5 22:11:13.925689 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Aug 5 22:11:13.925730 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Aug 5 22:11:13.926319 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Aug 5 22:11:13.926363 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Aug 5 22:11:13.931100 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 5 22:11:13.931159 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 5 22:11:13.933344 systemd[1]: sysroot-boot.mount: Deactivated successfully. Aug 5 22:11:13.933913 systemd[1]: sysroot-boot.service: Deactivated successfully. Aug 5 22:11:13.934017 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Aug 5 22:11:13.934720 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Aug 5 22:11:13.934797 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Aug 5 22:11:13.936088 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Aug 5 22:11:13.937103 systemd[1]: initrd-setup-root.service: Deactivated successfully. Aug 5 22:11:13.937154 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Aug 5 22:11:13.943455 systemd[1]: Starting initrd-switch-root.service - Switch Root... Aug 5 22:11:13.957122 systemd[1]: Switching root. Aug 5 22:11:13.991412 systemd-journald[184]: Journal stopped Aug 5 22:11:16.572503 systemd-journald[184]: Received SIGTERM from PID 1 (systemd). Aug 5 22:11:16.572570 kernel: SELinux: policy capability network_peer_controls=1 Aug 5 22:11:16.572589 kernel: SELinux: policy capability open_perms=1 Aug 5 22:11:16.572601 kernel: SELinux: policy capability extended_socket_class=1 Aug 5 22:11:16.572620 kernel: SELinux: policy capability always_check_network=0 Aug 5 22:11:16.572632 kernel: SELinux: policy capability cgroup_seclabel=1 Aug 5 22:11:16.572643 kernel: SELinux: policy capability nnp_nosuid_transition=1 Aug 5 22:11:16.572663 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Aug 5 22:11:16.572674 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Aug 5 22:11:16.572684 kernel: audit: type=1403 audit(1722895875.368:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Aug 5 22:11:16.572697 systemd[1]: Successfully loaded SELinux policy in 70.935ms. Aug 5 22:11:16.572728 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 19.202ms. Aug 5 22:11:16.572741 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Aug 5 22:11:16.572754 systemd[1]: Detected virtualization kvm. Aug 5 22:11:16.572767 systemd[1]: Detected architecture x86-64. Aug 5 22:11:16.572782 systemd[1]: Detected first boot. Aug 5 22:11:16.572794 systemd[1]: Hostname set to . Aug 5 22:11:16.572807 systemd[1]: Initializing machine ID from VM UUID. Aug 5 22:11:16.572820 zram_generator::config[989]: No configuration found. Aug 5 22:11:16.572839 systemd[1]: Populated /etc with preset unit settings. Aug 5 22:11:16.572851 systemd[1]: initrd-switch-root.service: Deactivated successfully. Aug 5 22:11:16.572863 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Aug 5 22:11:16.572875 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Aug 5 22:11:16.572891 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Aug 5 22:11:16.572904 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Aug 5 22:11:16.572916 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Aug 5 22:11:16.572928 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Aug 5 22:11:16.572941 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Aug 5 22:11:16.572954 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Aug 5 22:11:16.572967 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Aug 5 22:11:16.572980 systemd[1]: Created slice user.slice - User and Session Slice. Aug 5 22:11:16.572992 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 5 22:11:16.573008 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 5 22:11:16.573021 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Aug 5 22:11:16.573033 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Aug 5 22:11:16.573045 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Aug 5 22:11:16.573058 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Aug 5 22:11:16.573070 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Aug 5 22:11:16.573082 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 5 22:11:16.573095 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Aug 5 22:11:16.573110 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Aug 5 22:11:16.573123 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Aug 5 22:11:16.573135 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Aug 5 22:11:16.573147 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 5 22:11:16.573160 systemd[1]: Reached target remote-fs.target - Remote File Systems. Aug 5 22:11:16.573173 systemd[1]: Reached target slices.target - Slice Units. Aug 5 22:11:16.573185 systemd[1]: Reached target swap.target - Swaps. Aug 5 22:11:16.573200 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Aug 5 22:11:16.573212 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Aug 5 22:11:16.573246 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Aug 5 22:11:16.573260 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Aug 5 22:11:16.573277 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Aug 5 22:11:16.573289 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Aug 5 22:11:16.573302 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Aug 5 22:11:16.573314 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Aug 5 22:11:16.573327 systemd[1]: Mounting media.mount - External Media Directory... Aug 5 22:11:16.573342 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 5 22:11:16.573354 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Aug 5 22:11:16.573366 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Aug 5 22:11:16.573378 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Aug 5 22:11:16.573391 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Aug 5 22:11:16.573403 systemd[1]: Reached target machines.target - Containers. Aug 5 22:11:16.573416 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Aug 5 22:11:16.573428 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 5 22:11:16.573443 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Aug 5 22:11:16.573455 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Aug 5 22:11:16.573467 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 5 22:11:16.573479 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Aug 5 22:11:16.573492 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 5 22:11:16.573504 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Aug 5 22:11:16.573516 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 5 22:11:16.573528 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Aug 5 22:11:16.573542 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Aug 5 22:11:16.573557 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Aug 5 22:11:16.573569 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Aug 5 22:11:16.573581 systemd[1]: Stopped systemd-fsck-usr.service. Aug 5 22:11:16.573594 systemd[1]: Starting systemd-journald.service - Journal Service... Aug 5 22:11:16.573606 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Aug 5 22:11:16.573618 kernel: loop: module loaded Aug 5 22:11:16.573630 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Aug 5 22:11:16.573642 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Aug 5 22:11:16.573654 kernel: fuse: init (API version 7.39) Aug 5 22:11:16.573667 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Aug 5 22:11:16.573680 systemd[1]: verity-setup.service: Deactivated successfully. Aug 5 22:11:16.573692 systemd[1]: Stopped verity-setup.service. Aug 5 22:11:16.573705 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 5 22:11:16.573717 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Aug 5 22:11:16.573729 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Aug 5 22:11:16.573742 systemd[1]: Mounted media.mount - External Media Directory. Aug 5 22:11:16.573756 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Aug 5 22:11:16.573768 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Aug 5 22:11:16.573780 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Aug 5 22:11:16.573792 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Aug 5 22:11:16.573806 systemd[1]: modprobe@configfs.service: Deactivated successfully. Aug 5 22:11:16.573834 systemd-journald[1070]: Collecting audit messages is disabled. Aug 5 22:11:16.573862 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Aug 5 22:11:16.573875 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 5 22:11:16.573887 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 5 22:11:16.573903 systemd-journald[1070]: Journal started Aug 5 22:11:16.573930 systemd-journald[1070]: Runtime Journal (/run/log/journal/3a0c4425f3c947b2a6652bd55521a554) is 4.9M, max 39.3M, 34.4M free. Aug 5 22:11:16.039194 systemd[1]: Queued start job for default target multi-user.target. Aug 5 22:11:16.141856 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Aug 5 22:11:16.142784 systemd[1]: systemd-journald.service: Deactivated successfully. Aug 5 22:11:16.578304 systemd[1]: Started systemd-journald.service - Journal Service. Aug 5 22:11:16.579684 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 5 22:11:16.581291 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 5 22:11:16.582138 systemd[1]: modprobe@fuse.service: Deactivated successfully. Aug 5 22:11:16.582300 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Aug 5 22:11:16.583038 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 5 22:11:16.583170 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 5 22:11:16.584598 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Aug 5 22:11:16.586445 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Aug 5 22:11:16.587724 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Aug 5 22:11:16.610244 kernel: ACPI: bus type drm_connector registered Aug 5 22:11:16.616101 systemd[1]: modprobe@drm.service: Deactivated successfully. Aug 5 22:11:16.616333 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Aug 5 22:11:16.620999 systemd[1]: Reached target network-pre.target - Preparation for Network. Aug 5 22:11:16.628404 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Aug 5 22:11:16.636328 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Aug 5 22:11:16.636980 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Aug 5 22:11:16.637018 systemd[1]: Reached target local-fs.target - Local File Systems. Aug 5 22:11:16.639788 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Aug 5 22:11:16.646442 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Aug 5 22:11:16.656559 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Aug 5 22:11:16.658425 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 5 22:11:16.678466 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Aug 5 22:11:16.686721 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Aug 5 22:11:16.687447 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 5 22:11:16.689176 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Aug 5 22:11:16.690268 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Aug 5 22:11:16.693211 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Aug 5 22:11:16.696396 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Aug 5 22:11:16.705572 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Aug 5 22:11:16.708309 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Aug 5 22:11:16.709182 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Aug 5 22:11:16.709960 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Aug 5 22:11:16.717821 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Aug 5 22:11:16.722514 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Aug 5 22:11:16.733358 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Aug 5 22:11:16.747724 systemd-journald[1070]: Time spent on flushing to /var/log/journal/3a0c4425f3c947b2a6652bd55521a554 is 51.515ms for 941 entries. Aug 5 22:11:16.747724 systemd-journald[1070]: System Journal (/var/log/journal/3a0c4425f3c947b2a6652bd55521a554) is 8.0M, max 584.8M, 576.8M free. Aug 5 22:11:16.877504 systemd-journald[1070]: Received client request to flush runtime journal. Aug 5 22:11:16.877558 kernel: loop0: detected capacity change from 0 to 80568 Aug 5 22:11:16.877581 kernel: block loop0: the capability attribute has been deprecated. Aug 5 22:11:16.777055 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Aug 5 22:11:16.780965 udevadm[1127]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Aug 5 22:11:16.782107 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Aug 5 22:11:16.789518 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Aug 5 22:11:16.807184 systemd-tmpfiles[1121]: ACLs are not supported, ignoring. Aug 5 22:11:16.807199 systemd-tmpfiles[1121]: ACLs are not supported, ignoring. Aug 5 22:11:16.814291 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Aug 5 22:11:16.821832 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Aug 5 22:11:16.834732 systemd[1]: Starting systemd-sysusers.service - Create System Users... Aug 5 22:11:16.882628 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Aug 5 22:11:16.914018 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Aug 5 22:11:16.919327 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Aug 5 22:11:16.923288 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Aug 5 22:11:16.945704 systemd[1]: Finished systemd-sysusers.service - Create System Users. Aug 5 22:11:16.956411 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Aug 5 22:11:16.963274 kernel: loop1: detected capacity change from 0 to 139904 Aug 5 22:11:16.980584 systemd-tmpfiles[1146]: ACLs are not supported, ignoring. Aug 5 22:11:16.980606 systemd-tmpfiles[1146]: ACLs are not supported, ignoring. Aug 5 22:11:16.989927 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 5 22:11:17.047340 kernel: loop2: detected capacity change from 0 to 8 Aug 5 22:11:17.072317 kernel: loop3: detected capacity change from 0 to 211296 Aug 5 22:11:17.134267 kernel: loop4: detected capacity change from 0 to 80568 Aug 5 22:11:17.194244 kernel: loop5: detected capacity change from 0 to 139904 Aug 5 22:11:17.265255 kernel: loop6: detected capacity change from 0 to 8 Aug 5 22:11:17.270247 kernel: loop7: detected capacity change from 0 to 211296 Aug 5 22:11:17.341149 (sd-merge)[1152]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. Aug 5 22:11:17.342513 (sd-merge)[1152]: Merged extensions into '/usr'. Aug 5 22:11:17.356602 systemd[1]: Reloading requested from client PID 1120 ('systemd-sysext') (unit systemd-sysext.service)... Aug 5 22:11:17.356638 systemd[1]: Reloading... Aug 5 22:11:17.466268 zram_generator::config[1176]: No configuration found. Aug 5 22:11:17.677702 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 5 22:11:17.748519 systemd[1]: Reloading finished in 390 ms. Aug 5 22:11:17.775155 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Aug 5 22:11:17.776452 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Aug 5 22:11:17.784405 systemd[1]: Starting ensure-sysext.service... Aug 5 22:11:17.787563 systemd[1]: Starting systemd-tmpfiles-setup.service - Create Volatile Files and Directories... Aug 5 22:11:17.796756 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 5 22:11:17.812411 systemd[1]: Reloading requested from client PID 1232 ('systemctl') (unit ensure-sysext.service)... Aug 5 22:11:17.812424 systemd[1]: Reloading... Aug 5 22:11:17.831592 systemd-tmpfiles[1233]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Aug 5 22:11:17.832032 systemd-tmpfiles[1233]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Aug 5 22:11:17.833007 systemd-tmpfiles[1233]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Aug 5 22:11:17.835926 systemd-tmpfiles[1233]: ACLs are not supported, ignoring. Aug 5 22:11:17.836003 systemd-tmpfiles[1233]: ACLs are not supported, ignoring. Aug 5 22:11:17.842870 systemd-tmpfiles[1233]: Detected autofs mount point /boot during canonicalization of boot. Aug 5 22:11:17.842889 systemd-tmpfiles[1233]: Skipping /boot Aug 5 22:11:17.848002 ldconfig[1108]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Aug 5 22:11:17.874121 systemd-tmpfiles[1233]: Detected autofs mount point /boot during canonicalization of boot. Aug 5 22:11:17.874136 systemd-tmpfiles[1233]: Skipping /boot Aug 5 22:11:17.883241 zram_generator::config[1256]: No configuration found. Aug 5 22:11:17.897525 systemd-udevd[1234]: Using default interface naming scheme 'v255'. Aug 5 22:11:18.025261 kernel: BTRFS info: devid 1 device path /dev/mapper/usr changed to /dev/dm-0 scanned by (udev-worker) (1293) Aug 5 22:11:18.063266 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 36 scanned by (udev-worker) (1311) Aug 5 22:11:18.104943 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 5 22:11:18.117282 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input2 Aug 5 22:11:18.147367 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0 Aug 5 22:11:18.168530 kernel: ACPI: button: Power Button [PWRF] Aug 5 22:11:18.180257 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Aug 5 22:11:18.226254 kernel: mousedev: PS/2 mouse device common for all mice Aug 5 22:11:18.239802 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Aug 5 22:11:18.240503 systemd[1]: Reloading finished in 427 ms. Aug 5 22:11:18.248305 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0 Aug 5 22:11:18.248422 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console Aug 5 22:11:18.253251 kernel: Console: switching to colour dummy device 80x25 Aug 5 22:11:18.254367 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Aug 5 22:11:18.254405 kernel: [drm] features: -context_init Aug 5 22:11:18.258194 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 5 22:11:18.259245 kernel: [drm] number of scanouts: 1 Aug 5 22:11:18.259301 kernel: [drm] number of cap sets: 0 Aug 5 22:11:18.262328 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:02.0 on minor 0 Aug 5 22:11:18.261338 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Aug 5 22:11:18.269619 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Aug 5 22:11:18.269739 kernel: Console: switching to colour frame buffer device 128x48 Aug 5 22:11:18.268270 systemd[1]: Finished systemd-tmpfiles-setup.service - Create Volatile Files and Directories. Aug 5 22:11:18.272780 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device Aug 5 22:11:18.309514 systemd[1]: Finished ensure-sysext.service. Aug 5 22:11:18.318488 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Aug 5 22:11:18.319532 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 5 22:11:18.324371 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Aug 5 22:11:18.327366 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Aug 5 22:11:18.327584 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 5 22:11:18.332450 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 5 22:11:18.335413 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Aug 5 22:11:18.336913 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 5 22:11:18.341105 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 5 22:11:18.341407 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 5 22:11:18.346724 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Aug 5 22:11:18.350370 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Aug 5 22:11:18.360414 systemd[1]: Starting systemd-networkd.service - Network Configuration... Aug 5 22:11:18.364261 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Aug 5 22:11:18.369478 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Aug 5 22:11:18.378426 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Aug 5 22:11:18.384449 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 5 22:11:18.390663 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 5 22:11:18.392043 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Aug 5 22:11:18.396903 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 5 22:11:18.397060 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 5 22:11:18.397651 systemd[1]: modprobe@drm.service: Deactivated successfully. Aug 5 22:11:18.398314 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Aug 5 22:11:18.403148 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 5 22:11:18.403909 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 5 22:11:18.405643 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 5 22:11:18.405950 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 5 22:11:18.421558 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Aug 5 22:11:18.422988 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 5 22:11:18.423055 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Aug 5 22:11:18.432433 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Aug 5 22:11:18.433359 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Aug 5 22:11:18.447328 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Aug 5 22:11:18.470850 lvm[1374]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Aug 5 22:11:18.485670 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Aug 5 22:11:18.489290 augenrules[1384]: No rules Aug 5 22:11:18.497426 systemd[1]: Starting systemd-update-done.service - Update is Completed... Aug 5 22:11:18.498332 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Aug 5 22:11:18.514301 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Aug 5 22:11:18.519027 systemd[1]: Finished systemd-update-done.service - Update is Completed. Aug 5 22:11:18.521714 systemd[1]: Started systemd-userdbd.service - User Database Manager. Aug 5 22:11:18.526497 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Aug 5 22:11:18.539408 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Aug 5 22:11:18.541008 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Aug 5 22:11:18.544192 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Aug 5 22:11:18.562155 lvm[1398]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Aug 5 22:11:18.570665 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 5 22:11:18.609581 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Aug 5 22:11:18.612208 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Aug 5 22:11:18.616612 systemd[1]: Reached target time-set.target - System Time Set. Aug 5 22:11:18.646501 systemd-networkd[1359]: lo: Link UP Aug 5 22:11:18.646515 systemd-networkd[1359]: lo: Gained carrier Aug 5 22:11:18.647803 systemd-networkd[1359]: Enumeration completed Aug 5 22:11:18.647886 systemd[1]: Started systemd-networkd.service - Network Configuration. Aug 5 22:11:18.652550 systemd-networkd[1359]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 5 22:11:18.652561 systemd-networkd[1359]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 5 22:11:18.653563 systemd-networkd[1359]: eth0: Link UP Aug 5 22:11:18.653569 systemd-networkd[1359]: eth0: Gained carrier Aug 5 22:11:18.653582 systemd-networkd[1359]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 5 22:11:18.654089 systemd-resolved[1360]: Positive Trust Anchors: Aug 5 22:11:18.654370 systemd-resolved[1360]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Aug 5 22:11:18.654416 systemd-resolved[1360]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa corp home internal intranet lan local private test Aug 5 22:11:18.659450 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Aug 5 22:11:18.660739 systemd-resolved[1360]: Using system hostname 'ci-3975-2-0-1-de7b5ef465.novalocal'. Aug 5 22:11:18.662810 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Aug 5 22:11:18.666897 systemd[1]: Reached target network.target - Network. Aug 5 22:11:18.668381 systemd-networkd[1359]: eth0: DHCPv4 address 172.24.4.33/24, gateway 172.24.4.1 acquired from 172.24.4.1 Aug 5 22:11:18.669473 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Aug 5 22:11:18.672175 systemd[1]: Reached target sysinit.target - System Initialization. Aug 5 22:11:18.673562 systemd-timesyncd[1361]: Network configuration changed, trying to establish connection. Aug 5 22:11:18.674983 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Aug 5 22:11:18.677055 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Aug 5 22:11:18.679868 systemd[1]: Started logrotate.timer - Daily rotation of log files. Aug 5 22:11:18.684783 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Aug 5 22:11:18.686953 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Aug 5 22:11:18.688957 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Aug 5 22:11:18.689098 systemd[1]: Reached target paths.target - Path Units. Aug 5 22:11:18.691093 systemd[1]: Reached target timers.target - Timer Units. Aug 5 22:11:18.696270 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Aug 5 22:11:18.701669 systemd[1]: Starting docker.socket - Docker Socket for the API... Aug 5 22:11:18.710352 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Aug 5 22:11:18.715727 systemd[1]: Listening on docker.socket - Docker Socket for the API. Aug 5 22:11:18.716467 systemd[1]: Reached target sockets.target - Socket Units. Aug 5 22:11:18.716982 systemd[1]: Reached target basic.target - Basic System. Aug 5 22:11:18.717510 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Aug 5 22:11:18.717544 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Aug 5 22:11:18.729349 systemd[1]: Starting containerd.service - containerd container runtime... Aug 5 22:11:18.732763 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Aug 5 22:11:18.739475 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Aug 5 22:11:18.742903 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Aug 5 22:11:18.753482 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Aug 5 22:11:18.755827 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Aug 5 22:11:18.760399 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Aug 5 22:11:18.766302 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Aug 5 22:11:18.777347 jq[1416]: false Aug 5 22:11:18.777466 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Aug 5 22:11:18.782027 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Aug 5 22:11:18.790520 systemd[1]: Starting systemd-logind.service - User Login Management... Aug 5 22:11:18.791650 extend-filesystems[1417]: Found loop4 Aug 5 22:11:18.800001 extend-filesystems[1417]: Found loop5 Aug 5 22:11:18.800001 extend-filesystems[1417]: Found loop6 Aug 5 22:11:18.800001 extend-filesystems[1417]: Found loop7 Aug 5 22:11:18.800001 extend-filesystems[1417]: Found vda Aug 5 22:11:18.800001 extend-filesystems[1417]: Found vda1 Aug 5 22:11:18.800001 extend-filesystems[1417]: Found vda2 Aug 5 22:11:18.800001 extend-filesystems[1417]: Found vda3 Aug 5 22:11:18.800001 extend-filesystems[1417]: Found usr Aug 5 22:11:18.800001 extend-filesystems[1417]: Found vda4 Aug 5 22:11:18.800001 extend-filesystems[1417]: Found vda6 Aug 5 22:11:18.800001 extend-filesystems[1417]: Found vda7 Aug 5 22:11:18.800001 extend-filesystems[1417]: Found vda9 Aug 5 22:11:18.800001 extend-filesystems[1417]: Checking size of /dev/vda9 Aug 5 22:11:18.798964 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Aug 5 22:11:18.877871 dbus-daemon[1413]: [system] SELinux support is enabled Aug 5 22:11:18.884513 extend-filesystems[1417]: Resized partition /dev/vda9 Aug 5 22:11:18.803968 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Aug 5 22:11:18.890787 extend-filesystems[1440]: resize2fs 1.47.0 (5-Feb-2023) Aug 5 22:11:18.898920 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 4635643 blocks Aug 5 22:11:18.805400 systemd[1]: Starting update-engine.service - Update Engine... Aug 5 22:11:18.832367 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Aug 5 22:11:18.900968 jq[1431]: true Aug 5 22:11:18.847599 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Aug 5 22:11:18.901502 update_engine[1425]: I0805 22:11:18.889689 1425 main.cc:92] Flatcar Update Engine starting Aug 5 22:11:18.847811 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Aug 5 22:11:18.861880 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Aug 5 22:11:18.862112 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Aug 5 22:11:18.873880 systemd[1]: motdgen.service: Deactivated successfully. Aug 5 22:11:18.875302 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Aug 5 22:11:18.887530 systemd[1]: Started dbus.service - D-Bus System Message Bus. Aug 5 22:11:18.918249 update_engine[1425]: I0805 22:11:18.917549 1425 update_check_scheduler.cc:74] Next update check in 2m3s Aug 5 22:11:18.926835 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Aug 5 22:11:18.927546 jq[1443]: true Aug 5 22:11:18.926869 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Aug 5 22:11:18.932629 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Aug 5 22:11:18.932664 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Aug 5 22:11:18.941432 tar[1438]: linux-amd64/helm Aug 5 22:11:18.943452 systemd[1]: Started update-engine.service - Update Engine. Aug 5 22:11:18.953039 systemd[1]: Started locksmithd.service - Cluster reboot manager. Aug 5 22:11:18.956201 (ntainerd)[1444]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Aug 5 22:11:18.961252 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 36 scanned by (udev-worker) (1303) Aug 5 22:11:19.051449 systemd-logind[1423]: New seat seat0. Aug 5 22:11:19.067690 systemd-logind[1423]: Watching system buttons on /dev/input/event1 (Power Button) Aug 5 22:11:19.067709 systemd-logind[1423]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Aug 5 22:11:19.068240 systemd[1]: Started systemd-logind.service - User Login Management. Aug 5 22:11:19.110255 kernel: EXT4-fs (vda9): resized filesystem to 4635643 Aug 5 22:11:19.176482 extend-filesystems[1440]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Aug 5 22:11:19.176482 extend-filesystems[1440]: old_desc_blocks = 1, new_desc_blocks = 3 Aug 5 22:11:19.176482 extend-filesystems[1440]: The filesystem on /dev/vda9 is now 4635643 (4k) blocks long. Aug 5 22:11:19.176397 systemd[1]: extend-filesystems.service: Deactivated successfully. Aug 5 22:11:19.218545 bash[1469]: Updated "/home/core/.ssh/authorized_keys" Aug 5 22:11:19.224892 extend-filesystems[1417]: Resized filesystem in /dev/vda9 Aug 5 22:11:19.176580 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Aug 5 22:11:19.189917 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Aug 5 22:11:19.209517 systemd[1]: Starting sshkeys.service... Aug 5 22:11:19.256683 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Aug 5 22:11:19.271653 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Aug 5 22:11:19.331505 locksmithd[1456]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Aug 5 22:11:19.373575 sshd_keygen[1442]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Aug 5 22:11:19.445889 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Aug 5 22:11:19.461649 systemd[1]: Starting issuegen.service - Generate /run/issue... Aug 5 22:11:19.477447 systemd[1]: issuegen.service: Deactivated successfully. Aug 5 22:11:19.478016 systemd[1]: Finished issuegen.service - Generate /run/issue. Aug 5 22:11:19.484456 containerd[1444]: time="2024-08-05T22:11:19.484349492Z" level=info msg="starting containerd" revision=1fbfc07f8d28210e62bdbcbf7b950bac8028afbf version=v1.7.17 Aug 5 22:11:19.489693 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Aug 5 22:11:19.519622 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Aug 5 22:11:19.534710 systemd[1]: Started getty@tty1.service - Getty on tty1. Aug 5 22:11:19.546009 containerd[1444]: time="2024-08-05T22:11:19.544603268Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Aug 5 22:11:19.546009 containerd[1444]: time="2024-08-05T22:11:19.544665655Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Aug 5 22:11:19.546210 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Aug 5 22:11:19.548647 systemd[1]: Reached target getty.target - Login Prompts. Aug 5 22:11:19.554763 containerd[1444]: time="2024-08-05T22:11:19.554714131Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.43-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Aug 5 22:11:19.554841 containerd[1444]: time="2024-08-05T22:11:19.554765117Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Aug 5 22:11:19.555102 containerd[1444]: time="2024-08-05T22:11:19.555058336Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Aug 5 22:11:19.555102 containerd[1444]: time="2024-08-05T22:11:19.555098432Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Aug 5 22:11:19.555587 containerd[1444]: time="2024-08-05T22:11:19.555289009Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Aug 5 22:11:19.555676 containerd[1444]: time="2024-08-05T22:11:19.555642662Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Aug 5 22:11:19.555712 containerd[1444]: time="2024-08-05T22:11:19.555671797Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Aug 5 22:11:19.555775 containerd[1444]: time="2024-08-05T22:11:19.555750064Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Aug 5 22:11:19.555994 containerd[1444]: time="2024-08-05T22:11:19.555967051Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Aug 5 22:11:19.556026 containerd[1444]: time="2024-08-05T22:11:19.555993170Z" level=warning msg="failed to load plugin io.containerd.snapshotter.v1.devmapper" error="devmapper not configured" Aug 5 22:11:19.556026 containerd[1444]: time="2024-08-05T22:11:19.556006385Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Aug 5 22:11:19.556149 containerd[1444]: time="2024-08-05T22:11:19.556121080Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Aug 5 22:11:19.556149 containerd[1444]: time="2024-08-05T22:11:19.556145586Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Aug 5 22:11:19.556250 containerd[1444]: time="2024-08-05T22:11:19.556205107Z" level=warning msg="could not use snapshotter devmapper in metadata plugin" error="devmapper not configured" Aug 5 22:11:19.556280 containerd[1444]: time="2024-08-05T22:11:19.556247056Z" level=info msg="metadata content store policy set" policy=shared Aug 5 22:11:19.563517 containerd[1444]: time="2024-08-05T22:11:19.563472497Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Aug 5 22:11:19.563582 containerd[1444]: time="2024-08-05T22:11:19.563527170Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Aug 5 22:11:19.563582 containerd[1444]: time="2024-08-05T22:11:19.563544933Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Aug 5 22:11:19.563634 containerd[1444]: time="2024-08-05T22:11:19.563592492Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Aug 5 22:11:19.563634 containerd[1444]: time="2024-08-05T22:11:19.563619453Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Aug 5 22:11:19.563695 containerd[1444]: time="2024-08-05T22:11:19.563632698Z" level=info msg="NRI interface is disabled by configuration." Aug 5 22:11:19.563724 containerd[1444]: time="2024-08-05T22:11:19.563696267Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Aug 5 22:11:19.563890 containerd[1444]: time="2024-08-05T22:11:19.563864242Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Aug 5 22:11:19.563923 containerd[1444]: time="2024-08-05T22:11:19.563892234Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Aug 5 22:11:19.563923 containerd[1444]: time="2024-08-05T22:11:19.563910429Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Aug 5 22:11:19.563967 containerd[1444]: time="2024-08-05T22:11:19.563926619Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Aug 5 22:11:19.563967 containerd[1444]: time="2024-08-05T22:11:19.563947288Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Aug 5 22:11:19.564008 containerd[1444]: time="2024-08-05T22:11:19.563967806Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Aug 5 22:11:19.564008 containerd[1444]: time="2024-08-05T22:11:19.563985600Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Aug 5 22:11:19.564008 containerd[1444]: time="2024-08-05T22:11:19.564001700Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Aug 5 22:11:19.564068 containerd[1444]: time="2024-08-05T22:11:19.564019964Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Aug 5 22:11:19.564068 containerd[1444]: time="2024-08-05T22:11:19.564039671Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Aug 5 22:11:19.564068 containerd[1444]: time="2024-08-05T22:11:19.564055821Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Aug 5 22:11:19.564129 containerd[1444]: time="2024-08-05T22:11:19.564071481Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Aug 5 22:11:19.564212 containerd[1444]: time="2024-08-05T22:11:19.564183511Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Aug 5 22:11:19.564673 containerd[1444]: time="2024-08-05T22:11:19.564645427Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Aug 5 22:11:19.564717 containerd[1444]: time="2024-08-05T22:11:19.564683118Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Aug 5 22:11:19.564717 containerd[1444]: time="2024-08-05T22:11:19.564700260Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Aug 5 22:11:19.564768 containerd[1444]: time="2024-08-05T22:11:19.564726419Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Aug 5 22:11:19.564877 containerd[1444]: time="2024-08-05T22:11:19.564850181Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Aug 5 22:11:19.564908 containerd[1444]: time="2024-08-05T22:11:19.564877813Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Aug 5 22:11:19.564977 containerd[1444]: time="2024-08-05T22:11:19.564952192Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Aug 5 22:11:19.565014 containerd[1444]: time="2024-08-05T22:11:19.564976338Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Aug 5 22:11:19.565014 containerd[1444]: time="2024-08-05T22:11:19.564992889Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Aug 5 22:11:19.565014 containerd[1444]: time="2024-08-05T22:11:19.565010652Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Aug 5 22:11:19.565084 containerd[1444]: time="2024-08-05T22:11:19.565026923Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Aug 5 22:11:19.565084 containerd[1444]: time="2024-08-05T22:11:19.565042742Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Aug 5 22:11:19.565084 containerd[1444]: time="2024-08-05T22:11:19.565059574Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Aug 5 22:11:19.565259 containerd[1444]: time="2024-08-05T22:11:19.565233650Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Aug 5 22:11:19.565294 containerd[1444]: time="2024-08-05T22:11:19.565260310Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Aug 5 22:11:19.565294 containerd[1444]: time="2024-08-05T22:11:19.565276952Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Aug 5 22:11:19.565337 containerd[1444]: time="2024-08-05T22:11:19.565292992Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Aug 5 22:11:19.565337 containerd[1444]: time="2024-08-05T22:11:19.565309523Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Aug 5 22:11:19.565337 containerd[1444]: time="2024-08-05T22:11:19.565326014Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Aug 5 22:11:19.565410 containerd[1444]: time="2024-08-05T22:11:19.565346833Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Aug 5 22:11:19.565410 containerd[1444]: time="2024-08-05T22:11:19.565361480Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Aug 5 22:11:19.565783 containerd[1444]: time="2024-08-05T22:11:19.565704273Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Aug 5 22:11:19.565916 containerd[1444]: time="2024-08-05T22:11:19.565784734Z" level=info msg="Connect containerd service" Aug 5 22:11:19.565916 containerd[1444]: time="2024-08-05T22:11:19.565814069Z" level=info msg="using legacy CRI server" Aug 5 22:11:19.565916 containerd[1444]: time="2024-08-05T22:11:19.565821272Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Aug 5 22:11:19.565916 containerd[1444]: time="2024-08-05T22:11:19.565908666Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Aug 5 22:11:19.567228 containerd[1444]: time="2024-08-05T22:11:19.566818863Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Aug 5 22:11:19.567228 containerd[1444]: time="2024-08-05T22:11:19.566933488Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Aug 5 22:11:19.567228 containerd[1444]: time="2024-08-05T22:11:19.566957523Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Aug 5 22:11:19.567228 containerd[1444]: time="2024-08-05T22:11:19.567032514Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Aug 5 22:11:19.567228 containerd[1444]: time="2024-08-05T22:11:19.567053944Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Aug 5 22:11:19.567351 containerd[1444]: time="2024-08-05T22:11:19.567263497Z" level=info msg="Start subscribing containerd event" Aug 5 22:11:19.567351 containerd[1444]: time="2024-08-05T22:11:19.567309544Z" level=info msg="Start recovering state" Aug 5 22:11:19.567393 containerd[1444]: time="2024-08-05T22:11:19.567362753Z" level=info msg="Start event monitor" Aug 5 22:11:19.567393 containerd[1444]: time="2024-08-05T22:11:19.567387510Z" level=info msg="Start snapshots syncer" Aug 5 22:11:19.567434 containerd[1444]: time="2024-08-05T22:11:19.567397479Z" level=info msg="Start cni network conf syncer for default" Aug 5 22:11:19.567434 containerd[1444]: time="2024-08-05T22:11:19.567407087Z" level=info msg="Start streaming server" Aug 5 22:11:19.567715 containerd[1444]: time="2024-08-05T22:11:19.567688925Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Aug 5 22:11:19.567753 containerd[1444]: time="2024-08-05T22:11:19.567739881Z" level=info msg=serving... address=/run/containerd/containerd.sock Aug 5 22:11:19.568454 systemd[1]: Started containerd.service - containerd container runtime. Aug 5 22:11:19.570228 containerd[1444]: time="2024-08-05T22:11:19.570167965Z" level=info msg="containerd successfully booted in 0.093617s" Aug 5 22:11:19.788824 tar[1438]: linux-amd64/LICENSE Aug 5 22:11:19.789270 tar[1438]: linux-amd64/README.md Aug 5 22:11:19.803441 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Aug 5 22:11:19.883720 systemd-networkd[1359]: eth0: Gained IPv6LL Aug 5 22:11:19.885146 systemd-timesyncd[1361]: Network configuration changed, trying to establish connection. Aug 5 22:11:19.889903 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Aug 5 22:11:19.894292 systemd[1]: Reached target network-online.target - Network is Online. Aug 5 22:11:19.908744 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 5 22:11:19.917853 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Aug 5 22:11:19.973182 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Aug 5 22:11:21.141352 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Aug 5 22:11:21.157033 systemd[1]: Started sshd@0-172.24.4.33:22-172.24.4.1:57016.service - OpenSSH per-connection server daemon (172.24.4.1:57016). Aug 5 22:11:21.678190 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 5 22:11:21.701847 (kubelet)[1531]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 5 22:11:23.040398 sshd[1524]: Accepted publickey for core from 172.24.4.1 port 57016 ssh2: RSA SHA256:wJOsecOnS68Cf9bfpO6HbyavHDudnwLnl2CjsDHwoC8 Aug 5 22:11:23.043921 sshd[1524]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Aug 5 22:11:23.076853 systemd-logind[1423]: New session 1 of user core. Aug 5 22:11:23.079036 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Aug 5 22:11:23.090669 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Aug 5 22:11:23.110412 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Aug 5 22:11:23.122769 systemd[1]: Starting user@500.service - User Manager for UID 500... Aug 5 22:11:23.130737 (systemd)[1540]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Aug 5 22:11:23.254492 systemd[1540]: Queued start job for default target default.target. Aug 5 22:11:23.266501 systemd[1540]: Created slice app.slice - User Application Slice. Aug 5 22:11:23.266533 systemd[1540]: Reached target paths.target - Paths. Aug 5 22:11:23.266548 systemd[1540]: Reached target timers.target - Timers. Aug 5 22:11:23.270352 systemd[1540]: Starting dbus.socket - D-Bus User Message Bus Socket... Aug 5 22:11:23.283146 systemd[1540]: Listening on dbus.socket - D-Bus User Message Bus Socket. Aug 5 22:11:23.283329 systemd[1540]: Reached target sockets.target - Sockets. Aug 5 22:11:23.283351 systemd[1540]: Reached target basic.target - Basic System. Aug 5 22:11:23.283404 systemd[1540]: Reached target default.target - Main User Target. Aug 5 22:11:23.283448 systemd[1540]: Startup finished in 143ms. Aug 5 22:11:23.283529 systemd[1]: Started user@500.service - User Manager for UID 500. Aug 5 22:11:23.291461 systemd[1]: Started session-1.scope - Session 1 of User core. Aug 5 22:11:23.719028 systemd[1]: Started sshd@1-172.24.4.33:22-172.24.4.1:38374.service - OpenSSH per-connection server daemon (172.24.4.1:38374). Aug 5 22:11:23.824273 kubelet[1531]: E0805 22:11:23.824070 1531 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 5 22:11:23.828291 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 5 22:11:23.828606 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 5 22:11:23.829526 systemd[1]: kubelet.service: Consumed 1.967s CPU time. Aug 5 22:11:24.608378 login[1504]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Aug 5 22:11:24.617861 login[1505]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Aug 5 22:11:24.620663 systemd-logind[1423]: New session 2 of user core. Aug 5 22:11:24.634733 systemd[1]: Started session-2.scope - Session 2 of User core. Aug 5 22:11:24.643123 systemd-logind[1423]: New session 3 of user core. Aug 5 22:11:24.656688 systemd[1]: Started session-3.scope - Session 3 of User core. Aug 5 22:11:25.218478 sshd[1553]: Accepted publickey for core from 172.24.4.1 port 38374 ssh2: RSA SHA256:wJOsecOnS68Cf9bfpO6HbyavHDudnwLnl2CjsDHwoC8 Aug 5 22:11:25.221274 sshd[1553]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Aug 5 22:11:25.230413 systemd-logind[1423]: New session 4 of user core. Aug 5 22:11:25.244745 systemd[1]: Started session-4.scope - Session 4 of User core. Aug 5 22:11:25.827951 coreos-metadata[1412]: Aug 05 22:11:25.827 WARN failed to locate config-drive, using the metadata service API instead Aug 5 22:11:25.872692 sshd[1553]: pam_unix(sshd:session): session closed for user core Aug 5 22:11:25.876805 coreos-metadata[1412]: Aug 05 22:11:25.876 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Aug 5 22:11:25.886187 systemd[1]: sshd@1-172.24.4.33:22-172.24.4.1:38374.service: Deactivated successfully. Aug 5 22:11:25.890478 systemd[1]: session-4.scope: Deactivated successfully. Aug 5 22:11:25.894901 systemd-logind[1423]: Session 4 logged out. Waiting for processes to exit. Aug 5 22:11:25.901033 systemd[1]: Started sshd@2-172.24.4.33:22-172.24.4.1:46168.service - OpenSSH per-connection server daemon (172.24.4.1:46168). Aug 5 22:11:25.905266 systemd-logind[1423]: Removed session 4. Aug 5 22:11:26.082042 coreos-metadata[1412]: Aug 05 22:11:26.081 INFO Fetch successful Aug 5 22:11:26.082042 coreos-metadata[1412]: Aug 05 22:11:26.081 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Aug 5 22:11:26.095429 coreos-metadata[1412]: Aug 05 22:11:26.095 INFO Fetch successful Aug 5 22:11:26.095429 coreos-metadata[1412]: Aug 05 22:11:26.095 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Aug 5 22:11:26.109649 coreos-metadata[1412]: Aug 05 22:11:26.109 INFO Fetch successful Aug 5 22:11:26.109804 coreos-metadata[1412]: Aug 05 22:11:26.109 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Aug 5 22:11:26.125665 coreos-metadata[1412]: Aug 05 22:11:26.125 INFO Fetch successful Aug 5 22:11:26.125665 coreos-metadata[1412]: Aug 05 22:11:26.125 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Aug 5 22:11:26.143070 coreos-metadata[1412]: Aug 05 22:11:26.142 INFO Fetch successful Aug 5 22:11:26.143070 coreos-metadata[1412]: Aug 05 22:11:26.143 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Aug 5 22:11:26.158659 coreos-metadata[1412]: Aug 05 22:11:26.158 INFO Fetch successful Aug 5 22:11:26.222866 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Aug 5 22:11:26.225749 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Aug 5 22:11:26.387045 coreos-metadata[1484]: Aug 05 22:11:26.386 WARN failed to locate config-drive, using the metadata service API instead Aug 5 22:11:26.430045 coreos-metadata[1484]: Aug 05 22:11:26.429 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Aug 5 22:11:26.451541 coreos-metadata[1484]: Aug 05 22:11:26.451 INFO Fetch successful Aug 5 22:11:26.451541 coreos-metadata[1484]: Aug 05 22:11:26.451 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Aug 5 22:11:26.470288 coreos-metadata[1484]: Aug 05 22:11:26.470 INFO Fetch successful Aug 5 22:11:26.479317 unknown[1484]: wrote ssh authorized keys file for user: core Aug 5 22:11:26.515359 update-ssh-keys[1591]: Updated "/home/core/.ssh/authorized_keys" Aug 5 22:11:26.518000 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Aug 5 22:11:26.521920 systemd[1]: Finished sshkeys.service. Aug 5 22:11:26.524104 systemd[1]: Reached target multi-user.target - Multi-User System. Aug 5 22:11:26.530385 systemd[1]: Startup finished in 1.285s (kernel) + 16.534s (initrd) + 11.230s (userspace) = 29.051s. Aug 5 22:11:27.270416 sshd[1582]: Accepted publickey for core from 172.24.4.1 port 46168 ssh2: RSA SHA256:wJOsecOnS68Cf9bfpO6HbyavHDudnwLnl2CjsDHwoC8 Aug 5 22:11:27.273473 sshd[1582]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Aug 5 22:11:27.284339 systemd-logind[1423]: New session 5 of user core. Aug 5 22:11:27.296684 systemd[1]: Started session-5.scope - Session 5 of User core. Aug 5 22:11:27.877553 sshd[1582]: pam_unix(sshd:session): session closed for user core Aug 5 22:11:27.884921 systemd[1]: sshd@2-172.24.4.33:22-172.24.4.1:46168.service: Deactivated successfully. Aug 5 22:11:27.888154 systemd[1]: session-5.scope: Deactivated successfully. Aug 5 22:11:27.891986 systemd-logind[1423]: Session 5 logged out. Waiting for processes to exit. Aug 5 22:11:27.894627 systemd-logind[1423]: Removed session 5. Aug 5 22:11:34.079544 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Aug 5 22:11:34.086646 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 5 22:11:34.547676 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 5 22:11:34.563943 (kubelet)[1607]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 5 22:11:34.889767 kubelet[1607]: E0805 22:11:34.889423 1607 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 5 22:11:34.897893 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 5 22:11:34.898296 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 5 22:11:37.901803 systemd[1]: Started sshd@3-172.24.4.33:22-172.24.4.1:40404.service - OpenSSH per-connection server daemon (172.24.4.1:40404). Aug 5 22:11:39.123119 sshd[1616]: Accepted publickey for core from 172.24.4.1 port 40404 ssh2: RSA SHA256:wJOsecOnS68Cf9bfpO6HbyavHDudnwLnl2CjsDHwoC8 Aug 5 22:11:39.126399 sshd[1616]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Aug 5 22:11:39.142667 systemd-logind[1423]: New session 6 of user core. Aug 5 22:11:39.153557 systemd[1]: Started session-6.scope - Session 6 of User core. Aug 5 22:11:40.084479 sshd[1616]: pam_unix(sshd:session): session closed for user core Aug 5 22:11:40.098610 systemd[1]: sshd@3-172.24.4.33:22-172.24.4.1:40404.service: Deactivated successfully. Aug 5 22:11:40.102786 systemd[1]: session-6.scope: Deactivated successfully. Aug 5 22:11:40.107316 systemd-logind[1423]: Session 6 logged out. Waiting for processes to exit. Aug 5 22:11:40.113796 systemd[1]: Started sshd@4-172.24.4.33:22-172.24.4.1:40410.service - OpenSSH per-connection server daemon (172.24.4.1:40410). Aug 5 22:11:40.117500 systemd-logind[1423]: Removed session 6. Aug 5 22:11:41.803400 sshd[1623]: Accepted publickey for core from 172.24.4.1 port 40410 ssh2: RSA SHA256:wJOsecOnS68Cf9bfpO6HbyavHDudnwLnl2CjsDHwoC8 Aug 5 22:11:41.806156 sshd[1623]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Aug 5 22:11:41.817659 systemd-logind[1423]: New session 7 of user core. Aug 5 22:11:41.825543 systemd[1]: Started session-7.scope - Session 7 of User core. Aug 5 22:11:42.583080 sshd[1623]: pam_unix(sshd:session): session closed for user core Aug 5 22:11:42.591756 systemd[1]: sshd@4-172.24.4.33:22-172.24.4.1:40410.service: Deactivated successfully. Aug 5 22:11:42.592971 systemd[1]: session-7.scope: Deactivated successfully. Aug 5 22:11:42.594508 systemd-logind[1423]: Session 7 logged out. Waiting for processes to exit. Aug 5 22:11:42.602579 systemd[1]: Started sshd@5-172.24.4.33:22-172.24.4.1:40426.service - OpenSSH per-connection server daemon (172.24.4.1:40426). Aug 5 22:11:42.604625 systemd-logind[1423]: Removed session 7. Aug 5 22:11:44.627402 sshd[1630]: Accepted publickey for core from 172.24.4.1 port 40426 ssh2: RSA SHA256:wJOsecOnS68Cf9bfpO6HbyavHDudnwLnl2CjsDHwoC8 Aug 5 22:11:44.631634 sshd[1630]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Aug 5 22:11:44.643640 systemd-logind[1423]: New session 8 of user core. Aug 5 22:11:44.647579 systemd[1]: Started session-8.scope - Session 8 of User core. Aug 5 22:11:45.079529 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Aug 5 22:11:45.089676 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 5 22:11:45.232622 sshd[1630]: pam_unix(sshd:session): session closed for user core Aug 5 22:11:45.246608 systemd[1]: sshd@5-172.24.4.33:22-172.24.4.1:40426.service: Deactivated successfully. Aug 5 22:11:45.253084 systemd[1]: session-8.scope: Deactivated successfully. Aug 5 22:11:45.256500 systemd-logind[1423]: Session 8 logged out. Waiting for processes to exit. Aug 5 22:11:45.273439 systemd[1]: Started sshd@6-172.24.4.33:22-172.24.4.1:42602.service - OpenSSH per-connection server daemon (172.24.4.1:42602). Aug 5 22:11:45.278692 systemd-logind[1423]: Removed session 8. Aug 5 22:11:45.482511 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 5 22:11:45.484116 (kubelet)[1646]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 5 22:11:45.739703 kubelet[1646]: E0805 22:11:45.739463 1646 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 5 22:11:45.743427 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 5 22:11:45.743603 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 5 22:11:46.722021 sshd[1640]: Accepted publickey for core from 172.24.4.1 port 42602 ssh2: RSA SHA256:wJOsecOnS68Cf9bfpO6HbyavHDudnwLnl2CjsDHwoC8 Aug 5 22:11:46.724702 sshd[1640]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Aug 5 22:11:46.734309 systemd-logind[1423]: New session 9 of user core. Aug 5 22:11:46.746548 systemd[1]: Started session-9.scope - Session 9 of User core. Aug 5 22:11:47.264643 sudo[1656]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Aug 5 22:11:47.265309 sudo[1656]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Aug 5 22:11:47.282736 sudo[1656]: pam_unix(sudo:session): session closed for user root Aug 5 22:11:47.511794 sshd[1640]: pam_unix(sshd:session): session closed for user core Aug 5 22:11:47.524650 systemd[1]: sshd@6-172.24.4.33:22-172.24.4.1:42602.service: Deactivated successfully. Aug 5 22:11:47.527833 systemd[1]: session-9.scope: Deactivated successfully. Aug 5 22:11:47.531634 systemd-logind[1423]: Session 9 logged out. Waiting for processes to exit. Aug 5 22:11:47.546957 systemd[1]: Started sshd@7-172.24.4.33:22-172.24.4.1:42618.service - OpenSSH per-connection server daemon (172.24.4.1:42618). Aug 5 22:11:47.550642 systemd-logind[1423]: Removed session 9. Aug 5 22:11:48.908667 sshd[1661]: Accepted publickey for core from 172.24.4.1 port 42618 ssh2: RSA SHA256:wJOsecOnS68Cf9bfpO6HbyavHDudnwLnl2CjsDHwoC8 Aug 5 22:11:48.911476 sshd[1661]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Aug 5 22:11:48.923857 systemd-logind[1423]: New session 10 of user core. Aug 5 22:11:48.931585 systemd[1]: Started session-10.scope - Session 10 of User core. Aug 5 22:11:49.516517 sudo[1665]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Aug 5 22:11:49.517118 sudo[1665]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Aug 5 22:11:49.525711 sudo[1665]: pam_unix(sudo:session): session closed for user root Aug 5 22:11:49.540652 sudo[1664]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Aug 5 22:11:49.541970 sudo[1664]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Aug 5 22:11:49.570814 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Aug 5 22:11:49.576761 auditctl[1668]: No rules Aug 5 22:11:49.577441 systemd[1]: audit-rules.service: Deactivated successfully. Aug 5 22:11:49.577905 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Aug 5 22:11:49.590916 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Aug 5 22:11:49.655796 augenrules[1686]: No rules Aug 5 22:11:49.656862 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Aug 5 22:11:49.660486 sudo[1664]: pam_unix(sudo:session): session closed for user root Aug 5 22:11:49.830925 sshd[1661]: pam_unix(sshd:session): session closed for user core Aug 5 22:11:49.845134 systemd[1]: sshd@7-172.24.4.33:22-172.24.4.1:42618.service: Deactivated successfully. Aug 5 22:11:49.849550 systemd[1]: session-10.scope: Deactivated successfully. Aug 5 22:11:49.853653 systemd-logind[1423]: Session 10 logged out. Waiting for processes to exit. Aug 5 22:11:49.862845 systemd[1]: Started sshd@8-172.24.4.33:22-172.24.4.1:42626.service - OpenSSH per-connection server daemon (172.24.4.1:42626). Aug 5 22:11:49.866077 systemd-logind[1423]: Removed session 10. Aug 5 22:11:50.121581 systemd-timesyncd[1361]: Contacted time server 82.64.42.185:123 (2.flatcar.pool.ntp.org). Aug 5 22:11:50.121677 systemd-timesyncd[1361]: Initial clock synchronization to Mon 2024-08-05 22:11:49.846938 UTC. Aug 5 22:11:51.145196 sshd[1694]: Accepted publickey for core from 172.24.4.1 port 42626 ssh2: RSA SHA256:wJOsecOnS68Cf9bfpO6HbyavHDudnwLnl2CjsDHwoC8 Aug 5 22:11:51.148092 sshd[1694]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Aug 5 22:11:51.160639 systemd-logind[1423]: New session 11 of user core. Aug 5 22:11:51.164551 systemd[1]: Started session-11.scope - Session 11 of User core. Aug 5 22:11:51.508394 sudo[1697]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Aug 5 22:11:51.509045 sudo[1697]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Aug 5 22:11:51.829668 (dockerd)[1707]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Aug 5 22:11:51.829760 systemd[1]: Starting docker.service - Docker Application Container Engine... Aug 5 22:11:52.348007 dockerd[1707]: time="2024-08-05T22:11:52.347035926Z" level=info msg="Starting up" Aug 5 22:11:52.420921 dockerd[1707]: time="2024-08-05T22:11:52.420836309Z" level=info msg="Loading containers: start." Aug 5 22:11:52.623345 kernel: Initializing XFRM netlink socket Aug 5 22:11:52.792346 systemd-networkd[1359]: docker0: Link UP Aug 5 22:11:52.810074 dockerd[1707]: time="2024-08-05T22:11:52.810020824Z" level=info msg="Loading containers: done." Aug 5 22:11:52.935306 dockerd[1707]: time="2024-08-05T22:11:52.934722817Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Aug 5 22:11:52.935306 dockerd[1707]: time="2024-08-05T22:11:52.934904187Z" level=info msg="Docker daemon" commit=fca702de7f71362c8d103073c7e4a1d0a467fadd graphdriver=overlay2 version=24.0.9 Aug 5 22:11:52.935306 dockerd[1707]: time="2024-08-05T22:11:52.935018957Z" level=info msg="Daemon has completed initialization" Aug 5 22:11:52.977381 dockerd[1707]: time="2024-08-05T22:11:52.977203892Z" level=info msg="API listen on /run/docker.sock" Aug 5 22:11:52.977930 systemd[1]: Started docker.service - Docker Application Container Engine. Aug 5 22:11:54.628249 containerd[1444]: time="2024-08-05T22:11:54.628031899Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.29.7\"" Aug 5 22:11:55.387844 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2856788766.mount: Deactivated successfully. Aug 5 22:11:55.851838 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Aug 5 22:11:55.857606 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 5 22:11:55.953378 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 5 22:11:55.958814 (kubelet)[1897]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 5 22:11:56.009259 kubelet[1897]: E0805 22:11:56.008914 1897 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 5 22:11:56.011836 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 5 22:11:56.011962 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 5 22:11:57.647801 containerd[1444]: time="2024-08-05T22:11:57.647746048Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.29.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:11:57.649111 containerd[1444]: time="2024-08-05T22:11:57.649004078Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.29.7: active requests=0, bytes read=35232404" Aug 5 22:11:57.650038 containerd[1444]: time="2024-08-05T22:11:57.649981648Z" level=info msg="ImageCreate event name:\"sha256:a2e0d7fa8464a06b07519d78f53fef101bb1bcf716a85f2ac8b397f1a0025bea\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:11:57.654234 containerd[1444]: time="2024-08-05T22:11:57.654178861Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:7b104771c13b9e3537846c3f6949000785e1fbc66d07f123ebcea22c8eb918b3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:11:57.655036 containerd[1444]: time="2024-08-05T22:11:57.655000344Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.29.7\" with image id \"sha256:a2e0d7fa8464a06b07519d78f53fef101bb1bcf716a85f2ac8b397f1a0025bea\", repo tag \"registry.k8s.io/kube-apiserver:v1.29.7\", repo digest \"registry.k8s.io/kube-apiserver@sha256:7b104771c13b9e3537846c3f6949000785e1fbc66d07f123ebcea22c8eb918b3\", size \"35229196\" in 3.026924069s" Aug 5 22:11:57.655085 containerd[1444]: time="2024-08-05T22:11:57.655038273Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.29.7\" returns image reference \"sha256:a2e0d7fa8464a06b07519d78f53fef101bb1bcf716a85f2ac8b397f1a0025bea\"" Aug 5 22:11:57.676908 containerd[1444]: time="2024-08-05T22:11:57.676867136Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.29.7\"" Aug 5 22:12:00.162283 containerd[1444]: time="2024-08-05T22:12:00.162114022Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.29.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:12:00.163407 containerd[1444]: time="2024-08-05T22:12:00.163356343Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.29.7: active requests=0, bytes read=32204832" Aug 5 22:12:00.164441 containerd[1444]: time="2024-08-05T22:12:00.164374395Z" level=info msg="ImageCreate event name:\"sha256:32fe966e5c2b2a05d6b6a56a63a60e09d4c227ec1742d68f921c0b72e23537f8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:12:00.168089 containerd[1444]: time="2024-08-05T22:12:00.168017770Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:e3356f078f7ce72984385d4ca5e726a8cb05ce355d6b158f41aa9b5dbaff9b19\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:12:00.169437 containerd[1444]: time="2024-08-05T22:12:00.169296640Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.29.7\" with image id \"sha256:32fe966e5c2b2a05d6b6a56a63a60e09d4c227ec1742d68f921c0b72e23537f8\", repo tag \"registry.k8s.io/kube-controller-manager:v1.29.7\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:e3356f078f7ce72984385d4ca5e726a8cb05ce355d6b158f41aa9b5dbaff9b19\", size \"33754770\" in 2.492390781s" Aug 5 22:12:00.169437 containerd[1444]: time="2024-08-05T22:12:00.169336004Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.29.7\" returns image reference \"sha256:32fe966e5c2b2a05d6b6a56a63a60e09d4c227ec1742d68f921c0b72e23537f8\"" Aug 5 22:12:00.194667 containerd[1444]: time="2024-08-05T22:12:00.194617444Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.29.7\"" Aug 5 22:12:02.373329 containerd[1444]: time="2024-08-05T22:12:02.373024373Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.29.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:12:02.376656 containerd[1444]: time="2024-08-05T22:12:02.376569842Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.29.7: active requests=0, bytes read=17320811" Aug 5 22:12:02.379601 containerd[1444]: time="2024-08-05T22:12:02.379534854Z" level=info msg="ImageCreate event name:\"sha256:9cffb486021b39220589cbd71b6537e6f9cafdede1eba315b4b0dc83e2f4fc8e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:12:02.388503 containerd[1444]: time="2024-08-05T22:12:02.388331793Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:c6203fbc102cc80a7d934946b7eacb7491480a65db56db203cb3035deecaaa39\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:12:02.392848 containerd[1444]: time="2024-08-05T22:12:02.391380676Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.29.7\" with image id \"sha256:9cffb486021b39220589cbd71b6537e6f9cafdede1eba315b4b0dc83e2f4fc8e\", repo tag \"registry.k8s.io/kube-scheduler:v1.29.7\", repo digest \"registry.k8s.io/kube-scheduler@sha256:c6203fbc102cc80a7d934946b7eacb7491480a65db56db203cb3035deecaaa39\", size \"18870767\" in 2.19667619s" Aug 5 22:12:02.392848 containerd[1444]: time="2024-08-05T22:12:02.391480510Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.29.7\" returns image reference \"sha256:9cffb486021b39220589cbd71b6537e6f9cafdede1eba315b4b0dc83e2f4fc8e\"" Aug 5 22:12:02.446630 containerd[1444]: time="2024-08-05T22:12:02.446482852Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.29.7\"" Aug 5 22:12:04.064906 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2009933567.mount: Deactivated successfully. Aug 5 22:12:04.492759 update_engine[1425]: I0805 22:12:04.491290 1425 update_attempter.cc:509] Updating boot flags... Aug 5 22:12:05.151451 containerd[1444]: time="2024-08-05T22:12:05.150355990Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.29.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:12:05.153522 containerd[1444]: time="2024-08-05T22:12:05.153462620Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.29.7: active requests=0, bytes read=28600096" Aug 5 22:12:05.156570 containerd[1444]: time="2024-08-05T22:12:05.156510023Z" level=info msg="ImageCreate event name:\"sha256:cc8c46cf9d741d1e8a357e5899f298d2f4ac4d890a2d248026b57e130e91cd07\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:12:05.161529 containerd[1444]: time="2024-08-05T22:12:05.161469316Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:4d5e787d71c41243379cbb323d2b3a920fa50825cab19d20ef3344a808d18c4e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:12:05.164751 containerd[1444]: time="2024-08-05T22:12:05.163484959Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.29.7\" with image id \"sha256:cc8c46cf9d741d1e8a357e5899f298d2f4ac4d890a2d248026b57e130e91cd07\", repo tag \"registry.k8s.io/kube-proxy:v1.29.7\", repo digest \"registry.k8s.io/kube-proxy@sha256:4d5e787d71c41243379cbb323d2b3a920fa50825cab19d20ef3344a808d18c4e\", size \"28599107\" in 2.716926199s" Aug 5 22:12:05.164751 containerd[1444]: time="2024-08-05T22:12:05.163537151Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.29.7\" returns image reference \"sha256:cc8c46cf9d741d1e8a357e5899f298d2f4ac4d890a2d248026b57e130e91cd07\"" Aug 5 22:12:05.194703 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 36 scanned by (udev-worker) (1947) Aug 5 22:12:05.263375 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 36 scanned by (udev-worker) (1951) Aug 5 22:12:05.264758 containerd[1444]: time="2024-08-05T22:12:05.264349504Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Aug 5 22:12:05.925790 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2098542752.mount: Deactivated successfully. Aug 5 22:12:06.101419 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Aug 5 22:12:06.108622 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 5 22:12:06.268558 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 5 22:12:06.269641 (kubelet)[1979]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 5 22:12:06.703274 kubelet[1979]: E0805 22:12:06.703130 1979 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 5 22:12:06.707877 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 5 22:12:06.708333 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 5 22:12:08.098497 containerd[1444]: time="2024-08-05T22:12:08.098302445Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:12:08.099804 containerd[1444]: time="2024-08-05T22:12:08.099761613Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185769" Aug 5 22:12:08.100579 containerd[1444]: time="2024-08-05T22:12:08.100273456Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:12:08.103957 containerd[1444]: time="2024-08-05T22:12:08.103572785Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:12:08.105186 containerd[1444]: time="2024-08-05T22:12:08.104958247Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 2.84056012s" Aug 5 22:12:08.105186 containerd[1444]: time="2024-08-05T22:12:08.104990074Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" Aug 5 22:12:08.128387 containerd[1444]: time="2024-08-05T22:12:08.128351698Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Aug 5 22:12:08.862114 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3356582118.mount: Deactivated successfully. Aug 5 22:12:08.870367 containerd[1444]: time="2024-08-05T22:12:08.870274120Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:12:08.872866 containerd[1444]: time="2024-08-05T22:12:08.872746343Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=322298" Aug 5 22:12:08.873652 containerd[1444]: time="2024-08-05T22:12:08.873511882Z" level=info msg="ImageCreate event name:\"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:12:08.879138 containerd[1444]: time="2024-08-05T22:12:08.878999459Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:12:08.882045 containerd[1444]: time="2024-08-05T22:12:08.881762175Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"321520\" in 753.358045ms" Aug 5 22:12:08.882045 containerd[1444]: time="2024-08-05T22:12:08.881845016Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" Aug 5 22:12:08.931333 containerd[1444]: time="2024-08-05T22:12:08.931239237Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.10-0\"" Aug 5 22:12:09.683761 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2115379506.mount: Deactivated successfully. Aug 5 22:12:13.282632 containerd[1444]: time="2024-08-05T22:12:13.282473060Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.10-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:12:13.283976 containerd[1444]: time="2024-08-05T22:12:13.283940545Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.10-0: active requests=0, bytes read=56651633" Aug 5 22:12:13.285497 containerd[1444]: time="2024-08-05T22:12:13.285435563Z" level=info msg="ImageCreate event name:\"sha256:a0eed15eed4498c145ef2f1883fcd300d7adbb759df73c901abd5383dda668e7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:12:13.288975 containerd[1444]: time="2024-08-05T22:12:13.288909576Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:22f892d7672adc0b9c86df67792afdb8b2dc08880f49f669eaaa59c47d7908c2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:12:13.290523 containerd[1444]: time="2024-08-05T22:12:13.290267255Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.10-0\" with image id \"sha256:a0eed15eed4498c145ef2f1883fcd300d7adbb759df73c901abd5383dda668e7\", repo tag \"registry.k8s.io/etcd:3.5.10-0\", repo digest \"registry.k8s.io/etcd@sha256:22f892d7672adc0b9c86df67792afdb8b2dc08880f49f669eaaa59c47d7908c2\", size \"56649232\" in 4.358978663s" Aug 5 22:12:13.290523 containerd[1444]: time="2024-08-05T22:12:13.290302337Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.10-0\" returns image reference \"sha256:a0eed15eed4498c145ef2f1883fcd300d7adbb759df73c901abd5383dda668e7\"" Aug 5 22:12:16.852470 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Aug 5 22:12:16.863440 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 5 22:12:17.286527 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 5 22:12:17.294669 (kubelet)[2144]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 5 22:12:17.508705 kubelet[2144]: E0805 22:12:17.508661 2144 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 5 22:12:17.515598 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 5 22:12:17.516204 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 5 22:12:18.276040 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 5 22:12:18.283721 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 5 22:12:18.335110 systemd[1]: Reloading requested from client PID 2159 ('systemctl') (unit session-11.scope)... Aug 5 22:12:18.335139 systemd[1]: Reloading... Aug 5 22:12:18.444362 zram_generator::config[2193]: No configuration found. Aug 5 22:12:19.011879 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 5 22:12:19.095657 systemd[1]: Reloading finished in 759 ms. Aug 5 22:12:19.140659 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Aug 5 22:12:19.140738 systemd[1]: kubelet.service: Failed with result 'signal'. Aug 5 22:12:19.141061 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 5 22:12:19.147480 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 5 22:12:19.266745 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 5 22:12:19.281485 (kubelet)[2261]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Aug 5 22:12:19.744447 kubelet[2261]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 5 22:12:19.747242 kubelet[2261]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Aug 5 22:12:19.747242 kubelet[2261]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 5 22:12:19.747242 kubelet[2261]: I0805 22:12:19.745314 2261 server.go:204] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Aug 5 22:12:20.076379 kubelet[2261]: I0805 22:12:20.076299 2261 server.go:487] "Kubelet version" kubeletVersion="v1.29.2" Aug 5 22:12:20.076715 kubelet[2261]: I0805 22:12:20.076702 2261 server.go:489] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Aug 5 22:12:20.076995 kubelet[2261]: I0805 22:12:20.076980 2261 server.go:919] "Client rotation is on, will bootstrap in background" Aug 5 22:12:20.112042 kubelet[2261]: I0805 22:12:20.112016 2261 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 5 22:12:20.114868 kubelet[2261]: E0805 22:12:20.114803 2261 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://172.24.4.33:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 172.24.4.33:6443: connect: connection refused Aug 5 22:12:20.126571 kubelet[2261]: I0805 22:12:20.126548 2261 server.go:745] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Aug 5 22:12:20.126922 kubelet[2261]: I0805 22:12:20.126909 2261 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Aug 5 22:12:20.128348 kubelet[2261]: I0805 22:12:20.128322 2261 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Aug 5 22:12:20.129402 kubelet[2261]: I0805 22:12:20.129387 2261 topology_manager.go:138] "Creating topology manager with none policy" Aug 5 22:12:20.129478 kubelet[2261]: I0805 22:12:20.129468 2261 container_manager_linux.go:301] "Creating device plugin manager" Aug 5 22:12:20.133309 kubelet[2261]: I0805 22:12:20.133296 2261 state_mem.go:36] "Initialized new in-memory state store" Aug 5 22:12:20.134570 kubelet[2261]: W0805 22:12:20.134485 2261 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Node: Get "https://172.24.4.33:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-3975-2-0-1-de7b5ef465.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.33:6443: connect: connection refused Aug 5 22:12:20.134625 kubelet[2261]: E0805 22:12:20.134596 2261 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://172.24.4.33:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-3975-2-0-1-de7b5ef465.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.33:6443: connect: connection refused Aug 5 22:12:20.136424 kubelet[2261]: I0805 22:12:20.136185 2261 kubelet.go:396] "Attempting to sync node with API server" Aug 5 22:12:20.136424 kubelet[2261]: I0805 22:12:20.136231 2261 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Aug 5 22:12:20.136424 kubelet[2261]: I0805 22:12:20.136275 2261 kubelet.go:312] "Adding apiserver pod source" Aug 5 22:12:20.136424 kubelet[2261]: I0805 22:12:20.136289 2261 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Aug 5 22:12:20.138815 kubelet[2261]: I0805 22:12:20.138594 2261 kuberuntime_manager.go:258] "Container runtime initialized" containerRuntime="containerd" version="v1.7.17" apiVersion="v1" Aug 5 22:12:20.143795 kubelet[2261]: I0805 22:12:20.143682 2261 kubelet.go:809] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Aug 5 22:12:20.145733 kubelet[2261]: W0805 22:12:20.145452 2261 probe.go:268] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Aug 5 22:12:20.147210 kubelet[2261]: I0805 22:12:20.146806 2261 server.go:1256] "Started kubelet" Aug 5 22:12:20.147210 kubelet[2261]: W0805 22:12:20.147029 2261 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Service: Get "https://172.24.4.33:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.33:6443: connect: connection refused Aug 5 22:12:20.147210 kubelet[2261]: E0805 22:12:20.147135 2261 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://172.24.4.33:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.33:6443: connect: connection refused Aug 5 22:12:20.147637 kubelet[2261]: I0805 22:12:20.147600 2261 server.go:162] "Starting to listen" address="0.0.0.0" port=10250 Aug 5 22:12:20.149867 kubelet[2261]: I0805 22:12:20.149436 2261 server.go:461] "Adding debug handlers to kubelet server" Aug 5 22:12:20.154382 kubelet[2261]: I0805 22:12:20.154365 2261 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Aug 5 22:12:20.154931 kubelet[2261]: I0805 22:12:20.154656 2261 server.go:233] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Aug 5 22:12:20.157109 kubelet[2261]: E0805 22:12:20.157049 2261 event.go:355] "Unable to write event (may retry after sleeping)" err="Post \"https://172.24.4.33:6443/api/v1/namespaces/default/events\": dial tcp 172.24.4.33:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-3975-2-0-1-de7b5ef465.novalocal.17e8f4bab1696758 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-3975-2-0-1-de7b5ef465.novalocal,UID:ci-3975-2-0-1-de7b5ef465.novalocal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-3975-2-0-1-de7b5ef465.novalocal,},FirstTimestamp:2024-08-05 22:12:20.146784088 +0000 UTC m=+0.861384461,LastTimestamp:2024-08-05 22:12:20.146784088 +0000 UTC m=+0.861384461,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-3975-2-0-1-de7b5ef465.novalocal,}" Aug 5 22:12:20.157822 kubelet[2261]: I0805 22:12:20.157701 2261 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Aug 5 22:12:20.165472 kubelet[2261]: E0805 22:12:20.165439 2261 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-3975-2-0-1-de7b5ef465.novalocal\" not found" Aug 5 22:12:20.167244 kubelet[2261]: I0805 22:12:20.165616 2261 volume_manager.go:291] "Starting Kubelet Volume Manager" Aug 5 22:12:20.167244 kubelet[2261]: I0805 22:12:20.165760 2261 desired_state_of_world_populator.go:151] "Desired state populator starts to run" Aug 5 22:12:20.167244 kubelet[2261]: I0805 22:12:20.165836 2261 reconciler_new.go:29] "Reconciler: start to sync state" Aug 5 22:12:20.167244 kubelet[2261]: W0805 22:12:20.166245 2261 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.CSIDriver: Get "https://172.24.4.33:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.33:6443: connect: connection refused Aug 5 22:12:20.167244 kubelet[2261]: E0805 22:12:20.166287 2261 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://172.24.4.33:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.33:6443: connect: connection refused Aug 5 22:12:20.167244 kubelet[2261]: E0805 22:12:20.166517 2261 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.33:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-3975-2-0-1-de7b5ef465.novalocal?timeout=10s\": dial tcp 172.24.4.33:6443: connect: connection refused" interval="200ms" Aug 5 22:12:20.177683 kubelet[2261]: I0805 22:12:20.177660 2261 factory.go:221] Registration of the systemd container factory successfully Aug 5 22:12:20.177978 kubelet[2261]: I0805 22:12:20.177959 2261 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Aug 5 22:12:20.179367 kubelet[2261]: E0805 22:12:20.179353 2261 kubelet.go:1462] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Aug 5 22:12:20.179823 kubelet[2261]: I0805 22:12:20.179809 2261 factory.go:221] Registration of the containerd container factory successfully Aug 5 22:12:20.199434 kubelet[2261]: I0805 22:12:20.199410 2261 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Aug 5 22:12:20.200627 kubelet[2261]: I0805 22:12:20.200613 2261 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Aug 5 22:12:20.200716 kubelet[2261]: I0805 22:12:20.200707 2261 status_manager.go:217] "Starting to sync pod status with apiserver" Aug 5 22:12:20.200787 kubelet[2261]: I0805 22:12:20.200778 2261 kubelet.go:2329] "Starting kubelet main sync loop" Aug 5 22:12:20.200880 kubelet[2261]: E0805 22:12:20.200870 2261 kubelet.go:2353] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Aug 5 22:12:20.211934 kubelet[2261]: W0805 22:12:20.211834 2261 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.RuntimeClass: Get "https://172.24.4.33:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.33:6443: connect: connection refused Aug 5 22:12:20.212039 kubelet[2261]: E0805 22:12:20.211967 2261 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://172.24.4.33:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.33:6443: connect: connection refused Aug 5 22:12:20.222656 kubelet[2261]: I0805 22:12:20.222620 2261 cpu_manager.go:214] "Starting CPU manager" policy="none" Aug 5 22:12:20.222748 kubelet[2261]: I0805 22:12:20.222702 2261 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Aug 5 22:12:20.222748 kubelet[2261]: I0805 22:12:20.222739 2261 state_mem.go:36] "Initialized new in-memory state store" Aug 5 22:12:20.227468 kubelet[2261]: I0805 22:12:20.227431 2261 policy_none.go:49] "None policy: Start" Aug 5 22:12:20.229074 kubelet[2261]: I0805 22:12:20.228729 2261 memory_manager.go:170] "Starting memorymanager" policy="None" Aug 5 22:12:20.229074 kubelet[2261]: I0805 22:12:20.228765 2261 state_mem.go:35] "Initializing new in-memory state store" Aug 5 22:12:20.238004 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Aug 5 22:12:20.250972 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Aug 5 22:12:20.256712 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Aug 5 22:12:20.268315 kubelet[2261]: I0805 22:12:20.268019 2261 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Aug 5 22:12:20.268315 kubelet[2261]: I0805 22:12:20.268240 2261 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Aug 5 22:12:20.270675 kubelet[2261]: I0805 22:12:20.270581 2261 kubelet_node_status.go:73] "Attempting to register node" node="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:20.270975 kubelet[2261]: E0805 22:12:20.270876 2261 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.33:6443/api/v1/nodes\": dial tcp 172.24.4.33:6443: connect: connection refused" node="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:20.271999 kubelet[2261]: E0805 22:12:20.271849 2261 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-3975-2-0-1-de7b5ef465.novalocal\" not found" Aug 5 22:12:20.301486 kubelet[2261]: I0805 22:12:20.301449 2261 topology_manager.go:215] "Topology Admit Handler" podUID="bf4bbf4f9260439e9f02273c28e624cb" podNamespace="kube-system" podName="kube-apiserver-ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:20.303668 kubelet[2261]: I0805 22:12:20.303547 2261 topology_manager.go:215] "Topology Admit Handler" podUID="2f6787fba05730ec84fee9cd42df9be9" podNamespace="kube-system" podName="kube-controller-manager-ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:20.305503 kubelet[2261]: I0805 22:12:20.305471 2261 topology_manager.go:215] "Topology Admit Handler" podUID="13ae3c0a062b0c3f1ee1de89994c1721" podNamespace="kube-system" podName="kube-scheduler-ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:20.314984 systemd[1]: Created slice kubepods-burstable-podbf4bbf4f9260439e9f02273c28e624cb.slice - libcontainer container kubepods-burstable-podbf4bbf4f9260439e9f02273c28e624cb.slice. Aug 5 22:12:20.348749 systemd[1]: Created slice kubepods-burstable-pod2f6787fba05730ec84fee9cd42df9be9.slice - libcontainer container kubepods-burstable-pod2f6787fba05730ec84fee9cd42df9be9.slice. Aug 5 22:12:20.363522 systemd[1]: Created slice kubepods-burstable-pod13ae3c0a062b0c3f1ee1de89994c1721.slice - libcontainer container kubepods-burstable-pod13ae3c0a062b0c3f1ee1de89994c1721.slice. Aug 5 22:12:20.368341 kubelet[2261]: E0805 22:12:20.367433 2261 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.33:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-3975-2-0-1-de7b5ef465.novalocal?timeout=10s\": dial tcp 172.24.4.33:6443: connect: connection refused" interval="400ms" Aug 5 22:12:20.466855 kubelet[2261]: I0805 22:12:20.466790 2261 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/bf4bbf4f9260439e9f02273c28e624cb-k8s-certs\") pod \"kube-apiserver-ci-3975-2-0-1-de7b5ef465.novalocal\" (UID: \"bf4bbf4f9260439e9f02273c28e624cb\") " pod="kube-system/kube-apiserver-ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:20.467769 kubelet[2261]: I0805 22:12:20.467399 2261 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2f6787fba05730ec84fee9cd42df9be9-k8s-certs\") pod \"kube-controller-manager-ci-3975-2-0-1-de7b5ef465.novalocal\" (UID: \"2f6787fba05730ec84fee9cd42df9be9\") " pod="kube-system/kube-controller-manager-ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:20.467769 kubelet[2261]: I0805 22:12:20.467518 2261 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/13ae3c0a062b0c3f1ee1de89994c1721-kubeconfig\") pod \"kube-scheduler-ci-3975-2-0-1-de7b5ef465.novalocal\" (UID: \"13ae3c0a062b0c3f1ee1de89994c1721\") " pod="kube-system/kube-scheduler-ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:20.467769 kubelet[2261]: I0805 22:12:20.467643 2261 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/2f6787fba05730ec84fee9cd42df9be9-kubeconfig\") pod \"kube-controller-manager-ci-3975-2-0-1-de7b5ef465.novalocal\" (UID: \"2f6787fba05730ec84fee9cd42df9be9\") " pod="kube-system/kube-controller-manager-ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:20.467769 kubelet[2261]: I0805 22:12:20.467798 2261 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2f6787fba05730ec84fee9cd42df9be9-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-3975-2-0-1-de7b5ef465.novalocal\" (UID: \"2f6787fba05730ec84fee9cd42df9be9\") " pod="kube-system/kube-controller-manager-ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:20.468214 kubelet[2261]: I0805 22:12:20.467901 2261 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/bf4bbf4f9260439e9f02273c28e624cb-ca-certs\") pod \"kube-apiserver-ci-3975-2-0-1-de7b5ef465.novalocal\" (UID: \"bf4bbf4f9260439e9f02273c28e624cb\") " pod="kube-system/kube-apiserver-ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:20.468214 kubelet[2261]: I0805 22:12:20.468032 2261 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/bf4bbf4f9260439e9f02273c28e624cb-usr-share-ca-certificates\") pod \"kube-apiserver-ci-3975-2-0-1-de7b5ef465.novalocal\" (UID: \"bf4bbf4f9260439e9f02273c28e624cb\") " pod="kube-system/kube-apiserver-ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:20.468214 kubelet[2261]: I0805 22:12:20.468135 2261 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2f6787fba05730ec84fee9cd42df9be9-ca-certs\") pod \"kube-controller-manager-ci-3975-2-0-1-de7b5ef465.novalocal\" (UID: \"2f6787fba05730ec84fee9cd42df9be9\") " pod="kube-system/kube-controller-manager-ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:20.468482 kubelet[2261]: I0805 22:12:20.468245 2261 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/2f6787fba05730ec84fee9cd42df9be9-flexvolume-dir\") pod \"kube-controller-manager-ci-3975-2-0-1-de7b5ef465.novalocal\" (UID: \"2f6787fba05730ec84fee9cd42df9be9\") " pod="kube-system/kube-controller-manager-ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:20.474825 kubelet[2261]: I0805 22:12:20.474768 2261 kubelet_node_status.go:73] "Attempting to register node" node="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:20.475441 kubelet[2261]: E0805 22:12:20.475394 2261 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.33:6443/api/v1/nodes\": dial tcp 172.24.4.33:6443: connect: connection refused" node="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:20.638905 containerd[1444]: time="2024-08-05T22:12:20.638721074Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-3975-2-0-1-de7b5ef465.novalocal,Uid:bf4bbf4f9260439e9f02273c28e624cb,Namespace:kube-system,Attempt:0,}" Aug 5 22:12:20.671921 containerd[1444]: time="2024-08-05T22:12:20.671549253Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-3975-2-0-1-de7b5ef465.novalocal,Uid:2f6787fba05730ec84fee9cd42df9be9,Namespace:kube-system,Attempt:0,}" Aug 5 22:12:20.675022 containerd[1444]: time="2024-08-05T22:12:20.674875981Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-3975-2-0-1-de7b5ef465.novalocal,Uid:13ae3c0a062b0c3f1ee1de89994c1721,Namespace:kube-system,Attempt:0,}" Aug 5 22:12:20.768838 kubelet[2261]: E0805 22:12:20.768772 2261 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.33:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-3975-2-0-1-de7b5ef465.novalocal?timeout=10s\": dial tcp 172.24.4.33:6443: connect: connection refused" interval="800ms" Aug 5 22:12:20.879575 kubelet[2261]: I0805 22:12:20.879443 2261 kubelet_node_status.go:73] "Attempting to register node" node="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:20.880204 kubelet[2261]: E0805 22:12:20.880114 2261 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.33:6443/api/v1/nodes\": dial tcp 172.24.4.33:6443: connect: connection refused" node="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:21.240491 kubelet[2261]: W0805 22:12:21.240392 2261 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Service: Get "https://172.24.4.33:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.33:6443: connect: connection refused Aug 5 22:12:21.241455 kubelet[2261]: E0805 22:12:21.240840 2261 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://172.24.4.33:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.33:6443: connect: connection refused Aug 5 22:12:21.245558 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3920454972.mount: Deactivated successfully. Aug 5 22:12:21.254757 containerd[1444]: time="2024-08-05T22:12:21.254647510Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 5 22:12:21.257380 containerd[1444]: time="2024-08-05T22:12:21.257091901Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Aug 5 22:12:21.258769 containerd[1444]: time="2024-08-05T22:12:21.258667362Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 5 22:12:21.260993 containerd[1444]: time="2024-08-05T22:12:21.260926892Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 5 22:12:21.263351 containerd[1444]: time="2024-08-05T22:12:21.263091461Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312064" Aug 5 22:12:21.265066 containerd[1444]: time="2024-08-05T22:12:21.265000370Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Aug 5 22:12:21.265527 containerd[1444]: time="2024-08-05T22:12:21.265374560Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 5 22:12:21.272850 containerd[1444]: time="2024-08-05T22:12:21.272722408Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 5 22:12:21.277837 containerd[1444]: time="2024-08-05T22:12:21.277005589Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 605.273001ms" Aug 5 22:12:21.281582 containerd[1444]: time="2024-08-05T22:12:21.281353242Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 606.291404ms" Aug 5 22:12:21.294686 containerd[1444]: time="2024-08-05T22:12:21.294614012Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 655.675054ms" Aug 5 22:12:21.511848 containerd[1444]: time="2024-08-05T22:12:21.510690820Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Aug 5 22:12:21.511848 containerd[1444]: time="2024-08-05T22:12:21.510872416Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 5 22:12:21.512249 containerd[1444]: time="2024-08-05T22:12:21.510938119Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Aug 5 22:12:21.513389 containerd[1444]: time="2024-08-05T22:12:21.510980562Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 5 22:12:21.515255 containerd[1444]: time="2024-08-05T22:12:21.514835157Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Aug 5 22:12:21.515255 containerd[1444]: time="2024-08-05T22:12:21.514900279Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 5 22:12:21.515419 containerd[1444]: time="2024-08-05T22:12:21.514926922Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Aug 5 22:12:21.515419 containerd[1444]: time="2024-08-05T22:12:21.514950271Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 5 22:12:21.519036 containerd[1444]: time="2024-08-05T22:12:21.518432812Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Aug 5 22:12:21.519036 containerd[1444]: time="2024-08-05T22:12:21.518523245Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 5 22:12:21.519036 containerd[1444]: time="2024-08-05T22:12:21.518555034Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Aug 5 22:12:21.519036 containerd[1444]: time="2024-08-05T22:12:21.518577342Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 5 22:12:21.548256 systemd[1]: Started cri-containerd-c66dea924d415cb0ba7fb397536aff6e4f833c5b839df317380c52abf4c80ec2.scope - libcontainer container c66dea924d415cb0ba7fb397536aff6e4f833c5b839df317380c52abf4c80ec2. Aug 5 22:12:21.553504 systemd[1]: Started cri-containerd-b0b36cd76875c89853b03c3293f6694fc5ec9591afb020c472acb727fd44cf6f.scope - libcontainer container b0b36cd76875c89853b03c3293f6694fc5ec9591afb020c472acb727fd44cf6f. Aug 5 22:12:21.561266 systemd[1]: Started cri-containerd-2fd15d1f716b436c719428479cd692e847b8e0db650ff34c370a04050c0ef14e.scope - libcontainer container 2fd15d1f716b436c719428479cd692e847b8e0db650ff34c370a04050c0ef14e. Aug 5 22:12:21.570304 kubelet[2261]: E0805 22:12:21.570250 2261 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.33:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-3975-2-0-1-de7b5ef465.novalocal?timeout=10s\": dial tcp 172.24.4.33:6443: connect: connection refused" interval="1.6s" Aug 5 22:12:21.630274 containerd[1444]: time="2024-08-05T22:12:21.630169139Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-3975-2-0-1-de7b5ef465.novalocal,Uid:bf4bbf4f9260439e9f02273c28e624cb,Namespace:kube-system,Attempt:0,} returns sandbox id \"c66dea924d415cb0ba7fb397536aff6e4f833c5b839df317380c52abf4c80ec2\"" Aug 5 22:12:21.651059 containerd[1444]: time="2024-08-05T22:12:21.650432516Z" level=info msg="CreateContainer within sandbox \"c66dea924d415cb0ba7fb397536aff6e4f833c5b839df317380c52abf4c80ec2\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Aug 5 22:12:21.663863 containerd[1444]: time="2024-08-05T22:12:21.663778653Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-3975-2-0-1-de7b5ef465.novalocal,Uid:2f6787fba05730ec84fee9cd42df9be9,Namespace:kube-system,Attempt:0,} returns sandbox id \"b0b36cd76875c89853b03c3293f6694fc5ec9591afb020c472acb727fd44cf6f\"" Aug 5 22:12:21.665337 containerd[1444]: time="2024-08-05T22:12:21.664808563Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-3975-2-0-1-de7b5ef465.novalocal,Uid:13ae3c0a062b0c3f1ee1de89994c1721,Namespace:kube-system,Attempt:0,} returns sandbox id \"2fd15d1f716b436c719428479cd692e847b8e0db650ff34c370a04050c0ef14e\"" Aug 5 22:12:21.670147 containerd[1444]: time="2024-08-05T22:12:21.670096862Z" level=info msg="CreateContainer within sandbox \"2fd15d1f716b436c719428479cd692e847b8e0db650ff34c370a04050c0ef14e\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Aug 5 22:12:21.673375 containerd[1444]: time="2024-08-05T22:12:21.673318475Z" level=info msg="CreateContainer within sandbox \"b0b36cd76875c89853b03c3293f6694fc5ec9591afb020c472acb727fd44cf6f\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Aug 5 22:12:21.684069 kubelet[2261]: I0805 22:12:21.684012 2261 kubelet_node_status.go:73] "Attempting to register node" node="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:21.685007 kubelet[2261]: E0805 22:12:21.684981 2261 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.33:6443/api/v1/nodes\": dial tcp 172.24.4.33:6443: connect: connection refused" node="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:21.692406 kubelet[2261]: W0805 22:12:21.692340 2261 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Node: Get "https://172.24.4.33:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-3975-2-0-1-de7b5ef465.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.33:6443: connect: connection refused Aug 5 22:12:21.692686 kubelet[2261]: E0805 22:12:21.692660 2261 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://172.24.4.33:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-3975-2-0-1-de7b5ef465.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.33:6443: connect: connection refused Aug 5 22:12:21.725422 kubelet[2261]: W0805 22:12:21.725312 2261 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.CSIDriver: Get "https://172.24.4.33:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.33:6443: connect: connection refused Aug 5 22:12:21.725422 kubelet[2261]: E0805 22:12:21.725426 2261 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://172.24.4.33:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.33:6443: connect: connection refused Aug 5 22:12:21.725842 containerd[1444]: time="2024-08-05T22:12:21.725701302Z" level=info msg="CreateContainer within sandbox \"c66dea924d415cb0ba7fb397536aff6e4f833c5b839df317380c52abf4c80ec2\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"4e71c2cd0f3b775183aec411a673a9235fbd35ff9b2c5f7f8228cfa695172a68\"" Aug 5 22:12:21.734706 containerd[1444]: time="2024-08-05T22:12:21.734663010Z" level=info msg="CreateContainer within sandbox \"b0b36cd76875c89853b03c3293f6694fc5ec9591afb020c472acb727fd44cf6f\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"0574de62090fc7df1f7afbf45ed921d105652712af4e76f6520129f708239cc5\"" Aug 5 22:12:21.735050 containerd[1444]: time="2024-08-05T22:12:21.735028748Z" level=info msg="StartContainer for \"4e71c2cd0f3b775183aec411a673a9235fbd35ff9b2c5f7f8228cfa695172a68\"" Aug 5 22:12:21.737063 containerd[1444]: time="2024-08-05T22:12:21.736677852Z" level=info msg="CreateContainer within sandbox \"2fd15d1f716b436c719428479cd692e847b8e0db650ff34c370a04050c0ef14e\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"08afb82f09c2cb08c88234e037fe2f7cf961e689e0b57b57dedb6485e8818127\"" Aug 5 22:12:21.738254 containerd[1444]: time="2024-08-05T22:12:21.737274059Z" level=info msg="StartContainer for \"0574de62090fc7df1f7afbf45ed921d105652712af4e76f6520129f708239cc5\"" Aug 5 22:12:21.750327 containerd[1444]: time="2024-08-05T22:12:21.750233372Z" level=info msg="StartContainer for \"08afb82f09c2cb08c88234e037fe2f7cf961e689e0b57b57dedb6485e8818127\"" Aug 5 22:12:21.773429 systemd[1]: Started cri-containerd-4e71c2cd0f3b775183aec411a673a9235fbd35ff9b2c5f7f8228cfa695172a68.scope - libcontainer container 4e71c2cd0f3b775183aec411a673a9235fbd35ff9b2c5f7f8228cfa695172a68. Aug 5 22:12:21.783975 systemd[1]: Started cri-containerd-0574de62090fc7df1f7afbf45ed921d105652712af4e76f6520129f708239cc5.scope - libcontainer container 0574de62090fc7df1f7afbf45ed921d105652712af4e76f6520129f708239cc5. Aug 5 22:12:21.807405 systemd[1]: Started cri-containerd-08afb82f09c2cb08c88234e037fe2f7cf961e689e0b57b57dedb6485e8818127.scope - libcontainer container 08afb82f09c2cb08c88234e037fe2f7cf961e689e0b57b57dedb6485e8818127. Aug 5 22:12:21.811302 kubelet[2261]: W0805 22:12:21.811133 2261 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.RuntimeClass: Get "https://172.24.4.33:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.33:6443: connect: connection refused Aug 5 22:12:21.811302 kubelet[2261]: E0805 22:12:21.811277 2261 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://172.24.4.33:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.33:6443: connect: connection refused Aug 5 22:12:21.876767 containerd[1444]: time="2024-08-05T22:12:21.876504475Z" level=info msg="StartContainer for \"4e71c2cd0f3b775183aec411a673a9235fbd35ff9b2c5f7f8228cfa695172a68\" returns successfully" Aug 5 22:12:21.876767 containerd[1444]: time="2024-08-05T22:12:21.876512705Z" level=info msg="StartContainer for \"0574de62090fc7df1f7afbf45ed921d105652712af4e76f6520129f708239cc5\" returns successfully" Aug 5 22:12:21.911915 containerd[1444]: time="2024-08-05T22:12:21.911732525Z" level=info msg="StartContainer for \"08afb82f09c2cb08c88234e037fe2f7cf961e689e0b57b57dedb6485e8818127\" returns successfully" Aug 5 22:12:23.287575 kubelet[2261]: I0805 22:12:23.287494 2261 kubelet_node_status.go:73] "Attempting to register node" node="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:24.383970 kubelet[2261]: E0805 22:12:24.383932 2261 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-3975-2-0-1-de7b5ef465.novalocal\" not found" node="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:24.426939 kubelet[2261]: I0805 22:12:24.426698 2261 kubelet_node_status.go:76] "Successfully registered node" node="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:24.448917 kubelet[2261]: E0805 22:12:24.448829 2261 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-3975-2-0-1-de7b5ef465.novalocal\" not found" Aug 5 22:12:24.549919 kubelet[2261]: E0805 22:12:24.549865 2261 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-3975-2-0-1-de7b5ef465.novalocal\" not found" Aug 5 22:12:24.650907 kubelet[2261]: E0805 22:12:24.650657 2261 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-3975-2-0-1-de7b5ef465.novalocal\" not found" Aug 5 22:12:25.141855 kubelet[2261]: I0805 22:12:25.141689 2261 apiserver.go:52] "Watching apiserver" Aug 5 22:12:25.166354 kubelet[2261]: I0805 22:12:25.166296 2261 desired_state_of_world_populator.go:159] "Finished populating initial desired state of world" Aug 5 22:12:27.457804 systemd[1]: Reloading requested from client PID 2534 ('systemctl') (unit session-11.scope)... Aug 5 22:12:27.457823 systemd[1]: Reloading... Aug 5 22:12:27.548269 zram_generator::config[2571]: No configuration found. Aug 5 22:12:27.716050 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 5 22:12:27.828012 systemd[1]: Reloading finished in 369 ms. Aug 5 22:12:27.877872 kubelet[2261]: I0805 22:12:27.877817 2261 dynamic_cafile_content.go:171] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 5 22:12:27.878032 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Aug 5 22:12:27.891519 systemd[1]: kubelet.service: Deactivated successfully. Aug 5 22:12:27.891820 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 5 22:12:27.891873 systemd[1]: kubelet.service: Consumed 1.112s CPU time, 109.5M memory peak, 0B memory swap peak. Aug 5 22:12:27.901553 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 5 22:12:28.257516 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 5 22:12:28.279294 (kubelet)[2635]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Aug 5 22:12:28.653700 kubelet[2635]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 5 22:12:28.654873 kubelet[2635]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Aug 5 22:12:28.654873 kubelet[2635]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 5 22:12:28.654873 kubelet[2635]: I0805 22:12:28.654370 2635 server.go:204] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Aug 5 22:12:28.664069 kubelet[2635]: I0805 22:12:28.664019 2635 server.go:487] "Kubelet version" kubeletVersion="v1.29.2" Aug 5 22:12:28.664069 kubelet[2635]: I0805 22:12:28.664068 2635 server.go:489] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Aug 5 22:12:28.664519 kubelet[2635]: I0805 22:12:28.664489 2635 server.go:919] "Client rotation is on, will bootstrap in background" Aug 5 22:12:28.670585 kubelet[2635]: I0805 22:12:28.670565 2635 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Aug 5 22:12:28.678279 kubelet[2635]: I0805 22:12:28.678256 2635 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 5 22:12:28.688020 kubelet[2635]: I0805 22:12:28.687617 2635 server.go:745] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Aug 5 22:12:28.688020 kubelet[2635]: I0805 22:12:28.687845 2635 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Aug 5 22:12:28.688020 kubelet[2635]: I0805 22:12:28.688021 2635 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Aug 5 22:12:28.688331 kubelet[2635]: I0805 22:12:28.688048 2635 topology_manager.go:138] "Creating topology manager with none policy" Aug 5 22:12:28.688331 kubelet[2635]: I0805 22:12:28.688060 2635 container_manager_linux.go:301] "Creating device plugin manager" Aug 5 22:12:28.688331 kubelet[2635]: I0805 22:12:28.688092 2635 state_mem.go:36] "Initialized new in-memory state store" Aug 5 22:12:28.688331 kubelet[2635]: I0805 22:12:28.688179 2635 kubelet.go:396] "Attempting to sync node with API server" Aug 5 22:12:28.688331 kubelet[2635]: I0805 22:12:28.688196 2635 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Aug 5 22:12:28.688637 kubelet[2635]: I0805 22:12:28.688614 2635 kubelet.go:312] "Adding apiserver pod source" Aug 5 22:12:28.688637 kubelet[2635]: I0805 22:12:28.688635 2635 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Aug 5 22:12:28.694302 kubelet[2635]: I0805 22:12:28.694280 2635 kuberuntime_manager.go:258] "Container runtime initialized" containerRuntime="containerd" version="v1.7.17" apiVersion="v1" Aug 5 22:12:28.694654 kubelet[2635]: I0805 22:12:28.694624 2635 kubelet.go:809] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Aug 5 22:12:28.695242 kubelet[2635]: I0805 22:12:28.695040 2635 server.go:1256] "Started kubelet" Aug 5 22:12:28.699882 kubelet[2635]: I0805 22:12:28.699647 2635 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Aug 5 22:12:28.710764 kubelet[2635]: I0805 22:12:28.710739 2635 server.go:162] "Starting to listen" address="0.0.0.0" port=10250 Aug 5 22:12:28.712434 kubelet[2635]: I0805 22:12:28.712409 2635 server.go:461] "Adding debug handlers to kubelet server" Aug 5 22:12:28.718361 kubelet[2635]: I0805 22:12:28.718341 2635 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Aug 5 22:12:28.718635 kubelet[2635]: I0805 22:12:28.718621 2635 server.go:233] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Aug 5 22:12:28.724146 kubelet[2635]: I0805 22:12:28.724125 2635 volume_manager.go:291] "Starting Kubelet Volume Manager" Aug 5 22:12:28.725087 kubelet[2635]: I0805 22:12:28.724362 2635 desired_state_of_world_populator.go:151] "Desired state populator starts to run" Aug 5 22:12:28.725087 kubelet[2635]: I0805 22:12:28.724499 2635 reconciler_new.go:29] "Reconciler: start to sync state" Aug 5 22:12:28.731493 kubelet[2635]: E0805 22:12:28.731473 2635 kubelet.go:1462] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Aug 5 22:12:28.736236 kubelet[2635]: I0805 22:12:28.736196 2635 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Aug 5 22:12:28.739281 kubelet[2635]: I0805 22:12:28.739265 2635 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Aug 5 22:12:28.739391 kubelet[2635]: I0805 22:12:28.739381 2635 status_manager.go:217] "Starting to sync pod status with apiserver" Aug 5 22:12:28.739461 kubelet[2635]: I0805 22:12:28.739453 2635 kubelet.go:2329] "Starting kubelet main sync loop" Aug 5 22:12:28.739559 kubelet[2635]: E0805 22:12:28.739549 2635 kubelet.go:2353] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Aug 5 22:12:28.741601 kubelet[2635]: I0805 22:12:28.741474 2635 factory.go:221] Registration of the systemd container factory successfully Aug 5 22:12:28.741709 kubelet[2635]: I0805 22:12:28.741693 2635 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Aug 5 22:12:28.746793 kubelet[2635]: I0805 22:12:28.746700 2635 factory.go:221] Registration of the containerd container factory successfully Aug 5 22:12:28.807263 kubelet[2635]: I0805 22:12:28.806965 2635 cpu_manager.go:214] "Starting CPU manager" policy="none" Aug 5 22:12:28.807263 kubelet[2635]: I0805 22:12:28.806986 2635 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Aug 5 22:12:28.807263 kubelet[2635]: I0805 22:12:28.807002 2635 state_mem.go:36] "Initialized new in-memory state store" Aug 5 22:12:28.807263 kubelet[2635]: I0805 22:12:28.807151 2635 state_mem.go:88] "Updated default CPUSet" cpuSet="" Aug 5 22:12:28.807263 kubelet[2635]: I0805 22:12:28.807173 2635 state_mem.go:96] "Updated CPUSet assignments" assignments={} Aug 5 22:12:28.807263 kubelet[2635]: I0805 22:12:28.807190 2635 policy_none.go:49] "None policy: Start" Aug 5 22:12:28.808540 kubelet[2635]: I0805 22:12:28.808516 2635 memory_manager.go:170] "Starting memorymanager" policy="None" Aug 5 22:12:28.808704 kubelet[2635]: I0805 22:12:28.808627 2635 state_mem.go:35] "Initializing new in-memory state store" Aug 5 22:12:28.809031 kubelet[2635]: I0805 22:12:28.809018 2635 state_mem.go:75] "Updated machine memory state" Aug 5 22:12:28.822259 kubelet[2635]: I0805 22:12:28.822208 2635 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Aug 5 22:12:28.822491 kubelet[2635]: I0805 22:12:28.822465 2635 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Aug 5 22:12:28.830702 kubelet[2635]: I0805 22:12:28.830648 2635 kubelet_node_status.go:73] "Attempting to register node" node="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:28.841177 kubelet[2635]: I0805 22:12:28.839724 2635 topology_manager.go:215] "Topology Admit Handler" podUID="bf4bbf4f9260439e9f02273c28e624cb" podNamespace="kube-system" podName="kube-apiserver-ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:28.841177 kubelet[2635]: I0805 22:12:28.839809 2635 topology_manager.go:215] "Topology Admit Handler" podUID="2f6787fba05730ec84fee9cd42df9be9" podNamespace="kube-system" podName="kube-controller-manager-ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:28.841177 kubelet[2635]: I0805 22:12:28.839850 2635 topology_manager.go:215] "Topology Admit Handler" podUID="13ae3c0a062b0c3f1ee1de89994c1721" podNamespace="kube-system" podName="kube-scheduler-ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:28.853918 kubelet[2635]: I0805 22:12:28.853884 2635 kubelet_node_status.go:112] "Node was previously registered" node="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:28.854052 kubelet[2635]: I0805 22:12:28.853937 2635 kubelet_node_status.go:76] "Successfully registered node" node="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:28.860970 kubelet[2635]: W0805 22:12:28.860775 2635 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Aug 5 22:12:28.862552 kubelet[2635]: W0805 22:12:28.862539 2635 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Aug 5 22:12:28.863696 kubelet[2635]: W0805 22:12:28.863594 2635 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Aug 5 22:12:29.026826 kubelet[2635]: I0805 22:12:29.026685 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/bf4bbf4f9260439e9f02273c28e624cb-k8s-certs\") pod \"kube-apiserver-ci-3975-2-0-1-de7b5ef465.novalocal\" (UID: \"bf4bbf4f9260439e9f02273c28e624cb\") " pod="kube-system/kube-apiserver-ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:29.028450 kubelet[2635]: I0805 22:12:29.026872 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/2f6787fba05730ec84fee9cd42df9be9-flexvolume-dir\") pod \"kube-controller-manager-ci-3975-2-0-1-de7b5ef465.novalocal\" (UID: \"2f6787fba05730ec84fee9cd42df9be9\") " pod="kube-system/kube-controller-manager-ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:29.028450 kubelet[2635]: I0805 22:12:29.026971 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2f6787fba05730ec84fee9cd42df9be9-k8s-certs\") pod \"kube-controller-manager-ci-3975-2-0-1-de7b5ef465.novalocal\" (UID: \"2f6787fba05730ec84fee9cd42df9be9\") " pod="kube-system/kube-controller-manager-ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:29.028450 kubelet[2635]: I0805 22:12:29.027096 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2f6787fba05730ec84fee9cd42df9be9-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-3975-2-0-1-de7b5ef465.novalocal\" (UID: \"2f6787fba05730ec84fee9cd42df9be9\") " pod="kube-system/kube-controller-manager-ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:29.028450 kubelet[2635]: I0805 22:12:29.027188 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/13ae3c0a062b0c3f1ee1de89994c1721-kubeconfig\") pod \"kube-scheduler-ci-3975-2-0-1-de7b5ef465.novalocal\" (UID: \"13ae3c0a062b0c3f1ee1de89994c1721\") " pod="kube-system/kube-scheduler-ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:29.029488 kubelet[2635]: I0805 22:12:29.029441 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/bf4bbf4f9260439e9f02273c28e624cb-ca-certs\") pod \"kube-apiserver-ci-3975-2-0-1-de7b5ef465.novalocal\" (UID: \"bf4bbf4f9260439e9f02273c28e624cb\") " pod="kube-system/kube-apiserver-ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:29.029627 kubelet[2635]: I0805 22:12:29.029585 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/bf4bbf4f9260439e9f02273c28e624cb-usr-share-ca-certificates\") pod \"kube-apiserver-ci-3975-2-0-1-de7b5ef465.novalocal\" (UID: \"bf4bbf4f9260439e9f02273c28e624cb\") " pod="kube-system/kube-apiserver-ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:29.029752 kubelet[2635]: I0805 22:12:29.029684 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2f6787fba05730ec84fee9cd42df9be9-ca-certs\") pod \"kube-controller-manager-ci-3975-2-0-1-de7b5ef465.novalocal\" (UID: \"2f6787fba05730ec84fee9cd42df9be9\") " pod="kube-system/kube-controller-manager-ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:29.029813 kubelet[2635]: I0805 22:12:29.029801 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/2f6787fba05730ec84fee9cd42df9be9-kubeconfig\") pod \"kube-controller-manager-ci-3975-2-0-1-de7b5ef465.novalocal\" (UID: \"2f6787fba05730ec84fee9cd42df9be9\") " pod="kube-system/kube-controller-manager-ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:29.691032 kubelet[2635]: I0805 22:12:29.690605 2635 apiserver.go:52] "Watching apiserver" Aug 5 22:12:29.725445 kubelet[2635]: I0805 22:12:29.724913 2635 desired_state_of_world_populator.go:159] "Finished populating initial desired state of world" Aug 5 22:12:29.806055 kubelet[2635]: W0805 22:12:29.805561 2635 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Aug 5 22:12:29.806055 kubelet[2635]: E0805 22:12:29.805696 2635 kubelet.go:1921] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-3975-2-0-1-de7b5ef465.novalocal\" already exists" pod="kube-system/kube-apiserver-ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:12:29.842478 kubelet[2635]: I0805 22:12:29.842447 2635 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-3975-2-0-1-de7b5ef465.novalocal" podStartSLOduration=1.8424035650000001 podStartE2EDuration="1.842403565s" podCreationTimestamp="2024-08-05 22:12:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-08-05 22:12:29.842237935 +0000 UTC m=+1.547458353" watchObservedRunningTime="2024-08-05 22:12:29.842403565 +0000 UTC m=+1.547623963" Aug 5 22:12:29.888868 kubelet[2635]: I0805 22:12:29.888835 2635 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-3975-2-0-1-de7b5ef465.novalocal" podStartSLOduration=1.888775122 podStartE2EDuration="1.888775122s" podCreationTimestamp="2024-08-05 22:12:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-08-05 22:12:29.855943088 +0000 UTC m=+1.561163486" watchObservedRunningTime="2024-08-05 22:12:29.888775122 +0000 UTC m=+1.593995510" Aug 5 22:12:29.906534 kubelet[2635]: I0805 22:12:29.906348 2635 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-3975-2-0-1-de7b5ef465.novalocal" podStartSLOduration=1.9061892249999999 podStartE2EDuration="1.906189225s" podCreationTimestamp="2024-08-05 22:12:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-08-05 22:12:29.889529794 +0000 UTC m=+1.594750192" watchObservedRunningTime="2024-08-05 22:12:29.906189225 +0000 UTC m=+1.611409623" Aug 5 22:12:34.489021 sudo[1697]: pam_unix(sudo:session): session closed for user root Aug 5 22:12:34.703813 sshd[1694]: pam_unix(sshd:session): session closed for user core Aug 5 22:12:34.712984 systemd[1]: sshd@8-172.24.4.33:22-172.24.4.1:42626.service: Deactivated successfully. Aug 5 22:12:34.717833 systemd[1]: session-11.scope: Deactivated successfully. Aug 5 22:12:34.718427 systemd[1]: session-11.scope: Consumed 8.059s CPU time, 135.7M memory peak, 0B memory swap peak. Aug 5 22:12:34.719893 systemd-logind[1423]: Session 11 logged out. Waiting for processes to exit. Aug 5 22:12:34.722997 systemd-logind[1423]: Removed session 11. Aug 5 22:12:39.890906 kubelet[2635]: I0805 22:12:39.890481 2635 kuberuntime_manager.go:1529] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Aug 5 22:12:39.892801 containerd[1444]: time="2024-08-05T22:12:39.892129030Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Aug 5 22:12:39.893151 kubelet[2635]: I0805 22:12:39.892592 2635 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Aug 5 22:12:40.919249 kubelet[2635]: I0805 22:12:40.919149 2635 topology_manager.go:215] "Topology Admit Handler" podUID="35bd2b90-acf5-4a90-9f83-99183c3a6c21" podNamespace="kube-system" podName="kube-proxy-zvxgc" Aug 5 22:12:40.941416 systemd[1]: Created slice kubepods-besteffort-pod35bd2b90_acf5_4a90_9f83_99183c3a6c21.slice - libcontainer container kubepods-besteffort-pod35bd2b90_acf5_4a90_9f83_99183c3a6c21.slice. Aug 5 22:12:41.007642 kubelet[2635]: I0805 22:12:41.007590 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/35bd2b90-acf5-4a90-9f83-99183c3a6c21-lib-modules\") pod \"kube-proxy-zvxgc\" (UID: \"35bd2b90-acf5-4a90-9f83-99183c3a6c21\") " pod="kube-system/kube-proxy-zvxgc" Aug 5 22:12:41.007642 kubelet[2635]: I0805 22:12:41.007649 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/35bd2b90-acf5-4a90-9f83-99183c3a6c21-kube-proxy\") pod \"kube-proxy-zvxgc\" (UID: \"35bd2b90-acf5-4a90-9f83-99183c3a6c21\") " pod="kube-system/kube-proxy-zvxgc" Aug 5 22:12:41.007828 kubelet[2635]: I0805 22:12:41.007674 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/35bd2b90-acf5-4a90-9f83-99183c3a6c21-xtables-lock\") pod \"kube-proxy-zvxgc\" (UID: \"35bd2b90-acf5-4a90-9f83-99183c3a6c21\") " pod="kube-system/kube-proxy-zvxgc" Aug 5 22:12:41.007828 kubelet[2635]: I0805 22:12:41.007702 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdhp6\" (UniqueName: \"kubernetes.io/projected/35bd2b90-acf5-4a90-9f83-99183c3a6c21-kube-api-access-pdhp6\") pod \"kube-proxy-zvxgc\" (UID: \"35bd2b90-acf5-4a90-9f83-99183c3a6c21\") " pod="kube-system/kube-proxy-zvxgc" Aug 5 22:12:41.038995 kubelet[2635]: I0805 22:12:41.038955 2635 topology_manager.go:215] "Topology Admit Handler" podUID="64935183-0f84-40ed-8adb-1c7fc5f9f4eb" podNamespace="tigera-operator" podName="tigera-operator-76c4974c85-ktt5j" Aug 5 22:12:41.045365 kubelet[2635]: W0805 22:12:41.045327 2635 reflector.go:539] object-"tigera-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-3975-2-0-1-de7b5ef465.novalocal" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-3975-2-0-1-de7b5ef465.novalocal' and this object Aug 5 22:12:41.045365 kubelet[2635]: E0805 22:12:41.045365 2635 reflector.go:147] object-"tigera-operator"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-3975-2-0-1-de7b5ef465.novalocal" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-3975-2-0-1-de7b5ef465.novalocal' and this object Aug 5 22:12:41.045558 kubelet[2635]: W0805 22:12:41.045408 2635 reflector.go:539] object-"tigera-operator"/"kubernetes-services-endpoint": failed to list *v1.ConfigMap: configmaps "kubernetes-services-endpoint" is forbidden: User "system:node:ci-3975-2-0-1-de7b5ef465.novalocal" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-3975-2-0-1-de7b5ef465.novalocal' and this object Aug 5 22:12:41.045558 kubelet[2635]: E0805 22:12:41.045421 2635 reflector.go:147] object-"tigera-operator"/"kubernetes-services-endpoint": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kubernetes-services-endpoint" is forbidden: User "system:node:ci-3975-2-0-1-de7b5ef465.novalocal" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-3975-2-0-1-de7b5ef465.novalocal' and this object Aug 5 22:12:41.049091 systemd[1]: Created slice kubepods-besteffort-pod64935183_0f84_40ed_8adb_1c7fc5f9f4eb.slice - libcontainer container kubepods-besteffort-pod64935183_0f84_40ed_8adb_1c7fc5f9f4eb.slice. Aug 5 22:12:41.209324 kubelet[2635]: I0805 22:12:41.209125 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmh5z\" (UniqueName: \"kubernetes.io/projected/64935183-0f84-40ed-8adb-1c7fc5f9f4eb-kube-api-access-qmh5z\") pod \"tigera-operator-76c4974c85-ktt5j\" (UID: \"64935183-0f84-40ed-8adb-1c7fc5f9f4eb\") " pod="tigera-operator/tigera-operator-76c4974c85-ktt5j" Aug 5 22:12:41.209324 kubelet[2635]: I0805 22:12:41.209186 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/64935183-0f84-40ed-8adb-1c7fc5f9f4eb-var-lib-calico\") pod \"tigera-operator-76c4974c85-ktt5j\" (UID: \"64935183-0f84-40ed-8adb-1c7fc5f9f4eb\") " pod="tigera-operator/tigera-operator-76c4974c85-ktt5j" Aug 5 22:12:41.270053 containerd[1444]: time="2024-08-05T22:12:41.269503269Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-zvxgc,Uid:35bd2b90-acf5-4a90-9f83-99183c3a6c21,Namespace:kube-system,Attempt:0,}" Aug 5 22:12:41.338583 containerd[1444]: time="2024-08-05T22:12:41.337784678Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Aug 5 22:12:41.338583 containerd[1444]: time="2024-08-05T22:12:41.337928206Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 5 22:12:41.340108 containerd[1444]: time="2024-08-05T22:12:41.339757511Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Aug 5 22:12:41.340108 containerd[1444]: time="2024-08-05T22:12:41.339818339Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 5 22:12:41.382456 systemd[1]: Started cri-containerd-c687bbf28bc330e2bb46dac79f8434361311dfbed0ae53c83564378c432ebc8b.scope - libcontainer container c687bbf28bc330e2bb46dac79f8434361311dfbed0ae53c83564378c432ebc8b. Aug 5 22:12:41.423424 containerd[1444]: time="2024-08-05T22:12:41.423186704Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-zvxgc,Uid:35bd2b90-acf5-4a90-9f83-99183c3a6c21,Namespace:kube-system,Attempt:0,} returns sandbox id \"c687bbf28bc330e2bb46dac79f8434361311dfbed0ae53c83564378c432ebc8b\"" Aug 5 22:12:41.426956 containerd[1444]: time="2024-08-05T22:12:41.426733037Z" level=info msg="CreateContainer within sandbox \"c687bbf28bc330e2bb46dac79f8434361311dfbed0ae53c83564378c432ebc8b\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Aug 5 22:12:41.454949 containerd[1444]: time="2024-08-05T22:12:41.454898237Z" level=info msg="CreateContainer within sandbox \"c687bbf28bc330e2bb46dac79f8434361311dfbed0ae53c83564378c432ebc8b\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"245df651512c844f7c2743f128f0ad26e1b1cbc320d688a39397f7b64d3549d2\"" Aug 5 22:12:41.457797 containerd[1444]: time="2024-08-05T22:12:41.456669339Z" level=info msg="StartContainer for \"245df651512c844f7c2743f128f0ad26e1b1cbc320d688a39397f7b64d3549d2\"" Aug 5 22:12:41.500373 systemd[1]: Started cri-containerd-245df651512c844f7c2743f128f0ad26e1b1cbc320d688a39397f7b64d3549d2.scope - libcontainer container 245df651512c844f7c2743f128f0ad26e1b1cbc320d688a39397f7b64d3549d2. Aug 5 22:12:41.543917 containerd[1444]: time="2024-08-05T22:12:41.543871153Z" level=info msg="StartContainer for \"245df651512c844f7c2743f128f0ad26e1b1cbc320d688a39397f7b64d3549d2\" returns successfully" Aug 5 22:12:41.839943 kubelet[2635]: I0805 22:12:41.839763 2635 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-proxy-zvxgc" podStartSLOduration=1.839361971 podStartE2EDuration="1.839361971s" podCreationTimestamp="2024-08-05 22:12:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-08-05 22:12:41.838970091 +0000 UTC m=+13.544190509" watchObservedRunningTime="2024-08-05 22:12:41.839361971 +0000 UTC m=+13.544582409" Aug 5 22:12:41.956364 containerd[1444]: time="2024-08-05T22:12:41.955942677Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-76c4974c85-ktt5j,Uid:64935183-0f84-40ed-8adb-1c7fc5f9f4eb,Namespace:tigera-operator,Attempt:0,}" Aug 5 22:12:41.984273 containerd[1444]: time="2024-08-05T22:12:41.984143085Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Aug 5 22:12:41.984273 containerd[1444]: time="2024-08-05T22:12:41.984206618Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 5 22:12:41.984563 containerd[1444]: time="2024-08-05T22:12:41.984300590Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Aug 5 22:12:41.984563 containerd[1444]: time="2024-08-05T22:12:41.984345167Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 5 22:12:42.009435 systemd[1]: Started cri-containerd-b08a88c5a42f2111d741749b6d43bdaf5205410071873cb6bbfa69878a100fa0.scope - libcontainer container b08a88c5a42f2111d741749b6d43bdaf5205410071873cb6bbfa69878a100fa0. Aug 5 22:12:42.051925 containerd[1444]: time="2024-08-05T22:12:42.051878265Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-76c4974c85-ktt5j,Uid:64935183-0f84-40ed-8adb-1c7fc5f9f4eb,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"b08a88c5a42f2111d741749b6d43bdaf5205410071873cb6bbfa69878a100fa0\"" Aug 5 22:12:42.069043 containerd[1444]: time="2024-08-05T22:12:42.068709765Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.34.0\"" Aug 5 22:12:42.144533 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4222168780.mount: Deactivated successfully. Aug 5 22:12:43.917581 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1531785418.mount: Deactivated successfully. Aug 5 22:12:45.543509 containerd[1444]: time="2024-08-05T22:12:45.543390852Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.34.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:12:45.544951 containerd[1444]: time="2024-08-05T22:12:45.544890953Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.34.0: active requests=0, bytes read=22076076" Aug 5 22:12:45.545315 containerd[1444]: time="2024-08-05T22:12:45.545273791Z" level=info msg="ImageCreate event name:\"sha256:01249e32d0f6f7d0ad79761d634d16738f1a5792b893f202f9a417c63034411d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:12:45.548397 containerd[1444]: time="2024-08-05T22:12:45.548009825Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:479ddc7ff9ab095058b96f6710bbf070abada86332e267d6e5dcc1df36ba2cc5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:12:45.548901 containerd[1444]: time="2024-08-05T22:12:45.548870716Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.34.0\" with image id \"sha256:01249e32d0f6f7d0ad79761d634d16738f1a5792b893f202f9a417c63034411d\", repo tag \"quay.io/tigera/operator:v1.34.0\", repo digest \"quay.io/tigera/operator@sha256:479ddc7ff9ab095058b96f6710bbf070abada86332e267d6e5dcc1df36ba2cc5\", size \"22070263\" in 3.48007812s" Aug 5 22:12:45.549069 containerd[1444]: time="2024-08-05T22:12:45.549050552Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.34.0\" returns image reference \"sha256:01249e32d0f6f7d0ad79761d634d16738f1a5792b893f202f9a417c63034411d\"" Aug 5 22:12:45.552533 containerd[1444]: time="2024-08-05T22:12:45.552497256Z" level=info msg="CreateContainer within sandbox \"b08a88c5a42f2111d741749b6d43bdaf5205410071873cb6bbfa69878a100fa0\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Aug 5 22:12:45.575964 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4045663036.mount: Deactivated successfully. Aug 5 22:12:45.579957 containerd[1444]: time="2024-08-05T22:12:45.579840567Z" level=info msg="CreateContainer within sandbox \"b08a88c5a42f2111d741749b6d43bdaf5205410071873cb6bbfa69878a100fa0\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"e092276a0fc2c0ca8cd3a0afea53a6546077a5d0a63f0913251742d34c0c896f\"" Aug 5 22:12:45.581682 containerd[1444]: time="2024-08-05T22:12:45.580872506Z" level=info msg="StartContainer for \"e092276a0fc2c0ca8cd3a0afea53a6546077a5d0a63f0913251742d34c0c896f\"" Aug 5 22:12:45.620981 systemd[1]: run-containerd-runc-k8s.io-e092276a0fc2c0ca8cd3a0afea53a6546077a5d0a63f0913251742d34c0c896f-runc.r6851Y.mount: Deactivated successfully. Aug 5 22:12:45.628364 systemd[1]: Started cri-containerd-e092276a0fc2c0ca8cd3a0afea53a6546077a5d0a63f0913251742d34c0c896f.scope - libcontainer container e092276a0fc2c0ca8cd3a0afea53a6546077a5d0a63f0913251742d34c0c896f. Aug 5 22:12:45.669745 containerd[1444]: time="2024-08-05T22:12:45.669615557Z" level=info msg="StartContainer for \"e092276a0fc2c0ca8cd3a0afea53a6546077a5d0a63f0913251742d34c0c896f\" returns successfully" Aug 5 22:12:48.833355 kubelet[2635]: I0805 22:12:48.830614 2635 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="tigera-operator/tigera-operator-76c4974c85-ktt5j" podStartSLOduration=4.332361418 podStartE2EDuration="7.828480174s" podCreationTimestamp="2024-08-05 22:12:41 +0000 UTC" firstStartedPulling="2024-08-05 22:12:42.053325576 +0000 UTC m=+13.758545964" lastFinishedPulling="2024-08-05 22:12:45.549444332 +0000 UTC m=+17.254664720" observedRunningTime="2024-08-05 22:12:45.857033145 +0000 UTC m=+17.562253583" watchObservedRunningTime="2024-08-05 22:12:48.828480174 +0000 UTC m=+20.533700612" Aug 5 22:12:48.833355 kubelet[2635]: I0805 22:12:48.830921 2635 topology_manager.go:215] "Topology Admit Handler" podUID="f2ff3fd7-525a-47a9-95a1-74f7bff921c4" podNamespace="calico-system" podName="calico-typha-5859df9dd8-qnrkm" Aug 5 22:12:48.842602 systemd[1]: Created slice kubepods-besteffort-podf2ff3fd7_525a_47a9_95a1_74f7bff921c4.slice - libcontainer container kubepods-besteffort-podf2ff3fd7_525a_47a9_95a1_74f7bff921c4.slice. Aug 5 22:12:48.937707 kubelet[2635]: I0805 22:12:48.937667 2635 topology_manager.go:215] "Topology Admit Handler" podUID="825ecc74-9466-4fbf-bfc0-233430c643bb" podNamespace="calico-system" podName="calico-node-zs4mf" Aug 5 22:12:48.951429 systemd[1]: Created slice kubepods-besteffort-pod825ecc74_9466_4fbf_bfc0_233430c643bb.slice - libcontainer container kubepods-besteffort-pod825ecc74_9466_4fbf_bfc0_233430c643bb.slice. Aug 5 22:12:48.961936 kubelet[2635]: I0805 22:12:48.961899 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/f2ff3fd7-525a-47a9-95a1-74f7bff921c4-typha-certs\") pod \"calico-typha-5859df9dd8-qnrkm\" (UID: \"f2ff3fd7-525a-47a9-95a1-74f7bff921c4\") " pod="calico-system/calico-typha-5859df9dd8-qnrkm" Aug 5 22:12:48.961936 kubelet[2635]: I0805 22:12:48.961940 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj47n\" (UniqueName: \"kubernetes.io/projected/f2ff3fd7-525a-47a9-95a1-74f7bff921c4-kube-api-access-rj47n\") pod \"calico-typha-5859df9dd8-qnrkm\" (UID: \"f2ff3fd7-525a-47a9-95a1-74f7bff921c4\") " pod="calico-system/calico-typha-5859df9dd8-qnrkm" Aug 5 22:12:48.962144 kubelet[2635]: I0805 22:12:48.961973 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2ff3fd7-525a-47a9-95a1-74f7bff921c4-tigera-ca-bundle\") pod \"calico-typha-5859df9dd8-qnrkm\" (UID: \"f2ff3fd7-525a-47a9-95a1-74f7bff921c4\") " pod="calico-system/calico-typha-5859df9dd8-qnrkm" Aug 5 22:12:49.070742 kubelet[2635]: I0805 22:12:49.068993 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-flexvol-driver-host\") pod \"calico-node-zs4mf\" (UID: \"825ecc74-9466-4fbf-bfc0-233430c643bb\") " pod="calico-system/calico-node-zs4mf" Aug 5 22:12:49.072905 kubelet[2635]: I0805 22:12:49.072506 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-var-run-calico\") pod \"calico-node-zs4mf\" (UID: \"825ecc74-9466-4fbf-bfc0-233430c643bb\") " pod="calico-system/calico-node-zs4mf" Aug 5 22:12:49.073377 kubelet[2635]: I0805 22:12:49.072764 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-var-lib-calico\") pod \"calico-node-zs4mf\" (UID: \"825ecc74-9466-4fbf-bfc0-233430c643bb\") " pod="calico-system/calico-node-zs4mf" Aug 5 22:12:49.074135 kubelet[2635]: I0805 22:12:49.073950 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn8xf\" (UniqueName: \"kubernetes.io/projected/825ecc74-9466-4fbf-bfc0-233430c643bb-kube-api-access-dn8xf\") pod \"calico-node-zs4mf\" (UID: \"825ecc74-9466-4fbf-bfc0-233430c643bb\") " pod="calico-system/calico-node-zs4mf" Aug 5 22:12:49.078313 kubelet[2635]: I0805 22:12:49.075359 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/825ecc74-9466-4fbf-bfc0-233430c643bb-tigera-ca-bundle\") pod \"calico-node-zs4mf\" (UID: \"825ecc74-9466-4fbf-bfc0-233430c643bb\") " pod="calico-system/calico-node-zs4mf" Aug 5 22:12:49.078313 kubelet[2635]: I0805 22:12:49.075440 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/825ecc74-9466-4fbf-bfc0-233430c643bb-node-certs\") pod \"calico-node-zs4mf\" (UID: \"825ecc74-9466-4fbf-bfc0-233430c643bb\") " pod="calico-system/calico-node-zs4mf" Aug 5 22:12:49.078313 kubelet[2635]: I0805 22:12:49.075503 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-xtables-lock\") pod \"calico-node-zs4mf\" (UID: \"825ecc74-9466-4fbf-bfc0-233430c643bb\") " pod="calico-system/calico-node-zs4mf" Aug 5 22:12:49.078313 kubelet[2635]: I0805 22:12:49.075558 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-cni-net-dir\") pod \"calico-node-zs4mf\" (UID: \"825ecc74-9466-4fbf-bfc0-233430c643bb\") " pod="calico-system/calico-node-zs4mf" Aug 5 22:12:49.078313 kubelet[2635]: I0805 22:12:49.075615 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-cni-log-dir\") pod \"calico-node-zs4mf\" (UID: \"825ecc74-9466-4fbf-bfc0-233430c643bb\") " pod="calico-system/calico-node-zs4mf" Aug 5 22:12:49.078761 kubelet[2635]: I0805 22:12:49.075668 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-lib-modules\") pod \"calico-node-zs4mf\" (UID: \"825ecc74-9466-4fbf-bfc0-233430c643bb\") " pod="calico-system/calico-node-zs4mf" Aug 5 22:12:49.078761 kubelet[2635]: I0805 22:12:49.075722 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-cni-bin-dir\") pod \"calico-node-zs4mf\" (UID: \"825ecc74-9466-4fbf-bfc0-233430c643bb\") " pod="calico-system/calico-node-zs4mf" Aug 5 22:12:49.078761 kubelet[2635]: I0805 22:12:49.075815 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-policysync\") pod \"calico-node-zs4mf\" (UID: \"825ecc74-9466-4fbf-bfc0-233430c643bb\") " pod="calico-system/calico-node-zs4mf" Aug 5 22:12:49.092894 kubelet[2635]: I0805 22:12:49.092573 2635 topology_manager.go:215] "Topology Admit Handler" podUID="2d1ea0ca-eeda-4305-a298-4df807e6a886" podNamespace="calico-system" podName="csi-node-driver-ck8bx" Aug 5 22:12:49.092894 kubelet[2635]: E0805 22:12:49.092841 2635 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ck8bx" podUID="2d1ea0ca-eeda-4305-a298-4df807e6a886" Aug 5 22:12:49.171123 containerd[1444]: time="2024-08-05T22:12:49.170351933Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5859df9dd8-qnrkm,Uid:f2ff3fd7-525a-47a9-95a1-74f7bff921c4,Namespace:calico-system,Attempt:0,}" Aug 5 22:12:49.184929 kubelet[2635]: E0805 22:12:49.184521 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.184929 kubelet[2635]: W0805 22:12:49.184568 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.191248 kubelet[2635]: E0805 22:12:49.191192 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.192129 kubelet[2635]: E0805 22:12:49.191985 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.192129 kubelet[2635]: W0805 22:12:49.192003 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.192652 kubelet[2635]: E0805 22:12:49.192612 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.192985 kubelet[2635]: E0805 22:12:49.192783 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.192985 kubelet[2635]: W0805 22:12:49.192829 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.193457 kubelet[2635]: E0805 22:12:49.193441 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.193791 kubelet[2635]: E0805 22:12:49.193725 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.193791 kubelet[2635]: W0805 22:12:49.193737 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.193791 kubelet[2635]: E0805 22:12:49.193781 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.194906 kubelet[2635]: E0805 22:12:49.194763 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.194906 kubelet[2635]: W0805 22:12:49.194777 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.194906 kubelet[2635]: E0805 22:12:49.194839 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.195097 kubelet[2635]: E0805 22:12:49.195083 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.195166 kubelet[2635]: W0805 22:12:49.195149 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.195380 kubelet[2635]: E0805 22:12:49.195354 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.195619 kubelet[2635]: E0805 22:12:49.195507 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.195619 kubelet[2635]: W0805 22:12:49.195518 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.195619 kubelet[2635]: E0805 22:12:49.195566 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.195827 kubelet[2635]: E0805 22:12:49.195817 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.195969 kubelet[2635]: W0805 22:12:49.195883 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.195969 kubelet[2635]: E0805 22:12:49.195909 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.196392 kubelet[2635]: E0805 22:12:49.196380 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.196761 kubelet[2635]: W0805 22:12:49.196443 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.196761 kubelet[2635]: E0805 22:12:49.196472 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.197212 kubelet[2635]: E0805 22:12:49.197180 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.197340 kubelet[2635]: W0805 22:12:49.197301 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.197381 kubelet[2635]: E0805 22:12:49.197344 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.197568 kubelet[2635]: E0805 22:12:49.197529 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.197568 kubelet[2635]: W0805 22:12:49.197545 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.197568 kubelet[2635]: E0805 22:12:49.197559 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.199551 kubelet[2635]: E0805 22:12:49.199214 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.199551 kubelet[2635]: W0805 22:12:49.199296 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.199551 kubelet[2635]: E0805 22:12:49.199501 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.199551 kubelet[2635]: W0805 22:12:49.199511 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.199551 kubelet[2635]: E0805 22:12:49.199508 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.199731 kubelet[2635]: E0805 22:12:49.199563 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.200090 kubelet[2635]: E0805 22:12:49.199840 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.200090 kubelet[2635]: W0805 22:12:49.199856 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.200090 kubelet[2635]: E0805 22:12:49.200037 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.200206 kubelet[2635]: E0805 22:12:49.200117 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.200206 kubelet[2635]: W0805 22:12:49.200147 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.200437 kubelet[2635]: E0805 22:12:49.200341 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.200823 kubelet[2635]: E0805 22:12:49.200802 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.200823 kubelet[2635]: W0805 22:12:49.200817 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.200901 kubelet[2635]: E0805 22:12:49.200830 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.201631 kubelet[2635]: E0805 22:12:49.201605 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.203248 kubelet[2635]: W0805 22:12:49.201777 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.203248 kubelet[2635]: E0805 22:12:49.201801 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.203248 kubelet[2635]: E0805 22:12:49.202195 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.211720 kubelet[2635]: W0805 22:12:49.210914 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.211720 kubelet[2635]: E0805 22:12:49.210955 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.211720 kubelet[2635]: E0805 22:12:49.211144 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.211720 kubelet[2635]: W0805 22:12:49.211153 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.211720 kubelet[2635]: E0805 22:12:49.211165 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.212539 kubelet[2635]: E0805 22:12:49.212432 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.212539 kubelet[2635]: W0805 22:12:49.212447 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.212539 kubelet[2635]: E0805 22:12:49.212463 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.212638 kubelet[2635]: E0805 22:12:49.212612 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.212638 kubelet[2635]: W0805 22:12:49.212621 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.212638 kubelet[2635]: E0805 22:12:49.212632 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.213659 kubelet[2635]: E0805 22:12:49.213408 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.213659 kubelet[2635]: W0805 22:12:49.213424 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.213659 kubelet[2635]: E0805 22:12:49.213439 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.216354 kubelet[2635]: E0805 22:12:49.216320 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.216354 kubelet[2635]: W0805 22:12:49.216337 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.216354 kubelet[2635]: E0805 22:12:49.216350 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.222863 kubelet[2635]: E0805 22:12:49.222834 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.222863 kubelet[2635]: W0805 22:12:49.222857 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.223157 kubelet[2635]: E0805 22:12:49.222953 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.223157 kubelet[2635]: E0805 22:12:49.223125 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.223157 kubelet[2635]: W0805 22:12:49.223134 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.224313 kubelet[2635]: E0805 22:12:49.224281 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.224904 kubelet[2635]: E0805 22:12:49.224882 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.224904 kubelet[2635]: W0805 22:12:49.224900 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.225700 kubelet[2635]: E0805 22:12:49.225642 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.225700 kubelet[2635]: W0805 22:12:49.225666 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.225700 kubelet[2635]: E0805 22:12:49.225670 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.225700 kubelet[2635]: E0805 22:12:49.225687 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.226027 kubelet[2635]: E0805 22:12:49.226008 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.226027 kubelet[2635]: W0805 22:12:49.226023 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.226113 kubelet[2635]: E0805 22:12:49.226039 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.229358 kubelet[2635]: E0805 22:12:49.229338 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.229358 kubelet[2635]: W0805 22:12:49.229352 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.229452 kubelet[2635]: E0805 22:12:49.229367 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.229621 kubelet[2635]: E0805 22:12:49.229602 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.229621 kubelet[2635]: W0805 22:12:49.229616 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.229695 kubelet[2635]: E0805 22:12:49.229628 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.229877 kubelet[2635]: E0805 22:12:49.229859 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.229877 kubelet[2635]: W0805 22:12:49.229876 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.229944 kubelet[2635]: E0805 22:12:49.229888 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.243039 containerd[1444]: time="2024-08-05T22:12:49.241154536Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Aug 5 22:12:49.243039 containerd[1444]: time="2024-08-05T22:12:49.241244128Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 5 22:12:49.243039 containerd[1444]: time="2024-08-05T22:12:49.241275589Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Aug 5 22:12:49.243039 containerd[1444]: time="2024-08-05T22:12:49.241295366Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 5 22:12:49.256181 containerd[1444]: time="2024-08-05T22:12:49.255750789Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-zs4mf,Uid:825ecc74-9466-4fbf-bfc0-233430c643bb,Namespace:calico-system,Attempt:0,}" Aug 5 22:12:49.268416 systemd[1]: Started cri-containerd-57ba4be255acc58e96253187a5e2ea4f3b3e7eda28c3e5dd3d7b0f67986e6297.scope - libcontainer container 57ba4be255acc58e96253187a5e2ea4f3b3e7eda28c3e5dd3d7b0f67986e6297. Aug 5 22:12:49.280874 kubelet[2635]: E0805 22:12:49.280844 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.281694 kubelet[2635]: W0805 22:12:49.281273 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.281694 kubelet[2635]: E0805 22:12:49.281309 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.281694 kubelet[2635]: I0805 22:12:49.281348 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2d1ea0ca-eeda-4305-a298-4df807e6a886-socket-dir\") pod \"csi-node-driver-ck8bx\" (UID: \"2d1ea0ca-eeda-4305-a298-4df807e6a886\") " pod="calico-system/csi-node-driver-ck8bx" Aug 5 22:12:49.282335 kubelet[2635]: E0805 22:12:49.282320 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.282933 kubelet[2635]: W0805 22:12:49.282918 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.283287 kubelet[2635]: E0805 22:12:49.283274 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.285731 kubelet[2635]: E0805 22:12:49.285315 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.285731 kubelet[2635]: W0805 22:12:49.285421 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.285731 kubelet[2635]: E0805 22:12:49.285437 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.288138 kubelet[2635]: E0805 22:12:49.286416 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.288138 kubelet[2635]: W0805 22:12:49.286429 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.288138 kubelet[2635]: E0805 22:12:49.286443 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.294749 kubelet[2635]: E0805 22:12:49.294727 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.294859 kubelet[2635]: W0805 22:12:49.294845 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.296283 kubelet[2635]: E0805 22:12:49.296258 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.296494 kubelet[2635]: E0805 22:12:49.296483 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.296902 kubelet[2635]: W0805 22:12:49.296547 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.296902 kubelet[2635]: E0805 22:12:49.296564 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.320427 kubelet[2635]: I0805 22:12:49.320377 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/2d1ea0ca-eeda-4305-a298-4df807e6a886-varrun\") pod \"csi-node-driver-ck8bx\" (UID: \"2d1ea0ca-eeda-4305-a298-4df807e6a886\") " pod="calico-system/csi-node-driver-ck8bx" Aug 5 22:12:49.322018 kubelet[2635]: E0805 22:12:49.322002 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.322110 kubelet[2635]: W0805 22:12:49.322095 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.322245 kubelet[2635]: E0805 22:12:49.322174 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.323361 kubelet[2635]: I0805 22:12:49.323309 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2d1ea0ca-eeda-4305-a298-4df807e6a886-kubelet-dir\") pod \"csi-node-driver-ck8bx\" (UID: \"2d1ea0ca-eeda-4305-a298-4df807e6a886\") " pod="calico-system/csi-node-driver-ck8bx" Aug 5 22:12:49.323716 kubelet[2635]: E0805 22:12:49.323627 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.323716 kubelet[2635]: W0805 22:12:49.323640 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.323716 kubelet[2635]: E0805 22:12:49.323654 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.323716 kubelet[2635]: I0805 22:12:49.323679 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2d1ea0ca-eeda-4305-a298-4df807e6a886-registration-dir\") pod \"csi-node-driver-ck8bx\" (UID: \"2d1ea0ca-eeda-4305-a298-4df807e6a886\") " pod="calico-system/csi-node-driver-ck8bx" Aug 5 22:12:49.325670 kubelet[2635]: E0805 22:12:49.325643 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.325670 kubelet[2635]: W0805 22:12:49.325667 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.325842 kubelet[2635]: E0805 22:12:49.325701 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.326687 kubelet[2635]: E0805 22:12:49.326233 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.326687 kubelet[2635]: W0805 22:12:49.326248 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.326687 kubelet[2635]: E0805 22:12:49.326655 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.326687 kubelet[2635]: W0805 22:12:49.326664 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.326976 kubelet[2635]: E0805 22:12:49.326882 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.326976 kubelet[2635]: E0805 22:12:49.326917 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.326976 kubelet[2635]: I0805 22:12:49.326938 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvk6g\" (UniqueName: \"kubernetes.io/projected/2d1ea0ca-eeda-4305-a298-4df807e6a886-kube-api-access-nvk6g\") pod \"csi-node-driver-ck8bx\" (UID: \"2d1ea0ca-eeda-4305-a298-4df807e6a886\") " pod="calico-system/csi-node-driver-ck8bx" Aug 5 22:12:49.327528 kubelet[2635]: E0805 22:12:49.327136 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.327528 kubelet[2635]: W0805 22:12:49.327151 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.327528 kubelet[2635]: E0805 22:12:49.327172 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.327704 kubelet[2635]: E0805 22:12:49.327673 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.327704 kubelet[2635]: W0805 22:12:49.327688 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.327704 kubelet[2635]: E0805 22:12:49.327701 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.328104 kubelet[2635]: E0805 22:12:49.328077 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.328104 kubelet[2635]: W0805 22:12:49.328092 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.328104 kubelet[2635]: E0805 22:12:49.328104 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.328613 kubelet[2635]: E0805 22:12:49.328564 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.328613 kubelet[2635]: W0805 22:12:49.328577 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.328613 kubelet[2635]: E0805 22:12:49.328599 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.328802 containerd[1444]: time="2024-08-05T22:12:49.328292340Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Aug 5 22:12:49.328802 containerd[1444]: time="2024-08-05T22:12:49.328363066Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 5 22:12:49.328802 containerd[1444]: time="2024-08-05T22:12:49.328380379Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Aug 5 22:12:49.328802 containerd[1444]: time="2024-08-05T22:12:49.328392512Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 5 22:12:49.353400 systemd[1]: Started cri-containerd-de33a8626cda7c4dc930033f9a51a27269999ea189861abee56825a2b75599ce.scope - libcontainer container de33a8626cda7c4dc930033f9a51a27269999ea189861abee56825a2b75599ce. Aug 5 22:12:49.379978 containerd[1444]: time="2024-08-05T22:12:49.379901985Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5859df9dd8-qnrkm,Uid:f2ff3fd7-525a-47a9-95a1-74f7bff921c4,Namespace:calico-system,Attempt:0,} returns sandbox id \"57ba4be255acc58e96253187a5e2ea4f3b3e7eda28c3e5dd3d7b0f67986e6297\"" Aug 5 22:12:49.407863 containerd[1444]: time="2024-08-05T22:12:49.407792832Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-zs4mf,Uid:825ecc74-9466-4fbf-bfc0-233430c643bb,Namespace:calico-system,Attempt:0,} returns sandbox id \"de33a8626cda7c4dc930033f9a51a27269999ea189861abee56825a2b75599ce\"" Aug 5 22:12:49.409237 containerd[1444]: time="2024-08-05T22:12:49.408822038Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.28.0\"" Aug 5 22:12:49.438680 kubelet[2635]: E0805 22:12:49.438642 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.438680 kubelet[2635]: W0805 22:12:49.438668 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.438849 kubelet[2635]: E0805 22:12:49.438718 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.439239 kubelet[2635]: E0805 22:12:49.438985 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.439239 kubelet[2635]: W0805 22:12:49.439001 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.439239 kubelet[2635]: E0805 22:12:49.439027 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.439344 kubelet[2635]: E0805 22:12:49.439243 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.439344 kubelet[2635]: W0805 22:12:49.439252 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.439344 kubelet[2635]: E0805 22:12:49.439267 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.439672 kubelet[2635]: E0805 22:12:49.439460 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.439672 kubelet[2635]: W0805 22:12:49.439474 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.439672 kubelet[2635]: E0805 22:12:49.439498 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.439762 kubelet[2635]: E0805 22:12:49.439695 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.439762 kubelet[2635]: W0805 22:12:49.439708 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.439762 kubelet[2635]: E0805 22:12:49.439722 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.440971 kubelet[2635]: E0805 22:12:49.439939 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.440971 kubelet[2635]: W0805 22:12:49.439955 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.440971 kubelet[2635]: E0805 22:12:49.439980 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.440971 kubelet[2635]: E0805 22:12:49.440133 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.440971 kubelet[2635]: W0805 22:12:49.440141 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.440971 kubelet[2635]: E0805 22:12:49.440164 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.440971 kubelet[2635]: E0805 22:12:49.440315 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.440971 kubelet[2635]: W0805 22:12:49.440324 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.440971 kubelet[2635]: E0805 22:12:49.440389 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.440971 kubelet[2635]: E0805 22:12:49.440477 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.441297 kubelet[2635]: W0805 22:12:49.440484 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.441297 kubelet[2635]: E0805 22:12:49.440550 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.441297 kubelet[2635]: E0805 22:12:49.440628 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.441297 kubelet[2635]: W0805 22:12:49.440656 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.441297 kubelet[2635]: E0805 22:12:49.440744 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.441297 kubelet[2635]: E0805 22:12:49.440852 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.441297 kubelet[2635]: W0805 22:12:49.440866 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.441297 kubelet[2635]: E0805 22:12:49.440962 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.441297 kubelet[2635]: E0805 22:12:49.441071 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.441297 kubelet[2635]: W0805 22:12:49.441080 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.441541 kubelet[2635]: E0805 22:12:49.441124 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.441541 kubelet[2635]: E0805 22:12:49.441295 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.441541 kubelet[2635]: W0805 22:12:49.441303 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.441541 kubelet[2635]: E0805 22:12:49.441317 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.441541 kubelet[2635]: E0805 22:12:49.441471 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.441541 kubelet[2635]: W0805 22:12:49.441479 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.441692 kubelet[2635]: E0805 22:12:49.441566 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.442249 kubelet[2635]: E0805 22:12:49.441725 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.442249 kubelet[2635]: W0805 22:12:49.441741 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.442249 kubelet[2635]: E0805 22:12:49.441866 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.442249 kubelet[2635]: W0805 22:12:49.441876 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.442249 kubelet[2635]: E0805 22:12:49.441995 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.442249 kubelet[2635]: W0805 22:12:49.442003 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.442249 kubelet[2635]: E0805 22:12:49.442119 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.442249 kubelet[2635]: W0805 22:12:49.442127 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.442249 kubelet[2635]: E0805 22:12:49.442139 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.442466 kubelet[2635]: E0805 22:12:49.442288 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.442466 kubelet[2635]: W0805 22:12:49.442297 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.442466 kubelet[2635]: E0805 22:12:49.442309 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.445715 kubelet[2635]: E0805 22:12:49.445646 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.445715 kubelet[2635]: E0805 22:12:49.445669 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.445715 kubelet[2635]: E0805 22:12:49.445696 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.446045 kubelet[2635]: E0805 22:12:49.445895 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.446045 kubelet[2635]: W0805 22:12:49.445911 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.446045 kubelet[2635]: E0805 22:12:49.445939 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.447832 kubelet[2635]: E0805 22:12:49.447806 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.447832 kubelet[2635]: W0805 22:12:49.447822 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.448049 kubelet[2635]: E0805 22:12:49.448025 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.448049 kubelet[2635]: W0805 22:12:49.448040 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.448107 kubelet[2635]: E0805 22:12:49.448053 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.448253 kubelet[2635]: E0805 22:12:49.448199 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.448417 kubelet[2635]: E0805 22:12:49.448396 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.448522 kubelet[2635]: W0805 22:12:49.448498 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.448566 kubelet[2635]: E0805 22:12:49.448524 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.449178 kubelet[2635]: E0805 22:12:49.449156 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.449178 kubelet[2635]: W0805 22:12:49.449172 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.449276 kubelet[2635]: E0805 22:12:49.449187 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.450406 kubelet[2635]: E0805 22:12:49.450383 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.450406 kubelet[2635]: W0805 22:12:49.450400 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.450485 kubelet[2635]: E0805 22:12:49.450421 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:49.462437 kubelet[2635]: E0805 22:12:49.462403 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:49.462549 kubelet[2635]: W0805 22:12:49.462455 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:49.462549 kubelet[2635]: E0805 22:12:49.462476 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:50.741264 kubelet[2635]: E0805 22:12:50.740143 2635 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ck8bx" podUID="2d1ea0ca-eeda-4305-a298-4df807e6a886" Aug 5 22:12:52.740607 kubelet[2635]: E0805 22:12:52.740577 2635 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ck8bx" podUID="2d1ea0ca-eeda-4305-a298-4df807e6a886" Aug 5 22:12:52.911490 containerd[1444]: time="2024-08-05T22:12:52.911381457Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.28.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:12:52.914707 containerd[1444]: time="2024-08-05T22:12:52.914609580Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.28.0: active requests=0, bytes read=29458030" Aug 5 22:12:52.916370 containerd[1444]: time="2024-08-05T22:12:52.916287624Z" level=info msg="ImageCreate event name:\"sha256:a9372c0f51b54c589e5a16013ed3049b2a052dd6903d72603849fab2c4216fbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:12:52.925092 containerd[1444]: time="2024-08-05T22:12:52.924997034Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:eff1501af12b7e27e2ef8f4e55d03d837bcb017aa5663e22e519059c452d51ed\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:12:52.926388 containerd[1444]: time="2024-08-05T22:12:52.926314277Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.28.0\" with image id \"sha256:a9372c0f51b54c589e5a16013ed3049b2a052dd6903d72603849fab2c4216fbc\", repo tag \"ghcr.io/flatcar/calico/typha:v3.28.0\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:eff1501af12b7e27e2ef8f4e55d03d837bcb017aa5663e22e519059c452d51ed\", size \"30905782\" in 3.517445168s" Aug 5 22:12:52.926565 containerd[1444]: time="2024-08-05T22:12:52.926387287Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.28.0\" returns image reference \"sha256:a9372c0f51b54c589e5a16013ed3049b2a052dd6903d72603849fab2c4216fbc\"" Aug 5 22:12:52.929475 containerd[1444]: time="2024-08-05T22:12:52.928970324Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.28.0\"" Aug 5 22:12:52.979630 containerd[1444]: time="2024-08-05T22:12:52.978865824Z" level=info msg="CreateContainer within sandbox \"57ba4be255acc58e96253187a5e2ea4f3b3e7eda28c3e5dd3d7b0f67986e6297\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Aug 5 22:12:53.023418 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1191759512.mount: Deactivated successfully. Aug 5 22:12:53.047895 containerd[1444]: time="2024-08-05T22:12:53.047830832Z" level=info msg="CreateContainer within sandbox \"57ba4be255acc58e96253187a5e2ea4f3b3e7eda28c3e5dd3d7b0f67986e6297\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"ed2469c5787d99e23ec4d33409e4b682cb81574b72e1c6862573a3f969fcd062\"" Aug 5 22:12:53.050025 containerd[1444]: time="2024-08-05T22:12:53.048579134Z" level=info msg="StartContainer for \"ed2469c5787d99e23ec4d33409e4b682cb81574b72e1c6862573a3f969fcd062\"" Aug 5 22:12:53.101440 systemd[1]: Started cri-containerd-ed2469c5787d99e23ec4d33409e4b682cb81574b72e1c6862573a3f969fcd062.scope - libcontainer container ed2469c5787d99e23ec4d33409e4b682cb81574b72e1c6862573a3f969fcd062. Aug 5 22:12:53.190448 containerd[1444]: time="2024-08-05T22:12:53.190395358Z" level=info msg="StartContainer for \"ed2469c5787d99e23ec4d33409e4b682cb81574b72e1c6862573a3f969fcd062\" returns successfully" Aug 5 22:12:53.895678 containerd[1444]: time="2024-08-05T22:12:53.895442992Z" level=info msg="StopContainer for \"ed2469c5787d99e23ec4d33409e4b682cb81574b72e1c6862573a3f969fcd062\" with timeout 300 (s)" Aug 5 22:12:53.902763 containerd[1444]: time="2024-08-05T22:12:53.902660175Z" level=info msg="Stop container \"ed2469c5787d99e23ec4d33409e4b682cb81574b72e1c6862573a3f969fcd062\" with signal terminated" Aug 5 22:12:53.930356 kubelet[2635]: I0805 22:12:53.930284 2635 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-typha-5859df9dd8-qnrkm" podStartSLOduration=2.410986705 podStartE2EDuration="5.930138735s" podCreationTimestamp="2024-08-05 22:12:48 +0000 UTC" firstStartedPulling="2024-08-05 22:12:49.40837499 +0000 UTC m=+21.113595378" lastFinishedPulling="2024-08-05 22:12:52.92752698 +0000 UTC m=+24.632747408" observedRunningTime="2024-08-05 22:12:53.929486757 +0000 UTC m=+25.634707155" watchObservedRunningTime="2024-08-05 22:12:53.930138735 +0000 UTC m=+25.635359123" Aug 5 22:12:53.938793 systemd[1]: cri-containerd-ed2469c5787d99e23ec4d33409e4b682cb81574b72e1c6862573a3f969fcd062.scope: Deactivated successfully. Aug 5 22:12:53.985803 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ed2469c5787d99e23ec4d33409e4b682cb81574b72e1c6862573a3f969fcd062-rootfs.mount: Deactivated successfully. Aug 5 22:12:54.314274 containerd[1444]: time="2024-08-05T22:12:54.313121470Z" level=info msg="shim disconnected" id=ed2469c5787d99e23ec4d33409e4b682cb81574b72e1c6862573a3f969fcd062 namespace=k8s.io Aug 5 22:12:54.314274 containerd[1444]: time="2024-08-05T22:12:54.313334618Z" level=warning msg="cleaning up after shim disconnected" id=ed2469c5787d99e23ec4d33409e4b682cb81574b72e1c6862573a3f969fcd062 namespace=k8s.io Aug 5 22:12:54.314274 containerd[1444]: time="2024-08-05T22:12:54.313363403Z" level=info msg="cleaning up dead shim" namespace=k8s.io Aug 5 22:12:54.366742 containerd[1444]: time="2024-08-05T22:12:54.366655647Z" level=info msg="StopContainer for \"ed2469c5787d99e23ec4d33409e4b682cb81574b72e1c6862573a3f969fcd062\" returns successfully" Aug 5 22:12:54.370252 containerd[1444]: time="2024-08-05T22:12:54.368282367Z" level=info msg="StopPodSandbox for \"57ba4be255acc58e96253187a5e2ea4f3b3e7eda28c3e5dd3d7b0f67986e6297\"" Aug 5 22:12:54.370252 containerd[1444]: time="2024-08-05T22:12:54.368375646Z" level=info msg="Container to stop \"ed2469c5787d99e23ec4d33409e4b682cb81574b72e1c6862573a3f969fcd062\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Aug 5 22:12:54.373312 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-57ba4be255acc58e96253187a5e2ea4f3b3e7eda28c3e5dd3d7b0f67986e6297-shm.mount: Deactivated successfully. Aug 5 22:12:54.388590 systemd[1]: cri-containerd-57ba4be255acc58e96253187a5e2ea4f3b3e7eda28c3e5dd3d7b0f67986e6297.scope: Deactivated successfully. Aug 5 22:12:54.430453 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-57ba4be255acc58e96253187a5e2ea4f3b3e7eda28c3e5dd3d7b0f67986e6297-rootfs.mount: Deactivated successfully. Aug 5 22:12:54.435523 containerd[1444]: time="2024-08-05T22:12:54.435391438Z" level=info msg="shim disconnected" id=57ba4be255acc58e96253187a5e2ea4f3b3e7eda28c3e5dd3d7b0f67986e6297 namespace=k8s.io Aug 5 22:12:54.435523 containerd[1444]: time="2024-08-05T22:12:54.435462964Z" level=warning msg="cleaning up after shim disconnected" id=57ba4be255acc58e96253187a5e2ea4f3b3e7eda28c3e5dd3d7b0f67986e6297 namespace=k8s.io Aug 5 22:12:54.435523 containerd[1444]: time="2024-08-05T22:12:54.435473605Z" level=info msg="cleaning up dead shim" namespace=k8s.io Aug 5 22:12:54.457649 containerd[1444]: time="2024-08-05T22:12:54.457262547Z" level=info msg="TearDown network for sandbox \"57ba4be255acc58e96253187a5e2ea4f3b3e7eda28c3e5dd3d7b0f67986e6297\" successfully" Aug 5 22:12:54.457649 containerd[1444]: time="2024-08-05T22:12:54.457322762Z" level=info msg="StopPodSandbox for \"57ba4be255acc58e96253187a5e2ea4f3b3e7eda28c3e5dd3d7b0f67986e6297\" returns successfully" Aug 5 22:12:54.490186 kubelet[2635]: I0805 22:12:54.490125 2635 topology_manager.go:215] "Topology Admit Handler" podUID="47121d65-83bd-4de1-9adc-490123f80534" podNamespace="calico-system" podName="calico-typha-647446587d-j8scq" Aug 5 22:12:54.490186 kubelet[2635]: E0805 22:12:54.490198 2635 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="f2ff3fd7-525a-47a9-95a1-74f7bff921c4" containerName="calico-typha" Aug 5 22:12:54.490483 kubelet[2635]: I0805 22:12:54.490260 2635 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2ff3fd7-525a-47a9-95a1-74f7bff921c4" containerName="calico-typha" Aug 5 22:12:54.501062 systemd[1]: Created slice kubepods-besteffort-pod47121d65_83bd_4de1_9adc_490123f80534.slice - libcontainer container kubepods-besteffort-pod47121d65_83bd_4de1_9adc_490123f80534.slice. Aug 5 22:12:54.546333 kubelet[2635]: E0805 22:12:54.546294 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.546333 kubelet[2635]: W0805 22:12:54.546318 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.546678 kubelet[2635]: E0805 22:12:54.546650 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.547828 kubelet[2635]: E0805 22:12:54.547770 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.547828 kubelet[2635]: W0805 22:12:54.547821 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.547921 kubelet[2635]: E0805 22:12:54.547850 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.549277 kubelet[2635]: E0805 22:12:54.549204 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.549277 kubelet[2635]: W0805 22:12:54.549247 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.549393 kubelet[2635]: E0805 22:12:54.549320 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.550641 kubelet[2635]: E0805 22:12:54.550342 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.550641 kubelet[2635]: W0805 22:12:54.550359 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.550641 kubelet[2635]: E0805 22:12:54.550375 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.550904 kubelet[2635]: E0805 22:12:54.550879 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.550955 kubelet[2635]: W0805 22:12:54.550897 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.550955 kubelet[2635]: E0805 22:12:54.550924 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.551190 kubelet[2635]: E0805 22:12:54.551167 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.551190 kubelet[2635]: W0805 22:12:54.551181 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.551335 kubelet[2635]: E0805 22:12:54.551195 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.551642 kubelet[2635]: E0805 22:12:54.551618 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.551642 kubelet[2635]: W0805 22:12:54.551633 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.551642 kubelet[2635]: E0805 22:12:54.551646 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.553605 kubelet[2635]: E0805 22:12:54.553574 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.553605 kubelet[2635]: W0805 22:12:54.553595 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.553605 kubelet[2635]: E0805 22:12:54.553613 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.553923 kubelet[2635]: E0805 22:12:54.553898 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.553923 kubelet[2635]: W0805 22:12:54.553915 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.553996 kubelet[2635]: E0805 22:12:54.553929 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.554171 kubelet[2635]: E0805 22:12:54.554147 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.554171 kubelet[2635]: W0805 22:12:54.554163 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.554324 kubelet[2635]: E0805 22:12:54.554176 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.554522 kubelet[2635]: E0805 22:12:54.554498 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.554522 kubelet[2635]: W0805 22:12:54.554514 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.554597 kubelet[2635]: E0805 22:12:54.554538 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.554788 kubelet[2635]: E0805 22:12:54.554742 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.554788 kubelet[2635]: W0805 22:12:54.554781 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.554858 kubelet[2635]: E0805 22:12:54.554797 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.641276 kubelet[2635]: E0805 22:12:54.640914 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.641276 kubelet[2635]: W0805 22:12:54.640941 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.641276 kubelet[2635]: E0805 22:12:54.640965 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.641276 kubelet[2635]: I0805 22:12:54.641104 2635 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rj47n\" (UniqueName: \"kubernetes.io/projected/f2ff3fd7-525a-47a9-95a1-74f7bff921c4-kube-api-access-rj47n\") pod \"f2ff3fd7-525a-47a9-95a1-74f7bff921c4\" (UID: \"f2ff3fd7-525a-47a9-95a1-74f7bff921c4\") " Aug 5 22:12:54.642944 kubelet[2635]: E0805 22:12:54.642921 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.642944 kubelet[2635]: W0805 22:12:54.642941 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.643488 kubelet[2635]: E0805 22:12:54.642968 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.643488 kubelet[2635]: I0805 22:12:54.642998 2635 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2ff3fd7-525a-47a9-95a1-74f7bff921c4-tigera-ca-bundle\") pod \"f2ff3fd7-525a-47a9-95a1-74f7bff921c4\" (UID: \"f2ff3fd7-525a-47a9-95a1-74f7bff921c4\") " Aug 5 22:12:54.643488 kubelet[2635]: E0805 22:12:54.643384 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.643488 kubelet[2635]: W0805 22:12:54.643395 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.643488 kubelet[2635]: E0805 22:12:54.643410 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.643488 kubelet[2635]: I0805 22:12:54.643440 2635 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/f2ff3fd7-525a-47a9-95a1-74f7bff921c4-typha-certs\") pod \"f2ff3fd7-525a-47a9-95a1-74f7bff921c4\" (UID: \"f2ff3fd7-525a-47a9-95a1-74f7bff921c4\") " Aug 5 22:12:54.644167 kubelet[2635]: E0805 22:12:54.644122 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.644167 kubelet[2635]: W0805 22:12:54.644138 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.644167 kubelet[2635]: E0805 22:12:54.644151 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.644698 kubelet[2635]: I0805 22:12:54.644173 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47121d65-83bd-4de1-9adc-490123f80534-tigera-ca-bundle\") pod \"calico-typha-647446587d-j8scq\" (UID: \"47121d65-83bd-4de1-9adc-490123f80534\") " pod="calico-system/calico-typha-647446587d-j8scq" Aug 5 22:12:54.647264 kubelet[2635]: E0805 22:12:54.645346 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.647264 kubelet[2635]: W0805 22:12:54.645364 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.648487 kubelet[2635]: E0805 22:12:54.648463 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.648487 kubelet[2635]: W0805 22:12:54.648481 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.648578 kubelet[2635]: E0805 22:12:54.648497 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.657587 kubelet[2635]: E0805 22:12:54.657497 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.657835 systemd[1]: var-lib-kubelet-pods-f2ff3fd7\x2d525a\x2d47a9\x2d95a1\x2d74f7bff921c4-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2drj47n.mount: Deactivated successfully. Aug 5 22:12:54.659636 kubelet[2635]: W0805 22:12:54.658696 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.659636 kubelet[2635]: E0805 22:12:54.658740 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.663608 kubelet[2635]: E0805 22:12:54.663542 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.663608 kubelet[2635]: I0805 22:12:54.663603 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/47121d65-83bd-4de1-9adc-490123f80534-typha-certs\") pod \"calico-typha-647446587d-j8scq\" (UID: \"47121d65-83bd-4de1-9adc-490123f80534\") " pod="calico-system/calico-typha-647446587d-j8scq" Aug 5 22:12:54.664375 kubelet[2635]: I0805 22:12:54.664345 2635 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2ff3fd7-525a-47a9-95a1-74f7bff921c4-kube-api-access-rj47n" (OuterVolumeSpecName: "kube-api-access-rj47n") pod "f2ff3fd7-525a-47a9-95a1-74f7bff921c4" (UID: "f2ff3fd7-525a-47a9-95a1-74f7bff921c4"). InnerVolumeSpecName "kube-api-access-rj47n". PluginName "kubernetes.io/projected", VolumeGidValue "" Aug 5 22:12:54.671037 kubelet[2635]: E0805 22:12:54.670324 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.671037 kubelet[2635]: W0805 22:12:54.670348 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.671037 kubelet[2635]: E0805 22:12:54.670376 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.671037 kubelet[2635]: I0805 22:12:54.670412 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sghkp\" (UniqueName: \"kubernetes.io/projected/47121d65-83bd-4de1-9adc-490123f80534-kube-api-access-sghkp\") pod \"calico-typha-647446587d-j8scq\" (UID: \"47121d65-83bd-4de1-9adc-490123f80534\") " pod="calico-system/calico-typha-647446587d-j8scq" Aug 5 22:12:54.671037 kubelet[2635]: I0805 22:12:54.670458 2635 reconciler_common.go:300] "Volume detached for volume \"kube-api-access-rj47n\" (UniqueName: \"kubernetes.io/projected/f2ff3fd7-525a-47a9-95a1-74f7bff921c4-kube-api-access-rj47n\") on node \"ci-3975-2-0-1-de7b5ef465.novalocal\" DevicePath \"\"" Aug 5 22:12:54.673343 kubelet[2635]: E0805 22:12:54.672582 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.673343 kubelet[2635]: W0805 22:12:54.672597 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.673343 kubelet[2635]: E0805 22:12:54.672698 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.673343 kubelet[2635]: E0805 22:12:54.673124 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.673343 kubelet[2635]: W0805 22:12:54.673133 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.673343 kubelet[2635]: E0805 22:12:54.673241 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.673747 kubelet[2635]: E0805 22:12:54.673479 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.673747 kubelet[2635]: W0805 22:12:54.673488 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.675787 kubelet[2635]: E0805 22:12:54.675246 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.675787 kubelet[2635]: E0805 22:12:54.675394 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.675787 kubelet[2635]: W0805 22:12:54.675402 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.675787 kubelet[2635]: E0805 22:12:54.675414 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.675787 kubelet[2635]: E0805 22:12:54.675565 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.675787 kubelet[2635]: W0805 22:12:54.675573 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.675787 kubelet[2635]: E0805 22:12:54.675585 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.675787 kubelet[2635]: I0805 22:12:54.675527 2635 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2ff3fd7-525a-47a9-95a1-74f7bff921c4-typha-certs" (OuterVolumeSpecName: "typha-certs") pod "f2ff3fd7-525a-47a9-95a1-74f7bff921c4" (UID: "f2ff3fd7-525a-47a9-95a1-74f7bff921c4"). InnerVolumeSpecName "typha-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Aug 5 22:12:54.675787 kubelet[2635]: E0805 22:12:54.675716 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.676078 kubelet[2635]: W0805 22:12:54.675724 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.676078 kubelet[2635]: E0805 22:12:54.675735 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.683330 kubelet[2635]: E0805 22:12:54.683260 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.683820 kubelet[2635]: W0805 22:12:54.683747 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.684318 kubelet[2635]: E0805 22:12:54.684301 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.686298 kubelet[2635]: I0805 22:12:54.686256 2635 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2ff3fd7-525a-47a9-95a1-74f7bff921c4-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "f2ff3fd7-525a-47a9-95a1-74f7bff921c4" (UID: "f2ff3fd7-525a-47a9-95a1-74f7bff921c4"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Aug 5 22:12:54.741354 kubelet[2635]: E0805 22:12:54.741311 2635 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ck8bx" podUID="2d1ea0ca-eeda-4305-a298-4df807e6a886" Aug 5 22:12:54.752717 systemd[1]: Removed slice kubepods-besteffort-podf2ff3fd7_525a_47a9_95a1_74f7bff921c4.slice - libcontainer container kubepods-besteffort-podf2ff3fd7_525a_47a9_95a1_74f7bff921c4.slice. Aug 5 22:12:54.772792 kubelet[2635]: E0805 22:12:54.772572 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.772792 kubelet[2635]: W0805 22:12:54.772599 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.772993 kubelet[2635]: E0805 22:12:54.772624 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.774580 kubelet[2635]: E0805 22:12:54.773178 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.774580 kubelet[2635]: W0805 22:12:54.773194 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.774580 kubelet[2635]: E0805 22:12:54.773209 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.774580 kubelet[2635]: E0805 22:12:54.773405 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.774580 kubelet[2635]: W0805 22:12:54.773414 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.774580 kubelet[2635]: E0805 22:12:54.773429 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.774580 kubelet[2635]: I0805 22:12:54.773468 2635 reconciler_common.go:300] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2ff3fd7-525a-47a9-95a1-74f7bff921c4-tigera-ca-bundle\") on node \"ci-3975-2-0-1-de7b5ef465.novalocal\" DevicePath \"\"" Aug 5 22:12:54.774580 kubelet[2635]: I0805 22:12:54.773483 2635 reconciler_common.go:300] "Volume detached for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/f2ff3fd7-525a-47a9-95a1-74f7bff921c4-typha-certs\") on node \"ci-3975-2-0-1-de7b5ef465.novalocal\" DevicePath \"\"" Aug 5 22:12:54.774580 kubelet[2635]: E0805 22:12:54.773727 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.774580 kubelet[2635]: W0805 22:12:54.773735 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.775040 kubelet[2635]: E0805 22:12:54.773748 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.775040 kubelet[2635]: E0805 22:12:54.773882 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.775040 kubelet[2635]: W0805 22:12:54.773890 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.775040 kubelet[2635]: E0805 22:12:54.773904 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.775040 kubelet[2635]: E0805 22:12:54.774030 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.775040 kubelet[2635]: W0805 22:12:54.774038 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.775040 kubelet[2635]: E0805 22:12:54.774049 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.775040 kubelet[2635]: E0805 22:12:54.774205 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.775040 kubelet[2635]: W0805 22:12:54.774214 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.775040 kubelet[2635]: E0805 22:12:54.774250 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.776675 kubelet[2635]: E0805 22:12:54.775935 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.776675 kubelet[2635]: W0805 22:12:54.775958 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.776675 kubelet[2635]: E0805 22:12:54.776005 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.777601 kubelet[2635]: E0805 22:12:54.777436 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.777601 kubelet[2635]: W0805 22:12:54.777476 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.777601 kubelet[2635]: E0805 22:12:54.777576 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.778727 kubelet[2635]: E0805 22:12:54.778324 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.778727 kubelet[2635]: W0805 22:12:54.778336 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.778727 kubelet[2635]: E0805 22:12:54.778359 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.779884 kubelet[2635]: E0805 22:12:54.779377 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.779884 kubelet[2635]: W0805 22:12:54.779390 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.779884 kubelet[2635]: E0805 22:12:54.779538 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.779884 kubelet[2635]: E0805 22:12:54.779735 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.779884 kubelet[2635]: W0805 22:12:54.779770 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.781316 kubelet[2635]: E0805 22:12:54.780714 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.781992 kubelet[2635]: E0805 22:12:54.781824 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.781992 kubelet[2635]: W0805 22:12:54.781837 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.781992 kubelet[2635]: E0805 22:12:54.781881 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.783482 kubelet[2635]: E0805 22:12:54.783064 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.783482 kubelet[2635]: W0805 22:12:54.783174 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.783482 kubelet[2635]: E0805 22:12:54.783199 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.785341 kubelet[2635]: E0805 22:12:54.784188 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.785341 kubelet[2635]: W0805 22:12:54.785252 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.785341 kubelet[2635]: E0805 22:12:54.785282 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.785730 kubelet[2635]: E0805 22:12:54.785530 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.785730 kubelet[2635]: W0805 22:12:54.785560 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.785730 kubelet[2635]: E0805 22:12:54.785582 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.794273 kubelet[2635]: E0805 22:12:54.794125 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.794273 kubelet[2635]: W0805 22:12:54.794150 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.795417 kubelet[2635]: E0805 22:12:54.794618 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.800888 kubelet[2635]: E0805 22:12:54.800601 2635 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 5 22:12:54.800888 kubelet[2635]: W0805 22:12:54.800726 2635 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 5 22:12:54.800888 kubelet[2635]: E0805 22:12:54.800751 2635 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 5 22:12:54.806908 containerd[1444]: time="2024-08-05T22:12:54.806673874Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-647446587d-j8scq,Uid:47121d65-83bd-4de1-9adc-490123f80534,Namespace:calico-system,Attempt:0,}" Aug 5 22:12:54.862376 containerd[1444]: time="2024-08-05T22:12:54.862094620Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Aug 5 22:12:54.862376 containerd[1444]: time="2024-08-05T22:12:54.862163000Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 5 22:12:54.862376 containerd[1444]: time="2024-08-05T22:12:54.862186695Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Aug 5 22:12:54.862376 containerd[1444]: time="2024-08-05T22:12:54.862205021Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 5 22:12:54.895023 systemd[1]: Started cri-containerd-99d43c3a74438fcd98eb4212a8817a2eabdac6085f9502c2f227947c928a3dfd.scope - libcontainer container 99d43c3a74438fcd98eb4212a8817a2eabdac6085f9502c2f227947c928a3dfd. Aug 5 22:12:54.901583 kubelet[2635]: I0805 22:12:54.901087 2635 scope.go:117] "RemoveContainer" containerID="ed2469c5787d99e23ec4d33409e4b682cb81574b72e1c6862573a3f969fcd062" Aug 5 22:12:54.904779 containerd[1444]: time="2024-08-05T22:12:54.904134573Z" level=info msg="RemoveContainer for \"ed2469c5787d99e23ec4d33409e4b682cb81574b72e1c6862573a3f969fcd062\"" Aug 5 22:12:54.920601 containerd[1444]: time="2024-08-05T22:12:54.920541802Z" level=info msg="RemoveContainer for \"ed2469c5787d99e23ec4d33409e4b682cb81574b72e1c6862573a3f969fcd062\" returns successfully" Aug 5 22:12:54.923066 kubelet[2635]: I0805 22:12:54.921768 2635 scope.go:117] "RemoveContainer" containerID="ed2469c5787d99e23ec4d33409e4b682cb81574b72e1c6862573a3f969fcd062" Aug 5 22:12:54.924627 containerd[1444]: time="2024-08-05T22:12:54.924574455Z" level=error msg="ContainerStatus for \"ed2469c5787d99e23ec4d33409e4b682cb81574b72e1c6862573a3f969fcd062\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"ed2469c5787d99e23ec4d33409e4b682cb81574b72e1c6862573a3f969fcd062\": not found" Aug 5 22:12:54.924997 kubelet[2635]: E0805 22:12:54.924873 2635 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"ed2469c5787d99e23ec4d33409e4b682cb81574b72e1c6862573a3f969fcd062\": not found" containerID="ed2469c5787d99e23ec4d33409e4b682cb81574b72e1c6862573a3f969fcd062" Aug 5 22:12:54.924997 kubelet[2635]: I0805 22:12:54.924964 2635 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"ed2469c5787d99e23ec4d33409e4b682cb81574b72e1c6862573a3f969fcd062"} err="failed to get container status \"ed2469c5787d99e23ec4d33409e4b682cb81574b72e1c6862573a3f969fcd062\": rpc error: code = NotFound desc = an error occurred when try to find container \"ed2469c5787d99e23ec4d33409e4b682cb81574b72e1c6862573a3f969fcd062\": not found" Aug 5 22:12:54.969670 systemd[1]: var-lib-kubelet-pods-f2ff3fd7\x2d525a\x2d47a9\x2d95a1\x2d74f7bff921c4-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dtypha-1.mount: Deactivated successfully. Aug 5 22:12:54.969782 systemd[1]: var-lib-kubelet-pods-f2ff3fd7\x2d525a\x2d47a9\x2d95a1\x2d74f7bff921c4-volumes-kubernetes.io\x7esecret-typha\x2dcerts.mount: Deactivated successfully. Aug 5 22:12:55.017209 containerd[1444]: time="2024-08-05T22:12:55.017068646Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-647446587d-j8scq,Uid:47121d65-83bd-4de1-9adc-490123f80534,Namespace:calico-system,Attempt:0,} returns sandbox id \"99d43c3a74438fcd98eb4212a8817a2eabdac6085f9502c2f227947c928a3dfd\"" Aug 5 22:12:55.041505 containerd[1444]: time="2024-08-05T22:12:55.041186224Z" level=info msg="CreateContainer within sandbox \"99d43c3a74438fcd98eb4212a8817a2eabdac6085f9502c2f227947c928a3dfd\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Aug 5 22:12:55.056345 containerd[1444]: time="2024-08-05T22:12:55.056259053Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.28.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:12:55.059305 containerd[1444]: time="2024-08-05T22:12:55.059249588Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.28.0: active requests=0, bytes read=5140568" Aug 5 22:12:55.061291 containerd[1444]: time="2024-08-05T22:12:55.060400038Z" level=info msg="ImageCreate event name:\"sha256:587b28ecfc62e2a60919e6a39f9b25be37c77da99d8c84252716fa3a49a171b9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:12:55.068024 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount193152358.mount: Deactivated successfully. Aug 5 22:12:55.069662 containerd[1444]: time="2024-08-05T22:12:55.069331860Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:e57c9db86f1cee1ae6f41257eed1ee2f363783177809217a2045502a09cf7cee\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:12:55.070850 containerd[1444]: time="2024-08-05T22:12:55.070809684Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.28.0\" with image id \"sha256:587b28ecfc62e2a60919e6a39f9b25be37c77da99d8c84252716fa3a49a171b9\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.28.0\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:e57c9db86f1cee1ae6f41257eed1ee2f363783177809217a2045502a09cf7cee\", size \"6588288\" in 2.141762313s" Aug 5 22:12:55.071003 containerd[1444]: time="2024-08-05T22:12:55.070985009Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.28.0\" returns image reference \"sha256:587b28ecfc62e2a60919e6a39f9b25be37c77da99d8c84252716fa3a49a171b9\"" Aug 5 22:12:55.072632 containerd[1444]: time="2024-08-05T22:12:55.072595097Z" level=info msg="CreateContainer within sandbox \"99d43c3a74438fcd98eb4212a8817a2eabdac6085f9502c2f227947c928a3dfd\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"42e88ce9dd4f62a39c99fac380a0bdf63a39130a924052f61ff5f2c7f9874765\"" Aug 5 22:12:55.073264 containerd[1444]: time="2024-08-05T22:12:55.073214160Z" level=info msg="StartContainer for \"42e88ce9dd4f62a39c99fac380a0bdf63a39130a924052f61ff5f2c7f9874765\"" Aug 5 22:12:55.074473 containerd[1444]: time="2024-08-05T22:12:55.074383375Z" level=info msg="CreateContainer within sandbox \"de33a8626cda7c4dc930033f9a51a27269999ea189861abee56825a2b75599ce\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Aug 5 22:12:55.115207 containerd[1444]: time="2024-08-05T22:12:55.115153620Z" level=info msg="CreateContainer within sandbox \"de33a8626cda7c4dc930033f9a51a27269999ea189861abee56825a2b75599ce\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"d2c772f98dbb352bf97516c31814ab886a8ff2758f3c3a8208401e8f312811a2\"" Aug 5 22:12:55.115530 systemd[1]: Started cri-containerd-42e88ce9dd4f62a39c99fac380a0bdf63a39130a924052f61ff5f2c7f9874765.scope - libcontainer container 42e88ce9dd4f62a39c99fac380a0bdf63a39130a924052f61ff5f2c7f9874765. Aug 5 22:12:55.116194 containerd[1444]: time="2024-08-05T22:12:55.116144154Z" level=info msg="StartContainer for \"d2c772f98dbb352bf97516c31814ab886a8ff2758f3c3a8208401e8f312811a2\"" Aug 5 22:12:55.161392 systemd[1]: Started cri-containerd-d2c772f98dbb352bf97516c31814ab886a8ff2758f3c3a8208401e8f312811a2.scope - libcontainer container d2c772f98dbb352bf97516c31814ab886a8ff2758f3c3a8208401e8f312811a2. Aug 5 22:12:55.197663 containerd[1444]: time="2024-08-05T22:12:55.197525202Z" level=info msg="StartContainer for \"42e88ce9dd4f62a39c99fac380a0bdf63a39130a924052f61ff5f2c7f9874765\" returns successfully" Aug 5 22:12:55.224760 containerd[1444]: time="2024-08-05T22:12:55.224638836Z" level=info msg="StartContainer for \"d2c772f98dbb352bf97516c31814ab886a8ff2758f3c3a8208401e8f312811a2\" returns successfully" Aug 5 22:12:55.263476 systemd[1]: cri-containerd-d2c772f98dbb352bf97516c31814ab886a8ff2758f3c3a8208401e8f312811a2.scope: Deactivated successfully. Aug 5 22:12:55.313944 containerd[1444]: time="2024-08-05T22:12:55.313430117Z" level=info msg="shim disconnected" id=d2c772f98dbb352bf97516c31814ab886a8ff2758f3c3a8208401e8f312811a2 namespace=k8s.io Aug 5 22:12:55.313944 containerd[1444]: time="2024-08-05T22:12:55.313933789Z" level=warning msg="cleaning up after shim disconnected" id=d2c772f98dbb352bf97516c31814ab886a8ff2758f3c3a8208401e8f312811a2 namespace=k8s.io Aug 5 22:12:55.313944 containerd[1444]: time="2024-08-05T22:12:55.313946974Z" level=info msg="cleaning up dead shim" namespace=k8s.io Aug 5 22:12:55.338043 containerd[1444]: time="2024-08-05T22:12:55.337972946Z" level=warning msg="cleanup warnings time=\"2024-08-05T22:12:55Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Aug 5 22:12:55.920369 containerd[1444]: time="2024-08-05T22:12:55.918686352Z" level=info msg="StopPodSandbox for \"de33a8626cda7c4dc930033f9a51a27269999ea189861abee56825a2b75599ce\"" Aug 5 22:12:55.920369 containerd[1444]: time="2024-08-05T22:12:55.918787695Z" level=info msg="Container to stop \"d2c772f98dbb352bf97516c31814ab886a8ff2758f3c3a8208401e8f312811a2\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Aug 5 22:12:55.944917 systemd[1]: cri-containerd-de33a8626cda7c4dc930033f9a51a27269999ea189861abee56825a2b75599ce.scope: Deactivated successfully. Aug 5 22:12:55.970092 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-de33a8626cda7c4dc930033f9a51a27269999ea189861abee56825a2b75599ce-shm.mount: Deactivated successfully. Aug 5 22:12:56.018867 kubelet[2635]: I0805 22:12:56.018730 2635 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-typha-647446587d-j8scq" podStartSLOduration=7.018637273 podStartE2EDuration="7.018637273s" podCreationTimestamp="2024-08-05 22:12:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-08-05 22:12:55.954165635 +0000 UTC m=+27.659386083" watchObservedRunningTime="2024-08-05 22:12:56.018637273 +0000 UTC m=+27.723857671" Aug 5 22:12:56.022456 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-de33a8626cda7c4dc930033f9a51a27269999ea189861abee56825a2b75599ce-rootfs.mount: Deactivated successfully. Aug 5 22:12:56.027754 containerd[1444]: time="2024-08-05T22:12:56.027477560Z" level=info msg="shim disconnected" id=de33a8626cda7c4dc930033f9a51a27269999ea189861abee56825a2b75599ce namespace=k8s.io Aug 5 22:12:56.027754 containerd[1444]: time="2024-08-05T22:12:56.027534619Z" level=warning msg="cleaning up after shim disconnected" id=de33a8626cda7c4dc930033f9a51a27269999ea189861abee56825a2b75599ce namespace=k8s.io Aug 5 22:12:56.027754 containerd[1444]: time="2024-08-05T22:12:56.027544227Z" level=info msg="cleaning up dead shim" namespace=k8s.io Aug 5 22:12:56.042648 containerd[1444]: time="2024-08-05T22:12:56.042503483Z" level=info msg="TearDown network for sandbox \"de33a8626cda7c4dc930033f9a51a27269999ea189861abee56825a2b75599ce\" successfully" Aug 5 22:12:56.042648 containerd[1444]: time="2024-08-05T22:12:56.042537117Z" level=info msg="StopPodSandbox for \"de33a8626cda7c4dc930033f9a51a27269999ea189861abee56825a2b75599ce\" returns successfully" Aug 5 22:12:56.206795 kubelet[2635]: I0805 22:12:56.206704 2635 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/825ecc74-9466-4fbf-bfc0-233430c643bb-tigera-ca-bundle\") pod \"825ecc74-9466-4fbf-bfc0-233430c643bb\" (UID: \"825ecc74-9466-4fbf-bfc0-233430c643bb\") " Aug 5 22:12:56.207812 kubelet[2635]: I0805 22:12:56.207661 2635 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-lib-modules\") pod \"825ecc74-9466-4fbf-bfc0-233430c643bb\" (UID: \"825ecc74-9466-4fbf-bfc0-233430c643bb\") " Aug 5 22:12:56.208193 kubelet[2635]: I0805 22:12:56.207689 2635 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-policysync\") pod \"825ecc74-9466-4fbf-bfc0-233430c643bb\" (UID: \"825ecc74-9466-4fbf-bfc0-233430c643bb\") " Aug 5 22:12:56.208193 kubelet[2635]: I0805 22:12:56.207971 2635 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-var-run-calico\") pod \"825ecc74-9466-4fbf-bfc0-233430c643bb\" (UID: \"825ecc74-9466-4fbf-bfc0-233430c643bb\") " Aug 5 22:12:56.208193 kubelet[2635]: I0805 22:12:56.207995 2635 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-cni-log-dir\") pod \"825ecc74-9466-4fbf-bfc0-233430c643bb\" (UID: \"825ecc74-9466-4fbf-bfc0-233430c643bb\") " Aug 5 22:12:56.208193 kubelet[2635]: I0805 22:12:56.207418 2635 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/825ecc74-9466-4fbf-bfc0-233430c643bb-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "825ecc74-9466-4fbf-bfc0-233430c643bb" (UID: "825ecc74-9466-4fbf-bfc0-233430c643bb"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Aug 5 22:12:56.208193 kubelet[2635]: I0805 22:12:56.208130 2635 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-flexvol-driver-host\") pod \"825ecc74-9466-4fbf-bfc0-233430c643bb\" (UID: \"825ecc74-9466-4fbf-bfc0-233430c643bb\") " Aug 5 22:12:56.208193 kubelet[2635]: I0805 22:12:56.208164 2635 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-cni-bin-dir\") pod \"825ecc74-9466-4fbf-bfc0-233430c643bb\" (UID: \"825ecc74-9466-4fbf-bfc0-233430c643bb\") " Aug 5 22:12:56.208405 kubelet[2635]: I0805 22:12:56.207769 2635 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "825ecc74-9466-4fbf-bfc0-233430c643bb" (UID: "825ecc74-9466-4fbf-bfc0-233430c643bb"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Aug 5 22:12:56.208405 kubelet[2635]: I0805 22:12:56.207795 2635 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-policysync" (OuterVolumeSpecName: "policysync") pod "825ecc74-9466-4fbf-bfc0-233430c643bb" (UID: "825ecc74-9466-4fbf-bfc0-233430c643bb"). InnerVolumeSpecName "policysync". PluginName "kubernetes.io/host-path", VolumeGidValue "" Aug 5 22:12:56.208405 kubelet[2635]: I0805 22:12:56.208362 2635 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-cni-log-dir" (OuterVolumeSpecName: "cni-log-dir") pod "825ecc74-9466-4fbf-bfc0-233430c643bb" (UID: "825ecc74-9466-4fbf-bfc0-233430c643bb"). InnerVolumeSpecName "cni-log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Aug 5 22:12:56.208491 kubelet[2635]: I0805 22:12:56.208437 2635 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-var-run-calico" (OuterVolumeSpecName: "var-run-calico") pod "825ecc74-9466-4fbf-bfc0-233430c643bb" (UID: "825ecc74-9466-4fbf-bfc0-233430c643bb"). InnerVolumeSpecName "var-run-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Aug 5 22:12:56.208912 kubelet[2635]: I0805 22:12:56.208506 2635 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-flexvol-driver-host" (OuterVolumeSpecName: "flexvol-driver-host") pod "825ecc74-9466-4fbf-bfc0-233430c643bb" (UID: "825ecc74-9466-4fbf-bfc0-233430c643bb"). InnerVolumeSpecName "flexvol-driver-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Aug 5 22:12:56.208912 kubelet[2635]: I0805 22:12:56.208553 2635 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-var-lib-calico\") pod \"825ecc74-9466-4fbf-bfc0-233430c643bb\" (UID: \"825ecc74-9466-4fbf-bfc0-233430c643bb\") " Aug 5 22:12:56.208912 kubelet[2635]: I0805 22:12:56.208581 2635 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-cni-bin-dir" (OuterVolumeSpecName: "cni-bin-dir") pod "825ecc74-9466-4fbf-bfc0-233430c643bb" (UID: "825ecc74-9466-4fbf-bfc0-233430c643bb"). InnerVolumeSpecName "cni-bin-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Aug 5 22:12:56.208912 kubelet[2635]: I0805 22:12:56.208594 2635 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/825ecc74-9466-4fbf-bfc0-233430c643bb-node-certs\") pod \"825ecc74-9466-4fbf-bfc0-233430c643bb\" (UID: \"825ecc74-9466-4fbf-bfc0-233430c643bb\") " Aug 5 22:12:56.208912 kubelet[2635]: I0805 22:12:56.208744 2635 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dn8xf\" (UniqueName: \"kubernetes.io/projected/825ecc74-9466-4fbf-bfc0-233430c643bb-kube-api-access-dn8xf\") pod \"825ecc74-9466-4fbf-bfc0-233430c643bb\" (UID: \"825ecc74-9466-4fbf-bfc0-233430c643bb\") " Aug 5 22:12:56.209059 kubelet[2635]: I0805 22:12:56.208818 2635 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-xtables-lock\") pod \"825ecc74-9466-4fbf-bfc0-233430c643bb\" (UID: \"825ecc74-9466-4fbf-bfc0-233430c643bb\") " Aug 5 22:12:56.209059 kubelet[2635]: I0805 22:12:56.208868 2635 reconciler_common.go:172] "operationExecutor.UnmountVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-cni-net-dir\") pod \"825ecc74-9466-4fbf-bfc0-233430c643bb\" (UID: \"825ecc74-9466-4fbf-bfc0-233430c643bb\") " Aug 5 22:12:56.209059 kubelet[2635]: I0805 22:12:56.208988 2635 reconciler_common.go:300] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/825ecc74-9466-4fbf-bfc0-233430c643bb-tigera-ca-bundle\") on node \"ci-3975-2-0-1-de7b5ef465.novalocal\" DevicePath \"\"" Aug 5 22:12:56.209059 kubelet[2635]: I0805 22:12:56.209021 2635 reconciler_common.go:300] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-lib-modules\") on node \"ci-3975-2-0-1-de7b5ef465.novalocal\" DevicePath \"\"" Aug 5 22:12:56.209059 kubelet[2635]: I0805 22:12:56.209054 2635 reconciler_common.go:300] "Volume detached for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-policysync\") on node \"ci-3975-2-0-1-de7b5ef465.novalocal\" DevicePath \"\"" Aug 5 22:12:56.209185 kubelet[2635]: I0805 22:12:56.209084 2635 reconciler_common.go:300] "Volume detached for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-var-run-calico\") on node \"ci-3975-2-0-1-de7b5ef465.novalocal\" DevicePath \"\"" Aug 5 22:12:56.209185 kubelet[2635]: I0805 22:12:56.209114 2635 reconciler_common.go:300] "Volume detached for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-cni-log-dir\") on node \"ci-3975-2-0-1-de7b5ef465.novalocal\" DevicePath \"\"" Aug 5 22:12:56.209185 kubelet[2635]: I0805 22:12:56.209147 2635 reconciler_common.go:300] "Volume detached for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-flexvol-driver-host\") on node \"ci-3975-2-0-1-de7b5ef465.novalocal\" DevicePath \"\"" Aug 5 22:12:56.209185 kubelet[2635]: I0805 22:12:56.209176 2635 reconciler_common.go:300] "Volume detached for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-cni-bin-dir\") on node \"ci-3975-2-0-1-de7b5ef465.novalocal\" DevicePath \"\"" Aug 5 22:12:56.209790 kubelet[2635]: I0805 22:12:56.209262 2635 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-cni-net-dir" (OuterVolumeSpecName: "cni-net-dir") pod "825ecc74-9466-4fbf-bfc0-233430c643bb" (UID: "825ecc74-9466-4fbf-bfc0-233430c643bb"). InnerVolumeSpecName "cni-net-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Aug 5 22:12:56.209790 kubelet[2635]: I0805 22:12:56.209730 2635 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-var-lib-calico" (OuterVolumeSpecName: "var-lib-calico") pod "825ecc74-9466-4fbf-bfc0-233430c643bb" (UID: "825ecc74-9466-4fbf-bfc0-233430c643bb"). InnerVolumeSpecName "var-lib-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Aug 5 22:12:56.210028 kubelet[2635]: I0805 22:12:56.209894 2635 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-xtables-lock" (OuterVolumeSpecName: "xtables-lock") pod "825ecc74-9466-4fbf-bfc0-233430c643bb" (UID: "825ecc74-9466-4fbf-bfc0-233430c643bb"). InnerVolumeSpecName "xtables-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Aug 5 22:12:56.214886 systemd[1]: var-lib-kubelet-pods-825ecc74\x2d9466\x2d4fbf\x2dbfc0\x2d233430c643bb-volumes-kubernetes.io\x7esecret-node\x2dcerts.mount: Deactivated successfully. Aug 5 22:12:56.215203 kubelet[2635]: I0805 22:12:56.215161 2635 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/825ecc74-9466-4fbf-bfc0-233430c643bb-node-certs" (OuterVolumeSpecName: "node-certs") pod "825ecc74-9466-4fbf-bfc0-233430c643bb" (UID: "825ecc74-9466-4fbf-bfc0-233430c643bb"). InnerVolumeSpecName "node-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Aug 5 22:12:56.218772 systemd[1]: var-lib-kubelet-pods-825ecc74\x2d9466\x2d4fbf\x2dbfc0\x2d233430c643bb-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2ddn8xf.mount: Deactivated successfully. Aug 5 22:12:56.219962 kubelet[2635]: I0805 22:12:56.219476 2635 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/825ecc74-9466-4fbf-bfc0-233430c643bb-kube-api-access-dn8xf" (OuterVolumeSpecName: "kube-api-access-dn8xf") pod "825ecc74-9466-4fbf-bfc0-233430c643bb" (UID: "825ecc74-9466-4fbf-bfc0-233430c643bb"). InnerVolumeSpecName "kube-api-access-dn8xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Aug 5 22:12:56.310426 kubelet[2635]: I0805 22:12:56.310357 2635 reconciler_common.go:300] "Volume detached for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-var-lib-calico\") on node \"ci-3975-2-0-1-de7b5ef465.novalocal\" DevicePath \"\"" Aug 5 22:12:56.310426 kubelet[2635]: I0805 22:12:56.310431 2635 reconciler_common.go:300] "Volume detached for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/825ecc74-9466-4fbf-bfc0-233430c643bb-node-certs\") on node \"ci-3975-2-0-1-de7b5ef465.novalocal\" DevicePath \"\"" Aug 5 22:12:56.310710 kubelet[2635]: I0805 22:12:56.310478 2635 reconciler_common.go:300] "Volume detached for volume \"kube-api-access-dn8xf\" (UniqueName: \"kubernetes.io/projected/825ecc74-9466-4fbf-bfc0-233430c643bb-kube-api-access-dn8xf\") on node \"ci-3975-2-0-1-de7b5ef465.novalocal\" DevicePath \"\"" Aug 5 22:12:56.310710 kubelet[2635]: I0805 22:12:56.310510 2635 reconciler_common.go:300] "Volume detached for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-xtables-lock\") on node \"ci-3975-2-0-1-de7b5ef465.novalocal\" DevicePath \"\"" Aug 5 22:12:56.310710 kubelet[2635]: I0805 22:12:56.310542 2635 reconciler_common.go:300] "Volume detached for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/825ecc74-9466-4fbf-bfc0-233430c643bb-cni-net-dir\") on node \"ci-3975-2-0-1-de7b5ef465.novalocal\" DevicePath \"\"" Aug 5 22:12:56.743096 kubelet[2635]: E0805 22:12:56.740740 2635 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ck8bx" podUID="2d1ea0ca-eeda-4305-a298-4df807e6a886" Aug 5 22:12:56.750866 kubelet[2635]: I0805 22:12:56.749125 2635 kubelet_volumes.go:161] "Cleaned up orphaned pod volumes dir" podUID="f2ff3fd7-525a-47a9-95a1-74f7bff921c4" path="/var/lib/kubelet/pods/f2ff3fd7-525a-47a9-95a1-74f7bff921c4/volumes" Aug 5 22:12:56.767607 systemd[1]: Removed slice kubepods-besteffort-pod825ecc74_9466_4fbf_bfc0_233430c643bb.slice - libcontainer container kubepods-besteffort-pod825ecc74_9466_4fbf_bfc0_233430c643bb.slice. Aug 5 22:12:56.927373 kubelet[2635]: I0805 22:12:56.926533 2635 scope.go:117] "RemoveContainer" containerID="d2c772f98dbb352bf97516c31814ab886a8ff2758f3c3a8208401e8f312811a2" Aug 5 22:12:56.932510 containerd[1444]: time="2024-08-05T22:12:56.932397305Z" level=info msg="RemoveContainer for \"d2c772f98dbb352bf97516c31814ab886a8ff2758f3c3a8208401e8f312811a2\"" Aug 5 22:12:56.944549 containerd[1444]: time="2024-08-05T22:12:56.943518850Z" level=info msg="RemoveContainer for \"d2c772f98dbb352bf97516c31814ab886a8ff2758f3c3a8208401e8f312811a2\" returns successfully" Aug 5 22:12:57.023433 kubelet[2635]: I0805 22:12:57.023280 2635 topology_manager.go:215] "Topology Admit Handler" podUID="46f9ce36-b0f8-4063-a4c6-220fb21301cb" podNamespace="calico-system" podName="calico-node-bl8mm" Aug 5 22:12:57.023433 kubelet[2635]: E0805 22:12:57.023397 2635 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="825ecc74-9466-4fbf-bfc0-233430c643bb" containerName="flexvol-driver" Aug 5 22:12:57.024685 kubelet[2635]: I0805 22:12:57.023458 2635 memory_manager.go:354] "RemoveStaleState removing state" podUID="825ecc74-9466-4fbf-bfc0-233430c643bb" containerName="flexvol-driver" Aug 5 22:12:57.042299 systemd[1]: Created slice kubepods-besteffort-pod46f9ce36_b0f8_4063_a4c6_220fb21301cb.slice - libcontainer container kubepods-besteffort-pod46f9ce36_b0f8_4063_a4c6_220fb21301cb.slice. Aug 5 22:12:57.116410 kubelet[2635]: I0805 22:12:57.116335 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/46f9ce36-b0f8-4063-a4c6-220fb21301cb-xtables-lock\") pod \"calico-node-bl8mm\" (UID: \"46f9ce36-b0f8-4063-a4c6-220fb21301cb\") " pod="calico-system/calico-node-bl8mm" Aug 5 22:12:57.116602 kubelet[2635]: I0805 22:12:57.116457 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/46f9ce36-b0f8-4063-a4c6-220fb21301cb-node-certs\") pod \"calico-node-bl8mm\" (UID: \"46f9ce36-b0f8-4063-a4c6-220fb21301cb\") " pod="calico-system/calico-node-bl8mm" Aug 5 22:12:57.116602 kubelet[2635]: I0805 22:12:57.116527 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/46f9ce36-b0f8-4063-a4c6-220fb21301cb-var-lib-calico\") pod \"calico-node-bl8mm\" (UID: \"46f9ce36-b0f8-4063-a4c6-220fb21301cb\") " pod="calico-system/calico-node-bl8mm" Aug 5 22:12:57.116663 kubelet[2635]: I0805 22:12:57.116608 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/46f9ce36-b0f8-4063-a4c6-220fb21301cb-cni-net-dir\") pod \"calico-node-bl8mm\" (UID: \"46f9ce36-b0f8-4063-a4c6-220fb21301cb\") " pod="calico-system/calico-node-bl8mm" Aug 5 22:12:57.116696 kubelet[2635]: I0805 22:12:57.116672 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46f9ce36-b0f8-4063-a4c6-220fb21301cb-tigera-ca-bundle\") pod \"calico-node-bl8mm\" (UID: \"46f9ce36-b0f8-4063-a4c6-220fb21301cb\") " pod="calico-system/calico-node-bl8mm" Aug 5 22:12:57.116761 kubelet[2635]: I0805 22:12:57.116736 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/46f9ce36-b0f8-4063-a4c6-220fb21301cb-lib-modules\") pod \"calico-node-bl8mm\" (UID: \"46f9ce36-b0f8-4063-a4c6-220fb21301cb\") " pod="calico-system/calico-node-bl8mm" Aug 5 22:12:57.116843 kubelet[2635]: I0805 22:12:57.116808 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/46f9ce36-b0f8-4063-a4c6-220fb21301cb-cni-bin-dir\") pod \"calico-node-bl8mm\" (UID: \"46f9ce36-b0f8-4063-a4c6-220fb21301cb\") " pod="calico-system/calico-node-bl8mm" Aug 5 22:12:57.116905 kubelet[2635]: I0805 22:12:57.116880 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/46f9ce36-b0f8-4063-a4c6-220fb21301cb-cni-log-dir\") pod \"calico-node-bl8mm\" (UID: \"46f9ce36-b0f8-4063-a4c6-220fb21301cb\") " pod="calico-system/calico-node-bl8mm" Aug 5 22:12:57.116977 kubelet[2635]: I0805 22:12:57.116954 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgckx\" (UniqueName: \"kubernetes.io/projected/46f9ce36-b0f8-4063-a4c6-220fb21301cb-kube-api-access-bgckx\") pod \"calico-node-bl8mm\" (UID: \"46f9ce36-b0f8-4063-a4c6-220fb21301cb\") " pod="calico-system/calico-node-bl8mm" Aug 5 22:12:57.117051 kubelet[2635]: I0805 22:12:57.117025 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/46f9ce36-b0f8-4063-a4c6-220fb21301cb-var-run-calico\") pod \"calico-node-bl8mm\" (UID: \"46f9ce36-b0f8-4063-a4c6-220fb21301cb\") " pod="calico-system/calico-node-bl8mm" Aug 5 22:12:57.117126 kubelet[2635]: I0805 22:12:57.117100 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/46f9ce36-b0f8-4063-a4c6-220fb21301cb-policysync\") pod \"calico-node-bl8mm\" (UID: \"46f9ce36-b0f8-4063-a4c6-220fb21301cb\") " pod="calico-system/calico-node-bl8mm" Aug 5 22:12:57.117200 kubelet[2635]: I0805 22:12:57.117173 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/46f9ce36-b0f8-4063-a4c6-220fb21301cb-flexvol-driver-host\") pod \"calico-node-bl8mm\" (UID: \"46f9ce36-b0f8-4063-a4c6-220fb21301cb\") " pod="calico-system/calico-node-bl8mm" Aug 5 22:12:57.346107 containerd[1444]: time="2024-08-05T22:12:57.345400057Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-bl8mm,Uid:46f9ce36-b0f8-4063-a4c6-220fb21301cb,Namespace:calico-system,Attempt:0,}" Aug 5 22:12:57.376899 kubelet[2635]: I0805 22:12:57.376823 2635 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 5 22:12:57.422369 containerd[1444]: time="2024-08-05T22:12:57.421121521Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Aug 5 22:12:57.424186 containerd[1444]: time="2024-08-05T22:12:57.423851112Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 5 22:12:57.424186 containerd[1444]: time="2024-08-05T22:12:57.423984498Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Aug 5 22:12:57.425736 containerd[1444]: time="2024-08-05T22:12:57.425632965Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 5 22:12:57.470442 systemd[1]: Started cri-containerd-19660da175825eee0634b406dd0ff91cdc281bc0da9c54da36a454f658a1a9f4.scope - libcontainer container 19660da175825eee0634b406dd0ff91cdc281bc0da9c54da36a454f658a1a9f4. Aug 5 22:12:57.503399 containerd[1444]: time="2024-08-05T22:12:57.503344959Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-bl8mm,Uid:46f9ce36-b0f8-4063-a4c6-220fb21301cb,Namespace:calico-system,Attempt:0,} returns sandbox id \"19660da175825eee0634b406dd0ff91cdc281bc0da9c54da36a454f658a1a9f4\"" Aug 5 22:12:57.508696 containerd[1444]: time="2024-08-05T22:12:57.508593039Z" level=info msg="CreateContainer within sandbox \"19660da175825eee0634b406dd0ff91cdc281bc0da9c54da36a454f658a1a9f4\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Aug 5 22:12:57.526492 containerd[1444]: time="2024-08-05T22:12:57.526426386Z" level=info msg="CreateContainer within sandbox \"19660da175825eee0634b406dd0ff91cdc281bc0da9c54da36a454f658a1a9f4\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"235933467bf35e95d25e90e5fb8557592c83f3541f40c5a3326bfcc8159b4794\"" Aug 5 22:12:57.527366 containerd[1444]: time="2024-08-05T22:12:57.527339599Z" level=info msg="StartContainer for \"235933467bf35e95d25e90e5fb8557592c83f3541f40c5a3326bfcc8159b4794\"" Aug 5 22:12:57.569374 systemd[1]: Started cri-containerd-235933467bf35e95d25e90e5fb8557592c83f3541f40c5a3326bfcc8159b4794.scope - libcontainer container 235933467bf35e95d25e90e5fb8557592c83f3541f40c5a3326bfcc8159b4794. Aug 5 22:12:57.606824 containerd[1444]: time="2024-08-05T22:12:57.606766297Z" level=info msg="StartContainer for \"235933467bf35e95d25e90e5fb8557592c83f3541f40c5a3326bfcc8159b4794\" returns successfully" Aug 5 22:12:57.619578 systemd[1]: cri-containerd-235933467bf35e95d25e90e5fb8557592c83f3541f40c5a3326bfcc8159b4794.scope: Deactivated successfully. Aug 5 22:12:57.691027 containerd[1444]: time="2024-08-05T22:12:57.690915423Z" level=info msg="shim disconnected" id=235933467bf35e95d25e90e5fb8557592c83f3541f40c5a3326bfcc8159b4794 namespace=k8s.io Aug 5 22:12:57.691027 containerd[1444]: time="2024-08-05T22:12:57.691013009Z" level=warning msg="cleaning up after shim disconnected" id=235933467bf35e95d25e90e5fb8557592c83f3541f40c5a3326bfcc8159b4794 namespace=k8s.io Aug 5 22:12:57.691027 containerd[1444]: time="2024-08-05T22:12:57.691025071Z" level=info msg="cleaning up dead shim" namespace=k8s.io Aug 5 22:12:57.938095 containerd[1444]: time="2024-08-05T22:12:57.937804854Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.28.0\"" Aug 5 22:12:58.224789 systemd[1]: run-containerd-runc-k8s.io-19660da175825eee0634b406dd0ff91cdc281bc0da9c54da36a454f658a1a9f4-runc.X88rUb.mount: Deactivated successfully. Aug 5 22:12:58.741080 kubelet[2635]: E0805 22:12:58.740598 2635 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ck8bx" podUID="2d1ea0ca-eeda-4305-a298-4df807e6a886" Aug 5 22:12:58.744008 kubelet[2635]: I0805 22:12:58.743980 2635 kubelet_volumes.go:161] "Cleaned up orphaned pod volumes dir" podUID="825ecc74-9466-4fbf-bfc0-233430c643bb" path="/var/lib/kubelet/pods/825ecc74-9466-4fbf-bfc0-233430c643bb/volumes" Aug 5 22:13:00.745073 kubelet[2635]: E0805 22:13:00.743122 2635 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ck8bx" podUID="2d1ea0ca-eeda-4305-a298-4df807e6a886" Aug 5 22:13:02.741500 kubelet[2635]: E0805 22:13:02.741428 2635 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ck8bx" podUID="2d1ea0ca-eeda-4305-a298-4df807e6a886" Aug 5 22:13:03.854762 containerd[1444]: time="2024-08-05T22:13:03.854689841Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.28.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:13:03.858593 containerd[1444]: time="2024-08-05T22:13:03.858509383Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.28.0: active requests=0, bytes read=93087850" Aug 5 22:13:03.861420 containerd[1444]: time="2024-08-05T22:13:03.861361364Z" level=info msg="ImageCreate event name:\"sha256:107014d9f4c891a0235fa80b55df22451e8804ede5b891b632c5779ca3ab07a7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:13:03.869088 containerd[1444]: time="2024-08-05T22:13:03.869004066Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:67fdc0954d3c96f9a7938fca4d5759c835b773dfb5cb513903e89d21462d886e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:13:03.871468 containerd[1444]: time="2024-08-05T22:13:03.871388116Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.28.0\" with image id \"sha256:107014d9f4c891a0235fa80b55df22451e8804ede5b891b632c5779ca3ab07a7\", repo tag \"ghcr.io/flatcar/calico/cni:v3.28.0\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:67fdc0954d3c96f9a7938fca4d5759c835b773dfb5cb513903e89d21462d886e\", size \"94535610\" in 5.9334986s" Aug 5 22:13:03.871787 containerd[1444]: time="2024-08-05T22:13:03.871742891Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.28.0\" returns image reference \"sha256:107014d9f4c891a0235fa80b55df22451e8804ede5b891b632c5779ca3ab07a7\"" Aug 5 22:13:03.881181 containerd[1444]: time="2024-08-05T22:13:03.881031828Z" level=info msg="CreateContainer within sandbox \"19660da175825eee0634b406dd0ff91cdc281bc0da9c54da36a454f658a1a9f4\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Aug 5 22:13:03.928267 containerd[1444]: time="2024-08-05T22:13:03.928121801Z" level=info msg="CreateContainer within sandbox \"19660da175825eee0634b406dd0ff91cdc281bc0da9c54da36a454f658a1a9f4\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"a9127a9cc156bbeefd9dbb66c24254bb59f6137f05b6d9fed2bcc55f51b03389\"" Aug 5 22:13:03.934280 containerd[1444]: time="2024-08-05T22:13:03.932384016Z" level=info msg="StartContainer for \"a9127a9cc156bbeefd9dbb66c24254bb59f6137f05b6d9fed2bcc55f51b03389\"" Aug 5 22:13:04.073402 systemd[1]: Started cri-containerd-a9127a9cc156bbeefd9dbb66c24254bb59f6137f05b6d9fed2bcc55f51b03389.scope - libcontainer container a9127a9cc156bbeefd9dbb66c24254bb59f6137f05b6d9fed2bcc55f51b03389. Aug 5 22:13:04.114489 containerd[1444]: time="2024-08-05T22:13:04.114296815Z" level=info msg="StartContainer for \"a9127a9cc156bbeefd9dbb66c24254bb59f6137f05b6d9fed2bcc55f51b03389\" returns successfully" Aug 5 22:13:04.740099 kubelet[2635]: E0805 22:13:04.740014 2635 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ck8bx" podUID="2d1ea0ca-eeda-4305-a298-4df807e6a886" Aug 5 22:13:06.576783 systemd[1]: cri-containerd-a9127a9cc156bbeefd9dbb66c24254bb59f6137f05b6d9fed2bcc55f51b03389.scope: Deactivated successfully. Aug 5 22:13:06.622725 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a9127a9cc156bbeefd9dbb66c24254bb59f6137f05b6d9fed2bcc55f51b03389-rootfs.mount: Deactivated successfully. Aug 5 22:13:06.630783 containerd[1444]: time="2024-08-05T22:13:06.630514414Z" level=info msg="shim disconnected" id=a9127a9cc156bbeefd9dbb66c24254bb59f6137f05b6d9fed2bcc55f51b03389 namespace=k8s.io Aug 5 22:13:06.630783 containerd[1444]: time="2024-08-05T22:13:06.630594106Z" level=warning msg="cleaning up after shim disconnected" id=a9127a9cc156bbeefd9dbb66c24254bb59f6137f05b6d9fed2bcc55f51b03389 namespace=k8s.io Aug 5 22:13:06.630783 containerd[1444]: time="2024-08-05T22:13:06.630605517Z" level=info msg="cleaning up dead shim" namespace=k8s.io Aug 5 22:13:06.645247 kubelet[2635]: I0805 22:13:06.644689 2635 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Aug 5 22:13:06.679571 kubelet[2635]: I0805 22:13:06.678729 2635 topology_manager.go:215] "Topology Admit Handler" podUID="2932e3f8-d82d-4537-9f9b-b46369d5b6f8" podNamespace="kube-system" podName="coredns-76f75df574-lvcgp" Aug 5 22:13:06.686269 kubelet[2635]: I0805 22:13:06.686043 2635 topology_manager.go:215] "Topology Admit Handler" podUID="4ecfd1c0-3700-4ed3-84bf-da7987083d57" podNamespace="calico-system" podName="calico-kube-controllers-58c54bc647-pqf69" Aug 5 22:13:06.687891 kubelet[2635]: I0805 22:13:06.686904 2635 topology_manager.go:215] "Topology Admit Handler" podUID="55edd9ef-b537-4c83-ad1e-084f093bbf6f" podNamespace="kube-system" podName="coredns-76f75df574-t6q9g" Aug 5 22:13:06.691526 kubelet[2635]: I0805 22:13:06.690373 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/55edd9ef-b537-4c83-ad1e-084f093bbf6f-config-volume\") pod \"coredns-76f75df574-t6q9g\" (UID: \"55edd9ef-b537-4c83-ad1e-084f093bbf6f\") " pod="kube-system/coredns-76f75df574-t6q9g" Aug 5 22:13:06.691526 kubelet[2635]: I0805 22:13:06.690418 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntwkd\" (UniqueName: \"kubernetes.io/projected/2932e3f8-d82d-4537-9f9b-b46369d5b6f8-kube-api-access-ntwkd\") pod \"coredns-76f75df574-lvcgp\" (UID: \"2932e3f8-d82d-4537-9f9b-b46369d5b6f8\") " pod="kube-system/coredns-76f75df574-lvcgp" Aug 5 22:13:06.691526 kubelet[2635]: I0805 22:13:06.690559 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2932e3f8-d82d-4537-9f9b-b46369d5b6f8-config-volume\") pod \"coredns-76f75df574-lvcgp\" (UID: \"2932e3f8-d82d-4537-9f9b-b46369d5b6f8\") " pod="kube-system/coredns-76f75df574-lvcgp" Aug 5 22:13:06.691526 kubelet[2635]: I0805 22:13:06.690591 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ecfd1c0-3700-4ed3-84bf-da7987083d57-tigera-ca-bundle\") pod \"calico-kube-controllers-58c54bc647-pqf69\" (UID: \"4ecfd1c0-3700-4ed3-84bf-da7987083d57\") " pod="calico-system/calico-kube-controllers-58c54bc647-pqf69" Aug 5 22:13:06.691526 kubelet[2635]: I0805 22:13:06.690658 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q679w\" (UniqueName: \"kubernetes.io/projected/4ecfd1c0-3700-4ed3-84bf-da7987083d57-kube-api-access-q679w\") pod \"calico-kube-controllers-58c54bc647-pqf69\" (UID: \"4ecfd1c0-3700-4ed3-84bf-da7987083d57\") " pod="calico-system/calico-kube-controllers-58c54bc647-pqf69" Aug 5 22:13:06.690892 systemd[1]: Created slice kubepods-burstable-pod2932e3f8_d82d_4537_9f9b_b46369d5b6f8.slice - libcontainer container kubepods-burstable-pod2932e3f8_d82d_4537_9f9b_b46369d5b6f8.slice. Aug 5 22:13:06.693070 kubelet[2635]: I0805 22:13:06.690794 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdj8s\" (UniqueName: \"kubernetes.io/projected/55edd9ef-b537-4c83-ad1e-084f093bbf6f-kube-api-access-gdj8s\") pod \"coredns-76f75df574-t6q9g\" (UID: \"55edd9ef-b537-4c83-ad1e-084f093bbf6f\") " pod="kube-system/coredns-76f75df574-t6q9g" Aug 5 22:13:06.701269 systemd[1]: Created slice kubepods-besteffort-pod4ecfd1c0_3700_4ed3_84bf_da7987083d57.slice - libcontainer container kubepods-besteffort-pod4ecfd1c0_3700_4ed3_84bf_da7987083d57.slice. Aug 5 22:13:06.707580 systemd[1]: Created slice kubepods-burstable-pod55edd9ef_b537_4c83_ad1e_084f093bbf6f.slice - libcontainer container kubepods-burstable-pod55edd9ef_b537_4c83_ad1e_084f093bbf6f.slice. Aug 5 22:13:06.747918 systemd[1]: Created slice kubepods-besteffort-pod2d1ea0ca_eeda_4305_a298_4df807e6a886.slice - libcontainer container kubepods-besteffort-pod2d1ea0ca_eeda_4305_a298_4df807e6a886.slice. Aug 5 22:13:06.753198 containerd[1444]: time="2024-08-05T22:13:06.752759104Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ck8bx,Uid:2d1ea0ca-eeda-4305-a298-4df807e6a886,Namespace:calico-system,Attempt:0,}" Aug 5 22:13:06.954817 containerd[1444]: time="2024-08-05T22:13:06.954628224Z" level=error msg="Failed to destroy network for sandbox \"52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 5 22:13:06.960454 containerd[1444]: time="2024-08-05T22:13:06.960287409Z" level=error msg="encountered an error cleaning up failed sandbox \"52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 5 22:13:06.960454 containerd[1444]: time="2024-08-05T22:13:06.960350870Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ck8bx,Uid:2d1ea0ca-eeda-4305-a298-4df807e6a886,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 5 22:13:06.962468 kubelet[2635]: E0805 22:13:06.960604 2635 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 5 22:13:06.962468 kubelet[2635]: E0805 22:13:06.960670 2635 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ck8bx" Aug 5 22:13:06.962468 kubelet[2635]: E0805 22:13:06.960697 2635 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ck8bx" Aug 5 22:13:06.962641 kubelet[2635]: E0805 22:13:06.960782 2635 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-ck8bx_calico-system(2d1ea0ca-eeda-4305-a298-4df807e6a886)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-ck8bx_calico-system(2d1ea0ca-eeda-4305-a298-4df807e6a886)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-ck8bx" podUID="2d1ea0ca-eeda-4305-a298-4df807e6a886" Aug 5 22:13:06.983661 containerd[1444]: time="2024-08-05T22:13:06.983380295Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.28.0\"" Aug 5 22:13:06.984836 kubelet[2635]: I0805 22:13:06.984818 2635 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" Aug 5 22:13:06.986779 containerd[1444]: time="2024-08-05T22:13:06.986267478Z" level=info msg="StopPodSandbox for \"52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0\"" Aug 5 22:13:06.986779 containerd[1444]: time="2024-08-05T22:13:06.986505430Z" level=info msg="Ensure that sandbox 52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0 in task-service has been cleanup successfully" Aug 5 22:13:07.001802 containerd[1444]: time="2024-08-05T22:13:07.001718017Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-lvcgp,Uid:2932e3f8-d82d-4537-9f9b-b46369d5b6f8,Namespace:kube-system,Attempt:0,}" Aug 5 22:13:07.014266 containerd[1444]: time="2024-08-05T22:13:07.014105142Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-t6q9g,Uid:55edd9ef-b537-4c83-ad1e-084f093bbf6f,Namespace:kube-system,Attempt:0,}" Aug 5 22:13:07.015186 containerd[1444]: time="2024-08-05T22:13:07.014660980Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58c54bc647-pqf69,Uid:4ecfd1c0-3700-4ed3-84bf-da7987083d57,Namespace:calico-system,Attempt:0,}" Aug 5 22:13:07.054307 containerd[1444]: time="2024-08-05T22:13:07.054252037Z" level=error msg="StopPodSandbox for \"52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0\" failed" error="failed to destroy network for sandbox \"52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 5 22:13:07.058606 kubelet[2635]: E0805 22:13:07.054554 2635 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" Aug 5 22:13:07.058606 kubelet[2635]: E0805 22:13:07.054628 2635 kuberuntime_manager.go:1381] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0"} Aug 5 22:13:07.058606 kubelet[2635]: E0805 22:13:07.054780 2635 kuberuntime_manager.go:1081] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"2d1ea0ca-eeda-4305-a298-4df807e6a886\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Aug 5 22:13:07.058606 kubelet[2635]: E0805 22:13:07.054958 2635 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"2d1ea0ca-eeda-4305-a298-4df807e6a886\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-ck8bx" podUID="2d1ea0ca-eeda-4305-a298-4df807e6a886" Aug 5 22:13:07.181943 containerd[1444]: time="2024-08-05T22:13:07.180749416Z" level=error msg="Failed to destroy network for sandbox \"c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 5 22:13:07.182750 containerd[1444]: time="2024-08-05T22:13:07.182701568Z" level=error msg="encountered an error cleaning up failed sandbox \"c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 5 22:13:07.182839 containerd[1444]: time="2024-08-05T22:13:07.182767524Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-lvcgp,Uid:2932e3f8-d82d-4537-9f9b-b46369d5b6f8,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 5 22:13:07.183642 kubelet[2635]: E0805 22:13:07.183027 2635 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 5 22:13:07.183642 kubelet[2635]: E0805 22:13:07.183116 2635 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-lvcgp" Aug 5 22:13:07.183642 kubelet[2635]: E0805 22:13:07.183150 2635 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-lvcgp" Aug 5 22:13:07.183817 kubelet[2635]: E0805 22:13:07.183247 2635 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-lvcgp_kube-system(2932e3f8-d82d-4537-9f9b-b46369d5b6f8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-lvcgp_kube-system(2932e3f8-d82d-4537-9f9b-b46369d5b6f8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-lvcgp" podUID="2932e3f8-d82d-4537-9f9b-b46369d5b6f8" Aug 5 22:13:07.209761 containerd[1444]: time="2024-08-05T22:13:07.209521861Z" level=error msg="Failed to destroy network for sandbox \"8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 5 22:13:07.212025 containerd[1444]: time="2024-08-05T22:13:07.210877628Z" level=error msg="encountered an error cleaning up failed sandbox \"8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 5 22:13:07.212025 containerd[1444]: time="2024-08-05T22:13:07.210992186Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-t6q9g,Uid:55edd9ef-b537-4c83-ad1e-084f093bbf6f,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 5 22:13:07.212722 kubelet[2635]: E0805 22:13:07.211392 2635 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 5 22:13:07.212722 kubelet[2635]: E0805 22:13:07.211455 2635 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-t6q9g" Aug 5 22:13:07.212722 kubelet[2635]: E0805 22:13:07.211481 2635 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-t6q9g" Aug 5 22:13:07.212849 kubelet[2635]: E0805 22:13:07.211540 2635 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-t6q9g_kube-system(55edd9ef-b537-4c83-ad1e-084f093bbf6f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-t6q9g_kube-system(55edd9ef-b537-4c83-ad1e-084f093bbf6f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-t6q9g" podUID="55edd9ef-b537-4c83-ad1e-084f093bbf6f" Aug 5 22:13:07.213843 containerd[1444]: time="2024-08-05T22:13:07.213793644Z" level=error msg="Failed to destroy network for sandbox \"8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 5 22:13:07.214364 containerd[1444]: time="2024-08-05T22:13:07.214329172Z" level=error msg="encountered an error cleaning up failed sandbox \"8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 5 22:13:07.214480 containerd[1444]: time="2024-08-05T22:13:07.214438149Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58c54bc647-pqf69,Uid:4ecfd1c0-3700-4ed3-84bf-da7987083d57,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 5 22:13:07.214908 kubelet[2635]: E0805 22:13:07.214841 2635 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 5 22:13:07.215076 kubelet[2635]: E0805 22:13:07.214981 2635 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-58c54bc647-pqf69" Aug 5 22:13:07.215076 kubelet[2635]: E0805 22:13:07.215015 2635 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-58c54bc647-pqf69" Aug 5 22:13:07.215811 kubelet[2635]: E0805 22:13:07.215786 2635 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-58c54bc647-pqf69_calico-system(4ecfd1c0-3700-4ed3-84bf-da7987083d57)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-58c54bc647-pqf69_calico-system(4ecfd1c0-3700-4ed3-84bf-da7987083d57)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-58c54bc647-pqf69" podUID="4ecfd1c0-3700-4ed3-84bf-da7987083d57" Aug 5 22:13:07.629136 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0-shm.mount: Deactivated successfully. Aug 5 22:13:07.991990 kubelet[2635]: I0805 22:13:07.991692 2635 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" Aug 5 22:13:08.001307 containerd[1444]: time="2024-08-05T22:13:07.997713170Z" level=info msg="StopPodSandbox for \"8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193\"" Aug 5 22:13:08.001307 containerd[1444]: time="2024-08-05T22:13:07.999884248Z" level=info msg="Ensure that sandbox 8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193 in task-service has been cleanup successfully" Aug 5 22:13:08.001307 containerd[1444]: time="2024-08-05T22:13:08.000366645Z" level=info msg="StopPodSandbox for \"8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1\"" Aug 5 22:13:08.001307 containerd[1444]: time="2024-08-05T22:13:08.000828554Z" level=info msg="Ensure that sandbox 8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1 in task-service has been cleanup successfully" Aug 5 22:13:08.002915 kubelet[2635]: I0805 22:13:07.998845 2635 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" Aug 5 22:13:08.010316 kubelet[2635]: I0805 22:13:08.009598 2635 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" Aug 5 22:13:08.017207 containerd[1444]: time="2024-08-05T22:13:08.016584257Z" level=info msg="StopPodSandbox for \"c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd\"" Aug 5 22:13:08.018961 containerd[1444]: time="2024-08-05T22:13:08.018775393Z" level=info msg="Ensure that sandbox c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd in task-service has been cleanup successfully" Aug 5 22:13:08.093524 containerd[1444]: time="2024-08-05T22:13:08.093472730Z" level=error msg="StopPodSandbox for \"c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd\" failed" error="failed to destroy network for sandbox \"c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 5 22:13:08.094367 kubelet[2635]: E0805 22:13:08.093919 2635 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" Aug 5 22:13:08.094367 kubelet[2635]: E0805 22:13:08.093977 2635 kuberuntime_manager.go:1381] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd"} Aug 5 22:13:08.094367 kubelet[2635]: E0805 22:13:08.094019 2635 kuberuntime_manager.go:1081] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"2932e3f8-d82d-4537-9f9b-b46369d5b6f8\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Aug 5 22:13:08.094367 kubelet[2635]: E0805 22:13:08.094054 2635 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"2932e3f8-d82d-4537-9f9b-b46369d5b6f8\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-lvcgp" podUID="2932e3f8-d82d-4537-9f9b-b46369d5b6f8" Aug 5 22:13:08.094785 containerd[1444]: time="2024-08-05T22:13:08.094697328Z" level=error msg="StopPodSandbox for \"8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193\" failed" error="failed to destroy network for sandbox \"8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 5 22:13:08.094872 kubelet[2635]: E0805 22:13:08.094862 2635 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" Aug 5 22:13:08.094914 kubelet[2635]: E0805 22:13:08.094885 2635 kuberuntime_manager.go:1381] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193"} Aug 5 22:13:08.094955 kubelet[2635]: E0805 22:13:08.094919 2635 kuberuntime_manager.go:1081] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"55edd9ef-b537-4c83-ad1e-084f093bbf6f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Aug 5 22:13:08.094955 kubelet[2635]: E0805 22:13:08.094952 2635 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"55edd9ef-b537-4c83-ad1e-084f093bbf6f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-t6q9g" podUID="55edd9ef-b537-4c83-ad1e-084f093bbf6f" Aug 5 22:13:08.100814 containerd[1444]: time="2024-08-05T22:13:08.100758331Z" level=error msg="StopPodSandbox for \"8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1\" failed" error="failed to destroy network for sandbox \"8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 5 22:13:08.101322 kubelet[2635]: E0805 22:13:08.101024 2635 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" Aug 5 22:13:08.101322 kubelet[2635]: E0805 22:13:08.101080 2635 kuberuntime_manager.go:1381] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1"} Aug 5 22:13:08.101322 kubelet[2635]: E0805 22:13:08.101126 2635 kuberuntime_manager.go:1081] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"4ecfd1c0-3700-4ed3-84bf-da7987083d57\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Aug 5 22:13:08.101322 kubelet[2635]: E0805 22:13:08.101161 2635 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"4ecfd1c0-3700-4ed3-84bf-da7987083d57\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-58c54bc647-pqf69" podUID="4ecfd1c0-3700-4ed3-84bf-da7987083d57" Aug 5 22:13:15.301738 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2913694042.mount: Deactivated successfully. Aug 5 22:13:15.879039 containerd[1444]: time="2024-08-05T22:13:15.872760579Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.28.0: active requests=0, bytes read=115238750" Aug 5 22:13:15.887563 containerd[1444]: time="2024-08-05T22:13:15.887491903Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.28.0\" with image id \"sha256:4e42b6f329bc1d197d97f6d2a1289b9e9f4a9560db3a36c8cffb5e95e64e4b49\", repo tag \"ghcr.io/flatcar/calico/node:v3.28.0\", repo digest \"ghcr.io/flatcar/calico/node@sha256:95f8004836427050c9997ad0800819ced5636f6bda647b4158fc7c497910c8d0\", size \"115238612\" in 8.904042045s" Aug 5 22:13:15.887811 containerd[1444]: time="2024-08-05T22:13:15.866621796Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.28.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:13:15.888287 containerd[1444]: time="2024-08-05T22:13:15.887742489Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.28.0\" returns image reference \"sha256:4e42b6f329bc1d197d97f6d2a1289b9e9f4a9560db3a36c8cffb5e95e64e4b49\"" Aug 5 22:13:15.895499 containerd[1444]: time="2024-08-05T22:13:15.895092150Z" level=info msg="ImageCreate event name:\"sha256:4e42b6f329bc1d197d97f6d2a1289b9e9f4a9560db3a36c8cffb5e95e64e4b49\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:13:15.899368 containerd[1444]: time="2024-08-05T22:13:15.898746157Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:95f8004836427050c9997ad0800819ced5636f6bda647b4158fc7c497910c8d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:13:15.953564 containerd[1444]: time="2024-08-05T22:13:15.953431815Z" level=info msg="CreateContainer within sandbox \"19660da175825eee0634b406dd0ff91cdc281bc0da9c54da36a454f658a1a9f4\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Aug 5 22:13:15.982023 containerd[1444]: time="2024-08-05T22:13:15.981919551Z" level=info msg="CreateContainer within sandbox \"19660da175825eee0634b406dd0ff91cdc281bc0da9c54da36a454f658a1a9f4\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"81801b50e438d74a07c8b62aedd1cf55c8a0fde7a170d8621b6412d43c922fcd\"" Aug 5 22:13:15.984374 containerd[1444]: time="2024-08-05T22:13:15.983333837Z" level=info msg="StartContainer for \"81801b50e438d74a07c8b62aedd1cf55c8a0fde7a170d8621b6412d43c922fcd\"" Aug 5 22:13:16.024112 systemd[1]: Started cri-containerd-81801b50e438d74a07c8b62aedd1cf55c8a0fde7a170d8621b6412d43c922fcd.scope - libcontainer container 81801b50e438d74a07c8b62aedd1cf55c8a0fde7a170d8621b6412d43c922fcd. Aug 5 22:13:16.072704 containerd[1444]: time="2024-08-05T22:13:16.072665428Z" level=info msg="StartContainer for \"81801b50e438d74a07c8b62aedd1cf55c8a0fde7a170d8621b6412d43c922fcd\" returns successfully" Aug 5 22:13:16.175268 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Aug 5 22:13:16.175869 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Aug 5 22:13:17.125960 kubelet[2635]: I0805 22:13:17.125326 2635 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-node-bl8mm" podStartSLOduration=3.143715896 podStartE2EDuration="21.096069092s" podCreationTimestamp="2024-08-05 22:12:56 +0000 UTC" firstStartedPulling="2024-08-05 22:12:57.936282287 +0000 UTC m=+29.641502676" lastFinishedPulling="2024-08-05 22:13:15.888635434 +0000 UTC m=+47.593855872" observedRunningTime="2024-08-05 22:13:17.095135951 +0000 UTC m=+48.800356409" watchObservedRunningTime="2024-08-05 22:13:17.096069092 +0000 UTC m=+48.801289530" Aug 5 22:13:18.001448 systemd-networkd[1359]: vxlan.calico: Link UP Aug 5 22:13:18.001456 systemd-networkd[1359]: vxlan.calico: Gained carrier Aug 5 22:13:18.121389 systemd[1]: run-containerd-runc-k8s.io-81801b50e438d74a07c8b62aedd1cf55c8a0fde7a170d8621b6412d43c922fcd-runc.tcMcoY.mount: Deactivated successfully. Aug 5 22:13:19.743529 containerd[1444]: time="2024-08-05T22:13:19.741072508Z" level=info msg="StopPodSandbox for \"52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0\"" Aug 5 22:13:19.819506 systemd-networkd[1359]: vxlan.calico: Gained IPv6LL Aug 5 22:13:20.401885 containerd[1444]: 2024-08-05 22:13:19.866 [INFO][4253] k8s.go 608: Cleaning up netns ContainerID="52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" Aug 5 22:13:20.401885 containerd[1444]: 2024-08-05 22:13:19.870 [INFO][4253] dataplane_linux.go 530: Deleting workload's device in netns. ContainerID="52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" iface="eth0" netns="/var/run/netns/cni-43f5a6b4-5ade-55b4-c3a6-aa66f4ac1a26" Aug 5 22:13:20.401885 containerd[1444]: 2024-08-05 22:13:19.870 [INFO][4253] dataplane_linux.go 541: Entered netns, deleting veth. ContainerID="52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" iface="eth0" netns="/var/run/netns/cni-43f5a6b4-5ade-55b4-c3a6-aa66f4ac1a26" Aug 5 22:13:20.401885 containerd[1444]: 2024-08-05 22:13:19.870 [INFO][4253] dataplane_linux.go 568: Workload's veth was already gone. Nothing to do. ContainerID="52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" iface="eth0" netns="/var/run/netns/cni-43f5a6b4-5ade-55b4-c3a6-aa66f4ac1a26" Aug 5 22:13:20.401885 containerd[1444]: 2024-08-05 22:13:19.871 [INFO][4253] k8s.go 615: Releasing IP address(es) ContainerID="52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" Aug 5 22:13:20.401885 containerd[1444]: 2024-08-05 22:13:19.871 [INFO][4253] utils.go 188: Calico CNI releasing IP address ContainerID="52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" Aug 5 22:13:20.401885 containerd[1444]: 2024-08-05 22:13:20.372 [INFO][4260] ipam_plugin.go 411: Releasing address using handleID ContainerID="52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" HandleID="k8s-pod-network.52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-csi--node--driver--ck8bx-eth0" Aug 5 22:13:20.401885 containerd[1444]: 2024-08-05 22:13:20.374 [INFO][4260] ipam_plugin.go 352: About to acquire host-wide IPAM lock. Aug 5 22:13:20.401885 containerd[1444]: 2024-08-05 22:13:20.375 [INFO][4260] ipam_plugin.go 367: Acquired host-wide IPAM lock. Aug 5 22:13:20.401885 containerd[1444]: 2024-08-05 22:13:20.394 [WARNING][4260] ipam_plugin.go 428: Asked to release address but it doesn't exist. Ignoring ContainerID="52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" HandleID="k8s-pod-network.52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-csi--node--driver--ck8bx-eth0" Aug 5 22:13:20.401885 containerd[1444]: 2024-08-05 22:13:20.394 [INFO][4260] ipam_plugin.go 439: Releasing address using workloadID ContainerID="52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" HandleID="k8s-pod-network.52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-csi--node--driver--ck8bx-eth0" Aug 5 22:13:20.401885 containerd[1444]: 2024-08-05 22:13:20.396 [INFO][4260] ipam_plugin.go 373: Released host-wide IPAM lock. Aug 5 22:13:20.401885 containerd[1444]: 2024-08-05 22:13:20.397 [INFO][4253] k8s.go 621: Teardown processing complete. ContainerID="52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" Aug 5 22:13:20.403376 containerd[1444]: time="2024-08-05T22:13:20.402767765Z" level=info msg="TearDown network for sandbox \"52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0\" successfully" Aug 5 22:13:20.403376 containerd[1444]: time="2024-08-05T22:13:20.402867404Z" level=info msg="StopPodSandbox for \"52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0\" returns successfully" Aug 5 22:13:20.408192 containerd[1444]: time="2024-08-05T22:13:20.407602315Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ck8bx,Uid:2d1ea0ca-eeda-4305-a298-4df807e6a886,Namespace:calico-system,Attempt:1,}" Aug 5 22:13:20.409029 systemd[1]: run-netns-cni\x2d43f5a6b4\x2d5ade\x2d55b4\x2dc3a6\x2daa66f4ac1a26.mount: Deactivated successfully. Aug 5 22:13:20.638177 systemd-networkd[1359]: cali856eccd5c50: Link UP Aug 5 22:13:20.642628 systemd-networkd[1359]: cali856eccd5c50: Gained carrier Aug 5 22:13:20.664715 containerd[1444]: 2024-08-05 22:13:20.515 [INFO][4267] plugin.go 326: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--3975--2--0--1--de7b5ef465.novalocal-k8s-csi--node--driver--ck8bx-eth0 csi-node-driver- calico-system 2d1ea0ca-eeda-4305-a298-4df807e6a886 809 0 2024-08-05 22:12:49 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:7d7f6c786c k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:default] map[] [] [] []} {k8s ci-3975-2-0-1-de7b5ef465.novalocal csi-node-driver-ck8bx eth0 default [] [] [kns.calico-system ksa.calico-system.default] cali856eccd5c50 [] []}} ContainerID="f064d99279b0982c4ba9b97b4f5a5fe0d6c152caffbb4d5acaea5faf6f8455b9" Namespace="calico-system" Pod="csi-node-driver-ck8bx" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-csi--node--driver--ck8bx-" Aug 5 22:13:20.664715 containerd[1444]: 2024-08-05 22:13:20.515 [INFO][4267] k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="f064d99279b0982c4ba9b97b4f5a5fe0d6c152caffbb4d5acaea5faf6f8455b9" Namespace="calico-system" Pod="csi-node-driver-ck8bx" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-csi--node--driver--ck8bx-eth0" Aug 5 22:13:20.664715 containerd[1444]: 2024-08-05 22:13:20.565 [INFO][4277] ipam_plugin.go 224: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f064d99279b0982c4ba9b97b4f5a5fe0d6c152caffbb4d5acaea5faf6f8455b9" HandleID="k8s-pod-network.f064d99279b0982c4ba9b97b4f5a5fe0d6c152caffbb4d5acaea5faf6f8455b9" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-csi--node--driver--ck8bx-eth0" Aug 5 22:13:20.664715 containerd[1444]: 2024-08-05 22:13:20.584 [INFO][4277] ipam_plugin.go 264: Auto assigning IP ContainerID="f064d99279b0982c4ba9b97b4f5a5fe0d6c152caffbb4d5acaea5faf6f8455b9" HandleID="k8s-pod-network.f064d99279b0982c4ba9b97b4f5a5fe0d6c152caffbb4d5acaea5faf6f8455b9" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-csi--node--driver--ck8bx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000290110), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-3975-2-0-1-de7b5ef465.novalocal", "pod":"csi-node-driver-ck8bx", "timestamp":"2024-08-05 22:13:20.565321793 +0000 UTC"}, Hostname:"ci-3975-2-0-1-de7b5ef465.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 5 22:13:20.664715 containerd[1444]: 2024-08-05 22:13:20.584 [INFO][4277] ipam_plugin.go 352: About to acquire host-wide IPAM lock. Aug 5 22:13:20.664715 containerd[1444]: 2024-08-05 22:13:20.584 [INFO][4277] ipam_plugin.go 367: Acquired host-wide IPAM lock. Aug 5 22:13:20.664715 containerd[1444]: 2024-08-05 22:13:20.585 [INFO][4277] ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-3975-2-0-1-de7b5ef465.novalocal' Aug 5 22:13:20.664715 containerd[1444]: 2024-08-05 22:13:20.588 [INFO][4277] ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.f064d99279b0982c4ba9b97b4f5a5fe0d6c152caffbb4d5acaea5faf6f8455b9" host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:20.664715 containerd[1444]: 2024-08-05 22:13:20.599 [INFO][4277] ipam.go 372: Looking up existing affinities for host host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:20.664715 containerd[1444]: 2024-08-05 22:13:20.604 [INFO][4277] ipam.go 489: Trying affinity for 192.168.92.128/26 host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:20.664715 containerd[1444]: 2024-08-05 22:13:20.608 [INFO][4277] ipam.go 155: Attempting to load block cidr=192.168.92.128/26 host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:20.664715 containerd[1444]: 2024-08-05 22:13:20.611 [INFO][4277] ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.92.128/26 host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:20.664715 containerd[1444]: 2024-08-05 22:13:20.611 [INFO][4277] ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.92.128/26 handle="k8s-pod-network.f064d99279b0982c4ba9b97b4f5a5fe0d6c152caffbb4d5acaea5faf6f8455b9" host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:20.664715 containerd[1444]: 2024-08-05 22:13:20.613 [INFO][4277] ipam.go 1685: Creating new handle: k8s-pod-network.f064d99279b0982c4ba9b97b4f5a5fe0d6c152caffbb4d5acaea5faf6f8455b9 Aug 5 22:13:20.664715 containerd[1444]: 2024-08-05 22:13:20.617 [INFO][4277] ipam.go 1203: Writing block in order to claim IPs block=192.168.92.128/26 handle="k8s-pod-network.f064d99279b0982c4ba9b97b4f5a5fe0d6c152caffbb4d5acaea5faf6f8455b9" host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:20.664715 containerd[1444]: 2024-08-05 22:13:20.626 [INFO][4277] ipam.go 1216: Successfully claimed IPs: [192.168.92.129/26] block=192.168.92.128/26 handle="k8s-pod-network.f064d99279b0982c4ba9b97b4f5a5fe0d6c152caffbb4d5acaea5faf6f8455b9" host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:20.664715 containerd[1444]: 2024-08-05 22:13:20.627 [INFO][4277] ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.92.129/26] handle="k8s-pod-network.f064d99279b0982c4ba9b97b4f5a5fe0d6c152caffbb4d5acaea5faf6f8455b9" host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:20.664715 containerd[1444]: 2024-08-05 22:13:20.627 [INFO][4277] ipam_plugin.go 373: Released host-wide IPAM lock. Aug 5 22:13:20.664715 containerd[1444]: 2024-08-05 22:13:20.627 [INFO][4277] ipam_plugin.go 282: Calico CNI IPAM assigned addresses IPv4=[192.168.92.129/26] IPv6=[] ContainerID="f064d99279b0982c4ba9b97b4f5a5fe0d6c152caffbb4d5acaea5faf6f8455b9" HandleID="k8s-pod-network.f064d99279b0982c4ba9b97b4f5a5fe0d6c152caffbb4d5acaea5faf6f8455b9" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-csi--node--driver--ck8bx-eth0" Aug 5 22:13:20.666945 containerd[1444]: 2024-08-05 22:13:20.631 [INFO][4267] k8s.go 386: Populated endpoint ContainerID="f064d99279b0982c4ba9b97b4f5a5fe0d6c152caffbb4d5acaea5faf6f8455b9" Namespace="calico-system" Pod="csi-node-driver-ck8bx" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-csi--node--driver--ck8bx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3975--2--0--1--de7b5ef465.novalocal-k8s-csi--node--driver--ck8bx-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2d1ea0ca-eeda-4305-a298-4df807e6a886", ResourceVersion:"809", Generation:0, CreationTimestamp:time.Date(2024, time.August, 5, 22, 12, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"7d7f6c786c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3975-2-0-1-de7b5ef465.novalocal", ContainerID:"", Pod:"csi-node-driver-ck8bx", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.92.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.default"}, InterfaceName:"cali856eccd5c50", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Aug 5 22:13:20.666945 containerd[1444]: 2024-08-05 22:13:20.631 [INFO][4267] k8s.go 387: Calico CNI using IPs: [192.168.92.129/32] ContainerID="f064d99279b0982c4ba9b97b4f5a5fe0d6c152caffbb4d5acaea5faf6f8455b9" Namespace="calico-system" Pod="csi-node-driver-ck8bx" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-csi--node--driver--ck8bx-eth0" Aug 5 22:13:20.666945 containerd[1444]: 2024-08-05 22:13:20.631 [INFO][4267] dataplane_linux.go 68: Setting the host side veth name to cali856eccd5c50 ContainerID="f064d99279b0982c4ba9b97b4f5a5fe0d6c152caffbb4d5acaea5faf6f8455b9" Namespace="calico-system" Pod="csi-node-driver-ck8bx" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-csi--node--driver--ck8bx-eth0" Aug 5 22:13:20.666945 containerd[1444]: 2024-08-05 22:13:20.641 [INFO][4267] dataplane_linux.go 479: Disabling IPv4 forwarding ContainerID="f064d99279b0982c4ba9b97b4f5a5fe0d6c152caffbb4d5acaea5faf6f8455b9" Namespace="calico-system" Pod="csi-node-driver-ck8bx" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-csi--node--driver--ck8bx-eth0" Aug 5 22:13:20.666945 containerd[1444]: 2024-08-05 22:13:20.644 [INFO][4267] k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="f064d99279b0982c4ba9b97b4f5a5fe0d6c152caffbb4d5acaea5faf6f8455b9" Namespace="calico-system" Pod="csi-node-driver-ck8bx" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-csi--node--driver--ck8bx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3975--2--0--1--de7b5ef465.novalocal-k8s-csi--node--driver--ck8bx-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2d1ea0ca-eeda-4305-a298-4df807e6a886", ResourceVersion:"809", Generation:0, CreationTimestamp:time.Date(2024, time.August, 5, 22, 12, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"7d7f6c786c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3975-2-0-1-de7b5ef465.novalocal", ContainerID:"f064d99279b0982c4ba9b97b4f5a5fe0d6c152caffbb4d5acaea5faf6f8455b9", Pod:"csi-node-driver-ck8bx", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.92.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.default"}, InterfaceName:"cali856eccd5c50", MAC:"4a:27:c0:61:1a:c0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Aug 5 22:13:20.666945 containerd[1444]: 2024-08-05 22:13:20.660 [INFO][4267] k8s.go 500: Wrote updated endpoint to datastore ContainerID="f064d99279b0982c4ba9b97b4f5a5fe0d6c152caffbb4d5acaea5faf6f8455b9" Namespace="calico-system" Pod="csi-node-driver-ck8bx" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-csi--node--driver--ck8bx-eth0" Aug 5 22:13:20.733594 containerd[1444]: time="2024-08-05T22:13:20.732972418Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Aug 5 22:13:20.733594 containerd[1444]: time="2024-08-05T22:13:20.733040598Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 5 22:13:20.735196 containerd[1444]: time="2024-08-05T22:13:20.735026955Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Aug 5 22:13:20.735196 containerd[1444]: time="2024-08-05T22:13:20.735067081Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 5 22:13:20.780773 systemd[1]: Started cri-containerd-f064d99279b0982c4ba9b97b4f5a5fe0d6c152caffbb4d5acaea5faf6f8455b9.scope - libcontainer container f064d99279b0982c4ba9b97b4f5a5fe0d6c152caffbb4d5acaea5faf6f8455b9. Aug 5 22:13:20.827177 containerd[1444]: time="2024-08-05T22:13:20.827130175Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ck8bx,Uid:2d1ea0ca-eeda-4305-a298-4df807e6a886,Namespace:calico-system,Attempt:1,} returns sandbox id \"f064d99279b0982c4ba9b97b4f5a5fe0d6c152caffbb4d5acaea5faf6f8455b9\"" Aug 5 22:13:20.845841 containerd[1444]: time="2024-08-05T22:13:20.845572529Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.28.0\"" Aug 5 22:13:21.412432 systemd[1]: run-containerd-runc-k8s.io-f064d99279b0982c4ba9b97b4f5a5fe0d6c152caffbb4d5acaea5faf6f8455b9-runc.OO7rSP.mount: Deactivated successfully. Aug 5 22:13:21.742446 containerd[1444]: time="2024-08-05T22:13:21.742314155Z" level=info msg="StopPodSandbox for \"8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1\"" Aug 5 22:13:21.744030 containerd[1444]: time="2024-08-05T22:13:21.743195996Z" level=info msg="StopPodSandbox for \"8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193\"" Aug 5 22:13:21.915423 containerd[1444]: 2024-08-05 22:13:21.821 [INFO][4364] k8s.go 608: Cleaning up netns ContainerID="8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" Aug 5 22:13:21.915423 containerd[1444]: 2024-08-05 22:13:21.821 [INFO][4364] dataplane_linux.go 530: Deleting workload's device in netns. ContainerID="8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" iface="eth0" netns="/var/run/netns/cni-59728f46-16a2-6dfa-a20d-9b83722e576f" Aug 5 22:13:21.915423 containerd[1444]: 2024-08-05 22:13:21.822 [INFO][4364] dataplane_linux.go 541: Entered netns, deleting veth. ContainerID="8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" iface="eth0" netns="/var/run/netns/cni-59728f46-16a2-6dfa-a20d-9b83722e576f" Aug 5 22:13:21.915423 containerd[1444]: 2024-08-05 22:13:21.823 [INFO][4364] dataplane_linux.go 568: Workload's veth was already gone. Nothing to do. ContainerID="8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" iface="eth0" netns="/var/run/netns/cni-59728f46-16a2-6dfa-a20d-9b83722e576f" Aug 5 22:13:21.915423 containerd[1444]: 2024-08-05 22:13:21.823 [INFO][4364] k8s.go 615: Releasing IP address(es) ContainerID="8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" Aug 5 22:13:21.915423 containerd[1444]: 2024-08-05 22:13:21.823 [INFO][4364] utils.go 188: Calico CNI releasing IP address ContainerID="8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" Aug 5 22:13:21.915423 containerd[1444]: 2024-08-05 22:13:21.893 [INFO][4376] ipam_plugin.go 411: Releasing address using handleID ContainerID="8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" HandleID="k8s-pod-network.8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--kube--controllers--58c54bc647--pqf69-eth0" Aug 5 22:13:21.915423 containerd[1444]: 2024-08-05 22:13:21.893 [INFO][4376] ipam_plugin.go 352: About to acquire host-wide IPAM lock. Aug 5 22:13:21.915423 containerd[1444]: 2024-08-05 22:13:21.893 [INFO][4376] ipam_plugin.go 367: Acquired host-wide IPAM lock. Aug 5 22:13:21.915423 containerd[1444]: 2024-08-05 22:13:21.903 [WARNING][4376] ipam_plugin.go 428: Asked to release address but it doesn't exist. Ignoring ContainerID="8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" HandleID="k8s-pod-network.8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--kube--controllers--58c54bc647--pqf69-eth0" Aug 5 22:13:21.915423 containerd[1444]: 2024-08-05 22:13:21.903 [INFO][4376] ipam_plugin.go 439: Releasing address using workloadID ContainerID="8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" HandleID="k8s-pod-network.8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--kube--controllers--58c54bc647--pqf69-eth0" Aug 5 22:13:21.915423 containerd[1444]: 2024-08-05 22:13:21.905 [INFO][4376] ipam_plugin.go 373: Released host-wide IPAM lock. Aug 5 22:13:21.915423 containerd[1444]: 2024-08-05 22:13:21.911 [INFO][4364] k8s.go 621: Teardown processing complete. ContainerID="8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" Aug 5 22:13:21.921903 containerd[1444]: time="2024-08-05T22:13:21.920782079Z" level=info msg="TearDown network for sandbox \"8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1\" successfully" Aug 5 22:13:21.921903 containerd[1444]: time="2024-08-05T22:13:21.920814089Z" level=info msg="StopPodSandbox for \"8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1\" returns successfully" Aug 5 22:13:21.921903 containerd[1444]: time="2024-08-05T22:13:21.921489050Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58c54bc647-pqf69,Uid:4ecfd1c0-3700-4ed3-84bf-da7987083d57,Namespace:calico-system,Attempt:1,}" Aug 5 22:13:21.925328 systemd[1]: run-netns-cni\x2d59728f46\x2d16a2\x2d6dfa\x2da20d\x2d9b83722e576f.mount: Deactivated successfully. Aug 5 22:13:21.936078 containerd[1444]: 2024-08-05 22:13:21.853 [INFO][4365] k8s.go 608: Cleaning up netns ContainerID="8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" Aug 5 22:13:21.936078 containerd[1444]: 2024-08-05 22:13:21.854 [INFO][4365] dataplane_linux.go 530: Deleting workload's device in netns. ContainerID="8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" iface="eth0" netns="/var/run/netns/cni-76899f7d-5df4-e28e-d696-06fc7d2f1ff2" Aug 5 22:13:21.936078 containerd[1444]: 2024-08-05 22:13:21.855 [INFO][4365] dataplane_linux.go 541: Entered netns, deleting veth. ContainerID="8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" iface="eth0" netns="/var/run/netns/cni-76899f7d-5df4-e28e-d696-06fc7d2f1ff2" Aug 5 22:13:21.936078 containerd[1444]: 2024-08-05 22:13:21.855 [INFO][4365] dataplane_linux.go 568: Workload's veth was already gone. Nothing to do. ContainerID="8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" iface="eth0" netns="/var/run/netns/cni-76899f7d-5df4-e28e-d696-06fc7d2f1ff2" Aug 5 22:13:21.936078 containerd[1444]: 2024-08-05 22:13:21.855 [INFO][4365] k8s.go 615: Releasing IP address(es) ContainerID="8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" Aug 5 22:13:21.936078 containerd[1444]: 2024-08-05 22:13:21.856 [INFO][4365] utils.go 188: Calico CNI releasing IP address ContainerID="8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" Aug 5 22:13:21.936078 containerd[1444]: 2024-08-05 22:13:21.912 [INFO][4381] ipam_plugin.go 411: Releasing address using handleID ContainerID="8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" HandleID="k8s-pod-network.8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--t6q9g-eth0" Aug 5 22:13:21.936078 containerd[1444]: 2024-08-05 22:13:21.912 [INFO][4381] ipam_plugin.go 352: About to acquire host-wide IPAM lock. Aug 5 22:13:21.936078 containerd[1444]: 2024-08-05 22:13:21.912 [INFO][4381] ipam_plugin.go 367: Acquired host-wide IPAM lock. Aug 5 22:13:21.936078 containerd[1444]: 2024-08-05 22:13:21.927 [WARNING][4381] ipam_plugin.go 428: Asked to release address but it doesn't exist. Ignoring ContainerID="8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" HandleID="k8s-pod-network.8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--t6q9g-eth0" Aug 5 22:13:21.936078 containerd[1444]: 2024-08-05 22:13:21.927 [INFO][4381] ipam_plugin.go 439: Releasing address using workloadID ContainerID="8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" HandleID="k8s-pod-network.8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--t6q9g-eth0" Aug 5 22:13:21.936078 containerd[1444]: 2024-08-05 22:13:21.929 [INFO][4381] ipam_plugin.go 373: Released host-wide IPAM lock. Aug 5 22:13:21.936078 containerd[1444]: 2024-08-05 22:13:21.934 [INFO][4365] k8s.go 621: Teardown processing complete. ContainerID="8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" Aug 5 22:13:21.939507 containerd[1444]: time="2024-08-05T22:13:21.936195583Z" level=info msg="TearDown network for sandbox \"8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193\" successfully" Aug 5 22:13:21.939507 containerd[1444]: time="2024-08-05T22:13:21.936244166Z" level=info msg="StopPodSandbox for \"8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193\" returns successfully" Aug 5 22:13:21.939507 containerd[1444]: time="2024-08-05T22:13:21.937846164Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-t6q9g,Uid:55edd9ef-b537-4c83-ad1e-084f093bbf6f,Namespace:kube-system,Attempt:1,}" Aug 5 22:13:21.941160 systemd[1]: run-netns-cni\x2d76899f7d\x2d5df4\x2de28e\x2dd696\x2d06fc7d2f1ff2.mount: Deactivated successfully. Aug 5 22:13:22.178718 systemd-networkd[1359]: cali32125d64fa6: Link UP Aug 5 22:13:22.179417 systemd-networkd[1359]: cali32125d64fa6: Gained carrier Aug 5 22:13:22.208454 containerd[1444]: 2024-08-05 22:13:22.033 [INFO][4390] plugin.go 326: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--kube--controllers--58c54bc647--pqf69-eth0 calico-kube-controllers-58c54bc647- calico-system 4ecfd1c0-3700-4ed3-84bf-da7987083d57 820 0 2024-08-05 22:12:50 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:58c54bc647 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-3975-2-0-1-de7b5ef465.novalocal calico-kube-controllers-58c54bc647-pqf69 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali32125d64fa6 [] []}} ContainerID="35145b36206ed650826e52af4918c7250250684c9fa638a8c35efe287ddd8358" Namespace="calico-system" Pod="calico-kube-controllers-58c54bc647-pqf69" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--kube--controllers--58c54bc647--pqf69-" Aug 5 22:13:22.208454 containerd[1444]: 2024-08-05 22:13:22.033 [INFO][4390] k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="35145b36206ed650826e52af4918c7250250684c9fa638a8c35efe287ddd8358" Namespace="calico-system" Pod="calico-kube-controllers-58c54bc647-pqf69" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--kube--controllers--58c54bc647--pqf69-eth0" Aug 5 22:13:22.208454 containerd[1444]: 2024-08-05 22:13:22.093 [INFO][4411] ipam_plugin.go 224: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="35145b36206ed650826e52af4918c7250250684c9fa638a8c35efe287ddd8358" HandleID="k8s-pod-network.35145b36206ed650826e52af4918c7250250684c9fa638a8c35efe287ddd8358" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--kube--controllers--58c54bc647--pqf69-eth0" Aug 5 22:13:22.208454 containerd[1444]: 2024-08-05 22:13:22.119 [INFO][4411] ipam_plugin.go 264: Auto assigning IP ContainerID="35145b36206ed650826e52af4918c7250250684c9fa638a8c35efe287ddd8358" HandleID="k8s-pod-network.35145b36206ed650826e52af4918c7250250684c9fa638a8c35efe287ddd8358" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--kube--controllers--58c54bc647--pqf69-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000364a30), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-3975-2-0-1-de7b5ef465.novalocal", "pod":"calico-kube-controllers-58c54bc647-pqf69", "timestamp":"2024-08-05 22:13:22.093467216 +0000 UTC"}, Hostname:"ci-3975-2-0-1-de7b5ef465.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 5 22:13:22.208454 containerd[1444]: 2024-08-05 22:13:22.119 [INFO][4411] ipam_plugin.go 352: About to acquire host-wide IPAM lock. Aug 5 22:13:22.208454 containerd[1444]: 2024-08-05 22:13:22.120 [INFO][4411] ipam_plugin.go 367: Acquired host-wide IPAM lock. Aug 5 22:13:22.208454 containerd[1444]: 2024-08-05 22:13:22.121 [INFO][4411] ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-3975-2-0-1-de7b5ef465.novalocal' Aug 5 22:13:22.208454 containerd[1444]: 2024-08-05 22:13:22.123 [INFO][4411] ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.35145b36206ed650826e52af4918c7250250684c9fa638a8c35efe287ddd8358" host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:22.208454 containerd[1444]: 2024-08-05 22:13:22.128 [INFO][4411] ipam.go 372: Looking up existing affinities for host host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:22.208454 containerd[1444]: 2024-08-05 22:13:22.135 [INFO][4411] ipam.go 489: Trying affinity for 192.168.92.128/26 host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:22.208454 containerd[1444]: 2024-08-05 22:13:22.140 [INFO][4411] ipam.go 155: Attempting to load block cidr=192.168.92.128/26 host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:22.208454 containerd[1444]: 2024-08-05 22:13:22.144 [INFO][4411] ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.92.128/26 host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:22.208454 containerd[1444]: 2024-08-05 22:13:22.144 [INFO][4411] ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.92.128/26 handle="k8s-pod-network.35145b36206ed650826e52af4918c7250250684c9fa638a8c35efe287ddd8358" host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:22.208454 containerd[1444]: 2024-08-05 22:13:22.147 [INFO][4411] ipam.go 1685: Creating new handle: k8s-pod-network.35145b36206ed650826e52af4918c7250250684c9fa638a8c35efe287ddd8358 Aug 5 22:13:22.208454 containerd[1444]: 2024-08-05 22:13:22.154 [INFO][4411] ipam.go 1203: Writing block in order to claim IPs block=192.168.92.128/26 handle="k8s-pod-network.35145b36206ed650826e52af4918c7250250684c9fa638a8c35efe287ddd8358" host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:22.208454 containerd[1444]: 2024-08-05 22:13:22.162 [INFO][4411] ipam.go 1216: Successfully claimed IPs: [192.168.92.130/26] block=192.168.92.128/26 handle="k8s-pod-network.35145b36206ed650826e52af4918c7250250684c9fa638a8c35efe287ddd8358" host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:22.208454 containerd[1444]: 2024-08-05 22:13:22.162 [INFO][4411] ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.92.130/26] handle="k8s-pod-network.35145b36206ed650826e52af4918c7250250684c9fa638a8c35efe287ddd8358" host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:22.208454 containerd[1444]: 2024-08-05 22:13:22.162 [INFO][4411] ipam_plugin.go 373: Released host-wide IPAM lock. Aug 5 22:13:22.208454 containerd[1444]: 2024-08-05 22:13:22.162 [INFO][4411] ipam_plugin.go 282: Calico CNI IPAM assigned addresses IPv4=[192.168.92.130/26] IPv6=[] ContainerID="35145b36206ed650826e52af4918c7250250684c9fa638a8c35efe287ddd8358" HandleID="k8s-pod-network.35145b36206ed650826e52af4918c7250250684c9fa638a8c35efe287ddd8358" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--kube--controllers--58c54bc647--pqf69-eth0" Aug 5 22:13:22.210450 containerd[1444]: 2024-08-05 22:13:22.174 [INFO][4390] k8s.go 386: Populated endpoint ContainerID="35145b36206ed650826e52af4918c7250250684c9fa638a8c35efe287ddd8358" Namespace="calico-system" Pod="calico-kube-controllers-58c54bc647-pqf69" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--kube--controllers--58c54bc647--pqf69-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--kube--controllers--58c54bc647--pqf69-eth0", GenerateName:"calico-kube-controllers-58c54bc647-", Namespace:"calico-system", SelfLink:"", UID:"4ecfd1c0-3700-4ed3-84bf-da7987083d57", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2024, time.August, 5, 22, 12, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"58c54bc647", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3975-2-0-1-de7b5ef465.novalocal", ContainerID:"", Pod:"calico-kube-controllers-58c54bc647-pqf69", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.92.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali32125d64fa6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Aug 5 22:13:22.210450 containerd[1444]: 2024-08-05 22:13:22.174 [INFO][4390] k8s.go 387: Calico CNI using IPs: [192.168.92.130/32] ContainerID="35145b36206ed650826e52af4918c7250250684c9fa638a8c35efe287ddd8358" Namespace="calico-system" Pod="calico-kube-controllers-58c54bc647-pqf69" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--kube--controllers--58c54bc647--pqf69-eth0" Aug 5 22:13:22.210450 containerd[1444]: 2024-08-05 22:13:22.175 [INFO][4390] dataplane_linux.go 68: Setting the host side veth name to cali32125d64fa6 ContainerID="35145b36206ed650826e52af4918c7250250684c9fa638a8c35efe287ddd8358" Namespace="calico-system" Pod="calico-kube-controllers-58c54bc647-pqf69" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--kube--controllers--58c54bc647--pqf69-eth0" Aug 5 22:13:22.210450 containerd[1444]: 2024-08-05 22:13:22.177 [INFO][4390] dataplane_linux.go 479: Disabling IPv4 forwarding ContainerID="35145b36206ed650826e52af4918c7250250684c9fa638a8c35efe287ddd8358" Namespace="calico-system" Pod="calico-kube-controllers-58c54bc647-pqf69" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--kube--controllers--58c54bc647--pqf69-eth0" Aug 5 22:13:22.210450 containerd[1444]: 2024-08-05 22:13:22.181 [INFO][4390] k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="35145b36206ed650826e52af4918c7250250684c9fa638a8c35efe287ddd8358" Namespace="calico-system" Pod="calico-kube-controllers-58c54bc647-pqf69" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--kube--controllers--58c54bc647--pqf69-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--kube--controllers--58c54bc647--pqf69-eth0", GenerateName:"calico-kube-controllers-58c54bc647-", Namespace:"calico-system", SelfLink:"", UID:"4ecfd1c0-3700-4ed3-84bf-da7987083d57", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2024, time.August, 5, 22, 12, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"58c54bc647", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3975-2-0-1-de7b5ef465.novalocal", ContainerID:"35145b36206ed650826e52af4918c7250250684c9fa638a8c35efe287ddd8358", Pod:"calico-kube-controllers-58c54bc647-pqf69", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.92.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali32125d64fa6", MAC:"3a:21:a0:c4:3f:cd", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Aug 5 22:13:22.210450 containerd[1444]: 2024-08-05 22:13:22.205 [INFO][4390] k8s.go 500: Wrote updated endpoint to datastore ContainerID="35145b36206ed650826e52af4918c7250250684c9fa638a8c35efe287ddd8358" Namespace="calico-system" Pod="calico-kube-controllers-58c54bc647-pqf69" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--kube--controllers--58c54bc647--pqf69-eth0" Aug 5 22:13:22.246432 systemd-networkd[1359]: calib1d4a50183b: Link UP Aug 5 22:13:22.248674 systemd-networkd[1359]: calib1d4a50183b: Gained carrier Aug 5 22:13:22.277356 containerd[1444]: time="2024-08-05T22:13:22.275534208Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Aug 5 22:13:22.277356 containerd[1444]: time="2024-08-05T22:13:22.275620011Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 5 22:13:22.277356 containerd[1444]: time="2024-08-05T22:13:22.275651050Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Aug 5 22:13:22.277356 containerd[1444]: time="2024-08-05T22:13:22.275669965Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 5 22:13:22.282742 containerd[1444]: 2024-08-05 22:13:22.073 [INFO][4399] plugin.go 326: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--t6q9g-eth0 coredns-76f75df574- kube-system 55edd9ef-b537-4c83-ad1e-084f093bbf6f 821 0 2024-08-05 22:12:40 +0000 UTC map[k8s-app:kube-dns pod-template-hash:76f75df574 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-3975-2-0-1-de7b5ef465.novalocal coredns-76f75df574-t6q9g eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calib1d4a50183b [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="7dd79605bdd06466eb96bf397373cefe6c84be3828c1fc68ad1499f4372d8c2a" Namespace="kube-system" Pod="coredns-76f75df574-t6q9g" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--t6q9g-" Aug 5 22:13:22.282742 containerd[1444]: 2024-08-05 22:13:22.073 [INFO][4399] k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="7dd79605bdd06466eb96bf397373cefe6c84be3828c1fc68ad1499f4372d8c2a" Namespace="kube-system" Pod="coredns-76f75df574-t6q9g" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--t6q9g-eth0" Aug 5 22:13:22.282742 containerd[1444]: 2024-08-05 22:13:22.126 [INFO][4417] ipam_plugin.go 224: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7dd79605bdd06466eb96bf397373cefe6c84be3828c1fc68ad1499f4372d8c2a" HandleID="k8s-pod-network.7dd79605bdd06466eb96bf397373cefe6c84be3828c1fc68ad1499f4372d8c2a" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--t6q9g-eth0" Aug 5 22:13:22.282742 containerd[1444]: 2024-08-05 22:13:22.147 [INFO][4417] ipam_plugin.go 264: Auto assigning IP ContainerID="7dd79605bdd06466eb96bf397373cefe6c84be3828c1fc68ad1499f4372d8c2a" HandleID="k8s-pod-network.7dd79605bdd06466eb96bf397373cefe6c84be3828c1fc68ad1499f4372d8c2a" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--t6q9g-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000051190), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-3975-2-0-1-de7b5ef465.novalocal", "pod":"coredns-76f75df574-t6q9g", "timestamp":"2024-08-05 22:13:22.126517465 +0000 UTC"}, Hostname:"ci-3975-2-0-1-de7b5ef465.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 5 22:13:22.282742 containerd[1444]: 2024-08-05 22:13:22.148 [INFO][4417] ipam_plugin.go 352: About to acquire host-wide IPAM lock. Aug 5 22:13:22.282742 containerd[1444]: 2024-08-05 22:13:22.163 [INFO][4417] ipam_plugin.go 367: Acquired host-wide IPAM lock. Aug 5 22:13:22.282742 containerd[1444]: 2024-08-05 22:13:22.163 [INFO][4417] ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-3975-2-0-1-de7b5ef465.novalocal' Aug 5 22:13:22.282742 containerd[1444]: 2024-08-05 22:13:22.167 [INFO][4417] ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.7dd79605bdd06466eb96bf397373cefe6c84be3828c1fc68ad1499f4372d8c2a" host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:22.282742 containerd[1444]: 2024-08-05 22:13:22.174 [INFO][4417] ipam.go 372: Looking up existing affinities for host host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:22.282742 containerd[1444]: 2024-08-05 22:13:22.188 [INFO][4417] ipam.go 489: Trying affinity for 192.168.92.128/26 host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:22.282742 containerd[1444]: 2024-08-05 22:13:22.194 [INFO][4417] ipam.go 155: Attempting to load block cidr=192.168.92.128/26 host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:22.282742 containerd[1444]: 2024-08-05 22:13:22.198 [INFO][4417] ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.92.128/26 host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:22.282742 containerd[1444]: 2024-08-05 22:13:22.198 [INFO][4417] ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.92.128/26 handle="k8s-pod-network.7dd79605bdd06466eb96bf397373cefe6c84be3828c1fc68ad1499f4372d8c2a" host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:22.282742 containerd[1444]: 2024-08-05 22:13:22.205 [INFO][4417] ipam.go 1685: Creating new handle: k8s-pod-network.7dd79605bdd06466eb96bf397373cefe6c84be3828c1fc68ad1499f4372d8c2a Aug 5 22:13:22.282742 containerd[1444]: 2024-08-05 22:13:22.215 [INFO][4417] ipam.go 1203: Writing block in order to claim IPs block=192.168.92.128/26 handle="k8s-pod-network.7dd79605bdd06466eb96bf397373cefe6c84be3828c1fc68ad1499f4372d8c2a" host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:22.282742 containerd[1444]: 2024-08-05 22:13:22.227 [INFO][4417] ipam.go 1216: Successfully claimed IPs: [192.168.92.131/26] block=192.168.92.128/26 handle="k8s-pod-network.7dd79605bdd06466eb96bf397373cefe6c84be3828c1fc68ad1499f4372d8c2a" host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:22.282742 containerd[1444]: 2024-08-05 22:13:22.228 [INFO][4417] ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.92.131/26] handle="k8s-pod-network.7dd79605bdd06466eb96bf397373cefe6c84be3828c1fc68ad1499f4372d8c2a" host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:22.282742 containerd[1444]: 2024-08-05 22:13:22.228 [INFO][4417] ipam_plugin.go 373: Released host-wide IPAM lock. Aug 5 22:13:22.282742 containerd[1444]: 2024-08-05 22:13:22.228 [INFO][4417] ipam_plugin.go 282: Calico CNI IPAM assigned addresses IPv4=[192.168.92.131/26] IPv6=[] ContainerID="7dd79605bdd06466eb96bf397373cefe6c84be3828c1fc68ad1499f4372d8c2a" HandleID="k8s-pod-network.7dd79605bdd06466eb96bf397373cefe6c84be3828c1fc68ad1499f4372d8c2a" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--t6q9g-eth0" Aug 5 22:13:22.283486 containerd[1444]: 2024-08-05 22:13:22.234 [INFO][4399] k8s.go 386: Populated endpoint ContainerID="7dd79605bdd06466eb96bf397373cefe6c84be3828c1fc68ad1499f4372d8c2a" Namespace="kube-system" Pod="coredns-76f75df574-t6q9g" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--t6q9g-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--t6q9g-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"55edd9ef-b537-4c83-ad1e-084f093bbf6f", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2024, time.August, 5, 22, 12, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3975-2-0-1-de7b5ef465.novalocal", ContainerID:"", Pod:"coredns-76f75df574-t6q9g", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.92.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib1d4a50183b", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Aug 5 22:13:22.283486 containerd[1444]: 2024-08-05 22:13:22.234 [INFO][4399] k8s.go 387: Calico CNI using IPs: [192.168.92.131/32] ContainerID="7dd79605bdd06466eb96bf397373cefe6c84be3828c1fc68ad1499f4372d8c2a" Namespace="kube-system" Pod="coredns-76f75df574-t6q9g" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--t6q9g-eth0" Aug 5 22:13:22.283486 containerd[1444]: 2024-08-05 22:13:22.234 [INFO][4399] dataplane_linux.go 68: Setting the host side veth name to calib1d4a50183b ContainerID="7dd79605bdd06466eb96bf397373cefe6c84be3828c1fc68ad1499f4372d8c2a" Namespace="kube-system" Pod="coredns-76f75df574-t6q9g" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--t6q9g-eth0" Aug 5 22:13:22.283486 containerd[1444]: 2024-08-05 22:13:22.251 [INFO][4399] dataplane_linux.go 479: Disabling IPv4 forwarding ContainerID="7dd79605bdd06466eb96bf397373cefe6c84be3828c1fc68ad1499f4372d8c2a" Namespace="kube-system" Pod="coredns-76f75df574-t6q9g" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--t6q9g-eth0" Aug 5 22:13:22.283486 containerd[1444]: 2024-08-05 22:13:22.256 [INFO][4399] k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="7dd79605bdd06466eb96bf397373cefe6c84be3828c1fc68ad1499f4372d8c2a" Namespace="kube-system" Pod="coredns-76f75df574-t6q9g" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--t6q9g-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--t6q9g-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"55edd9ef-b537-4c83-ad1e-084f093bbf6f", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2024, time.August, 5, 22, 12, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3975-2-0-1-de7b5ef465.novalocal", ContainerID:"7dd79605bdd06466eb96bf397373cefe6c84be3828c1fc68ad1499f4372d8c2a", Pod:"coredns-76f75df574-t6q9g", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.92.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib1d4a50183b", MAC:"7a:4e:3f:70:2b:f7", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Aug 5 22:13:22.283486 containerd[1444]: 2024-08-05 22:13:22.277 [INFO][4399] k8s.go 500: Wrote updated endpoint to datastore ContainerID="7dd79605bdd06466eb96bf397373cefe6c84be3828c1fc68ad1499f4372d8c2a" Namespace="kube-system" Pod="coredns-76f75df574-t6q9g" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--t6q9g-eth0" Aug 5 22:13:22.323967 systemd[1]: Started cri-containerd-35145b36206ed650826e52af4918c7250250684c9fa638a8c35efe287ddd8358.scope - libcontainer container 35145b36206ed650826e52af4918c7250250684c9fa638a8c35efe287ddd8358. Aug 5 22:13:22.354330 containerd[1444]: time="2024-08-05T22:13:22.353068278Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Aug 5 22:13:22.354330 containerd[1444]: time="2024-08-05T22:13:22.353147057Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 5 22:13:22.354330 containerd[1444]: time="2024-08-05T22:13:22.353175512Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Aug 5 22:13:22.354330 containerd[1444]: time="2024-08-05T22:13:22.353195099Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 5 22:13:22.381492 systemd[1]: Started cri-containerd-7dd79605bdd06466eb96bf397373cefe6c84be3828c1fc68ad1499f4372d8c2a.scope - libcontainer container 7dd79605bdd06466eb96bf397373cefe6c84be3828c1fc68ad1499f4372d8c2a. Aug 5 22:13:22.442302 containerd[1444]: time="2024-08-05T22:13:22.441757213Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58c54bc647-pqf69,Uid:4ecfd1c0-3700-4ed3-84bf-da7987083d57,Namespace:calico-system,Attempt:1,} returns sandbox id \"35145b36206ed650826e52af4918c7250250684c9fa638a8c35efe287ddd8358\"" Aug 5 22:13:22.484487 containerd[1444]: time="2024-08-05T22:13:22.484387229Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-t6q9g,Uid:55edd9ef-b537-4c83-ad1e-084f093bbf6f,Namespace:kube-system,Attempt:1,} returns sandbox id \"7dd79605bdd06466eb96bf397373cefe6c84be3828c1fc68ad1499f4372d8c2a\"" Aug 5 22:13:22.508417 systemd-networkd[1359]: cali856eccd5c50: Gained IPv6LL Aug 5 22:13:22.525889 containerd[1444]: time="2024-08-05T22:13:22.525786481Z" level=info msg="CreateContainer within sandbox \"7dd79605bdd06466eb96bf397373cefe6c84be3828c1fc68ad1499f4372d8c2a\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Aug 5 22:13:22.571097 update_engine[1425]: I0805 22:13:22.571027 1425 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Aug 5 22:13:22.571097 update_engine[1425]: I0805 22:13:22.571084 1425 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Aug 5 22:13:22.573540 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2550002539.mount: Deactivated successfully. Aug 5 22:13:22.576136 update_engine[1425]: I0805 22:13:22.574327 1425 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Aug 5 22:13:22.576136 update_engine[1425]: I0805 22:13:22.575290 1425 omaha_request_params.cc:62] Current group set to stable Aug 5 22:13:22.576136 update_engine[1425]: I0805 22:13:22.575404 1425 update_attempter.cc:499] Already updated boot flags. Skipping. Aug 5 22:13:22.576136 update_engine[1425]: I0805 22:13:22.575410 1425 update_attempter.cc:643] Scheduling an action processor start. Aug 5 22:13:22.576136 update_engine[1425]: I0805 22:13:22.575428 1425 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Aug 5 22:13:22.576136 update_engine[1425]: I0805 22:13:22.575464 1425 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Aug 5 22:13:22.576136 update_engine[1425]: I0805 22:13:22.575515 1425 omaha_request_action.cc:271] Posting an Omaha request to disabled Aug 5 22:13:22.576136 update_engine[1425]: I0805 22:13:22.575520 1425 omaha_request_action.cc:272] Request: Aug 5 22:13:22.576136 update_engine[1425]: Aug 5 22:13:22.576136 update_engine[1425]: Aug 5 22:13:22.576136 update_engine[1425]: Aug 5 22:13:22.576136 update_engine[1425]: Aug 5 22:13:22.576136 update_engine[1425]: Aug 5 22:13:22.576136 update_engine[1425]: Aug 5 22:13:22.576136 update_engine[1425]: Aug 5 22:13:22.576136 update_engine[1425]: Aug 5 22:13:22.576136 update_engine[1425]: I0805 22:13:22.575523 1425 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Aug 5 22:13:22.584123 update_engine[1425]: I0805 22:13:22.581379 1425 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Aug 5 22:13:22.584123 update_engine[1425]: I0805 22:13:22.581732 1425 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Aug 5 22:13:22.593657 containerd[1444]: time="2024-08-05T22:13:22.593246946Z" level=info msg="CreateContainer within sandbox \"7dd79605bdd06466eb96bf397373cefe6c84be3828c1fc68ad1499f4372d8c2a\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"ebabb84aa672cb6546d0d25b1361bee2039f6caf439d1b08e115dd25b0986dbb\"" Aug 5 22:13:22.601568 update_engine[1425]: E0805 22:13:22.596715 1425 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Aug 5 22:13:22.601568 update_engine[1425]: I0805 22:13:22.596879 1425 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Aug 5 22:13:22.602124 containerd[1444]: time="2024-08-05T22:13:22.602070120Z" level=info msg="StartContainer for \"ebabb84aa672cb6546d0d25b1361bee2039f6caf439d1b08e115dd25b0986dbb\"" Aug 5 22:13:22.612327 locksmithd[1456]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Aug 5 22:13:22.673687 systemd[1]: Started cri-containerd-ebabb84aa672cb6546d0d25b1361bee2039f6caf439d1b08e115dd25b0986dbb.scope - libcontainer container ebabb84aa672cb6546d0d25b1361bee2039f6caf439d1b08e115dd25b0986dbb. Aug 5 22:13:22.739848 containerd[1444]: time="2024-08-05T22:13:22.738816253Z" level=info msg="StartContainer for \"ebabb84aa672cb6546d0d25b1361bee2039f6caf439d1b08e115dd25b0986dbb\" returns successfully" Aug 5 22:13:23.203276 kubelet[2635]: I0805 22:13:23.202476 2635 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/coredns-76f75df574-t6q9g" podStartSLOduration=43.202430136 podStartE2EDuration="43.202430136s" podCreationTimestamp="2024-08-05 22:12:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-08-05 22:13:23.199882826 +0000 UTC m=+54.905103224" watchObservedRunningTime="2024-08-05 22:13:23.202430136 +0000 UTC m=+54.907650534" Aug 5 22:13:23.221349 containerd[1444]: time="2024-08-05T22:13:23.221302594Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.28.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:13:23.223299 containerd[1444]: time="2024-08-05T22:13:23.223243494Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.28.0: active requests=0, bytes read=7641062" Aug 5 22:13:23.231326 containerd[1444]: time="2024-08-05T22:13:23.230674786Z" level=info msg="ImageCreate event name:\"sha256:1a094aeaf1521e225668c83cbf63c0ec63afbdb8c4dd7c3d2aab0ec917d103de\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:13:23.235616 containerd[1444]: time="2024-08-05T22:13:23.235582202Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ac5f0089ad8eab325e5d16a59536f9292619adf16736b1554a439a66d543a63d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:13:23.237727 containerd[1444]: time="2024-08-05T22:13:23.237697212Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.28.0\" with image id \"sha256:1a094aeaf1521e225668c83cbf63c0ec63afbdb8c4dd7c3d2aab0ec917d103de\", repo tag \"ghcr.io/flatcar/calico/csi:v3.28.0\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ac5f0089ad8eab325e5d16a59536f9292619adf16736b1554a439a66d543a63d\", size \"9088822\" in 2.392084649s" Aug 5 22:13:23.237967 containerd[1444]: time="2024-08-05T22:13:23.237905477Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.28.0\" returns image reference \"sha256:1a094aeaf1521e225668c83cbf63c0ec63afbdb8c4dd7c3d2aab0ec917d103de\"" Aug 5 22:13:23.240707 containerd[1444]: time="2024-08-05T22:13:23.240457135Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.28.0\"" Aug 5 22:13:23.241891 containerd[1444]: time="2024-08-05T22:13:23.241769234Z" level=info msg="CreateContainer within sandbox \"f064d99279b0982c4ba9b97b4f5a5fe0d6c152caffbb4d5acaea5faf6f8455b9\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Aug 5 22:13:23.290325 containerd[1444]: time="2024-08-05T22:13:23.290110030Z" level=info msg="CreateContainer within sandbox \"f064d99279b0982c4ba9b97b4f5a5fe0d6c152caffbb4d5acaea5faf6f8455b9\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"a9f121aa913ef229d25ba187f911c5d41dad33ae88562d6173d7aa6546930293\"" Aug 5 22:13:23.291493 containerd[1444]: time="2024-08-05T22:13:23.290990561Z" level=info msg="StartContainer for \"a9f121aa913ef229d25ba187f911c5d41dad33ae88562d6173d7aa6546930293\"" Aug 5 22:13:23.332443 systemd[1]: Started cri-containerd-a9f121aa913ef229d25ba187f911c5d41dad33ae88562d6173d7aa6546930293.scope - libcontainer container a9f121aa913ef229d25ba187f911c5d41dad33ae88562d6173d7aa6546930293. Aug 5 22:13:23.384148 containerd[1444]: time="2024-08-05T22:13:23.384102855Z" level=info msg="StartContainer for \"a9f121aa913ef229d25ba187f911c5d41dad33ae88562d6173d7aa6546930293\" returns successfully" Aug 5 22:13:23.403415 systemd-networkd[1359]: calib1d4a50183b: Gained IPv6LL Aug 5 22:13:23.404203 systemd-networkd[1359]: cali32125d64fa6: Gained IPv6LL Aug 5 22:13:23.743009 containerd[1444]: time="2024-08-05T22:13:23.742970212Z" level=info msg="StopPodSandbox for \"c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd\"" Aug 5 22:13:23.881769 containerd[1444]: 2024-08-05 22:13:23.825 [INFO][4633] k8s.go 608: Cleaning up netns ContainerID="c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" Aug 5 22:13:23.881769 containerd[1444]: 2024-08-05 22:13:23.826 [INFO][4633] dataplane_linux.go 530: Deleting workload's device in netns. ContainerID="c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" iface="eth0" netns="/var/run/netns/cni-815d8ed3-c2e0-f56f-0c39-bf2986ade1b5" Aug 5 22:13:23.881769 containerd[1444]: 2024-08-05 22:13:23.826 [INFO][4633] dataplane_linux.go 541: Entered netns, deleting veth. ContainerID="c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" iface="eth0" netns="/var/run/netns/cni-815d8ed3-c2e0-f56f-0c39-bf2986ade1b5" Aug 5 22:13:23.881769 containerd[1444]: 2024-08-05 22:13:23.826 [INFO][4633] dataplane_linux.go 568: Workload's veth was already gone. Nothing to do. ContainerID="c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" iface="eth0" netns="/var/run/netns/cni-815d8ed3-c2e0-f56f-0c39-bf2986ade1b5" Aug 5 22:13:23.881769 containerd[1444]: 2024-08-05 22:13:23.827 [INFO][4633] k8s.go 615: Releasing IP address(es) ContainerID="c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" Aug 5 22:13:23.881769 containerd[1444]: 2024-08-05 22:13:23.827 [INFO][4633] utils.go 188: Calico CNI releasing IP address ContainerID="c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" Aug 5 22:13:23.881769 containerd[1444]: 2024-08-05 22:13:23.866 [INFO][4639] ipam_plugin.go 411: Releasing address using handleID ContainerID="c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" HandleID="k8s-pod-network.c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--lvcgp-eth0" Aug 5 22:13:23.881769 containerd[1444]: 2024-08-05 22:13:23.867 [INFO][4639] ipam_plugin.go 352: About to acquire host-wide IPAM lock. Aug 5 22:13:23.881769 containerd[1444]: 2024-08-05 22:13:23.867 [INFO][4639] ipam_plugin.go 367: Acquired host-wide IPAM lock. Aug 5 22:13:23.881769 containerd[1444]: 2024-08-05 22:13:23.874 [WARNING][4639] ipam_plugin.go 428: Asked to release address but it doesn't exist. Ignoring ContainerID="c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" HandleID="k8s-pod-network.c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--lvcgp-eth0" Aug 5 22:13:23.881769 containerd[1444]: 2024-08-05 22:13:23.874 [INFO][4639] ipam_plugin.go 439: Releasing address using workloadID ContainerID="c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" HandleID="k8s-pod-network.c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--lvcgp-eth0" Aug 5 22:13:23.881769 containerd[1444]: 2024-08-05 22:13:23.877 [INFO][4639] ipam_plugin.go 373: Released host-wide IPAM lock. Aug 5 22:13:23.881769 containerd[1444]: 2024-08-05 22:13:23.880 [INFO][4633] k8s.go 621: Teardown processing complete. ContainerID="c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" Aug 5 22:13:23.885168 containerd[1444]: time="2024-08-05T22:13:23.885128180Z" level=info msg="TearDown network for sandbox \"c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd\" successfully" Aug 5 22:13:23.885168 containerd[1444]: time="2024-08-05T22:13:23.885164829Z" level=info msg="StopPodSandbox for \"c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd\" returns successfully" Aug 5 22:13:23.885826 containerd[1444]: time="2024-08-05T22:13:23.885793071Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-lvcgp,Uid:2932e3f8-d82d-4537-9f9b-b46369d5b6f8,Namespace:kube-system,Attempt:1,}" Aug 5 22:13:23.886820 systemd[1]: run-netns-cni\x2d815d8ed3\x2dc2e0\x2df56f\x2d0c39\x2dbf2986ade1b5.mount: Deactivated successfully. Aug 5 22:13:24.033877 systemd-networkd[1359]: cali89caf6ed1d6: Link UP Aug 5 22:13:24.035509 systemd-networkd[1359]: cali89caf6ed1d6: Gained carrier Aug 5 22:13:24.058824 containerd[1444]: 2024-08-05 22:13:23.953 [INFO][4646] plugin.go 326: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--lvcgp-eth0 coredns-76f75df574- kube-system 2932e3f8-d82d-4537-9f9b-b46369d5b6f8 844 0 2024-08-05 22:12:40 +0000 UTC map[k8s-app:kube-dns pod-template-hash:76f75df574 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-3975-2-0-1-de7b5ef465.novalocal coredns-76f75df574-lvcgp eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali89caf6ed1d6 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="0bb4098e5d7b022e09bdb0632ca469d1d0dfa990e63dcdbe6530bad4d804a06f" Namespace="kube-system" Pod="coredns-76f75df574-lvcgp" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--lvcgp-" Aug 5 22:13:24.058824 containerd[1444]: 2024-08-05 22:13:23.953 [INFO][4646] k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="0bb4098e5d7b022e09bdb0632ca469d1d0dfa990e63dcdbe6530bad4d804a06f" Namespace="kube-system" Pod="coredns-76f75df574-lvcgp" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--lvcgp-eth0" Aug 5 22:13:24.058824 containerd[1444]: 2024-08-05 22:13:23.987 [INFO][4657] ipam_plugin.go 224: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0bb4098e5d7b022e09bdb0632ca469d1d0dfa990e63dcdbe6530bad4d804a06f" HandleID="k8s-pod-network.0bb4098e5d7b022e09bdb0632ca469d1d0dfa990e63dcdbe6530bad4d804a06f" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--lvcgp-eth0" Aug 5 22:13:24.058824 containerd[1444]: 2024-08-05 22:13:23.998 [INFO][4657] ipam_plugin.go 264: Auto assigning IP ContainerID="0bb4098e5d7b022e09bdb0632ca469d1d0dfa990e63dcdbe6530bad4d804a06f" HandleID="k8s-pod-network.0bb4098e5d7b022e09bdb0632ca469d1d0dfa990e63dcdbe6530bad4d804a06f" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--lvcgp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001149c0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-3975-2-0-1-de7b5ef465.novalocal", "pod":"coredns-76f75df574-lvcgp", "timestamp":"2024-08-05 22:13:23.98753452 +0000 UTC"}, Hostname:"ci-3975-2-0-1-de7b5ef465.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 5 22:13:24.058824 containerd[1444]: 2024-08-05 22:13:23.998 [INFO][4657] ipam_plugin.go 352: About to acquire host-wide IPAM lock. Aug 5 22:13:24.058824 containerd[1444]: 2024-08-05 22:13:23.998 [INFO][4657] ipam_plugin.go 367: Acquired host-wide IPAM lock. Aug 5 22:13:24.058824 containerd[1444]: 2024-08-05 22:13:23.998 [INFO][4657] ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-3975-2-0-1-de7b5ef465.novalocal' Aug 5 22:13:24.058824 containerd[1444]: 2024-08-05 22:13:24.000 [INFO][4657] ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.0bb4098e5d7b022e09bdb0632ca469d1d0dfa990e63dcdbe6530bad4d804a06f" host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:24.058824 containerd[1444]: 2024-08-05 22:13:24.004 [INFO][4657] ipam.go 372: Looking up existing affinities for host host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:24.058824 containerd[1444]: 2024-08-05 22:13:24.008 [INFO][4657] ipam.go 489: Trying affinity for 192.168.92.128/26 host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:24.058824 containerd[1444]: 2024-08-05 22:13:24.012 [INFO][4657] ipam.go 155: Attempting to load block cidr=192.168.92.128/26 host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:24.058824 containerd[1444]: 2024-08-05 22:13:24.014 [INFO][4657] ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.92.128/26 host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:24.058824 containerd[1444]: 2024-08-05 22:13:24.014 [INFO][4657] ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.92.128/26 handle="k8s-pod-network.0bb4098e5d7b022e09bdb0632ca469d1d0dfa990e63dcdbe6530bad4d804a06f" host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:24.058824 containerd[1444]: 2024-08-05 22:13:24.016 [INFO][4657] ipam.go 1685: Creating new handle: k8s-pod-network.0bb4098e5d7b022e09bdb0632ca469d1d0dfa990e63dcdbe6530bad4d804a06f Aug 5 22:13:24.058824 containerd[1444]: 2024-08-05 22:13:24.020 [INFO][4657] ipam.go 1203: Writing block in order to claim IPs block=192.168.92.128/26 handle="k8s-pod-network.0bb4098e5d7b022e09bdb0632ca469d1d0dfa990e63dcdbe6530bad4d804a06f" host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:24.058824 containerd[1444]: 2024-08-05 22:13:24.026 [INFO][4657] ipam.go 1216: Successfully claimed IPs: [192.168.92.132/26] block=192.168.92.128/26 handle="k8s-pod-network.0bb4098e5d7b022e09bdb0632ca469d1d0dfa990e63dcdbe6530bad4d804a06f" host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:24.058824 containerd[1444]: 2024-08-05 22:13:24.027 [INFO][4657] ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.92.132/26] handle="k8s-pod-network.0bb4098e5d7b022e09bdb0632ca469d1d0dfa990e63dcdbe6530bad4d804a06f" host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:13:24.058824 containerd[1444]: 2024-08-05 22:13:24.027 [INFO][4657] ipam_plugin.go 373: Released host-wide IPAM lock. Aug 5 22:13:24.058824 containerd[1444]: 2024-08-05 22:13:24.027 [INFO][4657] ipam_plugin.go 282: Calico CNI IPAM assigned addresses IPv4=[192.168.92.132/26] IPv6=[] ContainerID="0bb4098e5d7b022e09bdb0632ca469d1d0dfa990e63dcdbe6530bad4d804a06f" HandleID="k8s-pod-network.0bb4098e5d7b022e09bdb0632ca469d1d0dfa990e63dcdbe6530bad4d804a06f" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--lvcgp-eth0" Aug 5 22:13:24.059581 containerd[1444]: 2024-08-05 22:13:24.029 [INFO][4646] k8s.go 386: Populated endpoint ContainerID="0bb4098e5d7b022e09bdb0632ca469d1d0dfa990e63dcdbe6530bad4d804a06f" Namespace="kube-system" Pod="coredns-76f75df574-lvcgp" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--lvcgp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--lvcgp-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"2932e3f8-d82d-4537-9f9b-b46369d5b6f8", ResourceVersion:"844", Generation:0, CreationTimestamp:time.Date(2024, time.August, 5, 22, 12, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3975-2-0-1-de7b5ef465.novalocal", ContainerID:"", Pod:"coredns-76f75df574-lvcgp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.92.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali89caf6ed1d6", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Aug 5 22:13:24.059581 containerd[1444]: 2024-08-05 22:13:24.029 [INFO][4646] k8s.go 387: Calico CNI using IPs: [192.168.92.132/32] ContainerID="0bb4098e5d7b022e09bdb0632ca469d1d0dfa990e63dcdbe6530bad4d804a06f" Namespace="kube-system" Pod="coredns-76f75df574-lvcgp" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--lvcgp-eth0" Aug 5 22:13:24.059581 containerd[1444]: 2024-08-05 22:13:24.029 [INFO][4646] dataplane_linux.go 68: Setting the host side veth name to cali89caf6ed1d6 ContainerID="0bb4098e5d7b022e09bdb0632ca469d1d0dfa990e63dcdbe6530bad4d804a06f" Namespace="kube-system" Pod="coredns-76f75df574-lvcgp" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--lvcgp-eth0" Aug 5 22:13:24.059581 containerd[1444]: 2024-08-05 22:13:24.036 [INFO][4646] dataplane_linux.go 479: Disabling IPv4 forwarding ContainerID="0bb4098e5d7b022e09bdb0632ca469d1d0dfa990e63dcdbe6530bad4d804a06f" Namespace="kube-system" Pod="coredns-76f75df574-lvcgp" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--lvcgp-eth0" Aug 5 22:13:24.059581 containerd[1444]: 2024-08-05 22:13:24.036 [INFO][4646] k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="0bb4098e5d7b022e09bdb0632ca469d1d0dfa990e63dcdbe6530bad4d804a06f" Namespace="kube-system" Pod="coredns-76f75df574-lvcgp" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--lvcgp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--lvcgp-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"2932e3f8-d82d-4537-9f9b-b46369d5b6f8", ResourceVersion:"844", Generation:0, CreationTimestamp:time.Date(2024, time.August, 5, 22, 12, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3975-2-0-1-de7b5ef465.novalocal", ContainerID:"0bb4098e5d7b022e09bdb0632ca469d1d0dfa990e63dcdbe6530bad4d804a06f", Pod:"coredns-76f75df574-lvcgp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.92.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali89caf6ed1d6", MAC:"1e:fa:ef:ba:6e:a9", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Aug 5 22:13:24.059581 containerd[1444]: 2024-08-05 22:13:24.052 [INFO][4646] k8s.go 500: Wrote updated endpoint to datastore ContainerID="0bb4098e5d7b022e09bdb0632ca469d1d0dfa990e63dcdbe6530bad4d804a06f" Namespace="kube-system" Pod="coredns-76f75df574-lvcgp" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--lvcgp-eth0" Aug 5 22:13:24.123729 containerd[1444]: time="2024-08-05T22:13:24.123276919Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Aug 5 22:13:24.123729 containerd[1444]: time="2024-08-05T22:13:24.123362542Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 5 22:13:24.123729 containerd[1444]: time="2024-08-05T22:13:24.123391957Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Aug 5 22:13:24.123729 containerd[1444]: time="2024-08-05T22:13:24.123412747Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 5 22:13:24.178914 systemd[1]: Started cri-containerd-0bb4098e5d7b022e09bdb0632ca469d1d0dfa990e63dcdbe6530bad4d804a06f.scope - libcontainer container 0bb4098e5d7b022e09bdb0632ca469d1d0dfa990e63dcdbe6530bad4d804a06f. Aug 5 22:13:24.282823 containerd[1444]: time="2024-08-05T22:13:24.282777243Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-lvcgp,Uid:2932e3f8-d82d-4537-9f9b-b46369d5b6f8,Namespace:kube-system,Attempt:1,} returns sandbox id \"0bb4098e5d7b022e09bdb0632ca469d1d0dfa990e63dcdbe6530bad4d804a06f\"" Aug 5 22:13:24.288204 containerd[1444]: time="2024-08-05T22:13:24.288087202Z" level=info msg="CreateContainer within sandbox \"0bb4098e5d7b022e09bdb0632ca469d1d0dfa990e63dcdbe6530bad4d804a06f\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Aug 5 22:13:24.312352 containerd[1444]: time="2024-08-05T22:13:24.312300071Z" level=info msg="CreateContainer within sandbox \"0bb4098e5d7b022e09bdb0632ca469d1d0dfa990e63dcdbe6530bad4d804a06f\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"6d62c00d463c593028a982429fcaa98dcef8b67cd84c38dbe204d43e414910ed\"" Aug 5 22:13:24.314297 containerd[1444]: time="2024-08-05T22:13:24.313691730Z" level=info msg="StartContainer for \"6d62c00d463c593028a982429fcaa98dcef8b67cd84c38dbe204d43e414910ed\"" Aug 5 22:13:24.359583 systemd[1]: Started cri-containerd-6d62c00d463c593028a982429fcaa98dcef8b67cd84c38dbe204d43e414910ed.scope - libcontainer container 6d62c00d463c593028a982429fcaa98dcef8b67cd84c38dbe204d43e414910ed. Aug 5 22:13:24.412104 containerd[1444]: time="2024-08-05T22:13:24.412047339Z" level=info msg="StartContainer for \"6d62c00d463c593028a982429fcaa98dcef8b67cd84c38dbe204d43e414910ed\" returns successfully" Aug 5 22:13:25.226259 kubelet[2635]: I0805 22:13:25.223738 2635 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/coredns-76f75df574-lvcgp" podStartSLOduration=45.223661893 podStartE2EDuration="45.223661893s" podCreationTimestamp="2024-08-05 22:12:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2024-08-05 22:13:25.185248013 +0000 UTC m=+56.890468401" watchObservedRunningTime="2024-08-05 22:13:25.223661893 +0000 UTC m=+56.928882281" Aug 5 22:13:25.835808 systemd-networkd[1359]: cali89caf6ed1d6: Gained IPv6LL Aug 5 22:13:26.744101 containerd[1444]: time="2024-08-05T22:13:26.743505519Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.28.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:13:26.746315 containerd[1444]: time="2024-08-05T22:13:26.746281545Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.28.0: active requests=0, bytes read=33505793" Aug 5 22:13:26.747292 containerd[1444]: time="2024-08-05T22:13:26.747241919Z" level=info msg="ImageCreate event name:\"sha256:428d92b02253980b402b9fb18f4cb58be36dc6bcf4893e07462732cb926ea783\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:13:26.750809 containerd[1444]: time="2024-08-05T22:13:26.750139353Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:c35e88abef622483409fff52313bf764a75095197be4c5a7c7830da342654de1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:13:26.750809 containerd[1444]: time="2024-08-05T22:13:26.750698800Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.28.0\" with image id \"sha256:428d92b02253980b402b9fb18f4cb58be36dc6bcf4893e07462732cb926ea783\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.28.0\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:c35e88abef622483409fff52313bf764a75095197be4c5a7c7830da342654de1\", size \"34953521\" in 3.510204784s" Aug 5 22:13:26.750809 containerd[1444]: time="2024-08-05T22:13:26.750728808Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.28.0\" returns image reference \"sha256:428d92b02253980b402b9fb18f4cb58be36dc6bcf4893e07462732cb926ea783\"" Aug 5 22:13:26.751870 containerd[1444]: time="2024-08-05T22:13:26.751823916Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.28.0\"" Aug 5 22:13:26.785158 containerd[1444]: time="2024-08-05T22:13:26.785012551Z" level=info msg="CreateContainer within sandbox \"35145b36206ed650826e52af4918c7250250684c9fa638a8c35efe287ddd8358\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Aug 5 22:13:26.816667 containerd[1444]: time="2024-08-05T22:13:26.816617994Z" level=info msg="CreateContainer within sandbox \"35145b36206ed650826e52af4918c7250250684c9fa638a8c35efe287ddd8358\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"931d26c86b35eaea57a3b41ac3248ed922aec7b95b2a1c7face54f7941b66bbb\"" Aug 5 22:13:26.817273 containerd[1444]: time="2024-08-05T22:13:26.817154618Z" level=info msg="StartContainer for \"931d26c86b35eaea57a3b41ac3248ed922aec7b95b2a1c7face54f7941b66bbb\"" Aug 5 22:13:26.856372 systemd[1]: Started cri-containerd-931d26c86b35eaea57a3b41ac3248ed922aec7b95b2a1c7face54f7941b66bbb.scope - libcontainer container 931d26c86b35eaea57a3b41ac3248ed922aec7b95b2a1c7face54f7941b66bbb. Aug 5 22:13:26.899931 containerd[1444]: time="2024-08-05T22:13:26.899699704Z" level=info msg="StartContainer for \"931d26c86b35eaea57a3b41ac3248ed922aec7b95b2a1c7face54f7941b66bbb\" returns successfully" Aug 5 22:13:27.158522 kubelet[2635]: I0805 22:13:27.158194 2635 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-58c54bc647-pqf69" podStartSLOduration=32.853509333 podStartE2EDuration="37.158145209s" podCreationTimestamp="2024-08-05 22:12:50 +0000 UTC" firstStartedPulling="2024-08-05 22:13:22.446874538 +0000 UTC m=+54.152094936" lastFinishedPulling="2024-08-05 22:13:26.751510423 +0000 UTC m=+58.456730812" observedRunningTime="2024-08-05 22:13:27.157729076 +0000 UTC m=+58.862949464" watchObservedRunningTime="2024-08-05 22:13:27.158145209 +0000 UTC m=+58.863365617" Aug 5 22:13:28.919661 containerd[1444]: time="2024-08-05T22:13:28.919617732Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.28.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:13:28.922969 containerd[1444]: time="2024-08-05T22:13:28.922932893Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.28.0: active requests=0, bytes read=10147655" Aug 5 22:13:28.923576 containerd[1444]: time="2024-08-05T22:13:28.923555778Z" level=info msg="StopPodSandbox for \"52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0\"" Aug 5 22:13:28.924645 containerd[1444]: time="2024-08-05T22:13:28.924608782Z" level=info msg="ImageCreate event name:\"sha256:0f80feca743f4a84ddda4057266092db9134f9af9e20e12ea6fcfe51d7e3a020\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:13:28.930076 containerd[1444]: time="2024-08-05T22:13:28.929995940Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:b3caf3e7b3042b293728a5ab55d893798d60fec55993a9531e82997de0e534cc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:13:28.931824 containerd[1444]: time="2024-08-05T22:13:28.931757738Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.28.0\" with image id \"sha256:0f80feca743f4a84ddda4057266092db9134f9af9e20e12ea6fcfe51d7e3a020\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.28.0\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:b3caf3e7b3042b293728a5ab55d893798d60fec55993a9531e82997de0e534cc\", size \"11595367\" in 2.179797784s" Aug 5 22:13:28.931928 containerd[1444]: time="2024-08-05T22:13:28.931909066Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.28.0\" returns image reference \"sha256:0f80feca743f4a84ddda4057266092db9134f9af9e20e12ea6fcfe51d7e3a020\"" Aug 5 22:13:28.934649 containerd[1444]: time="2024-08-05T22:13:28.934608666Z" level=info msg="CreateContainer within sandbox \"f064d99279b0982c4ba9b97b4f5a5fe0d6c152caffbb4d5acaea5faf6f8455b9\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Aug 5 22:13:28.963472 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1861039614.mount: Deactivated successfully. Aug 5 22:13:28.969180 containerd[1444]: time="2024-08-05T22:13:28.968620526Z" level=info msg="CreateContainer within sandbox \"f064d99279b0982c4ba9b97b4f5a5fe0d6c152caffbb4d5acaea5faf6f8455b9\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"df5273eedbdc6b3d7db6fe57e3dcbea2e7d8ef9b1891fd253d16f41d5a82923f\"" Aug 5 22:13:28.970463 containerd[1444]: time="2024-08-05T22:13:28.969361237Z" level=info msg="StartContainer for \"df5273eedbdc6b3d7db6fe57e3dcbea2e7d8ef9b1891fd253d16f41d5a82923f\"" Aug 5 22:13:29.019066 systemd[1]: run-containerd-runc-k8s.io-df5273eedbdc6b3d7db6fe57e3dcbea2e7d8ef9b1891fd253d16f41d5a82923f-runc.uD4Nd0.mount: Deactivated successfully. Aug 5 22:13:29.028485 systemd[1]: Started cri-containerd-df5273eedbdc6b3d7db6fe57e3dcbea2e7d8ef9b1891fd253d16f41d5a82923f.scope - libcontainer container df5273eedbdc6b3d7db6fe57e3dcbea2e7d8ef9b1891fd253d16f41d5a82923f. Aug 5 22:13:29.100172 containerd[1444]: 2024-08-05 22:13:29.015 [WARNING][4850] k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3975--2--0--1--de7b5ef465.novalocal-k8s-csi--node--driver--ck8bx-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2d1ea0ca-eeda-4305-a298-4df807e6a886", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2024, time.August, 5, 22, 12, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"7d7f6c786c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3975-2-0-1-de7b5ef465.novalocal", ContainerID:"f064d99279b0982c4ba9b97b4f5a5fe0d6c152caffbb4d5acaea5faf6f8455b9", Pod:"csi-node-driver-ck8bx", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.92.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.default"}, InterfaceName:"cali856eccd5c50", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Aug 5 22:13:29.100172 containerd[1444]: 2024-08-05 22:13:29.015 [INFO][4850] k8s.go 608: Cleaning up netns ContainerID="52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" Aug 5 22:13:29.100172 containerd[1444]: 2024-08-05 22:13:29.015 [INFO][4850] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" iface="eth0" netns="" Aug 5 22:13:29.100172 containerd[1444]: 2024-08-05 22:13:29.015 [INFO][4850] k8s.go 615: Releasing IP address(es) ContainerID="52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" Aug 5 22:13:29.100172 containerd[1444]: 2024-08-05 22:13:29.015 [INFO][4850] utils.go 188: Calico CNI releasing IP address ContainerID="52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" Aug 5 22:13:29.100172 containerd[1444]: 2024-08-05 22:13:29.057 [INFO][4873] ipam_plugin.go 411: Releasing address using handleID ContainerID="52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" HandleID="k8s-pod-network.52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-csi--node--driver--ck8bx-eth0" Aug 5 22:13:29.100172 containerd[1444]: 2024-08-05 22:13:29.057 [INFO][4873] ipam_plugin.go 352: About to acquire host-wide IPAM lock. Aug 5 22:13:29.100172 containerd[1444]: 2024-08-05 22:13:29.057 [INFO][4873] ipam_plugin.go 367: Acquired host-wide IPAM lock. Aug 5 22:13:29.100172 containerd[1444]: 2024-08-05 22:13:29.087 [WARNING][4873] ipam_plugin.go 428: Asked to release address but it doesn't exist. Ignoring ContainerID="52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" HandleID="k8s-pod-network.52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-csi--node--driver--ck8bx-eth0" Aug 5 22:13:29.100172 containerd[1444]: 2024-08-05 22:13:29.093 [INFO][4873] ipam_plugin.go 439: Releasing address using workloadID ContainerID="52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" HandleID="k8s-pod-network.52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-csi--node--driver--ck8bx-eth0" Aug 5 22:13:29.100172 containerd[1444]: 2024-08-05 22:13:29.096 [INFO][4873] ipam_plugin.go 373: Released host-wide IPAM lock. Aug 5 22:13:29.100172 containerd[1444]: 2024-08-05 22:13:29.098 [INFO][4850] k8s.go 621: Teardown processing complete. ContainerID="52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" Aug 5 22:13:29.100172 containerd[1444]: time="2024-08-05T22:13:29.100075274Z" level=info msg="TearDown network for sandbox \"52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0\" successfully" Aug 5 22:13:29.100172 containerd[1444]: time="2024-08-05T22:13:29.100122340Z" level=info msg="StopPodSandbox for \"52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0\" returns successfully" Aug 5 22:13:29.110916 containerd[1444]: time="2024-08-05T22:13:29.101410920Z" level=info msg="RemovePodSandbox for \"52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0\"" Aug 5 22:13:29.110916 containerd[1444]: time="2024-08-05T22:13:29.101448259Z" level=info msg="Forcibly stopping sandbox \"52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0\"" Aug 5 22:13:29.424875 containerd[1444]: time="2024-08-05T22:13:29.424809626Z" level=info msg="StartContainer for \"df5273eedbdc6b3d7db6fe57e3dcbea2e7d8ef9b1891fd253d16f41d5a82923f\" returns successfully" Aug 5 22:13:29.677073 containerd[1444]: 2024-08-05 22:13:29.591 [WARNING][4912] k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3975--2--0--1--de7b5ef465.novalocal-k8s-csi--node--driver--ck8bx-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2d1ea0ca-eeda-4305-a298-4df807e6a886", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2024, time.August, 5, 22, 12, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"7d7f6c786c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"default"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3975-2-0-1-de7b5ef465.novalocal", ContainerID:"f064d99279b0982c4ba9b97b4f5a5fe0d6c152caffbb4d5acaea5faf6f8455b9", Pod:"csi-node-driver-ck8bx", Endpoint:"eth0", ServiceAccountName:"default", IPNetworks:[]string{"192.168.92.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.default"}, InterfaceName:"cali856eccd5c50", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Aug 5 22:13:29.677073 containerd[1444]: 2024-08-05 22:13:29.591 [INFO][4912] k8s.go 608: Cleaning up netns ContainerID="52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" Aug 5 22:13:29.677073 containerd[1444]: 2024-08-05 22:13:29.591 [INFO][4912] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" iface="eth0" netns="" Aug 5 22:13:29.677073 containerd[1444]: 2024-08-05 22:13:29.591 [INFO][4912] k8s.go 615: Releasing IP address(es) ContainerID="52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" Aug 5 22:13:29.677073 containerd[1444]: 2024-08-05 22:13:29.591 [INFO][4912] utils.go 188: Calico CNI releasing IP address ContainerID="52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" Aug 5 22:13:29.677073 containerd[1444]: 2024-08-05 22:13:29.646 [INFO][4923] ipam_plugin.go 411: Releasing address using handleID ContainerID="52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" HandleID="k8s-pod-network.52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-csi--node--driver--ck8bx-eth0" Aug 5 22:13:29.677073 containerd[1444]: 2024-08-05 22:13:29.646 [INFO][4923] ipam_plugin.go 352: About to acquire host-wide IPAM lock. Aug 5 22:13:29.677073 containerd[1444]: 2024-08-05 22:13:29.646 [INFO][4923] ipam_plugin.go 367: Acquired host-wide IPAM lock. Aug 5 22:13:29.677073 containerd[1444]: 2024-08-05 22:13:29.652 [WARNING][4923] ipam_plugin.go 428: Asked to release address but it doesn't exist. Ignoring ContainerID="52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" HandleID="k8s-pod-network.52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-csi--node--driver--ck8bx-eth0" Aug 5 22:13:29.677073 containerd[1444]: 2024-08-05 22:13:29.652 [INFO][4923] ipam_plugin.go 439: Releasing address using workloadID ContainerID="52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" HandleID="k8s-pod-network.52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-csi--node--driver--ck8bx-eth0" Aug 5 22:13:29.677073 containerd[1444]: 2024-08-05 22:13:29.654 [INFO][4923] ipam_plugin.go 373: Released host-wide IPAM lock. Aug 5 22:13:29.677073 containerd[1444]: 2024-08-05 22:13:29.655 [INFO][4912] k8s.go 621: Teardown processing complete. ContainerID="52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0" Aug 5 22:13:29.677073 containerd[1444]: time="2024-08-05T22:13:29.675771009Z" level=info msg="TearDown network for sandbox \"52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0\" successfully" Aug 5 22:13:29.718245 containerd[1444]: time="2024-08-05T22:13:29.716309263Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Aug 5 22:13:29.718381 kubelet[2635]: I0805 22:13:29.717010 2635 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/csi-node-driver-ck8bx" podStartSLOduration=32.625391858 podStartE2EDuration="40.716569822s" podCreationTimestamp="2024-08-05 22:12:49 +0000 UTC" firstStartedPulling="2024-08-05 22:13:20.841022147 +0000 UTC m=+52.546242545" lastFinishedPulling="2024-08-05 22:13:28.932200121 +0000 UTC m=+60.637420509" observedRunningTime="2024-08-05 22:13:29.712065569 +0000 UTC m=+61.417286017" watchObservedRunningTime="2024-08-05 22:13:29.716569822 +0000 UTC m=+61.421790260" Aug 5 22:13:29.732137 containerd[1444]: time="2024-08-05T22:13:29.731615840Z" level=info msg="RemovePodSandbox \"52609d5815f48c3224a2955f13cc78c5198781e782cc09339cbdbd4863996ad0\" returns successfully" Aug 5 22:13:29.732533 containerd[1444]: time="2024-08-05T22:13:29.732500276Z" level=info msg="StopPodSandbox for \"c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd\"" Aug 5 22:13:29.806829 containerd[1444]: 2024-08-05 22:13:29.772 [WARNING][4942] k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--lvcgp-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"2932e3f8-d82d-4537-9f9b-b46369d5b6f8", ResourceVersion:"861", Generation:0, CreationTimestamp:time.Date(2024, time.August, 5, 22, 12, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3975-2-0-1-de7b5ef465.novalocal", ContainerID:"0bb4098e5d7b022e09bdb0632ca469d1d0dfa990e63dcdbe6530bad4d804a06f", Pod:"coredns-76f75df574-lvcgp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.92.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali89caf6ed1d6", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Aug 5 22:13:29.806829 containerd[1444]: 2024-08-05 22:13:29.773 [INFO][4942] k8s.go 608: Cleaning up netns ContainerID="c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" Aug 5 22:13:29.806829 containerd[1444]: 2024-08-05 22:13:29.773 [INFO][4942] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" iface="eth0" netns="" Aug 5 22:13:29.806829 containerd[1444]: 2024-08-05 22:13:29.773 [INFO][4942] k8s.go 615: Releasing IP address(es) ContainerID="c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" Aug 5 22:13:29.806829 containerd[1444]: 2024-08-05 22:13:29.773 [INFO][4942] utils.go 188: Calico CNI releasing IP address ContainerID="c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" Aug 5 22:13:29.806829 containerd[1444]: 2024-08-05 22:13:29.795 [INFO][4948] ipam_plugin.go 411: Releasing address using handleID ContainerID="c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" HandleID="k8s-pod-network.c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--lvcgp-eth0" Aug 5 22:13:29.806829 containerd[1444]: 2024-08-05 22:13:29.795 [INFO][4948] ipam_plugin.go 352: About to acquire host-wide IPAM lock. Aug 5 22:13:29.806829 containerd[1444]: 2024-08-05 22:13:29.795 [INFO][4948] ipam_plugin.go 367: Acquired host-wide IPAM lock. Aug 5 22:13:29.806829 containerd[1444]: 2024-08-05 22:13:29.802 [WARNING][4948] ipam_plugin.go 428: Asked to release address but it doesn't exist. Ignoring ContainerID="c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" HandleID="k8s-pod-network.c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--lvcgp-eth0" Aug 5 22:13:29.806829 containerd[1444]: 2024-08-05 22:13:29.802 [INFO][4948] ipam_plugin.go 439: Releasing address using workloadID ContainerID="c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" HandleID="k8s-pod-network.c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--lvcgp-eth0" Aug 5 22:13:29.806829 containerd[1444]: 2024-08-05 22:13:29.803 [INFO][4948] ipam_plugin.go 373: Released host-wide IPAM lock. Aug 5 22:13:29.806829 containerd[1444]: 2024-08-05 22:13:29.805 [INFO][4942] k8s.go 621: Teardown processing complete. ContainerID="c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" Aug 5 22:13:29.808137 containerd[1444]: time="2024-08-05T22:13:29.807125952Z" level=info msg="TearDown network for sandbox \"c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd\" successfully" Aug 5 22:13:29.808137 containerd[1444]: time="2024-08-05T22:13:29.807154705Z" level=info msg="StopPodSandbox for \"c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd\" returns successfully" Aug 5 22:13:29.808137 containerd[1444]: time="2024-08-05T22:13:29.807722319Z" level=info msg="RemovePodSandbox for \"c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd\"" Aug 5 22:13:29.808137 containerd[1444]: time="2024-08-05T22:13:29.807763024Z" level=info msg="Forcibly stopping sandbox \"c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd\"" Aug 5 22:13:29.892291 containerd[1444]: 2024-08-05 22:13:29.847 [WARNING][4966] k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--lvcgp-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"2932e3f8-d82d-4537-9f9b-b46369d5b6f8", ResourceVersion:"861", Generation:0, CreationTimestamp:time.Date(2024, time.August, 5, 22, 12, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3975-2-0-1-de7b5ef465.novalocal", ContainerID:"0bb4098e5d7b022e09bdb0632ca469d1d0dfa990e63dcdbe6530bad4d804a06f", Pod:"coredns-76f75df574-lvcgp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.92.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali89caf6ed1d6", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Aug 5 22:13:29.892291 containerd[1444]: 2024-08-05 22:13:29.847 [INFO][4966] k8s.go 608: Cleaning up netns ContainerID="c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" Aug 5 22:13:29.892291 containerd[1444]: 2024-08-05 22:13:29.847 [INFO][4966] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" iface="eth0" netns="" Aug 5 22:13:29.892291 containerd[1444]: 2024-08-05 22:13:29.847 [INFO][4966] k8s.go 615: Releasing IP address(es) ContainerID="c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" Aug 5 22:13:29.892291 containerd[1444]: 2024-08-05 22:13:29.847 [INFO][4966] utils.go 188: Calico CNI releasing IP address ContainerID="c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" Aug 5 22:13:29.892291 containerd[1444]: 2024-08-05 22:13:29.876 [INFO][4972] ipam_plugin.go 411: Releasing address using handleID ContainerID="c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" HandleID="k8s-pod-network.c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--lvcgp-eth0" Aug 5 22:13:29.892291 containerd[1444]: 2024-08-05 22:13:29.876 [INFO][4972] ipam_plugin.go 352: About to acquire host-wide IPAM lock. Aug 5 22:13:29.892291 containerd[1444]: 2024-08-05 22:13:29.876 [INFO][4972] ipam_plugin.go 367: Acquired host-wide IPAM lock. Aug 5 22:13:29.892291 containerd[1444]: 2024-08-05 22:13:29.883 [WARNING][4972] ipam_plugin.go 428: Asked to release address but it doesn't exist. Ignoring ContainerID="c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" HandleID="k8s-pod-network.c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--lvcgp-eth0" Aug 5 22:13:29.892291 containerd[1444]: 2024-08-05 22:13:29.884 [INFO][4972] ipam_plugin.go 439: Releasing address using workloadID ContainerID="c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" HandleID="k8s-pod-network.c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--lvcgp-eth0" Aug 5 22:13:29.892291 containerd[1444]: 2024-08-05 22:13:29.886 [INFO][4972] ipam_plugin.go 373: Released host-wide IPAM lock. Aug 5 22:13:29.892291 containerd[1444]: 2024-08-05 22:13:29.890 [INFO][4966] k8s.go 621: Teardown processing complete. ContainerID="c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd" Aug 5 22:13:29.893465 containerd[1444]: time="2024-08-05T22:13:29.892318229Z" level=info msg="TearDown network for sandbox \"c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd\" successfully" Aug 5 22:13:29.896892 containerd[1444]: time="2024-08-05T22:13:29.896860802Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Aug 5 22:13:29.897008 containerd[1444]: time="2024-08-05T22:13:29.896916014Z" level=info msg="RemovePodSandbox \"c44015cdd53b711bcd0d788d4f0c0a8bccb66bdbdff2b2e021822855fceb70cd\" returns successfully" Aug 5 22:13:29.897314 containerd[1444]: time="2024-08-05T22:13:29.897285382Z" level=info msg="StopPodSandbox for \"de33a8626cda7c4dc930033f9a51a27269999ea189861abee56825a2b75599ce\"" Aug 5 22:13:29.897393 containerd[1444]: time="2024-08-05T22:13:29.897368696Z" level=info msg="TearDown network for sandbox \"de33a8626cda7c4dc930033f9a51a27269999ea189861abee56825a2b75599ce\" successfully" Aug 5 22:13:29.897393 containerd[1444]: time="2024-08-05T22:13:29.897386820Z" level=info msg="StopPodSandbox for \"de33a8626cda7c4dc930033f9a51a27269999ea189861abee56825a2b75599ce\" returns successfully" Aug 5 22:13:29.899051 containerd[1444]: time="2024-08-05T22:13:29.898320135Z" level=info msg="RemovePodSandbox for \"de33a8626cda7c4dc930033f9a51a27269999ea189861abee56825a2b75599ce\"" Aug 5 22:13:29.899051 containerd[1444]: time="2024-08-05T22:13:29.898363876Z" level=info msg="Forcibly stopping sandbox \"de33a8626cda7c4dc930033f9a51a27269999ea189861abee56825a2b75599ce\"" Aug 5 22:13:29.899051 containerd[1444]: time="2024-08-05T22:13:29.898431590Z" level=info msg="TearDown network for sandbox \"de33a8626cda7c4dc930033f9a51a27269999ea189861abee56825a2b75599ce\" successfully" Aug 5 22:13:29.906073 containerd[1444]: time="2024-08-05T22:13:29.906046804Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"de33a8626cda7c4dc930033f9a51a27269999ea189861abee56825a2b75599ce\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Aug 5 22:13:29.906185 containerd[1444]: time="2024-08-05T22:13:29.906169639Z" level=info msg="RemovePodSandbox \"de33a8626cda7c4dc930033f9a51a27269999ea189861abee56825a2b75599ce\" returns successfully" Aug 5 22:13:29.906574 containerd[1444]: time="2024-08-05T22:13:29.906549698Z" level=info msg="StopPodSandbox for \"57ba4be255acc58e96253187a5e2ea4f3b3e7eda28c3e5dd3d7b0f67986e6297\"" Aug 5 22:13:29.907324 containerd[1444]: time="2024-08-05T22:13:29.906632561Z" level=info msg="TearDown network for sandbox \"57ba4be255acc58e96253187a5e2ea4f3b3e7eda28c3e5dd3d7b0f67986e6297\" successfully" Aug 5 22:13:29.907324 containerd[1444]: time="2024-08-05T22:13:29.907318993Z" level=info msg="StopPodSandbox for \"57ba4be255acc58e96253187a5e2ea4f3b3e7eda28c3e5dd3d7b0f67986e6297\" returns successfully" Aug 5 22:13:29.908095 containerd[1444]: time="2024-08-05T22:13:29.907543496Z" level=info msg="RemovePodSandbox for \"57ba4be255acc58e96253187a5e2ea4f3b3e7eda28c3e5dd3d7b0f67986e6297\"" Aug 5 22:13:29.908095 containerd[1444]: time="2024-08-05T22:13:29.907568873Z" level=info msg="Forcibly stopping sandbox \"57ba4be255acc58e96253187a5e2ea4f3b3e7eda28c3e5dd3d7b0f67986e6297\"" Aug 5 22:13:29.908095 containerd[1444]: time="2024-08-05T22:13:29.907621690Z" level=info msg="TearDown network for sandbox \"57ba4be255acc58e96253187a5e2ea4f3b3e7eda28c3e5dd3d7b0f67986e6297\" successfully" Aug 5 22:13:29.911239 containerd[1444]: time="2024-08-05T22:13:29.911196333Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"57ba4be255acc58e96253187a5e2ea4f3b3e7eda28c3e5dd3d7b0f67986e6297\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Aug 5 22:13:29.911416 containerd[1444]: time="2024-08-05T22:13:29.911335770Z" level=info msg="RemovePodSandbox \"57ba4be255acc58e96253187a5e2ea4f3b3e7eda28c3e5dd3d7b0f67986e6297\" returns successfully" Aug 5 22:13:29.911670 containerd[1444]: time="2024-08-05T22:13:29.911637204Z" level=info msg="StopPodSandbox for \"8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193\"" Aug 5 22:13:29.975308 kubelet[2635]: I0805 22:13:29.973853 2635 csi_plugin.go:99] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Aug 5 22:13:29.977362 kubelet[2635]: I0805 22:13:29.977332 2635 csi_plugin.go:112] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Aug 5 22:13:30.002981 containerd[1444]: 2024-08-05 22:13:29.957 [WARNING][4990] k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--t6q9g-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"55edd9ef-b537-4c83-ad1e-084f093bbf6f", ResourceVersion:"848", Generation:0, CreationTimestamp:time.Date(2024, time.August, 5, 22, 12, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3975-2-0-1-de7b5ef465.novalocal", ContainerID:"7dd79605bdd06466eb96bf397373cefe6c84be3828c1fc68ad1499f4372d8c2a", Pod:"coredns-76f75df574-t6q9g", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.92.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib1d4a50183b", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Aug 5 22:13:30.002981 containerd[1444]: 2024-08-05 22:13:29.958 [INFO][4990] k8s.go 608: Cleaning up netns ContainerID="8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" Aug 5 22:13:30.002981 containerd[1444]: 2024-08-05 22:13:29.958 [INFO][4990] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" iface="eth0" netns="" Aug 5 22:13:30.002981 containerd[1444]: 2024-08-05 22:13:29.958 [INFO][4990] k8s.go 615: Releasing IP address(es) ContainerID="8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" Aug 5 22:13:30.002981 containerd[1444]: 2024-08-05 22:13:29.958 [INFO][4990] utils.go 188: Calico CNI releasing IP address ContainerID="8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" Aug 5 22:13:30.002981 containerd[1444]: 2024-08-05 22:13:29.986 [INFO][4996] ipam_plugin.go 411: Releasing address using handleID ContainerID="8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" HandleID="k8s-pod-network.8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--t6q9g-eth0" Aug 5 22:13:30.002981 containerd[1444]: 2024-08-05 22:13:29.986 [INFO][4996] ipam_plugin.go 352: About to acquire host-wide IPAM lock. Aug 5 22:13:30.002981 containerd[1444]: 2024-08-05 22:13:29.986 [INFO][4996] ipam_plugin.go 367: Acquired host-wide IPAM lock. Aug 5 22:13:30.002981 containerd[1444]: 2024-08-05 22:13:29.996 [WARNING][4996] ipam_plugin.go 428: Asked to release address but it doesn't exist. Ignoring ContainerID="8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" HandleID="k8s-pod-network.8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--t6q9g-eth0" Aug 5 22:13:30.002981 containerd[1444]: 2024-08-05 22:13:29.996 [INFO][4996] ipam_plugin.go 439: Releasing address using workloadID ContainerID="8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" HandleID="k8s-pod-network.8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--t6q9g-eth0" Aug 5 22:13:30.002981 containerd[1444]: 2024-08-05 22:13:29.998 [INFO][4996] ipam_plugin.go 373: Released host-wide IPAM lock. Aug 5 22:13:30.002981 containerd[1444]: 2024-08-05 22:13:30.000 [INFO][4990] k8s.go 621: Teardown processing complete. ContainerID="8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" Aug 5 22:13:30.003761 containerd[1444]: time="2024-08-05T22:13:30.003721189Z" level=info msg="TearDown network for sandbox \"8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193\" successfully" Aug 5 22:13:30.004394 containerd[1444]: time="2024-08-05T22:13:30.003831101Z" level=info msg="StopPodSandbox for \"8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193\" returns successfully" Aug 5 22:13:30.004394 containerd[1444]: time="2024-08-05T22:13:30.004212202Z" level=info msg="RemovePodSandbox for \"8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193\"" Aug 5 22:13:30.004497 containerd[1444]: time="2024-08-05T22:13:30.004480256Z" level=info msg="Forcibly stopping sandbox \"8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193\"" Aug 5 22:13:30.089196 containerd[1444]: 2024-08-05 22:13:30.051 [WARNING][5014] k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--t6q9g-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"55edd9ef-b537-4c83-ad1e-084f093bbf6f", ResourceVersion:"848", Generation:0, CreationTimestamp:time.Date(2024, time.August, 5, 22, 12, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3975-2-0-1-de7b5ef465.novalocal", ContainerID:"7dd79605bdd06466eb96bf397373cefe6c84be3828c1fc68ad1499f4372d8c2a", Pod:"coredns-76f75df574-t6q9g", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.92.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib1d4a50183b", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Aug 5 22:13:30.089196 containerd[1444]: 2024-08-05 22:13:30.051 [INFO][5014] k8s.go 608: Cleaning up netns ContainerID="8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" Aug 5 22:13:30.089196 containerd[1444]: 2024-08-05 22:13:30.051 [INFO][5014] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" iface="eth0" netns="" Aug 5 22:13:30.089196 containerd[1444]: 2024-08-05 22:13:30.051 [INFO][5014] k8s.go 615: Releasing IP address(es) ContainerID="8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" Aug 5 22:13:30.089196 containerd[1444]: 2024-08-05 22:13:30.051 [INFO][5014] utils.go 188: Calico CNI releasing IP address ContainerID="8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" Aug 5 22:13:30.089196 containerd[1444]: 2024-08-05 22:13:30.074 [INFO][5020] ipam_plugin.go 411: Releasing address using handleID ContainerID="8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" HandleID="k8s-pod-network.8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--t6q9g-eth0" Aug 5 22:13:30.089196 containerd[1444]: 2024-08-05 22:13:30.074 [INFO][5020] ipam_plugin.go 352: About to acquire host-wide IPAM lock. Aug 5 22:13:30.089196 containerd[1444]: 2024-08-05 22:13:30.074 [INFO][5020] ipam_plugin.go 367: Acquired host-wide IPAM lock. Aug 5 22:13:30.089196 containerd[1444]: 2024-08-05 22:13:30.083 [WARNING][5020] ipam_plugin.go 428: Asked to release address but it doesn't exist. Ignoring ContainerID="8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" HandleID="k8s-pod-network.8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--t6q9g-eth0" Aug 5 22:13:30.089196 containerd[1444]: 2024-08-05 22:13:30.083 [INFO][5020] ipam_plugin.go 439: Releasing address using workloadID ContainerID="8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" HandleID="k8s-pod-network.8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-coredns--76f75df574--t6q9g-eth0" Aug 5 22:13:30.089196 containerd[1444]: 2024-08-05 22:13:30.085 [INFO][5020] ipam_plugin.go 373: Released host-wide IPAM lock. Aug 5 22:13:30.089196 containerd[1444]: 2024-08-05 22:13:30.087 [INFO][5014] k8s.go 621: Teardown processing complete. ContainerID="8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193" Aug 5 22:13:30.089763 containerd[1444]: time="2024-08-05T22:13:30.089723504Z" level=info msg="TearDown network for sandbox \"8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193\" successfully" Aug 5 22:13:30.093882 containerd[1444]: time="2024-08-05T22:13:30.093854847Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Aug 5 22:13:30.094025 containerd[1444]: time="2024-08-05T22:13:30.093995967Z" level=info msg="RemovePodSandbox \"8d2607fead9c533ba1c6f4efe579b7f9aa3d73d5750c29a6449e651cd173f193\" returns successfully" Aug 5 22:13:30.094527 containerd[1444]: time="2024-08-05T22:13:30.094505455Z" level=info msg="StopPodSandbox for \"8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1\"" Aug 5 22:13:30.167597 containerd[1444]: 2024-08-05 22:13:30.133 [WARNING][5038] k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--kube--controllers--58c54bc647--pqf69-eth0", GenerateName:"calico-kube-controllers-58c54bc647-", Namespace:"calico-system", SelfLink:"", UID:"4ecfd1c0-3700-4ed3-84bf-da7987083d57", ResourceVersion:"883", Generation:0, CreationTimestamp:time.Date(2024, time.August, 5, 22, 12, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"58c54bc647", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3975-2-0-1-de7b5ef465.novalocal", ContainerID:"35145b36206ed650826e52af4918c7250250684c9fa638a8c35efe287ddd8358", Pod:"calico-kube-controllers-58c54bc647-pqf69", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.92.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali32125d64fa6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Aug 5 22:13:30.167597 containerd[1444]: 2024-08-05 22:13:30.133 [INFO][5038] k8s.go 608: Cleaning up netns ContainerID="8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" Aug 5 22:13:30.167597 containerd[1444]: 2024-08-05 22:13:30.133 [INFO][5038] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" iface="eth0" netns="" Aug 5 22:13:30.167597 containerd[1444]: 2024-08-05 22:13:30.133 [INFO][5038] k8s.go 615: Releasing IP address(es) ContainerID="8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" Aug 5 22:13:30.167597 containerd[1444]: 2024-08-05 22:13:30.133 [INFO][5038] utils.go 188: Calico CNI releasing IP address ContainerID="8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" Aug 5 22:13:30.167597 containerd[1444]: 2024-08-05 22:13:30.155 [INFO][5044] ipam_plugin.go 411: Releasing address using handleID ContainerID="8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" HandleID="k8s-pod-network.8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--kube--controllers--58c54bc647--pqf69-eth0" Aug 5 22:13:30.167597 containerd[1444]: 2024-08-05 22:13:30.155 [INFO][5044] ipam_plugin.go 352: About to acquire host-wide IPAM lock. Aug 5 22:13:30.167597 containerd[1444]: 2024-08-05 22:13:30.155 [INFO][5044] ipam_plugin.go 367: Acquired host-wide IPAM lock. Aug 5 22:13:30.167597 containerd[1444]: 2024-08-05 22:13:30.162 [WARNING][5044] ipam_plugin.go 428: Asked to release address but it doesn't exist. Ignoring ContainerID="8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" HandleID="k8s-pod-network.8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--kube--controllers--58c54bc647--pqf69-eth0" Aug 5 22:13:30.167597 containerd[1444]: 2024-08-05 22:13:30.162 [INFO][5044] ipam_plugin.go 439: Releasing address using workloadID ContainerID="8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" HandleID="k8s-pod-network.8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--kube--controllers--58c54bc647--pqf69-eth0" Aug 5 22:13:30.167597 containerd[1444]: 2024-08-05 22:13:30.164 [INFO][5044] ipam_plugin.go 373: Released host-wide IPAM lock. Aug 5 22:13:30.167597 containerd[1444]: 2024-08-05 22:13:30.166 [INFO][5038] k8s.go 621: Teardown processing complete. ContainerID="8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" Aug 5 22:13:30.168168 containerd[1444]: time="2024-08-05T22:13:30.168088844Z" level=info msg="TearDown network for sandbox \"8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1\" successfully" Aug 5 22:13:30.168168 containerd[1444]: time="2024-08-05T22:13:30.168130742Z" level=info msg="StopPodSandbox for \"8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1\" returns successfully" Aug 5 22:13:30.169079 containerd[1444]: time="2024-08-05T22:13:30.168790526Z" level=info msg="RemovePodSandbox for \"8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1\"" Aug 5 22:13:30.169079 containerd[1444]: time="2024-08-05T22:13:30.168817125Z" level=info msg="Forcibly stopping sandbox \"8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1\"" Aug 5 22:13:30.247123 containerd[1444]: 2024-08-05 22:13:30.209 [WARNING][5062] k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--kube--controllers--58c54bc647--pqf69-eth0", GenerateName:"calico-kube-controllers-58c54bc647-", Namespace:"calico-system", SelfLink:"", UID:"4ecfd1c0-3700-4ed3-84bf-da7987083d57", ResourceVersion:"883", Generation:0, CreationTimestamp:time.Date(2024, time.August, 5, 22, 12, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"58c54bc647", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3975-2-0-1-de7b5ef465.novalocal", ContainerID:"35145b36206ed650826e52af4918c7250250684c9fa638a8c35efe287ddd8358", Pod:"calico-kube-controllers-58c54bc647-pqf69", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.92.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali32125d64fa6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Aug 5 22:13:30.247123 containerd[1444]: 2024-08-05 22:13:30.209 [INFO][5062] k8s.go 608: Cleaning up netns ContainerID="8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" Aug 5 22:13:30.247123 containerd[1444]: 2024-08-05 22:13:30.210 [INFO][5062] dataplane_linux.go 526: CleanUpNamespace called with no netns name, ignoring. ContainerID="8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" iface="eth0" netns="" Aug 5 22:13:30.247123 containerd[1444]: 2024-08-05 22:13:30.210 [INFO][5062] k8s.go 615: Releasing IP address(es) ContainerID="8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" Aug 5 22:13:30.247123 containerd[1444]: 2024-08-05 22:13:30.210 [INFO][5062] utils.go 188: Calico CNI releasing IP address ContainerID="8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" Aug 5 22:13:30.247123 containerd[1444]: 2024-08-05 22:13:30.231 [INFO][5068] ipam_plugin.go 411: Releasing address using handleID ContainerID="8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" HandleID="k8s-pod-network.8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--kube--controllers--58c54bc647--pqf69-eth0" Aug 5 22:13:30.247123 containerd[1444]: 2024-08-05 22:13:30.231 [INFO][5068] ipam_plugin.go 352: About to acquire host-wide IPAM lock. Aug 5 22:13:30.247123 containerd[1444]: 2024-08-05 22:13:30.231 [INFO][5068] ipam_plugin.go 367: Acquired host-wide IPAM lock. Aug 5 22:13:30.247123 containerd[1444]: 2024-08-05 22:13:30.241 [WARNING][5068] ipam_plugin.go 428: Asked to release address but it doesn't exist. Ignoring ContainerID="8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" HandleID="k8s-pod-network.8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--kube--controllers--58c54bc647--pqf69-eth0" Aug 5 22:13:30.247123 containerd[1444]: 2024-08-05 22:13:30.241 [INFO][5068] ipam_plugin.go 439: Releasing address using workloadID ContainerID="8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" HandleID="k8s-pod-network.8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--kube--controllers--58c54bc647--pqf69-eth0" Aug 5 22:13:30.247123 containerd[1444]: 2024-08-05 22:13:30.243 [INFO][5068] ipam_plugin.go 373: Released host-wide IPAM lock. Aug 5 22:13:30.247123 containerd[1444]: 2024-08-05 22:13:30.245 [INFO][5062] k8s.go 621: Teardown processing complete. ContainerID="8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1" Aug 5 22:13:30.249435 containerd[1444]: time="2024-08-05T22:13:30.247494530Z" level=info msg="TearDown network for sandbox \"8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1\" successfully" Aug 5 22:13:30.254165 containerd[1444]: time="2024-08-05T22:13:30.253997538Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Aug 5 22:13:30.254165 containerd[1444]: time="2024-08-05T22:13:30.254072375Z" level=info msg="RemovePodSandbox \"8c50acd90b9a35b996fe6fdab53bd12bbbfd7a8f245f5a4d560d009635fdaaa1\" returns successfully" Aug 5 22:13:32.486948 update_engine[1425]: I0805 22:13:32.486212 1425 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Aug 5 22:13:32.486948 update_engine[1425]: I0805 22:13:32.486629 1425 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Aug 5 22:13:32.488686 update_engine[1425]: I0805 22:13:32.487046 1425 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Aug 5 22:13:32.497628 update_engine[1425]: E0805 22:13:32.497521 1425 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Aug 5 22:13:32.497628 update_engine[1425]: I0805 22:13:32.497618 1425 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Aug 5 22:13:42.489826 update_engine[1425]: I0805 22:13:42.489708 1425 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Aug 5 22:13:42.490633 update_engine[1425]: I0805 22:13:42.490137 1425 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Aug 5 22:13:42.490633 update_engine[1425]: I0805 22:13:42.490566 1425 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Aug 5 22:13:42.500815 update_engine[1425]: E0805 22:13:42.500752 1425 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Aug 5 22:13:42.500917 update_engine[1425]: I0805 22:13:42.500846 1425 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Aug 5 22:13:42.852818 systemd[1]: Started sshd@9-172.24.4.33:22-172.24.4.1:37392.service - OpenSSH per-connection server daemon (172.24.4.1:37392). Aug 5 22:13:44.122439 sshd[5112]: Accepted publickey for core from 172.24.4.1 port 37392 ssh2: RSA SHA256:wJOsecOnS68Cf9bfpO6HbyavHDudnwLnl2CjsDHwoC8 Aug 5 22:13:44.126040 sshd[5112]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Aug 5 22:13:44.136506 systemd-logind[1423]: New session 12 of user core. Aug 5 22:13:44.143523 systemd[1]: Started session-12.scope - Session 12 of User core. Aug 5 22:13:45.047594 systemd[1]: run-containerd-runc-k8s.io-81801b50e438d74a07c8b62aedd1cf55c8a0fde7a170d8621b6412d43c922fcd-runc.qQIKx7.mount: Deactivated successfully. Aug 5 22:13:45.097040 sshd[5112]: pam_unix(sshd:session): session closed for user core Aug 5 22:13:45.102764 systemd[1]: sshd@9-172.24.4.33:22-172.24.4.1:37392.service: Deactivated successfully. Aug 5 22:13:45.107822 systemd[1]: session-12.scope: Deactivated successfully. Aug 5 22:13:45.112188 systemd-logind[1423]: Session 12 logged out. Waiting for processes to exit. Aug 5 22:13:45.116107 systemd-logind[1423]: Removed session 12. Aug 5 22:13:50.119850 systemd[1]: Started sshd@10-172.24.4.33:22-172.24.4.1:46940.service - OpenSSH per-connection server daemon (172.24.4.1:46940). Aug 5 22:13:51.251899 sshd[5154]: Accepted publickey for core from 172.24.4.1 port 46940 ssh2: RSA SHA256:wJOsecOnS68Cf9bfpO6HbyavHDudnwLnl2CjsDHwoC8 Aug 5 22:13:51.253316 sshd[5154]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Aug 5 22:13:51.262934 systemd-logind[1423]: New session 13 of user core. Aug 5 22:13:51.268875 systemd[1]: Started session-13.scope - Session 13 of User core. Aug 5 22:13:52.481726 update_engine[1425]: I0805 22:13:52.481287 1425 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Aug 5 22:13:52.481726 update_engine[1425]: I0805 22:13:52.481487 1425 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Aug 5 22:13:52.481726 update_engine[1425]: I0805 22:13:52.481690 1425 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Aug 5 22:13:52.492021 update_engine[1425]: E0805 22:13:52.491725 1425 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Aug 5 22:13:52.492021 update_engine[1425]: I0805 22:13:52.491773 1425 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Aug 5 22:13:52.492021 update_engine[1425]: I0805 22:13:52.491778 1425 omaha_request_action.cc:617] Omaha request response: Aug 5 22:13:52.492021 update_engine[1425]: E0805 22:13:52.491863 1425 omaha_request_action.cc:636] Omaha request network transfer failed. Aug 5 22:13:52.547592 update_engine[1425]: I0805 22:13:52.547545 1425 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Aug 5 22:13:52.547592 update_engine[1425]: I0805 22:13:52.547580 1425 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Aug 5 22:13:52.547592 update_engine[1425]: I0805 22:13:52.547585 1425 update_attempter.cc:306] Processing Done. Aug 5 22:13:52.547592 update_engine[1425]: E0805 22:13:52.547596 1425 update_attempter.cc:619] Update failed. Aug 5 22:13:52.547773 update_engine[1425]: I0805 22:13:52.547616 1425 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Aug 5 22:13:52.547773 update_engine[1425]: I0805 22:13:52.547619 1425 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Aug 5 22:13:52.547773 update_engine[1425]: I0805 22:13:52.547624 1425 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Aug 5 22:13:52.547773 update_engine[1425]: I0805 22:13:52.547706 1425 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Aug 5 22:13:52.547773 update_engine[1425]: I0805 22:13:52.547727 1425 omaha_request_action.cc:271] Posting an Omaha request to disabled Aug 5 22:13:52.547773 update_engine[1425]: I0805 22:13:52.547731 1425 omaha_request_action.cc:272] Request: Aug 5 22:13:52.547773 update_engine[1425]: Aug 5 22:13:52.547773 update_engine[1425]: Aug 5 22:13:52.547773 update_engine[1425]: Aug 5 22:13:52.547773 update_engine[1425]: Aug 5 22:13:52.547773 update_engine[1425]: Aug 5 22:13:52.547773 update_engine[1425]: Aug 5 22:13:52.547773 update_engine[1425]: I0805 22:13:52.547735 1425 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Aug 5 22:13:52.548182 update_engine[1425]: I0805 22:13:52.547881 1425 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Aug 5 22:13:52.548182 update_engine[1425]: I0805 22:13:52.548115 1425 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Aug 5 22:13:52.552090 locksmithd[1456]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Aug 5 22:13:52.558319 update_engine[1425]: E0805 22:13:52.558279 1425 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Aug 5 22:13:52.558424 update_engine[1425]: I0805 22:13:52.558332 1425 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Aug 5 22:13:52.558424 update_engine[1425]: I0805 22:13:52.558338 1425 omaha_request_action.cc:617] Omaha request response: Aug 5 22:13:52.558424 update_engine[1425]: I0805 22:13:52.558342 1425 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Aug 5 22:13:52.558424 update_engine[1425]: I0805 22:13:52.558346 1425 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Aug 5 22:13:52.558424 update_engine[1425]: I0805 22:13:52.558348 1425 update_attempter.cc:306] Processing Done. Aug 5 22:13:52.558424 update_engine[1425]: I0805 22:13:52.558352 1425 update_attempter.cc:310] Error event sent. Aug 5 22:13:52.558424 update_engine[1425]: I0805 22:13:52.558360 1425 update_check_scheduler.cc:74] Next update check in 43m7s Aug 5 22:13:52.558712 locksmithd[1456]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Aug 5 22:13:52.888633 sshd[5154]: pam_unix(sshd:session): session closed for user core Aug 5 22:13:52.892391 systemd[1]: sshd@10-172.24.4.33:22-172.24.4.1:46940.service: Deactivated successfully. Aug 5 22:13:52.894962 systemd[1]: session-13.scope: Deactivated successfully. Aug 5 22:13:52.897159 systemd-logind[1423]: Session 13 logged out. Waiting for processes to exit. Aug 5 22:13:52.899149 systemd-logind[1423]: Removed session 13. Aug 5 22:13:57.915747 systemd[1]: Started sshd@11-172.24.4.33:22-172.24.4.1:46614.service - OpenSSH per-connection server daemon (172.24.4.1:46614). Aug 5 22:13:59.583550 sshd[5169]: Accepted publickey for core from 172.24.4.1 port 46614 ssh2: RSA SHA256:wJOsecOnS68Cf9bfpO6HbyavHDudnwLnl2CjsDHwoC8 Aug 5 22:13:59.586814 sshd[5169]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Aug 5 22:13:59.605835 systemd-logind[1423]: New session 14 of user core. Aug 5 22:13:59.608582 systemd[1]: Started session-14.scope - Session 14 of User core. Aug 5 22:14:00.491823 sshd[5169]: pam_unix(sshd:session): session closed for user core Aug 5 22:14:00.498282 systemd[1]: Started sshd@12-172.24.4.33:22-172.24.4.1:46630.service - OpenSSH per-connection server daemon (172.24.4.1:46630). Aug 5 22:14:00.501387 systemd[1]: sshd@11-172.24.4.33:22-172.24.4.1:46614.service: Deactivated successfully. Aug 5 22:14:00.506576 systemd[1]: session-14.scope: Deactivated successfully. Aug 5 22:14:00.509119 systemd-logind[1423]: Session 14 logged out. Waiting for processes to exit. Aug 5 22:14:00.511430 systemd-logind[1423]: Removed session 14. Aug 5 22:14:02.211810 sshd[5193]: Accepted publickey for core from 172.24.4.1 port 46630 ssh2: RSA SHA256:wJOsecOnS68Cf9bfpO6HbyavHDudnwLnl2CjsDHwoC8 Aug 5 22:14:02.215210 sshd[5193]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Aug 5 22:14:02.226950 systemd-logind[1423]: New session 15 of user core. Aug 5 22:14:02.235767 systemd[1]: Started session-15.scope - Session 15 of User core. Aug 5 22:14:03.172586 sshd[5193]: pam_unix(sshd:session): session closed for user core Aug 5 22:14:03.184794 systemd[1]: sshd@12-172.24.4.33:22-172.24.4.1:46630.service: Deactivated successfully. Aug 5 22:14:03.187038 systemd[1]: session-15.scope: Deactivated successfully. Aug 5 22:14:03.189822 systemd-logind[1423]: Session 15 logged out. Waiting for processes to exit. Aug 5 22:14:03.197515 systemd[1]: Started sshd@13-172.24.4.33:22-172.24.4.1:46646.service - OpenSSH per-connection server daemon (172.24.4.1:46646). Aug 5 22:14:03.199692 systemd-logind[1423]: Removed session 15. Aug 5 22:14:04.880790 sshd[5207]: Accepted publickey for core from 172.24.4.1 port 46646 ssh2: RSA SHA256:wJOsecOnS68Cf9bfpO6HbyavHDudnwLnl2CjsDHwoC8 Aug 5 22:14:04.882693 sshd[5207]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Aug 5 22:14:04.896302 systemd-logind[1423]: New session 16 of user core. Aug 5 22:14:04.905635 systemd[1]: Started session-16.scope - Session 16 of User core. Aug 5 22:14:05.703876 sshd[5207]: pam_unix(sshd:session): session closed for user core Aug 5 22:14:05.719583 systemd[1]: sshd@13-172.24.4.33:22-172.24.4.1:46646.service: Deactivated successfully. Aug 5 22:14:05.720106 systemd-logind[1423]: Session 16 logged out. Waiting for processes to exit. Aug 5 22:14:05.724688 systemd[1]: session-16.scope: Deactivated successfully. Aug 5 22:14:05.730582 systemd-logind[1423]: Removed session 16. Aug 5 22:14:10.728834 systemd[1]: Started sshd@14-172.24.4.33:22-172.24.4.1:53066.service - OpenSSH per-connection server daemon (172.24.4.1:53066). Aug 5 22:14:11.991951 sshd[5250]: Accepted publickey for core from 172.24.4.1 port 53066 ssh2: RSA SHA256:wJOsecOnS68Cf9bfpO6HbyavHDudnwLnl2CjsDHwoC8 Aug 5 22:14:11.996732 sshd[5250]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Aug 5 22:14:12.010707 systemd-logind[1423]: New session 17 of user core. Aug 5 22:14:12.017495 systemd[1]: Started session-17.scope - Session 17 of User core. Aug 5 22:14:12.914437 sshd[5250]: pam_unix(sshd:session): session closed for user core Aug 5 22:14:12.922576 systemd[1]: sshd@14-172.24.4.33:22-172.24.4.1:53066.service: Deactivated successfully. Aug 5 22:14:12.928134 systemd[1]: session-17.scope: Deactivated successfully. Aug 5 22:14:12.930794 systemd-logind[1423]: Session 17 logged out. Waiting for processes to exit. Aug 5 22:14:12.933167 systemd-logind[1423]: Removed session 17. Aug 5 22:14:17.937856 systemd[1]: Started sshd@15-172.24.4.33:22-172.24.4.1:34032.service - OpenSSH per-connection server daemon (172.24.4.1:34032). Aug 5 22:14:19.199083 sshd[5288]: Accepted publickey for core from 172.24.4.1 port 34032 ssh2: RSA SHA256:wJOsecOnS68Cf9bfpO6HbyavHDudnwLnl2CjsDHwoC8 Aug 5 22:14:19.206082 sshd[5288]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Aug 5 22:14:19.217204 systemd-logind[1423]: New session 18 of user core. Aug 5 22:14:19.222579 systemd[1]: Started session-18.scope - Session 18 of User core. Aug 5 22:14:20.054826 sshd[5288]: pam_unix(sshd:session): session closed for user core Aug 5 22:14:20.059873 systemd[1]: sshd@15-172.24.4.33:22-172.24.4.1:34032.service: Deactivated successfully. Aug 5 22:14:20.062062 systemd[1]: session-18.scope: Deactivated successfully. Aug 5 22:14:20.063173 systemd-logind[1423]: Session 18 logged out. Waiting for processes to exit. Aug 5 22:14:20.064698 systemd-logind[1423]: Removed session 18. Aug 5 22:14:25.085211 systemd[1]: Started sshd@16-172.24.4.33:22-172.24.4.1:57616.service - OpenSSH per-connection server daemon (172.24.4.1:57616). Aug 5 22:14:26.340852 sshd[5306]: Accepted publickey for core from 172.24.4.1 port 57616 ssh2: RSA SHA256:wJOsecOnS68Cf9bfpO6HbyavHDudnwLnl2CjsDHwoC8 Aug 5 22:14:26.343782 sshd[5306]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Aug 5 22:14:26.350559 systemd-logind[1423]: New session 19 of user core. Aug 5 22:14:26.359502 systemd[1]: Started session-19.scope - Session 19 of User core. Aug 5 22:14:27.428373 sshd[5306]: pam_unix(sshd:session): session closed for user core Aug 5 22:14:27.436469 systemd[1]: sshd@16-172.24.4.33:22-172.24.4.1:57616.service: Deactivated successfully. Aug 5 22:14:27.440153 systemd[1]: session-19.scope: Deactivated successfully. Aug 5 22:14:27.442731 systemd-logind[1423]: Session 19 logged out. Waiting for processes to exit. Aug 5 22:14:27.452942 systemd[1]: Started sshd@17-172.24.4.33:22-172.24.4.1:57630.service - OpenSSH per-connection server daemon (172.24.4.1:57630). Aug 5 22:14:27.459534 systemd-logind[1423]: Removed session 19. Aug 5 22:14:29.137353 sshd[5319]: Accepted publickey for core from 172.24.4.1 port 57630 ssh2: RSA SHA256:wJOsecOnS68Cf9bfpO6HbyavHDudnwLnl2CjsDHwoC8 Aug 5 22:14:29.139585 sshd[5319]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Aug 5 22:14:29.147511 systemd-logind[1423]: New session 20 of user core. Aug 5 22:14:29.151689 systemd[1]: Started session-20.scope - Session 20 of User core. Aug 5 22:14:30.630328 sshd[5319]: pam_unix(sshd:session): session closed for user core Aug 5 22:14:30.648056 systemd[1]: Started sshd@18-172.24.4.33:22-172.24.4.1:57646.service - OpenSSH per-connection server daemon (172.24.4.1:57646). Aug 5 22:14:30.651753 systemd[1]: sshd@17-172.24.4.33:22-172.24.4.1:57630.service: Deactivated successfully. Aug 5 22:14:30.658784 systemd[1]: session-20.scope: Deactivated successfully. Aug 5 22:14:30.664910 systemd-logind[1423]: Session 20 logged out. Waiting for processes to exit. Aug 5 22:14:30.668086 systemd-logind[1423]: Removed session 20. Aug 5 22:14:32.064718 sshd[5330]: Accepted publickey for core from 172.24.4.1 port 57646 ssh2: RSA SHA256:wJOsecOnS68Cf9bfpO6HbyavHDudnwLnl2CjsDHwoC8 Aug 5 22:14:32.070115 sshd[5330]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Aug 5 22:14:32.081563 systemd-logind[1423]: New session 21 of user core. Aug 5 22:14:32.089627 systemd[1]: Started session-21.scope - Session 21 of User core. Aug 5 22:14:35.688629 sshd[5330]: pam_unix(sshd:session): session closed for user core Aug 5 22:14:35.699357 systemd[1]: sshd@18-172.24.4.33:22-172.24.4.1:57646.service: Deactivated successfully. Aug 5 22:14:35.703770 systemd[1]: session-21.scope: Deactivated successfully. Aug 5 22:14:35.705265 systemd-logind[1423]: Session 21 logged out. Waiting for processes to exit. Aug 5 22:14:35.718032 systemd[1]: Started sshd@19-172.24.4.33:22-172.24.4.1:36506.service - OpenSSH per-connection server daemon (172.24.4.1:36506). Aug 5 22:14:35.720292 systemd-logind[1423]: Removed session 21. Aug 5 22:14:37.055335 sshd[5375]: Accepted publickey for core from 172.24.4.1 port 36506 ssh2: RSA SHA256:wJOsecOnS68Cf9bfpO6HbyavHDudnwLnl2CjsDHwoC8 Aug 5 22:14:37.066346 sshd[5375]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Aug 5 22:14:37.082513 systemd-logind[1423]: New session 22 of user core. Aug 5 22:14:37.091627 systemd[1]: Started session-22.scope - Session 22 of User core. Aug 5 22:14:39.241716 sshd[5375]: pam_unix(sshd:session): session closed for user core Aug 5 22:14:39.258080 systemd[1]: sshd@19-172.24.4.33:22-172.24.4.1:36506.service: Deactivated successfully. Aug 5 22:14:39.263591 systemd[1]: session-22.scope: Deactivated successfully. Aug 5 22:14:39.267371 systemd-logind[1423]: Session 22 logged out. Waiting for processes to exit. Aug 5 22:14:39.275883 systemd[1]: Started sshd@20-172.24.4.33:22-172.24.4.1:36522.service - OpenSSH per-connection server daemon (172.24.4.1:36522). Aug 5 22:14:39.281436 systemd-logind[1423]: Removed session 22. Aug 5 22:14:40.997876 sshd[5411]: Accepted publickey for core from 172.24.4.1 port 36522 ssh2: RSA SHA256:wJOsecOnS68Cf9bfpO6HbyavHDudnwLnl2CjsDHwoC8 Aug 5 22:14:41.002451 sshd[5411]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Aug 5 22:14:41.014974 systemd-logind[1423]: New session 23 of user core. Aug 5 22:14:41.025614 systemd[1]: Started session-23.scope - Session 23 of User core. Aug 5 22:14:41.919556 sshd[5411]: pam_unix(sshd:session): session closed for user core Aug 5 22:14:41.928278 systemd[1]: sshd@20-172.24.4.33:22-172.24.4.1:36522.service: Deactivated successfully. Aug 5 22:14:41.935121 systemd[1]: session-23.scope: Deactivated successfully. Aug 5 22:14:41.937845 systemd-logind[1423]: Session 23 logged out. Waiting for processes to exit. Aug 5 22:14:41.941331 systemd-logind[1423]: Removed session 23. Aug 5 22:14:43.675863 kubelet[2635]: I0805 22:14:43.675468 2635 topology_manager.go:215] "Topology Admit Handler" podUID="9cc295e3-10b1-4891-96c8-d1040d2b3032" podNamespace="calico-apiserver" podName="calico-apiserver-6849b66ff4-6xb9k" Aug 5 22:14:43.751013 systemd[1]: Created slice kubepods-besteffort-pod9cc295e3_10b1_4891_96c8_d1040d2b3032.slice - libcontainer container kubepods-besteffort-pod9cc295e3_10b1_4891_96c8_d1040d2b3032.slice. Aug 5 22:14:43.845452 kubelet[2635]: I0805 22:14:43.845156 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/9cc295e3-10b1-4891-96c8-d1040d2b3032-calico-apiserver-certs\") pod \"calico-apiserver-6849b66ff4-6xb9k\" (UID: \"9cc295e3-10b1-4891-96c8-d1040d2b3032\") " pod="calico-apiserver/calico-apiserver-6849b66ff4-6xb9k" Aug 5 22:14:43.848660 kubelet[2635]: I0805 22:14:43.848558 2635 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqxcn\" (UniqueName: \"kubernetes.io/projected/9cc295e3-10b1-4891-96c8-d1040d2b3032-kube-api-access-fqxcn\") pod \"calico-apiserver-6849b66ff4-6xb9k\" (UID: \"9cc295e3-10b1-4891-96c8-d1040d2b3032\") " pod="calico-apiserver/calico-apiserver-6849b66ff4-6xb9k" Aug 5 22:14:43.961332 kubelet[2635]: E0805 22:14:43.958947 2635 secret.go:194] Couldn't get secret calico-apiserver/calico-apiserver-certs: secret "calico-apiserver-certs" not found Aug 5 22:14:43.975445 kubelet[2635]: E0805 22:14:43.975395 2635 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9cc295e3-10b1-4891-96c8-d1040d2b3032-calico-apiserver-certs podName:9cc295e3-10b1-4891-96c8-d1040d2b3032 nodeName:}" failed. No retries permitted until 2024-08-05 22:14:44.459031766 +0000 UTC m=+136.164252164 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "calico-apiserver-certs" (UniqueName: "kubernetes.io/secret/9cc295e3-10b1-4891-96c8-d1040d2b3032-calico-apiserver-certs") pod "calico-apiserver-6849b66ff4-6xb9k" (UID: "9cc295e3-10b1-4891-96c8-d1040d2b3032") : secret "calico-apiserver-certs" not found Aug 5 22:14:44.656720 containerd[1444]: time="2024-08-05T22:14:44.656546301Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6849b66ff4-6xb9k,Uid:9cc295e3-10b1-4891-96c8-d1040d2b3032,Namespace:calico-apiserver,Attempt:0,}" Aug 5 22:14:44.917002 systemd-networkd[1359]: cali10e79282f2d: Link UP Aug 5 22:14:44.917766 systemd-networkd[1359]: cali10e79282f2d: Gained carrier Aug 5 22:14:44.937618 containerd[1444]: 2024-08-05 22:14:44.806 [INFO][5439] plugin.go 326: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--apiserver--6849b66ff4--6xb9k-eth0 calico-apiserver-6849b66ff4- calico-apiserver 9cc295e3-10b1-4891-96c8-d1040d2b3032 1265 0 2024-08-05 22:14:43 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6849b66ff4 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-3975-2-0-1-de7b5ef465.novalocal calico-apiserver-6849b66ff4-6xb9k eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali10e79282f2d [] []}} ContainerID="4322f8aed5d2fa1283ea7a0c5aad0a912e604f163e2b577c2cf105b7e6e21582" Namespace="calico-apiserver" Pod="calico-apiserver-6849b66ff4-6xb9k" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--apiserver--6849b66ff4--6xb9k-" Aug 5 22:14:44.937618 containerd[1444]: 2024-08-05 22:14:44.807 [INFO][5439] k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="4322f8aed5d2fa1283ea7a0c5aad0a912e604f163e2b577c2cf105b7e6e21582" Namespace="calico-apiserver" Pod="calico-apiserver-6849b66ff4-6xb9k" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--apiserver--6849b66ff4--6xb9k-eth0" Aug 5 22:14:44.937618 containerd[1444]: 2024-08-05 22:14:44.853 [INFO][5453] ipam_plugin.go 224: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4322f8aed5d2fa1283ea7a0c5aad0a912e604f163e2b577c2cf105b7e6e21582" HandleID="k8s-pod-network.4322f8aed5d2fa1283ea7a0c5aad0a912e604f163e2b577c2cf105b7e6e21582" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--apiserver--6849b66ff4--6xb9k-eth0" Aug 5 22:14:44.937618 containerd[1444]: 2024-08-05 22:14:44.864 [INFO][5453] ipam_plugin.go 264: Auto assigning IP ContainerID="4322f8aed5d2fa1283ea7a0c5aad0a912e604f163e2b577c2cf105b7e6e21582" HandleID="k8s-pod-network.4322f8aed5d2fa1283ea7a0c5aad0a912e604f163e2b577c2cf105b7e6e21582" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--apiserver--6849b66ff4--6xb9k-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000378ab0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-3975-2-0-1-de7b5ef465.novalocal", "pod":"calico-apiserver-6849b66ff4-6xb9k", "timestamp":"2024-08-05 22:14:44.853104614 +0000 UTC"}, Hostname:"ci-3975-2-0-1-de7b5ef465.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 5 22:14:44.937618 containerd[1444]: 2024-08-05 22:14:44.865 [INFO][5453] ipam_plugin.go 352: About to acquire host-wide IPAM lock. Aug 5 22:14:44.937618 containerd[1444]: 2024-08-05 22:14:44.865 [INFO][5453] ipam_plugin.go 367: Acquired host-wide IPAM lock. Aug 5 22:14:44.937618 containerd[1444]: 2024-08-05 22:14:44.865 [INFO][5453] ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-3975-2-0-1-de7b5ef465.novalocal' Aug 5 22:14:44.937618 containerd[1444]: 2024-08-05 22:14:44.872 [INFO][5453] ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.4322f8aed5d2fa1283ea7a0c5aad0a912e604f163e2b577c2cf105b7e6e21582" host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:14:44.937618 containerd[1444]: 2024-08-05 22:14:44.880 [INFO][5453] ipam.go 372: Looking up existing affinities for host host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:14:44.937618 containerd[1444]: 2024-08-05 22:14:44.886 [INFO][5453] ipam.go 489: Trying affinity for 192.168.92.128/26 host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:14:44.937618 containerd[1444]: 2024-08-05 22:14:44.890 [INFO][5453] ipam.go 155: Attempting to load block cidr=192.168.92.128/26 host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:14:44.937618 containerd[1444]: 2024-08-05 22:14:44.893 [INFO][5453] ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.92.128/26 host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:14:44.937618 containerd[1444]: 2024-08-05 22:14:44.893 [INFO][5453] ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.92.128/26 handle="k8s-pod-network.4322f8aed5d2fa1283ea7a0c5aad0a912e604f163e2b577c2cf105b7e6e21582" host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:14:44.937618 containerd[1444]: 2024-08-05 22:14:44.896 [INFO][5453] ipam.go 1685: Creating new handle: k8s-pod-network.4322f8aed5d2fa1283ea7a0c5aad0a912e604f163e2b577c2cf105b7e6e21582 Aug 5 22:14:44.937618 containerd[1444]: 2024-08-05 22:14:44.900 [INFO][5453] ipam.go 1203: Writing block in order to claim IPs block=192.168.92.128/26 handle="k8s-pod-network.4322f8aed5d2fa1283ea7a0c5aad0a912e604f163e2b577c2cf105b7e6e21582" host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:14:44.937618 containerd[1444]: 2024-08-05 22:14:44.907 [INFO][5453] ipam.go 1216: Successfully claimed IPs: [192.168.92.133/26] block=192.168.92.128/26 handle="k8s-pod-network.4322f8aed5d2fa1283ea7a0c5aad0a912e604f163e2b577c2cf105b7e6e21582" host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:14:44.937618 containerd[1444]: 2024-08-05 22:14:44.907 [INFO][5453] ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.92.133/26] handle="k8s-pod-network.4322f8aed5d2fa1283ea7a0c5aad0a912e604f163e2b577c2cf105b7e6e21582" host="ci-3975-2-0-1-de7b5ef465.novalocal" Aug 5 22:14:44.937618 containerd[1444]: 2024-08-05 22:14:44.907 [INFO][5453] ipam_plugin.go 373: Released host-wide IPAM lock. Aug 5 22:14:44.937618 containerd[1444]: 2024-08-05 22:14:44.907 [INFO][5453] ipam_plugin.go 282: Calico CNI IPAM assigned addresses IPv4=[192.168.92.133/26] IPv6=[] ContainerID="4322f8aed5d2fa1283ea7a0c5aad0a912e604f163e2b577c2cf105b7e6e21582" HandleID="k8s-pod-network.4322f8aed5d2fa1283ea7a0c5aad0a912e604f163e2b577c2cf105b7e6e21582" Workload="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--apiserver--6849b66ff4--6xb9k-eth0" Aug 5 22:14:44.946663 containerd[1444]: 2024-08-05 22:14:44.910 [INFO][5439] k8s.go 386: Populated endpoint ContainerID="4322f8aed5d2fa1283ea7a0c5aad0a912e604f163e2b577c2cf105b7e6e21582" Namespace="calico-apiserver" Pod="calico-apiserver-6849b66ff4-6xb9k" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--apiserver--6849b66ff4--6xb9k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--apiserver--6849b66ff4--6xb9k-eth0", GenerateName:"calico-apiserver-6849b66ff4-", Namespace:"calico-apiserver", SelfLink:"", UID:"9cc295e3-10b1-4891-96c8-d1040d2b3032", ResourceVersion:"1265", Generation:0, CreationTimestamp:time.Date(2024, time.August, 5, 22, 14, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6849b66ff4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3975-2-0-1-de7b5ef465.novalocal", ContainerID:"", Pod:"calico-apiserver-6849b66ff4-6xb9k", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.92.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali10e79282f2d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Aug 5 22:14:44.946663 containerd[1444]: 2024-08-05 22:14:44.910 [INFO][5439] k8s.go 387: Calico CNI using IPs: [192.168.92.133/32] ContainerID="4322f8aed5d2fa1283ea7a0c5aad0a912e604f163e2b577c2cf105b7e6e21582" Namespace="calico-apiserver" Pod="calico-apiserver-6849b66ff4-6xb9k" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--apiserver--6849b66ff4--6xb9k-eth0" Aug 5 22:14:44.946663 containerd[1444]: 2024-08-05 22:14:44.910 [INFO][5439] dataplane_linux.go 68: Setting the host side veth name to cali10e79282f2d ContainerID="4322f8aed5d2fa1283ea7a0c5aad0a912e604f163e2b577c2cf105b7e6e21582" Namespace="calico-apiserver" Pod="calico-apiserver-6849b66ff4-6xb9k" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--apiserver--6849b66ff4--6xb9k-eth0" Aug 5 22:14:44.946663 containerd[1444]: 2024-08-05 22:14:44.918 [INFO][5439] dataplane_linux.go 479: Disabling IPv4 forwarding ContainerID="4322f8aed5d2fa1283ea7a0c5aad0a912e604f163e2b577c2cf105b7e6e21582" Namespace="calico-apiserver" Pod="calico-apiserver-6849b66ff4-6xb9k" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--apiserver--6849b66ff4--6xb9k-eth0" Aug 5 22:14:44.946663 containerd[1444]: 2024-08-05 22:14:44.920 [INFO][5439] k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="4322f8aed5d2fa1283ea7a0c5aad0a912e604f163e2b577c2cf105b7e6e21582" Namespace="calico-apiserver" Pod="calico-apiserver-6849b66ff4-6xb9k" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--apiserver--6849b66ff4--6xb9k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--apiserver--6849b66ff4--6xb9k-eth0", GenerateName:"calico-apiserver-6849b66ff4-", Namespace:"calico-apiserver", SelfLink:"", UID:"9cc295e3-10b1-4891-96c8-d1040d2b3032", ResourceVersion:"1265", Generation:0, CreationTimestamp:time.Date(2024, time.August, 5, 22, 14, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6849b66ff4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3975-2-0-1-de7b5ef465.novalocal", ContainerID:"4322f8aed5d2fa1283ea7a0c5aad0a912e604f163e2b577c2cf105b7e6e21582", Pod:"calico-apiserver-6849b66ff4-6xb9k", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.92.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali10e79282f2d", MAC:"56:ac:5a:81:58:45", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Aug 5 22:14:44.946663 containerd[1444]: 2024-08-05 22:14:44.932 [INFO][5439] k8s.go 500: Wrote updated endpoint to datastore ContainerID="4322f8aed5d2fa1283ea7a0c5aad0a912e604f163e2b577c2cf105b7e6e21582" Namespace="calico-apiserver" Pod="calico-apiserver-6849b66ff4-6xb9k" WorkloadEndpoint="ci--3975--2--0--1--de7b5ef465.novalocal-k8s-calico--apiserver--6849b66ff4--6xb9k-eth0" Aug 5 22:14:45.054662 systemd[1]: run-containerd-runc-k8s.io-81801b50e438d74a07c8b62aedd1cf55c8a0fde7a170d8621b6412d43c922fcd-runc.fHOxxV.mount: Deactivated successfully. Aug 5 22:14:45.276501 containerd[1444]: time="2024-08-05T22:14:45.275895058Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Aug 5 22:14:45.276501 containerd[1444]: time="2024-08-05T22:14:45.275974237Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 5 22:14:45.276501 containerd[1444]: time="2024-08-05T22:14:45.276004473Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Aug 5 22:14:45.276501 containerd[1444]: time="2024-08-05T22:14:45.276024331Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Aug 5 22:14:45.330923 systemd[1]: run-containerd-runc-k8s.io-4322f8aed5d2fa1283ea7a0c5aad0a912e604f163e2b577c2cf105b7e6e21582-runc.S1LDfe.mount: Deactivated successfully. Aug 5 22:14:45.348774 systemd[1]: Started cri-containerd-4322f8aed5d2fa1283ea7a0c5aad0a912e604f163e2b577c2cf105b7e6e21582.scope - libcontainer container 4322f8aed5d2fa1283ea7a0c5aad0a912e604f163e2b577c2cf105b7e6e21582. Aug 5 22:14:45.400565 containerd[1444]: time="2024-08-05T22:14:45.400516139Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6849b66ff4-6xb9k,Uid:9cc295e3-10b1-4891-96c8-d1040d2b3032,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"4322f8aed5d2fa1283ea7a0c5aad0a912e604f163e2b577c2cf105b7e6e21582\"" Aug 5 22:14:45.430301 containerd[1444]: time="2024-08-05T22:14:45.429728383Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.28.0\"" Aug 5 22:14:46.859860 systemd-networkd[1359]: cali10e79282f2d: Gained IPv6LL Aug 5 22:14:46.955155 systemd[1]: Started sshd@21-172.24.4.33:22-172.24.4.1:54184.service - OpenSSH per-connection server daemon (172.24.4.1:54184). Aug 5 22:14:48.644631 sshd[5541]: Accepted publickey for core from 172.24.4.1 port 54184 ssh2: RSA SHA256:wJOsecOnS68Cf9bfpO6HbyavHDudnwLnl2CjsDHwoC8 Aug 5 22:14:48.652547 sshd[5541]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Aug 5 22:14:48.664495 systemd-logind[1423]: New session 24 of user core. Aug 5 22:14:48.671396 systemd[1]: Started session-24.scope - Session 24 of User core. Aug 5 22:14:49.840506 containerd[1444]: time="2024-08-05T22:14:49.839159555Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.28.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:14:49.844017 containerd[1444]: time="2024-08-05T22:14:49.843583667Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.28.0: active requests=0, bytes read=40421260" Aug 5 22:14:49.850286 containerd[1444]: time="2024-08-05T22:14:49.846645622Z" level=info msg="ImageCreate event name:\"sha256:6c07591fd1cfafb48d575f75a6b9d8d3cc03bead5b684908ef5e7dd3132794d6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:14:49.859017 containerd[1444]: time="2024-08-05T22:14:49.858920658Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:e8f124312a4c41451e51bfc00b6e98929e9eb0510905f3301542719a3e8d2fec\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 5 22:14:49.863392 containerd[1444]: time="2024-08-05T22:14:49.863304292Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.28.0\" with image id \"sha256:6c07591fd1cfafb48d575f75a6b9d8d3cc03bead5b684908ef5e7dd3132794d6\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.28.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:e8f124312a4c41451e51bfc00b6e98929e9eb0510905f3301542719a3e8d2fec\", size \"41869036\" in 4.432443136s" Aug 5 22:14:49.863746 containerd[1444]: time="2024-08-05T22:14:49.863601964Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.28.0\" returns image reference \"sha256:6c07591fd1cfafb48d575f75a6b9d8d3cc03bead5b684908ef5e7dd3132794d6\"" Aug 5 22:14:49.873331 containerd[1444]: time="2024-08-05T22:14:49.873202302Z" level=info msg="CreateContainer within sandbox \"4322f8aed5d2fa1283ea7a0c5aad0a912e604f163e2b577c2cf105b7e6e21582\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 5 22:14:49.915956 containerd[1444]: time="2024-08-05T22:14:49.915907670Z" level=info msg="CreateContainer within sandbox \"4322f8aed5d2fa1283ea7a0c5aad0a912e604f163e2b577c2cf105b7e6e21582\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"d4dec8d5e6a31d3516fef243b6b503d1eec2660d2666d506072a1848a470c4c3\"" Aug 5 22:14:49.919020 containerd[1444]: time="2024-08-05T22:14:49.918835705Z" level=info msg="StartContainer for \"d4dec8d5e6a31d3516fef243b6b503d1eec2660d2666d506072a1848a470c4c3\"" Aug 5 22:14:49.992678 systemd[1]: Started cri-containerd-d4dec8d5e6a31d3516fef243b6b503d1eec2660d2666d506072a1848a470c4c3.scope - libcontainer container d4dec8d5e6a31d3516fef243b6b503d1eec2660d2666d506072a1848a470c4c3. Aug 5 22:14:50.115376 containerd[1444]: time="2024-08-05T22:14:50.115270116Z" level=info msg="StartContainer for \"d4dec8d5e6a31d3516fef243b6b503d1eec2660d2666d506072a1848a470c4c3\" returns successfully" Aug 5 22:14:50.903129 sshd[5541]: pam_unix(sshd:session): session closed for user core Aug 5 22:14:50.910652 systemd-logind[1423]: Session 24 logged out. Waiting for processes to exit. Aug 5 22:14:50.912507 systemd[1]: sshd@21-172.24.4.33:22-172.24.4.1:54184.service: Deactivated successfully. Aug 5 22:14:50.917937 systemd[1]: session-24.scope: Deactivated successfully. Aug 5 22:14:50.921002 systemd-logind[1423]: Removed session 24. Aug 5 22:14:52.188244 kubelet[2635]: I0805 22:14:52.187082 2635 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6849b66ff4-6xb9k" podStartSLOduration=4.740676175 podStartE2EDuration="9.176469219s" podCreationTimestamp="2024-08-05 22:14:43 +0000 UTC" firstStartedPulling="2024-08-05 22:14:45.429257798 +0000 UTC m=+137.134478186" lastFinishedPulling="2024-08-05 22:14:49.865050792 +0000 UTC m=+141.570271230" observedRunningTime="2024-08-05 22:14:50.905146724 +0000 UTC m=+142.610367122" watchObservedRunningTime="2024-08-05 22:14:52.176469219 +0000 UTC m=+143.881689607" Aug 5 22:14:55.921955 systemd[1]: Started sshd@22-172.24.4.33:22-172.24.4.1:38948.service - OpenSSH per-connection server daemon (172.24.4.1:38948). Aug 5 22:14:57.682374 sshd[5629]: Accepted publickey for core from 172.24.4.1 port 38948 ssh2: RSA SHA256:wJOsecOnS68Cf9bfpO6HbyavHDudnwLnl2CjsDHwoC8 Aug 5 22:14:57.689002 sshd[5629]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Aug 5 22:14:57.698348 systemd-logind[1423]: New session 25 of user core. Aug 5 22:14:57.703389 systemd[1]: Started session-25.scope - Session 25 of User core. Aug 5 22:14:58.641284 sshd[5629]: pam_unix(sshd:session): session closed for user core Aug 5 22:14:58.649831 systemd[1]: sshd@22-172.24.4.33:22-172.24.4.1:38948.service: Deactivated successfully. Aug 5 22:14:58.653238 systemd[1]: session-25.scope: Deactivated successfully. Aug 5 22:14:58.655710 systemd-logind[1423]: Session 25 logged out. Waiting for processes to exit. Aug 5 22:14:58.657326 systemd-logind[1423]: Removed session 25. Aug 5 22:15:03.665909 systemd[1]: Started sshd@23-172.24.4.33:22-172.24.4.1:38950.service - OpenSSH per-connection server daemon (172.24.4.1:38950). Aug 5 22:15:05.026433 sshd[5647]: Accepted publickey for core from 172.24.4.1 port 38950 ssh2: RSA SHA256:wJOsecOnS68Cf9bfpO6HbyavHDudnwLnl2CjsDHwoC8 Aug 5 22:15:05.026182 sshd[5647]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Aug 5 22:15:05.035305 systemd-logind[1423]: New session 26 of user core. Aug 5 22:15:05.041537 systemd[1]: Started session-26.scope - Session 26 of User core. Aug 5 22:15:05.831458 sshd[5647]: pam_unix(sshd:session): session closed for user core Aug 5 22:15:05.837407 systemd[1]: sshd@23-172.24.4.33:22-172.24.4.1:38950.service: Deactivated successfully. Aug 5 22:15:05.841629 systemd[1]: session-26.scope: Deactivated successfully. Aug 5 22:15:05.843044 systemd-logind[1423]: Session 26 logged out. Waiting for processes to exit. Aug 5 22:15:05.844356 systemd-logind[1423]: Removed session 26. Aug 5 22:15:07.093012 systemd[1]: run-containerd-runc-k8s.io-931d26c86b35eaea57a3b41ac3248ed922aec7b95b2a1c7face54f7941b66bbb-runc.Gz8zUz.mount: Deactivated successfully. Aug 5 22:15:10.857912 systemd[1]: Started sshd@24-172.24.4.33:22-172.24.4.1:58550.service - OpenSSH per-connection server daemon (172.24.4.1:58550). Aug 5 22:15:12.237037 sshd[5682]: Accepted publickey for core from 172.24.4.1 port 58550 ssh2: RSA SHA256:wJOsecOnS68Cf9bfpO6HbyavHDudnwLnl2CjsDHwoC8 Aug 5 22:15:12.240703 sshd[5682]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Aug 5 22:15:12.254965 systemd-logind[1423]: New session 27 of user core. Aug 5 22:15:12.260563 systemd[1]: Started session-27.scope - Session 27 of User core. Aug 5 22:15:13.169496 sshd[5682]: pam_unix(sshd:session): session closed for user core Aug 5 22:15:13.175400 systemd[1]: sshd@24-172.24.4.33:22-172.24.4.1:58550.service: Deactivated successfully. Aug 5 22:15:13.180801 systemd[1]: session-27.scope: Deactivated successfully. Aug 5 22:15:13.185273 systemd-logind[1423]: Session 27 logged out. Waiting for processes to exit. Aug 5 22:15:13.190937 systemd-logind[1423]: Removed session 27. Aug 5 22:15:15.090346 systemd[1]: run-containerd-runc-k8s.io-81801b50e438d74a07c8b62aedd1cf55c8a0fde7a170d8621b6412d43c922fcd-runc.ahrY9X.mount: Deactivated successfully.