Mar 13 00:33:18.896117 kernel: Linux version 6.12.74-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Thu Mar 12 22:08:29 -00 2026 Mar 13 00:33:18.896138 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=a2116dc4421f78fe124deb19b9ad6d70a0cb4fc0b3349854f4ce4e2904d4925d Mar 13 00:33:18.896145 kernel: BIOS-provided physical RAM map: Mar 13 00:33:18.896150 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Mar 13 00:33:18.896157 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ed3efff] usable Mar 13 00:33:18.896162 kernel: BIOS-e820: [mem 0x000000007ed3f000-0x000000007edfffff] reserved Mar 13 00:33:18.896167 kernel: BIOS-e820: [mem 0x000000007ee00000-0x000000007f8ecfff] usable Mar 13 00:33:18.896172 kernel: BIOS-e820: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Mar 13 00:33:18.896177 kernel: BIOS-e820: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Mar 13 00:33:18.896181 kernel: BIOS-e820: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Mar 13 00:33:18.896186 kernel: BIOS-e820: [mem 0x000000007fbff000-0x000000007ff7bfff] usable Mar 13 00:33:18.896191 kernel: BIOS-e820: [mem 0x000000007ff7c000-0x000000007fffffff] reserved Mar 13 00:33:18.896195 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Mar 13 00:33:18.896202 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 13 00:33:18.896208 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Mar 13 00:33:18.896213 kernel: BIOS-e820: [mem 0x0000000100000000-0x0000000179ffffff] usable Mar 13 00:33:18.896218 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Mar 13 00:33:18.896222 kernel: NX (Execute Disable) protection: active Mar 13 00:33:18.896230 kernel: APIC: Static calls initialized Mar 13 00:33:18.896234 kernel: e820: update [mem 0x7dfae018-0x7dfb7a57] usable ==> usable Mar 13 00:33:18.896239 kernel: e820: update [mem 0x7df72018-0x7dfad657] usable ==> usable Mar 13 00:33:18.896244 kernel: e820: update [mem 0x7df36018-0x7df71657] usable ==> usable Mar 13 00:33:18.896249 kernel: extended physical RAM map: Mar 13 00:33:18.896254 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Mar 13 00:33:18.896259 kernel: reserve setup_data: [mem 0x0000000000100000-0x000000007df36017] usable Mar 13 00:33:18.896264 kernel: reserve setup_data: [mem 0x000000007df36018-0x000000007df71657] usable Mar 13 00:33:18.896325 kernel: reserve setup_data: [mem 0x000000007df71658-0x000000007df72017] usable Mar 13 00:33:18.896330 kernel: reserve setup_data: [mem 0x000000007df72018-0x000000007dfad657] usable Mar 13 00:33:18.896335 kernel: reserve setup_data: [mem 0x000000007dfad658-0x000000007dfae017] usable Mar 13 00:33:18.896342 kernel: reserve setup_data: [mem 0x000000007dfae018-0x000000007dfb7a57] usable Mar 13 00:33:18.896347 kernel: reserve setup_data: [mem 0x000000007dfb7a58-0x000000007ed3efff] usable Mar 13 00:33:18.896352 kernel: reserve setup_data: [mem 0x000000007ed3f000-0x000000007edfffff] reserved Mar 13 00:33:18.896357 kernel: reserve setup_data: [mem 0x000000007ee00000-0x000000007f8ecfff] usable Mar 13 00:33:18.896361 kernel: reserve setup_data: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Mar 13 00:33:18.896366 kernel: reserve setup_data: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Mar 13 00:33:18.896371 kernel: reserve setup_data: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Mar 13 00:33:18.896376 kernel: reserve setup_data: [mem 0x000000007fbff000-0x000000007ff7bfff] usable Mar 13 00:33:18.896381 kernel: reserve setup_data: [mem 0x000000007ff7c000-0x000000007fffffff] reserved Mar 13 00:33:18.896386 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Mar 13 00:33:18.896391 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 13 00:33:18.896401 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Mar 13 00:33:18.896406 kernel: reserve setup_data: [mem 0x0000000100000000-0x0000000179ffffff] usable Mar 13 00:33:18.896411 kernel: reserve setup_data: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Mar 13 00:33:18.896416 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Mar 13 00:33:18.896421 kernel: efi: SMBIOS=0x7f988000 SMBIOS 3.0=0x7f986000 ACPI=0x7fb7e000 ACPI 2.0=0x7fb7e014 MEMATTR=0x7e845198 RNG=0x7fb73018 Mar 13 00:33:18.896429 kernel: random: crng init done Mar 13 00:33:18.896434 kernel: efi: Remove mem136: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Mar 13 00:33:18.896439 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Mar 13 00:33:18.896444 kernel: secureboot: Secure boot disabled Mar 13 00:33:18.896449 kernel: SMBIOS 3.0.0 present. Mar 13 00:33:18.896454 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 Mar 13 00:33:18.896459 kernel: DMI: Memory slots populated: 1/1 Mar 13 00:33:18.896464 kernel: Hypervisor detected: KVM Mar 13 00:33:18.896469 kernel: last_pfn = 0x7ff7c max_arch_pfn = 0x10000000000 Mar 13 00:33:18.896474 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Mar 13 00:33:18.896479 kernel: kvm-clock: using sched offset of 13686828681 cycles Mar 13 00:33:18.896487 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 13 00:33:18.896492 kernel: tsc: Detected 2399.998 MHz processor Mar 13 00:33:18.896497 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 13 00:33:18.896503 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 13 00:33:18.896508 kernel: last_pfn = 0x17a000 max_arch_pfn = 0x10000000000 Mar 13 00:33:18.896513 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Mar 13 00:33:18.896518 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 13 00:33:18.896523 kernel: last_pfn = 0x7ff7c max_arch_pfn = 0x10000000000 Mar 13 00:33:18.896528 kernel: Using GB pages for direct mapping Mar 13 00:33:18.896536 kernel: ACPI: Early table checksum verification disabled Mar 13 00:33:18.896541 kernel: ACPI: RSDP 0x000000007FB7E014 000024 (v02 BOCHS ) Mar 13 00:33:18.896546 kernel: ACPI: XSDT 0x000000007FB7D0E8 000054 (v01 BOCHS BXPC 00000001 01000013) Mar 13 00:33:18.896551 kernel: ACPI: FACP 0x000000007FB79000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 13 00:33:18.896556 kernel: ACPI: DSDT 0x000000007FB7A000 002443 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 13 00:33:18.896561 kernel: ACPI: FACS 0x000000007FBDD000 000040 Mar 13 00:33:18.896567 kernel: ACPI: APIC 0x000000007FB78000 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 13 00:33:18.896572 kernel: ACPI: HPET 0x000000007FB77000 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 13 00:33:18.896577 kernel: ACPI: MCFG 0x000000007FB76000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 13 00:33:18.896584 kernel: ACPI: WAET 0x000000007FB75000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 13 00:33:18.896589 kernel: ACPI: BGRT 0x000000007FB74000 000038 (v01 INTEL EDK2 00000002 01000013) Mar 13 00:33:18.896595 kernel: ACPI: Reserving FACP table memory at [mem 0x7fb79000-0x7fb790f3] Mar 13 00:33:18.896600 kernel: ACPI: Reserving DSDT table memory at [mem 0x7fb7a000-0x7fb7c442] Mar 13 00:33:18.896605 kernel: ACPI: Reserving FACS table memory at [mem 0x7fbdd000-0x7fbdd03f] Mar 13 00:33:18.896610 kernel: ACPI: Reserving APIC table memory at [mem 0x7fb78000-0x7fb7807f] Mar 13 00:33:18.896615 kernel: ACPI: Reserving HPET table memory at [mem 0x7fb77000-0x7fb77037] Mar 13 00:33:18.896620 kernel: ACPI: Reserving MCFG table memory at [mem 0x7fb76000-0x7fb7603b] Mar 13 00:33:18.896625 kernel: ACPI: Reserving WAET table memory at [mem 0x7fb75000-0x7fb75027] Mar 13 00:33:18.896633 kernel: ACPI: Reserving BGRT table memory at [mem 0x7fb74000-0x7fb74037] Mar 13 00:33:18.896638 kernel: No NUMA configuration found Mar 13 00:33:18.896643 kernel: Faking a node at [mem 0x0000000000000000-0x0000000179ffffff] Mar 13 00:33:18.896648 kernel: NODE_DATA(0) allocated [mem 0x179ff8dc0-0x179ffffff] Mar 13 00:33:18.896654 kernel: Zone ranges: Mar 13 00:33:18.896659 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 13 00:33:18.896664 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Mar 13 00:33:18.896669 kernel: Normal [mem 0x0000000100000000-0x0000000179ffffff] Mar 13 00:33:18.896674 kernel: Device empty Mar 13 00:33:18.896681 kernel: Movable zone start for each node Mar 13 00:33:18.896686 kernel: Early memory node ranges Mar 13 00:33:18.896691 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Mar 13 00:33:18.896697 kernel: node 0: [mem 0x0000000000100000-0x000000007ed3efff] Mar 13 00:33:18.896702 kernel: node 0: [mem 0x000000007ee00000-0x000000007f8ecfff] Mar 13 00:33:18.896707 kernel: node 0: [mem 0x000000007fbff000-0x000000007ff7bfff] Mar 13 00:33:18.896712 kernel: node 0: [mem 0x0000000100000000-0x0000000179ffffff] Mar 13 00:33:18.896717 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x0000000179ffffff] Mar 13 00:33:18.896723 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 13 00:33:18.896728 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Mar 13 00:33:18.896741 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Mar 13 00:33:18.896756 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Mar 13 00:33:18.896769 kernel: On node 0, zone Normal: 132 pages in unavailable ranges Mar 13 00:33:18.896775 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Mar 13 00:33:18.896780 kernel: ACPI: PM-Timer IO Port: 0x608 Mar 13 00:33:18.896786 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Mar 13 00:33:18.896791 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Mar 13 00:33:18.896796 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Mar 13 00:33:18.896801 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Mar 13 00:33:18.896814 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 13 00:33:18.896819 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Mar 13 00:33:18.896824 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Mar 13 00:33:18.896830 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 13 00:33:18.896835 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Mar 13 00:33:18.896840 kernel: CPU topo: Max. logical packages: 1 Mar 13 00:33:18.896845 kernel: CPU topo: Max. logical dies: 1 Mar 13 00:33:18.896859 kernel: CPU topo: Max. dies per package: 1 Mar 13 00:33:18.896864 kernel: CPU topo: Max. threads per core: 1 Mar 13 00:33:18.896870 kernel: CPU topo: Num. cores per package: 2 Mar 13 00:33:18.896875 kernel: CPU topo: Num. threads per package: 2 Mar 13 00:33:18.896880 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Mar 13 00:33:18.896888 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Mar 13 00:33:18.896893 kernel: [mem 0x80000000-0xdfffffff] available for PCI devices Mar 13 00:33:18.896899 kernel: Booting paravirtualized kernel on KVM Mar 13 00:33:18.896904 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 13 00:33:18.896910 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Mar 13 00:33:18.896918 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Mar 13 00:33:18.896923 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Mar 13 00:33:18.896928 kernel: pcpu-alloc: [0] 0 1 Mar 13 00:33:18.896934 kernel: kvm-guest: PV spinlocks disabled, no host support Mar 13 00:33:18.896940 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=a2116dc4421f78fe124deb19b9ad6d70a0cb4fc0b3349854f4ce4e2904d4925d Mar 13 00:33:18.896945 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 13 00:33:18.896951 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 13 00:33:18.896956 kernel: Fallback order for Node 0: 0 Mar 13 00:33:18.896964 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1022792 Mar 13 00:33:18.896969 kernel: Policy zone: Normal Mar 13 00:33:18.896974 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 13 00:33:18.896980 kernel: software IO TLB: area num 2. Mar 13 00:33:18.896985 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 13 00:33:18.896990 kernel: ftrace: allocating 40099 entries in 157 pages Mar 13 00:33:18.896996 kernel: ftrace: allocated 157 pages with 5 groups Mar 13 00:33:18.897001 kernel: Dynamic Preempt: voluntary Mar 13 00:33:18.897006 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 13 00:33:18.897014 kernel: rcu: RCU event tracing is enabled. Mar 13 00:33:18.897020 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 13 00:33:18.897026 kernel: Trampoline variant of Tasks RCU enabled. Mar 13 00:33:18.897031 kernel: Rude variant of Tasks RCU enabled. Mar 13 00:33:18.897037 kernel: Tracing variant of Tasks RCU enabled. Mar 13 00:33:18.897042 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 13 00:33:18.897048 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 13 00:33:18.897053 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 13 00:33:18.897058 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 13 00:33:18.897064 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 13 00:33:18.897072 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Mar 13 00:33:18.897077 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 13 00:33:18.897083 kernel: Console: colour dummy device 80x25 Mar 13 00:33:18.897088 kernel: printk: legacy console [tty0] enabled Mar 13 00:33:18.897093 kernel: printk: legacy console [ttyS0] enabled Mar 13 00:33:18.897099 kernel: ACPI: Core revision 20240827 Mar 13 00:33:18.897104 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Mar 13 00:33:18.897109 kernel: APIC: Switch to symmetric I/O mode setup Mar 13 00:33:18.897115 kernel: x2apic enabled Mar 13 00:33:18.897122 kernel: APIC: Switched APIC routing to: physical x2apic Mar 13 00:33:18.897128 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Mar 13 00:33:18.897133 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x229835b7123, max_idle_ns: 440795242976 ns Mar 13 00:33:18.897139 kernel: Calibrating delay loop (skipped) preset value.. 4799.99 BogoMIPS (lpj=2399998) Mar 13 00:33:18.897144 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Mar 13 00:33:18.897149 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Mar 13 00:33:18.897155 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Mar 13 00:33:18.897160 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 13 00:33:18.897168 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Mar 13 00:33:18.897173 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Mar 13 00:33:18.897179 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Mar 13 00:33:18.897184 kernel: active return thunk: srso_alias_return_thunk Mar 13 00:33:18.897190 kernel: Speculative Return Stack Overflow: Mitigation: Safe RET Mar 13 00:33:18.897195 kernel: Transient Scheduler Attacks: Forcing mitigation on in a VM Mar 13 00:33:18.897200 kernel: Transient Scheduler Attacks: Vulnerable: Clear CPU buffers attempted, no microcode Mar 13 00:33:18.897206 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Mar 13 00:33:18.897211 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Mar 13 00:33:18.897219 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Mar 13 00:33:18.897224 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Mar 13 00:33:18.897229 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Mar 13 00:33:18.897235 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Mar 13 00:33:18.897240 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Mar 13 00:33:18.897246 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Mar 13 00:33:18.897251 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Mar 13 00:33:18.897256 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Mar 13 00:33:18.897262 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Mar 13 00:33:18.897285 kernel: x86/fpu: xstate_offset[9]: 2432, xstate_sizes[9]: 8 Mar 13 00:33:18.897290 kernel: x86/fpu: Enabled xstate features 0x2e7, context size is 2440 bytes, using 'compacted' format. Mar 13 00:33:18.897295 kernel: Freeing SMP alternatives memory: 32K Mar 13 00:33:18.898146 kernel: pid_max: default: 32768 minimum: 301 Mar 13 00:33:18.898158 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Mar 13 00:33:18.898164 kernel: landlock: Up and running. Mar 13 00:33:18.898170 kernel: SELinux: Initializing. Mar 13 00:33:18.898175 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 13 00:33:18.898181 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 13 00:33:18.898190 kernel: smpboot: CPU0: AMD EPYC-Genoa Processor (family: 0x19, model: 0x11, stepping: 0x0) Mar 13 00:33:18.898196 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Mar 13 00:33:18.898201 kernel: ... version: 0 Mar 13 00:33:18.898207 kernel: ... bit width: 48 Mar 13 00:33:18.898212 kernel: ... generic registers: 6 Mar 13 00:33:18.898217 kernel: ... value mask: 0000ffffffffffff Mar 13 00:33:18.898223 kernel: ... max period: 00007fffffffffff Mar 13 00:33:18.898228 kernel: ... fixed-purpose events: 0 Mar 13 00:33:18.898234 kernel: ... event mask: 000000000000003f Mar 13 00:33:18.898241 kernel: signal: max sigframe size: 3376 Mar 13 00:33:18.898247 kernel: rcu: Hierarchical SRCU implementation. Mar 13 00:33:18.898253 kernel: rcu: Max phase no-delay instances is 400. Mar 13 00:33:18.898258 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Mar 13 00:33:18.898264 kernel: smp: Bringing up secondary CPUs ... Mar 13 00:33:18.901550 kernel: smpboot: x86: Booting SMP configuration: Mar 13 00:33:18.901559 kernel: .... node #0, CPUs: #1 Mar 13 00:33:18.901565 kernel: smp: Brought up 1 node, 2 CPUs Mar 13 00:33:18.901570 kernel: smpboot: Total of 2 processors activated (9599.99 BogoMIPS) Mar 13 00:33:18.901581 kernel: Memory: 3848520K/4091168K available (14336K kernel code, 2445K rwdata, 26064K rodata, 46200K init, 2560K bss, 237016K reserved, 0K cma-reserved) Mar 13 00:33:18.901586 kernel: devtmpfs: initialized Mar 13 00:33:18.901592 kernel: x86/mm: Memory block size: 128MB Mar 13 00:33:18.901597 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7fb7f000-0x7fbfefff] (524288 bytes) Mar 13 00:33:18.901603 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 13 00:33:18.901609 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 13 00:33:18.901614 kernel: pinctrl core: initialized pinctrl subsystem Mar 13 00:33:18.901619 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 13 00:33:18.901625 kernel: audit: initializing netlink subsys (disabled) Mar 13 00:33:18.901633 kernel: audit: type=2000 audit(1773361996.476:1): state=initialized audit_enabled=0 res=1 Mar 13 00:33:18.901638 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 13 00:33:18.901644 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 13 00:33:18.901649 kernel: cpuidle: using governor menu Mar 13 00:33:18.901655 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 13 00:33:18.901660 kernel: dca service started, version 1.12.1 Mar 13 00:33:18.901665 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Mar 13 00:33:18.901671 kernel: PCI: Using configuration type 1 for base access Mar 13 00:33:18.901676 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 13 00:33:18.901684 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 13 00:33:18.901690 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Mar 13 00:33:18.901695 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 13 00:33:18.901700 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 13 00:33:18.901706 kernel: ACPI: Added _OSI(Module Device) Mar 13 00:33:18.901711 kernel: ACPI: Added _OSI(Processor Device) Mar 13 00:33:18.901716 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 13 00:33:18.901722 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 13 00:33:18.901727 kernel: ACPI: Interpreter enabled Mar 13 00:33:18.901735 kernel: ACPI: PM: (supports S0 S5) Mar 13 00:33:18.901740 kernel: ACPI: Using IOAPIC for interrupt routing Mar 13 00:33:18.901746 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 13 00:33:18.901751 kernel: PCI: Using E820 reservations for host bridge windows Mar 13 00:33:18.901756 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Mar 13 00:33:18.901762 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 13 00:33:18.901935 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 13 00:33:18.902043 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Mar 13 00:33:18.902147 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Mar 13 00:33:18.902154 kernel: PCI host bridge to bus 0000:00 Mar 13 00:33:18.902255 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 13 00:33:18.902382 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Mar 13 00:33:18.902473 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 13 00:33:18.902561 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xdfffffff window] Mar 13 00:33:18.902649 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Mar 13 00:33:18.902740 kernel: pci_bus 0000:00: root bus resource [mem 0xc000000000-0xc7ffffffff window] Mar 13 00:33:18.902830 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 13 00:33:18.902944 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Mar 13 00:33:18.903054 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Mar 13 00:33:18.903151 kernel: pci 0000:00:01.0: BAR 0 [mem 0x80000000-0x807fffff pref] Mar 13 00:33:18.903247 kernel: pci 0000:00:01.0: BAR 2 [mem 0xc060500000-0xc060503fff 64bit pref] Mar 13 00:33:18.903391 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8138a000-0x8138afff] Mar 13 00:33:18.903489 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Mar 13 00:33:18.903585 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 13 00:33:18.903688 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:33:18.903784 kernel: pci 0000:00:02.0: BAR 0 [mem 0x81389000-0x81389fff] Mar 13 00:33:18.903879 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Mar 13 00:33:18.903975 kernel: pci 0000:00:02.0: bridge window [mem 0x81200000-0x812fffff] Mar 13 00:33:18.904074 kernel: pci 0000:00:02.0: bridge window [mem 0xc060000000-0xc0600fffff 64bit pref] Mar 13 00:33:18.904176 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:33:18.904303 kernel: pci 0000:00:02.1: BAR 0 [mem 0x81388000-0x81388fff] Mar 13 00:33:18.904403 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Mar 13 00:33:18.904498 kernel: pci 0000:00:02.1: bridge window [mem 0x81100000-0x811fffff] Mar 13 00:33:18.904601 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:33:18.904697 kernel: pci 0000:00:02.2: BAR 0 [mem 0x81387000-0x81387fff] Mar 13 00:33:18.904795 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Mar 13 00:33:18.904890 kernel: pci 0000:00:02.2: bridge window [mem 0x81000000-0x810fffff] Mar 13 00:33:18.904985 kernel: pci 0000:00:02.2: bridge window [mem 0xc060100000-0xc0601fffff 64bit pref] Mar 13 00:33:18.905091 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:33:18.905187 kernel: pci 0000:00:02.3: BAR 0 [mem 0x81386000-0x81386fff] Mar 13 00:33:18.906324 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Mar 13 00:33:18.906436 kernel: pci 0000:00:02.3: bridge window [mem 0xc060200000-0xc0602fffff 64bit pref] Mar 13 00:33:18.906546 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:33:18.906646 kernel: pci 0000:00:02.4: BAR 0 [mem 0x81385000-0x81385fff] Mar 13 00:33:18.906743 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Mar 13 00:33:18.906838 kernel: pci 0000:00:02.4: bridge window [mem 0x80f00000-0x80ffffff] Mar 13 00:33:18.906935 kernel: pci 0000:00:02.4: bridge window [mem 0xc060300000-0xc0603fffff 64bit pref] Mar 13 00:33:18.907035 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:33:18.907132 kernel: pci 0000:00:02.5: BAR 0 [mem 0x81384000-0x81384fff] Mar 13 00:33:18.907230 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Mar 13 00:33:18.908394 kernel: pci 0000:00:02.5: bridge window [mem 0x80e00000-0x80efffff] Mar 13 00:33:18.908499 kernel: pci 0000:00:02.5: bridge window [mem 0xc060400000-0xc0604fffff 64bit pref] Mar 13 00:33:18.908606 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:33:18.908702 kernel: pci 0000:00:02.6: BAR 0 [mem 0x81383000-0x81383fff] Mar 13 00:33:18.908799 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Mar 13 00:33:18.908894 kernel: pci 0000:00:02.6: bridge window [mem 0x80c00000-0x80dfffff] Mar 13 00:33:18.908999 kernel: pci 0000:00:02.6: bridge window [mem 0xc000000000-0xc01fffffff 64bit pref] Mar 13 00:33:18.909101 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:33:18.909199 kernel: pci 0000:00:02.7: BAR 0 [mem 0x81382000-0x81382fff] Mar 13 00:33:18.910342 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Mar 13 00:33:18.910450 kernel: pci 0000:00:02.7: bridge window [mem 0x80a00000-0x80bfffff] Mar 13 00:33:18.910547 kernel: pci 0000:00:02.7: bridge window [mem 0xc020000000-0xc03fffffff 64bit pref] Mar 13 00:33:18.910653 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 13 00:33:18.910750 kernel: pci 0000:00:03.0: BAR 0 [mem 0x81381000-0x81381fff] Mar 13 00:33:18.910845 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Mar 13 00:33:18.910940 kernel: pci 0000:00:03.0: bridge window [mem 0x80800000-0x809fffff] Mar 13 00:33:18.911035 kernel: pci 0000:00:03.0: bridge window [mem 0xc040000000-0xc05fffffff 64bit pref] Mar 13 00:33:18.911136 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Mar 13 00:33:18.911232 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Mar 13 00:33:18.911896 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Mar 13 00:33:18.912000 kernel: pci 0000:00:1f.2: BAR 4 [io 0x6040-0x605f] Mar 13 00:33:18.912097 kernel: pci 0000:00:1f.2: BAR 5 [mem 0x81380000-0x81380fff] Mar 13 00:33:18.912198 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Mar 13 00:33:18.912334 kernel: pci 0000:00:1f.3: BAR 4 [io 0x6000-0x603f] Mar 13 00:33:18.912444 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Mar 13 00:33:18.912549 kernel: pci 0000:01:00.0: BAR 1 [mem 0x81200000-0x81200fff] Mar 13 00:33:18.912650 kernel: pci 0000:01:00.0: BAR 4 [mem 0xc060000000-0xc060003fff 64bit pref] Mar 13 00:33:18.912749 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Mar 13 00:33:18.912845 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Mar 13 00:33:18.912950 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Mar 13 00:33:18.913050 kernel: pci 0000:02:00.0: BAR 0 [mem 0x81100000-0x81103fff 64bit] Mar 13 00:33:18.913145 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Mar 13 00:33:18.913253 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Mar 13 00:33:18.913402 kernel: pci 0000:03:00.0: BAR 1 [mem 0x81000000-0x81000fff] Mar 13 00:33:18.913502 kernel: pci 0000:03:00.0: BAR 4 [mem 0xc060100000-0xc060103fff 64bit pref] Mar 13 00:33:18.913598 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Mar 13 00:33:18.913728 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Mar 13 00:33:18.913834 kernel: pci 0000:04:00.0: BAR 4 [mem 0xc060200000-0xc060203fff 64bit pref] Mar 13 00:33:18.913930 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Mar 13 00:33:18.914042 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Mar 13 00:33:18.914143 kernel: pci 0000:05:00.0: BAR 1 [mem 0x80f00000-0x80f00fff] Mar 13 00:33:18.914243 kernel: pci 0000:05:00.0: BAR 4 [mem 0xc060300000-0xc060303fff 64bit pref] Mar 13 00:33:18.914366 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Mar 13 00:33:18.914473 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Mar 13 00:33:18.914573 kernel: pci 0000:06:00.0: BAR 1 [mem 0x80e00000-0x80e00fff] Mar 13 00:33:18.914673 kernel: pci 0000:06:00.0: BAR 4 [mem 0xc060400000-0xc060403fff 64bit pref] Mar 13 00:33:18.914784 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Mar 13 00:33:18.914796 kernel: acpiphp: Slot [0] registered Mar 13 00:33:18.914924 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Mar 13 00:33:18.915029 kernel: pci 0000:07:00.0: BAR 1 [mem 0x80c00000-0x80c00fff] Mar 13 00:33:18.915130 kernel: pci 0000:07:00.0: BAR 4 [mem 0xc000000000-0xc000003fff 64bit pref] Mar 13 00:33:18.915229 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref] Mar 13 00:33:18.915602 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Mar 13 00:33:18.915616 kernel: acpiphp: Slot [0-2] registered Mar 13 00:33:18.915715 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Mar 13 00:33:18.915722 kernel: acpiphp: Slot [0-3] registered Mar 13 00:33:18.915823 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Mar 13 00:33:18.915832 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Mar 13 00:33:18.915851 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Mar 13 00:33:18.915859 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 13 00:33:18.915865 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Mar 13 00:33:18.915872 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Mar 13 00:33:18.915878 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Mar 13 00:33:18.915884 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Mar 13 00:33:18.915890 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Mar 13 00:33:18.915895 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Mar 13 00:33:18.915901 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Mar 13 00:33:18.915906 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Mar 13 00:33:18.915912 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Mar 13 00:33:18.915918 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Mar 13 00:33:18.915925 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Mar 13 00:33:18.915931 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Mar 13 00:33:18.915939 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Mar 13 00:33:18.915945 kernel: iommu: Default domain type: Translated Mar 13 00:33:18.915950 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 13 00:33:18.915956 kernel: efivars: Registered efivars operations Mar 13 00:33:18.915964 kernel: PCI: Using ACPI for IRQ routing Mar 13 00:33:18.915970 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 13 00:33:18.915976 kernel: e820: reserve RAM buffer [mem 0x7df36018-0x7fffffff] Mar 13 00:33:18.915981 kernel: e820: reserve RAM buffer [mem 0x7df72018-0x7fffffff] Mar 13 00:33:18.915987 kernel: e820: reserve RAM buffer [mem 0x7dfae018-0x7fffffff] Mar 13 00:33:18.915992 kernel: e820: reserve RAM buffer [mem 0x7ed3f000-0x7fffffff] Mar 13 00:33:18.915998 kernel: e820: reserve RAM buffer [mem 0x7f8ed000-0x7fffffff] Mar 13 00:33:18.916004 kernel: e820: reserve RAM buffer [mem 0x7ff7c000-0x7fffffff] Mar 13 00:33:18.916011 kernel: e820: reserve RAM buffer [mem 0x17a000000-0x17bffffff] Mar 13 00:33:18.916108 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Mar 13 00:33:18.916204 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Mar 13 00:33:18.916333 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 13 00:33:18.916341 kernel: vgaarb: loaded Mar 13 00:33:18.916347 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Mar 13 00:33:18.916353 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Mar 13 00:33:18.916358 kernel: clocksource: Switched to clocksource kvm-clock Mar 13 00:33:18.916364 kernel: VFS: Disk quotas dquot_6.6.0 Mar 13 00:33:18.916373 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 13 00:33:18.916379 kernel: pnp: PnP ACPI init Mar 13 00:33:18.916485 kernel: system 00:04: [mem 0xe0000000-0xefffffff window] has been reserved Mar 13 00:33:18.916493 kernel: pnp: PnP ACPI: found 5 devices Mar 13 00:33:18.916499 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 13 00:33:18.916505 kernel: NET: Registered PF_INET protocol family Mar 13 00:33:18.916511 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 13 00:33:18.916517 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 13 00:33:18.916525 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 13 00:33:18.916531 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 13 00:33:18.916536 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 13 00:33:18.916542 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 13 00:33:18.916548 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 13 00:33:18.916553 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 13 00:33:18.916559 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 13 00:33:18.916565 kernel: NET: Registered PF_XDP protocol family Mar 13 00:33:18.916667 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Mar 13 00:33:18.916773 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Mar 13 00:33:18.916870 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Mar 13 00:33:18.916965 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Mar 13 00:33:18.917060 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Mar 13 00:33:18.917155 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff]: assigned Mar 13 00:33:18.917251 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff]: assigned Mar 13 00:33:18.919391 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff]: assigned Mar 13 00:33:18.919504 kernel: pci 0000:01:00.0: ROM [mem 0x81280000-0x812fffff pref]: assigned Mar 13 00:33:18.919614 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Mar 13 00:33:18.919709 kernel: pci 0000:00:02.0: bridge window [mem 0x81200000-0x812fffff] Mar 13 00:33:18.919804 kernel: pci 0000:00:02.0: bridge window [mem 0xc060000000-0xc0600fffff 64bit pref] Mar 13 00:33:18.919900 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Mar 13 00:33:18.919994 kernel: pci 0000:00:02.1: bridge window [mem 0x81100000-0x811fffff] Mar 13 00:33:18.920090 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Mar 13 00:33:18.920185 kernel: pci 0000:00:02.2: bridge window [mem 0x81000000-0x810fffff] Mar 13 00:33:18.921352 kernel: pci 0000:00:02.2: bridge window [mem 0xc060100000-0xc0601fffff 64bit pref] Mar 13 00:33:18.921477 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Mar 13 00:33:18.921581 kernel: pci 0000:00:02.3: bridge window [mem 0xc060200000-0xc0602fffff 64bit pref] Mar 13 00:33:18.921677 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Mar 13 00:33:18.921772 kernel: pci 0000:00:02.4: bridge window [mem 0x80f00000-0x80ffffff] Mar 13 00:33:18.921881 kernel: pci 0000:00:02.4: bridge window [mem 0xc060300000-0xc0603fffff 64bit pref] Mar 13 00:33:18.921980 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Mar 13 00:33:18.922076 kernel: pci 0000:00:02.5: bridge window [mem 0x80e00000-0x80efffff] Mar 13 00:33:18.922172 kernel: pci 0000:00:02.5: bridge window [mem 0xc060400000-0xc0604fffff 64bit pref] Mar 13 00:33:18.923595 kernel: pci 0000:07:00.0: ROM [mem 0x80c80000-0x80cfffff pref]: assigned Mar 13 00:33:18.923917 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Mar 13 00:33:18.925236 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] Mar 13 00:33:18.925384 kernel: pci 0000:00:02.6: bridge window [mem 0x80c00000-0x80dfffff] Mar 13 00:33:18.925489 kernel: pci 0000:00:02.6: bridge window [mem 0xc000000000-0xc01fffffff 64bit pref] Mar 13 00:33:18.925592 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Mar 13 00:33:18.925694 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] Mar 13 00:33:18.925799 kernel: pci 0000:00:02.7: bridge window [mem 0x80a00000-0x80bfffff] Mar 13 00:33:18.925899 kernel: pci 0000:00:02.7: bridge window [mem 0xc020000000-0xc03fffffff 64bit pref] Mar 13 00:33:18.925999 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Mar 13 00:33:18.926098 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] Mar 13 00:33:18.926198 kernel: pci 0000:00:03.0: bridge window [mem 0x80800000-0x809fffff] Mar 13 00:33:18.926318 kernel: pci 0000:00:03.0: bridge window [mem 0xc040000000-0xc05fffffff 64bit pref] Mar 13 00:33:18.926418 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Mar 13 00:33:18.926531 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Mar 13 00:33:18.926621 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Mar 13 00:33:18.926715 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xdfffffff window] Mar 13 00:33:18.926807 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Mar 13 00:33:18.926897 kernel: pci_bus 0000:00: resource 9 [mem 0xc000000000-0xc7ffffffff window] Mar 13 00:33:18.926999 kernel: pci_bus 0000:01: resource 1 [mem 0x81200000-0x812fffff] Mar 13 00:33:18.927098 kernel: pci_bus 0000:01: resource 2 [mem 0xc060000000-0xc0600fffff 64bit pref] Mar 13 00:33:18.927203 kernel: pci_bus 0000:02: resource 1 [mem 0x81100000-0x811fffff] Mar 13 00:33:18.927346 kernel: pci_bus 0000:03: resource 1 [mem 0x81000000-0x810fffff] Mar 13 00:33:18.927443 kernel: pci_bus 0000:03: resource 2 [mem 0xc060100000-0xc0601fffff 64bit pref] Mar 13 00:33:18.927543 kernel: pci_bus 0000:04: resource 2 [mem 0xc060200000-0xc0602fffff 64bit pref] Mar 13 00:33:18.927642 kernel: pci_bus 0000:05: resource 1 [mem 0x80f00000-0x80ffffff] Mar 13 00:33:18.927736 kernel: pci_bus 0000:05: resource 2 [mem 0xc060300000-0xc0603fffff 64bit pref] Mar 13 00:33:18.927835 kernel: pci_bus 0000:06: resource 1 [mem 0x80e00000-0x80efffff] Mar 13 00:33:18.927929 kernel: pci_bus 0000:06: resource 2 [mem 0xc060400000-0xc0604fffff 64bit pref] Mar 13 00:33:18.928034 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] Mar 13 00:33:18.928127 kernel: pci_bus 0000:07: resource 1 [mem 0x80c00000-0x80dfffff] Mar 13 00:33:18.928221 kernel: pci_bus 0000:07: resource 2 [mem 0xc000000000-0xc01fffffff 64bit pref] Mar 13 00:33:18.928339 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] Mar 13 00:33:18.928433 kernel: pci_bus 0000:08: resource 1 [mem 0x80a00000-0x80bfffff] Mar 13 00:33:18.928526 kernel: pci_bus 0000:08: resource 2 [mem 0xc020000000-0xc03fffffff 64bit pref] Mar 13 00:33:18.928626 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] Mar 13 00:33:18.928720 kernel: pci_bus 0000:09: resource 1 [mem 0x80800000-0x809fffff] Mar 13 00:33:18.928812 kernel: pci_bus 0000:09: resource 2 [mem 0xc040000000-0xc05fffffff 64bit pref] Mar 13 00:33:18.928820 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Mar 13 00:33:18.928826 kernel: PCI: CLS 0 bytes, default 64 Mar 13 00:33:18.928832 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Mar 13 00:33:18.928838 kernel: software IO TLB: mapped [mem 0x0000000077ffd000-0x000000007bffd000] (64MB) Mar 13 00:33:18.928843 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x229835b7123, max_idle_ns: 440795242976 ns Mar 13 00:33:18.928852 kernel: Initialise system trusted keyrings Mar 13 00:33:18.928858 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 13 00:33:18.928864 kernel: Key type asymmetric registered Mar 13 00:33:18.928869 kernel: Asymmetric key parser 'x509' registered Mar 13 00:33:18.928875 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Mar 13 00:33:18.928881 kernel: io scheduler mq-deadline registered Mar 13 00:33:18.928886 kernel: io scheduler kyber registered Mar 13 00:33:18.928892 kernel: io scheduler bfq registered Mar 13 00:33:18.928990 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Mar 13 00:33:18.929089 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Mar 13 00:33:18.929187 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Mar 13 00:33:18.929301 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Mar 13 00:33:18.929398 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Mar 13 00:33:18.929495 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Mar 13 00:33:18.929592 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Mar 13 00:33:18.929688 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Mar 13 00:33:18.929788 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Mar 13 00:33:18.929896 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Mar 13 00:33:18.929991 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Mar 13 00:33:18.930098 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Mar 13 00:33:18.930195 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Mar 13 00:33:18.930335 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Mar 13 00:33:18.930433 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Mar 13 00:33:18.930529 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Mar 13 00:33:18.930539 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Mar 13 00:33:18.930635 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Mar 13 00:33:18.930749 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Mar 13 00:33:18.930757 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 13 00:33:18.930763 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Mar 13 00:33:18.930769 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 13 00:33:18.930774 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 13 00:33:18.930781 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Mar 13 00:33:18.930789 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 13 00:33:18.930795 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 13 00:33:18.930897 kernel: rtc_cmos 00:03: RTC can wake from S4 Mar 13 00:33:18.930905 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Mar 13 00:33:18.931020 kernel: rtc_cmos 00:03: registered as rtc0 Mar 13 00:33:18.931141 kernel: rtc_cmos 00:03: setting system clock to 2026-03-13T00:33:18 UTC (1773361998) Mar 13 00:33:18.931259 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Mar 13 00:33:18.931320 kernel: amd_pstate: The CPPC feature is supported but currently disabled by the BIOS. Please enable it if your BIOS has the CPPC option. Mar 13 00:33:18.931329 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Mar 13 00:33:18.931338 kernel: efifb: probing for efifb Mar 13 00:33:18.931347 kernel: efifb: framebuffer at 0x80000000, using 4000k, total 4000k Mar 13 00:33:18.931356 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Mar 13 00:33:18.931364 kernel: efifb: scrolling: redraw Mar 13 00:33:18.931373 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Mar 13 00:33:18.931383 kernel: Console: switching to colour frame buffer device 160x50 Mar 13 00:33:18.931391 kernel: fb0: EFI VGA frame buffer device Mar 13 00:33:18.931404 kernel: pstore: Using crash dump compression: deflate Mar 13 00:33:18.931413 kernel: pstore: Registered efi_pstore as persistent store backend Mar 13 00:33:18.931421 kernel: NET: Registered PF_INET6 protocol family Mar 13 00:33:18.931429 kernel: Segment Routing with IPv6 Mar 13 00:33:18.931437 kernel: In-situ OAM (IOAM) with IPv6 Mar 13 00:33:18.931446 kernel: NET: Registered PF_PACKET protocol family Mar 13 00:33:18.931454 kernel: Key type dns_resolver registered Mar 13 00:33:18.931462 kernel: IPI shorthand broadcast: enabled Mar 13 00:33:18.931470 kernel: sched_clock: Marking stable (2774022835, 269344934)->(3083164814, -39797045) Mar 13 00:33:18.931479 kernel: registered taskstats version 1 Mar 13 00:33:18.931485 kernel: Loading compiled-in X.509 certificates Mar 13 00:33:18.931491 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.74-flatcar: 5aff49df330f42445474818d085d5033fee752d8' Mar 13 00:33:18.931497 kernel: Demotion targets for Node 0: null Mar 13 00:33:18.931502 kernel: Key type .fscrypt registered Mar 13 00:33:18.931510 kernel: Key type fscrypt-provisioning registered Mar 13 00:33:18.931519 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 13 00:33:18.931529 kernel: ima: Allocated hash algorithm: sha1 Mar 13 00:33:18.931537 kernel: ima: No architecture policies found Mar 13 00:33:18.931549 kernel: clk: Disabling unused clocks Mar 13 00:33:18.931556 kernel: Warning: unable to open an initial console. Mar 13 00:33:18.931562 kernel: Freeing unused kernel image (initmem) memory: 46200K Mar 13 00:33:18.931568 kernel: Write protecting the kernel read-only data: 40960k Mar 13 00:33:18.931574 kernel: Freeing unused kernel image (rodata/data gap) memory: 560K Mar 13 00:33:18.931579 kernel: Run /init as init process Mar 13 00:33:18.931585 kernel: with arguments: Mar 13 00:33:18.931591 kernel: /init Mar 13 00:33:18.931598 kernel: with environment: Mar 13 00:33:18.931610 kernel: HOME=/ Mar 13 00:33:18.931619 kernel: TERM=linux Mar 13 00:33:18.931628 systemd[1]: Successfully made /usr/ read-only. Mar 13 00:33:18.931640 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 13 00:33:18.931650 systemd[1]: Detected virtualization kvm. Mar 13 00:33:18.931659 systemd[1]: Detected architecture x86-64. Mar 13 00:33:18.931668 systemd[1]: Running in initrd. Mar 13 00:33:18.931679 systemd[1]: No hostname configured, using default hostname. Mar 13 00:33:18.931686 systemd[1]: Hostname set to . Mar 13 00:33:18.931693 systemd[1]: Initializing machine ID from VM UUID. Mar 13 00:33:18.931699 systemd[1]: Queued start job for default target initrd.target. Mar 13 00:33:18.931705 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 13 00:33:18.931711 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 13 00:33:18.931718 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 13 00:33:18.931724 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 13 00:33:18.931732 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 13 00:33:18.931739 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 13 00:33:18.931746 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 13 00:33:18.931752 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 13 00:33:18.931758 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 13 00:33:18.931764 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 13 00:33:18.931770 systemd[1]: Reached target paths.target - Path Units. Mar 13 00:33:18.931779 systemd[1]: Reached target slices.target - Slice Units. Mar 13 00:33:18.931785 systemd[1]: Reached target swap.target - Swaps. Mar 13 00:33:18.931791 systemd[1]: Reached target timers.target - Timer Units. Mar 13 00:33:18.931797 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 13 00:33:18.931803 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 13 00:33:18.931809 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 13 00:33:18.931816 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 13 00:33:18.931826 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 13 00:33:18.931835 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 13 00:33:18.931847 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 13 00:33:18.931854 systemd[1]: Reached target sockets.target - Socket Units. Mar 13 00:33:18.931860 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 13 00:33:18.931866 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 13 00:33:18.931872 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 13 00:33:18.931878 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Mar 13 00:33:18.931884 systemd[1]: Starting systemd-fsck-usr.service... Mar 13 00:33:18.931890 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 13 00:33:18.931899 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 13 00:33:18.931905 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 13 00:33:18.931911 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 13 00:33:18.931941 systemd-journald[197]: Collecting audit messages is disabled. Mar 13 00:33:18.931960 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 13 00:33:18.931966 systemd[1]: Finished systemd-fsck-usr.service. Mar 13 00:33:18.931972 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 13 00:33:18.931979 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:33:18.931987 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 13 00:33:18.931993 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 13 00:33:18.932000 systemd-journald[197]: Journal started Mar 13 00:33:18.932014 systemd-journald[197]: Runtime Journal (/run/log/journal/3343a34f6f734c2e8a0167e39c728900) is 8M, max 76.1M, 68.1M free. Mar 13 00:33:18.894006 systemd-modules-load[198]: Inserted module 'overlay' Mar 13 00:33:18.937472 systemd[1]: Started systemd-journald.service - Journal Service. Mar 13 00:33:18.937499 kernel: Bridge firewalling registered Mar 13 00:33:18.936903 systemd-modules-load[198]: Inserted module 'br_netfilter' Mar 13 00:33:18.939170 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 13 00:33:18.942772 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 13 00:33:18.944399 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 13 00:33:18.949983 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 13 00:33:18.951453 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 13 00:33:18.962019 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 13 00:33:18.967074 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 13 00:33:18.967136 systemd-tmpfiles[219]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Mar 13 00:33:18.970902 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 13 00:33:18.972490 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 13 00:33:18.973427 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 13 00:33:18.983827 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 13 00:33:18.994680 dracut-cmdline[234]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=a2116dc4421f78fe124deb19b9ad6d70a0cb4fc0b3349854f4ce4e2904d4925d Mar 13 00:33:19.021020 systemd-resolved[236]: Positive Trust Anchors: Mar 13 00:33:19.021325 systemd-resolved[236]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 13 00:33:19.021346 systemd-resolved[236]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 13 00:33:19.025517 systemd-resolved[236]: Defaulting to hostname 'linux'. Mar 13 00:33:19.027127 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 13 00:33:19.027715 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 13 00:33:19.068325 kernel: SCSI subsystem initialized Mar 13 00:33:19.076300 kernel: Loading iSCSI transport class v2.0-870. Mar 13 00:33:19.085344 kernel: iscsi: registered transport (tcp) Mar 13 00:33:19.101556 kernel: iscsi: registered transport (qla4xxx) Mar 13 00:33:19.101582 kernel: QLogic iSCSI HBA Driver Mar 13 00:33:19.119238 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 13 00:33:19.142723 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 13 00:33:19.144603 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 13 00:33:19.205158 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 13 00:33:19.207911 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 13 00:33:19.261311 kernel: raid6: avx512x4 gen() 43113 MB/s Mar 13 00:33:19.279297 kernel: raid6: avx512x2 gen() 46614 MB/s Mar 13 00:33:19.297306 kernel: raid6: avx512x1 gen() 43781 MB/s Mar 13 00:33:19.315296 kernel: raid6: avx2x4 gen() 41935 MB/s Mar 13 00:33:19.333304 kernel: raid6: avx2x2 gen() 49710 MB/s Mar 13 00:33:19.352663 kernel: raid6: avx2x1 gen() 39310 MB/s Mar 13 00:33:19.352719 kernel: raid6: using algorithm avx2x2 gen() 49710 MB/s Mar 13 00:33:19.372475 kernel: raid6: .... xor() 29630 MB/s, rmw enabled Mar 13 00:33:19.372542 kernel: raid6: using avx512x2 recovery algorithm Mar 13 00:33:19.411334 kernel: xor: automatically using best checksumming function avx Mar 13 00:33:19.540358 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 13 00:33:19.550109 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 13 00:33:19.554341 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 13 00:33:19.577183 systemd-udevd[447]: Using default interface naming scheme 'v255'. Mar 13 00:33:19.582312 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 13 00:33:19.588380 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 13 00:33:19.623248 dracut-pre-trigger[456]: rd.md=0: removing MD RAID activation Mar 13 00:33:19.657628 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 13 00:33:19.660971 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 13 00:33:19.738411 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 13 00:33:19.745106 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 13 00:33:19.806484 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Mar 13 00:33:19.814302 kernel: scsi host0: Virtio SCSI HBA Mar 13 00:33:19.819296 kernel: scsi 0:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Mar 13 00:33:19.829306 kernel: libata version 3.00 loaded. Mar 13 00:33:19.848303 kernel: cryptd: max_cpu_qlen set to 1000 Mar 13 00:33:19.862352 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Mar 13 00:33:19.875298 kernel: ACPI: bus type USB registered Mar 13 00:33:19.879351 kernel: usbcore: registered new interface driver usbfs Mar 13 00:33:19.879398 kernel: usbcore: registered new interface driver hub Mar 13 00:33:19.881327 kernel: usbcore: registered new device driver usb Mar 13 00:33:19.881533 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 13 00:33:19.881649 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:33:19.882786 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 13 00:33:19.884877 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 13 00:33:19.886700 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 13 00:33:19.894804 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 13 00:33:19.895238 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:33:19.900157 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 13 00:33:19.908983 kernel: sd 0:0:0:0: Power-on or device reset occurred Mar 13 00:33:19.909200 kernel: ahci 0000:00:1f.2: version 3.0 Mar 13 00:33:19.915296 kernel: sd 0:0:0:0: [sda] 160006144 512-byte logical blocks: (81.9 GB/76.3 GiB) Mar 13 00:33:19.915456 kernel: AES CTR mode by8 optimization enabled Mar 13 00:33:19.915466 kernel: sd 0:0:0:0: [sda] Write Protect is off Mar 13 00:33:19.915598 kernel: sd 0:0:0:0: [sda] Mode Sense: 63 00 00 08 Mar 13 00:33:19.915720 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Mar 13 00:33:19.919529 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Mar 13 00:33:19.925299 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Mar 13 00:33:19.925493 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Mar 13 00:33:19.925638 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Mar 13 00:33:19.931176 kernel: scsi host1: ahci Mar 13 00:33:19.931522 kernel: scsi host2: ahci Mar 13 00:33:19.934474 kernel: scsi host3: ahci Mar 13 00:33:19.936293 kernel: scsi host4: ahci Mar 13 00:33:19.939871 kernel: scsi host5: ahci Mar 13 00:33:19.940019 kernel: scsi host6: ahci Mar 13 00:33:19.963625 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 13 00:33:19.963653 kernel: ata1: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380100 irq 48 lpm-pol 1 Mar 13 00:33:19.963668 kernel: GPT:17805311 != 160006143 Mar 13 00:33:19.963676 kernel: ata2: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380180 irq 48 lpm-pol 1 Mar 13 00:33:19.963684 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 13 00:33:19.963692 kernel: GPT:17805311 != 160006143 Mar 13 00:33:19.963700 kernel: ata3: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380200 irq 48 lpm-pol 1 Mar 13 00:33:19.963708 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 13 00:33:19.963716 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 13 00:33:19.963723 kernel: ata4: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380280 irq 48 lpm-pol 1 Mar 13 00:33:19.963731 kernel: ata5: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380300 irq 48 lpm-pol 1 Mar 13 00:33:19.963741 kernel: ata6: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380380 irq 48 lpm-pol 1 Mar 13 00:33:19.965401 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Mar 13 00:33:19.978211 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:33:20.281589 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Mar 13 00:33:20.281685 kernel: ata4: SATA link down (SStatus 0 SControl 300) Mar 13 00:33:20.281709 kernel: ata1.00: LPM support broken, forcing max_power Mar 13 00:33:20.290075 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Mar 13 00:33:20.290130 kernel: ata1.00: applying bridge limits Mar 13 00:33:20.298897 kernel: ata5: SATA link down (SStatus 0 SControl 300) Mar 13 00:33:20.311415 kernel: ata6: SATA link down (SStatus 0 SControl 300) Mar 13 00:33:20.311467 kernel: ata1.00: LPM support broken, forcing max_power Mar 13 00:33:20.311490 kernel: ata1.00: configured for UDMA/100 Mar 13 00:33:20.312325 kernel: ata3: SATA link down (SStatus 0 SControl 300) Mar 13 00:33:20.320337 kernel: ata2: SATA link down (SStatus 0 SControl 300) Mar 13 00:33:20.327344 kernel: scsi 1:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Mar 13 00:33:20.354754 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 13 00:33:20.358367 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Mar 13 00:33:20.358650 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Mar 13 00:33:20.365743 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 13 00:33:20.365925 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Mar 13 00:33:20.368058 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Mar 13 00:33:20.371862 kernel: hub 1-0:1.0: USB hub found Mar 13 00:33:20.372039 kernel: hub 1-0:1.0: 4 ports detected Mar 13 00:33:20.376292 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Mar 13 00:33:20.379293 kernel: hub 2-0:1.0: USB hub found Mar 13 00:33:20.392458 kernel: sr 1:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Mar 13 00:33:20.392653 kernel: hub 2-0:1.0: 4 ports detected Mar 13 00:33:20.392779 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 13 00:33:20.405299 kernel: sr 1:0:0:0: Attached scsi CD-ROM sr0 Mar 13 00:33:20.418794 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Mar 13 00:33:20.425745 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Mar 13 00:33:20.433313 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Mar 13 00:33:20.439121 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Mar 13 00:33:20.439803 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Mar 13 00:33:20.441790 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 13 00:33:20.462176 disk-uuid[648]: Primary Header is updated. Mar 13 00:33:20.462176 disk-uuid[648]: Secondary Entries is updated. Mar 13 00:33:20.462176 disk-uuid[648]: Secondary Header is updated. Mar 13 00:33:20.475322 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 13 00:33:20.487292 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 13 00:33:20.612322 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Mar 13 00:33:20.629987 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 13 00:33:20.631342 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 13 00:33:20.631688 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 13 00:33:20.632293 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 13 00:33:20.633537 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 13 00:33:20.656026 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 13 00:33:20.762476 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 13 00:33:20.771638 kernel: usbcore: registered new interface driver usbhid Mar 13 00:33:20.771701 kernel: usbhid: USB HID core driver Mar 13 00:33:20.785405 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input4 Mar 13 00:33:20.785461 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Mar 13 00:33:21.501376 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 13 00:33:21.504943 disk-uuid[649]: The operation has completed successfully. Mar 13 00:33:21.579598 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 13 00:33:21.579709 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 13 00:33:21.598830 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 13 00:33:21.614226 sh[681]: Success Mar 13 00:33:21.632125 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 13 00:33:21.632203 kernel: device-mapper: uevent: version 1.0.3 Mar 13 00:33:21.632784 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Mar 13 00:33:21.644641 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Mar 13 00:33:21.688943 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 13 00:33:21.692367 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 13 00:33:21.701383 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 13 00:33:21.709306 kernel: BTRFS: device fsid 503642f8-c59c-4168-97a8-9c3603183fa3 devid 1 transid 37 /dev/mapper/usr (254:0) scanned by mount (694) Mar 13 00:33:21.714676 kernel: BTRFS info (device dm-0): first mount of filesystem 503642f8-c59c-4168-97a8-9c3603183fa3 Mar 13 00:33:21.714711 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 13 00:33:21.724418 kernel: BTRFS info (device dm-0 state E): enabling ssd optimizations Mar 13 00:33:21.724468 kernel: BTRFS info (device dm-0 state E): disabling log replay at mount time Mar 13 00:33:21.726584 kernel: BTRFS info (device dm-0 state E): enabling free space tree Mar 13 00:33:21.730000 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 13 00:33:21.730758 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Mar 13 00:33:21.731359 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 13 00:33:21.733375 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 13 00:33:21.734307 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 13 00:33:21.766324 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (729) Mar 13 00:33:21.766387 kernel: BTRFS info (device sda6): first mount of filesystem 451985e5-e916-48b1-8100-483c174d7b52 Mar 13 00:33:21.771220 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 13 00:33:21.777575 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 13 00:33:21.777619 kernel: BTRFS info (device sda6): turning on async discard Mar 13 00:33:21.778314 kernel: BTRFS info (device sda6): enabling free space tree Mar 13 00:33:21.787425 kernel: BTRFS info (device sda6): last unmount of filesystem 451985e5-e916-48b1-8100-483c174d7b52 Mar 13 00:33:21.788367 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 13 00:33:21.789894 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 13 00:33:21.866742 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 13 00:33:21.870419 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 13 00:33:21.884535 ignition[793]: Ignition 2.22.0 Mar 13 00:33:21.885196 ignition[793]: Stage: fetch-offline Mar 13 00:33:21.885232 ignition[793]: no configs at "/usr/lib/ignition/base.d" Mar 13 00:33:21.885242 ignition[793]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 13 00:33:21.885344 ignition[793]: parsed url from cmdline: "" Mar 13 00:33:21.885349 ignition[793]: no config URL provided Mar 13 00:33:21.885355 ignition[793]: reading system config file "/usr/lib/ignition/user.ign" Mar 13 00:33:21.887990 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 13 00:33:21.885364 ignition[793]: no config at "/usr/lib/ignition/user.ign" Mar 13 00:33:21.885371 ignition[793]: failed to fetch config: resource requires networking Mar 13 00:33:21.886109 ignition[793]: Ignition finished successfully Mar 13 00:33:21.906136 systemd-networkd[867]: lo: Link UP Mar 13 00:33:21.906146 systemd-networkd[867]: lo: Gained carrier Mar 13 00:33:21.908448 systemd-networkd[867]: Enumeration completed Mar 13 00:33:21.908532 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 13 00:33:21.909040 systemd[1]: Reached target network.target - Network. Mar 13 00:33:21.909334 systemd-networkd[867]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 13 00:33:21.909338 systemd-networkd[867]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 13 00:33:21.910989 systemd-networkd[867]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 13 00:33:21.910995 systemd-networkd[867]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 13 00:33:21.911746 systemd-networkd[867]: eth0: Link UP Mar 13 00:33:21.911919 systemd-networkd[867]: eth0: Gained carrier Mar 13 00:33:21.911928 systemd-networkd[867]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 13 00:33:21.912441 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 13 00:33:21.914556 systemd-networkd[867]: eth1: Link UP Mar 13 00:33:21.914920 systemd-networkd[867]: eth1: Gained carrier Mar 13 00:33:21.914929 systemd-networkd[867]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 13 00:33:21.939214 ignition[871]: Ignition 2.22.0 Mar 13 00:33:21.939224 ignition[871]: Stage: fetch Mar 13 00:33:21.939339 ignition[871]: no configs at "/usr/lib/ignition/base.d" Mar 13 00:33:21.939348 ignition[871]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 13 00:33:21.939416 ignition[871]: parsed url from cmdline: "" Mar 13 00:33:21.939420 ignition[871]: no config URL provided Mar 13 00:33:21.939424 ignition[871]: reading system config file "/usr/lib/ignition/user.ign" Mar 13 00:33:21.939432 ignition[871]: no config at "/usr/lib/ignition/user.ign" Mar 13 00:33:21.939451 ignition[871]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Mar 13 00:33:21.939589 ignition[871]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Mar 13 00:33:21.948352 systemd-networkd[867]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Mar 13 00:33:21.977401 systemd-networkd[867]: eth0: DHCPv4 address 157.180.117.151/32, gateway 172.31.1.1 acquired from 172.31.1.1 Mar 13 00:33:22.140751 ignition[871]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Mar 13 00:33:22.148983 ignition[871]: GET result: OK Mar 13 00:33:22.149042 ignition[871]: parsing config with SHA512: 90dc15662729129bd49c1fcd6b7c51811a143a993036c2cdb6cc4d72b6aaab9e25b0d86e485f46adb4bd6943c9d1b050d68b3d3417f5f838160cfa0a28a9cf65 Mar 13 00:33:22.152976 unknown[871]: fetched base config from "system" Mar 13 00:33:22.153032 unknown[871]: fetched base config from "system" Mar 13 00:33:22.153037 unknown[871]: fetched user config from "hetzner" Mar 13 00:33:22.154476 ignition[871]: fetch: fetch complete Mar 13 00:33:22.154486 ignition[871]: fetch: fetch passed Mar 13 00:33:22.154589 ignition[871]: Ignition finished successfully Mar 13 00:33:22.156892 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 13 00:33:22.158924 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 13 00:33:22.193705 ignition[878]: Ignition 2.22.0 Mar 13 00:33:22.194318 ignition[878]: Stage: kargs Mar 13 00:33:22.194468 ignition[878]: no configs at "/usr/lib/ignition/base.d" Mar 13 00:33:22.194479 ignition[878]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 13 00:33:22.195187 ignition[878]: kargs: kargs passed Mar 13 00:33:22.195232 ignition[878]: Ignition finished successfully Mar 13 00:33:22.199775 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 13 00:33:22.203584 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 13 00:33:22.233047 ignition[885]: Ignition 2.22.0 Mar 13 00:33:22.233058 ignition[885]: Stage: disks Mar 13 00:33:22.236912 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 13 00:33:22.233165 ignition[885]: no configs at "/usr/lib/ignition/base.d" Mar 13 00:33:22.233176 ignition[885]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 13 00:33:22.233783 ignition[885]: disks: disks passed Mar 13 00:33:22.238691 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 13 00:33:22.233828 ignition[885]: Ignition finished successfully Mar 13 00:33:22.239967 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 13 00:33:22.241171 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 13 00:33:22.242205 systemd[1]: Reached target sysinit.target - System Initialization. Mar 13 00:33:22.243183 systemd[1]: Reached target basic.target - Basic System. Mar 13 00:33:22.245907 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 13 00:33:22.273853 systemd-fsck[893]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Mar 13 00:33:22.276822 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 13 00:33:22.279348 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 13 00:33:22.388327 kernel: EXT4-fs (sda9): mounted filesystem 26348f72-0225-4c06-aedc-823e61beebc6 r/w with ordered data mode. Quota mode: none. Mar 13 00:33:22.389837 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 13 00:33:22.390367 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 13 00:33:22.392894 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 13 00:33:22.395329 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 13 00:33:22.397553 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Mar 13 00:33:22.398405 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 13 00:33:22.399087 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 13 00:33:22.410737 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 13 00:33:22.413385 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 13 00:33:22.428435 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (901) Mar 13 00:33:22.437680 kernel: BTRFS info (device sda6): first mount of filesystem 451985e5-e916-48b1-8100-483c174d7b52 Mar 13 00:33:22.437741 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 13 00:33:22.451601 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 13 00:33:22.451676 kernel: BTRFS info (device sda6): turning on async discard Mar 13 00:33:22.451711 kernel: BTRFS info (device sda6): enabling free space tree Mar 13 00:33:22.456178 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 13 00:33:22.481504 coreos-metadata[903]: Mar 13 00:33:22.481 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Mar 13 00:33:22.483758 coreos-metadata[903]: Mar 13 00:33:22.483 INFO Fetch successful Mar 13 00:33:22.484897 coreos-metadata[903]: Mar 13 00:33:22.484 INFO wrote hostname ci-4459-2-4-n-bcba60e63d to /sysroot/etc/hostname Mar 13 00:33:22.487538 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 13 00:33:22.489147 initrd-setup-root[929]: cut: /sysroot/etc/passwd: No such file or directory Mar 13 00:33:22.493305 initrd-setup-root[936]: cut: /sysroot/etc/group: No such file or directory Mar 13 00:33:22.498735 initrd-setup-root[943]: cut: /sysroot/etc/shadow: No such file or directory Mar 13 00:33:22.502255 initrd-setup-root[950]: cut: /sysroot/etc/gshadow: No such file or directory Mar 13 00:33:22.592637 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 13 00:33:22.594384 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 13 00:33:22.596398 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 13 00:33:22.618309 kernel: BTRFS info (device sda6): last unmount of filesystem 451985e5-e916-48b1-8100-483c174d7b52 Mar 13 00:33:22.632039 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 13 00:33:22.644031 ignition[1019]: INFO : Ignition 2.22.0 Mar 13 00:33:22.644670 ignition[1019]: INFO : Stage: mount Mar 13 00:33:22.645047 ignition[1019]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 13 00:33:22.645047 ignition[1019]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 13 00:33:22.645738 ignition[1019]: INFO : mount: mount passed Mar 13 00:33:22.645738 ignition[1019]: INFO : Ignition finished successfully Mar 13 00:33:22.646788 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 13 00:33:22.648262 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 13 00:33:22.707156 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 13 00:33:22.708455 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 13 00:33:22.730397 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1029) Mar 13 00:33:22.730442 kernel: BTRFS info (device sda6): first mount of filesystem 451985e5-e916-48b1-8100-483c174d7b52 Mar 13 00:33:22.730456 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 13 00:33:22.739475 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 13 00:33:22.739506 kernel: BTRFS info (device sda6): turning on async discard Mar 13 00:33:22.739519 kernel: BTRFS info (device sda6): enabling free space tree Mar 13 00:33:22.743120 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 13 00:33:22.769949 ignition[1045]: INFO : Ignition 2.22.0 Mar 13 00:33:22.769949 ignition[1045]: INFO : Stage: files Mar 13 00:33:22.771243 ignition[1045]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 13 00:33:22.771243 ignition[1045]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 13 00:33:22.771243 ignition[1045]: DEBUG : files: compiled without relabeling support, skipping Mar 13 00:33:22.772586 ignition[1045]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 13 00:33:22.772586 ignition[1045]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 13 00:33:22.773904 ignition[1045]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 13 00:33:22.773904 ignition[1045]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 13 00:33:22.774698 ignition[1045]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 13 00:33:22.774421 unknown[1045]: wrote ssh authorized keys file for user: core Mar 13 00:33:22.776138 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 13 00:33:22.776707 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Mar 13 00:33:22.997328 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 13 00:33:23.302038 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 13 00:33:23.303307 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 13 00:33:23.303307 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 13 00:33:23.303307 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 13 00:33:23.303307 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 13 00:33:23.303307 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 13 00:33:23.303307 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 13 00:33:23.303307 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 13 00:33:23.303307 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 13 00:33:23.308188 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 13 00:33:23.308188 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 13 00:33:23.308188 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.4-x86-64.raw" Mar 13 00:33:23.310632 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.4-x86-64.raw" Mar 13 00:33:23.310632 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.4-x86-64.raw" Mar 13 00:33:23.310632 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.4-x86-64.raw: attempt #1 Mar 13 00:33:23.384483 systemd-networkd[867]: eth0: Gained IPv6LL Mar 13 00:33:23.640509 systemd-networkd[867]: eth1: Gained IPv6LL Mar 13 00:33:23.985904 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 13 00:33:24.271792 ignition[1045]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.4-x86-64.raw" Mar 13 00:33:24.271792 ignition[1045]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 13 00:33:24.273394 ignition[1045]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 13 00:33:24.281069 ignition[1045]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 13 00:33:24.281069 ignition[1045]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 13 00:33:24.282377 ignition[1045]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Mar 13 00:33:24.282377 ignition[1045]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Mar 13 00:33:24.282377 ignition[1045]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Mar 13 00:33:24.282377 ignition[1045]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Mar 13 00:33:24.282377 ignition[1045]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Mar 13 00:33:24.282377 ignition[1045]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Mar 13 00:33:24.282377 ignition[1045]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 13 00:33:24.282377 ignition[1045]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 13 00:33:24.282377 ignition[1045]: INFO : files: files passed Mar 13 00:33:24.282377 ignition[1045]: INFO : Ignition finished successfully Mar 13 00:33:24.287430 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 13 00:33:24.289355 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 13 00:33:24.290623 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 13 00:33:24.301890 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 13 00:33:24.301989 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 13 00:33:24.314041 initrd-setup-root-after-ignition[1076]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 13 00:33:24.314041 initrd-setup-root-after-ignition[1076]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 13 00:33:24.316352 initrd-setup-root-after-ignition[1079]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 13 00:33:24.319117 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 13 00:33:24.319987 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 13 00:33:24.321522 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 13 00:33:24.373707 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 13 00:33:24.373818 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 13 00:33:24.375401 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 13 00:33:24.376826 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 13 00:33:24.377346 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 13 00:33:24.379379 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 13 00:33:24.420735 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 13 00:33:24.425039 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 13 00:33:24.450618 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 13 00:33:24.452032 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 13 00:33:24.453414 systemd[1]: Stopped target timers.target - Timer Units. Mar 13 00:33:24.454750 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 13 00:33:24.454944 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 13 00:33:24.456955 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 13 00:33:24.458382 systemd[1]: Stopped target basic.target - Basic System. Mar 13 00:33:24.459815 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 13 00:33:24.461235 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 13 00:33:24.462448 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 13 00:33:24.463611 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Mar 13 00:33:24.464776 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 13 00:33:24.465704 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 13 00:33:24.466908 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 13 00:33:24.468093 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 13 00:33:24.469128 systemd[1]: Stopped target swap.target - Swaps. Mar 13 00:33:24.470234 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 13 00:33:24.470495 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 13 00:33:24.472085 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 13 00:33:24.473094 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 13 00:33:24.474041 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 13 00:33:24.474203 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 13 00:33:24.474978 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 13 00:33:24.475120 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 13 00:33:24.476706 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 13 00:33:24.476938 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 13 00:33:24.478245 systemd[1]: ignition-files.service: Deactivated successfully. Mar 13 00:33:24.478487 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 13 00:33:24.479635 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Mar 13 00:33:24.479817 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 13 00:33:24.482240 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 13 00:33:24.485413 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 13 00:33:24.485816 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 13 00:33:24.485930 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 13 00:33:24.486618 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 13 00:33:24.486726 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 13 00:33:24.491577 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 13 00:33:24.492081 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 13 00:33:24.507599 ignition[1100]: INFO : Ignition 2.22.0 Mar 13 00:33:24.507599 ignition[1100]: INFO : Stage: umount Mar 13 00:33:24.509574 ignition[1100]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 13 00:33:24.509574 ignition[1100]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 13 00:33:24.509574 ignition[1100]: INFO : umount: umount passed Mar 13 00:33:24.509574 ignition[1100]: INFO : Ignition finished successfully Mar 13 00:33:24.511229 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 13 00:33:24.511786 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 13 00:33:24.513967 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 13 00:33:24.514769 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 13 00:33:24.514837 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 13 00:33:24.515205 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 13 00:33:24.515240 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 13 00:33:24.515638 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 13 00:33:24.515672 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 13 00:33:24.516333 systemd[1]: Stopped target network.target - Network. Mar 13 00:33:24.516948 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 13 00:33:24.516990 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 13 00:33:24.517649 systemd[1]: Stopped target paths.target - Path Units. Mar 13 00:33:24.518240 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 13 00:33:24.522312 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 13 00:33:24.522689 systemd[1]: Stopped target slices.target - Slice Units. Mar 13 00:33:24.523345 systemd[1]: Stopped target sockets.target - Socket Units. Mar 13 00:33:24.525628 systemd[1]: iscsid.socket: Deactivated successfully. Mar 13 00:33:24.525667 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 13 00:33:24.525981 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 13 00:33:24.526009 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 13 00:33:24.526325 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 13 00:33:24.526362 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 13 00:33:24.526667 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 13 00:33:24.526698 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 13 00:33:24.527134 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 13 00:33:24.527517 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 13 00:33:24.539078 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 13 00:33:24.539174 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 13 00:33:24.543604 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 13 00:33:24.543870 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 13 00:33:24.543966 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 13 00:33:24.545859 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 13 00:33:24.546068 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 13 00:33:24.546148 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 13 00:33:24.547880 systemd[1]: Stopped target network-pre.target - Preparation for Network. Mar 13 00:33:24.548256 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 13 00:33:24.548316 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 13 00:33:24.548866 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 13 00:33:24.548910 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 13 00:33:24.551351 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 13 00:33:24.551676 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 13 00:33:24.551718 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 13 00:33:24.552333 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 13 00:33:24.552370 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 13 00:33:24.552987 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 13 00:33:24.553021 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 13 00:33:24.554409 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 13 00:33:24.554471 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 13 00:33:24.556284 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 13 00:33:24.561574 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 13 00:33:24.561638 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 13 00:33:24.568902 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 13 00:33:24.570425 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 13 00:33:24.571001 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 13 00:33:24.571050 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 13 00:33:24.571415 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 13 00:33:24.571443 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 13 00:33:24.571778 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 13 00:33:24.571817 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 13 00:33:24.572210 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 13 00:33:24.572243 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 13 00:33:24.574597 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 13 00:33:24.574645 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 13 00:33:24.576411 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 13 00:33:24.577341 systemd[1]: systemd-network-generator.service: Deactivated successfully. Mar 13 00:33:24.577385 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Mar 13 00:33:24.580484 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 13 00:33:24.580530 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 13 00:33:24.581352 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Mar 13 00:33:24.581389 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 13 00:33:24.582035 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 13 00:33:24.582070 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 13 00:33:24.582679 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 13 00:33:24.582714 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:33:24.584445 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Mar 13 00:33:24.584490 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Mar 13 00:33:24.584524 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 13 00:33:24.584560 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 13 00:33:24.584841 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 13 00:33:24.586368 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 13 00:33:24.592480 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 13 00:33:24.592579 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 13 00:33:24.593545 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 13 00:33:24.594622 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 13 00:33:24.608696 systemd[1]: Switching root. Mar 13 00:33:24.657267 systemd-journald[197]: Journal stopped Mar 13 00:33:25.765589 systemd-journald[197]: Received SIGTERM from PID 1 (systemd). Mar 13 00:33:25.768340 kernel: SELinux: policy capability network_peer_controls=1 Mar 13 00:33:25.768355 kernel: SELinux: policy capability open_perms=1 Mar 13 00:33:25.768364 kernel: SELinux: policy capability extended_socket_class=1 Mar 13 00:33:25.768373 kernel: SELinux: policy capability always_check_network=0 Mar 13 00:33:25.768382 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 13 00:33:25.768394 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 13 00:33:25.768402 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 13 00:33:25.768411 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 13 00:33:25.768419 kernel: SELinux: policy capability userspace_initial_context=0 Mar 13 00:33:25.768427 kernel: audit: type=1403 audit(1773362004.805:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 13 00:33:25.768437 systemd[1]: Successfully loaded SELinux policy in 62.572ms. Mar 13 00:33:25.768460 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 6.505ms. Mar 13 00:33:25.768476 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 13 00:33:25.768485 systemd[1]: Detected virtualization kvm. Mar 13 00:33:25.768496 systemd[1]: Detected architecture x86-64. Mar 13 00:33:25.768504 systemd[1]: Detected first boot. Mar 13 00:33:25.768513 systemd[1]: Hostname set to . Mar 13 00:33:25.768522 systemd[1]: Initializing machine ID from VM UUID. Mar 13 00:33:25.768532 zram_generator::config[1144]: No configuration found. Mar 13 00:33:25.768542 kernel: Guest personality initialized and is inactive Mar 13 00:33:25.768551 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Mar 13 00:33:25.768559 kernel: Initialized host personality Mar 13 00:33:25.768570 kernel: NET: Registered PF_VSOCK protocol family Mar 13 00:33:25.768579 systemd[1]: Populated /etc with preset unit settings. Mar 13 00:33:25.768590 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 13 00:33:25.768599 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 13 00:33:25.768608 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 13 00:33:25.768616 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 13 00:33:25.768625 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 13 00:33:25.768634 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 13 00:33:25.768645 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 13 00:33:25.768654 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 13 00:33:25.768662 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 13 00:33:25.768671 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 13 00:33:25.768680 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 13 00:33:25.768688 systemd[1]: Created slice user.slice - User and Session Slice. Mar 13 00:33:25.768697 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 13 00:33:25.768706 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 13 00:33:25.768717 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 13 00:33:25.768727 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 13 00:33:25.768736 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 13 00:33:25.768745 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 13 00:33:25.768754 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 13 00:33:25.768763 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 13 00:33:25.768774 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 13 00:33:25.768783 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 13 00:33:25.768792 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 13 00:33:25.768801 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 13 00:33:25.768809 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 13 00:33:25.768818 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 13 00:33:25.768827 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 13 00:33:25.768835 systemd[1]: Reached target slices.target - Slice Units. Mar 13 00:33:25.768844 systemd[1]: Reached target swap.target - Swaps. Mar 13 00:33:25.768853 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 13 00:33:25.768864 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 13 00:33:25.768873 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 13 00:33:25.768881 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 13 00:33:25.768890 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 13 00:33:25.768899 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 13 00:33:25.768908 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 13 00:33:25.768917 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 13 00:33:25.768926 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 13 00:33:25.768934 systemd[1]: Mounting media.mount - External Media Directory... Mar 13 00:33:25.768945 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:33:25.768954 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 13 00:33:25.768962 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 13 00:33:25.768971 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 13 00:33:25.768979 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 13 00:33:25.768988 systemd[1]: Reached target machines.target - Containers. Mar 13 00:33:25.769000 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 13 00:33:25.769009 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 13 00:33:25.769020 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 13 00:33:25.769029 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 13 00:33:25.769037 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 13 00:33:25.769046 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 13 00:33:25.769055 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 13 00:33:25.769063 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 13 00:33:25.769072 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 13 00:33:25.769081 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 13 00:33:25.769091 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 13 00:33:25.769100 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 13 00:33:25.769109 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 13 00:33:25.769118 systemd[1]: Stopped systemd-fsck-usr.service. Mar 13 00:33:25.769129 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 13 00:33:25.769140 kernel: fuse: init (API version 7.41) Mar 13 00:33:25.769148 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 13 00:33:25.769157 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 13 00:33:25.769167 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 13 00:33:25.769176 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 13 00:33:25.769184 kernel: ACPI: bus type drm_connector registered Mar 13 00:33:25.769195 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 13 00:33:25.769203 kernel: loop: module loaded Mar 13 00:33:25.769211 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 13 00:33:25.769220 systemd[1]: verity-setup.service: Deactivated successfully. Mar 13 00:33:25.769229 systemd[1]: Stopped verity-setup.service. Mar 13 00:33:25.769238 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:33:25.769247 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 13 00:33:25.769290 systemd-journald[1225]: Collecting audit messages is disabled. Mar 13 00:33:25.769321 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 13 00:33:25.769331 systemd[1]: Mounted media.mount - External Media Directory. Mar 13 00:33:25.769339 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 13 00:33:25.769348 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 13 00:33:25.769357 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 13 00:33:25.769366 systemd-journald[1225]: Journal started Mar 13 00:33:25.769385 systemd-journald[1225]: Runtime Journal (/run/log/journal/3343a34f6f734c2e8a0167e39c728900) is 8M, max 76.1M, 68.1M free. Mar 13 00:33:25.421004 systemd[1]: Queued start job for default target multi-user.target. Mar 13 00:33:25.433612 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Mar 13 00:33:25.434035 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 13 00:33:25.772294 systemd[1]: Started systemd-journald.service - Journal Service. Mar 13 00:33:25.773115 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 13 00:33:25.773857 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 13 00:33:25.774544 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 13 00:33:25.774765 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 13 00:33:25.775471 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 13 00:33:25.775686 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 13 00:33:25.776563 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 13 00:33:25.776784 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 13 00:33:25.777442 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 13 00:33:25.777644 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 13 00:33:25.778389 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 13 00:33:25.778593 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 13 00:33:25.779211 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 13 00:33:25.779427 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 13 00:33:25.780094 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 13 00:33:25.780864 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 13 00:33:25.781515 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 13 00:33:25.782133 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 13 00:33:25.792122 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 13 00:33:25.796344 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 13 00:33:25.798412 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 13 00:33:25.798791 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 13 00:33:25.798812 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 13 00:33:25.799824 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 13 00:33:25.804618 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 13 00:33:25.805141 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 13 00:33:25.809379 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 13 00:33:25.813475 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 13 00:33:25.813854 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 13 00:33:25.816750 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 13 00:33:25.817660 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 13 00:33:25.820354 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 13 00:33:25.824431 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 13 00:33:25.826587 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 13 00:33:25.830933 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 13 00:33:25.839635 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 13 00:33:25.842622 systemd-journald[1225]: Time spent on flushing to /var/log/journal/3343a34f6f734c2e8a0167e39c728900 is 52.509ms for 1248 entries. Mar 13 00:33:25.842622 systemd-journald[1225]: System Journal (/var/log/journal/3343a34f6f734c2e8a0167e39c728900) is 8M, max 584.8M, 576.8M free. Mar 13 00:33:25.914522 systemd-journald[1225]: Received client request to flush runtime journal. Mar 13 00:33:25.914559 kernel: loop0: detected capacity change from 0 to 110984 Mar 13 00:33:25.865452 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 13 00:33:25.867573 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 13 00:33:25.875663 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 13 00:33:25.880354 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 13 00:33:25.928738 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 13 00:33:25.924718 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 13 00:33:25.926097 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 13 00:33:25.940497 systemd-tmpfiles[1270]: ACLs are not supported, ignoring. Mar 13 00:33:25.940963 systemd-tmpfiles[1270]: ACLs are not supported, ignoring. Mar 13 00:33:25.948479 kernel: loop1: detected capacity change from 0 to 128560 Mar 13 00:33:25.949592 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 13 00:33:25.954428 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 13 00:33:25.956309 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 13 00:33:25.986297 kernel: loop2: detected capacity change from 0 to 8 Mar 13 00:33:26.007323 kernel: loop3: detected capacity change from 0 to 219192 Mar 13 00:33:26.009620 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 13 00:33:26.013250 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 13 00:33:26.045313 kernel: loop4: detected capacity change from 0 to 110984 Mar 13 00:33:26.048490 systemd-tmpfiles[1294]: ACLs are not supported, ignoring. Mar 13 00:33:26.048899 systemd-tmpfiles[1294]: ACLs are not supported, ignoring. Mar 13 00:33:26.056499 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 13 00:33:26.064577 kernel: loop5: detected capacity change from 0 to 128560 Mar 13 00:33:26.077392 kernel: loop6: detected capacity change from 0 to 8 Mar 13 00:33:26.082380 kernel: loop7: detected capacity change from 0 to 219192 Mar 13 00:33:26.104879 (sd-merge)[1297]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Mar 13 00:33:26.106341 (sd-merge)[1297]: Merged extensions into '/usr'. Mar 13 00:33:26.111658 systemd[1]: Reload requested from client PID 1269 ('systemd-sysext') (unit systemd-sysext.service)... Mar 13 00:33:26.111755 systemd[1]: Reloading... Mar 13 00:33:26.193322 zram_generator::config[1324]: No configuration found. Mar 13 00:33:26.266575 ldconfig[1264]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 13 00:33:26.377410 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 13 00:33:26.377689 systemd[1]: Reloading finished in 265 ms. Mar 13 00:33:26.406148 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 13 00:33:26.407075 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 13 00:33:26.409696 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 13 00:33:26.420521 systemd[1]: Starting ensure-sysext.service... Mar 13 00:33:26.424435 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 13 00:33:26.429853 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 13 00:33:26.450510 systemd[1]: Reload requested from client PID 1368 ('systemctl') (unit ensure-sysext.service)... Mar 13 00:33:26.450691 systemd[1]: Reloading... Mar 13 00:33:26.452842 systemd-tmpfiles[1369]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Mar 13 00:33:26.452871 systemd-tmpfiles[1369]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Mar 13 00:33:26.453091 systemd-tmpfiles[1369]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 13 00:33:26.457282 systemd-tmpfiles[1369]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 13 00:33:26.458064 systemd-tmpfiles[1369]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 13 00:33:26.458282 systemd-tmpfiles[1369]: ACLs are not supported, ignoring. Mar 13 00:33:26.458353 systemd-tmpfiles[1369]: ACLs are not supported, ignoring. Mar 13 00:33:26.471452 systemd-tmpfiles[1369]: Detected autofs mount point /boot during canonicalization of boot. Mar 13 00:33:26.471463 systemd-tmpfiles[1369]: Skipping /boot Mar 13 00:33:26.472497 systemd-udevd[1370]: Using default interface naming scheme 'v255'. Mar 13 00:33:26.483265 systemd-tmpfiles[1369]: Detected autofs mount point /boot during canonicalization of boot. Mar 13 00:33:26.483377 systemd-tmpfiles[1369]: Skipping /boot Mar 13 00:33:26.535298 zram_generator::config[1397]: No configuration found. Mar 13 00:33:26.712310 kernel: mousedev: PS/2 mouse device common for all mice Mar 13 00:33:26.743034 systemd[1]: Reloading finished in 291 ms. Mar 13 00:33:26.750427 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 13 00:33:26.751728 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 13 00:33:26.777049 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 13 00:33:26.795502 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 13 00:33:26.797496 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 13 00:33:26.800189 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 13 00:33:26.804988 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 13 00:33:26.807611 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 13 00:33:26.811789 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 13 00:33:26.817043 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:33:26.817178 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 13 00:33:26.820469 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 13 00:33:26.822861 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 13 00:33:26.829194 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 13 00:33:26.829935 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 13 00:33:26.830012 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 13 00:33:26.830070 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:33:26.833401 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:33:26.833538 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 13 00:33:26.833649 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 13 00:33:26.833698 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 13 00:33:26.833748 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:33:26.840282 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input5 Mar 13 00:33:26.842032 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:33:26.842233 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 13 00:33:26.847379 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 13 00:33:26.848060 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 13 00:33:26.848393 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 13 00:33:26.854939 kernel: ACPI: button: Power Button [PWRF] Mar 13 00:33:26.852491 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 13 00:33:26.853374 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:33:26.861743 systemd[1]: Finished ensure-sysext.service. Mar 13 00:33:26.863064 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 13 00:33:26.870438 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 13 00:33:26.871114 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 13 00:33:26.881455 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 13 00:33:26.901055 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 13 00:33:26.901424 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 13 00:33:26.902621 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 13 00:33:26.902810 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 13 00:33:26.905717 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 13 00:33:26.911819 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 13 00:33:26.913173 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 13 00:33:26.916706 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 13 00:33:26.916975 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 13 00:33:26.918903 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 13 00:33:26.922657 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 13 00:33:26.930327 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Mar 13 00:33:26.932289 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Mar 13 00:33:26.936678 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Mar 13 00:33:26.939446 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Mar 13 00:33:26.941794 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 13 00:33:26.944706 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Mar 13 00:33:26.944747 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:33:26.944841 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 13 00:33:26.947408 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 13 00:33:26.953715 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 13 00:33:26.956909 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 13 00:33:26.957327 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 13 00:33:26.957355 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 13 00:33:26.957371 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 13 00:33:26.957493 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 13 00:33:26.963800 augenrules[1535]: No rules Mar 13 00:33:26.963657 systemd[1]: audit-rules.service: Deactivated successfully. Mar 13 00:33:26.965426 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 13 00:33:26.975373 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 13 00:33:26.975779 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 13 00:33:26.994675 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 13 00:33:27.000592 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 13 00:33:27.002446 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 13 00:33:27.008670 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 13 00:33:27.008860 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 13 00:33:27.019849 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 13 00:33:27.021570 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 13 00:33:27.022071 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 13 00:33:27.022098 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 13 00:33:27.022405 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Mar 13 00:33:27.031588 kernel: Console: switching to colour dummy device 80x25 Mar 13 00:33:27.039291 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Mar 13 00:33:27.051564 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 13 00:33:27.058624 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Mar 13 00:33:27.058660 kernel: [drm] features: -context_init Mar 13 00:33:27.063293 kernel: EDAC MC: Ver: 3.0.0 Mar 13 00:33:27.065162 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 13 00:33:27.065373 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:33:27.069238 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 13 00:33:27.093833 kernel: [drm] number of scanouts: 1 Mar 13 00:33:27.093893 kernel: [drm] number of cap sets: 0 Mar 13 00:33:27.101299 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Mar 13 00:33:27.109335 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Mar 13 00:33:27.109373 kernel: Console: switching to colour frame buffer device 160x50 Mar 13 00:33:27.122298 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Mar 13 00:33:27.125778 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 13 00:33:27.127324 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:33:27.129888 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 13 00:33:27.180463 systemd-networkd[1490]: lo: Link UP Mar 13 00:33:27.180473 systemd-networkd[1490]: lo: Gained carrier Mar 13 00:33:27.185192 systemd-networkd[1490]: Enumeration completed Mar 13 00:33:27.185295 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 13 00:33:27.187799 systemd-networkd[1490]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 13 00:33:27.187810 systemd-networkd[1490]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 13 00:33:27.188889 systemd-networkd[1490]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 13 00:33:27.188897 systemd-networkd[1490]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 13 00:33:27.189167 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 13 00:33:27.193858 systemd-networkd[1490]: eth0: Link UP Mar 13 00:33:27.193987 systemd-networkd[1490]: eth0: Gained carrier Mar 13 00:33:27.194003 systemd-networkd[1490]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 13 00:33:27.195778 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 13 00:33:27.198807 systemd-networkd[1490]: eth1: Link UP Mar 13 00:33:27.199529 systemd-networkd[1490]: eth1: Gained carrier Mar 13 00:33:27.199547 systemd-networkd[1490]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 13 00:33:27.209967 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 13 00:33:27.210238 systemd[1]: Reached target time-set.target - System Time Set. Mar 13 00:33:27.210320 systemd-resolved[1491]: Positive Trust Anchors: Mar 13 00:33:27.210549 systemd-resolved[1491]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 13 00:33:27.210602 systemd-resolved[1491]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 13 00:33:27.217364 systemd-resolved[1491]: Using system hostname 'ci-4459-2-4-n-bcba60e63d'. Mar 13 00:33:27.219356 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 13 00:33:27.219804 systemd[1]: Reached target network.target - Network. Mar 13 00:33:27.219862 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 13 00:33:27.234453 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 13 00:33:27.236719 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 13 00:33:27.237434 systemd-networkd[1490]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Mar 13 00:33:27.238529 systemd-timesyncd[1507]: Network configuration changed, trying to establish connection. Mar 13 00:33:27.238770 systemd[1]: Reached target sysinit.target - System Initialization. Mar 13 00:33:27.239974 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 13 00:33:27.240096 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 13 00:33:27.240191 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Mar 13 00:33:27.240510 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 13 00:33:27.241136 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 13 00:33:27.241215 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 13 00:33:27.242321 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 13 00:33:27.242353 systemd[1]: Reached target paths.target - Path Units. Mar 13 00:33:27.242417 systemd[1]: Reached target timers.target - Timer Units. Mar 13 00:33:27.243532 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 13 00:33:27.245576 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 13 00:33:27.249117 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 13 00:33:27.254437 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 13 00:33:27.255476 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 13 00:33:27.259168 systemd-networkd[1490]: eth0: DHCPv4 address 157.180.117.151/32, gateway 172.31.1.1 acquired from 172.31.1.1 Mar 13 00:33:27.259470 systemd-timesyncd[1507]: Network configuration changed, trying to establish connection. Mar 13 00:33:27.260062 systemd-timesyncd[1507]: Network configuration changed, trying to establish connection. Mar 13 00:33:27.263544 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 13 00:33:27.267506 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 13 00:33:27.269929 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 13 00:33:27.271393 systemd[1]: Reached target sockets.target - Socket Units. Mar 13 00:33:27.274147 systemd[1]: Reached target basic.target - Basic System. Mar 13 00:33:27.274742 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 13 00:33:27.274767 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 13 00:33:27.275783 systemd[1]: Starting containerd.service - containerd container runtime... Mar 13 00:33:27.281382 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 13 00:33:27.286454 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 13 00:33:27.288555 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 13 00:33:27.299593 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 13 00:33:27.301876 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 13 00:33:27.302896 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 13 00:33:27.305967 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Mar 13 00:33:27.312128 coreos-metadata[1588]: Mar 13 00:33:27.312 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Mar 13 00:33:27.313192 coreos-metadata[1588]: Mar 13 00:33:27.313 INFO Fetch successful Mar 13 00:33:27.313353 coreos-metadata[1588]: Mar 13 00:33:27.313 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Mar 13 00:33:27.313648 coreos-metadata[1588]: Mar 13 00:33:27.313 INFO Fetch successful Mar 13 00:33:27.315711 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 13 00:33:27.320341 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 13 00:33:27.325400 google_oslogin_nss_cache[1595]: oslogin_cache_refresh[1595]: Refreshing passwd entry cache Mar 13 00:33:27.325410 oslogin_cache_refresh[1595]: Refreshing passwd entry cache Mar 13 00:33:27.328101 google_oslogin_nss_cache[1595]: oslogin_cache_refresh[1595]: Failure getting users, quitting Mar 13 00:33:27.328101 google_oslogin_nss_cache[1595]: oslogin_cache_refresh[1595]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Mar 13 00:33:27.328084 oslogin_cache_refresh[1595]: Failure getting users, quitting Mar 13 00:33:27.328191 google_oslogin_nss_cache[1595]: oslogin_cache_refresh[1595]: Refreshing group entry cache Mar 13 00:33:27.328096 oslogin_cache_refresh[1595]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Mar 13 00:33:27.328128 oslogin_cache_refresh[1595]: Refreshing group entry cache Mar 13 00:33:27.328500 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Mar 13 00:33:27.330575 jq[1593]: false Mar 13 00:33:27.329588 oslogin_cache_refresh[1595]: Failure getting groups, quitting Mar 13 00:33:27.330758 google_oslogin_nss_cache[1595]: oslogin_cache_refresh[1595]: Failure getting groups, quitting Mar 13 00:33:27.330758 google_oslogin_nss_cache[1595]: oslogin_cache_refresh[1595]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Mar 13 00:33:27.329597 oslogin_cache_refresh[1595]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Mar 13 00:33:27.334883 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 13 00:33:27.339251 extend-filesystems[1594]: Found /dev/sda6 Mar 13 00:33:27.341386 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 13 00:33:27.351229 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 13 00:33:27.356303 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 13 00:33:27.357358 extend-filesystems[1594]: Found /dev/sda9 Mar 13 00:33:27.356705 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 13 00:33:27.362343 extend-filesystems[1594]: Checking size of /dev/sda9 Mar 13 00:33:27.364215 systemd[1]: Starting update-engine.service - Update Engine... Mar 13 00:33:27.370373 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 13 00:33:27.381890 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 13 00:33:27.387380 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 13 00:33:27.387588 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 13 00:33:27.387855 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Mar 13 00:33:27.388024 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Mar 13 00:33:27.388770 extend-filesystems[1594]: Resized partition /dev/sda9 Mar 13 00:33:27.391597 systemd[1]: motdgen.service: Deactivated successfully. Mar 13 00:33:27.395737 jq[1617]: true Mar 13 00:33:27.395529 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 13 00:33:27.398424 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 13 00:33:27.399441 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 13 00:33:27.411760 update_engine[1613]: I20260313 00:33:27.411703 1613 main.cc:92] Flatcar Update Engine starting Mar 13 00:33:27.413315 extend-filesystems[1624]: resize2fs 1.47.3 (8-Jul-2025) Mar 13 00:33:27.426302 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 19393531 blocks Mar 13 00:33:27.433369 jq[1627]: true Mar 13 00:33:27.438454 systemd-logind[1609]: New seat seat0. Mar 13 00:33:27.441722 (ntainerd)[1635]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 13 00:33:27.451019 tar[1626]: linux-amd64/LICENSE Mar 13 00:33:27.451741 systemd-logind[1609]: Watching system buttons on /dev/input/event3 (Power Button) Mar 13 00:33:27.451761 systemd-logind[1609]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 13 00:33:27.451992 systemd[1]: Started systemd-logind.service - User Login Management. Mar 13 00:33:27.454005 tar[1626]: linux-amd64/helm Mar 13 00:33:27.457090 dbus-daemon[1589]: [system] SELinux support is enabled Mar 13 00:33:27.458096 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 13 00:33:27.462774 update_engine[1613]: I20260313 00:33:27.462321 1613 update_check_scheduler.cc:74] Next update check in 4m37s Mar 13 00:33:27.466039 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 13 00:33:27.466066 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 13 00:33:27.471249 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 13 00:33:27.471369 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 13 00:33:27.480137 systemd[1]: Started update-engine.service - Update Engine. Mar 13 00:33:27.483540 dbus-daemon[1589]: [system] Successfully activated service 'org.freedesktop.systemd1' Mar 13 00:33:27.497752 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 13 00:33:27.556154 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 13 00:33:27.560111 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 13 00:33:27.632763 bash[1671]: Updated "/home/core/.ssh/authorized_keys" Mar 13 00:33:27.634216 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 13 00:33:27.640177 systemd[1]: Starting sshkeys.service... Mar 13 00:33:27.670979 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 13 00:33:27.675508 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 13 00:33:27.702420 containerd[1635]: time="2026-03-13T00:33:27Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 13 00:33:27.704562 containerd[1635]: time="2026-03-13T00:33:27.704419583Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Mar 13 00:33:27.723299 containerd[1635]: time="2026-03-13T00:33:27.722826011Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="28.8µs" Mar 13 00:33:27.723299 containerd[1635]: time="2026-03-13T00:33:27.722850851Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 13 00:33:27.723299 containerd[1635]: time="2026-03-13T00:33:27.722867991Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 13 00:33:27.723299 containerd[1635]: time="2026-03-13T00:33:27.723017761Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 13 00:33:27.723299 containerd[1635]: time="2026-03-13T00:33:27.723053221Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 13 00:33:27.723299 containerd[1635]: time="2026-03-13T00:33:27.723075001Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 13 00:33:27.723299 containerd[1635]: time="2026-03-13T00:33:27.723125051Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 13 00:33:27.723637 containerd[1635]: time="2026-03-13T00:33:27.723134591Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 13 00:33:27.725166 containerd[1635]: time="2026-03-13T00:33:27.724693672Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 13 00:33:27.725166 containerd[1635]: time="2026-03-13T00:33:27.724956592Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 13 00:33:27.725166 containerd[1635]: time="2026-03-13T00:33:27.724968682Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 13 00:33:27.725166 containerd[1635]: time="2026-03-13T00:33:27.724975262Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 13 00:33:27.725662 containerd[1635]: time="2026-03-13T00:33:27.725448402Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 13 00:33:27.726705 containerd[1635]: time="2026-03-13T00:33:27.726117392Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 13 00:33:27.726838 locksmithd[1649]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 13 00:33:27.727398 containerd[1635]: time="2026-03-13T00:33:27.727030323Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 13 00:33:27.727398 containerd[1635]: time="2026-03-13T00:33:27.727041533Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 13 00:33:27.727398 containerd[1635]: time="2026-03-13T00:33:27.727066783Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 13 00:33:27.728885 containerd[1635]: time="2026-03-13T00:33:27.727720323Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 13 00:33:27.729738 containerd[1635]: time="2026-03-13T00:33:27.729528054Z" level=info msg="metadata content store policy set" policy=shared Mar 13 00:33:27.747797 containerd[1635]: time="2026-03-13T00:33:27.746336671Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 13 00:33:27.747797 containerd[1635]: time="2026-03-13T00:33:27.746374561Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 13 00:33:27.747797 containerd[1635]: time="2026-03-13T00:33:27.746392631Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 13 00:33:27.747797 containerd[1635]: time="2026-03-13T00:33:27.746404981Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 13 00:33:27.747797 containerd[1635]: time="2026-03-13T00:33:27.746415861Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 13 00:33:27.747797 containerd[1635]: time="2026-03-13T00:33:27.746423131Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 13 00:33:27.747797 containerd[1635]: time="2026-03-13T00:33:27.746431461Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 13 00:33:27.747797 containerd[1635]: time="2026-03-13T00:33:27.746448631Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 13 00:33:27.747797 containerd[1635]: time="2026-03-13T00:33:27.746456411Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 13 00:33:27.747797 containerd[1635]: time="2026-03-13T00:33:27.746464121Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 13 00:33:27.747797 containerd[1635]: time="2026-03-13T00:33:27.746470771Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 13 00:33:27.747797 containerd[1635]: time="2026-03-13T00:33:27.746480131Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 13 00:33:27.747797 containerd[1635]: time="2026-03-13T00:33:27.746594371Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 13 00:33:27.747797 containerd[1635]: time="2026-03-13T00:33:27.746608171Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 13 00:33:27.748010 containerd[1635]: time="2026-03-13T00:33:27.746618501Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 13 00:33:27.748010 containerd[1635]: time="2026-03-13T00:33:27.746625811Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 13 00:33:27.748010 containerd[1635]: time="2026-03-13T00:33:27.746633161Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 13 00:33:27.748010 containerd[1635]: time="2026-03-13T00:33:27.746640651Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 13 00:33:27.748010 containerd[1635]: time="2026-03-13T00:33:27.746664021Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 13 00:33:27.748010 containerd[1635]: time="2026-03-13T00:33:27.746671081Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 13 00:33:27.748010 containerd[1635]: time="2026-03-13T00:33:27.746678981Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 13 00:33:27.748010 containerd[1635]: time="2026-03-13T00:33:27.746686681Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 13 00:33:27.748010 containerd[1635]: time="2026-03-13T00:33:27.746701091Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 13 00:33:27.748010 containerd[1635]: time="2026-03-13T00:33:27.746743211Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 13 00:33:27.748010 containerd[1635]: time="2026-03-13T00:33:27.746755921Z" level=info msg="Start snapshots syncer" Mar 13 00:33:27.748010 containerd[1635]: time="2026-03-13T00:33:27.746768671Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 13 00:33:27.748165 containerd[1635]: time="2026-03-13T00:33:27.746942961Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 13 00:33:27.748165 containerd[1635]: time="2026-03-13T00:33:27.746978901Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 13 00:33:27.748255 containerd[1635]: time="2026-03-13T00:33:27.746999651Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 13 00:33:27.748255 containerd[1635]: time="2026-03-13T00:33:27.747072761Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 13 00:33:27.748255 containerd[1635]: time="2026-03-13T00:33:27.747091201Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 13 00:33:27.748255 containerd[1635]: time="2026-03-13T00:33:27.747098971Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 13 00:33:27.748255 containerd[1635]: time="2026-03-13T00:33:27.747105721Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 13 00:33:27.748255 containerd[1635]: time="2026-03-13T00:33:27.747113981Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 13 00:33:27.748255 containerd[1635]: time="2026-03-13T00:33:27.747121281Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 13 00:33:27.748255 containerd[1635]: time="2026-03-13T00:33:27.747129011Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 13 00:33:27.748255 containerd[1635]: time="2026-03-13T00:33:27.747143411Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 13 00:33:27.748255 containerd[1635]: time="2026-03-13T00:33:27.747150881Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 13 00:33:27.748255 containerd[1635]: time="2026-03-13T00:33:27.747162971Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 13 00:33:27.748255 containerd[1635]: time="2026-03-13T00:33:27.747184151Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 13 00:33:27.748255 containerd[1635]: time="2026-03-13T00:33:27.747193561Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 13 00:33:27.748255 containerd[1635]: time="2026-03-13T00:33:27.747199501Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 13 00:33:27.748461 containerd[1635]: time="2026-03-13T00:33:27.747207061Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 13 00:33:27.748461 containerd[1635]: time="2026-03-13T00:33:27.747212391Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 13 00:33:27.748461 containerd[1635]: time="2026-03-13T00:33:27.747218741Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 13 00:33:27.748461 containerd[1635]: time="2026-03-13T00:33:27.747231251Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 13 00:33:27.748461 containerd[1635]: time="2026-03-13T00:33:27.747243751Z" level=info msg="runtime interface created" Mar 13 00:33:27.748461 containerd[1635]: time="2026-03-13T00:33:27.747247621Z" level=info msg="created NRI interface" Mar 13 00:33:27.748461 containerd[1635]: time="2026-03-13T00:33:27.747253401Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 13 00:33:27.748461 containerd[1635]: time="2026-03-13T00:33:27.747260801Z" level=info msg="Connect containerd service" Mar 13 00:33:27.748461 containerd[1635]: time="2026-03-13T00:33:27.747293171Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 13 00:33:27.750074 containerd[1635]: time="2026-03-13T00:33:27.748861182Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 13 00:33:27.758527 coreos-metadata[1678]: Mar 13 00:33:27.758 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Mar 13 00:33:27.760460 coreos-metadata[1678]: Mar 13 00:33:27.760 INFO Fetch successful Mar 13 00:33:27.766142 unknown[1678]: wrote ssh authorized keys file for user: core Mar 13 00:33:27.773681 kernel: EXT4-fs (sda9): resized filesystem to 19393531 Mar 13 00:33:27.796315 extend-filesystems[1624]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Mar 13 00:33:27.796315 extend-filesystems[1624]: old_desc_blocks = 1, new_desc_blocks = 10 Mar 13 00:33:27.796315 extend-filesystems[1624]: The filesystem on /dev/sda9 is now 19393531 (4k) blocks long. Mar 13 00:33:27.798303 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 13 00:33:27.818484 update-ssh-keys[1690]: Updated "/home/core/.ssh/authorized_keys" Mar 13 00:33:27.818567 extend-filesystems[1594]: Resized filesystem in /dev/sda9 Mar 13 00:33:27.799448 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 13 00:33:27.811096 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 13 00:33:27.823018 systemd[1]: Finished sshkeys.service. Mar 13 00:33:27.841290 containerd[1635]: time="2026-03-13T00:33:27.840590620Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 13 00:33:27.841290 containerd[1635]: time="2026-03-13T00:33:27.840646050Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 13 00:33:27.841290 containerd[1635]: time="2026-03-13T00:33:27.840663650Z" level=info msg="Start subscribing containerd event" Mar 13 00:33:27.841290 containerd[1635]: time="2026-03-13T00:33:27.840687280Z" level=info msg="Start recovering state" Mar 13 00:33:27.841290 containerd[1635]: time="2026-03-13T00:33:27.840779060Z" level=info msg="Start event monitor" Mar 13 00:33:27.841290 containerd[1635]: time="2026-03-13T00:33:27.840787820Z" level=info msg="Start cni network conf syncer for default" Mar 13 00:33:27.841290 containerd[1635]: time="2026-03-13T00:33:27.840793390Z" level=info msg="Start streaming server" Mar 13 00:33:27.841290 containerd[1635]: time="2026-03-13T00:33:27.840800950Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 13 00:33:27.841290 containerd[1635]: time="2026-03-13T00:33:27.840806830Z" level=info msg="runtime interface starting up..." Mar 13 00:33:27.841290 containerd[1635]: time="2026-03-13T00:33:27.840811880Z" level=info msg="starting plugins..." Mar 13 00:33:27.841290 containerd[1635]: time="2026-03-13T00:33:27.840822190Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 13 00:33:27.841033 systemd[1]: Started containerd.service - containerd container runtime. Mar 13 00:33:27.849283 containerd[1635]: time="2026-03-13T00:33:27.848412773Z" level=info msg="containerd successfully booted in 0.147270s" Mar 13 00:33:27.866443 sshd_keygen[1625]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 13 00:33:27.883868 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 13 00:33:27.890466 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 13 00:33:27.905296 systemd[1]: issuegen.service: Deactivated successfully. Mar 13 00:33:27.905511 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 13 00:33:27.913451 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 13 00:33:27.924889 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 13 00:33:27.931488 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 13 00:33:27.936505 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 13 00:33:27.941250 systemd[1]: Reached target getty.target - Login Prompts. Mar 13 00:33:27.971846 tar[1626]: linux-amd64/README.md Mar 13 00:33:27.987984 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 13 00:33:28.952591 systemd-networkd[1490]: eth1: Gained IPv6LL Mar 13 00:33:28.953488 systemd-timesyncd[1507]: Network configuration changed, trying to establish connection. Mar 13 00:33:28.959031 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 13 00:33:28.961479 systemd[1]: Reached target network-online.target - Network is Online. Mar 13 00:33:28.967659 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 13 00:33:28.972940 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 13 00:33:29.038601 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 13 00:33:29.080455 systemd-networkd[1490]: eth0: Gained IPv6LL Mar 13 00:33:29.081420 systemd-timesyncd[1507]: Network configuration changed, trying to establish connection. Mar 13 00:33:30.030394 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:33:30.031159 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 13 00:33:30.033161 systemd[1]: Startup finished in 2.854s (kernel) + 6.122s (initrd) + 5.290s (userspace) = 14.267s. Mar 13 00:33:30.036590 (kubelet)[1740]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 13 00:33:30.496394 kubelet[1740]: E0313 00:33:30.496298 1740 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 13 00:33:30.501379 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 13 00:33:30.501578 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 13 00:33:30.502196 systemd[1]: kubelet.service: Consumed 882ms CPU time, 257.8M memory peak. Mar 13 00:33:33.545607 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 13 00:33:33.548514 systemd[1]: Started sshd@0-157.180.117.151:22-4.153.228.146:56558.service - OpenSSH per-connection server daemon (4.153.228.146:56558). Mar 13 00:33:34.231968 sshd[1752]: Accepted publickey for core from 4.153.228.146 port 56558 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:33:34.238953 sshd-session[1752]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:33:34.252594 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 13 00:33:34.254994 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 13 00:33:34.277400 systemd-logind[1609]: New session 1 of user core. Mar 13 00:33:34.292409 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 13 00:33:34.297594 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 13 00:33:34.317631 (systemd)[1757]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 13 00:33:34.321793 systemd-logind[1609]: New session c1 of user core. Mar 13 00:33:34.483505 systemd[1757]: Queued start job for default target default.target. Mar 13 00:33:34.495457 systemd[1757]: Created slice app.slice - User Application Slice. Mar 13 00:33:34.495513 systemd[1757]: Reached target paths.target - Paths. Mar 13 00:33:34.495560 systemd[1757]: Reached target timers.target - Timers. Mar 13 00:33:34.497358 systemd[1757]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 13 00:33:34.538928 systemd[1757]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 13 00:33:34.539045 systemd[1757]: Reached target sockets.target - Sockets. Mar 13 00:33:34.539084 systemd[1757]: Reached target basic.target - Basic System. Mar 13 00:33:34.539120 systemd[1757]: Reached target default.target - Main User Target. Mar 13 00:33:34.539150 systemd[1757]: Startup finished in 204ms. Mar 13 00:33:34.539495 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 13 00:33:34.547398 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 13 00:33:34.929175 systemd[1]: Started sshd@1-157.180.117.151:22-4.153.228.146:56562.service - OpenSSH per-connection server daemon (4.153.228.146:56562). Mar 13 00:33:35.608391 sshd[1768]: Accepted publickey for core from 4.153.228.146 port 56562 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:33:35.610246 sshd-session[1768]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:33:35.620391 systemd-logind[1609]: New session 2 of user core. Mar 13 00:33:35.630526 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 13 00:33:35.977933 sshd[1771]: Connection closed by 4.153.228.146 port 56562 Mar 13 00:33:35.979629 sshd-session[1768]: pam_unix(sshd:session): session closed for user core Mar 13 00:33:35.986141 systemd[1]: sshd@1-157.180.117.151:22-4.153.228.146:56562.service: Deactivated successfully. Mar 13 00:33:35.990218 systemd[1]: session-2.scope: Deactivated successfully. Mar 13 00:33:35.992771 systemd-logind[1609]: Session 2 logged out. Waiting for processes to exit. Mar 13 00:33:35.996608 systemd-logind[1609]: Removed session 2. Mar 13 00:33:36.114639 systemd[1]: Started sshd@2-157.180.117.151:22-4.153.228.146:56578.service - OpenSSH per-connection server daemon (4.153.228.146:56578). Mar 13 00:33:36.778422 sshd[1777]: Accepted publickey for core from 4.153.228.146 port 56578 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:33:36.781243 sshd-session[1777]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:33:36.790619 systemd-logind[1609]: New session 3 of user core. Mar 13 00:33:36.796527 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 13 00:33:37.135633 sshd[1780]: Connection closed by 4.153.228.146 port 56578 Mar 13 00:33:37.136588 sshd-session[1777]: pam_unix(sshd:session): session closed for user core Mar 13 00:33:37.143533 systemd-logind[1609]: Session 3 logged out. Waiting for processes to exit. Mar 13 00:33:37.144881 systemd[1]: sshd@2-157.180.117.151:22-4.153.228.146:56578.service: Deactivated successfully. Mar 13 00:33:37.148609 systemd[1]: session-3.scope: Deactivated successfully. Mar 13 00:33:37.151589 systemd-logind[1609]: Removed session 3. Mar 13 00:33:37.263098 systemd[1]: Started sshd@3-157.180.117.151:22-4.153.228.146:56580.service - OpenSSH per-connection server daemon (4.153.228.146:56580). Mar 13 00:33:37.917903 sshd[1786]: Accepted publickey for core from 4.153.228.146 port 56580 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:33:37.919777 sshd-session[1786]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:33:37.929385 systemd-logind[1609]: New session 4 of user core. Mar 13 00:33:37.936498 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 13 00:33:38.283365 sshd[1789]: Connection closed by 4.153.228.146 port 56580 Mar 13 00:33:38.284639 sshd-session[1786]: pam_unix(sshd:session): session closed for user core Mar 13 00:33:38.292590 systemd[1]: sshd@3-157.180.117.151:22-4.153.228.146:56580.service: Deactivated successfully. Mar 13 00:33:38.296240 systemd[1]: session-4.scope: Deactivated successfully. Mar 13 00:33:38.297695 systemd-logind[1609]: Session 4 logged out. Waiting for processes to exit. Mar 13 00:33:38.300509 systemd-logind[1609]: Removed session 4. Mar 13 00:33:38.417632 systemd[1]: Started sshd@4-157.180.117.151:22-4.153.228.146:57566.service - OpenSSH per-connection server daemon (4.153.228.146:57566). Mar 13 00:33:39.096254 sshd[1795]: Accepted publickey for core from 4.153.228.146 port 57566 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:33:39.098568 sshd-session[1795]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:33:39.107366 systemd-logind[1609]: New session 5 of user core. Mar 13 00:33:39.115556 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 13 00:33:39.359736 sudo[1799]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 13 00:33:39.360398 sudo[1799]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 13 00:33:39.377327 sudo[1799]: pam_unix(sudo:session): session closed for user root Mar 13 00:33:39.498051 sshd[1798]: Connection closed by 4.153.228.146 port 57566 Mar 13 00:33:39.500650 sshd-session[1795]: pam_unix(sshd:session): session closed for user core Mar 13 00:33:39.506564 systemd[1]: sshd@4-157.180.117.151:22-4.153.228.146:57566.service: Deactivated successfully. Mar 13 00:33:39.510296 systemd[1]: session-5.scope: Deactivated successfully. Mar 13 00:33:39.512568 systemd-logind[1609]: Session 5 logged out. Waiting for processes to exit. Mar 13 00:33:39.515559 systemd-logind[1609]: Removed session 5. Mar 13 00:33:39.636784 systemd[1]: Started sshd@5-157.180.117.151:22-4.153.228.146:57570.service - OpenSSH per-connection server daemon (4.153.228.146:57570). Mar 13 00:33:40.304579 sshd[1805]: Accepted publickey for core from 4.153.228.146 port 57570 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:33:40.307876 sshd-session[1805]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:33:40.317254 systemd-logind[1609]: New session 6 of user core. Mar 13 00:33:40.328503 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 13 00:33:40.557335 sudo[1810]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 13 00:33:40.557951 sudo[1810]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 13 00:33:40.559757 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 13 00:33:40.564542 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 13 00:33:40.571196 sudo[1810]: pam_unix(sudo:session): session closed for user root Mar 13 00:33:40.583613 sudo[1809]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 13 00:33:40.584903 sudo[1809]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 13 00:33:40.606692 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 13 00:33:40.653331 augenrules[1835]: No rules Mar 13 00:33:40.654609 systemd[1]: audit-rules.service: Deactivated successfully. Mar 13 00:33:40.654811 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 13 00:33:40.656941 sudo[1809]: pam_unix(sudo:session): session closed for user root Mar 13 00:33:40.735095 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:33:40.743548 (kubelet)[1845]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 13 00:33:40.778191 sshd[1808]: Connection closed by 4.153.228.146 port 57570 Mar 13 00:33:40.780993 sshd-session[1805]: pam_unix(sshd:session): session closed for user core Mar 13 00:33:40.785073 kubelet[1845]: E0313 00:33:40.785037 1845 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 13 00:33:40.787133 systemd[1]: sshd@5-157.180.117.151:22-4.153.228.146:57570.service: Deactivated successfully. Mar 13 00:33:40.788251 systemd-logind[1609]: Session 6 logged out. Waiting for processes to exit. Mar 13 00:33:40.790192 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 13 00:33:40.790362 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 13 00:33:40.790614 systemd[1]: kubelet.service: Consumed 179ms CPU time, 108.9M memory peak. Mar 13 00:33:40.791027 systemd[1]: session-6.scope: Deactivated successfully. Mar 13 00:33:40.793235 systemd-logind[1609]: Removed session 6. Mar 13 00:33:40.914596 systemd[1]: Started sshd@6-157.180.117.151:22-4.153.228.146:57574.service - OpenSSH per-connection server daemon (4.153.228.146:57574). Mar 13 00:33:41.585996 sshd[1856]: Accepted publickey for core from 4.153.228.146 port 57574 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:33:41.586504 sshd-session[1856]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:33:41.591098 systemd-logind[1609]: New session 7 of user core. Mar 13 00:33:41.596411 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 13 00:33:41.833058 sudo[1860]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 13 00:33:41.833748 sudo[1860]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 13 00:33:42.116481 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 13 00:33:42.125638 (dockerd)[1878]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 13 00:33:42.390880 dockerd[1878]: time="2026-03-13T00:33:42.390484420Z" level=info msg="Starting up" Mar 13 00:33:42.393472 dockerd[1878]: time="2026-03-13T00:33:42.393404301Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 13 00:33:42.411137 dockerd[1878]: time="2026-03-13T00:33:42.411034278Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Mar 13 00:33:42.426563 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport1717163298-merged.mount: Deactivated successfully. Mar 13 00:33:42.459564 dockerd[1878]: time="2026-03-13T00:33:42.459504678Z" level=info msg="Loading containers: start." Mar 13 00:33:42.470309 kernel: Initializing XFRM netlink socket Mar 13 00:33:42.699103 systemd-timesyncd[1507]: Network configuration changed, trying to establish connection. Mar 13 00:33:42.740671 systemd-networkd[1490]: docker0: Link UP Mar 13 00:33:42.744920 dockerd[1878]: time="2026-03-13T00:33:42.744870217Z" level=info msg="Loading containers: done." Mar 13 00:33:42.760766 dockerd[1878]: time="2026-03-13T00:33:42.760713134Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 13 00:33:42.760938 dockerd[1878]: time="2026-03-13T00:33:42.760774064Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Mar 13 00:33:42.760938 dockerd[1878]: time="2026-03-13T00:33:42.760840534Z" level=info msg="Initializing buildkit" Mar 13 00:33:42.782339 dockerd[1878]: time="2026-03-13T00:33:42.782305153Z" level=info msg="Completed buildkit initialization" Mar 13 00:33:42.789230 dockerd[1878]: time="2026-03-13T00:33:42.789188976Z" level=info msg="Daemon has completed initialization" Mar 13 00:33:42.789367 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 13 00:33:42.789782 dockerd[1878]: time="2026-03-13T00:33:42.789339196Z" level=info msg="API listen on /run/docker.sock" Mar 13 00:33:43.619168 systemd-timesyncd[1507]: Contacted time server 217.91.44.17:123 (2.flatcar.pool.ntp.org). Mar 13 00:33:43.619239 systemd-timesyncd[1507]: Initial clock synchronization to Fri 2026-03-13 00:33:43.618900 UTC. Mar 13 00:33:43.619925 systemd-resolved[1491]: Clock change detected. Flushing caches. Mar 13 00:33:44.239368 containerd[1635]: time="2026-03-13T00:33:44.238854592Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.5\"" Mar 13 00:33:44.813308 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2272225483.mount: Deactivated successfully. Mar 13 00:33:45.889787 containerd[1635]: time="2026-03-13T00:33:45.889721640Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:45.890693 containerd[1635]: time="2026-03-13T00:33:45.890480840Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.5: active requests=0, bytes read=27074597" Mar 13 00:33:45.891503 containerd[1635]: time="2026-03-13T00:33:45.891475100Z" level=info msg="ImageCreate event name:\"sha256:364ea2876e41b29691964751b6217cd2e343433690fbe16a5c6a236042684df3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:45.893603 containerd[1635]: time="2026-03-13T00:33:45.893577581Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:c548633fcd3b4aad59b70815be4c8be54a0fddaddc3fcffa9371eedb0e96417a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:45.894542 containerd[1635]: time="2026-03-13T00:33:45.894495302Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.5\" with image id \"sha256:364ea2876e41b29691964751b6217cd2e343433690fbe16a5c6a236042684df3\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:c548633fcd3b4aad59b70815be4c8be54a0fddaddc3fcffa9371eedb0e96417a\", size \"27071096\" in 1.65559496s" Mar 13 00:33:45.894542 containerd[1635]: time="2026-03-13T00:33:45.894549642Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.5\" returns image reference \"sha256:364ea2876e41b29691964751b6217cd2e343433690fbe16a5c6a236042684df3\"" Mar 13 00:33:45.895178 containerd[1635]: time="2026-03-13T00:33:45.895146422Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.5\"" Mar 13 00:33:47.005072 containerd[1635]: time="2026-03-13T00:33:47.005010864Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:47.006122 containerd[1635]: time="2026-03-13T00:33:47.006085245Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.5: active requests=0, bytes read=21165845" Mar 13 00:33:47.006706 containerd[1635]: time="2026-03-13T00:33:47.006672335Z" level=info msg="ImageCreate event name:\"sha256:8926c34822743bb97f9003f92c30127bfeaad8bed71cd36f1c861ed8fda2c154\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:47.009112 containerd[1635]: time="2026-03-13T00:33:47.008419696Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:f0426100c873816560c520d542fa28999a98dad909edd04365f3b0eead790da3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:47.009112 containerd[1635]: time="2026-03-13T00:33:47.008992196Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.5\" with image id \"sha256:8926c34822743bb97f9003f92c30127bfeaad8bed71cd36f1c861ed8fda2c154\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:f0426100c873816560c520d542fa28999a98dad909edd04365f3b0eead790da3\", size \"22822771\" in 1.113813804s" Mar 13 00:33:47.009112 containerd[1635]: time="2026-03-13T00:33:47.009012076Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.5\" returns image reference \"sha256:8926c34822743bb97f9003f92c30127bfeaad8bed71cd36f1c861ed8fda2c154\"" Mar 13 00:33:47.009430 containerd[1635]: time="2026-03-13T00:33:47.009413476Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.5\"" Mar 13 00:33:47.968155 containerd[1635]: time="2026-03-13T00:33:47.968113595Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:47.969052 containerd[1635]: time="2026-03-13T00:33:47.968870096Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.5: active requests=0, bytes read=15729846" Mar 13 00:33:47.969855 containerd[1635]: time="2026-03-13T00:33:47.969836496Z" level=info msg="ImageCreate event name:\"sha256:f6b3520b1732b4980b2528fe5622e62be26bb6a8d38da81349cb6ccd3a1e6d65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:47.973968 containerd[1635]: time="2026-03-13T00:33:47.973946728Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:b67b0d627c8e99ffa362bd4d9a60ca9a6c449e363a5f88d2aa8c224bd84ca51d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:47.974543 containerd[1635]: time="2026-03-13T00:33:47.974524198Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.5\" with image id \"sha256:f6b3520b1732b4980b2528fe5622e62be26bb6a8d38da81349cb6ccd3a1e6d65\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:b67b0d627c8e99ffa362bd4d9a60ca9a6c449e363a5f88d2aa8c224bd84ca51d\", size \"17386790\" in 965.084602ms" Mar 13 00:33:47.974594 containerd[1635]: time="2026-03-13T00:33:47.974585148Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.5\" returns image reference \"sha256:f6b3520b1732b4980b2528fe5622e62be26bb6a8d38da81349cb6ccd3a1e6d65\"" Mar 13 00:33:47.975168 containerd[1635]: time="2026-03-13T00:33:47.975155648Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.5\"" Mar 13 00:33:48.941936 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1868827108.mount: Deactivated successfully. Mar 13 00:33:49.125199 containerd[1635]: time="2026-03-13T00:33:49.125155317Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:49.126273 containerd[1635]: time="2026-03-13T00:33:49.126137118Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.5: active requests=0, bytes read=25861798" Mar 13 00:33:49.129137 containerd[1635]: time="2026-03-13T00:33:49.129115479Z" level=info msg="ImageCreate event name:\"sha256:38728cde323c302ed9eca4f1b7c0080d17db50144e39398fcf901d9df13f0c3e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:49.133350 containerd[1635]: time="2026-03-13T00:33:49.133326411Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:8a22a3bf452d07af3b5a3064b089d2ad6579d5dd3b850386e05cc0f36dc3f4cf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:49.133776 containerd[1635]: time="2026-03-13T00:33:49.133748401Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.5\" with image id \"sha256:38728cde323c302ed9eca4f1b7c0080d17db50144e39398fcf901d9df13f0c3e\", repo tag \"registry.k8s.io/kube-proxy:v1.34.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:8a22a3bf452d07af3b5a3064b089d2ad6579d5dd3b850386e05cc0f36dc3f4cf\", size \"25860789\" in 1.158507153s" Mar 13 00:33:49.133888 containerd[1635]: time="2026-03-13T00:33:49.133822481Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.5\" returns image reference \"sha256:38728cde323c302ed9eca4f1b7c0080d17db50144e39398fcf901d9df13f0c3e\"" Mar 13 00:33:49.134365 containerd[1635]: time="2026-03-13T00:33:49.134344691Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Mar 13 00:33:49.641090 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount826796163.mount: Deactivated successfully. Mar 13 00:33:50.391848 containerd[1635]: time="2026-03-13T00:33:50.391800105Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:50.393176 containerd[1635]: time="2026-03-13T00:33:50.392966475Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=22388101" Mar 13 00:33:50.394250 containerd[1635]: time="2026-03-13T00:33:50.394226736Z" level=info msg="ImageCreate event name:\"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:50.396638 containerd[1635]: time="2026-03-13T00:33:50.396599637Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:50.397183 containerd[1635]: time="2026-03-13T00:33:50.397160837Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"22384805\" in 1.262751406s" Mar 13 00:33:50.397215 containerd[1635]: time="2026-03-13T00:33:50.397185937Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\"" Mar 13 00:33:50.397570 containerd[1635]: time="2026-03-13T00:33:50.397538867Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Mar 13 00:33:50.862898 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3058577177.mount: Deactivated successfully. Mar 13 00:33:50.867837 containerd[1635]: time="2026-03-13T00:33:50.867804883Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:50.868638 containerd[1635]: time="2026-03-13T00:33:50.868481583Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=321240" Mar 13 00:33:50.869510 containerd[1635]: time="2026-03-13T00:33:50.869492834Z" level=info msg="ImageCreate event name:\"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:50.871251 containerd[1635]: time="2026-03-13T00:33:50.871230014Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:50.871711 containerd[1635]: time="2026-03-13T00:33:50.871692425Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"320448\" in 474.132168ms" Mar 13 00:33:50.871783 containerd[1635]: time="2026-03-13T00:33:50.871772465Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\"" Mar 13 00:33:50.872134 containerd[1635]: time="2026-03-13T00:33:50.872112215Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\"" Mar 13 00:33:51.371943 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount26497320.mount: Deactivated successfully. Mar 13 00:33:51.818750 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 13 00:33:51.820770 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 13 00:33:51.961738 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:33:51.968124 (kubelet)[2283]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 13 00:33:52.000185 kubelet[2283]: E0313 00:33:52.000136 2283 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 13 00:33:52.002638 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 13 00:33:52.002819 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 13 00:33:52.003131 systemd[1]: kubelet.service: Consumed 136ms CPU time, 110.1M memory peak. Mar 13 00:33:52.050394 containerd[1635]: time="2026-03-13T00:33:52.050352366Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.5-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:52.051476 containerd[1635]: time="2026-03-13T00:33:52.051269656Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.5-0: active requests=0, bytes read=22860762" Mar 13 00:33:52.052248 containerd[1635]: time="2026-03-13T00:33:52.052231056Z" level=info msg="ImageCreate event name:\"sha256:a3e246e9556e93d71e2850085ba581b376c76a9187b4b8a01c120f86579ef2b1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:52.054588 containerd[1635]: time="2026-03-13T00:33:52.054564397Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:33:52.055536 containerd[1635]: time="2026-03-13T00:33:52.055512228Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.5-0\" with image id \"sha256:a3e246e9556e93d71e2850085ba581b376c76a9187b4b8a01c120f86579ef2b1\", repo tag \"registry.k8s.io/etcd:3.6.5-0\", repo digest \"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\", size \"22871747\" in 1.183381963s" Mar 13 00:33:52.055536 containerd[1635]: time="2026-03-13T00:33:52.055534998Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\" returns image reference \"sha256:a3e246e9556e93d71e2850085ba581b376c76a9187b4b8a01c120f86579ef2b1\"" Mar 13 00:33:53.524792 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:33:53.525230 systemd[1]: kubelet.service: Consumed 136ms CPU time, 110.1M memory peak. Mar 13 00:33:53.528810 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 13 00:33:53.552829 systemd[1]: Reload requested from client PID 2324 ('systemctl') (unit session-7.scope)... Mar 13 00:33:53.552916 systemd[1]: Reloading... Mar 13 00:33:53.654703 zram_generator::config[2367]: No configuration found. Mar 13 00:33:53.831061 systemd[1]: Reloading finished in 277 ms. Mar 13 00:33:53.888988 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 13 00:33:53.889188 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 13 00:33:53.889862 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:33:53.889950 systemd[1]: kubelet.service: Consumed 124ms CPU time, 98.5M memory peak. Mar 13 00:33:53.893277 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 13 00:33:54.041023 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:33:54.050094 (kubelet)[2421]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 13 00:33:54.097385 kubelet[2421]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 13 00:33:54.097385 kubelet[2421]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 00:33:54.097385 kubelet[2421]: I0313 00:33:54.097213 2421 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 13 00:33:54.620388 kubelet[2421]: I0313 00:33:54.620356 2421 server.go:529] "Kubelet version" kubeletVersion="v1.34.4" Mar 13 00:33:54.620568 kubelet[2421]: I0313 00:33:54.620502 2421 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 13 00:33:54.622664 kubelet[2421]: I0313 00:33:54.622033 2421 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 13 00:33:54.622664 kubelet[2421]: I0313 00:33:54.622046 2421 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 13 00:33:54.622664 kubelet[2421]: I0313 00:33:54.622232 2421 server.go:956] "Client rotation is on, will bootstrap in background" Mar 13 00:33:54.631230 kubelet[2421]: E0313 00:33:54.631205 2421 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://157.180.117.151:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 157.180.117.151:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 13 00:33:54.633472 kubelet[2421]: I0313 00:33:54.633444 2421 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 13 00:33:54.637070 kubelet[2421]: I0313 00:33:54.637045 2421 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 13 00:33:54.640716 kubelet[2421]: I0313 00:33:54.640685 2421 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 13 00:33:54.641457 kubelet[2421]: I0313 00:33:54.641425 2421 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 13 00:33:54.641572 kubelet[2421]: I0313 00:33:54.641448 2421 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-4-n-bcba60e63d","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 13 00:33:54.641572 kubelet[2421]: I0313 00:33:54.641565 2421 topology_manager.go:138] "Creating topology manager with none policy" Mar 13 00:33:54.641572 kubelet[2421]: I0313 00:33:54.641572 2421 container_manager_linux.go:306] "Creating device plugin manager" Mar 13 00:33:54.641700 kubelet[2421]: I0313 00:33:54.641673 2421 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Mar 13 00:33:54.643616 kubelet[2421]: I0313 00:33:54.643590 2421 state_mem.go:36] "Initialized new in-memory state store" Mar 13 00:33:54.643808 kubelet[2421]: I0313 00:33:54.643785 2421 kubelet.go:475] "Attempting to sync node with API server" Mar 13 00:33:54.643808 kubelet[2421]: I0313 00:33:54.643798 2421 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 13 00:33:54.643859 kubelet[2421]: I0313 00:33:54.643816 2421 kubelet.go:387] "Adding apiserver pod source" Mar 13 00:33:54.646401 kubelet[2421]: I0313 00:33:54.646370 2421 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 13 00:33:54.650364 kubelet[2421]: E0313 00:33:54.650328 2421 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://157.180.117.151:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-4-n-bcba60e63d&limit=500&resourceVersion=0\": dial tcp 157.180.117.151:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 13 00:33:54.650627 kubelet[2421]: E0313 00:33:54.650601 2421 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://157.180.117.151:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 157.180.117.151:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 13 00:33:54.651677 kubelet[2421]: I0313 00:33:54.651006 2421 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 13 00:33:54.651677 kubelet[2421]: I0313 00:33:54.651405 2421 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 13 00:33:54.651677 kubelet[2421]: I0313 00:33:54.651421 2421 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 13 00:33:54.651677 kubelet[2421]: W0313 00:33:54.651469 2421 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 13 00:33:54.654984 kubelet[2421]: I0313 00:33:54.654804 2421 server.go:1262] "Started kubelet" Mar 13 00:33:54.655978 kubelet[2421]: I0313 00:33:54.655965 2421 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 13 00:33:54.659367 kubelet[2421]: E0313 00:33:54.658343 2421 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://157.180.117.151:6443/api/v1/namespaces/default/events\": dial tcp 157.180.117.151:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-2-4-n-bcba60e63d.189c3f5ff5c322a8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-2-4-n-bcba60e63d,UID:ci-4459-2-4-n-bcba60e63d,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-2-4-n-bcba60e63d,},FirstTimestamp:2026-03-13 00:33:54.65478212 +0000 UTC m=+0.601020411,LastTimestamp:2026-03-13 00:33:54.65478212 +0000 UTC m=+0.601020411,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-4-n-bcba60e63d,}" Mar 13 00:33:54.660336 kubelet[2421]: E0313 00:33:54.660284 2421 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 13 00:33:54.660455 kubelet[2421]: I0313 00:33:54.660432 2421 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 13 00:33:54.661603 kubelet[2421]: I0313 00:33:54.661580 2421 server.go:310] "Adding debug handlers to kubelet server" Mar 13 00:33:54.665678 kubelet[2421]: I0313 00:33:54.665629 2421 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 13 00:33:54.665754 kubelet[2421]: I0313 00:33:54.665745 2421 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 13 00:33:54.665818 kubelet[2421]: I0313 00:33:54.665800 2421 volume_manager.go:313] "Starting Kubelet Volume Manager" Mar 13 00:33:54.667663 kubelet[2421]: E0313 00:33:54.666183 2421 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-bcba60e63d\" not found" Mar 13 00:33:54.667663 kubelet[2421]: I0313 00:33:54.666337 2421 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 13 00:33:54.669860 kubelet[2421]: I0313 00:33:54.669844 2421 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 13 00:33:54.669957 kubelet[2421]: I0313 00:33:54.669936 2421 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 13 00:33:54.670006 kubelet[2421]: I0313 00:33:54.669987 2421 reconciler.go:29] "Reconciler: start to sync state" Mar 13 00:33:54.670753 kubelet[2421]: I0313 00:33:54.670742 2421 factory.go:223] Registration of the systemd container factory successfully Mar 13 00:33:54.671334 kubelet[2421]: I0313 00:33:54.671317 2421 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 13 00:33:54.671797 kubelet[2421]: E0313 00:33:54.671779 2421 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://157.180.117.151:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-bcba60e63d?timeout=10s\": dial tcp 157.180.117.151:6443: connect: connection refused" interval="200ms" Mar 13 00:33:54.673309 kubelet[2421]: I0313 00:33:54.673299 2421 factory.go:223] Registration of the containerd container factory successfully Mar 13 00:33:54.680995 kubelet[2421]: I0313 00:33:54.680973 2421 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 13 00:33:54.682204 kubelet[2421]: I0313 00:33:54.682193 2421 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 13 00:33:54.682258 kubelet[2421]: I0313 00:33:54.682252 2421 status_manager.go:244] "Starting to sync pod status with apiserver" Mar 13 00:33:54.682305 kubelet[2421]: I0313 00:33:54.682300 2421 kubelet.go:2428] "Starting kubelet main sync loop" Mar 13 00:33:54.682365 kubelet[2421]: E0313 00:33:54.682354 2421 kubelet.go:2452] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 13 00:33:54.689040 kubelet[2421]: E0313 00:33:54.689026 2421 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://157.180.117.151:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 157.180.117.151:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 13 00:33:54.689613 kubelet[2421]: E0313 00:33:54.689600 2421 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://157.180.117.151:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 157.180.117.151:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 13 00:33:54.704399 kubelet[2421]: I0313 00:33:54.704387 2421 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 13 00:33:54.704614 kubelet[2421]: I0313 00:33:54.704605 2421 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 13 00:33:54.704678 kubelet[2421]: I0313 00:33:54.704671 2421 state_mem.go:36] "Initialized new in-memory state store" Mar 13 00:33:54.707004 kubelet[2421]: I0313 00:33:54.706993 2421 policy_none.go:49] "None policy: Start" Mar 13 00:33:54.707051 kubelet[2421]: I0313 00:33:54.707044 2421 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 13 00:33:54.707091 kubelet[2421]: I0313 00:33:54.707085 2421 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 13 00:33:54.708576 kubelet[2421]: I0313 00:33:54.708567 2421 policy_none.go:47] "Start" Mar 13 00:33:54.712513 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 13 00:33:54.722242 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 13 00:33:54.725620 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 13 00:33:54.735616 kubelet[2421]: E0313 00:33:54.735600 2421 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 13 00:33:54.735863 kubelet[2421]: I0313 00:33:54.735853 2421 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 13 00:33:54.735934 kubelet[2421]: I0313 00:33:54.735911 2421 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 13 00:33:54.736121 kubelet[2421]: I0313 00:33:54.736110 2421 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 13 00:33:54.737235 kubelet[2421]: E0313 00:33:54.737225 2421 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 13 00:33:54.737301 kubelet[2421]: E0313 00:33:54.737295 2421 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459-2-4-n-bcba60e63d\" not found" Mar 13 00:33:54.797368 systemd[1]: Created slice kubepods-burstable-pod4386427b1588f042cb1f28dbcb739b96.slice - libcontainer container kubepods-burstable-pod4386427b1588f042cb1f28dbcb739b96.slice. Mar 13 00:33:54.808251 kubelet[2421]: E0313 00:33:54.808204 2421 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-bcba60e63d\" not found" node="ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:54.813853 systemd[1]: Created slice kubepods-burstable-pod5a91e11390d62b878c5b871ef7b9ed70.slice - libcontainer container kubepods-burstable-pod5a91e11390d62b878c5b871ef7b9ed70.slice. Mar 13 00:33:54.823798 kubelet[2421]: E0313 00:33:54.823757 2421 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-bcba60e63d\" not found" node="ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:54.829683 systemd[1]: Created slice kubepods-burstable-pod2ecefdde302235cd2e6a23a4eae31b0a.slice - libcontainer container kubepods-burstable-pod2ecefdde302235cd2e6a23a4eae31b0a.slice. Mar 13 00:33:54.832702 kubelet[2421]: E0313 00:33:54.832621 2421 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-bcba60e63d\" not found" node="ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:54.838250 kubelet[2421]: I0313 00:33:54.838203 2421 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:54.838579 kubelet[2421]: E0313 00:33:54.838475 2421 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://157.180.117.151:6443/api/v1/nodes\": dial tcp 157.180.117.151:6443: connect: connection refused" node="ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:54.872388 kubelet[2421]: E0313 00:33:54.872280 2421 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://157.180.117.151:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-bcba60e63d?timeout=10s\": dial tcp 157.180.117.151:6443: connect: connection refused" interval="400ms" Mar 13 00:33:54.971107 kubelet[2421]: I0313 00:33:54.970968 2421 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5a91e11390d62b878c5b871ef7b9ed70-ca-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-bcba60e63d\" (UID: \"5a91e11390d62b878c5b871ef7b9ed70\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:54.971107 kubelet[2421]: I0313 00:33:54.971019 2421 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/5a91e11390d62b878c5b871ef7b9ed70-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-4-n-bcba60e63d\" (UID: \"5a91e11390d62b878c5b871ef7b9ed70\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:54.971347 kubelet[2421]: I0313 00:33:54.971141 2421 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5a91e11390d62b878c5b871ef7b9ed70-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-bcba60e63d\" (UID: \"5a91e11390d62b878c5b871ef7b9ed70\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:54.971347 kubelet[2421]: I0313 00:33:54.971245 2421 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5a91e11390d62b878c5b871ef7b9ed70-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-4-n-bcba60e63d\" (UID: \"5a91e11390d62b878c5b871ef7b9ed70\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:54.971347 kubelet[2421]: I0313 00:33:54.971276 2421 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4386427b1588f042cb1f28dbcb739b96-k8s-certs\") pod \"kube-apiserver-ci-4459-2-4-n-bcba60e63d\" (UID: \"4386427b1588f042cb1f28dbcb739b96\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:54.971463 kubelet[2421]: I0313 00:33:54.971350 2421 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5a91e11390d62b878c5b871ef7b9ed70-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-4-n-bcba60e63d\" (UID: \"5a91e11390d62b878c5b871ef7b9ed70\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:54.971463 kubelet[2421]: I0313 00:33:54.971416 2421 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/2ecefdde302235cd2e6a23a4eae31b0a-kubeconfig\") pod \"kube-scheduler-ci-4459-2-4-n-bcba60e63d\" (UID: \"2ecefdde302235cd2e6a23a4eae31b0a\") " pod="kube-system/kube-scheduler-ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:54.971553 kubelet[2421]: I0313 00:33:54.971464 2421 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4386427b1588f042cb1f28dbcb739b96-ca-certs\") pod \"kube-apiserver-ci-4459-2-4-n-bcba60e63d\" (UID: \"4386427b1588f042cb1f28dbcb739b96\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:54.971553 kubelet[2421]: I0313 00:33:54.971536 2421 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4386427b1588f042cb1f28dbcb739b96-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-4-n-bcba60e63d\" (UID: \"4386427b1588f042cb1f28dbcb739b96\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:55.041784 kubelet[2421]: I0313 00:33:55.041711 2421 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:55.042203 kubelet[2421]: E0313 00:33:55.042164 2421 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://157.180.117.151:6443/api/v1/nodes\": dial tcp 157.180.117.151:6443: connect: connection refused" node="ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:55.112913 containerd[1635]: time="2026-03-13T00:33:55.112841931Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-4-n-bcba60e63d,Uid:4386427b1588f042cb1f28dbcb739b96,Namespace:kube-system,Attempt:0,}" Mar 13 00:33:55.128292 containerd[1635]: time="2026-03-13T00:33:55.128148357Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-4-n-bcba60e63d,Uid:5a91e11390d62b878c5b871ef7b9ed70,Namespace:kube-system,Attempt:0,}" Mar 13 00:33:55.136098 containerd[1635]: time="2026-03-13T00:33:55.135957611Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-4-n-bcba60e63d,Uid:2ecefdde302235cd2e6a23a4eae31b0a,Namespace:kube-system,Attempt:0,}" Mar 13 00:33:55.273833 kubelet[2421]: E0313 00:33:55.273719 2421 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://157.180.117.151:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-n-bcba60e63d?timeout=10s\": dial tcp 157.180.117.151:6443: connect: connection refused" interval="800ms" Mar 13 00:33:55.444457 kubelet[2421]: I0313 00:33:55.444314 2421 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:55.444633 kubelet[2421]: E0313 00:33:55.444599 2421 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://157.180.117.151:6443/api/v1/nodes\": dial tcp 157.180.117.151:6443: connect: connection refused" node="ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:55.505615 kubelet[2421]: E0313 00:33:55.505550 2421 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://157.180.117.151:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-4-n-bcba60e63d&limit=500&resourceVersion=0\": dial tcp 157.180.117.151:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 13 00:33:55.592399 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount173540961.mount: Deactivated successfully. Mar 13 00:33:55.602956 containerd[1635]: time="2026-03-13T00:33:55.602889225Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 13 00:33:55.607258 containerd[1635]: time="2026-03-13T00:33:55.607149937Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321160" Mar 13 00:33:55.609594 containerd[1635]: time="2026-03-13T00:33:55.609556588Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 13 00:33:55.612692 containerd[1635]: time="2026-03-13T00:33:55.612503969Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 13 00:33:55.613595 containerd[1635]: time="2026-03-13T00:33:55.613550700Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 13 00:33:55.614623 containerd[1635]: time="2026-03-13T00:33:55.614579440Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 13 00:33:55.616251 containerd[1635]: time="2026-03-13T00:33:55.616219131Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 13 00:33:55.617851 containerd[1635]: time="2026-03-13T00:33:55.617215001Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 502.350219ms" Mar 13 00:33:55.617851 containerd[1635]: time="2026-03-13T00:33:55.617329881Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 13 00:33:55.624032 containerd[1635]: time="2026-03-13T00:33:55.623986524Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 486.377493ms" Mar 13 00:33:55.625541 containerd[1635]: time="2026-03-13T00:33:55.625476735Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 495.655447ms" Mar 13 00:33:55.657630 kubelet[2421]: E0313 00:33:55.657579 2421 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://157.180.117.151:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 157.180.117.151:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 13 00:33:55.666527 containerd[1635]: time="2026-03-13T00:33:55.666299162Z" level=info msg="connecting to shim 3bc70e961d9540f5543230bf99e2ce941885bad1771f7fe9d22bb73e32bd9083" address="unix:///run/containerd/s/b38096db0b540fdc1cdc7e3bca51dd3513b4369078de6aaefe796b2078cb034a" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:33:55.668663 containerd[1635]: time="2026-03-13T00:33:55.668627003Z" level=info msg="connecting to shim 10513e33c823dfcc620f1a742664acb0600348b85fe0c8869c2ff6d7a5c9facb" address="unix:///run/containerd/s/823ce0d920924546cfad431aaf456d3d500f5f5cb4f7ca4fd26dcac365e4840c" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:33:55.673897 containerd[1635]: time="2026-03-13T00:33:55.673866695Z" level=info msg="connecting to shim 045702106dbddad2202b15837562c871c2c34091f39d6a9b519ccedcf04338c7" address="unix:///run/containerd/s/e450c08987ad390712ecc9a6b334dcc0151958b409b2fa4357d2ca8361f493a5" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:33:55.699762 systemd[1]: Started cri-containerd-045702106dbddad2202b15837562c871c2c34091f39d6a9b519ccedcf04338c7.scope - libcontainer container 045702106dbddad2202b15837562c871c2c34091f39d6a9b519ccedcf04338c7. Mar 13 00:33:55.701523 systemd[1]: Started cri-containerd-3bc70e961d9540f5543230bf99e2ce941885bad1771f7fe9d22bb73e32bd9083.scope - libcontainer container 3bc70e961d9540f5543230bf99e2ce941885bad1771f7fe9d22bb73e32bd9083. Mar 13 00:33:55.704898 systemd[1]: Started cri-containerd-10513e33c823dfcc620f1a742664acb0600348b85fe0c8869c2ff6d7a5c9facb.scope - libcontainer container 10513e33c823dfcc620f1a742664acb0600348b85fe0c8869c2ff6d7a5c9facb. Mar 13 00:33:55.752345 containerd[1635]: time="2026-03-13T00:33:55.752284407Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-4-n-bcba60e63d,Uid:2ecefdde302235cd2e6a23a4eae31b0a,Namespace:kube-system,Attempt:0,} returns sandbox id \"045702106dbddad2202b15837562c871c2c34091f39d6a9b519ccedcf04338c7\"" Mar 13 00:33:55.757048 containerd[1635]: time="2026-03-13T00:33:55.757025519Z" level=info msg="CreateContainer within sandbox \"045702106dbddad2202b15837562c871c2c34091f39d6a9b519ccedcf04338c7\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 13 00:33:55.768719 containerd[1635]: time="2026-03-13T00:33:55.768692584Z" level=info msg="Container 163d4ee38a5e09fc1a8dad99026cb14a6cec329b41803f3100049974d82c78fb: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:33:55.775507 containerd[1635]: time="2026-03-13T00:33:55.775420467Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-4-n-bcba60e63d,Uid:5a91e11390d62b878c5b871ef7b9ed70,Namespace:kube-system,Attempt:0,} returns sandbox id \"10513e33c823dfcc620f1a742664acb0600348b85fe0c8869c2ff6d7a5c9facb\"" Mar 13 00:33:55.776106 containerd[1635]: time="2026-03-13T00:33:55.776092757Z" level=info msg="CreateContainer within sandbox \"045702106dbddad2202b15837562c871c2c34091f39d6a9b519ccedcf04338c7\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"163d4ee38a5e09fc1a8dad99026cb14a6cec329b41803f3100049974d82c78fb\"" Mar 13 00:33:55.777053 containerd[1635]: time="2026-03-13T00:33:55.777039298Z" level=info msg="StartContainer for \"163d4ee38a5e09fc1a8dad99026cb14a6cec329b41803f3100049974d82c78fb\"" Mar 13 00:33:55.778363 containerd[1635]: time="2026-03-13T00:33:55.778250138Z" level=info msg="connecting to shim 163d4ee38a5e09fc1a8dad99026cb14a6cec329b41803f3100049974d82c78fb" address="unix:///run/containerd/s/e450c08987ad390712ecc9a6b334dcc0151958b409b2fa4357d2ca8361f493a5" protocol=ttrpc version=3 Mar 13 00:33:55.779038 containerd[1635]: time="2026-03-13T00:33:55.779024488Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-4-n-bcba60e63d,Uid:4386427b1588f042cb1f28dbcb739b96,Namespace:kube-system,Attempt:0,} returns sandbox id \"3bc70e961d9540f5543230bf99e2ce941885bad1771f7fe9d22bb73e32bd9083\"" Mar 13 00:33:55.784546 containerd[1635]: time="2026-03-13T00:33:55.784526071Z" level=info msg="CreateContainer within sandbox \"10513e33c823dfcc620f1a742664acb0600348b85fe0c8869c2ff6d7a5c9facb\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 13 00:33:55.785829 containerd[1635]: time="2026-03-13T00:33:55.785729651Z" level=info msg="CreateContainer within sandbox \"3bc70e961d9540f5543230bf99e2ce941885bad1771f7fe9d22bb73e32bd9083\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 13 00:33:55.793967 containerd[1635]: time="2026-03-13T00:33:55.793754395Z" level=info msg="Container a230832493d975dc2294a1a795812e97fb49e31f405b48b8eb56fb9b457939c9: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:33:55.796001 containerd[1635]: time="2026-03-13T00:33:55.795550385Z" level=info msg="Container 56ca4d41103caa85de2104dbb6f90fdea407945a6a0a867aaaa08c41c0be5199: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:33:55.799868 systemd[1]: Started cri-containerd-163d4ee38a5e09fc1a8dad99026cb14a6cec329b41803f3100049974d82c78fb.scope - libcontainer container 163d4ee38a5e09fc1a8dad99026cb14a6cec329b41803f3100049974d82c78fb. Mar 13 00:33:55.803757 containerd[1635]: time="2026-03-13T00:33:55.803737759Z" level=info msg="CreateContainer within sandbox \"10513e33c823dfcc620f1a742664acb0600348b85fe0c8869c2ff6d7a5c9facb\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"a230832493d975dc2294a1a795812e97fb49e31f405b48b8eb56fb9b457939c9\"" Mar 13 00:33:55.804022 containerd[1635]: time="2026-03-13T00:33:55.803861309Z" level=info msg="CreateContainer within sandbox \"3bc70e961d9540f5543230bf99e2ce941885bad1771f7fe9d22bb73e32bd9083\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"56ca4d41103caa85de2104dbb6f90fdea407945a6a0a867aaaa08c41c0be5199\"" Mar 13 00:33:55.804151 containerd[1635]: time="2026-03-13T00:33:55.804119949Z" level=info msg="StartContainer for \"a230832493d975dc2294a1a795812e97fb49e31f405b48b8eb56fb9b457939c9\"" Mar 13 00:33:55.804303 containerd[1635]: time="2026-03-13T00:33:55.804291499Z" level=info msg="StartContainer for \"56ca4d41103caa85de2104dbb6f90fdea407945a6a0a867aaaa08c41c0be5199\"" Mar 13 00:33:55.805021 containerd[1635]: time="2026-03-13T00:33:55.805002639Z" level=info msg="connecting to shim a230832493d975dc2294a1a795812e97fb49e31f405b48b8eb56fb9b457939c9" address="unix:///run/containerd/s/823ce0d920924546cfad431aaf456d3d500f5f5cb4f7ca4fd26dcac365e4840c" protocol=ttrpc version=3 Mar 13 00:33:55.805191 containerd[1635]: time="2026-03-13T00:33:55.805177309Z" level=info msg="connecting to shim 56ca4d41103caa85de2104dbb6f90fdea407945a6a0a867aaaa08c41c0be5199" address="unix:///run/containerd/s/b38096db0b540fdc1cdc7e3bca51dd3513b4369078de6aaefe796b2078cb034a" protocol=ttrpc version=3 Mar 13 00:33:55.827756 systemd[1]: Started cri-containerd-a230832493d975dc2294a1a795812e97fb49e31f405b48b8eb56fb9b457939c9.scope - libcontainer container a230832493d975dc2294a1a795812e97fb49e31f405b48b8eb56fb9b457939c9. Mar 13 00:33:55.835972 systemd[1]: Started cri-containerd-56ca4d41103caa85de2104dbb6f90fdea407945a6a0a867aaaa08c41c0be5199.scope - libcontainer container 56ca4d41103caa85de2104dbb6f90fdea407945a6a0a867aaaa08c41c0be5199. Mar 13 00:33:55.880515 containerd[1635]: time="2026-03-13T00:33:55.880483401Z" level=info msg="StartContainer for \"163d4ee38a5e09fc1a8dad99026cb14a6cec329b41803f3100049974d82c78fb\" returns successfully" Mar 13 00:33:55.907600 containerd[1635]: time="2026-03-13T00:33:55.907571742Z" level=info msg="StartContainer for \"a230832493d975dc2294a1a795812e97fb49e31f405b48b8eb56fb9b457939c9\" returns successfully" Mar 13 00:33:55.911582 containerd[1635]: time="2026-03-13T00:33:55.911556894Z" level=info msg="StartContainer for \"56ca4d41103caa85de2104dbb6f90fdea407945a6a0a867aaaa08c41c0be5199\" returns successfully" Mar 13 00:33:55.928229 kubelet[2421]: E0313 00:33:55.928187 2421 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://157.180.117.151:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 157.180.117.151:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 13 00:33:56.247725 kubelet[2421]: I0313 00:33:56.247695 2421 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:56.701364 kubelet[2421]: E0313 00:33:56.701335 2421 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-bcba60e63d\" not found" node="ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:56.702739 kubelet[2421]: E0313 00:33:56.702724 2421 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-bcba60e63d\" not found" node="ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:56.704863 kubelet[2421]: E0313 00:33:56.704847 2421 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-bcba60e63d\" not found" node="ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:57.037922 kubelet[2421]: E0313 00:33:57.037835 2421 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459-2-4-n-bcba60e63d\" not found" node="ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:57.120691 kubelet[2421]: I0313 00:33:57.120661 2421 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:57.120691 kubelet[2421]: E0313 00:33:57.120691 2421 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"ci-4459-2-4-n-bcba60e63d\": node \"ci-4459-2-4-n-bcba60e63d\" not found" Mar 13 00:33:57.141158 kubelet[2421]: E0313 00:33:57.141135 2421 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-bcba60e63d\" not found" Mar 13 00:33:57.241463 kubelet[2421]: E0313 00:33:57.241407 2421 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-bcba60e63d\" not found" Mar 13 00:33:57.342150 kubelet[2421]: E0313 00:33:57.342108 2421 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-bcba60e63d\" not found" Mar 13 00:33:57.443037 kubelet[2421]: E0313 00:33:57.442971 2421 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-bcba60e63d\" not found" Mar 13 00:33:57.543503 kubelet[2421]: E0313 00:33:57.543436 2421 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-bcba60e63d\" not found" Mar 13 00:33:57.644054 kubelet[2421]: E0313 00:33:57.643883 2421 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-bcba60e63d\" not found" Mar 13 00:33:57.709011 kubelet[2421]: E0313 00:33:57.708610 2421 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-bcba60e63d\" not found" node="ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:57.709011 kubelet[2421]: E0313 00:33:57.708808 2421 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-n-bcba60e63d\" not found" node="ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:57.744305 kubelet[2421]: E0313 00:33:57.744237 2421 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-bcba60e63d\" not found" Mar 13 00:33:57.845376 kubelet[2421]: E0313 00:33:57.845309 2421 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-bcba60e63d\" not found" Mar 13 00:33:57.945598 kubelet[2421]: E0313 00:33:57.945420 2421 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-bcba60e63d\" not found" Mar 13 00:33:58.046102 kubelet[2421]: E0313 00:33:58.046027 2421 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-bcba60e63d\" not found" Mar 13 00:33:58.146526 kubelet[2421]: E0313 00:33:58.146284 2421 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-bcba60e63d\" not found" Mar 13 00:33:58.246927 kubelet[2421]: E0313 00:33:58.246838 2421 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-bcba60e63d\" not found" Mar 13 00:33:58.347819 kubelet[2421]: E0313 00:33:58.347775 2421 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-bcba60e63d\" not found" Mar 13 00:33:58.448510 kubelet[2421]: E0313 00:33:58.448457 2421 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-bcba60e63d\" not found" Mar 13 00:33:58.548889 kubelet[2421]: E0313 00:33:58.548743 2421 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-n-bcba60e63d\" not found" Mar 13 00:33:58.652981 kubelet[2421]: I0313 00:33:58.652945 2421 apiserver.go:52] "Watching apiserver" Mar 13 00:33:58.667251 kubelet[2421]: I0313 00:33:58.667198 2421 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:58.671061 kubelet[2421]: I0313 00:33:58.670993 2421 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 13 00:33:58.675759 kubelet[2421]: I0313 00:33:58.675636 2421 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:58.680684 kubelet[2421]: I0313 00:33:58.680307 2421 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:59.021905 systemd[1]: Reload requested from client PID 2703 ('systemctl') (unit session-7.scope)... Mar 13 00:33:59.021935 systemd[1]: Reloading... Mar 13 00:33:59.114714 zram_generator::config[2747]: No configuration found. Mar 13 00:33:59.315428 systemd[1]: Reloading finished in 292 ms. Mar 13 00:33:59.350410 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 13 00:33:59.369732 systemd[1]: kubelet.service: Deactivated successfully. Mar 13 00:33:59.370154 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:33:59.370273 systemd[1]: kubelet.service: Consumed 909ms CPU time, 125.3M memory peak. Mar 13 00:33:59.372308 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 13 00:33:59.540425 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 13 00:33:59.549510 (kubelet)[2798]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 13 00:33:59.585741 kubelet[2798]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 13 00:33:59.586071 kubelet[2798]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 00:33:59.586160 kubelet[2798]: I0313 00:33:59.586142 2798 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 13 00:33:59.596840 kubelet[2798]: I0313 00:33:59.596816 2798 server.go:529] "Kubelet version" kubeletVersion="v1.34.4" Mar 13 00:33:59.596960 kubelet[2798]: I0313 00:33:59.596953 2798 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 13 00:33:59.597026 kubelet[2798]: I0313 00:33:59.597019 2798 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 13 00:33:59.597068 kubelet[2798]: I0313 00:33:59.597061 2798 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 13 00:33:59.597226 kubelet[2798]: I0313 00:33:59.597217 2798 server.go:956] "Client rotation is on, will bootstrap in background" Mar 13 00:33:59.598354 kubelet[2798]: I0313 00:33:59.598333 2798 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 13 00:33:59.599747 kubelet[2798]: I0313 00:33:59.599734 2798 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 13 00:33:59.603144 kubelet[2798]: I0313 00:33:59.603078 2798 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 13 00:33:59.606365 kubelet[2798]: I0313 00:33:59.606345 2798 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 13 00:33:59.606613 kubelet[2798]: I0313 00:33:59.606583 2798 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 13 00:33:59.606978 kubelet[2798]: I0313 00:33:59.606606 2798 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-4-n-bcba60e63d","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 13 00:33:59.607119 kubelet[2798]: I0313 00:33:59.606975 2798 topology_manager.go:138] "Creating topology manager with none policy" Mar 13 00:33:59.607119 kubelet[2798]: I0313 00:33:59.607113 2798 container_manager_linux.go:306] "Creating device plugin manager" Mar 13 00:33:59.607159 kubelet[2798]: I0313 00:33:59.607134 2798 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Mar 13 00:33:59.608206 kubelet[2798]: I0313 00:33:59.608180 2798 state_mem.go:36] "Initialized new in-memory state store" Mar 13 00:33:59.609690 kubelet[2798]: I0313 00:33:59.608480 2798 kubelet.go:475] "Attempting to sync node with API server" Mar 13 00:33:59.609690 kubelet[2798]: I0313 00:33:59.608504 2798 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 13 00:33:59.609690 kubelet[2798]: I0313 00:33:59.608544 2798 kubelet.go:387] "Adding apiserver pod source" Mar 13 00:33:59.609690 kubelet[2798]: I0313 00:33:59.608555 2798 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 13 00:33:59.610913 kubelet[2798]: I0313 00:33:59.610779 2798 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 13 00:33:59.611733 kubelet[2798]: I0313 00:33:59.611713 2798 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 13 00:33:59.611854 kubelet[2798]: I0313 00:33:59.611839 2798 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 13 00:33:59.618707 kubelet[2798]: I0313 00:33:59.617969 2798 server.go:1262] "Started kubelet" Mar 13 00:33:59.619241 kubelet[2798]: I0313 00:33:59.619228 2798 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 13 00:33:59.621877 kubelet[2798]: I0313 00:33:59.621860 2798 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 13 00:33:59.622557 kubelet[2798]: I0313 00:33:59.622546 2798 server.go:310] "Adding debug handlers to kubelet server" Mar 13 00:33:59.628359 kubelet[2798]: I0313 00:33:59.628314 2798 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 13 00:33:59.628406 kubelet[2798]: I0313 00:33:59.628371 2798 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 13 00:33:59.628638 kubelet[2798]: I0313 00:33:59.628618 2798 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 13 00:33:59.630767 kubelet[2798]: I0313 00:33:59.630733 2798 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 13 00:33:59.631781 kubelet[2798]: I0313 00:33:59.631770 2798 volume_manager.go:313] "Starting Kubelet Volume Manager" Mar 13 00:33:59.632912 kubelet[2798]: I0313 00:33:59.632892 2798 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 13 00:33:59.635080 kubelet[2798]: E0313 00:33:59.635052 2798 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 13 00:33:59.635271 kubelet[2798]: I0313 00:33:59.635253 2798 factory.go:223] Registration of the systemd container factory successfully Mar 13 00:33:59.635363 kubelet[2798]: I0313 00:33:59.635343 2798 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 13 00:33:59.635876 kubelet[2798]: I0313 00:33:59.635866 2798 reconciler.go:29] "Reconciler: start to sync state" Mar 13 00:33:59.636885 kubelet[2798]: I0313 00:33:59.636850 2798 factory.go:223] Registration of the containerd container factory successfully Mar 13 00:33:59.653465 kubelet[2798]: I0313 00:33:59.653343 2798 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 13 00:33:59.657242 kubelet[2798]: I0313 00:33:59.657228 2798 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 13 00:33:59.657755 kubelet[2798]: I0313 00:33:59.657412 2798 status_manager.go:244] "Starting to sync pod status with apiserver" Mar 13 00:33:59.657755 kubelet[2798]: I0313 00:33:59.657435 2798 kubelet.go:2428] "Starting kubelet main sync loop" Mar 13 00:33:59.657755 kubelet[2798]: E0313 00:33:59.657472 2798 kubelet.go:2452] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 13 00:33:59.680616 kubelet[2798]: I0313 00:33:59.680598 2798 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 13 00:33:59.680770 kubelet[2798]: I0313 00:33:59.680761 2798 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 13 00:33:59.680821 kubelet[2798]: I0313 00:33:59.680815 2798 state_mem.go:36] "Initialized new in-memory state store" Mar 13 00:33:59.680942 kubelet[2798]: I0313 00:33:59.680934 2798 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 13 00:33:59.680982 kubelet[2798]: I0313 00:33:59.680969 2798 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 13 00:33:59.681009 kubelet[2798]: I0313 00:33:59.681004 2798 policy_none.go:49] "None policy: Start" Mar 13 00:33:59.681852 kubelet[2798]: I0313 00:33:59.681041 2798 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 13 00:33:59.681852 kubelet[2798]: I0313 00:33:59.681050 2798 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 13 00:33:59.681852 kubelet[2798]: I0313 00:33:59.681121 2798 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Mar 13 00:33:59.681852 kubelet[2798]: I0313 00:33:59.681127 2798 policy_none.go:47] "Start" Mar 13 00:33:59.685933 kubelet[2798]: E0313 00:33:59.685922 2798 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 13 00:33:59.686187 kubelet[2798]: I0313 00:33:59.686106 2798 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 13 00:33:59.686246 kubelet[2798]: I0313 00:33:59.686227 2798 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 13 00:33:59.686503 kubelet[2798]: I0313 00:33:59.686492 2798 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 13 00:33:59.687915 kubelet[2798]: E0313 00:33:59.687899 2798 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 13 00:33:59.758371 kubelet[2798]: I0313 00:33:59.758331 2798 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:59.758522 kubelet[2798]: I0313 00:33:59.758506 2798 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:59.758749 kubelet[2798]: I0313 00:33:59.758700 2798 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:59.764953 kubelet[2798]: E0313 00:33:59.764927 2798 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-4-n-bcba60e63d\" already exists" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:59.765844 kubelet[2798]: E0313 00:33:59.765795 2798 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-4-n-bcba60e63d\" already exists" pod="kube-system/kube-scheduler-ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:59.765844 kubelet[2798]: E0313 00:33:59.765836 2798 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-4-n-bcba60e63d\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:59.793288 kubelet[2798]: I0313 00:33:59.793231 2798 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:59.803167 kubelet[2798]: I0313 00:33:59.803107 2798 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:59.803312 kubelet[2798]: I0313 00:33:59.803269 2798 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:59.837730 kubelet[2798]: I0313 00:33:59.837494 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/2ecefdde302235cd2e6a23a4eae31b0a-kubeconfig\") pod \"kube-scheduler-ci-4459-2-4-n-bcba60e63d\" (UID: \"2ecefdde302235cd2e6a23a4eae31b0a\") " pod="kube-system/kube-scheduler-ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:59.837730 kubelet[2798]: I0313 00:33:59.837551 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4386427b1588f042cb1f28dbcb739b96-k8s-certs\") pod \"kube-apiserver-ci-4459-2-4-n-bcba60e63d\" (UID: \"4386427b1588f042cb1f28dbcb739b96\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:59.837730 kubelet[2798]: I0313 00:33:59.837598 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4386427b1588f042cb1f28dbcb739b96-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-4-n-bcba60e63d\" (UID: \"4386427b1588f042cb1f28dbcb739b96\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:59.837730 kubelet[2798]: I0313 00:33:59.837624 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5a91e11390d62b878c5b871ef7b9ed70-ca-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-bcba60e63d\" (UID: \"5a91e11390d62b878c5b871ef7b9ed70\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:59.837730 kubelet[2798]: I0313 00:33:59.837678 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/5a91e11390d62b878c5b871ef7b9ed70-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-4-n-bcba60e63d\" (UID: \"5a91e11390d62b878c5b871ef7b9ed70\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:59.838069 kubelet[2798]: I0313 00:33:59.837701 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5a91e11390d62b878c5b871ef7b9ed70-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-4-n-bcba60e63d\" (UID: \"5a91e11390d62b878c5b871ef7b9ed70\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:59.838069 kubelet[2798]: I0313 00:33:59.837722 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5a91e11390d62b878c5b871ef7b9ed70-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-4-n-bcba60e63d\" (UID: \"5a91e11390d62b878c5b871ef7b9ed70\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:59.838069 kubelet[2798]: I0313 00:33:59.837743 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4386427b1588f042cb1f28dbcb739b96-ca-certs\") pod \"kube-apiserver-ci-4459-2-4-n-bcba60e63d\" (UID: \"4386427b1588f042cb1f28dbcb739b96\") " pod="kube-system/kube-apiserver-ci-4459-2-4-n-bcba60e63d" Mar 13 00:33:59.838069 kubelet[2798]: I0313 00:33:59.837765 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5a91e11390d62b878c5b871ef7b9ed70-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-4-n-bcba60e63d\" (UID: \"5a91e11390d62b878c5b871ef7b9ed70\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:00.617259 kubelet[2798]: I0313 00:34:00.617178 2798 apiserver.go:52] "Watching apiserver" Mar 13 00:34:00.633561 kubelet[2798]: I0313 00:34:00.633517 2798 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 13 00:34:00.674415 kubelet[2798]: I0313 00:34:00.674308 2798 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:00.675462 kubelet[2798]: I0313 00:34:00.675409 2798 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:00.682008 kubelet[2798]: E0313 00:34:00.681971 2798 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-4-n-bcba60e63d\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:00.683679 kubelet[2798]: E0313 00:34:00.682732 2798 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-4-n-bcba60e63d\" already exists" pod="kube-system/kube-scheduler-ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:00.694689 kubelet[2798]: I0313 00:34:00.694624 2798 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459-2-4-n-bcba60e63d" podStartSLOduration=2.694577796 podStartE2EDuration="2.694577796s" podCreationTimestamp="2026-03-13 00:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 00:34:00.694295006 +0000 UTC m=+1.137550155" watchObservedRunningTime="2026-03-13 00:34:00.694577796 +0000 UTC m=+1.137832955" Mar 13 00:34:00.716422 kubelet[2798]: I0313 00:34:00.716331 2798 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459-2-4-n-bcba60e63d" podStartSLOduration=2.716318255 podStartE2EDuration="2.716318255s" podCreationTimestamp="2026-03-13 00:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 00:34:00.706822131 +0000 UTC m=+1.150077290" watchObservedRunningTime="2026-03-13 00:34:00.716318255 +0000 UTC m=+1.159573414" Mar 13 00:34:05.294957 kubelet[2798]: I0313 00:34:05.294838 2798 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459-2-4-n-bcba60e63d" podStartSLOduration=7.294800352 podStartE2EDuration="7.294800352s" podCreationTimestamp="2026-03-13 00:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 00:34:00.716841285 +0000 UTC m=+1.160096434" watchObservedRunningTime="2026-03-13 00:34:05.294800352 +0000 UTC m=+5.738055511" Mar 13 00:34:05.849314 kubelet[2798]: I0313 00:34:05.849248 2798 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 13 00:34:05.849904 containerd[1635]: time="2026-03-13T00:34:05.849581373Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 13 00:34:05.850691 kubelet[2798]: I0313 00:34:05.850627 2798 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 13 00:34:06.829105 systemd[1]: Created slice kubepods-besteffort-pod4fcf7fbf_7540_40b1_971b_e1d694133266.slice - libcontainer container kubepods-besteffort-pod4fcf7fbf_7540_40b1_971b_e1d694133266.slice. Mar 13 00:34:06.882067 kubelet[2798]: I0313 00:34:06.882017 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/4fcf7fbf-7540-40b1-971b-e1d694133266-kube-proxy\") pod \"kube-proxy-spl8c\" (UID: \"4fcf7fbf-7540-40b1-971b-e1d694133266\") " pod="kube-system/kube-proxy-spl8c" Mar 13 00:34:06.882067 kubelet[2798]: I0313 00:34:06.882046 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/4fcf7fbf-7540-40b1-971b-e1d694133266-xtables-lock\") pod \"kube-proxy-spl8c\" (UID: \"4fcf7fbf-7540-40b1-971b-e1d694133266\") " pod="kube-system/kube-proxy-spl8c" Mar 13 00:34:06.882067 kubelet[2798]: I0313 00:34:06.882060 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4fcf7fbf-7540-40b1-971b-e1d694133266-lib-modules\") pod \"kube-proxy-spl8c\" (UID: \"4fcf7fbf-7540-40b1-971b-e1d694133266\") " pod="kube-system/kube-proxy-spl8c" Mar 13 00:34:06.882067 kubelet[2798]: I0313 00:34:06.882072 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72sdn\" (UniqueName: \"kubernetes.io/projected/4fcf7fbf-7540-40b1-971b-e1d694133266-kube-api-access-72sdn\") pod \"kube-proxy-spl8c\" (UID: \"4fcf7fbf-7540-40b1-971b-e1d694133266\") " pod="kube-system/kube-proxy-spl8c" Mar 13 00:34:07.057984 systemd[1]: Created slice kubepods-besteffort-podeb1ed8ba_da2a_49b9_af33_7ef163436eb6.slice - libcontainer container kubepods-besteffort-podeb1ed8ba_da2a_49b9_af33_7ef163436eb6.slice. Mar 13 00:34:07.084118 kubelet[2798]: I0313 00:34:07.084023 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/eb1ed8ba-da2a-49b9-af33-7ef163436eb6-var-lib-calico\") pod \"tigera-operator-5588576f44-v5c7c\" (UID: \"eb1ed8ba-da2a-49b9-af33-7ef163436eb6\") " pod="tigera-operator/tigera-operator-5588576f44-v5c7c" Mar 13 00:34:07.084118 kubelet[2798]: I0313 00:34:07.084052 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k527l\" (UniqueName: \"kubernetes.io/projected/eb1ed8ba-da2a-49b9-af33-7ef163436eb6-kube-api-access-k527l\") pod \"tigera-operator-5588576f44-v5c7c\" (UID: \"eb1ed8ba-da2a-49b9-af33-7ef163436eb6\") " pod="tigera-operator/tigera-operator-5588576f44-v5c7c" Mar 13 00:34:07.142486 containerd[1635]: time="2026-03-13T00:34:07.142427691Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-spl8c,Uid:4fcf7fbf-7540-40b1-971b-e1d694133266,Namespace:kube-system,Attempt:0,}" Mar 13 00:34:07.171692 containerd[1635]: time="2026-03-13T00:34:07.171568853Z" level=info msg="connecting to shim e5c5e92265a4925ae186c066d9054646e650b8ed6fa8b4830fdbf6a675110a25" address="unix:///run/containerd/s/4ba0534c0a27617d306ee33bdf0727471676861a3599899485e9ae854239e778" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:34:07.227758 systemd[1]: Started cri-containerd-e5c5e92265a4925ae186c066d9054646e650b8ed6fa8b4830fdbf6a675110a25.scope - libcontainer container e5c5e92265a4925ae186c066d9054646e650b8ed6fa8b4830fdbf6a675110a25. Mar 13 00:34:07.249396 containerd[1635]: time="2026-03-13T00:34:07.249358756Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-spl8c,Uid:4fcf7fbf-7540-40b1-971b-e1d694133266,Namespace:kube-system,Attempt:0,} returns sandbox id \"e5c5e92265a4925ae186c066d9054646e650b8ed6fa8b4830fdbf6a675110a25\"" Mar 13 00:34:07.254442 containerd[1635]: time="2026-03-13T00:34:07.254409428Z" level=info msg="CreateContainer within sandbox \"e5c5e92265a4925ae186c066d9054646e650b8ed6fa8b4830fdbf6a675110a25\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 13 00:34:07.265626 containerd[1635]: time="2026-03-13T00:34:07.265602203Z" level=info msg="Container 5aa8288e6fb16d573ebf3f8b1691c3048d46c71f5b27135fdc5dcbadbb0a1a77: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:07.271482 containerd[1635]: time="2026-03-13T00:34:07.271423415Z" level=info msg="CreateContainer within sandbox \"e5c5e92265a4925ae186c066d9054646e650b8ed6fa8b4830fdbf6a675110a25\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"5aa8288e6fb16d573ebf3f8b1691c3048d46c71f5b27135fdc5dcbadbb0a1a77\"" Mar 13 00:34:07.273452 containerd[1635]: time="2026-03-13T00:34:07.272510025Z" level=info msg="StartContainer for \"5aa8288e6fb16d573ebf3f8b1691c3048d46c71f5b27135fdc5dcbadbb0a1a77\"" Mar 13 00:34:07.273452 containerd[1635]: time="2026-03-13T00:34:07.273353036Z" level=info msg="connecting to shim 5aa8288e6fb16d573ebf3f8b1691c3048d46c71f5b27135fdc5dcbadbb0a1a77" address="unix:///run/containerd/s/4ba0534c0a27617d306ee33bdf0727471676861a3599899485e9ae854239e778" protocol=ttrpc version=3 Mar 13 00:34:07.288759 systemd[1]: Started cri-containerd-5aa8288e6fb16d573ebf3f8b1691c3048d46c71f5b27135fdc5dcbadbb0a1a77.scope - libcontainer container 5aa8288e6fb16d573ebf3f8b1691c3048d46c71f5b27135fdc5dcbadbb0a1a77. Mar 13 00:34:07.348184 containerd[1635]: time="2026-03-13T00:34:07.348115037Z" level=info msg="StartContainer for \"5aa8288e6fb16d573ebf3f8b1691c3048d46c71f5b27135fdc5dcbadbb0a1a77\" returns successfully" Mar 13 00:34:07.363449 containerd[1635]: time="2026-03-13T00:34:07.363392343Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5588576f44-v5c7c,Uid:eb1ed8ba-da2a-49b9-af33-7ef163436eb6,Namespace:tigera-operator,Attempt:0,}" Mar 13 00:34:07.379267 containerd[1635]: time="2026-03-13T00:34:07.379053650Z" level=info msg="connecting to shim a5d42e4488fe1795bb7bada0948db2c21dffd231c031065f7119f8886c45bf68" address="unix:///run/containerd/s/c2f46b5cd4bf9b2f87cd24084935310e23bdaaf7ba7e83a8a0f769c25b6469cf" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:34:07.398749 systemd[1]: Started cri-containerd-a5d42e4488fe1795bb7bada0948db2c21dffd231c031065f7119f8886c45bf68.scope - libcontainer container a5d42e4488fe1795bb7bada0948db2c21dffd231c031065f7119f8886c45bf68. Mar 13 00:34:07.438131 containerd[1635]: time="2026-03-13T00:34:07.438097644Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5588576f44-v5c7c,Uid:eb1ed8ba-da2a-49b9-af33-7ef163436eb6,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"a5d42e4488fe1795bb7bada0948db2c21dffd231c031065f7119f8886c45bf68\"" Mar 13 00:34:07.439457 containerd[1635]: time="2026-03-13T00:34:07.439425785Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Mar 13 00:34:09.292487 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3313548191.mount: Deactivated successfully. Mar 13 00:34:09.672211 kubelet[2798]: I0313 00:34:09.672166 2798 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-spl8c" podStartSLOduration=3.672152245 podStartE2EDuration="3.672152245s" podCreationTimestamp="2026-03-13 00:34:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 00:34:07.708448737 +0000 UTC m=+8.151703956" watchObservedRunningTime="2026-03-13 00:34:09.672152245 +0000 UTC m=+10.115407404" Mar 13 00:34:10.077685 containerd[1635]: time="2026-03-13T00:34:10.077619494Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:10.078860 containerd[1635]: time="2026-03-13T00:34:10.078825564Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=40846156" Mar 13 00:34:10.079880 containerd[1635]: time="2026-03-13T00:34:10.079848305Z" level=info msg="ImageCreate event name:\"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:10.082037 containerd[1635]: time="2026-03-13T00:34:10.082007296Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:10.082689 containerd[1635]: time="2026-03-13T00:34:10.082415416Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"40842151\" in 2.642967581s" Mar 13 00:34:10.082689 containerd[1635]: time="2026-03-13T00:34:10.082451096Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\"" Mar 13 00:34:10.086942 containerd[1635]: time="2026-03-13T00:34:10.086903198Z" level=info msg="CreateContainer within sandbox \"a5d42e4488fe1795bb7bada0948db2c21dffd231c031065f7119f8886c45bf68\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 13 00:34:10.094356 containerd[1635]: time="2026-03-13T00:34:10.094013691Z" level=info msg="Container 65aceb5dd321c420641e76d80efbf4a119009e355c6c74183d40bbbb4a4208bb: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:10.098231 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1224528181.mount: Deactivated successfully. Mar 13 00:34:10.100785 containerd[1635]: time="2026-03-13T00:34:10.100762733Z" level=info msg="CreateContainer within sandbox \"a5d42e4488fe1795bb7bada0948db2c21dffd231c031065f7119f8886c45bf68\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"65aceb5dd321c420641e76d80efbf4a119009e355c6c74183d40bbbb4a4208bb\"" Mar 13 00:34:10.101326 containerd[1635]: time="2026-03-13T00:34:10.101295244Z" level=info msg="StartContainer for \"65aceb5dd321c420641e76d80efbf4a119009e355c6c74183d40bbbb4a4208bb\"" Mar 13 00:34:10.102501 containerd[1635]: time="2026-03-13T00:34:10.102469584Z" level=info msg="connecting to shim 65aceb5dd321c420641e76d80efbf4a119009e355c6c74183d40bbbb4a4208bb" address="unix:///run/containerd/s/c2f46b5cd4bf9b2f87cd24084935310e23bdaaf7ba7e83a8a0f769c25b6469cf" protocol=ttrpc version=3 Mar 13 00:34:10.126832 systemd[1]: Started cri-containerd-65aceb5dd321c420641e76d80efbf4a119009e355c6c74183d40bbbb4a4208bb.scope - libcontainer container 65aceb5dd321c420641e76d80efbf4a119009e355c6c74183d40bbbb4a4208bb. Mar 13 00:34:10.156405 containerd[1635]: time="2026-03-13T00:34:10.156282646Z" level=info msg="StartContainer for \"65aceb5dd321c420641e76d80efbf4a119009e355c6c74183d40bbbb4a4208bb\" returns successfully" Mar 13 00:34:13.557676 update_engine[1613]: I20260313 00:34:13.556725 1613 update_attempter.cc:509] Updating boot flags... Mar 13 00:34:14.786564 kubelet[2798]: I0313 00:34:14.786316 2798 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-5588576f44-v5c7c" podStartSLOduration=5.142115294 podStartE2EDuration="7.786291015s" podCreationTimestamp="2026-03-13 00:34:07 +0000 UTC" firstStartedPulling="2026-03-13 00:34:07.439064935 +0000 UTC m=+7.882320094" lastFinishedPulling="2026-03-13 00:34:10.083240666 +0000 UTC m=+10.526495815" observedRunningTime="2026-03-13 00:34:10.712545398 +0000 UTC m=+11.155800557" watchObservedRunningTime="2026-03-13 00:34:14.786291015 +0000 UTC m=+15.229546204" Mar 13 00:34:15.141185 sudo[1860]: pam_unix(sudo:session): session closed for user root Mar 13 00:34:15.259671 sshd[1859]: Connection closed by 4.153.228.146 port 57574 Mar 13 00:34:15.260174 sshd-session[1856]: pam_unix(sshd:session): session closed for user core Mar 13 00:34:15.265377 systemd[1]: sshd@6-157.180.117.151:22-4.153.228.146:57574.service: Deactivated successfully. Mar 13 00:34:15.268199 systemd[1]: session-7.scope: Deactivated successfully. Mar 13 00:34:15.268690 systemd[1]: session-7.scope: Consumed 3.534s CPU time, 229.3M memory peak. Mar 13 00:34:15.272528 systemd-logind[1609]: Session 7 logged out. Waiting for processes to exit. Mar 13 00:34:15.275414 systemd-logind[1609]: Removed session 7. Mar 13 00:34:16.975566 systemd[1]: Created slice kubepods-besteffort-pod0bd0b6bf_0139_4e52_a281_6ad91c269ccb.slice - libcontainer container kubepods-besteffort-pod0bd0b6bf_0139_4e52_a281_6ad91c269ccb.slice. Mar 13 00:34:17.044287 systemd[1]: Created slice kubepods-besteffort-poda6766f4c_3c79_40d1_8b4f_d6245dc40867.slice - libcontainer container kubepods-besteffort-poda6766f4c_3c79_40d1_8b4f_d6245dc40867.slice. Mar 13 00:34:17.063162 kubelet[2798]: I0313 00:34:17.063123 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6766f4c-3c79-40d1-8b4f-d6245dc40867-tigera-ca-bundle\") pod \"calico-node-27f76\" (UID: \"a6766f4c-3c79-40d1-8b4f-d6245dc40867\") " pod="calico-system/calico-node-27f76" Mar 13 00:34:17.063162 kubelet[2798]: I0313 00:34:17.063154 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rqkv\" (UniqueName: \"kubernetes.io/projected/0bd0b6bf-0139-4e52-a281-6ad91c269ccb-kube-api-access-5rqkv\") pod \"calico-typha-95fb8f89c-2gn44\" (UID: \"0bd0b6bf-0139-4e52-a281-6ad91c269ccb\") " pod="calico-system/calico-typha-95fb8f89c-2gn44" Mar 13 00:34:17.063162 kubelet[2798]: I0313 00:34:17.063166 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/a6766f4c-3c79-40d1-8b4f-d6245dc40867-cni-log-dir\") pod \"calico-node-27f76\" (UID: \"a6766f4c-3c79-40d1-8b4f-d6245dc40867\") " pod="calico-system/calico-node-27f76" Mar 13 00:34:17.063162 kubelet[2798]: I0313 00:34:17.063184 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/a6766f4c-3c79-40d1-8b4f-d6245dc40867-node-certs\") pod \"calico-node-27f76\" (UID: \"a6766f4c-3c79-40d1-8b4f-d6245dc40867\") " pod="calico-system/calico-node-27f76" Mar 13 00:34:17.063789 kubelet[2798]: I0313 00:34:17.063194 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/a6766f4c-3c79-40d1-8b4f-d6245dc40867-var-run-calico\") pod \"calico-node-27f76\" (UID: \"a6766f4c-3c79-40d1-8b4f-d6245dc40867\") " pod="calico-system/calico-node-27f76" Mar 13 00:34:17.063789 kubelet[2798]: I0313 00:34:17.063203 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a6766f4c-3c79-40d1-8b4f-d6245dc40867-xtables-lock\") pod \"calico-node-27f76\" (UID: \"a6766f4c-3c79-40d1-8b4f-d6245dc40867\") " pod="calico-system/calico-node-27f76" Mar 13 00:34:17.063789 kubelet[2798]: I0313 00:34:17.063212 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/a6766f4c-3c79-40d1-8b4f-d6245dc40867-flexvol-driver-host\") pod \"calico-node-27f76\" (UID: \"a6766f4c-3c79-40d1-8b4f-d6245dc40867\") " pod="calico-system/calico-node-27f76" Mar 13 00:34:17.063789 kubelet[2798]: I0313 00:34:17.063243 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a6766f4c-3c79-40d1-8b4f-d6245dc40867-lib-modules\") pod \"calico-node-27f76\" (UID: \"a6766f4c-3c79-40d1-8b4f-d6245dc40867\") " pod="calico-system/calico-node-27f76" Mar 13 00:34:17.063789 kubelet[2798]: I0313 00:34:17.063256 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/a6766f4c-3c79-40d1-8b4f-d6245dc40867-bpffs\") pod \"calico-node-27f76\" (UID: \"a6766f4c-3c79-40d1-8b4f-d6245dc40867\") " pod="calico-system/calico-node-27f76" Mar 13 00:34:17.063918 kubelet[2798]: I0313 00:34:17.063266 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/a6766f4c-3c79-40d1-8b4f-d6245dc40867-cni-bin-dir\") pod \"calico-node-27f76\" (UID: \"a6766f4c-3c79-40d1-8b4f-d6245dc40867\") " pod="calico-system/calico-node-27f76" Mar 13 00:34:17.063918 kubelet[2798]: I0313 00:34:17.063275 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/a6766f4c-3c79-40d1-8b4f-d6245dc40867-policysync\") pod \"calico-node-27f76\" (UID: \"a6766f4c-3c79-40d1-8b4f-d6245dc40867\") " pod="calico-system/calico-node-27f76" Mar 13 00:34:17.063918 kubelet[2798]: I0313 00:34:17.063288 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a6766f4c-3c79-40d1-8b4f-d6245dc40867-sys-fs\") pod \"calico-node-27f76\" (UID: \"a6766f4c-3c79-40d1-8b4f-d6245dc40867\") " pod="calico-system/calico-node-27f76" Mar 13 00:34:17.063918 kubelet[2798]: I0313 00:34:17.063296 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a6766f4c-3c79-40d1-8b4f-d6245dc40867-var-lib-calico\") pod \"calico-node-27f76\" (UID: \"a6766f4c-3c79-40d1-8b4f-d6245dc40867\") " pod="calico-system/calico-node-27f76" Mar 13 00:34:17.063918 kubelet[2798]: I0313 00:34:17.063308 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/0bd0b6bf-0139-4e52-a281-6ad91c269ccb-typha-certs\") pod \"calico-typha-95fb8f89c-2gn44\" (UID: \"0bd0b6bf-0139-4e52-a281-6ad91c269ccb\") " pod="calico-system/calico-typha-95fb8f89c-2gn44" Mar 13 00:34:17.064028 kubelet[2798]: I0313 00:34:17.063317 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/a6766f4c-3c79-40d1-8b4f-d6245dc40867-cni-net-dir\") pod \"calico-node-27f76\" (UID: \"a6766f4c-3c79-40d1-8b4f-d6245dc40867\") " pod="calico-system/calico-node-27f76" Mar 13 00:34:17.064028 kubelet[2798]: I0313 00:34:17.063331 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bd0b6bf-0139-4e52-a281-6ad91c269ccb-tigera-ca-bundle\") pod \"calico-typha-95fb8f89c-2gn44\" (UID: \"0bd0b6bf-0139-4e52-a281-6ad91c269ccb\") " pod="calico-system/calico-typha-95fb8f89c-2gn44" Mar 13 00:34:17.064028 kubelet[2798]: I0313 00:34:17.063344 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/a6766f4c-3c79-40d1-8b4f-d6245dc40867-nodeproc\") pod \"calico-node-27f76\" (UID: \"a6766f4c-3c79-40d1-8b4f-d6245dc40867\") " pod="calico-system/calico-node-27f76" Mar 13 00:34:17.064028 kubelet[2798]: I0313 00:34:17.063356 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5p5m\" (UniqueName: \"kubernetes.io/projected/a6766f4c-3c79-40d1-8b4f-d6245dc40867-kube-api-access-r5p5m\") pod \"calico-node-27f76\" (UID: \"a6766f4c-3c79-40d1-8b4f-d6245dc40867\") " pod="calico-system/calico-node-27f76" Mar 13 00:34:17.146897 kubelet[2798]: E0313 00:34:17.146862 2798 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tp7j8" podUID="7e7f9c17-2933-4641-bd19-7c38cba62524" Mar 13 00:34:17.170525 kubelet[2798]: E0313 00:34:17.170448 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.170525 kubelet[2798]: W0313 00:34:17.170465 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.170525 kubelet[2798]: E0313 00:34:17.170481 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.174721 kubelet[2798]: E0313 00:34:17.174700 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.174721 kubelet[2798]: W0313 00:34:17.174717 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.174840 kubelet[2798]: E0313 00:34:17.174731 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.179583 kubelet[2798]: E0313 00:34:17.179515 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.179583 kubelet[2798]: W0313 00:34:17.179529 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.179583 kubelet[2798]: E0313 00:34:17.179545 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.189663 kubelet[2798]: E0313 00:34:17.187743 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.189663 kubelet[2798]: W0313 00:34:17.187776 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.189663 kubelet[2798]: E0313 00:34:17.187790 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.193684 kubelet[2798]: E0313 00:34:17.192717 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.193684 kubelet[2798]: W0313 00:34:17.192730 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.193684 kubelet[2798]: E0313 00:34:17.192741 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.245778 kubelet[2798]: E0313 00:34:17.245684 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.245778 kubelet[2798]: W0313 00:34:17.245707 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.245778 kubelet[2798]: E0313 00:34:17.245727 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.246367 kubelet[2798]: E0313 00:34:17.246353 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.246367 kubelet[2798]: W0313 00:34:17.246366 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.246420 kubelet[2798]: E0313 00:34:17.246378 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.246715 kubelet[2798]: E0313 00:34:17.246700 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.246715 kubelet[2798]: W0313 00:34:17.246711 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.246763 kubelet[2798]: E0313 00:34:17.246720 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.247863 kubelet[2798]: E0313 00:34:17.247850 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.247863 kubelet[2798]: W0313 00:34:17.247861 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.247917 kubelet[2798]: E0313 00:34:17.247870 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.248038 kubelet[2798]: E0313 00:34:17.248027 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.248038 kubelet[2798]: W0313 00:34:17.248036 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.248073 kubelet[2798]: E0313 00:34:17.248042 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.248182 kubelet[2798]: E0313 00:34:17.248171 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.248182 kubelet[2798]: W0313 00:34:17.248179 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.248212 kubelet[2798]: E0313 00:34:17.248185 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.248335 kubelet[2798]: E0313 00:34:17.248323 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.248335 kubelet[2798]: W0313 00:34:17.248333 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.248379 kubelet[2798]: E0313 00:34:17.248339 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.249716 kubelet[2798]: E0313 00:34:17.249700 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.249716 kubelet[2798]: W0313 00:34:17.249712 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.249776 kubelet[2798]: E0313 00:34:17.249719 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.249889 kubelet[2798]: E0313 00:34:17.249876 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.249889 kubelet[2798]: W0313 00:34:17.249885 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.249929 kubelet[2798]: E0313 00:34:17.249890 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.250032 kubelet[2798]: E0313 00:34:17.250021 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.250032 kubelet[2798]: W0313 00:34:17.250030 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.250065 kubelet[2798]: E0313 00:34:17.250035 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.250187 kubelet[2798]: E0313 00:34:17.250163 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.250187 kubelet[2798]: W0313 00:34:17.250171 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.250187 kubelet[2798]: E0313 00:34:17.250177 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.250347 kubelet[2798]: E0313 00:34:17.250335 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.250347 kubelet[2798]: W0313 00:34:17.250345 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.250381 kubelet[2798]: E0313 00:34:17.250353 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.250519 kubelet[2798]: E0313 00:34:17.250506 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.250519 kubelet[2798]: W0313 00:34:17.250515 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.250560 kubelet[2798]: E0313 00:34:17.250521 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.251662 kubelet[2798]: E0313 00:34:17.250977 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.251662 kubelet[2798]: W0313 00:34:17.250985 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.251662 kubelet[2798]: E0313 00:34:17.250991 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.251662 kubelet[2798]: E0313 00:34:17.251134 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.251662 kubelet[2798]: W0313 00:34:17.251139 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.251662 kubelet[2798]: E0313 00:34:17.251145 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.251777 kubelet[2798]: E0313 00:34:17.251693 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.251777 kubelet[2798]: W0313 00:34:17.251699 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.251777 kubelet[2798]: E0313 00:34:17.251705 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.251871 kubelet[2798]: E0313 00:34:17.251860 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.251871 kubelet[2798]: W0313 00:34:17.251869 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.251899 kubelet[2798]: E0313 00:34:17.251874 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.252027 kubelet[2798]: E0313 00:34:17.252017 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.252027 kubelet[2798]: W0313 00:34:17.252025 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.252061 kubelet[2798]: E0313 00:34:17.252030 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.252189 kubelet[2798]: E0313 00:34:17.252179 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.252189 kubelet[2798]: W0313 00:34:17.252187 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.252223 kubelet[2798]: E0313 00:34:17.252193 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.253389 kubelet[2798]: E0313 00:34:17.253375 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.253389 kubelet[2798]: W0313 00:34:17.253386 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.253442 kubelet[2798]: E0313 00:34:17.253393 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.264753 kubelet[2798]: E0313 00:34:17.264731 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.264753 kubelet[2798]: W0313 00:34:17.264745 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.264753 kubelet[2798]: E0313 00:34:17.264757 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.264921 kubelet[2798]: I0313 00:34:17.264776 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7e7f9c17-2933-4641-bd19-7c38cba62524-kubelet-dir\") pod \"csi-node-driver-tp7j8\" (UID: \"7e7f9c17-2933-4641-bd19-7c38cba62524\") " pod="calico-system/csi-node-driver-tp7j8" Mar 13 00:34:17.264942 kubelet[2798]: E0313 00:34:17.264936 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.264960 kubelet[2798]: W0313 00:34:17.264942 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.264960 kubelet[2798]: E0313 00:34:17.264950 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.264999 kubelet[2798]: I0313 00:34:17.264959 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7e7f9c17-2933-4641-bd19-7c38cba62524-registration-dir\") pod \"csi-node-driver-tp7j8\" (UID: \"7e7f9c17-2933-4641-bd19-7c38cba62524\") " pod="calico-system/csi-node-driver-tp7j8" Mar 13 00:34:17.265185 kubelet[2798]: E0313 00:34:17.265091 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.265185 kubelet[2798]: W0313 00:34:17.265101 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.265185 kubelet[2798]: E0313 00:34:17.265107 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.265185 kubelet[2798]: I0313 00:34:17.265115 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7e7f9c17-2933-4641-bd19-7c38cba62524-socket-dir\") pod \"csi-node-driver-tp7j8\" (UID: \"7e7f9c17-2933-4641-bd19-7c38cba62524\") " pod="calico-system/csi-node-driver-tp7j8" Mar 13 00:34:17.265305 kubelet[2798]: E0313 00:34:17.265296 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.265340 kubelet[2798]: W0313 00:34:17.265333 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.265374 kubelet[2798]: E0313 00:34:17.265366 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.265542 kubelet[2798]: E0313 00:34:17.265533 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.265681 kubelet[2798]: W0313 00:34:17.265580 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.265681 kubelet[2798]: E0313 00:34:17.265589 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.265785 kubelet[2798]: E0313 00:34:17.265777 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.265828 kubelet[2798]: W0313 00:34:17.265810 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.265860 kubelet[2798]: E0313 00:34:17.265854 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.265904 kubelet[2798]: I0313 00:34:17.265896 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/7e7f9c17-2933-4641-bd19-7c38cba62524-varrun\") pod \"csi-node-driver-tp7j8\" (UID: \"7e7f9c17-2933-4641-bd19-7c38cba62524\") " pod="calico-system/csi-node-driver-tp7j8" Mar 13 00:34:17.266716 kubelet[2798]: E0313 00:34:17.266706 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.266811 kubelet[2798]: W0313 00:34:17.266764 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.266811 kubelet[2798]: E0313 00:34:17.266774 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.267007 kubelet[2798]: E0313 00:34:17.266997 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.267066 kubelet[2798]: W0313 00:34:17.267040 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.267066 kubelet[2798]: E0313 00:34:17.267048 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.267305 kubelet[2798]: E0313 00:34:17.267273 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.267305 kubelet[2798]: W0313 00:34:17.267284 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.267305 kubelet[2798]: E0313 00:34:17.267292 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.267583 kubelet[2798]: E0313 00:34:17.267559 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.267583 kubelet[2798]: W0313 00:34:17.267567 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.267583 kubelet[2798]: E0313 00:34:17.267573 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.267752 kubelet[2798]: I0313 00:34:17.267741 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hlmn\" (UniqueName: \"kubernetes.io/projected/7e7f9c17-2933-4641-bd19-7c38cba62524-kube-api-access-5hlmn\") pod \"csi-node-driver-tp7j8\" (UID: \"7e7f9c17-2933-4641-bd19-7c38cba62524\") " pod="calico-system/csi-node-driver-tp7j8" Mar 13 00:34:17.267918 kubelet[2798]: E0313 00:34:17.267908 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.267976 kubelet[2798]: W0313 00:34:17.267957 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.267976 kubelet[2798]: E0313 00:34:17.267966 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.268221 kubelet[2798]: E0313 00:34:17.268190 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.268221 kubelet[2798]: W0313 00:34:17.268199 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.268221 kubelet[2798]: E0313 00:34:17.268208 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.268556 kubelet[2798]: E0313 00:34:17.268533 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.268556 kubelet[2798]: W0313 00:34:17.268541 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.268556 kubelet[2798]: E0313 00:34:17.268547 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.268878 kubelet[2798]: E0313 00:34:17.268846 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.268878 kubelet[2798]: W0313 00:34:17.268857 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.268878 kubelet[2798]: E0313 00:34:17.268865 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.269214 kubelet[2798]: E0313 00:34:17.269184 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.269214 kubelet[2798]: W0313 00:34:17.269193 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.269214 kubelet[2798]: E0313 00:34:17.269199 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.282531 containerd[1635]: time="2026-03-13T00:34:17.282500140Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-95fb8f89c-2gn44,Uid:0bd0b6bf-0139-4e52-a281-6ad91c269ccb,Namespace:calico-system,Attempt:0,}" Mar 13 00:34:17.303523 containerd[1635]: time="2026-03-13T00:34:17.303186627Z" level=info msg="connecting to shim cb6a581845626b315a010727e1312aa1d293cd1e0e7b7ad27b2846c42ddfe2b3" address="unix:///run/containerd/s/4a0c713b4f9f4de7d099685ae865bdf6b75bc7819300169293b31c517472e55f" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:34:17.332791 systemd[1]: Started cri-containerd-cb6a581845626b315a010727e1312aa1d293cd1e0e7b7ad27b2846c42ddfe2b3.scope - libcontainer container cb6a581845626b315a010727e1312aa1d293cd1e0e7b7ad27b2846c42ddfe2b3. Mar 13 00:34:17.349688 containerd[1635]: time="2026-03-13T00:34:17.349659199Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-27f76,Uid:a6766f4c-3c79-40d1-8b4f-d6245dc40867,Namespace:calico-system,Attempt:0,}" Mar 13 00:34:17.366725 containerd[1635]: time="2026-03-13T00:34:17.366661707Z" level=info msg="connecting to shim 766dae36db74ba8f78bbe7a9ff332a9d193bb4c9fb87f2ef07d2884219cd44a0" address="unix:///run/containerd/s/f05b3b24552b3ea6e1a1869fe42443b1ec9669e4f832715ccaba488ac63e4a77" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:34:17.370483 kubelet[2798]: E0313 00:34:17.370226 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.370483 kubelet[2798]: W0313 00:34:17.370360 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.370483 kubelet[2798]: E0313 00:34:17.370384 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.370778 kubelet[2798]: E0313 00:34:17.370753 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.370834 kubelet[2798]: W0313 00:34:17.370807 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.370858 kubelet[2798]: E0313 00:34:17.370835 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.371845 kubelet[2798]: E0313 00:34:17.371829 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.371845 kubelet[2798]: W0313 00:34:17.371841 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.371911 kubelet[2798]: E0313 00:34:17.371851 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.372459 kubelet[2798]: E0313 00:34:17.372107 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.372459 kubelet[2798]: W0313 00:34:17.372115 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.372459 kubelet[2798]: E0313 00:34:17.372124 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.372947 kubelet[2798]: E0313 00:34:17.372641 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.373007 kubelet[2798]: W0313 00:34:17.372948 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.373007 kubelet[2798]: E0313 00:34:17.372959 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.373524 kubelet[2798]: E0313 00:34:17.373492 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.373558 kubelet[2798]: W0313 00:34:17.373524 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.373558 kubelet[2798]: E0313 00:34:17.373535 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.374093 kubelet[2798]: E0313 00:34:17.374079 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.374093 kubelet[2798]: W0313 00:34:17.374090 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.374159 kubelet[2798]: E0313 00:34:17.374098 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.374526 kubelet[2798]: E0313 00:34:17.374513 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.374526 kubelet[2798]: W0313 00:34:17.374523 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.374566 kubelet[2798]: E0313 00:34:17.374532 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.375249 kubelet[2798]: E0313 00:34:17.375234 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.375249 kubelet[2798]: W0313 00:34:17.375248 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.375691 kubelet[2798]: E0313 00:34:17.375257 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.375691 kubelet[2798]: E0313 00:34:17.375672 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.375691 kubelet[2798]: W0313 00:34:17.375679 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.375691 kubelet[2798]: E0313 00:34:17.375686 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.376184 kubelet[2798]: E0313 00:34:17.376154 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.376184 kubelet[2798]: W0313 00:34:17.376165 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.376184 kubelet[2798]: E0313 00:34:17.376172 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.376681 kubelet[2798]: E0313 00:34:17.376629 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.376774 kubelet[2798]: W0313 00:34:17.376640 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.376797 kubelet[2798]: E0313 00:34:17.376774 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.377039 kubelet[2798]: E0313 00:34:17.377024 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.377039 kubelet[2798]: W0313 00:34:17.377034 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.377039 kubelet[2798]: E0313 00:34:17.377041 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.377334 kubelet[2798]: E0313 00:34:17.377319 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.377334 kubelet[2798]: W0313 00:34:17.377330 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.377410 kubelet[2798]: E0313 00:34:17.377352 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.377582 kubelet[2798]: E0313 00:34:17.377563 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.377582 kubelet[2798]: W0313 00:34:17.377573 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.377582 kubelet[2798]: E0313 00:34:17.377579 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.377861 kubelet[2798]: E0313 00:34:17.377838 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.377861 kubelet[2798]: W0313 00:34:17.377848 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.377861 kubelet[2798]: E0313 00:34:17.377855 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.378170 kubelet[2798]: E0313 00:34:17.378153 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.378170 kubelet[2798]: W0313 00:34:17.378163 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.378170 kubelet[2798]: E0313 00:34:17.378178 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.378958 kubelet[2798]: E0313 00:34:17.378875 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.378958 kubelet[2798]: W0313 00:34:17.378883 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.378958 kubelet[2798]: E0313 00:34:17.378889 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.379131 kubelet[2798]: E0313 00:34:17.379076 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.379131 kubelet[2798]: W0313 00:34:17.379084 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.379131 kubelet[2798]: E0313 00:34:17.379090 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.379346 kubelet[2798]: E0313 00:34:17.379333 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.379346 kubelet[2798]: W0313 00:34:17.379344 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.379397 kubelet[2798]: E0313 00:34:17.379352 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.379624 kubelet[2798]: E0313 00:34:17.379542 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.379624 kubelet[2798]: W0313 00:34:17.379549 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.379624 kubelet[2798]: E0313 00:34:17.379555 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.379947 kubelet[2798]: E0313 00:34:17.379900 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.379947 kubelet[2798]: W0313 00:34:17.379908 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.379947 kubelet[2798]: E0313 00:34:17.379914 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.380496 kubelet[2798]: E0313 00:34:17.380412 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.380496 kubelet[2798]: W0313 00:34:17.380421 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.380496 kubelet[2798]: E0313 00:34:17.380427 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.381693 kubelet[2798]: E0313 00:34:17.380972 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.381693 kubelet[2798]: W0313 00:34:17.380979 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.381693 kubelet[2798]: E0313 00:34:17.380987 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.381693 kubelet[2798]: E0313 00:34:17.381207 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.381693 kubelet[2798]: W0313 00:34:17.381215 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.381693 kubelet[2798]: E0313 00:34:17.381223 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.394352 kubelet[2798]: E0313 00:34:17.393925 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:17.394352 kubelet[2798]: W0313 00:34:17.393958 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:17.394352 kubelet[2798]: E0313 00:34:17.393973 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:17.396880 systemd[1]: Started cri-containerd-766dae36db74ba8f78bbe7a9ff332a9d193bb4c9fb87f2ef07d2884219cd44a0.scope - libcontainer container 766dae36db74ba8f78bbe7a9ff332a9d193bb4c9fb87f2ef07d2884219cd44a0. Mar 13 00:34:17.397906 containerd[1635]: time="2026-03-13T00:34:17.397790130Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-95fb8f89c-2gn44,Uid:0bd0b6bf-0139-4e52-a281-6ad91c269ccb,Namespace:calico-system,Attempt:0,} returns sandbox id \"cb6a581845626b315a010727e1312aa1d293cd1e0e7b7ad27b2846c42ddfe2b3\"" Mar 13 00:34:17.399794 containerd[1635]: time="2026-03-13T00:34:17.399773691Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Mar 13 00:34:17.423489 containerd[1635]: time="2026-03-13T00:34:17.423457209Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-27f76,Uid:a6766f4c-3c79-40d1-8b4f-d6245dc40867,Namespace:calico-system,Attempt:0,} returns sandbox id \"766dae36db74ba8f78bbe7a9ff332a9d193bb4c9fb87f2ef07d2884219cd44a0\"" Mar 13 00:34:18.658249 kubelet[2798]: E0313 00:34:18.658155 2798 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tp7j8" podUID="7e7f9c17-2933-4641-bd19-7c38cba62524" Mar 13 00:34:19.257767 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2743943887.mount: Deactivated successfully. Mar 13 00:34:20.477129 containerd[1635]: time="2026-03-13T00:34:20.477064387Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:20.478045 containerd[1635]: time="2026-03-13T00:34:20.477864734Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=36107596" Mar 13 00:34:20.479108 containerd[1635]: time="2026-03-13T00:34:20.479072375Z" level=info msg="ImageCreate event name:\"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:20.481067 containerd[1635]: time="2026-03-13T00:34:20.481033774Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:20.481521 containerd[1635]: time="2026-03-13T00:34:20.481495757Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"36107450\" in 3.081698446s" Mar 13 00:34:20.481582 containerd[1635]: time="2026-03-13T00:34:20.481572305Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\"" Mar 13 00:34:20.482595 containerd[1635]: time="2026-03-13T00:34:20.482452651Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Mar 13 00:34:20.502410 containerd[1635]: time="2026-03-13T00:34:20.502353632Z" level=info msg="CreateContainer within sandbox \"cb6a581845626b315a010727e1312aa1d293cd1e0e7b7ad27b2846c42ddfe2b3\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 13 00:34:20.511859 containerd[1635]: time="2026-03-13T00:34:20.511821140Z" level=info msg="Container 5e7acbca005e0f34a1a0187db8ebebe00e48cdb50823963fa553a8a285141422: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:20.518905 containerd[1635]: time="2026-03-13T00:34:20.518861758Z" level=info msg="CreateContainer within sandbox \"cb6a581845626b315a010727e1312aa1d293cd1e0e7b7ad27b2846c42ddfe2b3\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"5e7acbca005e0f34a1a0187db8ebebe00e48cdb50823963fa553a8a285141422\"" Mar 13 00:34:20.519376 containerd[1635]: time="2026-03-13T00:34:20.519357411Z" level=info msg="StartContainer for \"5e7acbca005e0f34a1a0187db8ebebe00e48cdb50823963fa553a8a285141422\"" Mar 13 00:34:20.520816 containerd[1635]: time="2026-03-13T00:34:20.520778167Z" level=info msg="connecting to shim 5e7acbca005e0f34a1a0187db8ebebe00e48cdb50823963fa553a8a285141422" address="unix:///run/containerd/s/4a0c713b4f9f4de7d099685ae865bdf6b75bc7819300169293b31c517472e55f" protocol=ttrpc version=3 Mar 13 00:34:20.542791 systemd[1]: Started cri-containerd-5e7acbca005e0f34a1a0187db8ebebe00e48cdb50823963fa553a8a285141422.scope - libcontainer container 5e7acbca005e0f34a1a0187db8ebebe00e48cdb50823963fa553a8a285141422. Mar 13 00:34:20.593584 containerd[1635]: time="2026-03-13T00:34:20.593531502Z" level=info msg="StartContainer for \"5e7acbca005e0f34a1a0187db8ebebe00e48cdb50823963fa553a8a285141422\" returns successfully" Mar 13 00:34:20.658534 kubelet[2798]: E0313 00:34:20.657838 2798 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tp7j8" podUID="7e7f9c17-2933-4641-bd19-7c38cba62524" Mar 13 00:34:20.777238 kubelet[2798]: E0313 00:34:20.776867 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.777238 kubelet[2798]: W0313 00:34:20.776886 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.777238 kubelet[2798]: E0313 00:34:20.776901 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.777238 kubelet[2798]: E0313 00:34:20.777107 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.777238 kubelet[2798]: W0313 00:34:20.777113 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.777238 kubelet[2798]: E0313 00:34:20.777119 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.777421 kubelet[2798]: E0313 00:34:20.777285 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.777421 kubelet[2798]: W0313 00:34:20.777320 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.777421 kubelet[2798]: E0313 00:34:20.777342 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.777568 kubelet[2798]: E0313 00:34:20.777550 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.777568 kubelet[2798]: W0313 00:34:20.777560 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.777568 kubelet[2798]: E0313 00:34:20.777567 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.777808 kubelet[2798]: E0313 00:34:20.777789 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.777808 kubelet[2798]: W0313 00:34:20.777799 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.777808 kubelet[2798]: E0313 00:34:20.777806 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.778025 kubelet[2798]: E0313 00:34:20.778006 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.778025 kubelet[2798]: W0313 00:34:20.778016 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.778025 kubelet[2798]: E0313 00:34:20.778022 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.778228 kubelet[2798]: E0313 00:34:20.778210 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.778228 kubelet[2798]: W0313 00:34:20.778219 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.778228 kubelet[2798]: E0313 00:34:20.778225 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.778419 kubelet[2798]: E0313 00:34:20.778402 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.778419 kubelet[2798]: W0313 00:34:20.778412 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.778458 kubelet[2798]: E0313 00:34:20.778426 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.778616 kubelet[2798]: E0313 00:34:20.778599 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.778616 kubelet[2798]: W0313 00:34:20.778609 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.778616 kubelet[2798]: E0313 00:34:20.778615 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.778842 kubelet[2798]: E0313 00:34:20.778824 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.778842 kubelet[2798]: W0313 00:34:20.778834 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.778842 kubelet[2798]: E0313 00:34:20.778841 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.779035 kubelet[2798]: E0313 00:34:20.779017 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.779062 kubelet[2798]: W0313 00:34:20.779027 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.779062 kubelet[2798]: E0313 00:34:20.779042 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.779229 kubelet[2798]: E0313 00:34:20.779211 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.779229 kubelet[2798]: W0313 00:34:20.779222 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.779229 kubelet[2798]: E0313 00:34:20.779228 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.779459 kubelet[2798]: E0313 00:34:20.779428 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.779459 kubelet[2798]: W0313 00:34:20.779447 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.779459 kubelet[2798]: E0313 00:34:20.779454 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.779634 kubelet[2798]: E0313 00:34:20.779618 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.779634 kubelet[2798]: W0313 00:34:20.779627 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.779634 kubelet[2798]: E0313 00:34:20.779633 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.779869 kubelet[2798]: E0313 00:34:20.779851 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.779869 kubelet[2798]: W0313 00:34:20.779861 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.779869 kubelet[2798]: E0313 00:34:20.779867 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.799411 kubelet[2798]: E0313 00:34:20.799375 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.799411 kubelet[2798]: W0313 00:34:20.799398 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.799411 kubelet[2798]: E0313 00:34:20.799416 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.799732 kubelet[2798]: E0313 00:34:20.799664 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.799732 kubelet[2798]: W0313 00:34:20.799671 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.799732 kubelet[2798]: E0313 00:34:20.799677 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.799942 kubelet[2798]: E0313 00:34:20.799904 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.799942 kubelet[2798]: W0313 00:34:20.799932 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.799942 kubelet[2798]: E0313 00:34:20.799939 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.800145 kubelet[2798]: E0313 00:34:20.800130 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.800145 kubelet[2798]: W0313 00:34:20.800139 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.800145 kubelet[2798]: E0313 00:34:20.800145 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.800351 kubelet[2798]: E0313 00:34:20.800342 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.800351 kubelet[2798]: W0313 00:34:20.800349 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.800390 kubelet[2798]: E0313 00:34:20.800356 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.800560 kubelet[2798]: E0313 00:34:20.800543 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.800560 kubelet[2798]: W0313 00:34:20.800553 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.800560 kubelet[2798]: E0313 00:34:20.800559 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.800824 kubelet[2798]: E0313 00:34:20.800807 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.800824 kubelet[2798]: W0313 00:34:20.800817 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.800824 kubelet[2798]: E0313 00:34:20.800823 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.801169 kubelet[2798]: E0313 00:34:20.801155 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.801169 kubelet[2798]: W0313 00:34:20.801165 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.801169 kubelet[2798]: E0313 00:34:20.801171 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.801382 kubelet[2798]: E0313 00:34:20.801369 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.801382 kubelet[2798]: W0313 00:34:20.801380 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.801422 kubelet[2798]: E0313 00:34:20.801387 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.801612 kubelet[2798]: E0313 00:34:20.801595 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.801612 kubelet[2798]: W0313 00:34:20.801605 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.801612 kubelet[2798]: E0313 00:34:20.801611 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.801834 kubelet[2798]: E0313 00:34:20.801821 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.801834 kubelet[2798]: W0313 00:34:20.801831 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.801872 kubelet[2798]: E0313 00:34:20.801837 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.802047 kubelet[2798]: E0313 00:34:20.802026 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.802047 kubelet[2798]: W0313 00:34:20.802036 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.802047 kubelet[2798]: E0313 00:34:20.802042 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.802261 kubelet[2798]: E0313 00:34:20.802244 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.802261 kubelet[2798]: W0313 00:34:20.802253 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.802261 kubelet[2798]: E0313 00:34:20.802259 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.802729 kubelet[2798]: E0313 00:34:20.802711 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.802729 kubelet[2798]: W0313 00:34:20.802722 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.802729 kubelet[2798]: E0313 00:34:20.802729 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.802927 kubelet[2798]: E0313 00:34:20.802914 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.802927 kubelet[2798]: W0313 00:34:20.802924 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.802963 kubelet[2798]: E0313 00:34:20.802931 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.803146 kubelet[2798]: E0313 00:34:20.803106 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.803146 kubelet[2798]: W0313 00:34:20.803115 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.803146 kubelet[2798]: E0313 00:34:20.803121 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.803416 kubelet[2798]: E0313 00:34:20.803377 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.803416 kubelet[2798]: W0313 00:34:20.803396 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.803416 kubelet[2798]: E0313 00:34:20.803403 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:20.803867 kubelet[2798]: E0313 00:34:20.803852 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:20.803867 kubelet[2798]: W0313 00:34:20.803863 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:20.803941 kubelet[2798]: E0313 00:34:20.803870 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.728954 kubelet[2798]: I0313 00:34:21.728895 2798 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 00:34:21.784978 kubelet[2798]: E0313 00:34:21.784923 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.784978 kubelet[2798]: W0313 00:34:21.784960 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.784978 kubelet[2798]: E0313 00:34:21.784985 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.785348 kubelet[2798]: E0313 00:34:21.785312 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.785348 kubelet[2798]: W0313 00:34:21.785332 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.785348 kubelet[2798]: E0313 00:34:21.785346 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.785708 kubelet[2798]: E0313 00:34:21.785678 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.785708 kubelet[2798]: W0313 00:34:21.785696 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.785806 kubelet[2798]: E0313 00:34:21.785709 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.786115 kubelet[2798]: E0313 00:34:21.786084 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.786115 kubelet[2798]: W0313 00:34:21.786105 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.786199 kubelet[2798]: E0313 00:34:21.786120 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.786541 kubelet[2798]: E0313 00:34:21.786497 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.786541 kubelet[2798]: W0313 00:34:21.786527 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.786703 kubelet[2798]: E0313 00:34:21.786547 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.786991 kubelet[2798]: E0313 00:34:21.786957 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.786991 kubelet[2798]: W0313 00:34:21.786981 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.787164 kubelet[2798]: E0313 00:34:21.786996 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.787380 kubelet[2798]: E0313 00:34:21.787340 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.787380 kubelet[2798]: W0313 00:34:21.787367 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.787506 kubelet[2798]: E0313 00:34:21.787384 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.787826 kubelet[2798]: E0313 00:34:21.787794 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.787826 kubelet[2798]: W0313 00:34:21.787815 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.787906 kubelet[2798]: E0313 00:34:21.787830 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.788220 kubelet[2798]: E0313 00:34:21.788199 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.788220 kubelet[2798]: W0313 00:34:21.788216 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.788343 kubelet[2798]: E0313 00:34:21.788233 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.788632 kubelet[2798]: E0313 00:34:21.788598 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.788632 kubelet[2798]: W0313 00:34:21.788623 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.788801 kubelet[2798]: E0313 00:34:21.788639 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.789075 kubelet[2798]: E0313 00:34:21.789042 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.789075 kubelet[2798]: W0313 00:34:21.789064 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.789168 kubelet[2798]: E0313 00:34:21.789077 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.789459 kubelet[2798]: E0313 00:34:21.789428 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.789459 kubelet[2798]: W0313 00:34:21.789451 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.789561 kubelet[2798]: E0313 00:34:21.789467 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.789983 kubelet[2798]: E0313 00:34:21.789949 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.789983 kubelet[2798]: W0313 00:34:21.789970 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.790087 kubelet[2798]: E0313 00:34:21.789985 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.790394 kubelet[2798]: E0313 00:34:21.790329 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.790394 kubelet[2798]: W0313 00:34:21.790345 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.790394 kubelet[2798]: E0313 00:34:21.790359 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.790736 kubelet[2798]: E0313 00:34:21.790711 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.790736 kubelet[2798]: W0313 00:34:21.790730 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.790908 kubelet[2798]: E0313 00:34:21.790743 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.807366 kubelet[2798]: E0313 00:34:21.807228 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.807366 kubelet[2798]: W0313 00:34:21.807254 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.807366 kubelet[2798]: E0313 00:34:21.807274 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.807734 kubelet[2798]: E0313 00:34:21.807697 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.807734 kubelet[2798]: W0313 00:34:21.807717 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.807734 kubelet[2798]: E0313 00:34:21.807732 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.808365 kubelet[2798]: E0313 00:34:21.808296 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.808365 kubelet[2798]: W0313 00:34:21.808325 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.808365 kubelet[2798]: E0313 00:34:21.808347 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.808893 kubelet[2798]: E0313 00:34:21.808855 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.808893 kubelet[2798]: W0313 00:34:21.808882 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.809054 kubelet[2798]: E0313 00:34:21.808904 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.809541 kubelet[2798]: E0313 00:34:21.809325 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.809541 kubelet[2798]: W0313 00:34:21.809345 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.809541 kubelet[2798]: E0313 00:34:21.809363 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.809879 kubelet[2798]: E0313 00:34:21.809855 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.809991 kubelet[2798]: W0313 00:34:21.809953 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.809991 kubelet[2798]: E0313 00:34:21.809978 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.810379 kubelet[2798]: E0313 00:34:21.810358 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.810379 kubelet[2798]: W0313 00:34:21.810376 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.810500 kubelet[2798]: E0313 00:34:21.810397 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.810899 kubelet[2798]: E0313 00:34:21.810863 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.810899 kubelet[2798]: W0313 00:34:21.810888 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.811005 kubelet[2798]: E0313 00:34:21.810904 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.812030 kubelet[2798]: E0313 00:34:21.811986 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.812030 kubelet[2798]: W0313 00:34:21.812021 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.812156 kubelet[2798]: E0313 00:34:21.812045 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.813819 kubelet[2798]: E0313 00:34:21.813747 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.813819 kubelet[2798]: W0313 00:34:21.813788 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.813819 kubelet[2798]: E0313 00:34:21.813806 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.817122 kubelet[2798]: E0313 00:34:21.816983 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.817122 kubelet[2798]: W0313 00:34:21.817010 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.817122 kubelet[2798]: E0313 00:34:21.817035 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.818432 kubelet[2798]: E0313 00:34:21.818398 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.818432 kubelet[2798]: W0313 00:34:21.818420 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.818558 kubelet[2798]: E0313 00:34:21.818444 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.821050 kubelet[2798]: E0313 00:34:21.820823 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.821050 kubelet[2798]: W0313 00:34:21.820846 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.821050 kubelet[2798]: E0313 00:34:21.820874 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.821428 kubelet[2798]: E0313 00:34:21.821409 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.821507 kubelet[2798]: W0313 00:34:21.821491 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.821592 kubelet[2798]: E0313 00:34:21.821575 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.822504 kubelet[2798]: E0313 00:34:21.822482 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.823677 kubelet[2798]: W0313 00:34:21.822583 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.823677 kubelet[2798]: E0313 00:34:21.822604 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.825056 kubelet[2798]: E0313 00:34:21.825035 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.825147 kubelet[2798]: W0313 00:34:21.825131 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.825399 kubelet[2798]: E0313 00:34:21.825362 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.826718 kubelet[2798]: E0313 00:34:21.826642 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.826718 kubelet[2798]: W0313 00:34:21.826713 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.826825 kubelet[2798]: E0313 00:34:21.826732 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:21.827462 kubelet[2798]: E0313 00:34:21.827420 2798 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 13 00:34:21.827462 kubelet[2798]: W0313 00:34:21.827446 2798 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 13 00:34:21.827462 kubelet[2798]: E0313 00:34:21.827465 2798 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 13 00:34:22.256998 containerd[1635]: time="2026-03-13T00:34:22.256945646Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:22.258259 containerd[1635]: time="2026-03-13T00:34:22.258128130Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4630250" Mar 13 00:34:22.259430 containerd[1635]: time="2026-03-13T00:34:22.259406071Z" level=info msg="ImageCreate event name:\"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:22.261634 containerd[1635]: time="2026-03-13T00:34:22.261610630Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:22.262297 containerd[1635]: time="2026-03-13T00:34:22.262273491Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"6186255\" in 1.77980055s" Mar 13 00:34:22.262382 containerd[1635]: time="2026-03-13T00:34:22.262368510Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\"" Mar 13 00:34:22.265884 containerd[1635]: time="2026-03-13T00:34:22.265856881Z" level=info msg="CreateContainer within sandbox \"766dae36db74ba8f78bbe7a9ff332a9d193bb4c9fb87f2ef07d2884219cd44a0\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 13 00:34:22.276162 containerd[1635]: time="2026-03-13T00:34:22.275530545Z" level=info msg="Container cbea261f781c08897920dadefd48693d236b91b68022eb1dd27369ad600becf8: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:22.290584 containerd[1635]: time="2026-03-13T00:34:22.290542825Z" level=info msg="CreateContainer within sandbox \"766dae36db74ba8f78bbe7a9ff332a9d193bb4c9fb87f2ef07d2884219cd44a0\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"cbea261f781c08897920dadefd48693d236b91b68022eb1dd27369ad600becf8\"" Mar 13 00:34:22.290990 containerd[1635]: time="2026-03-13T00:34:22.290970868Z" level=info msg="StartContainer for \"cbea261f781c08897920dadefd48693d236b91b68022eb1dd27369ad600becf8\"" Mar 13 00:34:22.292328 containerd[1635]: time="2026-03-13T00:34:22.292312980Z" level=info msg="connecting to shim cbea261f781c08897920dadefd48693d236b91b68022eb1dd27369ad600becf8" address="unix:///run/containerd/s/f05b3b24552b3ea6e1a1869fe42443b1ec9669e4f832715ccaba488ac63e4a77" protocol=ttrpc version=3 Mar 13 00:34:22.317227 systemd[1]: Started cri-containerd-cbea261f781c08897920dadefd48693d236b91b68022eb1dd27369ad600becf8.scope - libcontainer container cbea261f781c08897920dadefd48693d236b91b68022eb1dd27369ad600becf8. Mar 13 00:34:22.373556 containerd[1635]: time="2026-03-13T00:34:22.373506692Z" level=info msg="StartContainer for \"cbea261f781c08897920dadefd48693d236b91b68022eb1dd27369ad600becf8\" returns successfully" Mar 13 00:34:22.385713 systemd[1]: cri-containerd-cbea261f781c08897920dadefd48693d236b91b68022eb1dd27369ad600becf8.scope: Deactivated successfully. Mar 13 00:34:22.389242 containerd[1635]: time="2026-03-13T00:34:22.388990494Z" level=info msg="received container exit event container_id:\"cbea261f781c08897920dadefd48693d236b91b68022eb1dd27369ad600becf8\" id:\"cbea261f781c08897920dadefd48693d236b91b68022eb1dd27369ad600becf8\" pid:3523 exited_at:{seconds:1773362062 nanos:388354864}" Mar 13 00:34:22.408141 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-cbea261f781c08897920dadefd48693d236b91b68022eb1dd27369ad600becf8-rootfs.mount: Deactivated successfully. Mar 13 00:34:22.659506 kubelet[2798]: E0313 00:34:22.659004 2798 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tp7j8" podUID="7e7f9c17-2933-4641-bd19-7c38cba62524" Mar 13 00:34:22.739385 containerd[1635]: time="2026-03-13T00:34:22.739146395Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Mar 13 00:34:22.769970 kubelet[2798]: I0313 00:34:22.769342 2798 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-95fb8f89c-2gn44" podStartSLOduration=3.686377889 podStartE2EDuration="6.769321923s" podCreationTimestamp="2026-03-13 00:34:16 +0000 UTC" firstStartedPulling="2026-03-13 00:34:17.399419048 +0000 UTC m=+17.842674197" lastFinishedPulling="2026-03-13 00:34:20.482363072 +0000 UTC m=+20.925618231" observedRunningTime="2026-03-13 00:34:20.746772459 +0000 UTC m=+21.190027608" watchObservedRunningTime="2026-03-13 00:34:22.769321923 +0000 UTC m=+23.212577112" Mar 13 00:34:24.659355 kubelet[2798]: E0313 00:34:24.659149 2798 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tp7j8" podUID="7e7f9c17-2933-4641-bd19-7c38cba62524" Mar 13 00:34:26.658537 kubelet[2798]: E0313 00:34:26.658444 2798 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tp7j8" podUID="7e7f9c17-2933-4641-bd19-7c38cba62524" Mar 13 00:34:28.658912 kubelet[2798]: E0313 00:34:28.658831 2798 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tp7j8" podUID="7e7f9c17-2933-4641-bd19-7c38cba62524" Mar 13 00:34:29.052460 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3180344499.mount: Deactivated successfully. Mar 13 00:34:29.078539 containerd[1635]: time="2026-03-13T00:34:29.078494812Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:29.079231 containerd[1635]: time="2026-03-13T00:34:29.079211626Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=159838564" Mar 13 00:34:29.080196 containerd[1635]: time="2026-03-13T00:34:29.080153288Z" level=info msg="ImageCreate event name:\"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:29.081682 containerd[1635]: time="2026-03-13T00:34:29.081640605Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:29.082294 containerd[1635]: time="2026-03-13T00:34:29.081968472Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"159838426\" in 6.342642259s" Mar 13 00:34:29.082294 containerd[1635]: time="2026-03-13T00:34:29.081989992Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\"" Mar 13 00:34:29.084952 containerd[1635]: time="2026-03-13T00:34:29.084928446Z" level=info msg="CreateContainer within sandbox \"766dae36db74ba8f78bbe7a9ff332a9d193bb4c9fb87f2ef07d2884219cd44a0\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Mar 13 00:34:29.095417 containerd[1635]: time="2026-03-13T00:34:29.094787819Z" level=info msg="Container 696beb04f3f09eff00a7b3238041f85542d28809565a770b6ef947393ed29296: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:29.102136 containerd[1635]: time="2026-03-13T00:34:29.102108415Z" level=info msg="CreateContainer within sandbox \"766dae36db74ba8f78bbe7a9ff332a9d193bb4c9fb87f2ef07d2884219cd44a0\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"696beb04f3f09eff00a7b3238041f85542d28809565a770b6ef947393ed29296\"" Mar 13 00:34:29.102661 containerd[1635]: time="2026-03-13T00:34:29.102620430Z" level=info msg="StartContainer for \"696beb04f3f09eff00a7b3238041f85542d28809565a770b6ef947393ed29296\"" Mar 13 00:34:29.103924 containerd[1635]: time="2026-03-13T00:34:29.103800700Z" level=info msg="connecting to shim 696beb04f3f09eff00a7b3238041f85542d28809565a770b6ef947393ed29296" address="unix:///run/containerd/s/f05b3b24552b3ea6e1a1869fe42443b1ec9669e4f832715ccaba488ac63e4a77" protocol=ttrpc version=3 Mar 13 00:34:29.126774 systemd[1]: Started cri-containerd-696beb04f3f09eff00a7b3238041f85542d28809565a770b6ef947393ed29296.scope - libcontainer container 696beb04f3f09eff00a7b3238041f85542d28809565a770b6ef947393ed29296. Mar 13 00:34:29.188514 containerd[1635]: time="2026-03-13T00:34:29.188435257Z" level=info msg="StartContainer for \"696beb04f3f09eff00a7b3238041f85542d28809565a770b6ef947393ed29296\" returns successfully" Mar 13 00:34:29.221997 systemd[1]: cri-containerd-696beb04f3f09eff00a7b3238041f85542d28809565a770b6ef947393ed29296.scope: Deactivated successfully. Mar 13 00:34:29.223617 containerd[1635]: time="2026-03-13T00:34:29.223543570Z" level=info msg="received container exit event container_id:\"696beb04f3f09eff00a7b3238041f85542d28809565a770b6ef947393ed29296\" id:\"696beb04f3f09eff00a7b3238041f85542d28809565a770b6ef947393ed29296\" pid:3577 exited_at:{seconds:1773362069 nanos:223243173}" Mar 13 00:34:29.764569 containerd[1635]: time="2026-03-13T00:34:29.764525553Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Mar 13 00:34:30.054471 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-696beb04f3f09eff00a7b3238041f85542d28809565a770b6ef947393ed29296-rootfs.mount: Deactivated successfully. Mar 13 00:34:30.658880 kubelet[2798]: E0313 00:34:30.658430 2798 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tp7j8" podUID="7e7f9c17-2933-4641-bd19-7c38cba62524" Mar 13 00:34:32.658330 kubelet[2798]: E0313 00:34:32.658143 2798 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tp7j8" podUID="7e7f9c17-2933-4641-bd19-7c38cba62524" Mar 13 00:34:33.445531 containerd[1635]: time="2026-03-13T00:34:33.445478093Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:33.446548 containerd[1635]: time="2026-03-13T00:34:33.446469337Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=70611671" Mar 13 00:34:33.447241 containerd[1635]: time="2026-03-13T00:34:33.447221961Z" level=info msg="ImageCreate event name:\"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:33.449216 containerd[1635]: time="2026-03-13T00:34:33.448750581Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:33.449216 containerd[1635]: time="2026-03-13T00:34:33.449127579Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"72167716\" in 3.68415451s" Mar 13 00:34:33.449216 containerd[1635]: time="2026-03-13T00:34:33.449146649Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\"" Mar 13 00:34:33.452346 containerd[1635]: time="2026-03-13T00:34:33.452318777Z" level=info msg="CreateContainer within sandbox \"766dae36db74ba8f78bbe7a9ff332a9d193bb4c9fb87f2ef07d2884219cd44a0\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 13 00:34:33.458670 containerd[1635]: time="2026-03-13T00:34:33.458334627Z" level=info msg="Container 0a78290b96e05ea921687080accb1c80bc96d5e27d238f77f4e7f3681e8888c9: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:33.472858 containerd[1635]: time="2026-03-13T00:34:33.472822401Z" level=info msg="CreateContainer within sandbox \"766dae36db74ba8f78bbe7a9ff332a9d193bb4c9fb87f2ef07d2884219cd44a0\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"0a78290b96e05ea921687080accb1c80bc96d5e27d238f77f4e7f3681e8888c9\"" Mar 13 00:34:33.473287 containerd[1635]: time="2026-03-13T00:34:33.473173188Z" level=info msg="StartContainer for \"0a78290b96e05ea921687080accb1c80bc96d5e27d238f77f4e7f3681e8888c9\"" Mar 13 00:34:33.474475 containerd[1635]: time="2026-03-13T00:34:33.474453429Z" level=info msg="connecting to shim 0a78290b96e05ea921687080accb1c80bc96d5e27d238f77f4e7f3681e8888c9" address="unix:///run/containerd/s/f05b3b24552b3ea6e1a1869fe42443b1ec9669e4f832715ccaba488ac63e4a77" protocol=ttrpc version=3 Mar 13 00:34:33.492780 systemd[1]: Started cri-containerd-0a78290b96e05ea921687080accb1c80bc96d5e27d238f77f4e7f3681e8888c9.scope - libcontainer container 0a78290b96e05ea921687080accb1c80bc96d5e27d238f77f4e7f3681e8888c9. Mar 13 00:34:33.558022 containerd[1635]: time="2026-03-13T00:34:33.557988041Z" level=info msg="StartContainer for \"0a78290b96e05ea921687080accb1c80bc96d5e27d238f77f4e7f3681e8888c9\" returns successfully" Mar 13 00:34:33.995888 containerd[1635]: time="2026-03-13T00:34:33.995837305Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 13 00:34:33.998424 systemd[1]: cri-containerd-0a78290b96e05ea921687080accb1c80bc96d5e27d238f77f4e7f3681e8888c9.scope: Deactivated successfully. Mar 13 00:34:33.999102 systemd[1]: cri-containerd-0a78290b96e05ea921687080accb1c80bc96d5e27d238f77f4e7f3681e8888c9.scope: Consumed 393ms CPU time, 191.4M memory peak, 1.8M read from disk, 177M written to disk. Mar 13 00:34:34.000074 containerd[1635]: time="2026-03-13T00:34:34.000052217Z" level=info msg="received container exit event container_id:\"0a78290b96e05ea921687080accb1c80bc96d5e27d238f77f4e7f3681e8888c9\" id:\"0a78290b96e05ea921687080accb1c80bc96d5e27d238f77f4e7f3681e8888c9\" pid:3633 exited_at:{seconds:1773362073 nanos:999903628}" Mar 13 00:34:34.008378 kubelet[2798]: I0313 00:34:34.008358 2798 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Mar 13 00:34:34.056548 systemd[1]: Created slice kubepods-burstable-pod7e5979e4_f434_490a_b349_72aee91f876a.slice - libcontainer container kubepods-burstable-pod7e5979e4_f434_490a_b349_72aee91f876a.slice. Mar 13 00:34:34.078460 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0a78290b96e05ea921687080accb1c80bc96d5e27d238f77f4e7f3681e8888c9-rootfs.mount: Deactivated successfully. Mar 13 00:34:34.083042 systemd[1]: Created slice kubepods-burstable-podda12e426_2ad2_4ffc_9b11_b6543119413f.slice - libcontainer container kubepods-burstable-podda12e426_2ad2_4ffc_9b11_b6543119413f.slice. Mar 13 00:34:34.089017 systemd[1]: Created slice kubepods-besteffort-pod28a7fd80_dee6_42b2_af21_3af7c0ab8ba1.slice - libcontainer container kubepods-besteffort-pod28a7fd80_dee6_42b2_af21_3af7c0ab8ba1.slice. Mar 13 00:34:34.101285 systemd[1]: Created slice kubepods-besteffort-pod74e2b49f_1624_4b27_9f4b_1bb46fbfa4d3.slice - libcontainer container kubepods-besteffort-pod74e2b49f_1624_4b27_9f4b_1bb46fbfa4d3.slice. Mar 13 00:34:34.106351 kubelet[2798]: I0313 00:34:34.106329 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xhkx\" (UniqueName: \"kubernetes.io/projected/7e5979e4-f434-490a-b349-72aee91f876a-kube-api-access-5xhkx\") pod \"coredns-66bc5c9577-mgfwk\" (UID: \"7e5979e4-f434-490a-b349-72aee91f876a\") " pod="kube-system/coredns-66bc5c9577-mgfwk" Mar 13 00:34:34.106676 kubelet[2798]: I0313 00:34:34.106501 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lxcr\" (UniqueName: \"kubernetes.io/projected/1c3c5981-ba4c-4160-bb56-967a52352e9f-kube-api-access-4lxcr\") pod \"calico-apiserver-7449d668ff-qzpl5\" (UID: \"1c3c5981-ba4c-4160-bb56-967a52352e9f\") " pod="calico-system/calico-apiserver-7449d668ff-qzpl5" Mar 13 00:34:34.106676 kubelet[2798]: I0313 00:34:34.106518 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da12e426-2ad2-4ffc-9b11-b6543119413f-config-volume\") pod \"coredns-66bc5c9577-t7h9z\" (UID: \"da12e426-2ad2-4ffc-9b11-b6543119413f\") " pod="kube-system/coredns-66bc5c9577-t7h9z" Mar 13 00:34:34.106676 kubelet[2798]: I0313 00:34:34.106530 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/1c3c5981-ba4c-4160-bb56-967a52352e9f-calico-apiserver-certs\") pod \"calico-apiserver-7449d668ff-qzpl5\" (UID: \"1c3c5981-ba4c-4160-bb56-967a52352e9f\") " pod="calico-system/calico-apiserver-7449d668ff-qzpl5" Mar 13 00:34:34.106676 kubelet[2798]: I0313 00:34:34.106541 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74e2b49f-1624-4b27-9f4b-1bb46fbfa4d3-tigera-ca-bundle\") pod \"calico-kube-controllers-6b485f8c87-fd8vp\" (UID: \"74e2b49f-1624-4b27-9f4b-1bb46fbfa4d3\") " pod="calico-system/calico-kube-controllers-6b485f8c87-fd8vp" Mar 13 00:34:34.106676 kubelet[2798]: I0313 00:34:34.106554 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/28a7fd80-dee6-42b2-af21-3af7c0ab8ba1-whisker-backend-key-pair\") pod \"whisker-5c7d565994-mf58t\" (UID: \"28a7fd80-dee6-42b2-af21-3af7c0ab8ba1\") " pod="calico-system/whisker-5c7d565994-mf58t" Mar 13 00:34:34.106798 kubelet[2798]: I0313 00:34:34.106569 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m25s\" (UniqueName: \"kubernetes.io/projected/da12e426-2ad2-4ffc-9b11-b6543119413f-kube-api-access-2m25s\") pod \"coredns-66bc5c9577-t7h9z\" (UID: \"da12e426-2ad2-4ffc-9b11-b6543119413f\") " pod="kube-system/coredns-66bc5c9577-t7h9z" Mar 13 00:34:34.106798 kubelet[2798]: I0313 00:34:34.106590 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/28a7fd80-dee6-42b2-af21-3af7c0ab8ba1-nginx-config\") pod \"whisker-5c7d565994-mf58t\" (UID: \"28a7fd80-dee6-42b2-af21-3af7c0ab8ba1\") " pod="calico-system/whisker-5c7d565994-mf58t" Mar 13 00:34:34.106798 kubelet[2798]: I0313 00:34:34.106601 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2rpr\" (UniqueName: \"kubernetes.io/projected/28a7fd80-dee6-42b2-af21-3af7c0ab8ba1-kube-api-access-k2rpr\") pod \"whisker-5c7d565994-mf58t\" (UID: \"28a7fd80-dee6-42b2-af21-3af7c0ab8ba1\") " pod="calico-system/whisker-5c7d565994-mf58t" Mar 13 00:34:34.106798 kubelet[2798]: I0313 00:34:34.106611 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/1c6e5827-ca2b-4238-8d67-bf66b92bd43f-calico-apiserver-certs\") pod \"calico-apiserver-7449d668ff-b29lf\" (UID: \"1c6e5827-ca2b-4238-8d67-bf66b92bd43f\") " pod="calico-system/calico-apiserver-7449d668ff-b29lf" Mar 13 00:34:34.106798 kubelet[2798]: I0313 00:34:34.106629 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28a7fd80-dee6-42b2-af21-3af7c0ab8ba1-whisker-ca-bundle\") pod \"whisker-5c7d565994-mf58t\" (UID: \"28a7fd80-dee6-42b2-af21-3af7c0ab8ba1\") " pod="calico-system/whisker-5c7d565994-mf58t" Mar 13 00:34:34.106895 kubelet[2798]: I0313 00:34:34.106638 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfz79\" (UniqueName: \"kubernetes.io/projected/1c6e5827-ca2b-4238-8d67-bf66b92bd43f-kube-api-access-hfz79\") pod \"calico-apiserver-7449d668ff-b29lf\" (UID: \"1c6e5827-ca2b-4238-8d67-bf66b92bd43f\") " pod="calico-system/calico-apiserver-7449d668ff-b29lf" Mar 13 00:34:34.107055 systemd[1]: Created slice kubepods-besteffort-pod1c6e5827_ca2b_4238_8d67_bf66b92bd43f.slice - libcontainer container kubepods-besteffort-pod1c6e5827_ca2b_4238_8d67_bf66b92bd43f.slice. Mar 13 00:34:34.107753 kubelet[2798]: I0313 00:34:34.107684 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e5979e4-f434-490a-b349-72aee91f876a-config-volume\") pod \"coredns-66bc5c9577-mgfwk\" (UID: \"7e5979e4-f434-490a-b349-72aee91f876a\") " pod="kube-system/coredns-66bc5c9577-mgfwk" Mar 13 00:34:34.107753 kubelet[2798]: I0313 00:34:34.107710 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmrl6\" (UniqueName: \"kubernetes.io/projected/74e2b49f-1624-4b27-9f4b-1bb46fbfa4d3-kube-api-access-qmrl6\") pod \"calico-kube-controllers-6b485f8c87-fd8vp\" (UID: \"74e2b49f-1624-4b27-9f4b-1bb46fbfa4d3\") " pod="calico-system/calico-kube-controllers-6b485f8c87-fd8vp" Mar 13 00:34:34.112990 systemd[1]: Created slice kubepods-besteffort-pod1c3c5981_ba4c_4160_bb56_967a52352e9f.slice - libcontainer container kubepods-besteffort-pod1c3c5981_ba4c_4160_bb56_967a52352e9f.slice. Mar 13 00:34:34.118227 systemd[1]: Created slice kubepods-besteffort-pod8927d3f8_4e49_4a05_8332_7df32999988a.slice - libcontainer container kubepods-besteffort-pod8927d3f8_4e49_4a05_8332_7df32999988a.slice. Mar 13 00:34:34.209010 kubelet[2798]: I0313 00:34:34.208938 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/8927d3f8-4e49-4a05-8332-7df32999988a-goldmane-key-pair\") pod \"goldmane-cccfbd5cf-ntbll\" (UID: \"8927d3f8-4e49-4a05-8332-7df32999988a\") " pod="calico-system/goldmane-cccfbd5cf-ntbll" Mar 13 00:34:34.209010 kubelet[2798]: I0313 00:34:34.208995 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8927d3f8-4e49-4a05-8332-7df32999988a-config\") pod \"goldmane-cccfbd5cf-ntbll\" (UID: \"8927d3f8-4e49-4a05-8332-7df32999988a\") " pod="calico-system/goldmane-cccfbd5cf-ntbll" Mar 13 00:34:34.209010 kubelet[2798]: I0313 00:34:34.209006 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8927d3f8-4e49-4a05-8332-7df32999988a-goldmane-ca-bundle\") pod \"goldmane-cccfbd5cf-ntbll\" (UID: \"8927d3f8-4e49-4a05-8332-7df32999988a\") " pod="calico-system/goldmane-cccfbd5cf-ntbll" Mar 13 00:34:34.209277 kubelet[2798]: I0313 00:34:34.209054 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7trh\" (UniqueName: \"kubernetes.io/projected/8927d3f8-4e49-4a05-8332-7df32999988a-kube-api-access-r7trh\") pod \"goldmane-cccfbd5cf-ntbll\" (UID: \"8927d3f8-4e49-4a05-8332-7df32999988a\") " pod="calico-system/goldmane-cccfbd5cf-ntbll" Mar 13 00:34:34.371222 containerd[1635]: time="2026-03-13T00:34:34.370966263Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-mgfwk,Uid:7e5979e4-f434-490a-b349-72aee91f876a,Namespace:kube-system,Attempt:0,}" Mar 13 00:34:34.399312 containerd[1635]: time="2026-03-13T00:34:34.399080957Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-t7h9z,Uid:da12e426-2ad2-4ffc-9b11-b6543119413f,Namespace:kube-system,Attempt:0,}" Mar 13 00:34:34.399552 kubelet[2798]: I0313 00:34:34.399466 2798 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 00:34:34.401561 containerd[1635]: time="2026-03-13T00:34:34.401264963Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5c7d565994-mf58t,Uid:28a7fd80-dee6-42b2-af21-3af7c0ab8ba1,Namespace:calico-system,Attempt:0,}" Mar 13 00:34:34.412589 containerd[1635]: time="2026-03-13T00:34:34.412382864Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6b485f8c87-fd8vp,Uid:74e2b49f-1624-4b27-9f4b-1bb46fbfa4d3,Namespace:calico-system,Attempt:0,}" Mar 13 00:34:34.416558 containerd[1635]: time="2026-03-13T00:34:34.416427459Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7449d668ff-b29lf,Uid:1c6e5827-ca2b-4238-8d67-bf66b92bd43f,Namespace:calico-system,Attempt:0,}" Mar 13 00:34:34.438630 containerd[1635]: time="2026-03-13T00:34:34.438571231Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-ntbll,Uid:8927d3f8-4e49-4a05-8332-7df32999988a,Namespace:calico-system,Attempt:0,}" Mar 13 00:34:34.442280 containerd[1635]: time="2026-03-13T00:34:34.439061998Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7449d668ff-qzpl5,Uid:1c3c5981-ba4c-4160-bb56-967a52352e9f,Namespace:calico-system,Attempt:0,}" Mar 13 00:34:34.600421 containerd[1635]: time="2026-03-13T00:34:34.599880674Z" level=error msg="Failed to destroy network for sandbox \"402a868b481352a9d794e858d70848c1ba18e81c7153edc1f2cd3d067af336dc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.601942 systemd[1]: run-netns-cni\x2d201b5a67\x2d1f75\x2d3341\x2d34ba\x2d54bc5c21cbfb.mount: Deactivated successfully. Mar 13 00:34:34.605956 containerd[1635]: time="2026-03-13T00:34:34.604737214Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5c7d565994-mf58t,Uid:28a7fd80-dee6-42b2-af21-3af7c0ab8ba1,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"402a868b481352a9d794e858d70848c1ba18e81c7153edc1f2cd3d067af336dc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.606036 kubelet[2798]: E0313 00:34:34.604909 2798 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"402a868b481352a9d794e858d70848c1ba18e81c7153edc1f2cd3d067af336dc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.606036 kubelet[2798]: E0313 00:34:34.604959 2798 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"402a868b481352a9d794e858d70848c1ba18e81c7153edc1f2cd3d067af336dc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5c7d565994-mf58t" Mar 13 00:34:34.606036 kubelet[2798]: E0313 00:34:34.604973 2798 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"402a868b481352a9d794e858d70848c1ba18e81c7153edc1f2cd3d067af336dc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5c7d565994-mf58t" Mar 13 00:34:34.606112 kubelet[2798]: E0313 00:34:34.605015 2798 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5c7d565994-mf58t_calico-system(28a7fd80-dee6-42b2-af21-3af7c0ab8ba1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5c7d565994-mf58t_calico-system(28a7fd80-dee6-42b2-af21-3af7c0ab8ba1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"402a868b481352a9d794e858d70848c1ba18e81c7153edc1f2cd3d067af336dc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5c7d565994-mf58t" podUID="28a7fd80-dee6-42b2-af21-3af7c0ab8ba1" Mar 13 00:34:34.610859 containerd[1635]: time="2026-03-13T00:34:34.610831957Z" level=error msg="Failed to destroy network for sandbox \"84508f9066e999bed9c3199a6f70b82203d7da76a1f1f16a09ad9262909d044f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.613168 systemd[1]: run-netns-cni\x2da508ab2f\x2dce74\x2d3fb3\x2d8084\x2dc8de5cac0656.mount: Deactivated successfully. Mar 13 00:34:34.615082 containerd[1635]: time="2026-03-13T00:34:34.615062270Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6b485f8c87-fd8vp,Uid:74e2b49f-1624-4b27-9f4b-1bb46fbfa4d3,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"84508f9066e999bed9c3199a6f70b82203d7da76a1f1f16a09ad9262909d044f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.616219 kubelet[2798]: E0313 00:34:34.615292 2798 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"84508f9066e999bed9c3199a6f70b82203d7da76a1f1f16a09ad9262909d044f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.616219 kubelet[2798]: E0313 00:34:34.615331 2798 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"84508f9066e999bed9c3199a6f70b82203d7da76a1f1f16a09ad9262909d044f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6b485f8c87-fd8vp" Mar 13 00:34:34.616219 kubelet[2798]: E0313 00:34:34.615345 2798 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"84508f9066e999bed9c3199a6f70b82203d7da76a1f1f16a09ad9262909d044f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6b485f8c87-fd8vp" Mar 13 00:34:34.616316 kubelet[2798]: E0313 00:34:34.615383 2798 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6b485f8c87-fd8vp_calico-system(74e2b49f-1624-4b27-9f4b-1bb46fbfa4d3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6b485f8c87-fd8vp_calico-system(74e2b49f-1624-4b27-9f4b-1bb46fbfa4d3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"84508f9066e999bed9c3199a6f70b82203d7da76a1f1f16a09ad9262909d044f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6b485f8c87-fd8vp" podUID="74e2b49f-1624-4b27-9f4b-1bb46fbfa4d3" Mar 13 00:34:34.618545 containerd[1635]: time="2026-03-13T00:34:34.618509849Z" level=error msg="Failed to destroy network for sandbox \"15d2ef5f7fb85a7d0dd164bc89a5812be8388c3d8029a4507d987a2be18b87d8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.620403 containerd[1635]: time="2026-03-13T00:34:34.620377007Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7449d668ff-qzpl5,Uid:1c3c5981-ba4c-4160-bb56-967a52352e9f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"15d2ef5f7fb85a7d0dd164bc89a5812be8388c3d8029a4507d987a2be18b87d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.622452 kubelet[2798]: E0313 00:34:34.621081 2798 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"15d2ef5f7fb85a7d0dd164bc89a5812be8388c3d8029a4507d987a2be18b87d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.622452 kubelet[2798]: E0313 00:34:34.621110 2798 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"15d2ef5f7fb85a7d0dd164bc89a5812be8388c3d8029a4507d987a2be18b87d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-7449d668ff-qzpl5" Mar 13 00:34:34.622452 kubelet[2798]: E0313 00:34:34.621128 2798 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"15d2ef5f7fb85a7d0dd164bc89a5812be8388c3d8029a4507d987a2be18b87d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-7449d668ff-qzpl5" Mar 13 00:34:34.622015 systemd[1]: run-netns-cni\x2d94a96074\x2db69c\x2dc0e1\x2dc0e2\x2d3ce7626191b5.mount: Deactivated successfully. Mar 13 00:34:34.622583 kubelet[2798]: E0313 00:34:34.621160 2798 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7449d668ff-qzpl5_calico-system(1c3c5981-ba4c-4160-bb56-967a52352e9f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7449d668ff-qzpl5_calico-system(1c3c5981-ba4c-4160-bb56-967a52352e9f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"15d2ef5f7fb85a7d0dd164bc89a5812be8388c3d8029a4507d987a2be18b87d8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-7449d668ff-qzpl5" podUID="1c3c5981-ba4c-4160-bb56-967a52352e9f" Mar 13 00:34:34.630441 containerd[1635]: time="2026-03-13T00:34:34.630308845Z" level=error msg="Failed to destroy network for sandbox \"d4dc9bef778ec9d8eb330dced7a8764b24fda54bbec5afffbed7272a3d4479e6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.631868 containerd[1635]: time="2026-03-13T00:34:34.631838615Z" level=error msg="Failed to destroy network for sandbox \"1917207ea43cc9ba73ea0bcb8c1c6a07876616199e7ed70b08705d013fb3a8e7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.632592 containerd[1635]: time="2026-03-13T00:34:34.632032444Z" level=error msg="Failed to destroy network for sandbox \"9cbf65b2f9228b790c9666fc3c84a3d27c13f81b6c6b4a826fc2363e4ed873b3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.632871 containerd[1635]: time="2026-03-13T00:34:34.632849749Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-mgfwk,Uid:7e5979e4-f434-490a-b349-72aee91f876a,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4dc9bef778ec9d8eb330dced7a8764b24fda54bbec5afffbed7272a3d4479e6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.633255 kubelet[2798]: E0313 00:34:34.633161 2798 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4dc9bef778ec9d8eb330dced7a8764b24fda54bbec5afffbed7272a3d4479e6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.633255 kubelet[2798]: E0313 00:34:34.633217 2798 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4dc9bef778ec9d8eb330dced7a8764b24fda54bbec5afffbed7272a3d4479e6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-mgfwk" Mar 13 00:34:34.633255 kubelet[2798]: E0313 00:34:34.633230 2798 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4dc9bef778ec9d8eb330dced7a8764b24fda54bbec5afffbed7272a3d4479e6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-mgfwk" Mar 13 00:34:34.633557 kubelet[2798]: E0313 00:34:34.633426 2798 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-mgfwk_kube-system(7e5979e4-f434-490a-b349-72aee91f876a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-mgfwk_kube-system(7e5979e4-f434-490a-b349-72aee91f876a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d4dc9bef778ec9d8eb330dced7a8764b24fda54bbec5afffbed7272a3d4479e6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-mgfwk" podUID="7e5979e4-f434-490a-b349-72aee91f876a" Mar 13 00:34:34.634293 containerd[1635]: time="2026-03-13T00:34:34.634205780Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-t7h9z,Uid:da12e426-2ad2-4ffc-9b11-b6543119413f,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1917207ea43cc9ba73ea0bcb8c1c6a07876616199e7ed70b08705d013fb3a8e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.635327 kubelet[2798]: E0313 00:34:34.635308 2798 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1917207ea43cc9ba73ea0bcb8c1c6a07876616199e7ed70b08705d013fb3a8e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.635378 kubelet[2798]: E0313 00:34:34.635331 2798 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1917207ea43cc9ba73ea0bcb8c1c6a07876616199e7ed70b08705d013fb3a8e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-t7h9z" Mar 13 00:34:34.635378 kubelet[2798]: E0313 00:34:34.635342 2798 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1917207ea43cc9ba73ea0bcb8c1c6a07876616199e7ed70b08705d013fb3a8e7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-t7h9z" Mar 13 00:34:34.635378 kubelet[2798]: E0313 00:34:34.635365 2798 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-t7h9z_kube-system(da12e426-2ad2-4ffc-9b11-b6543119413f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-t7h9z_kube-system(da12e426-2ad2-4ffc-9b11-b6543119413f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1917207ea43cc9ba73ea0bcb8c1c6a07876616199e7ed70b08705d013fb3a8e7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-t7h9z" podUID="da12e426-2ad2-4ffc-9b11-b6543119413f" Mar 13 00:34:34.635516 containerd[1635]: time="2026-03-13T00:34:34.635452033Z" level=error msg="Failed to destroy network for sandbox \"331de107c46689976562d1d423dea02d1cda469d2c8092ad32f44ebe6b09795c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.636185 containerd[1635]: time="2026-03-13T00:34:34.636158568Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-ntbll,Uid:8927d3f8-4e49-4a05-8332-7df32999988a,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9cbf65b2f9228b790c9666fc3c84a3d27c13f81b6c6b4a826fc2363e4ed873b3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.636357 kubelet[2798]: E0313 00:34:34.636333 2798 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9cbf65b2f9228b790c9666fc3c84a3d27c13f81b6c6b4a826fc2363e4ed873b3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.636402 kubelet[2798]: E0313 00:34:34.636360 2798 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9cbf65b2f9228b790c9666fc3c84a3d27c13f81b6c6b4a826fc2363e4ed873b3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-cccfbd5cf-ntbll" Mar 13 00:34:34.636402 kubelet[2798]: E0313 00:34:34.636369 2798 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9cbf65b2f9228b790c9666fc3c84a3d27c13f81b6c6b4a826fc2363e4ed873b3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-cccfbd5cf-ntbll" Mar 13 00:34:34.636454 kubelet[2798]: E0313 00:34:34.636410 2798 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-cccfbd5cf-ntbll_calico-system(8927d3f8-4e49-4a05-8332-7df32999988a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-cccfbd5cf-ntbll_calico-system(8927d3f8-4e49-4a05-8332-7df32999988a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9cbf65b2f9228b790c9666fc3c84a3d27c13f81b6c6b4a826fc2363e4ed873b3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-cccfbd5cf-ntbll" podUID="8927d3f8-4e49-4a05-8332-7df32999988a" Mar 13 00:34:34.637151 containerd[1635]: time="2026-03-13T00:34:34.637125793Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7449d668ff-b29lf,Uid:1c6e5827-ca2b-4238-8d67-bf66b92bd43f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"331de107c46689976562d1d423dea02d1cda469d2c8092ad32f44ebe6b09795c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.637268 kubelet[2798]: E0313 00:34:34.637238 2798 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"331de107c46689976562d1d423dea02d1cda469d2c8092ad32f44ebe6b09795c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.637293 kubelet[2798]: E0313 00:34:34.637280 2798 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"331de107c46689976562d1d423dea02d1cda469d2c8092ad32f44ebe6b09795c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-7449d668ff-b29lf" Mar 13 00:34:34.637313 kubelet[2798]: E0313 00:34:34.637290 2798 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"331de107c46689976562d1d423dea02d1cda469d2c8092ad32f44ebe6b09795c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-7449d668ff-b29lf" Mar 13 00:34:34.637490 kubelet[2798]: E0313 00:34:34.637331 2798 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7449d668ff-b29lf_calico-system(1c6e5827-ca2b-4238-8d67-bf66b92bd43f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7449d668ff-b29lf_calico-system(1c6e5827-ca2b-4238-8d67-bf66b92bd43f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"331de107c46689976562d1d423dea02d1cda469d2c8092ad32f44ebe6b09795c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-7449d668ff-b29lf" podUID="1c6e5827-ca2b-4238-8d67-bf66b92bd43f" Mar 13 00:34:34.662965 systemd[1]: Created slice kubepods-besteffort-pod7e7f9c17_2933_4641_bd19_7c38cba62524.slice - libcontainer container kubepods-besteffort-pod7e7f9c17_2933_4641_bd19_7c38cba62524.slice. Mar 13 00:34:34.666220 containerd[1635]: time="2026-03-13T00:34:34.666193521Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tp7j8,Uid:7e7f9c17-2933-4641-bd19-7c38cba62524,Namespace:calico-system,Attempt:0,}" Mar 13 00:34:34.702288 containerd[1635]: time="2026-03-13T00:34:34.702238887Z" level=error msg="Failed to destroy network for sandbox \"0ec4e7127dcd326476920cf638b8542ba829719ffbed0162f87def19bf510ab5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.703753 containerd[1635]: time="2026-03-13T00:34:34.703724567Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tp7j8,Uid:7e7f9c17-2933-4641-bd19-7c38cba62524,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ec4e7127dcd326476920cf638b8542ba829719ffbed0162f87def19bf510ab5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.703957 kubelet[2798]: E0313 00:34:34.703926 2798 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ec4e7127dcd326476920cf638b8542ba829719ffbed0162f87def19bf510ab5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 13 00:34:34.704021 kubelet[2798]: E0313 00:34:34.703965 2798 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ec4e7127dcd326476920cf638b8542ba829719ffbed0162f87def19bf510ab5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-tp7j8" Mar 13 00:34:34.704021 kubelet[2798]: E0313 00:34:34.703979 2798 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ec4e7127dcd326476920cf638b8542ba829719ffbed0162f87def19bf510ab5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-tp7j8" Mar 13 00:34:34.704060 kubelet[2798]: E0313 00:34:34.704031 2798 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-tp7j8_calico-system(7e7f9c17-2933-4641-bd19-7c38cba62524)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-tp7j8_calico-system(7e7f9c17-2933-4641-bd19-7c38cba62524)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0ec4e7127dcd326476920cf638b8542ba829719ffbed0162f87def19bf510ab5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-tp7j8" podUID="7e7f9c17-2933-4641-bd19-7c38cba62524" Mar 13 00:34:34.810730 containerd[1635]: time="2026-03-13T00:34:34.810626350Z" level=info msg="CreateContainer within sandbox \"766dae36db74ba8f78bbe7a9ff332a9d193bb4c9fb87f2ef07d2884219cd44a0\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 13 00:34:34.820768 containerd[1635]: time="2026-03-13T00:34:34.820731867Z" level=info msg="Container 612aed2999f96dcb257940a0a5140fd09760fe649ca58642583a248c3e188439: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:34.829861 containerd[1635]: time="2026-03-13T00:34:34.829782001Z" level=info msg="CreateContainer within sandbox \"766dae36db74ba8f78bbe7a9ff332a9d193bb4c9fb87f2ef07d2884219cd44a0\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"612aed2999f96dcb257940a0a5140fd09760fe649ca58642583a248c3e188439\"" Mar 13 00:34:34.830859 containerd[1635]: time="2026-03-13T00:34:34.830810364Z" level=info msg="StartContainer for \"612aed2999f96dcb257940a0a5140fd09760fe649ca58642583a248c3e188439\"" Mar 13 00:34:34.832925 containerd[1635]: time="2026-03-13T00:34:34.832906511Z" level=info msg="connecting to shim 612aed2999f96dcb257940a0a5140fd09760fe649ca58642583a248c3e188439" address="unix:///run/containerd/s/f05b3b24552b3ea6e1a1869fe42443b1ec9669e4f832715ccaba488ac63e4a77" protocol=ttrpc version=3 Mar 13 00:34:34.850747 systemd[1]: Started cri-containerd-612aed2999f96dcb257940a0a5140fd09760fe649ca58642583a248c3e188439.scope - libcontainer container 612aed2999f96dcb257940a0a5140fd09760fe649ca58642583a248c3e188439. Mar 13 00:34:34.923831 containerd[1635]: time="2026-03-13T00:34:34.923741795Z" level=info msg="StartContainer for \"612aed2999f96dcb257940a0a5140fd09760fe649ca58642583a248c3e188439\" returns successfully" Mar 13 00:34:35.119126 kubelet[2798]: I0313 00:34:35.119081 2798 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2rpr\" (UniqueName: \"kubernetes.io/projected/28a7fd80-dee6-42b2-af21-3af7c0ab8ba1-kube-api-access-k2rpr\") pod \"28a7fd80-dee6-42b2-af21-3af7c0ab8ba1\" (UID: \"28a7fd80-dee6-42b2-af21-3af7c0ab8ba1\") " Mar 13 00:34:35.119126 kubelet[2798]: I0313 00:34:35.119132 2798 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/28a7fd80-dee6-42b2-af21-3af7c0ab8ba1-whisker-backend-key-pair\") pod \"28a7fd80-dee6-42b2-af21-3af7c0ab8ba1\" (UID: \"28a7fd80-dee6-42b2-af21-3af7c0ab8ba1\") " Mar 13 00:34:35.119506 kubelet[2798]: I0313 00:34:35.119147 2798 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/28a7fd80-dee6-42b2-af21-3af7c0ab8ba1-nginx-config\") pod \"28a7fd80-dee6-42b2-af21-3af7c0ab8ba1\" (UID: \"28a7fd80-dee6-42b2-af21-3af7c0ab8ba1\") " Mar 13 00:34:35.119506 kubelet[2798]: I0313 00:34:35.119159 2798 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28a7fd80-dee6-42b2-af21-3af7c0ab8ba1-whisker-ca-bundle\") pod \"28a7fd80-dee6-42b2-af21-3af7c0ab8ba1\" (UID: \"28a7fd80-dee6-42b2-af21-3af7c0ab8ba1\") " Mar 13 00:34:35.119547 kubelet[2798]: I0313 00:34:35.119521 2798 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28a7fd80-dee6-42b2-af21-3af7c0ab8ba1-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "28a7fd80-dee6-42b2-af21-3af7c0ab8ba1" (UID: "28a7fd80-dee6-42b2-af21-3af7c0ab8ba1"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 13 00:34:35.120564 kubelet[2798]: I0313 00:34:35.120538 2798 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28a7fd80-dee6-42b2-af21-3af7c0ab8ba1-nginx-config" (OuterVolumeSpecName: "nginx-config") pod "28a7fd80-dee6-42b2-af21-3af7c0ab8ba1" (UID: "28a7fd80-dee6-42b2-af21-3af7c0ab8ba1"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 13 00:34:35.123507 kubelet[2798]: I0313 00:34:35.123409 2798 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28a7fd80-dee6-42b2-af21-3af7c0ab8ba1-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "28a7fd80-dee6-42b2-af21-3af7c0ab8ba1" (UID: "28a7fd80-dee6-42b2-af21-3af7c0ab8ba1"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 13 00:34:35.123987 kubelet[2798]: I0313 00:34:35.123947 2798 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28a7fd80-dee6-42b2-af21-3af7c0ab8ba1-kube-api-access-k2rpr" (OuterVolumeSpecName: "kube-api-access-k2rpr") pod "28a7fd80-dee6-42b2-af21-3af7c0ab8ba1" (UID: "28a7fd80-dee6-42b2-af21-3af7c0ab8ba1"). InnerVolumeSpecName "kube-api-access-k2rpr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 13 00:34:35.220181 kubelet[2798]: I0313 00:34:35.219983 2798 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k2rpr\" (UniqueName: \"kubernetes.io/projected/28a7fd80-dee6-42b2-af21-3af7c0ab8ba1-kube-api-access-k2rpr\") on node \"ci-4459-2-4-n-bcba60e63d\" DevicePath \"\"" Mar 13 00:34:35.220181 kubelet[2798]: I0313 00:34:35.220028 2798 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/28a7fd80-dee6-42b2-af21-3af7c0ab8ba1-whisker-backend-key-pair\") on node \"ci-4459-2-4-n-bcba60e63d\" DevicePath \"\"" Mar 13 00:34:35.220181 kubelet[2798]: I0313 00:34:35.220047 2798 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/28a7fd80-dee6-42b2-af21-3af7c0ab8ba1-nginx-config\") on node \"ci-4459-2-4-n-bcba60e63d\" DevicePath \"\"" Mar 13 00:34:35.220181 kubelet[2798]: I0313 00:34:35.220061 2798 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28a7fd80-dee6-42b2-af21-3af7c0ab8ba1-whisker-ca-bundle\") on node \"ci-4459-2-4-n-bcba60e63d\" DevicePath \"\"" Mar 13 00:34:35.467526 systemd[1]: run-netns-cni\x2dd6aa6f4a\x2d9c03\x2df046\x2d2947\x2da9ee0feab963.mount: Deactivated successfully. Mar 13 00:34:35.467866 systemd[1]: run-netns-cni\x2d1512b32b\x2d7bc9\x2df3bf\x2db268\x2db927e4a9ee97.mount: Deactivated successfully. Mar 13 00:34:35.468087 systemd[1]: run-netns-cni\x2d1bc8346a\x2db9b5\x2de96a\x2dcaed\x2d25ccb8709728.mount: Deactivated successfully. Mar 13 00:34:35.468261 systemd[1]: run-netns-cni\x2d93d223b9\x2df0a9\x2d5e23\x2d4065\x2da3bade83fd44.mount: Deactivated successfully. Mar 13 00:34:35.468397 systemd[1]: var-lib-kubelet-pods-28a7fd80\x2ddee6\x2d42b2\x2daf21\x2d3af7c0ab8ba1-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dk2rpr.mount: Deactivated successfully. Mar 13 00:34:35.468537 systemd[1]: var-lib-kubelet-pods-28a7fd80\x2ddee6\x2d42b2\x2daf21\x2d3af7c0ab8ba1-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Mar 13 00:34:35.671831 systemd[1]: Removed slice kubepods-besteffort-pod28a7fd80_dee6_42b2_af21_3af7c0ab8ba1.slice - libcontainer container kubepods-besteffort-pod28a7fd80_dee6_42b2_af21_3af7c0ab8ba1.slice. Mar 13 00:34:35.835064 kubelet[2798]: I0313 00:34:35.835015 2798 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-27f76" podStartSLOduration=2.809953519 podStartE2EDuration="18.835001247s" podCreationTimestamp="2026-03-13 00:34:17 +0000 UTC" firstStartedPulling="2026-03-13 00:34:17.424626237 +0000 UTC m=+17.867881396" lastFinishedPulling="2026-03-13 00:34:33.449673975 +0000 UTC m=+33.892929124" observedRunningTime="2026-03-13 00:34:35.832485281 +0000 UTC m=+36.275740450" watchObservedRunningTime="2026-03-13 00:34:35.835001247 +0000 UTC m=+36.278256406" Mar 13 00:34:35.904436 systemd[1]: Created slice kubepods-besteffort-pod6853ee52_c7d4_46c5_b886_398448fa1d63.slice - libcontainer container kubepods-besteffort-pod6853ee52_c7d4_46c5_b886_398448fa1d63.slice. Mar 13 00:34:36.037948 kubelet[2798]: I0313 00:34:36.037810 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6853ee52-c7d4-46c5-b886-398448fa1d63-whisker-ca-bundle\") pod \"whisker-688ff9598c-2chfm\" (UID: \"6853ee52-c7d4-46c5-b886-398448fa1d63\") " pod="calico-system/whisker-688ff9598c-2chfm" Mar 13 00:34:36.037948 kubelet[2798]: I0313 00:34:36.037868 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/6853ee52-c7d4-46c5-b886-398448fa1d63-nginx-config\") pod \"whisker-688ff9598c-2chfm\" (UID: \"6853ee52-c7d4-46c5-b886-398448fa1d63\") " pod="calico-system/whisker-688ff9598c-2chfm" Mar 13 00:34:36.037948 kubelet[2798]: I0313 00:34:36.037889 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/6853ee52-c7d4-46c5-b886-398448fa1d63-whisker-backend-key-pair\") pod \"whisker-688ff9598c-2chfm\" (UID: \"6853ee52-c7d4-46c5-b886-398448fa1d63\") " pod="calico-system/whisker-688ff9598c-2chfm" Mar 13 00:34:36.037948 kubelet[2798]: I0313 00:34:36.037913 2798 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6s8p\" (UniqueName: \"kubernetes.io/projected/6853ee52-c7d4-46c5-b886-398448fa1d63-kube-api-access-v6s8p\") pod \"whisker-688ff9598c-2chfm\" (UID: \"6853ee52-c7d4-46c5-b886-398448fa1d63\") " pod="calico-system/whisker-688ff9598c-2chfm" Mar 13 00:34:36.209940 containerd[1635]: time="2026-03-13T00:34:36.209899936Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-688ff9598c-2chfm,Uid:6853ee52-c7d4-46c5-b886-398448fa1d63,Namespace:calico-system,Attempt:0,}" Mar 13 00:34:36.357489 systemd-networkd[1490]: cali11ad491df1d: Link UP Mar 13 00:34:36.359902 systemd-networkd[1490]: cali11ad491df1d: Gained carrier Mar 13 00:34:36.393113 containerd[1635]: 2026-03-13 00:34:36.247 [ERROR][4018] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 13 00:34:36.393113 containerd[1635]: 2026-03-13 00:34:36.277 [INFO][4018] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--bcba60e63d-k8s-whisker--688ff9598c--2chfm-eth0 whisker-688ff9598c- calico-system 6853ee52-c7d4-46c5-b886-398448fa1d63 882 0 2026-03-13 00:34:35 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:688ff9598c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-2-4-n-bcba60e63d whisker-688ff9598c-2chfm eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali11ad491df1d [] [] }} ContainerID="41e1dceefbfc89de57bb72379a8b59c9012f30f69b8a0292f609fe5dee6ae4e9" Namespace="calico-system" Pod="whisker-688ff9598c-2chfm" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-whisker--688ff9598c--2chfm-" Mar 13 00:34:36.393113 containerd[1635]: 2026-03-13 00:34:36.277 [INFO][4018] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="41e1dceefbfc89de57bb72379a8b59c9012f30f69b8a0292f609fe5dee6ae4e9" Namespace="calico-system" Pod="whisker-688ff9598c-2chfm" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-whisker--688ff9598c--2chfm-eth0" Mar 13 00:34:36.393113 containerd[1635]: 2026-03-13 00:34:36.309 [INFO][4064] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="41e1dceefbfc89de57bb72379a8b59c9012f30f69b8a0292f609fe5dee6ae4e9" HandleID="k8s-pod-network.41e1dceefbfc89de57bb72379a8b59c9012f30f69b8a0292f609fe5dee6ae4e9" Workload="ci--4459--2--4--n--bcba60e63d-k8s-whisker--688ff9598c--2chfm-eth0" Mar 13 00:34:36.393820 containerd[1635]: 2026-03-13 00:34:36.314 [INFO][4064] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="41e1dceefbfc89de57bb72379a8b59c9012f30f69b8a0292f609fe5dee6ae4e9" HandleID="k8s-pod-network.41e1dceefbfc89de57bb72379a8b59c9012f30f69b8a0292f609fe5dee6ae4e9" Workload="ci--4459--2--4--n--bcba60e63d-k8s-whisker--688ff9598c--2chfm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fb480), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-bcba60e63d", "pod":"whisker-688ff9598c-2chfm", "timestamp":"2026-03-13 00:34:36.309188157 +0000 UTC"}, Hostname:"ci-4459-2-4-n-bcba60e63d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc00028edc0)} Mar 13 00:34:36.393820 containerd[1635]: 2026-03-13 00:34:36.314 [INFO][4064] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:34:36.393820 containerd[1635]: 2026-03-13 00:34:36.314 [INFO][4064] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:34:36.393820 containerd[1635]: 2026-03-13 00:34:36.314 [INFO][4064] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-bcba60e63d' Mar 13 00:34:36.393820 containerd[1635]: 2026-03-13 00:34:36.317 [INFO][4064] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.41e1dceefbfc89de57bb72379a8b59c9012f30f69b8a0292f609fe5dee6ae4e9" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:36.393820 containerd[1635]: 2026-03-13 00:34:36.322 [INFO][4064] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:36.393820 containerd[1635]: 2026-03-13 00:34:36.326 [INFO][4064] ipam/ipam.go 526: Trying affinity for 192.168.112.64/26 host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:36.393820 containerd[1635]: 2026-03-13 00:34:36.327 [INFO][4064] ipam/ipam.go 160: Attempting to load block cidr=192.168.112.64/26 host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:36.393820 containerd[1635]: 2026-03-13 00:34:36.329 [INFO][4064] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.112.64/26 host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:36.393975 containerd[1635]: 2026-03-13 00:34:36.329 [INFO][4064] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.112.64/26 handle="k8s-pod-network.41e1dceefbfc89de57bb72379a8b59c9012f30f69b8a0292f609fe5dee6ae4e9" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:36.393975 containerd[1635]: 2026-03-13 00:34:36.331 [INFO][4064] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.41e1dceefbfc89de57bb72379a8b59c9012f30f69b8a0292f609fe5dee6ae4e9 Mar 13 00:34:36.393975 containerd[1635]: 2026-03-13 00:34:36.334 [INFO][4064] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.112.64/26 handle="k8s-pod-network.41e1dceefbfc89de57bb72379a8b59c9012f30f69b8a0292f609fe5dee6ae4e9" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:36.393975 containerd[1635]: 2026-03-13 00:34:36.339 [INFO][4064] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.112.65/26] block=192.168.112.64/26 handle="k8s-pod-network.41e1dceefbfc89de57bb72379a8b59c9012f30f69b8a0292f609fe5dee6ae4e9" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:36.393975 containerd[1635]: 2026-03-13 00:34:36.339 [INFO][4064] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.112.65/26] handle="k8s-pod-network.41e1dceefbfc89de57bb72379a8b59c9012f30f69b8a0292f609fe5dee6ae4e9" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:36.393975 containerd[1635]: 2026-03-13 00:34:36.340 [INFO][4064] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:34:36.393975 containerd[1635]: 2026-03-13 00:34:36.340 [INFO][4064] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.112.65/26] IPv6=[] ContainerID="41e1dceefbfc89de57bb72379a8b59c9012f30f69b8a0292f609fe5dee6ae4e9" HandleID="k8s-pod-network.41e1dceefbfc89de57bb72379a8b59c9012f30f69b8a0292f609fe5dee6ae4e9" Workload="ci--4459--2--4--n--bcba60e63d-k8s-whisker--688ff9598c--2chfm-eth0" Mar 13 00:34:36.394080 containerd[1635]: 2026-03-13 00:34:36.346 [INFO][4018] cni-plugin/k8s.go 418: Populated endpoint ContainerID="41e1dceefbfc89de57bb72379a8b59c9012f30f69b8a0292f609fe5dee6ae4e9" Namespace="calico-system" Pod="whisker-688ff9598c-2chfm" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-whisker--688ff9598c--2chfm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--bcba60e63d-k8s-whisker--688ff9598c--2chfm-eth0", GenerateName:"whisker-688ff9598c-", Namespace:"calico-system", SelfLink:"", UID:"6853ee52-c7d4-46c5-b886-398448fa1d63", ResourceVersion:"882", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"688ff9598c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-bcba60e63d", ContainerID:"", Pod:"whisker-688ff9598c-2chfm", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.112.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali11ad491df1d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:36.394080 containerd[1635]: 2026-03-13 00:34:36.346 [INFO][4018] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.112.65/32] ContainerID="41e1dceefbfc89de57bb72379a8b59c9012f30f69b8a0292f609fe5dee6ae4e9" Namespace="calico-system" Pod="whisker-688ff9598c-2chfm" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-whisker--688ff9598c--2chfm-eth0" Mar 13 00:34:36.394133 containerd[1635]: 2026-03-13 00:34:36.346 [INFO][4018] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali11ad491df1d ContainerID="41e1dceefbfc89de57bb72379a8b59c9012f30f69b8a0292f609fe5dee6ae4e9" Namespace="calico-system" Pod="whisker-688ff9598c-2chfm" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-whisker--688ff9598c--2chfm-eth0" Mar 13 00:34:36.394133 containerd[1635]: 2026-03-13 00:34:36.359 [INFO][4018] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="41e1dceefbfc89de57bb72379a8b59c9012f30f69b8a0292f609fe5dee6ae4e9" Namespace="calico-system" Pod="whisker-688ff9598c-2chfm" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-whisker--688ff9598c--2chfm-eth0" Mar 13 00:34:36.394167 containerd[1635]: 2026-03-13 00:34:36.371 [INFO][4018] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="41e1dceefbfc89de57bb72379a8b59c9012f30f69b8a0292f609fe5dee6ae4e9" Namespace="calico-system" Pod="whisker-688ff9598c-2chfm" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-whisker--688ff9598c--2chfm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--bcba60e63d-k8s-whisker--688ff9598c--2chfm-eth0", GenerateName:"whisker-688ff9598c-", Namespace:"calico-system", SelfLink:"", UID:"6853ee52-c7d4-46c5-b886-398448fa1d63", ResourceVersion:"882", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"688ff9598c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-bcba60e63d", ContainerID:"41e1dceefbfc89de57bb72379a8b59c9012f30f69b8a0292f609fe5dee6ae4e9", Pod:"whisker-688ff9598c-2chfm", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.112.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali11ad491df1d", MAC:"ce:cb:4b:97:c9:31", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:36.394207 containerd[1635]: 2026-03-13 00:34:36.388 [INFO][4018] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="41e1dceefbfc89de57bb72379a8b59c9012f30f69b8a0292f609fe5dee6ae4e9" Namespace="calico-system" Pod="whisker-688ff9598c-2chfm" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-whisker--688ff9598c--2chfm-eth0" Mar 13 00:34:36.439026 containerd[1635]: time="2026-03-13T00:34:36.438942022Z" level=info msg="connecting to shim 41e1dceefbfc89de57bb72379a8b59c9012f30f69b8a0292f609fe5dee6ae4e9" address="unix:///run/containerd/s/0679278fbe5451db475c7cfceacd23ac793903fb34d3529ea012ef0db767490a" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:34:36.474781 systemd[1]: Started cri-containerd-41e1dceefbfc89de57bb72379a8b59c9012f30f69b8a0292f609fe5dee6ae4e9.scope - libcontainer container 41e1dceefbfc89de57bb72379a8b59c9012f30f69b8a0292f609fe5dee6ae4e9. Mar 13 00:34:36.530144 containerd[1635]: time="2026-03-13T00:34:36.530116936Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-688ff9598c-2chfm,Uid:6853ee52-c7d4-46c5-b886-398448fa1d63,Namespace:calico-system,Attempt:0,} returns sandbox id \"41e1dceefbfc89de57bb72379a8b59c9012f30f69b8a0292f609fe5dee6ae4e9\"" Mar 13 00:34:36.533905 containerd[1635]: time="2026-03-13T00:34:36.533828416Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Mar 13 00:34:36.927513 systemd-networkd[1490]: vxlan.calico: Link UP Mar 13 00:34:36.927520 systemd-networkd[1490]: vxlan.calico: Gained carrier Mar 13 00:34:37.661522 kubelet[2798]: I0313 00:34:37.661467 2798 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28a7fd80-dee6-42b2-af21-3af7c0ab8ba1" path="/var/lib/kubelet/pods/28a7fd80-dee6-42b2-af21-3af7c0ab8ba1/volumes" Mar 13 00:34:38.221327 systemd-networkd[1490]: cali11ad491df1d: Gained IPv6LL Mar 13 00:34:38.401830 containerd[1635]: time="2026-03-13T00:34:38.401784848Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:38.402946 containerd[1635]: time="2026-03-13T00:34:38.402830413Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=6039889" Mar 13 00:34:38.403772 containerd[1635]: time="2026-03-13T00:34:38.403750739Z" level=info msg="ImageCreate event name:\"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:38.405853 containerd[1635]: time="2026-03-13T00:34:38.405831809Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:38.406326 containerd[1635]: time="2026-03-13T00:34:38.406307186Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7595926\" in 1.87245758s" Mar 13 00:34:38.406438 containerd[1635]: time="2026-03-13T00:34:38.406372596Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\"" Mar 13 00:34:38.410041 containerd[1635]: time="2026-03-13T00:34:38.410019169Z" level=info msg="CreateContainer within sandbox \"41e1dceefbfc89de57bb72379a8b59c9012f30f69b8a0292f609fe5dee6ae4e9\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Mar 13 00:34:38.418292 containerd[1635]: time="2026-03-13T00:34:38.417824992Z" level=info msg="Container 8233a587451c84cdd2ddb123008c727cf74b7fd7199a299ba9d39e04ee9af8fa: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:38.421484 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3664684141.mount: Deactivated successfully. Mar 13 00:34:38.434424 containerd[1635]: time="2026-03-13T00:34:38.434397714Z" level=info msg="CreateContainer within sandbox \"41e1dceefbfc89de57bb72379a8b59c9012f30f69b8a0292f609fe5dee6ae4e9\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"8233a587451c84cdd2ddb123008c727cf74b7fd7199a299ba9d39e04ee9af8fa\"" Mar 13 00:34:38.434834 containerd[1635]: time="2026-03-13T00:34:38.434818462Z" level=info msg="StartContainer for \"8233a587451c84cdd2ddb123008c727cf74b7fd7199a299ba9d39e04ee9af8fa\"" Mar 13 00:34:38.436192 containerd[1635]: time="2026-03-13T00:34:38.436176466Z" level=info msg="connecting to shim 8233a587451c84cdd2ddb123008c727cf74b7fd7199a299ba9d39e04ee9af8fa" address="unix:///run/containerd/s/0679278fbe5451db475c7cfceacd23ac793903fb34d3529ea012ef0db767490a" protocol=ttrpc version=3 Mar 13 00:34:38.459766 systemd[1]: Started cri-containerd-8233a587451c84cdd2ddb123008c727cf74b7fd7199a299ba9d39e04ee9af8fa.scope - libcontainer container 8233a587451c84cdd2ddb123008c727cf74b7fd7199a299ba9d39e04ee9af8fa. Mar 13 00:34:38.501130 containerd[1635]: time="2026-03-13T00:34:38.501038880Z" level=info msg="StartContainer for \"8233a587451c84cdd2ddb123008c727cf74b7fd7199a299ba9d39e04ee9af8fa\" returns successfully" Mar 13 00:34:38.503762 containerd[1635]: time="2026-03-13T00:34:38.503742467Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Mar 13 00:34:38.860995 systemd-networkd[1490]: vxlan.calico: Gained IPv6LL Mar 13 00:34:40.884084 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2057578990.mount: Deactivated successfully. Mar 13 00:34:40.900386 containerd[1635]: time="2026-03-13T00:34:40.900348456Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:40.901710 containerd[1635]: time="2026-03-13T00:34:40.901685501Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=17609475" Mar 13 00:34:40.902997 containerd[1635]: time="2026-03-13T00:34:40.902964765Z" level=info msg="ImageCreate event name:\"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:40.904720 containerd[1635]: time="2026-03-13T00:34:40.904688938Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:40.905079 containerd[1635]: time="2026-03-13T00:34:40.905057037Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"17609305\" in 2.401196471s" Mar 13 00:34:40.905121 containerd[1635]: time="2026-03-13T00:34:40.905080486Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\"" Mar 13 00:34:40.908862 containerd[1635]: time="2026-03-13T00:34:40.908838472Z" level=info msg="CreateContainer within sandbox \"41e1dceefbfc89de57bb72379a8b59c9012f30f69b8a0292f609fe5dee6ae4e9\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Mar 13 00:34:40.917316 containerd[1635]: time="2026-03-13T00:34:40.916817259Z" level=info msg="Container d0afafe519dec89f3acf1dc20f156f3e384d188d2005c8186d4b57da1b14cd2f: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:40.920549 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1060832033.mount: Deactivated successfully. Mar 13 00:34:40.934414 containerd[1635]: time="2026-03-13T00:34:40.934382316Z" level=info msg="CreateContainer within sandbox \"41e1dceefbfc89de57bb72379a8b59c9012f30f69b8a0292f609fe5dee6ae4e9\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"d0afafe519dec89f3acf1dc20f156f3e384d188d2005c8186d4b57da1b14cd2f\"" Mar 13 00:34:40.935604 containerd[1635]: time="2026-03-13T00:34:40.934831024Z" level=info msg="StartContainer for \"d0afafe519dec89f3acf1dc20f156f3e384d188d2005c8186d4b57da1b14cd2f\"" Mar 13 00:34:40.935604 containerd[1635]: time="2026-03-13T00:34:40.935521212Z" level=info msg="connecting to shim d0afafe519dec89f3acf1dc20f156f3e384d188d2005c8186d4b57da1b14cd2f" address="unix:///run/containerd/s/0679278fbe5451db475c7cfceacd23ac793903fb34d3529ea012ef0db767490a" protocol=ttrpc version=3 Mar 13 00:34:40.953766 systemd[1]: Started cri-containerd-d0afafe519dec89f3acf1dc20f156f3e384d188d2005c8186d4b57da1b14cd2f.scope - libcontainer container d0afafe519dec89f3acf1dc20f156f3e384d188d2005c8186d4b57da1b14cd2f. Mar 13 00:34:40.996189 containerd[1635]: time="2026-03-13T00:34:40.996163813Z" level=info msg="StartContainer for \"d0afafe519dec89f3acf1dc20f156f3e384d188d2005c8186d4b57da1b14cd2f\" returns successfully" Mar 13 00:34:41.824924 kubelet[2798]: I0313 00:34:41.824847 2798 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-688ff9598c-2chfm" podStartSLOduration=2.452822919 podStartE2EDuration="6.824829346s" podCreationTimestamp="2026-03-13 00:34:35 +0000 UTC" firstStartedPulling="2026-03-13 00:34:36.533625427 +0000 UTC m=+36.976880586" lastFinishedPulling="2026-03-13 00:34:40.905631864 +0000 UTC m=+41.348887013" observedRunningTime="2026-03-13 00:34:41.82399067 +0000 UTC m=+42.267245869" watchObservedRunningTime="2026-03-13 00:34:41.824829346 +0000 UTC m=+42.268084525" Mar 13 00:34:45.662288 containerd[1635]: time="2026-03-13T00:34:45.662207254Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tp7j8,Uid:7e7f9c17-2933-4641-bd19-7c38cba62524,Namespace:calico-system,Attempt:0,}" Mar 13 00:34:45.665129 containerd[1635]: time="2026-03-13T00:34:45.664756566Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-t7h9z,Uid:da12e426-2ad2-4ffc-9b11-b6543119413f,Namespace:kube-system,Attempt:0,}" Mar 13 00:34:45.790250 systemd-networkd[1490]: calibd8c0cbbb2e: Link UP Mar 13 00:34:45.790439 systemd-networkd[1490]: calibd8c0cbbb2e: Gained carrier Mar 13 00:34:45.805288 containerd[1635]: 2026-03-13 00:34:45.728 [INFO][4371] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--bcba60e63d-k8s-coredns--66bc5c9577--t7h9z-eth0 coredns-66bc5c9577- kube-system da12e426-2ad2-4ffc-9b11-b6543119413f 826 0 2026-03-13 00:34:07 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-4-n-bcba60e63d coredns-66bc5c9577-t7h9z eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calibd8c0cbbb2e [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="78ce650b22b76404d898fc1fd87d0f5f157574f96a7bb15687d851bff25698fa" Namespace="kube-system" Pod="coredns-66bc5c9577-t7h9z" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-coredns--66bc5c9577--t7h9z-" Mar 13 00:34:45.805288 containerd[1635]: 2026-03-13 00:34:45.728 [INFO][4371] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="78ce650b22b76404d898fc1fd87d0f5f157574f96a7bb15687d851bff25698fa" Namespace="kube-system" Pod="coredns-66bc5c9577-t7h9z" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-coredns--66bc5c9577--t7h9z-eth0" Mar 13 00:34:45.805288 containerd[1635]: 2026-03-13 00:34:45.756 [INFO][4391] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="78ce650b22b76404d898fc1fd87d0f5f157574f96a7bb15687d851bff25698fa" HandleID="k8s-pod-network.78ce650b22b76404d898fc1fd87d0f5f157574f96a7bb15687d851bff25698fa" Workload="ci--4459--2--4--n--bcba60e63d-k8s-coredns--66bc5c9577--t7h9z-eth0" Mar 13 00:34:45.805445 containerd[1635]: 2026-03-13 00:34:45.761 [INFO][4391] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="78ce650b22b76404d898fc1fd87d0f5f157574f96a7bb15687d851bff25698fa" HandleID="k8s-pod-network.78ce650b22b76404d898fc1fd87d0f5f157574f96a7bb15687d851bff25698fa" Workload="ci--4459--2--4--n--bcba60e63d-k8s-coredns--66bc5c9577--t7h9z-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fd7a0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-4-n-bcba60e63d", "pod":"coredns-66bc5c9577-t7h9z", "timestamp":"2026-03-13 00:34:45.756011706 +0000 UTC"}, Hostname:"ci-4459-2-4-n-bcba60e63d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0003858c0)} Mar 13 00:34:45.805445 containerd[1635]: 2026-03-13 00:34:45.761 [INFO][4391] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:34:45.805445 containerd[1635]: 2026-03-13 00:34:45.761 [INFO][4391] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:34:45.805445 containerd[1635]: 2026-03-13 00:34:45.761 [INFO][4391] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-bcba60e63d' Mar 13 00:34:45.805445 containerd[1635]: 2026-03-13 00:34:45.763 [INFO][4391] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.78ce650b22b76404d898fc1fd87d0f5f157574f96a7bb15687d851bff25698fa" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:45.805445 containerd[1635]: 2026-03-13 00:34:45.766 [INFO][4391] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:45.805445 containerd[1635]: 2026-03-13 00:34:45.768 [INFO][4391] ipam/ipam.go 526: Trying affinity for 192.168.112.64/26 host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:45.805445 containerd[1635]: 2026-03-13 00:34:45.769 [INFO][4391] ipam/ipam.go 160: Attempting to load block cidr=192.168.112.64/26 host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:45.805445 containerd[1635]: 2026-03-13 00:34:45.771 [INFO][4391] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.112.64/26 host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:45.805682 containerd[1635]: 2026-03-13 00:34:45.771 [INFO][4391] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.112.64/26 handle="k8s-pod-network.78ce650b22b76404d898fc1fd87d0f5f157574f96a7bb15687d851bff25698fa" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:45.805682 containerd[1635]: 2026-03-13 00:34:45.772 [INFO][4391] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.78ce650b22b76404d898fc1fd87d0f5f157574f96a7bb15687d851bff25698fa Mar 13 00:34:45.805682 containerd[1635]: 2026-03-13 00:34:45.777 [INFO][4391] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.112.64/26 handle="k8s-pod-network.78ce650b22b76404d898fc1fd87d0f5f157574f96a7bb15687d851bff25698fa" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:45.805682 containerd[1635]: 2026-03-13 00:34:45.781 [INFO][4391] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.112.66/26] block=192.168.112.64/26 handle="k8s-pod-network.78ce650b22b76404d898fc1fd87d0f5f157574f96a7bb15687d851bff25698fa" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:45.805682 containerd[1635]: 2026-03-13 00:34:45.781 [INFO][4391] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.112.66/26] handle="k8s-pod-network.78ce650b22b76404d898fc1fd87d0f5f157574f96a7bb15687d851bff25698fa" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:45.805682 containerd[1635]: 2026-03-13 00:34:45.781 [INFO][4391] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:34:45.805682 containerd[1635]: 2026-03-13 00:34:45.781 [INFO][4391] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.112.66/26] IPv6=[] ContainerID="78ce650b22b76404d898fc1fd87d0f5f157574f96a7bb15687d851bff25698fa" HandleID="k8s-pod-network.78ce650b22b76404d898fc1fd87d0f5f157574f96a7bb15687d851bff25698fa" Workload="ci--4459--2--4--n--bcba60e63d-k8s-coredns--66bc5c9577--t7h9z-eth0" Mar 13 00:34:45.805804 containerd[1635]: 2026-03-13 00:34:45.783 [INFO][4371] cni-plugin/k8s.go 418: Populated endpoint ContainerID="78ce650b22b76404d898fc1fd87d0f5f157574f96a7bb15687d851bff25698fa" Namespace="kube-system" Pod="coredns-66bc5c9577-t7h9z" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-coredns--66bc5c9577--t7h9z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--bcba60e63d-k8s-coredns--66bc5c9577--t7h9z-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"da12e426-2ad2-4ffc-9b11-b6543119413f", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-bcba60e63d", ContainerID:"", Pod:"coredns-66bc5c9577-t7h9z", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.112.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calibd8c0cbbb2e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:45.805804 containerd[1635]: 2026-03-13 00:34:45.783 [INFO][4371] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.112.66/32] ContainerID="78ce650b22b76404d898fc1fd87d0f5f157574f96a7bb15687d851bff25698fa" Namespace="kube-system" Pod="coredns-66bc5c9577-t7h9z" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-coredns--66bc5c9577--t7h9z-eth0" Mar 13 00:34:45.805804 containerd[1635]: 2026-03-13 00:34:45.783 [INFO][4371] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibd8c0cbbb2e ContainerID="78ce650b22b76404d898fc1fd87d0f5f157574f96a7bb15687d851bff25698fa" Namespace="kube-system" Pod="coredns-66bc5c9577-t7h9z" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-coredns--66bc5c9577--t7h9z-eth0" Mar 13 00:34:45.805804 containerd[1635]: 2026-03-13 00:34:45.791 [INFO][4371] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="78ce650b22b76404d898fc1fd87d0f5f157574f96a7bb15687d851bff25698fa" Namespace="kube-system" Pod="coredns-66bc5c9577-t7h9z" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-coredns--66bc5c9577--t7h9z-eth0" Mar 13 00:34:45.805804 containerd[1635]: 2026-03-13 00:34:45.791 [INFO][4371] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="78ce650b22b76404d898fc1fd87d0f5f157574f96a7bb15687d851bff25698fa" Namespace="kube-system" Pod="coredns-66bc5c9577-t7h9z" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-coredns--66bc5c9577--t7h9z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--bcba60e63d-k8s-coredns--66bc5c9577--t7h9z-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"da12e426-2ad2-4ffc-9b11-b6543119413f", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-bcba60e63d", ContainerID:"78ce650b22b76404d898fc1fd87d0f5f157574f96a7bb15687d851bff25698fa", Pod:"coredns-66bc5c9577-t7h9z", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.112.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calibd8c0cbbb2e", MAC:"92:39:35:38:92:01", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:45.805953 containerd[1635]: 2026-03-13 00:34:45.803 [INFO][4371] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="78ce650b22b76404d898fc1fd87d0f5f157574f96a7bb15687d851bff25698fa" Namespace="kube-system" Pod="coredns-66bc5c9577-t7h9z" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-coredns--66bc5c9577--t7h9z-eth0" Mar 13 00:34:45.829950 containerd[1635]: time="2026-03-13T00:34:45.829913605Z" level=info msg="connecting to shim 78ce650b22b76404d898fc1fd87d0f5f157574f96a7bb15687d851bff25698fa" address="unix:///run/containerd/s/384bb90ef96fadc83d0d8bdefbb695b85287fa1834881f53a7077dd8b58643c9" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:34:45.856797 systemd[1]: Started cri-containerd-78ce650b22b76404d898fc1fd87d0f5f157574f96a7bb15687d851bff25698fa.scope - libcontainer container 78ce650b22b76404d898fc1fd87d0f5f157574f96a7bb15687d851bff25698fa. Mar 13 00:34:45.905008 systemd-networkd[1490]: cali889658e3fc2: Link UP Mar 13 00:34:45.908952 systemd-networkd[1490]: cali889658e3fc2: Gained carrier Mar 13 00:34:45.926454 containerd[1635]: 2026-03-13 00:34:45.736 [INFO][4367] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--bcba60e63d-k8s-csi--node--driver--tp7j8-eth0 csi-node-driver- calico-system 7e7f9c17-2933-4641-bd19-7c38cba62524 685 0 2026-03-13 00:34:17 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:98cbb5577 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459-2-4-n-bcba60e63d csi-node-driver-tp7j8 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali889658e3fc2 [] [] }} ContainerID="e99672bfe17b352e0d7e5b1cc8a6e68261076888bad212bc0e32738468ab3421" Namespace="calico-system" Pod="csi-node-driver-tp7j8" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-csi--node--driver--tp7j8-" Mar 13 00:34:45.926454 containerd[1635]: 2026-03-13 00:34:45.736 [INFO][4367] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e99672bfe17b352e0d7e5b1cc8a6e68261076888bad212bc0e32738468ab3421" Namespace="calico-system" Pod="csi-node-driver-tp7j8" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-csi--node--driver--tp7j8-eth0" Mar 13 00:34:45.926454 containerd[1635]: 2026-03-13 00:34:45.758 [INFO][4396] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e99672bfe17b352e0d7e5b1cc8a6e68261076888bad212bc0e32738468ab3421" HandleID="k8s-pod-network.e99672bfe17b352e0d7e5b1cc8a6e68261076888bad212bc0e32738468ab3421" Workload="ci--4459--2--4--n--bcba60e63d-k8s-csi--node--driver--tp7j8-eth0" Mar 13 00:34:45.926454 containerd[1635]: 2026-03-13 00:34:45.763 [INFO][4396] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="e99672bfe17b352e0d7e5b1cc8a6e68261076888bad212bc0e32738468ab3421" HandleID="k8s-pod-network.e99672bfe17b352e0d7e5b1cc8a6e68261076888bad212bc0e32738468ab3421" Workload="ci--4459--2--4--n--bcba60e63d-k8s-csi--node--driver--tp7j8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0004c1460), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-bcba60e63d", "pod":"csi-node-driver-tp7j8", "timestamp":"2026-03-13 00:34:45.75813919 +0000 UTC"}, Hostname:"ci-4459-2-4-n-bcba60e63d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000214580)} Mar 13 00:34:45.926454 containerd[1635]: 2026-03-13 00:34:45.763 [INFO][4396] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:34:45.926454 containerd[1635]: 2026-03-13 00:34:45.781 [INFO][4396] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:34:45.926454 containerd[1635]: 2026-03-13 00:34:45.781 [INFO][4396] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-bcba60e63d' Mar 13 00:34:45.926454 containerd[1635]: 2026-03-13 00:34:45.864 [INFO][4396] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.e99672bfe17b352e0d7e5b1cc8a6e68261076888bad212bc0e32738468ab3421" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:45.926454 containerd[1635]: 2026-03-13 00:34:45.869 [INFO][4396] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:45.926454 containerd[1635]: 2026-03-13 00:34:45.874 [INFO][4396] ipam/ipam.go 526: Trying affinity for 192.168.112.64/26 host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:45.926454 containerd[1635]: 2026-03-13 00:34:45.875 [INFO][4396] ipam/ipam.go 160: Attempting to load block cidr=192.168.112.64/26 host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:45.926454 containerd[1635]: 2026-03-13 00:34:45.877 [INFO][4396] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.112.64/26 host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:45.926454 containerd[1635]: 2026-03-13 00:34:45.877 [INFO][4396] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.112.64/26 handle="k8s-pod-network.e99672bfe17b352e0d7e5b1cc8a6e68261076888bad212bc0e32738468ab3421" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:45.926454 containerd[1635]: 2026-03-13 00:34:45.879 [INFO][4396] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.e99672bfe17b352e0d7e5b1cc8a6e68261076888bad212bc0e32738468ab3421 Mar 13 00:34:45.926454 containerd[1635]: 2026-03-13 00:34:45.883 [INFO][4396] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.112.64/26 handle="k8s-pod-network.e99672bfe17b352e0d7e5b1cc8a6e68261076888bad212bc0e32738468ab3421" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:45.926454 containerd[1635]: 2026-03-13 00:34:45.890 [INFO][4396] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.112.67/26] block=192.168.112.64/26 handle="k8s-pod-network.e99672bfe17b352e0d7e5b1cc8a6e68261076888bad212bc0e32738468ab3421" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:45.926454 containerd[1635]: 2026-03-13 00:34:45.890 [INFO][4396] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.112.67/26] handle="k8s-pod-network.e99672bfe17b352e0d7e5b1cc8a6e68261076888bad212bc0e32738468ab3421" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:45.926454 containerd[1635]: 2026-03-13 00:34:45.890 [INFO][4396] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:34:45.926454 containerd[1635]: 2026-03-13 00:34:45.890 [INFO][4396] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.112.67/26] IPv6=[] ContainerID="e99672bfe17b352e0d7e5b1cc8a6e68261076888bad212bc0e32738468ab3421" HandleID="k8s-pod-network.e99672bfe17b352e0d7e5b1cc8a6e68261076888bad212bc0e32738468ab3421" Workload="ci--4459--2--4--n--bcba60e63d-k8s-csi--node--driver--tp7j8-eth0" Mar 13 00:34:45.927962 containerd[1635]: 2026-03-13 00:34:45.895 [INFO][4367] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e99672bfe17b352e0d7e5b1cc8a6e68261076888bad212bc0e32738468ab3421" Namespace="calico-system" Pod="csi-node-driver-tp7j8" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-csi--node--driver--tp7j8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--bcba60e63d-k8s-csi--node--driver--tp7j8-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"7e7f9c17-2933-4641-bd19-7c38cba62524", ResourceVersion:"685", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-bcba60e63d", ContainerID:"", Pod:"csi-node-driver-tp7j8", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.112.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali889658e3fc2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:45.927962 containerd[1635]: 2026-03-13 00:34:45.895 [INFO][4367] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.112.67/32] ContainerID="e99672bfe17b352e0d7e5b1cc8a6e68261076888bad212bc0e32738468ab3421" Namespace="calico-system" Pod="csi-node-driver-tp7j8" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-csi--node--driver--tp7j8-eth0" Mar 13 00:34:45.927962 containerd[1635]: 2026-03-13 00:34:45.895 [INFO][4367] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali889658e3fc2 ContainerID="e99672bfe17b352e0d7e5b1cc8a6e68261076888bad212bc0e32738468ab3421" Namespace="calico-system" Pod="csi-node-driver-tp7j8" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-csi--node--driver--tp7j8-eth0" Mar 13 00:34:45.927962 containerd[1635]: 2026-03-13 00:34:45.909 [INFO][4367] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e99672bfe17b352e0d7e5b1cc8a6e68261076888bad212bc0e32738468ab3421" Namespace="calico-system" Pod="csi-node-driver-tp7j8" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-csi--node--driver--tp7j8-eth0" Mar 13 00:34:45.927962 containerd[1635]: 2026-03-13 00:34:45.909 [INFO][4367] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e99672bfe17b352e0d7e5b1cc8a6e68261076888bad212bc0e32738468ab3421" Namespace="calico-system" Pod="csi-node-driver-tp7j8" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-csi--node--driver--tp7j8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--bcba60e63d-k8s-csi--node--driver--tp7j8-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"7e7f9c17-2933-4641-bd19-7c38cba62524", ResourceVersion:"685", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-bcba60e63d", ContainerID:"e99672bfe17b352e0d7e5b1cc8a6e68261076888bad212bc0e32738468ab3421", Pod:"csi-node-driver-tp7j8", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.112.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali889658e3fc2", MAC:"c2:66:70:15:a2:e9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:45.927962 containerd[1635]: 2026-03-13 00:34:45.921 [INFO][4367] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e99672bfe17b352e0d7e5b1cc8a6e68261076888bad212bc0e32738468ab3421" Namespace="calico-system" Pod="csi-node-driver-tp7j8" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-csi--node--driver--tp7j8-eth0" Mar 13 00:34:45.929784 containerd[1635]: time="2026-03-13T00:34:45.929566031Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-t7h9z,Uid:da12e426-2ad2-4ffc-9b11-b6543119413f,Namespace:kube-system,Attempt:0,} returns sandbox id \"78ce650b22b76404d898fc1fd87d0f5f157574f96a7bb15687d851bff25698fa\"" Mar 13 00:34:45.934694 containerd[1635]: time="2026-03-13T00:34:45.934675306Z" level=info msg="CreateContainer within sandbox \"78ce650b22b76404d898fc1fd87d0f5f157574f96a7bb15687d851bff25698fa\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 13 00:34:45.951878 containerd[1635]: time="2026-03-13T00:34:45.951856727Z" level=info msg="Container c675c38d6d26fce333e14574fe21a6f5b7c0e79e01df1fea807c30e739a7f041: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:45.952964 containerd[1635]: time="2026-03-13T00:34:45.952921973Z" level=info msg="connecting to shim e99672bfe17b352e0d7e5b1cc8a6e68261076888bad212bc0e32738468ab3421" address="unix:///run/containerd/s/2142c7a87df133f846df359d248f27134e511a2c6ce1d6973f2801241e7f7ff8" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:34:45.958572 containerd[1635]: time="2026-03-13T00:34:45.958548408Z" level=info msg="CreateContainer within sandbox \"78ce650b22b76404d898fc1fd87d0f5f157574f96a7bb15687d851bff25698fa\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"c675c38d6d26fce333e14574fe21a6f5b7c0e79e01df1fea807c30e739a7f041\"" Mar 13 00:34:45.960486 containerd[1635]: time="2026-03-13T00:34:45.959728234Z" level=info msg="StartContainer for \"c675c38d6d26fce333e14574fe21a6f5b7c0e79e01df1fea807c30e739a7f041\"" Mar 13 00:34:45.960486 containerd[1635]: time="2026-03-13T00:34:45.960311152Z" level=info msg="connecting to shim c675c38d6d26fce333e14574fe21a6f5b7c0e79e01df1fea807c30e739a7f041" address="unix:///run/containerd/s/384bb90ef96fadc83d0d8bdefbb695b85287fa1834881f53a7077dd8b58643c9" protocol=ttrpc version=3 Mar 13 00:34:45.986767 systemd[1]: Started cri-containerd-c675c38d6d26fce333e14574fe21a6f5b7c0e79e01df1fea807c30e739a7f041.scope - libcontainer container c675c38d6d26fce333e14574fe21a6f5b7c0e79e01df1fea807c30e739a7f041. Mar 13 00:34:45.993767 systemd[1]: Started cri-containerd-e99672bfe17b352e0d7e5b1cc8a6e68261076888bad212bc0e32738468ab3421.scope - libcontainer container e99672bfe17b352e0d7e5b1cc8a6e68261076888bad212bc0e32738468ab3421. Mar 13 00:34:46.025102 containerd[1635]: time="2026-03-13T00:34:46.024911673Z" level=info msg="StartContainer for \"c675c38d6d26fce333e14574fe21a6f5b7c0e79e01df1fea807c30e739a7f041\" returns successfully" Mar 13 00:34:46.027633 containerd[1635]: time="2026-03-13T00:34:46.027167167Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tp7j8,Uid:7e7f9c17-2933-4641-bd19-7c38cba62524,Namespace:calico-system,Attempt:0,} returns sandbox id \"e99672bfe17b352e0d7e5b1cc8a6e68261076888bad212bc0e32738468ab3421\"" Mar 13 00:34:46.029674 containerd[1635]: time="2026-03-13T00:34:46.029633870Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Mar 13 00:34:46.661537 containerd[1635]: time="2026-03-13T00:34:46.661468226Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6b485f8c87-fd8vp,Uid:74e2b49f-1624-4b27-9f4b-1bb46fbfa4d3,Namespace:calico-system,Attempt:0,}" Mar 13 00:34:46.664887 containerd[1635]: time="2026-03-13T00:34:46.664089359Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7449d668ff-qzpl5,Uid:1c3c5981-ba4c-4160-bb56-967a52352e9f,Namespace:calico-system,Attempt:0,}" Mar 13 00:34:46.665476 containerd[1635]: time="2026-03-13T00:34:46.665125676Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-ntbll,Uid:8927d3f8-4e49-4a05-8332-7df32999988a,Namespace:calico-system,Attempt:0,}" Mar 13 00:34:46.828432 systemd-networkd[1490]: cali8b3a2e760bf: Link UP Mar 13 00:34:46.829748 systemd-networkd[1490]: cali8b3a2e760bf: Gained carrier Mar 13 00:34:46.850324 containerd[1635]: 2026-03-13 00:34:46.749 [INFO][4568] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--bcba60e63d-k8s-calico--apiserver--7449d668ff--qzpl5-eth0 calico-apiserver-7449d668ff- calico-system 1c3c5981-ba4c-4160-bb56-967a52352e9f 823 0 2026-03-13 00:34:16 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7449d668ff projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-4-n-bcba60e63d calico-apiserver-7449d668ff-qzpl5 eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali8b3a2e760bf [] [] }} ContainerID="60caa4d20d2849e075a76513648c8fce8e01c68b206c88e52a5575453db60275" Namespace="calico-system" Pod="calico-apiserver-7449d668ff-qzpl5" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-calico--apiserver--7449d668ff--qzpl5-" Mar 13 00:34:46.850324 containerd[1635]: 2026-03-13 00:34:46.750 [INFO][4568] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="60caa4d20d2849e075a76513648c8fce8e01c68b206c88e52a5575453db60275" Namespace="calico-system" Pod="calico-apiserver-7449d668ff-qzpl5" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-calico--apiserver--7449d668ff--qzpl5-eth0" Mar 13 00:34:46.850324 containerd[1635]: 2026-03-13 00:34:46.787 [INFO][4606] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="60caa4d20d2849e075a76513648c8fce8e01c68b206c88e52a5575453db60275" HandleID="k8s-pod-network.60caa4d20d2849e075a76513648c8fce8e01c68b206c88e52a5575453db60275" Workload="ci--4459--2--4--n--bcba60e63d-k8s-calico--apiserver--7449d668ff--qzpl5-eth0" Mar 13 00:34:46.850324 containerd[1635]: 2026-03-13 00:34:46.798 [INFO][4606] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="60caa4d20d2849e075a76513648c8fce8e01c68b206c88e52a5575453db60275" HandleID="k8s-pod-network.60caa4d20d2849e075a76513648c8fce8e01c68b206c88e52a5575453db60275" Workload="ci--4459--2--4--n--bcba60e63d-k8s-calico--apiserver--7449d668ff--qzpl5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ef770), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-bcba60e63d", "pod":"calico-apiserver-7449d668ff-qzpl5", "timestamp":"2026-03-13 00:34:46.787773781 +0000 UTC"}, Hostname:"ci-4459-2-4-n-bcba60e63d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0003fcf20)} Mar 13 00:34:46.850324 containerd[1635]: 2026-03-13 00:34:46.799 [INFO][4606] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:34:46.850324 containerd[1635]: 2026-03-13 00:34:46.799 [INFO][4606] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:34:46.850324 containerd[1635]: 2026-03-13 00:34:46.799 [INFO][4606] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-bcba60e63d' Mar 13 00:34:46.850324 containerd[1635]: 2026-03-13 00:34:46.803 [INFO][4606] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.60caa4d20d2849e075a76513648c8fce8e01c68b206c88e52a5575453db60275" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:46.850324 containerd[1635]: 2026-03-13 00:34:46.807 [INFO][4606] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:46.850324 containerd[1635]: 2026-03-13 00:34:46.810 [INFO][4606] ipam/ipam.go 526: Trying affinity for 192.168.112.64/26 host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:46.850324 containerd[1635]: 2026-03-13 00:34:46.811 [INFO][4606] ipam/ipam.go 160: Attempting to load block cidr=192.168.112.64/26 host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:46.850324 containerd[1635]: 2026-03-13 00:34:46.812 [INFO][4606] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.112.64/26 host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:46.850324 containerd[1635]: 2026-03-13 00:34:46.812 [INFO][4606] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.112.64/26 handle="k8s-pod-network.60caa4d20d2849e075a76513648c8fce8e01c68b206c88e52a5575453db60275" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:46.850324 containerd[1635]: 2026-03-13 00:34:46.814 [INFO][4606] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.60caa4d20d2849e075a76513648c8fce8e01c68b206c88e52a5575453db60275 Mar 13 00:34:46.850324 containerd[1635]: 2026-03-13 00:34:46.817 [INFO][4606] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.112.64/26 handle="k8s-pod-network.60caa4d20d2849e075a76513648c8fce8e01c68b206c88e52a5575453db60275" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:46.850324 containerd[1635]: 2026-03-13 00:34:46.822 [INFO][4606] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.112.68/26] block=192.168.112.64/26 handle="k8s-pod-network.60caa4d20d2849e075a76513648c8fce8e01c68b206c88e52a5575453db60275" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:46.850324 containerd[1635]: 2026-03-13 00:34:46.822 [INFO][4606] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.112.68/26] handle="k8s-pod-network.60caa4d20d2849e075a76513648c8fce8e01c68b206c88e52a5575453db60275" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:46.850324 containerd[1635]: 2026-03-13 00:34:46.822 [INFO][4606] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:34:46.850324 containerd[1635]: 2026-03-13 00:34:46.822 [INFO][4606] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.112.68/26] IPv6=[] ContainerID="60caa4d20d2849e075a76513648c8fce8e01c68b206c88e52a5575453db60275" HandleID="k8s-pod-network.60caa4d20d2849e075a76513648c8fce8e01c68b206c88e52a5575453db60275" Workload="ci--4459--2--4--n--bcba60e63d-k8s-calico--apiserver--7449d668ff--qzpl5-eth0" Mar 13 00:34:46.851572 containerd[1635]: 2026-03-13 00:34:46.825 [INFO][4568] cni-plugin/k8s.go 418: Populated endpoint ContainerID="60caa4d20d2849e075a76513648c8fce8e01c68b206c88e52a5575453db60275" Namespace="calico-system" Pod="calico-apiserver-7449d668ff-qzpl5" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-calico--apiserver--7449d668ff--qzpl5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--bcba60e63d-k8s-calico--apiserver--7449d668ff--qzpl5-eth0", GenerateName:"calico-apiserver-7449d668ff-", Namespace:"calico-system", SelfLink:"", UID:"1c3c5981-ba4c-4160-bb56-967a52352e9f", ResourceVersion:"823", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7449d668ff", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-bcba60e63d", ContainerID:"", Pod:"calico-apiserver-7449d668ff-qzpl5", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.112.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali8b3a2e760bf", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:46.851572 containerd[1635]: 2026-03-13 00:34:46.825 [INFO][4568] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.112.68/32] ContainerID="60caa4d20d2849e075a76513648c8fce8e01c68b206c88e52a5575453db60275" Namespace="calico-system" Pod="calico-apiserver-7449d668ff-qzpl5" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-calico--apiserver--7449d668ff--qzpl5-eth0" Mar 13 00:34:46.851572 containerd[1635]: 2026-03-13 00:34:46.825 [INFO][4568] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8b3a2e760bf ContainerID="60caa4d20d2849e075a76513648c8fce8e01c68b206c88e52a5575453db60275" Namespace="calico-system" Pod="calico-apiserver-7449d668ff-qzpl5" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-calico--apiserver--7449d668ff--qzpl5-eth0" Mar 13 00:34:46.851572 containerd[1635]: 2026-03-13 00:34:46.831 [INFO][4568] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="60caa4d20d2849e075a76513648c8fce8e01c68b206c88e52a5575453db60275" Namespace="calico-system" Pod="calico-apiserver-7449d668ff-qzpl5" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-calico--apiserver--7449d668ff--qzpl5-eth0" Mar 13 00:34:46.851572 containerd[1635]: 2026-03-13 00:34:46.831 [INFO][4568] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="60caa4d20d2849e075a76513648c8fce8e01c68b206c88e52a5575453db60275" Namespace="calico-system" Pod="calico-apiserver-7449d668ff-qzpl5" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-calico--apiserver--7449d668ff--qzpl5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--bcba60e63d-k8s-calico--apiserver--7449d668ff--qzpl5-eth0", GenerateName:"calico-apiserver-7449d668ff-", Namespace:"calico-system", SelfLink:"", UID:"1c3c5981-ba4c-4160-bb56-967a52352e9f", ResourceVersion:"823", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7449d668ff", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-bcba60e63d", ContainerID:"60caa4d20d2849e075a76513648c8fce8e01c68b206c88e52a5575453db60275", Pod:"calico-apiserver-7449d668ff-qzpl5", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.112.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali8b3a2e760bf", MAC:"46:a7:26:25:00:65", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:46.851572 containerd[1635]: 2026-03-13 00:34:46.843 [INFO][4568] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="60caa4d20d2849e075a76513648c8fce8e01c68b206c88e52a5575453db60275" Namespace="calico-system" Pod="calico-apiserver-7449d668ff-qzpl5" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-calico--apiserver--7449d668ff--qzpl5-eth0" Mar 13 00:34:46.865741 kubelet[2798]: I0313 00:34:46.865035 2798 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-t7h9z" podStartSLOduration=39.865021036 podStartE2EDuration="39.865021036s" podCreationTimestamp="2026-03-13 00:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 00:34:46.864772036 +0000 UTC m=+47.308027205" watchObservedRunningTime="2026-03-13 00:34:46.865021036 +0000 UTC m=+47.308276195" Mar 13 00:34:46.893734 containerd[1635]: time="2026-03-13T00:34:46.893086542Z" level=info msg="connecting to shim 60caa4d20d2849e075a76513648c8fce8e01c68b206c88e52a5575453db60275" address="unix:///run/containerd/s/829d2e90c091172c9f14f1bc7f6f308b753cad8fdccd9d6fe6f2f0f71f8e94fb" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:34:46.927069 systemd[1]: Started cri-containerd-60caa4d20d2849e075a76513648c8fce8e01c68b206c88e52a5575453db60275.scope - libcontainer container 60caa4d20d2849e075a76513648c8fce8e01c68b206c88e52a5575453db60275. Mar 13 00:34:46.951274 systemd-networkd[1490]: calid0d40f4e472: Link UP Mar 13 00:34:46.952223 systemd-networkd[1490]: calid0d40f4e472: Gained carrier Mar 13 00:34:46.970842 containerd[1635]: 2026-03-13 00:34:46.753 [INFO][4569] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--bcba60e63d-k8s-calico--kube--controllers--6b485f8c87--fd8vp-eth0 calico-kube-controllers-6b485f8c87- calico-system 74e2b49f-1624-4b27-9f4b-1bb46fbfa4d3 821 0 2026-03-13 00:34:17 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6b485f8c87 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459-2-4-n-bcba60e63d calico-kube-controllers-6b485f8c87-fd8vp eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calid0d40f4e472 [] [] }} ContainerID="713090deb5c88e34e9353a5227f5a41c6aef6f1eddaded10fc7a3f648b0219be" Namespace="calico-system" Pod="calico-kube-controllers-6b485f8c87-fd8vp" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-calico--kube--controllers--6b485f8c87--fd8vp-" Mar 13 00:34:46.970842 containerd[1635]: 2026-03-13 00:34:46.753 [INFO][4569] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="713090deb5c88e34e9353a5227f5a41c6aef6f1eddaded10fc7a3f648b0219be" Namespace="calico-system" Pod="calico-kube-controllers-6b485f8c87-fd8vp" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-calico--kube--controllers--6b485f8c87--fd8vp-eth0" Mar 13 00:34:46.970842 containerd[1635]: 2026-03-13 00:34:46.790 [INFO][4615] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="713090deb5c88e34e9353a5227f5a41c6aef6f1eddaded10fc7a3f648b0219be" HandleID="k8s-pod-network.713090deb5c88e34e9353a5227f5a41c6aef6f1eddaded10fc7a3f648b0219be" Workload="ci--4459--2--4--n--bcba60e63d-k8s-calico--kube--controllers--6b485f8c87--fd8vp-eth0" Mar 13 00:34:46.970842 containerd[1635]: 2026-03-13 00:34:46.799 [INFO][4615] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="713090deb5c88e34e9353a5227f5a41c6aef6f1eddaded10fc7a3f648b0219be" HandleID="k8s-pod-network.713090deb5c88e34e9353a5227f5a41c6aef6f1eddaded10fc7a3f648b0219be" Workload="ci--4459--2--4--n--bcba60e63d-k8s-calico--kube--controllers--6b485f8c87--fd8vp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fd860), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-bcba60e63d", "pod":"calico-kube-controllers-6b485f8c87-fd8vp", "timestamp":"2026-03-13 00:34:46.790809572 +0000 UTC"}, Hostname:"ci-4459-2-4-n-bcba60e63d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0000cba20)} Mar 13 00:34:46.970842 containerd[1635]: 2026-03-13 00:34:46.799 [INFO][4615] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:34:46.970842 containerd[1635]: 2026-03-13 00:34:46.822 [INFO][4615] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:34:46.970842 containerd[1635]: 2026-03-13 00:34:46.822 [INFO][4615] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-bcba60e63d' Mar 13 00:34:46.970842 containerd[1635]: 2026-03-13 00:34:46.906 [INFO][4615] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.713090deb5c88e34e9353a5227f5a41c6aef6f1eddaded10fc7a3f648b0219be" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:46.970842 containerd[1635]: 2026-03-13 00:34:46.915 [INFO][4615] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:46.970842 containerd[1635]: 2026-03-13 00:34:46.919 [INFO][4615] ipam/ipam.go 526: Trying affinity for 192.168.112.64/26 host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:46.970842 containerd[1635]: 2026-03-13 00:34:46.921 [INFO][4615] ipam/ipam.go 160: Attempting to load block cidr=192.168.112.64/26 host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:46.970842 containerd[1635]: 2026-03-13 00:34:46.923 [INFO][4615] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.112.64/26 host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:46.970842 containerd[1635]: 2026-03-13 00:34:46.923 [INFO][4615] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.112.64/26 handle="k8s-pod-network.713090deb5c88e34e9353a5227f5a41c6aef6f1eddaded10fc7a3f648b0219be" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:46.970842 containerd[1635]: 2026-03-13 00:34:46.926 [INFO][4615] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.713090deb5c88e34e9353a5227f5a41c6aef6f1eddaded10fc7a3f648b0219be Mar 13 00:34:46.970842 containerd[1635]: 2026-03-13 00:34:46.933 [INFO][4615] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.112.64/26 handle="k8s-pod-network.713090deb5c88e34e9353a5227f5a41c6aef6f1eddaded10fc7a3f648b0219be" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:46.970842 containerd[1635]: 2026-03-13 00:34:46.939 [INFO][4615] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.112.69/26] block=192.168.112.64/26 handle="k8s-pod-network.713090deb5c88e34e9353a5227f5a41c6aef6f1eddaded10fc7a3f648b0219be" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:46.970842 containerd[1635]: 2026-03-13 00:34:46.939 [INFO][4615] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.112.69/26] handle="k8s-pod-network.713090deb5c88e34e9353a5227f5a41c6aef6f1eddaded10fc7a3f648b0219be" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:46.970842 containerd[1635]: 2026-03-13 00:34:46.939 [INFO][4615] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:34:46.970842 containerd[1635]: 2026-03-13 00:34:46.939 [INFO][4615] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.112.69/26] IPv6=[] ContainerID="713090deb5c88e34e9353a5227f5a41c6aef6f1eddaded10fc7a3f648b0219be" HandleID="k8s-pod-network.713090deb5c88e34e9353a5227f5a41c6aef6f1eddaded10fc7a3f648b0219be" Workload="ci--4459--2--4--n--bcba60e63d-k8s-calico--kube--controllers--6b485f8c87--fd8vp-eth0" Mar 13 00:34:46.971271 containerd[1635]: 2026-03-13 00:34:46.949 [INFO][4569] cni-plugin/k8s.go 418: Populated endpoint ContainerID="713090deb5c88e34e9353a5227f5a41c6aef6f1eddaded10fc7a3f648b0219be" Namespace="calico-system" Pod="calico-kube-controllers-6b485f8c87-fd8vp" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-calico--kube--controllers--6b485f8c87--fd8vp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--bcba60e63d-k8s-calico--kube--controllers--6b485f8c87--fd8vp-eth0", GenerateName:"calico-kube-controllers-6b485f8c87-", Namespace:"calico-system", SelfLink:"", UID:"74e2b49f-1624-4b27-9f4b-1bb46fbfa4d3", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6b485f8c87", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-bcba60e63d", ContainerID:"", Pod:"calico-kube-controllers-6b485f8c87-fd8vp", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.112.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calid0d40f4e472", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:46.971271 containerd[1635]: 2026-03-13 00:34:46.949 [INFO][4569] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.112.69/32] ContainerID="713090deb5c88e34e9353a5227f5a41c6aef6f1eddaded10fc7a3f648b0219be" Namespace="calico-system" Pod="calico-kube-controllers-6b485f8c87-fd8vp" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-calico--kube--controllers--6b485f8c87--fd8vp-eth0" Mar 13 00:34:46.971271 containerd[1635]: 2026-03-13 00:34:46.949 [INFO][4569] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid0d40f4e472 ContainerID="713090deb5c88e34e9353a5227f5a41c6aef6f1eddaded10fc7a3f648b0219be" Namespace="calico-system" Pod="calico-kube-controllers-6b485f8c87-fd8vp" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-calico--kube--controllers--6b485f8c87--fd8vp-eth0" Mar 13 00:34:46.971271 containerd[1635]: 2026-03-13 00:34:46.960 [INFO][4569] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="713090deb5c88e34e9353a5227f5a41c6aef6f1eddaded10fc7a3f648b0219be" Namespace="calico-system" Pod="calico-kube-controllers-6b485f8c87-fd8vp" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-calico--kube--controllers--6b485f8c87--fd8vp-eth0" Mar 13 00:34:46.971271 containerd[1635]: 2026-03-13 00:34:46.961 [INFO][4569] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="713090deb5c88e34e9353a5227f5a41c6aef6f1eddaded10fc7a3f648b0219be" Namespace="calico-system" Pod="calico-kube-controllers-6b485f8c87-fd8vp" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-calico--kube--controllers--6b485f8c87--fd8vp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--bcba60e63d-k8s-calico--kube--controllers--6b485f8c87--fd8vp-eth0", GenerateName:"calico-kube-controllers-6b485f8c87-", Namespace:"calico-system", SelfLink:"", UID:"74e2b49f-1624-4b27-9f4b-1bb46fbfa4d3", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6b485f8c87", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-bcba60e63d", ContainerID:"713090deb5c88e34e9353a5227f5a41c6aef6f1eddaded10fc7a3f648b0219be", Pod:"calico-kube-controllers-6b485f8c87-fd8vp", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.112.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calid0d40f4e472", MAC:"26:c6:74:08:46:30", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:46.971271 containerd[1635]: 2026-03-13 00:34:46.968 [INFO][4569] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="713090deb5c88e34e9353a5227f5a41c6aef6f1eddaded10fc7a3f648b0219be" Namespace="calico-system" Pod="calico-kube-controllers-6b485f8c87-fd8vp" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-calico--kube--controllers--6b485f8c87--fd8vp-eth0" Mar 13 00:34:47.000906 containerd[1635]: time="2026-03-13T00:34:47.000869316Z" level=info msg="connecting to shim 713090deb5c88e34e9353a5227f5a41c6aef6f1eddaded10fc7a3f648b0219be" address="unix:///run/containerd/s/be598211b620aeba33ac70ce3b3d19ee8d9d95eb5a1be7d3b8c19fbd6fe7171d" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:34:47.046757 systemd[1]: Started cri-containerd-713090deb5c88e34e9353a5227f5a41c6aef6f1eddaded10fc7a3f648b0219be.scope - libcontainer container 713090deb5c88e34e9353a5227f5a41c6aef6f1eddaded10fc7a3f648b0219be. Mar 13 00:34:47.050419 containerd[1635]: time="2026-03-13T00:34:47.050347824Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7449d668ff-qzpl5,Uid:1c3c5981-ba4c-4160-bb56-967a52352e9f,Namespace:calico-system,Attempt:0,} returns sandbox id \"60caa4d20d2849e075a76513648c8fce8e01c68b206c88e52a5575453db60275\"" Mar 13 00:34:47.061292 systemd-networkd[1490]: calidcdadda2878: Link UP Mar 13 00:34:47.061498 systemd-networkd[1490]: calidcdadda2878: Gained carrier Mar 13 00:34:47.078590 containerd[1635]: 2026-03-13 00:34:46.750 [INFO][4589] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--bcba60e63d-k8s-goldmane--cccfbd5cf--ntbll-eth0 goldmane-cccfbd5cf- calico-system 8927d3f8-4e49-4a05-8332-7df32999988a 822 0 2026-03-13 00:34:16 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:cccfbd5cf projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459-2-4-n-bcba60e63d goldmane-cccfbd5cf-ntbll eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calidcdadda2878 [] [] }} ContainerID="8c8e18264d17bfffc75b5c1473929b4d4dfc7bde498ce0683e50ce5842548de2" Namespace="calico-system" Pod="goldmane-cccfbd5cf-ntbll" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-goldmane--cccfbd5cf--ntbll-" Mar 13 00:34:47.078590 containerd[1635]: 2026-03-13 00:34:46.750 [INFO][4589] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8c8e18264d17bfffc75b5c1473929b4d4dfc7bde498ce0683e50ce5842548de2" Namespace="calico-system" Pod="goldmane-cccfbd5cf-ntbll" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-goldmane--cccfbd5cf--ntbll-eth0" Mar 13 00:34:47.078590 containerd[1635]: 2026-03-13 00:34:46.796 [INFO][4607] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8c8e18264d17bfffc75b5c1473929b4d4dfc7bde498ce0683e50ce5842548de2" HandleID="k8s-pod-network.8c8e18264d17bfffc75b5c1473929b4d4dfc7bde498ce0683e50ce5842548de2" Workload="ci--4459--2--4--n--bcba60e63d-k8s-goldmane--cccfbd5cf--ntbll-eth0" Mar 13 00:34:47.078590 containerd[1635]: 2026-03-13 00:34:46.804 [INFO][4607] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="8c8e18264d17bfffc75b5c1473929b4d4dfc7bde498ce0683e50ce5842548de2" HandleID="k8s-pod-network.8c8e18264d17bfffc75b5c1473929b4d4dfc7bde498ce0683e50ce5842548de2" Workload="ci--4459--2--4--n--bcba60e63d-k8s-goldmane--cccfbd5cf--ntbll-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000277990), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-bcba60e63d", "pod":"goldmane-cccfbd5cf-ntbll", "timestamp":"2026-03-13 00:34:46.796300229 +0000 UTC"}, Hostname:"ci-4459-2-4-n-bcba60e63d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0003371e0)} Mar 13 00:34:47.078590 containerd[1635]: 2026-03-13 00:34:46.804 [INFO][4607] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:34:47.078590 containerd[1635]: 2026-03-13 00:34:46.939 [INFO][4607] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:34:47.078590 containerd[1635]: 2026-03-13 00:34:46.939 [INFO][4607] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-bcba60e63d' Mar 13 00:34:47.078590 containerd[1635]: 2026-03-13 00:34:47.007 [INFO][4607] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.8c8e18264d17bfffc75b5c1473929b4d4dfc7bde498ce0683e50ce5842548de2" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:47.078590 containerd[1635]: 2026-03-13 00:34:47.018 [INFO][4607] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:47.078590 containerd[1635]: 2026-03-13 00:34:47.024 [INFO][4607] ipam/ipam.go 526: Trying affinity for 192.168.112.64/26 host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:47.078590 containerd[1635]: 2026-03-13 00:34:47.025 [INFO][4607] ipam/ipam.go 160: Attempting to load block cidr=192.168.112.64/26 host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:47.078590 containerd[1635]: 2026-03-13 00:34:47.028 [INFO][4607] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.112.64/26 host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:47.078590 containerd[1635]: 2026-03-13 00:34:47.028 [INFO][4607] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.112.64/26 handle="k8s-pod-network.8c8e18264d17bfffc75b5c1473929b4d4dfc7bde498ce0683e50ce5842548de2" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:47.078590 containerd[1635]: 2026-03-13 00:34:47.029 [INFO][4607] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.8c8e18264d17bfffc75b5c1473929b4d4dfc7bde498ce0683e50ce5842548de2 Mar 13 00:34:47.078590 containerd[1635]: 2026-03-13 00:34:47.034 [INFO][4607] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.112.64/26 handle="k8s-pod-network.8c8e18264d17bfffc75b5c1473929b4d4dfc7bde498ce0683e50ce5842548de2" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:47.078590 containerd[1635]: 2026-03-13 00:34:47.043 [INFO][4607] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.112.70/26] block=192.168.112.64/26 handle="k8s-pod-network.8c8e18264d17bfffc75b5c1473929b4d4dfc7bde498ce0683e50ce5842548de2" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:47.078590 containerd[1635]: 2026-03-13 00:34:47.044 [INFO][4607] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.112.70/26] handle="k8s-pod-network.8c8e18264d17bfffc75b5c1473929b4d4dfc7bde498ce0683e50ce5842548de2" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:47.078590 containerd[1635]: 2026-03-13 00:34:47.044 [INFO][4607] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:34:47.078590 containerd[1635]: 2026-03-13 00:34:47.044 [INFO][4607] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.112.70/26] IPv6=[] ContainerID="8c8e18264d17bfffc75b5c1473929b4d4dfc7bde498ce0683e50ce5842548de2" HandleID="k8s-pod-network.8c8e18264d17bfffc75b5c1473929b4d4dfc7bde498ce0683e50ce5842548de2" Workload="ci--4459--2--4--n--bcba60e63d-k8s-goldmane--cccfbd5cf--ntbll-eth0" Mar 13 00:34:47.079025 containerd[1635]: 2026-03-13 00:34:47.047 [INFO][4589] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8c8e18264d17bfffc75b5c1473929b4d4dfc7bde498ce0683e50ce5842548de2" Namespace="calico-system" Pod="goldmane-cccfbd5cf-ntbll" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-goldmane--cccfbd5cf--ntbll-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--bcba60e63d-k8s-goldmane--cccfbd5cf--ntbll-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"8927d3f8-4e49-4a05-8332-7df32999988a", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-bcba60e63d", ContainerID:"", Pod:"goldmane-cccfbd5cf-ntbll", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.112.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calidcdadda2878", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:47.079025 containerd[1635]: 2026-03-13 00:34:47.047 [INFO][4589] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.112.70/32] ContainerID="8c8e18264d17bfffc75b5c1473929b4d4dfc7bde498ce0683e50ce5842548de2" Namespace="calico-system" Pod="goldmane-cccfbd5cf-ntbll" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-goldmane--cccfbd5cf--ntbll-eth0" Mar 13 00:34:47.079025 containerd[1635]: 2026-03-13 00:34:47.047 [INFO][4589] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidcdadda2878 ContainerID="8c8e18264d17bfffc75b5c1473929b4d4dfc7bde498ce0683e50ce5842548de2" Namespace="calico-system" Pod="goldmane-cccfbd5cf-ntbll" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-goldmane--cccfbd5cf--ntbll-eth0" Mar 13 00:34:47.079025 containerd[1635]: 2026-03-13 00:34:47.061 [INFO][4589] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8c8e18264d17bfffc75b5c1473929b4d4dfc7bde498ce0683e50ce5842548de2" Namespace="calico-system" Pod="goldmane-cccfbd5cf-ntbll" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-goldmane--cccfbd5cf--ntbll-eth0" Mar 13 00:34:47.079025 containerd[1635]: 2026-03-13 00:34:47.061 [INFO][4589] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8c8e18264d17bfffc75b5c1473929b4d4dfc7bde498ce0683e50ce5842548de2" Namespace="calico-system" Pod="goldmane-cccfbd5cf-ntbll" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-goldmane--cccfbd5cf--ntbll-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--bcba60e63d-k8s-goldmane--cccfbd5cf--ntbll-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"8927d3f8-4e49-4a05-8332-7df32999988a", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-bcba60e63d", ContainerID:"8c8e18264d17bfffc75b5c1473929b4d4dfc7bde498ce0683e50ce5842548de2", Pod:"goldmane-cccfbd5cf-ntbll", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.112.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calidcdadda2878", MAC:"fa:3e:a6:1a:0d:b2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:47.079025 containerd[1635]: 2026-03-13 00:34:47.074 [INFO][4589] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8c8e18264d17bfffc75b5c1473929b4d4dfc7bde498ce0683e50ce5842548de2" Namespace="calico-system" Pod="goldmane-cccfbd5cf-ntbll" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-goldmane--cccfbd5cf--ntbll-eth0" Mar 13 00:34:47.098953 containerd[1635]: time="2026-03-13T00:34:47.098858605Z" level=info msg="connecting to shim 8c8e18264d17bfffc75b5c1473929b4d4dfc7bde498ce0683e50ce5842548de2" address="unix:///run/containerd/s/aef4152125e3225b6ac6e2d3b13010ca68f490d689711dbc6ed8148ea92d0abe" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:34:47.123917 systemd[1]: Started cri-containerd-8c8e18264d17bfffc75b5c1473929b4d4dfc7bde498ce0683e50ce5842548de2.scope - libcontainer container 8c8e18264d17bfffc75b5c1473929b4d4dfc7bde498ce0683e50ce5842548de2. Mar 13 00:34:47.139182 containerd[1635]: time="2026-03-13T00:34:47.139110185Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6b485f8c87-fd8vp,Uid:74e2b49f-1624-4b27-9f4b-1bb46fbfa4d3,Namespace:calico-system,Attempt:0,} returns sandbox id \"713090deb5c88e34e9353a5227f5a41c6aef6f1eddaded10fc7a3f648b0219be\"" Mar 13 00:34:47.184974 containerd[1635]: time="2026-03-13T00:34:47.184864703Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-ntbll,Uid:8927d3f8-4e49-4a05-8332-7df32999988a,Namespace:calico-system,Attempt:0,} returns sandbox id \"8c8e18264d17bfffc75b5c1473929b4d4dfc7bde498ce0683e50ce5842548de2\"" Mar 13 00:34:47.244868 systemd-networkd[1490]: calibd8c0cbbb2e: Gained IPv6LL Mar 13 00:34:47.655206 containerd[1635]: time="2026-03-13T00:34:47.655141986Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:47.655978 containerd[1635]: time="2026-03-13T00:34:47.655945065Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8792502" Mar 13 00:34:47.657037 containerd[1635]: time="2026-03-13T00:34:47.657000512Z" level=info msg="ImageCreate event name:\"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:47.659086 containerd[1635]: time="2026-03-13T00:34:47.658994487Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:47.660109 containerd[1635]: time="2026-03-13T00:34:47.659326066Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"10348547\" in 1.629250506s" Mar 13 00:34:47.660109 containerd[1635]: time="2026-03-13T00:34:47.659355506Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\"" Mar 13 00:34:47.661570 containerd[1635]: time="2026-03-13T00:34:47.661553311Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 13 00:34:47.664132 containerd[1635]: time="2026-03-13T00:34:47.664115955Z" level=info msg="CreateContainer within sandbox \"e99672bfe17b352e0d7e5b1cc8a6e68261076888bad212bc0e32738468ab3421\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 13 00:34:47.678900 containerd[1635]: time="2026-03-13T00:34:47.678874308Z" level=info msg="Container 8d69f09b6681c099af9704057fa300c96bdef88b4766891d6d055a6327363474: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:47.682317 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1895191820.mount: Deactivated successfully. Mar 13 00:34:47.689055 containerd[1635]: time="2026-03-13T00:34:47.689023483Z" level=info msg="CreateContainer within sandbox \"e99672bfe17b352e0d7e5b1cc8a6e68261076888bad212bc0e32738468ab3421\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"8d69f09b6681c099af9704057fa300c96bdef88b4766891d6d055a6327363474\"" Mar 13 00:34:47.689641 containerd[1635]: time="2026-03-13T00:34:47.689618162Z" level=info msg="StartContainer for \"8d69f09b6681c099af9704057fa300c96bdef88b4766891d6d055a6327363474\"" Mar 13 00:34:47.691039 containerd[1635]: time="2026-03-13T00:34:47.691015378Z" level=info msg="connecting to shim 8d69f09b6681c099af9704057fa300c96bdef88b4766891d6d055a6327363474" address="unix:///run/containerd/s/2142c7a87df133f846df359d248f27134e511a2c6ce1d6973f2801241e7f7ff8" protocol=ttrpc version=3 Mar 13 00:34:47.713847 systemd[1]: Started cri-containerd-8d69f09b6681c099af9704057fa300c96bdef88b4766891d6d055a6327363474.scope - libcontainer container 8d69f09b6681c099af9704057fa300c96bdef88b4766891d6d055a6327363474. Mar 13 00:34:47.784042 containerd[1635]: time="2026-03-13T00:34:47.783985590Z" level=info msg="StartContainer for \"8d69f09b6681c099af9704057fa300c96bdef88b4766891d6d055a6327363474\" returns successfully" Mar 13 00:34:47.884998 systemd-networkd[1490]: cali889658e3fc2: Gained IPv6LL Mar 13 00:34:48.076898 systemd-networkd[1490]: calidcdadda2878: Gained IPv6LL Mar 13 00:34:48.334242 systemd-networkd[1490]: cali8b3a2e760bf: Gained IPv6LL Mar 13 00:34:48.652910 systemd-networkd[1490]: calid0d40f4e472: Gained IPv6LL Mar 13 00:34:49.662710 containerd[1635]: time="2026-03-13T00:34:49.662614440Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-mgfwk,Uid:7e5979e4-f434-490a-b349-72aee91f876a,Namespace:kube-system,Attempt:0,}" Mar 13 00:34:49.663995 containerd[1635]: time="2026-03-13T00:34:49.663922868Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7449d668ff-b29lf,Uid:1c6e5827-ca2b-4238-8d67-bf66b92bd43f,Namespace:calico-system,Attempt:0,}" Mar 13 00:34:49.787172 systemd-networkd[1490]: cali687fcfc5363: Link UP Mar 13 00:34:49.787918 systemd-networkd[1490]: cali687fcfc5363: Gained carrier Mar 13 00:34:49.805544 containerd[1635]: 2026-03-13 00:34:49.726 [INFO][4882] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--bcba60e63d-k8s-coredns--66bc5c9577--mgfwk-eth0 coredns-66bc5c9577- kube-system 7e5979e4-f434-490a-b349-72aee91f876a 817 0 2026-03-13 00:34:07 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-4-n-bcba60e63d coredns-66bc5c9577-mgfwk eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali687fcfc5363 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="ba873214960ea82aa357c460fd3bb4590700470a90dbad44edf6fe49b1720152" Namespace="kube-system" Pod="coredns-66bc5c9577-mgfwk" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-coredns--66bc5c9577--mgfwk-" Mar 13 00:34:49.805544 containerd[1635]: 2026-03-13 00:34:49.726 [INFO][4882] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ba873214960ea82aa357c460fd3bb4590700470a90dbad44edf6fe49b1720152" Namespace="kube-system" Pod="coredns-66bc5c9577-mgfwk" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-coredns--66bc5c9577--mgfwk-eth0" Mar 13 00:34:49.805544 containerd[1635]: 2026-03-13 00:34:49.750 [INFO][4907] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ba873214960ea82aa357c460fd3bb4590700470a90dbad44edf6fe49b1720152" HandleID="k8s-pod-network.ba873214960ea82aa357c460fd3bb4590700470a90dbad44edf6fe49b1720152" Workload="ci--4459--2--4--n--bcba60e63d-k8s-coredns--66bc5c9577--mgfwk-eth0" Mar 13 00:34:49.805544 containerd[1635]: 2026-03-13 00:34:49.756 [INFO][4907] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="ba873214960ea82aa357c460fd3bb4590700470a90dbad44edf6fe49b1720152" HandleID="k8s-pod-network.ba873214960ea82aa357c460fd3bb4590700470a90dbad44edf6fe49b1720152" Workload="ci--4459--2--4--n--bcba60e63d-k8s-coredns--66bc5c9577--mgfwk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fd240), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-4-n-bcba60e63d", "pod":"coredns-66bc5c9577-mgfwk", "timestamp":"2026-03-13 00:34:49.750789624 +0000 UTC"}, Hostname:"ci-4459-2-4-n-bcba60e63d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000372dc0)} Mar 13 00:34:49.805544 containerd[1635]: 2026-03-13 00:34:49.756 [INFO][4907] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:34:49.805544 containerd[1635]: 2026-03-13 00:34:49.756 [INFO][4907] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:34:49.805544 containerd[1635]: 2026-03-13 00:34:49.756 [INFO][4907] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-bcba60e63d' Mar 13 00:34:49.805544 containerd[1635]: 2026-03-13 00:34:49.759 [INFO][4907] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.ba873214960ea82aa357c460fd3bb4590700470a90dbad44edf6fe49b1720152" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:49.805544 containerd[1635]: 2026-03-13 00:34:49.763 [INFO][4907] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:49.805544 containerd[1635]: 2026-03-13 00:34:49.767 [INFO][4907] ipam/ipam.go 526: Trying affinity for 192.168.112.64/26 host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:49.805544 containerd[1635]: 2026-03-13 00:34:49.768 [INFO][4907] ipam/ipam.go 160: Attempting to load block cidr=192.168.112.64/26 host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:49.805544 containerd[1635]: 2026-03-13 00:34:49.770 [INFO][4907] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.112.64/26 host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:49.805544 containerd[1635]: 2026-03-13 00:34:49.770 [INFO][4907] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.112.64/26 handle="k8s-pod-network.ba873214960ea82aa357c460fd3bb4590700470a90dbad44edf6fe49b1720152" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:49.805544 containerd[1635]: 2026-03-13 00:34:49.771 [INFO][4907] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.ba873214960ea82aa357c460fd3bb4590700470a90dbad44edf6fe49b1720152 Mar 13 00:34:49.805544 containerd[1635]: 2026-03-13 00:34:49.776 [INFO][4907] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.112.64/26 handle="k8s-pod-network.ba873214960ea82aa357c460fd3bb4590700470a90dbad44edf6fe49b1720152" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:49.805544 containerd[1635]: 2026-03-13 00:34:49.781 [INFO][4907] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.112.71/26] block=192.168.112.64/26 handle="k8s-pod-network.ba873214960ea82aa357c460fd3bb4590700470a90dbad44edf6fe49b1720152" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:49.805544 containerd[1635]: 2026-03-13 00:34:49.781 [INFO][4907] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.112.71/26] handle="k8s-pod-network.ba873214960ea82aa357c460fd3bb4590700470a90dbad44edf6fe49b1720152" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:49.805544 containerd[1635]: 2026-03-13 00:34:49.781 [INFO][4907] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:34:49.805544 containerd[1635]: 2026-03-13 00:34:49.781 [INFO][4907] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.112.71/26] IPv6=[] ContainerID="ba873214960ea82aa357c460fd3bb4590700470a90dbad44edf6fe49b1720152" HandleID="k8s-pod-network.ba873214960ea82aa357c460fd3bb4590700470a90dbad44edf6fe49b1720152" Workload="ci--4459--2--4--n--bcba60e63d-k8s-coredns--66bc5c9577--mgfwk-eth0" Mar 13 00:34:49.805989 containerd[1635]: 2026-03-13 00:34:49.784 [INFO][4882] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ba873214960ea82aa357c460fd3bb4590700470a90dbad44edf6fe49b1720152" Namespace="kube-system" Pod="coredns-66bc5c9577-mgfwk" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-coredns--66bc5c9577--mgfwk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--bcba60e63d-k8s-coredns--66bc5c9577--mgfwk-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"7e5979e4-f434-490a-b349-72aee91f876a", ResourceVersion:"817", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-bcba60e63d", ContainerID:"", Pod:"coredns-66bc5c9577-mgfwk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.112.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali687fcfc5363", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:49.805989 containerd[1635]: 2026-03-13 00:34:49.784 [INFO][4882] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.112.71/32] ContainerID="ba873214960ea82aa357c460fd3bb4590700470a90dbad44edf6fe49b1720152" Namespace="kube-system" Pod="coredns-66bc5c9577-mgfwk" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-coredns--66bc5c9577--mgfwk-eth0" Mar 13 00:34:49.805989 containerd[1635]: 2026-03-13 00:34:49.784 [INFO][4882] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali687fcfc5363 ContainerID="ba873214960ea82aa357c460fd3bb4590700470a90dbad44edf6fe49b1720152" Namespace="kube-system" Pod="coredns-66bc5c9577-mgfwk" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-coredns--66bc5c9577--mgfwk-eth0" Mar 13 00:34:49.805989 containerd[1635]: 2026-03-13 00:34:49.788 [INFO][4882] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ba873214960ea82aa357c460fd3bb4590700470a90dbad44edf6fe49b1720152" Namespace="kube-system" Pod="coredns-66bc5c9577-mgfwk" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-coredns--66bc5c9577--mgfwk-eth0" Mar 13 00:34:49.805989 containerd[1635]: 2026-03-13 00:34:49.789 [INFO][4882] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ba873214960ea82aa357c460fd3bb4590700470a90dbad44edf6fe49b1720152" Namespace="kube-system" Pod="coredns-66bc5c9577-mgfwk" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-coredns--66bc5c9577--mgfwk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--bcba60e63d-k8s-coredns--66bc5c9577--mgfwk-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"7e5979e4-f434-490a-b349-72aee91f876a", ResourceVersion:"817", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-bcba60e63d", ContainerID:"ba873214960ea82aa357c460fd3bb4590700470a90dbad44edf6fe49b1720152", Pod:"coredns-66bc5c9577-mgfwk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.112.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali687fcfc5363", MAC:"fa:48:ac:95:13:81", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:49.806125 containerd[1635]: 2026-03-13 00:34:49.797 [INFO][4882] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ba873214960ea82aa357c460fd3bb4590700470a90dbad44edf6fe49b1720152" Namespace="kube-system" Pod="coredns-66bc5c9577-mgfwk" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-coredns--66bc5c9577--mgfwk-eth0" Mar 13 00:34:49.842173 containerd[1635]: time="2026-03-13T00:34:49.842143321Z" level=info msg="connecting to shim ba873214960ea82aa357c460fd3bb4590700470a90dbad44edf6fe49b1720152" address="unix:///run/containerd/s/4c4db313f9f3e1659746d9690c03a775f39c76ba49854f3122e813c723d67ee0" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:34:49.875016 systemd[1]: Started cri-containerd-ba873214960ea82aa357c460fd3bb4590700470a90dbad44edf6fe49b1720152.scope - libcontainer container ba873214960ea82aa357c460fd3bb4590700470a90dbad44edf6fe49b1720152. Mar 13 00:34:49.901715 systemd-networkd[1490]: calib1294bd79fe: Link UP Mar 13 00:34:49.909770 systemd-networkd[1490]: calib1294bd79fe: Gained carrier Mar 13 00:34:49.923736 containerd[1635]: 2026-03-13 00:34:49.729 [INFO][4879] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--n--bcba60e63d-k8s-calico--apiserver--7449d668ff--b29lf-eth0 calico-apiserver-7449d668ff- calico-system 1c6e5827-ca2b-4238-8d67-bf66b92bd43f 825 0 2026-03-13 00:34:16 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7449d668ff projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-4-n-bcba60e63d calico-apiserver-7449d668ff-b29lf eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] calib1294bd79fe [] [] }} ContainerID="1858639048c6e9ed2f47ea7ce62a34a0bc194943c2750ffd045727466e44d0ae" Namespace="calico-system" Pod="calico-apiserver-7449d668ff-b29lf" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-calico--apiserver--7449d668ff--b29lf-" Mar 13 00:34:49.923736 containerd[1635]: 2026-03-13 00:34:49.729 [INFO][4879] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1858639048c6e9ed2f47ea7ce62a34a0bc194943c2750ffd045727466e44d0ae" Namespace="calico-system" Pod="calico-apiserver-7449d668ff-b29lf" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-calico--apiserver--7449d668ff--b29lf-eth0" Mar 13 00:34:49.923736 containerd[1635]: 2026-03-13 00:34:49.756 [INFO][4909] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1858639048c6e9ed2f47ea7ce62a34a0bc194943c2750ffd045727466e44d0ae" HandleID="k8s-pod-network.1858639048c6e9ed2f47ea7ce62a34a0bc194943c2750ffd045727466e44d0ae" Workload="ci--4459--2--4--n--bcba60e63d-k8s-calico--apiserver--7449d668ff--b29lf-eth0" Mar 13 00:34:49.923736 containerd[1635]: 2026-03-13 00:34:49.761 [INFO][4909] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="1858639048c6e9ed2f47ea7ce62a34a0bc194943c2750ffd045727466e44d0ae" HandleID="k8s-pod-network.1858639048c6e9ed2f47ea7ce62a34a0bc194943c2750ffd045727466e44d0ae" Workload="ci--4459--2--4--n--bcba60e63d-k8s-calico--apiserver--7449d668ff--b29lf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fd440), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-n-bcba60e63d", "pod":"calico-apiserver-7449d668ff-b29lf", "timestamp":"2026-03-13 00:34:49.756589353 +0000 UTC"}, Hostname:"ci-4459-2-4-n-bcba60e63d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0002611e0)} Mar 13 00:34:49.923736 containerd[1635]: 2026-03-13 00:34:49.761 [INFO][4909] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 13 00:34:49.923736 containerd[1635]: 2026-03-13 00:34:49.781 [INFO][4909] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 13 00:34:49.923736 containerd[1635]: 2026-03-13 00:34:49.781 [INFO][4909] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-n-bcba60e63d' Mar 13 00:34:49.923736 containerd[1635]: 2026-03-13 00:34:49.862 [INFO][4909] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.1858639048c6e9ed2f47ea7ce62a34a0bc194943c2750ffd045727466e44d0ae" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:49.923736 containerd[1635]: 2026-03-13 00:34:49.868 [INFO][4909] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:49.923736 containerd[1635]: 2026-03-13 00:34:49.872 [INFO][4909] ipam/ipam.go 526: Trying affinity for 192.168.112.64/26 host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:49.923736 containerd[1635]: 2026-03-13 00:34:49.873 [INFO][4909] ipam/ipam.go 160: Attempting to load block cidr=192.168.112.64/26 host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:49.923736 containerd[1635]: 2026-03-13 00:34:49.877 [INFO][4909] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.112.64/26 host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:49.923736 containerd[1635]: 2026-03-13 00:34:49.877 [INFO][4909] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.112.64/26 handle="k8s-pod-network.1858639048c6e9ed2f47ea7ce62a34a0bc194943c2750ffd045727466e44d0ae" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:49.923736 containerd[1635]: 2026-03-13 00:34:49.880 [INFO][4909] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.1858639048c6e9ed2f47ea7ce62a34a0bc194943c2750ffd045727466e44d0ae Mar 13 00:34:49.923736 containerd[1635]: 2026-03-13 00:34:49.884 [INFO][4909] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.112.64/26 handle="k8s-pod-network.1858639048c6e9ed2f47ea7ce62a34a0bc194943c2750ffd045727466e44d0ae" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:49.923736 containerd[1635]: 2026-03-13 00:34:49.890 [INFO][4909] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.112.72/26] block=192.168.112.64/26 handle="k8s-pod-network.1858639048c6e9ed2f47ea7ce62a34a0bc194943c2750ffd045727466e44d0ae" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:49.923736 containerd[1635]: 2026-03-13 00:34:49.890 [INFO][4909] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.112.72/26] handle="k8s-pod-network.1858639048c6e9ed2f47ea7ce62a34a0bc194943c2750ffd045727466e44d0ae" host="ci-4459-2-4-n-bcba60e63d" Mar 13 00:34:49.923736 containerd[1635]: 2026-03-13 00:34:49.890 [INFO][4909] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 13 00:34:49.923736 containerd[1635]: 2026-03-13 00:34:49.890 [INFO][4909] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.112.72/26] IPv6=[] ContainerID="1858639048c6e9ed2f47ea7ce62a34a0bc194943c2750ffd045727466e44d0ae" HandleID="k8s-pod-network.1858639048c6e9ed2f47ea7ce62a34a0bc194943c2750ffd045727466e44d0ae" Workload="ci--4459--2--4--n--bcba60e63d-k8s-calico--apiserver--7449d668ff--b29lf-eth0" Mar 13 00:34:49.924137 containerd[1635]: 2026-03-13 00:34:49.893 [INFO][4879] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1858639048c6e9ed2f47ea7ce62a34a0bc194943c2750ffd045727466e44d0ae" Namespace="calico-system" Pod="calico-apiserver-7449d668ff-b29lf" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-calico--apiserver--7449d668ff--b29lf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--bcba60e63d-k8s-calico--apiserver--7449d668ff--b29lf-eth0", GenerateName:"calico-apiserver-7449d668ff-", Namespace:"calico-system", SelfLink:"", UID:"1c6e5827-ca2b-4238-8d67-bf66b92bd43f", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7449d668ff", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-bcba60e63d", ContainerID:"", Pod:"calico-apiserver-7449d668ff-b29lf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.112.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calib1294bd79fe", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:49.924137 containerd[1635]: 2026-03-13 00:34:49.893 [INFO][4879] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.112.72/32] ContainerID="1858639048c6e9ed2f47ea7ce62a34a0bc194943c2750ffd045727466e44d0ae" Namespace="calico-system" Pod="calico-apiserver-7449d668ff-b29lf" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-calico--apiserver--7449d668ff--b29lf-eth0" Mar 13 00:34:49.924137 containerd[1635]: 2026-03-13 00:34:49.893 [INFO][4879] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib1294bd79fe ContainerID="1858639048c6e9ed2f47ea7ce62a34a0bc194943c2750ffd045727466e44d0ae" Namespace="calico-system" Pod="calico-apiserver-7449d668ff-b29lf" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-calico--apiserver--7449d668ff--b29lf-eth0" Mar 13 00:34:49.924137 containerd[1635]: 2026-03-13 00:34:49.909 [INFO][4879] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1858639048c6e9ed2f47ea7ce62a34a0bc194943c2750ffd045727466e44d0ae" Namespace="calico-system" Pod="calico-apiserver-7449d668ff-b29lf" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-calico--apiserver--7449d668ff--b29lf-eth0" Mar 13 00:34:49.924137 containerd[1635]: 2026-03-13 00:34:49.911 [INFO][4879] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1858639048c6e9ed2f47ea7ce62a34a0bc194943c2750ffd045727466e44d0ae" Namespace="calico-system" Pod="calico-apiserver-7449d668ff-b29lf" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-calico--apiserver--7449d668ff--b29lf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--n--bcba60e63d-k8s-calico--apiserver--7449d668ff--b29lf-eth0", GenerateName:"calico-apiserver-7449d668ff-", Namespace:"calico-system", SelfLink:"", UID:"1c6e5827-ca2b-4238-8d67-bf66b92bd43f", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2026, time.March, 13, 0, 34, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7449d668ff", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-n-bcba60e63d", ContainerID:"1858639048c6e9ed2f47ea7ce62a34a0bc194943c2750ffd045727466e44d0ae", Pod:"calico-apiserver-7449d668ff-b29lf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.112.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calib1294bd79fe", MAC:"06:d9:e5:6a:b5:cd", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 13 00:34:49.924137 containerd[1635]: 2026-03-13 00:34:49.918 [INFO][4879] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1858639048c6e9ed2f47ea7ce62a34a0bc194943c2750ffd045727466e44d0ae" Namespace="calico-system" Pod="calico-apiserver-7449d668ff-b29lf" WorkloadEndpoint="ci--4459--2--4--n--bcba60e63d-k8s-calico--apiserver--7449d668ff--b29lf-eth0" Mar 13 00:34:49.959107 containerd[1635]: time="2026-03-13T00:34:49.959043665Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-mgfwk,Uid:7e5979e4-f434-490a-b349-72aee91f876a,Namespace:kube-system,Attempt:0,} returns sandbox id \"ba873214960ea82aa357c460fd3bb4590700470a90dbad44edf6fe49b1720152\"" Mar 13 00:34:49.965399 containerd[1635]: time="2026-03-13T00:34:49.965342662Z" level=info msg="CreateContainer within sandbox \"ba873214960ea82aa357c460fd3bb4590700470a90dbad44edf6fe49b1720152\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 13 00:34:49.970976 containerd[1635]: time="2026-03-13T00:34:49.970927460Z" level=info msg="connecting to shim 1858639048c6e9ed2f47ea7ce62a34a0bc194943c2750ffd045727466e44d0ae" address="unix:///run/containerd/s/fb45302c0f3dcf89b00e93377eac61f5190424a5d8643dc4058d0fe1c7064a8b" namespace=k8s.io protocol=ttrpc version=3 Mar 13 00:34:49.979874 containerd[1635]: time="2026-03-13T00:34:49.979855481Z" level=info msg="Container da91442901691fe120d1b9ccff6ad4a5c3b0ef49cdc6416c71a07d620b3a2d94: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:49.988240 containerd[1635]: time="2026-03-13T00:34:49.988202803Z" level=info msg="CreateContainer within sandbox \"ba873214960ea82aa357c460fd3bb4590700470a90dbad44edf6fe49b1720152\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"da91442901691fe120d1b9ccff6ad4a5c3b0ef49cdc6416c71a07d620b3a2d94\"" Mar 13 00:34:49.989890 containerd[1635]: time="2026-03-13T00:34:49.989864650Z" level=info msg="StartContainer for \"da91442901691fe120d1b9ccff6ad4a5c3b0ef49cdc6416c71a07d620b3a2d94\"" Mar 13 00:34:49.990490 containerd[1635]: time="2026-03-13T00:34:49.990468038Z" level=info msg="connecting to shim da91442901691fe120d1b9ccff6ad4a5c3b0ef49cdc6416c71a07d620b3a2d94" address="unix:///run/containerd/s/4c4db313f9f3e1659746d9690c03a775f39c76ba49854f3122e813c723d67ee0" protocol=ttrpc version=3 Mar 13 00:34:50.000774 systemd[1]: Started cri-containerd-1858639048c6e9ed2f47ea7ce62a34a0bc194943c2750ffd045727466e44d0ae.scope - libcontainer container 1858639048c6e9ed2f47ea7ce62a34a0bc194943c2750ffd045727466e44d0ae. Mar 13 00:34:50.022022 systemd[1]: Started cri-containerd-da91442901691fe120d1b9ccff6ad4a5c3b0ef49cdc6416c71a07d620b3a2d94.scope - libcontainer container da91442901691fe120d1b9ccff6ad4a5c3b0ef49cdc6416c71a07d620b3a2d94. Mar 13 00:34:50.054187 containerd[1635]: time="2026-03-13T00:34:50.054049122Z" level=info msg="StartContainer for \"da91442901691fe120d1b9ccff6ad4a5c3b0ef49cdc6416c71a07d620b3a2d94\" returns successfully" Mar 13 00:34:50.069045 containerd[1635]: time="2026-03-13T00:34:50.069017434Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7449d668ff-b29lf,Uid:1c6e5827-ca2b-4238-8d67-bf66b92bd43f,Namespace:calico-system,Attempt:0,} returns sandbox id \"1858639048c6e9ed2f47ea7ce62a34a0bc194943c2750ffd045727466e44d0ae\"" Mar 13 00:34:50.924459 kubelet[2798]: I0313 00:34:50.924271 2798 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-mgfwk" podStartSLOduration=43.924150813 podStartE2EDuration="43.924150813s" podCreationTimestamp="2026-03-13 00:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 00:34:50.903039044 +0000 UTC m=+51.346294203" watchObservedRunningTime="2026-03-13 00:34:50.924150813 +0000 UTC m=+51.367405972" Mar 13 00:34:51.084899 systemd-networkd[1490]: calib1294bd79fe: Gained IPv6LL Mar 13 00:34:51.415434 containerd[1635]: time="2026-03-13T00:34:51.415394374Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:51.416573 containerd[1635]: time="2026-03-13T00:34:51.416502233Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=48415780" Mar 13 00:34:51.417385 containerd[1635]: time="2026-03-13T00:34:51.417367221Z" level=info msg="ImageCreate event name:\"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:51.419289 containerd[1635]: time="2026-03-13T00:34:51.419237508Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:51.419677 containerd[1635]: time="2026-03-13T00:34:51.419591817Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 3.757596217s" Mar 13 00:34:51.419677 containerd[1635]: time="2026-03-13T00:34:51.419613597Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Mar 13 00:34:51.421755 containerd[1635]: time="2026-03-13T00:34:51.421729283Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Mar 13 00:34:51.423944 containerd[1635]: time="2026-03-13T00:34:51.423589120Z" level=info msg="CreateContainer within sandbox \"60caa4d20d2849e075a76513648c8fce8e01c68b206c88e52a5575453db60275\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 13 00:34:51.429012 containerd[1635]: time="2026-03-13T00:34:51.428997620Z" level=info msg="Container 929eb2e633b715c243b785d59f6ef83afc4c849a67233c6acabf3157247baf35: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:51.444572 containerd[1635]: time="2026-03-13T00:34:51.444548523Z" level=info msg="CreateContainer within sandbox \"60caa4d20d2849e075a76513648c8fce8e01c68b206c88e52a5575453db60275\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"929eb2e633b715c243b785d59f6ef83afc4c849a67233c6acabf3157247baf35\"" Mar 13 00:34:51.446250 containerd[1635]: time="2026-03-13T00:34:51.444986422Z" level=info msg="StartContainer for \"929eb2e633b715c243b785d59f6ef83afc4c849a67233c6acabf3157247baf35\"" Mar 13 00:34:51.446359 containerd[1635]: time="2026-03-13T00:34:51.446345899Z" level=info msg="connecting to shim 929eb2e633b715c243b785d59f6ef83afc4c849a67233c6acabf3157247baf35" address="unix:///run/containerd/s/829d2e90c091172c9f14f1bc7f6f308b753cad8fdccd9d6fe6f2f0f71f8e94fb" protocol=ttrpc version=3 Mar 13 00:34:51.465752 systemd[1]: Started cri-containerd-929eb2e633b715c243b785d59f6ef83afc4c849a67233c6acabf3157247baf35.scope - libcontainer container 929eb2e633b715c243b785d59f6ef83afc4c849a67233c6acabf3157247baf35. Mar 13 00:34:51.508837 containerd[1635]: time="2026-03-13T00:34:51.508807076Z" level=info msg="StartContainer for \"929eb2e633b715c243b785d59f6ef83afc4c849a67233c6acabf3157247baf35\" returns successfully" Mar 13 00:34:51.725729 systemd-networkd[1490]: cali687fcfc5363: Gained IPv6LL Mar 13 00:34:52.901038 kubelet[2798]: I0313 00:34:52.900969 2798 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 00:34:54.838777 containerd[1635]: time="2026-03-13T00:34:54.838718672Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:54.840014 containerd[1635]: time="2026-03-13T00:34:54.839834001Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=52406348" Mar 13 00:34:54.840913 containerd[1635]: time="2026-03-13T00:34:54.840889579Z" level=info msg="ImageCreate event name:\"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:54.842536 containerd[1635]: time="2026-03-13T00:34:54.842521197Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:54.842945 containerd[1635]: time="2026-03-13T00:34:54.842919046Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"53962361\" in 3.421170223s" Mar 13 00:34:54.842980 containerd[1635]: time="2026-03-13T00:34:54.842945896Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\"" Mar 13 00:34:54.844615 containerd[1635]: time="2026-03-13T00:34:54.844593104Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Mar 13 00:34:54.858104 containerd[1635]: time="2026-03-13T00:34:54.858064494Z" level=info msg="CreateContainer within sandbox \"713090deb5c88e34e9353a5227f5a41c6aef6f1eddaded10fc7a3f648b0219be\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 13 00:34:54.866680 containerd[1635]: time="2026-03-13T00:34:54.866411833Z" level=info msg="Container 16da04be8cab1842483031f89c7529659fd1692331703b7e93813950d24d0e31: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:54.874368 containerd[1635]: time="2026-03-13T00:34:54.874338322Z" level=info msg="CreateContainer within sandbox \"713090deb5c88e34e9353a5227f5a41c6aef6f1eddaded10fc7a3f648b0219be\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"16da04be8cab1842483031f89c7529659fd1692331703b7e93813950d24d0e31\"" Mar 13 00:34:54.874830 containerd[1635]: time="2026-03-13T00:34:54.874793951Z" level=info msg="StartContainer for \"16da04be8cab1842483031f89c7529659fd1692331703b7e93813950d24d0e31\"" Mar 13 00:34:54.875528 containerd[1635]: time="2026-03-13T00:34:54.875504441Z" level=info msg="connecting to shim 16da04be8cab1842483031f89c7529659fd1692331703b7e93813950d24d0e31" address="unix:///run/containerd/s/be598211b620aeba33ac70ce3b3d19ee8d9d95eb5a1be7d3b8c19fbd6fe7171d" protocol=ttrpc version=3 Mar 13 00:34:54.896755 systemd[1]: Started cri-containerd-16da04be8cab1842483031f89c7529659fd1692331703b7e93813950d24d0e31.scope - libcontainer container 16da04be8cab1842483031f89c7529659fd1692331703b7e93813950d24d0e31. Mar 13 00:34:54.944809 containerd[1635]: time="2026-03-13T00:34:54.944766502Z" level=info msg="StartContainer for \"16da04be8cab1842483031f89c7529659fd1692331703b7e93813950d24d0e31\" returns successfully" Mar 13 00:34:55.949957 kubelet[2798]: I0313 00:34:55.948631 2798 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-7449d668ff-qzpl5" podStartSLOduration=35.580513134 podStartE2EDuration="39.948608531s" podCreationTimestamp="2026-03-13 00:34:16 +0000 UTC" firstStartedPulling="2026-03-13 00:34:47.052344509 +0000 UTC m=+47.495599668" lastFinishedPulling="2026-03-13 00:34:51.420439916 +0000 UTC m=+51.863695065" observedRunningTime="2026-03-13 00:34:51.908031146 +0000 UTC m=+52.351286325" watchObservedRunningTime="2026-03-13 00:34:55.948608531 +0000 UTC m=+56.391863720" Mar 13 00:34:55.951901 kubelet[2798]: I0313 00:34:55.950112 2798 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-6b485f8c87-fd8vp" podStartSLOduration=31.246868547 podStartE2EDuration="38.950097489s" podCreationTimestamp="2026-03-13 00:34:17 +0000 UTC" firstStartedPulling="2026-03-13 00:34:47.140727582 +0000 UTC m=+47.583982741" lastFinishedPulling="2026-03-13 00:34:54.843956534 +0000 UTC m=+55.287211683" observedRunningTime="2026-03-13 00:34:55.946443524 +0000 UTC m=+56.389698713" watchObservedRunningTime="2026-03-13 00:34:55.950097489 +0000 UTC m=+56.393352688" Mar 13 00:34:56.815505 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1396830540.mount: Deactivated successfully. Mar 13 00:34:57.118273 containerd[1635]: time="2026-03-13T00:34:57.118233183Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:57.119413 containerd[1635]: time="2026-03-13T00:34:57.119273121Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=55623386" Mar 13 00:34:57.120273 containerd[1635]: time="2026-03-13T00:34:57.120254030Z" level=info msg="ImageCreate event name:\"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:57.122184 containerd[1635]: time="2026-03-13T00:34:57.122165268Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:57.122600 containerd[1635]: time="2026-03-13T00:34:57.122578237Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"55623232\" in 2.277961073s" Mar 13 00:34:57.122634 containerd[1635]: time="2026-03-13T00:34:57.122602697Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\"" Mar 13 00:34:57.124901 containerd[1635]: time="2026-03-13T00:34:57.124877655Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Mar 13 00:34:57.127280 containerd[1635]: time="2026-03-13T00:34:57.127263442Z" level=info msg="CreateContainer within sandbox \"8c8e18264d17bfffc75b5c1473929b4d4dfc7bde498ce0683e50ce5842548de2\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Mar 13 00:34:57.134678 containerd[1635]: time="2026-03-13T00:34:57.133051227Z" level=info msg="Container 1fb5cc5e737d4aeb148549e17c2e2615f9eb1b7cae8f5245ab5af2a8eba0c81d: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:57.145366 containerd[1635]: time="2026-03-13T00:34:57.145341293Z" level=info msg="CreateContainer within sandbox \"8c8e18264d17bfffc75b5c1473929b4d4dfc7bde498ce0683e50ce5842548de2\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"1fb5cc5e737d4aeb148549e17c2e2615f9eb1b7cae8f5245ab5af2a8eba0c81d\"" Mar 13 00:34:57.145747 containerd[1635]: time="2026-03-13T00:34:57.145684083Z" level=info msg="StartContainer for \"1fb5cc5e737d4aeb148549e17c2e2615f9eb1b7cae8f5245ab5af2a8eba0c81d\"" Mar 13 00:34:57.147336 containerd[1635]: time="2026-03-13T00:34:57.147315981Z" level=info msg="connecting to shim 1fb5cc5e737d4aeb148549e17c2e2615f9eb1b7cae8f5245ab5af2a8eba0c81d" address="unix:///run/containerd/s/aef4152125e3225b6ac6e2d3b13010ca68f490d689711dbc6ed8148ea92d0abe" protocol=ttrpc version=3 Mar 13 00:34:57.166843 systemd[1]: Started cri-containerd-1fb5cc5e737d4aeb148549e17c2e2615f9eb1b7cae8f5245ab5af2a8eba0c81d.scope - libcontainer container 1fb5cc5e737d4aeb148549e17c2e2615f9eb1b7cae8f5245ab5af2a8eba0c81d. Mar 13 00:34:57.218504 containerd[1635]: time="2026-03-13T00:34:57.218431733Z" level=info msg="StartContainer for \"1fb5cc5e737d4aeb148549e17c2e2615f9eb1b7cae8f5245ab5af2a8eba0c81d\" returns successfully" Mar 13 00:34:57.964633 kubelet[2798]: I0313 00:34:57.963260 2798 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-cccfbd5cf-ntbll" podStartSLOduration=32.027018151 podStartE2EDuration="41.96324003s" podCreationTimestamp="2026-03-13 00:34:16 +0000 UTC" firstStartedPulling="2026-03-13 00:34:47.187273558 +0000 UTC m=+47.630528717" lastFinishedPulling="2026-03-13 00:34:57.123495437 +0000 UTC m=+57.566750596" observedRunningTime="2026-03-13 00:34:57.96242786 +0000 UTC m=+58.405683059" watchObservedRunningTime="2026-03-13 00:34:57.96324003 +0000 UTC m=+58.406495229" Mar 13 00:34:58.814257 containerd[1635]: time="2026-03-13T00:34:58.814214527Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:58.815394 containerd[1635]: time="2026-03-13T00:34:58.815277915Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=14704317" Mar 13 00:34:58.816249 containerd[1635]: time="2026-03-13T00:34:58.816234105Z" level=info msg="ImageCreate event name:\"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:58.817918 containerd[1635]: time="2026-03-13T00:34:58.817903684Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:58.818200 containerd[1635]: time="2026-03-13T00:34:58.818178673Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"16260314\" in 1.693278438s" Mar 13 00:34:58.818234 containerd[1635]: time="2026-03-13T00:34:58.818202443Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\"" Mar 13 00:34:58.819089 containerd[1635]: time="2026-03-13T00:34:58.819053572Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 13 00:34:58.822153 containerd[1635]: time="2026-03-13T00:34:58.822129879Z" level=info msg="CreateContainer within sandbox \"e99672bfe17b352e0d7e5b1cc8a6e68261076888bad212bc0e32738468ab3421\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 13 00:34:58.830916 containerd[1635]: time="2026-03-13T00:34:58.830723441Z" level=info msg="Container 5b4039e58e30603bf8f14e2723e66ff9e833d3c5bb8d8e8d6b75e8d196d62593: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:58.846581 containerd[1635]: time="2026-03-13T00:34:58.846548974Z" level=info msg="CreateContainer within sandbox \"e99672bfe17b352e0d7e5b1cc8a6e68261076888bad212bc0e32738468ab3421\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"5b4039e58e30603bf8f14e2723e66ff9e833d3c5bb8d8e8d6b75e8d196d62593\"" Mar 13 00:34:58.847004 containerd[1635]: time="2026-03-13T00:34:58.846990935Z" level=info msg="StartContainer for \"5b4039e58e30603bf8f14e2723e66ff9e833d3c5bb8d8e8d6b75e8d196d62593\"" Mar 13 00:34:58.848164 containerd[1635]: time="2026-03-13T00:34:58.848149333Z" level=info msg="connecting to shim 5b4039e58e30603bf8f14e2723e66ff9e833d3c5bb8d8e8d6b75e8d196d62593" address="unix:///run/containerd/s/2142c7a87df133f846df359d248f27134e511a2c6ce1d6973f2801241e7f7ff8" protocol=ttrpc version=3 Mar 13 00:34:58.866794 systemd[1]: Started cri-containerd-5b4039e58e30603bf8f14e2723e66ff9e833d3c5bb8d8e8d6b75e8d196d62593.scope - libcontainer container 5b4039e58e30603bf8f14e2723e66ff9e833d3c5bb8d8e8d6b75e8d196d62593. Mar 13 00:34:58.950858 containerd[1635]: time="2026-03-13T00:34:58.950813221Z" level=info msg="StartContainer for \"5b4039e58e30603bf8f14e2723e66ff9e833d3c5bb8d8e8d6b75e8d196d62593\" returns successfully" Mar 13 00:34:59.277842 containerd[1635]: time="2026-03-13T00:34:59.277781219Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 13 00:34:59.278689 containerd[1635]: time="2026-03-13T00:34:59.278665228Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Mar 13 00:34:59.280551 containerd[1635]: time="2026-03-13T00:34:59.280510496Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 461.439884ms" Mar 13 00:34:59.280551 containerd[1635]: time="2026-03-13T00:34:59.280539726Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Mar 13 00:34:59.285060 containerd[1635]: time="2026-03-13T00:34:59.284940512Z" level=info msg="CreateContainer within sandbox \"1858639048c6e9ed2f47ea7ce62a34a0bc194943c2750ffd045727466e44d0ae\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 13 00:34:59.297080 containerd[1635]: time="2026-03-13T00:34:59.295988743Z" level=info msg="Container 81a3b2981d408d1fad26bde21e1621eb40798577631952b6975ee4d283d97354: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:34:59.307088 containerd[1635]: time="2026-03-13T00:34:59.307044422Z" level=info msg="CreateContainer within sandbox \"1858639048c6e9ed2f47ea7ce62a34a0bc194943c2750ffd045727466e44d0ae\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"81a3b2981d408d1fad26bde21e1621eb40798577631952b6975ee4d283d97354\"" Mar 13 00:34:59.309565 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2206228889.mount: Deactivated successfully. Mar 13 00:34:59.310513 containerd[1635]: time="2026-03-13T00:34:59.310463170Z" level=info msg="StartContainer for \"81a3b2981d408d1fad26bde21e1621eb40798577631952b6975ee4d283d97354\"" Mar 13 00:34:59.316783 containerd[1635]: time="2026-03-13T00:34:59.315718425Z" level=info msg="connecting to shim 81a3b2981d408d1fad26bde21e1621eb40798577631952b6975ee4d283d97354" address="unix:///run/containerd/s/fb45302c0f3dcf89b00e93377eac61f5190424a5d8643dc4058d0fe1c7064a8b" protocol=ttrpc version=3 Mar 13 00:34:59.346870 systemd[1]: Started cri-containerd-81a3b2981d408d1fad26bde21e1621eb40798577631952b6975ee4d283d97354.scope - libcontainer container 81a3b2981d408d1fad26bde21e1621eb40798577631952b6975ee4d283d97354. Mar 13 00:34:59.406141 containerd[1635]: time="2026-03-13T00:34:59.406093512Z" level=info msg="StartContainer for \"81a3b2981d408d1fad26bde21e1621eb40798577631952b6975ee4d283d97354\" returns successfully" Mar 13 00:34:59.727856 kubelet[2798]: I0313 00:34:59.727809 2798 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 13 00:34:59.729012 kubelet[2798]: I0313 00:34:59.728961 2798 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 13 00:35:00.001971 kubelet[2798]: I0313 00:35:00.001564 2798 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-tp7j8" podStartSLOduration=30.21176555 podStartE2EDuration="43.00155069s" podCreationTimestamp="2026-03-13 00:34:17 +0000 UTC" firstStartedPulling="2026-03-13 00:34:46.029200272 +0000 UTC m=+46.472455431" lastFinishedPulling="2026-03-13 00:34:58.818985422 +0000 UTC m=+59.262240571" observedRunningTime="2026-03-13 00:34:59.988106593 +0000 UTC m=+60.431361752" watchObservedRunningTime="2026-03-13 00:35:00.00155069 +0000 UTC m=+60.444805849" Mar 13 00:35:00.002601 kubelet[2798]: I0313 00:35:00.002264 2798 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-7449d668ff-b29lf" podStartSLOduration=34.792751994 podStartE2EDuration="44.002258211s" podCreationTimestamp="2026-03-13 00:34:16 +0000 UTC" firstStartedPulling="2026-03-13 00:34:50.071697798 +0000 UTC m=+50.514952947" lastFinishedPulling="2026-03-13 00:34:59.281204005 +0000 UTC m=+59.724459164" observedRunningTime="2026-03-13 00:35:00.000813411 +0000 UTC m=+60.444068560" watchObservedRunningTime="2026-03-13 00:35:00.002258211 +0000 UTC m=+60.445513370" Mar 13 00:35:00.951565 kubelet[2798]: I0313 00:35:00.951347 2798 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 00:35:22.514417 kubelet[2798]: I0313 00:35:22.514322 2798 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 00:35:28.863336 kubelet[2798]: I0313 00:35:28.862796 2798 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 00:35:30.492851 systemd[1]: Started sshd@7-157.180.117.151:22-4.153.228.146:59526.service - OpenSSH per-connection server daemon (4.153.228.146:59526). Mar 13 00:35:31.176503 sshd[5514]: Accepted publickey for core from 4.153.228.146 port 59526 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:35:31.179790 sshd-session[5514]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:35:31.188173 systemd-logind[1609]: New session 8 of user core. Mar 13 00:35:31.197940 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 13 00:35:31.651144 sshd[5517]: Connection closed by 4.153.228.146 port 59526 Mar 13 00:35:31.652221 sshd-session[5514]: pam_unix(sshd:session): session closed for user core Mar 13 00:35:31.660549 systemd[1]: sshd@7-157.180.117.151:22-4.153.228.146:59526.service: Deactivated successfully. Mar 13 00:35:31.666762 systemd[1]: session-8.scope: Deactivated successfully. Mar 13 00:35:31.668791 systemd-logind[1609]: Session 8 logged out. Waiting for processes to exit. Mar 13 00:35:31.672064 systemd-logind[1609]: Removed session 8. Mar 13 00:35:36.780819 systemd[1]: Started sshd@8-157.180.117.151:22-4.153.228.146:59530.service - OpenSSH per-connection server daemon (4.153.228.146:59530). Mar 13 00:35:37.444062 sshd[5530]: Accepted publickey for core from 4.153.228.146 port 59530 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:35:37.447037 sshd-session[5530]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:35:37.452820 systemd-logind[1609]: New session 9 of user core. Mar 13 00:35:37.459345 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 13 00:35:37.878861 sshd[5556]: Connection closed by 4.153.228.146 port 59530 Mar 13 00:35:37.880769 sshd-session[5530]: pam_unix(sshd:session): session closed for user core Mar 13 00:35:37.884154 systemd[1]: sshd@8-157.180.117.151:22-4.153.228.146:59530.service: Deactivated successfully. Mar 13 00:35:37.887874 systemd[1]: session-9.scope: Deactivated successfully. Mar 13 00:35:37.889222 systemd-logind[1609]: Session 9 logged out. Waiting for processes to exit. Mar 13 00:35:37.891856 systemd-logind[1609]: Removed session 9. Mar 13 00:35:43.017113 systemd[1]: Started sshd@9-157.180.117.151:22-4.153.228.146:54466.service - OpenSSH per-connection server daemon (4.153.228.146:54466). Mar 13 00:35:43.689530 sshd[5573]: Accepted publickey for core from 4.153.228.146 port 54466 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:35:43.691519 sshd-session[5573]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:35:43.699370 systemd-logind[1609]: New session 10 of user core. Mar 13 00:35:43.706935 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 13 00:35:44.149503 sshd[5576]: Connection closed by 4.153.228.146 port 54466 Mar 13 00:35:44.151987 sshd-session[5573]: pam_unix(sshd:session): session closed for user core Mar 13 00:35:44.158590 systemd-logind[1609]: Session 10 logged out. Waiting for processes to exit. Mar 13 00:35:44.160044 systemd[1]: sshd@9-157.180.117.151:22-4.153.228.146:54466.service: Deactivated successfully. Mar 13 00:35:44.163892 systemd[1]: session-10.scope: Deactivated successfully. Mar 13 00:35:44.166828 systemd-logind[1609]: Removed session 10. Mar 13 00:35:49.292771 systemd[1]: Started sshd@10-157.180.117.151:22-4.153.228.146:48712.service - OpenSSH per-connection server daemon (4.153.228.146:48712). Mar 13 00:35:49.956934 sshd[5653]: Accepted publickey for core from 4.153.228.146 port 48712 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:35:49.959388 sshd-session[5653]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:35:49.968808 systemd-logind[1609]: New session 11 of user core. Mar 13 00:35:49.977919 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 13 00:35:50.383080 sshd[5656]: Connection closed by 4.153.228.146 port 48712 Mar 13 00:35:50.384158 sshd-session[5653]: pam_unix(sshd:session): session closed for user core Mar 13 00:35:50.391597 systemd-logind[1609]: Session 11 logged out. Waiting for processes to exit. Mar 13 00:35:50.392165 systemd[1]: sshd@10-157.180.117.151:22-4.153.228.146:48712.service: Deactivated successfully. Mar 13 00:35:50.397187 systemd[1]: session-11.scope: Deactivated successfully. Mar 13 00:35:50.400598 systemd-logind[1609]: Removed session 11. Mar 13 00:35:50.524746 systemd[1]: Started sshd@11-157.180.117.151:22-4.153.228.146:48722.service - OpenSSH per-connection server daemon (4.153.228.146:48722). Mar 13 00:35:51.194397 sshd[5669]: Accepted publickey for core from 4.153.228.146 port 48722 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:35:51.196990 sshd-session[5669]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:35:51.204769 systemd-logind[1609]: New session 12 of user core. Mar 13 00:35:51.212866 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 13 00:35:51.684795 sshd[5672]: Connection closed by 4.153.228.146 port 48722 Mar 13 00:35:51.687133 sshd-session[5669]: pam_unix(sshd:session): session closed for user core Mar 13 00:35:51.696078 systemd[1]: sshd@11-157.180.117.151:22-4.153.228.146:48722.service: Deactivated successfully. Mar 13 00:35:51.700730 systemd[1]: session-12.scope: Deactivated successfully. Mar 13 00:35:51.702576 systemd-logind[1609]: Session 12 logged out. Waiting for processes to exit. Mar 13 00:35:51.705624 systemd-logind[1609]: Removed session 12. Mar 13 00:35:51.822302 systemd[1]: Started sshd@12-157.180.117.151:22-4.153.228.146:48730.service - OpenSSH per-connection server daemon (4.153.228.146:48730). Mar 13 00:35:52.479489 sshd[5682]: Accepted publickey for core from 4.153.228.146 port 48730 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:35:52.482246 sshd-session[5682]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:35:52.491807 systemd-logind[1609]: New session 13 of user core. Mar 13 00:35:52.499029 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 13 00:35:52.949121 sshd[5685]: Connection closed by 4.153.228.146 port 48730 Mar 13 00:35:52.950965 sshd-session[5682]: pam_unix(sshd:session): session closed for user core Mar 13 00:35:52.958359 systemd[1]: sshd@12-157.180.117.151:22-4.153.228.146:48730.service: Deactivated successfully. Mar 13 00:35:52.963019 systemd[1]: session-13.scope: Deactivated successfully. Mar 13 00:35:52.964867 systemd-logind[1609]: Session 13 logged out. Waiting for processes to exit. Mar 13 00:35:52.968040 systemd-logind[1609]: Removed session 13. Mar 13 00:35:58.082598 systemd[1]: Started sshd@13-157.180.117.151:22-4.153.228.146:48744.service - OpenSSH per-connection server daemon (4.153.228.146:48744). Mar 13 00:35:58.741478 sshd[5728]: Accepted publickey for core from 4.153.228.146 port 48744 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:35:58.743535 sshd-session[5728]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:35:58.751943 systemd-logind[1609]: New session 14 of user core. Mar 13 00:35:58.756798 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 13 00:35:59.165995 sshd[5731]: Connection closed by 4.153.228.146 port 48744 Mar 13 00:35:59.167023 sshd-session[5728]: pam_unix(sshd:session): session closed for user core Mar 13 00:35:59.174522 systemd[1]: sshd@13-157.180.117.151:22-4.153.228.146:48744.service: Deactivated successfully. Mar 13 00:35:59.181006 systemd[1]: session-14.scope: Deactivated successfully. Mar 13 00:35:59.182983 systemd-logind[1609]: Session 14 logged out. Waiting for processes to exit. Mar 13 00:35:59.185975 systemd-logind[1609]: Removed session 14. Mar 13 00:35:59.302044 systemd[1]: Started sshd@14-157.180.117.151:22-4.153.228.146:44886.service - OpenSSH per-connection server daemon (4.153.228.146:44886). Mar 13 00:35:59.960121 sshd[5767]: Accepted publickey for core from 4.153.228.146 port 44886 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:35:59.963296 sshd-session[5767]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:35:59.971529 systemd-logind[1609]: New session 15 of user core. Mar 13 00:35:59.976088 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 13 00:36:00.580842 sshd[5772]: Connection closed by 4.153.228.146 port 44886 Mar 13 00:36:00.582962 sshd-session[5767]: pam_unix(sshd:session): session closed for user core Mar 13 00:36:00.590513 systemd-logind[1609]: Session 15 logged out. Waiting for processes to exit. Mar 13 00:36:00.590927 systemd[1]: sshd@14-157.180.117.151:22-4.153.228.146:44886.service: Deactivated successfully. Mar 13 00:36:00.595616 systemd[1]: session-15.scope: Deactivated successfully. Mar 13 00:36:00.599591 systemd-logind[1609]: Removed session 15. Mar 13 00:36:00.717920 systemd[1]: Started sshd@15-157.180.117.151:22-4.153.228.146:44894.service - OpenSSH per-connection server daemon (4.153.228.146:44894). Mar 13 00:36:01.396201 sshd[5782]: Accepted publickey for core from 4.153.228.146 port 44894 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:36:01.397069 sshd-session[5782]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:36:01.401997 systemd-logind[1609]: New session 16 of user core. Mar 13 00:36:01.413849 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 13 00:36:02.396030 sshd[5785]: Connection closed by 4.153.228.146 port 44894 Mar 13 00:36:02.397366 sshd-session[5782]: pam_unix(sshd:session): session closed for user core Mar 13 00:36:02.401359 systemd[1]: sshd@15-157.180.117.151:22-4.153.228.146:44894.service: Deactivated successfully. Mar 13 00:36:02.403300 systemd[1]: session-16.scope: Deactivated successfully. Mar 13 00:36:02.404463 systemd-logind[1609]: Session 16 logged out. Waiting for processes to exit. Mar 13 00:36:02.406286 systemd-logind[1609]: Removed session 16. Mar 13 00:36:02.538177 systemd[1]: Started sshd@16-157.180.117.151:22-4.153.228.146:44902.service - OpenSSH per-connection server daemon (4.153.228.146:44902). Mar 13 00:36:03.186584 sshd[5813]: Accepted publickey for core from 4.153.228.146 port 44902 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:36:03.189326 sshd-session[5813]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:36:03.199316 systemd-logind[1609]: New session 17 of user core. Mar 13 00:36:03.213904 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 13 00:36:03.741219 sshd[5816]: Connection closed by 4.153.228.146 port 44902 Mar 13 00:36:03.743105 sshd-session[5813]: pam_unix(sshd:session): session closed for user core Mar 13 00:36:03.751537 systemd[1]: sshd@16-157.180.117.151:22-4.153.228.146:44902.service: Deactivated successfully. Mar 13 00:36:03.756799 systemd[1]: session-17.scope: Deactivated successfully. Mar 13 00:36:03.759402 systemd-logind[1609]: Session 17 logged out. Waiting for processes to exit. Mar 13 00:36:03.763191 systemd-logind[1609]: Removed session 17. Mar 13 00:36:03.878392 systemd[1]: Started sshd@17-157.180.117.151:22-4.153.228.146:44916.service - OpenSSH per-connection server daemon (4.153.228.146:44916). Mar 13 00:36:04.543038 sshd[5828]: Accepted publickey for core from 4.153.228.146 port 44916 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:36:04.545079 sshd-session[5828]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:36:04.552717 systemd-logind[1609]: New session 18 of user core. Mar 13 00:36:04.559466 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 13 00:36:04.956636 sshd[5831]: Connection closed by 4.153.228.146 port 44916 Mar 13 00:36:04.957219 sshd-session[5828]: pam_unix(sshd:session): session closed for user core Mar 13 00:36:04.962039 systemd[1]: sshd@17-157.180.117.151:22-4.153.228.146:44916.service: Deactivated successfully. Mar 13 00:36:04.964037 systemd[1]: session-18.scope: Deactivated successfully. Mar 13 00:36:04.965063 systemd-logind[1609]: Session 18 logged out. Waiting for processes to exit. Mar 13 00:36:04.967785 systemd-logind[1609]: Removed session 18. Mar 13 00:36:10.094281 systemd[1]: Started sshd@18-157.180.117.151:22-4.153.228.146:37328.service - OpenSSH per-connection server daemon (4.153.228.146:37328). Mar 13 00:36:10.757996 sshd[5906]: Accepted publickey for core from 4.153.228.146 port 37328 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:36:10.760513 sshd-session[5906]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:36:10.770257 systemd-logind[1609]: New session 19 of user core. Mar 13 00:36:10.776625 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 13 00:36:11.214630 sshd[5913]: Connection closed by 4.153.228.146 port 37328 Mar 13 00:36:11.216377 sshd-session[5906]: pam_unix(sshd:session): session closed for user core Mar 13 00:36:11.220967 systemd[1]: sshd@18-157.180.117.151:22-4.153.228.146:37328.service: Deactivated successfully. Mar 13 00:36:11.225451 systemd[1]: session-19.scope: Deactivated successfully. Mar 13 00:36:11.227331 systemd-logind[1609]: Session 19 logged out. Waiting for processes to exit. Mar 13 00:36:11.229431 systemd-logind[1609]: Removed session 19. Mar 13 00:36:16.351342 systemd[1]: Started sshd@19-157.180.117.151:22-4.153.228.146:37338.service - OpenSSH per-connection server daemon (4.153.228.146:37338). Mar 13 00:36:16.997426 sshd[5924]: Accepted publickey for core from 4.153.228.146 port 37338 ssh2: RSA SHA256:ihdQa0i/HnNGvKP5m9obD9eorZ8Lhhc0yafWx7ReGkQ Mar 13 00:36:16.998814 sshd-session[5924]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 13 00:36:17.004693 systemd-logind[1609]: New session 20 of user core. Mar 13 00:36:17.007799 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 13 00:36:17.447899 sshd[5927]: Connection closed by 4.153.228.146 port 37338 Mar 13 00:36:17.450013 sshd-session[5924]: pam_unix(sshd:session): session closed for user core Mar 13 00:36:17.457407 systemd[1]: sshd@19-157.180.117.151:22-4.153.228.146:37338.service: Deactivated successfully. Mar 13 00:36:17.461464 systemd[1]: session-20.scope: Deactivated successfully. Mar 13 00:36:17.464305 systemd-logind[1609]: Session 20 logged out. Waiting for processes to exit. Mar 13 00:36:17.467471 systemd-logind[1609]: Removed session 20. Mar 13 00:36:34.064591 kubelet[2798]: E0313 00:36:34.064355 2798 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:39642->10.0.0.2:2379: read: connection timed out" Mar 13 00:36:34.293621 systemd[1]: cri-containerd-65aceb5dd321c420641e76d80efbf4a119009e355c6c74183d40bbbb4a4208bb.scope: Deactivated successfully. Mar 13 00:36:34.295755 systemd[1]: cri-containerd-65aceb5dd321c420641e76d80efbf4a119009e355c6c74183d40bbbb4a4208bb.scope: Consumed 9.015s CPU time, 144.9M memory peak, 420K read from disk. Mar 13 00:36:34.301155 containerd[1635]: time="2026-03-13T00:36:34.300963159Z" level=info msg="received container exit event container_id:\"65aceb5dd321c420641e76d80efbf4a119009e355c6c74183d40bbbb4a4208bb\" id:\"65aceb5dd321c420641e76d80efbf4a119009e355c6c74183d40bbbb4a4208bb\" pid:3124 exit_status:1 exited_at:{seconds:1773362194 nanos:294581625}" Mar 13 00:36:34.341122 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-65aceb5dd321c420641e76d80efbf4a119009e355c6c74183d40bbbb4a4208bb-rootfs.mount: Deactivated successfully. Mar 13 00:36:35.136073 systemd[1]: cri-containerd-a230832493d975dc2294a1a795812e97fb49e31f405b48b8eb56fb9b457939c9.scope: Deactivated successfully. Mar 13 00:36:35.136342 systemd[1]: cri-containerd-a230832493d975dc2294a1a795812e97fb49e31f405b48b8eb56fb9b457939c9.scope: Consumed 3.241s CPU time, 63.1M memory peak, 64K read from disk. Mar 13 00:36:35.138491 containerd[1635]: time="2026-03-13T00:36:35.138238855Z" level=info msg="received container exit event container_id:\"a230832493d975dc2294a1a795812e97fb49e31f405b48b8eb56fb9b457939c9\" id:\"a230832493d975dc2294a1a795812e97fb49e31f405b48b8eb56fb9b457939c9\" pid:2640 exit_status:1 exited_at:{seconds:1773362195 nanos:137870597}" Mar 13 00:36:35.170381 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a230832493d975dc2294a1a795812e97fb49e31f405b48b8eb56fb9b457939c9-rootfs.mount: Deactivated successfully. Mar 13 00:36:35.201605 kubelet[2798]: I0313 00:36:35.201514 2798 scope.go:117] "RemoveContainer" containerID="a230832493d975dc2294a1a795812e97fb49e31f405b48b8eb56fb9b457939c9" Mar 13 00:36:35.205232 kubelet[2798]: I0313 00:36:35.205171 2798 scope.go:117] "RemoveContainer" containerID="65aceb5dd321c420641e76d80efbf4a119009e355c6c74183d40bbbb4a4208bb" Mar 13 00:36:35.205754 containerd[1635]: time="2026-03-13T00:36:35.205712081Z" level=info msg="CreateContainer within sandbox \"10513e33c823dfcc620f1a742664acb0600348b85fe0c8869c2ff6d7a5c9facb\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Mar 13 00:36:35.209570 containerd[1635]: time="2026-03-13T00:36:35.208348482Z" level=info msg="CreateContainer within sandbox \"a5d42e4488fe1795bb7bada0948db2c21dffd231c031065f7119f8886c45bf68\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Mar 13 00:36:35.221447 containerd[1635]: time="2026-03-13T00:36:35.221357079Z" level=info msg="Container fadc74e1cfcf021bc15da42519d99f2ab8ef75b98c9682124217e79fd93638b0: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:36:35.223987 containerd[1635]: time="2026-03-13T00:36:35.223951691Z" level=info msg="Container c178f7118a2ae4577e7a39753838f33440ed7f680ccd2d5a6787b441c9d1865d: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:36:35.224847 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1665081008.mount: Deactivated successfully. Mar 13 00:36:35.236531 containerd[1635]: time="2026-03-13T00:36:35.236474001Z" level=info msg="CreateContainer within sandbox \"a5d42e4488fe1795bb7bada0948db2c21dffd231c031065f7119f8886c45bf68\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"fadc74e1cfcf021bc15da42519d99f2ab8ef75b98c9682124217e79fd93638b0\"" Mar 13 00:36:35.237354 containerd[1635]: time="2026-03-13T00:36:35.237183985Z" level=info msg="StartContainer for \"fadc74e1cfcf021bc15da42519d99f2ab8ef75b98c9682124217e79fd93638b0\"" Mar 13 00:36:35.238230 containerd[1635]: time="2026-03-13T00:36:35.238204129Z" level=info msg="connecting to shim fadc74e1cfcf021bc15da42519d99f2ab8ef75b98c9682124217e79fd93638b0" address="unix:///run/containerd/s/c2f46b5cd4bf9b2f87cd24084935310e23bdaaf7ba7e83a8a0f769c25b6469cf" protocol=ttrpc version=3 Mar 13 00:36:35.241188 containerd[1635]: time="2026-03-13T00:36:35.241159197Z" level=info msg="CreateContainer within sandbox \"10513e33c823dfcc620f1a742664acb0600348b85fe0c8869c2ff6d7a5c9facb\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"c178f7118a2ae4577e7a39753838f33440ed7f680ccd2d5a6787b441c9d1865d\"" Mar 13 00:36:35.243257 containerd[1635]: time="2026-03-13T00:36:35.243187343Z" level=info msg="StartContainer for \"c178f7118a2ae4577e7a39753838f33440ed7f680ccd2d5a6787b441c9d1865d\"" Mar 13 00:36:35.244327 containerd[1635]: time="2026-03-13T00:36:35.244284925Z" level=info msg="connecting to shim c178f7118a2ae4577e7a39753838f33440ed7f680ccd2d5a6787b441c9d1865d" address="unix:///run/containerd/s/823ce0d920924546cfad431aaf456d3d500f5f5cb4f7ca4fd26dcac365e4840c" protocol=ttrpc version=3 Mar 13 00:36:35.262752 systemd[1]: Started cri-containerd-c178f7118a2ae4577e7a39753838f33440ed7f680ccd2d5a6787b441c9d1865d.scope - libcontainer container c178f7118a2ae4577e7a39753838f33440ed7f680ccd2d5a6787b441c9d1865d. Mar 13 00:36:35.266143 systemd[1]: Started cri-containerd-fadc74e1cfcf021bc15da42519d99f2ab8ef75b98c9682124217e79fd93638b0.scope - libcontainer container fadc74e1cfcf021bc15da42519d99f2ab8ef75b98c9682124217e79fd93638b0. Mar 13 00:36:35.303624 containerd[1635]: time="2026-03-13T00:36:35.303578710Z" level=info msg="StartContainer for \"fadc74e1cfcf021bc15da42519d99f2ab8ef75b98c9682124217e79fd93638b0\" returns successfully" Mar 13 00:36:35.322748 containerd[1635]: time="2026-03-13T00:36:35.322723823Z" level=info msg="StartContainer for \"c178f7118a2ae4577e7a39753838f33440ed7f680ccd2d5a6787b441c9d1865d\" returns successfully" Mar 13 00:36:38.224066 kubelet[2798]: E0313 00:36:38.222017 2798 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:39282->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4459-2-4-n-bcba60e63d.189c3f839cc9f144 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4459-2-4-n-bcba60e63d,UID:4386427b1588f042cb1f28dbcb739b96,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4459-2-4-n-bcba60e63d,},FirstTimestamp:2026-03-13 00:36:27.78087866 +0000 UTC m=+148.224133819,LastTimestamp:2026-03-13 00:36:27.78087866 +0000 UTC m=+148.224133819,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-4-n-bcba60e63d,}" Mar 13 00:36:38.599005 systemd[1]: cri-containerd-163d4ee38a5e09fc1a8dad99026cb14a6cec329b41803f3100049974d82c78fb.scope: Deactivated successfully. Mar 13 00:36:38.601019 systemd[1]: cri-containerd-163d4ee38a5e09fc1a8dad99026cb14a6cec329b41803f3100049974d82c78fb.scope: Consumed 1.974s CPU time, 23.7M memory peak. Mar 13 00:36:38.601270 containerd[1635]: time="2026-03-13T00:36:38.601048619Z" level=info msg="received container exit event container_id:\"163d4ee38a5e09fc1a8dad99026cb14a6cec329b41803f3100049974d82c78fb\" id:\"163d4ee38a5e09fc1a8dad99026cb14a6cec329b41803f3100049974d82c78fb\" pid:2612 exit_status:1 exited_at:{seconds:1773362198 nanos:600759721}" Mar 13 00:36:38.640355 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-163d4ee38a5e09fc1a8dad99026cb14a6cec329b41803f3100049974d82c78fb-rootfs.mount: Deactivated successfully. Mar 13 00:36:39.224062 kubelet[2798]: I0313 00:36:39.223997 2798 scope.go:117] "RemoveContainer" containerID="163d4ee38a5e09fc1a8dad99026cb14a6cec329b41803f3100049974d82c78fb" Mar 13 00:36:39.227707 containerd[1635]: time="2026-03-13T00:36:39.226732084Z" level=info msg="CreateContainer within sandbox \"045702106dbddad2202b15837562c871c2c34091f39d6a9b519ccedcf04338c7\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Mar 13 00:36:39.241106 containerd[1635]: time="2026-03-13T00:36:39.241057797Z" level=info msg="Container 695e9c0cce1991c085e0d1490f46d7cf64422b94dfc607d70a5d3b06fc584276: CDI devices from CRI Config.CDIDevices: []" Mar 13 00:36:39.252436 containerd[1635]: time="2026-03-13T00:36:39.252351541Z" level=info msg="CreateContainer within sandbox \"045702106dbddad2202b15837562c871c2c34091f39d6a9b519ccedcf04338c7\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"695e9c0cce1991c085e0d1490f46d7cf64422b94dfc607d70a5d3b06fc584276\"" Mar 13 00:36:39.253340 containerd[1635]: time="2026-03-13T00:36:39.253278744Z" level=info msg="StartContainer for \"695e9c0cce1991c085e0d1490f46d7cf64422b94dfc607d70a5d3b06fc584276\"" Mar 13 00:36:39.256218 containerd[1635]: time="2026-03-13T00:36:39.256179174Z" level=info msg="connecting to shim 695e9c0cce1991c085e0d1490f46d7cf64422b94dfc607d70a5d3b06fc584276" address="unix:///run/containerd/s/e450c08987ad390712ecc9a6b334dcc0151958b409b2fa4357d2ca8361f493a5" protocol=ttrpc version=3 Mar 13 00:36:39.295840 systemd[1]: Started cri-containerd-695e9c0cce1991c085e0d1490f46d7cf64422b94dfc607d70a5d3b06fc584276.scope - libcontainer container 695e9c0cce1991c085e0d1490f46d7cf64422b94dfc607d70a5d3b06fc584276. Mar 13 00:36:39.356859 containerd[1635]: time="2026-03-13T00:36:39.356731044Z" level=info msg="StartContainer for \"695e9c0cce1991c085e0d1490f46d7cf64422b94dfc607d70a5d3b06fc584276\" returns successfully"