Mar 3 13:34:55.902503 kernel: Linux version 6.12.74-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue Mar 3 10:59:45 -00 2026 Mar 3 13:34:55.902523 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=51ade538e3d3c371f07ae1ec6fa9803fff0566ec060cf4b56dc685fc36d0e01c Mar 3 13:34:55.902530 kernel: BIOS-provided physical RAM map: Mar 3 13:34:55.902535 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Mar 3 13:34:55.902542 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ed3efff] usable Mar 3 13:34:55.902547 kernel: BIOS-e820: [mem 0x000000007ed3f000-0x000000007edfffff] reserved Mar 3 13:34:55.902552 kernel: BIOS-e820: [mem 0x000000007ee00000-0x000000007f8ecfff] usable Mar 3 13:34:55.902557 kernel: BIOS-e820: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Mar 3 13:34:55.902562 kernel: BIOS-e820: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Mar 3 13:34:55.902567 kernel: BIOS-e820: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Mar 3 13:34:55.902572 kernel: BIOS-e820: [mem 0x000000007fbff000-0x000000007ff7bfff] usable Mar 3 13:34:55.902576 kernel: BIOS-e820: [mem 0x000000007ff7c000-0x000000007fffffff] reserved Mar 3 13:34:55.902581 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Mar 3 13:34:55.902588 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 3 13:34:55.902593 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Mar 3 13:34:55.902598 kernel: BIOS-e820: [mem 0x0000000100000000-0x0000000179ffffff] usable Mar 3 13:34:55.902603 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Mar 3 13:34:55.902608 kernel: NX (Execute Disable) protection: active Mar 3 13:34:55.902615 kernel: APIC: Static calls initialized Mar 3 13:34:55.902620 kernel: e820: update [mem 0x7dfac018-0x7dfb5a57] usable ==> usable Mar 3 13:34:55.902625 kernel: e820: update [mem 0x7df70018-0x7dfab657] usable ==> usable Mar 3 13:34:55.902630 kernel: e820: update [mem 0x7df34018-0x7df6f657] usable ==> usable Mar 3 13:34:55.902635 kernel: extended physical RAM map: Mar 3 13:34:55.902640 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Mar 3 13:34:55.902645 kernel: reserve setup_data: [mem 0x0000000000100000-0x000000007df34017] usable Mar 3 13:34:55.902649 kernel: reserve setup_data: [mem 0x000000007df34018-0x000000007df6f657] usable Mar 3 13:34:55.902654 kernel: reserve setup_data: [mem 0x000000007df6f658-0x000000007df70017] usable Mar 3 13:34:55.902659 kernel: reserve setup_data: [mem 0x000000007df70018-0x000000007dfab657] usable Mar 3 13:34:55.902676 kernel: reserve setup_data: [mem 0x000000007dfab658-0x000000007dfac017] usable Mar 3 13:34:55.902694 kernel: reserve setup_data: [mem 0x000000007dfac018-0x000000007dfb5a57] usable Mar 3 13:34:55.902699 kernel: reserve setup_data: [mem 0x000000007dfb5a58-0x000000007ed3efff] usable Mar 3 13:34:55.902704 kernel: reserve setup_data: [mem 0x000000007ed3f000-0x000000007edfffff] reserved Mar 3 13:34:55.902708 kernel: reserve setup_data: [mem 0x000000007ee00000-0x000000007f8ecfff] usable Mar 3 13:34:55.902713 kernel: reserve setup_data: [mem 0x000000007f8ed000-0x000000007fb6cfff] reserved Mar 3 13:34:55.902718 kernel: reserve setup_data: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Mar 3 13:34:55.902723 kernel: reserve setup_data: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Mar 3 13:34:55.902728 kernel: reserve setup_data: [mem 0x000000007fbff000-0x000000007ff7bfff] usable Mar 3 13:34:55.902733 kernel: reserve setup_data: [mem 0x000000007ff7c000-0x000000007fffffff] reserved Mar 3 13:34:55.902738 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Mar 3 13:34:55.902743 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 3 13:34:55.902752 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Mar 3 13:34:55.902758 kernel: reserve setup_data: [mem 0x0000000100000000-0x0000000179ffffff] usable Mar 3 13:34:55.902763 kernel: reserve setup_data: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Mar 3 13:34:55.902768 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Mar 3 13:34:55.902773 kernel: efi: SMBIOS=0x7f988000 SMBIOS 3.0=0x7f986000 ACPI=0x7fb7e000 ACPI 2.0=0x7fb7e014 MEMATTR=0x7e845198 RNG=0x7fb73018 Mar 3 13:34:55.902781 kernel: random: crng init done Mar 3 13:34:55.902786 kernel: efi: Remove mem136: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Mar 3 13:34:55.902791 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Mar 3 13:34:55.902796 kernel: secureboot: Secure boot disabled Mar 3 13:34:55.902801 kernel: SMBIOS 3.0.0 present. Mar 3 13:34:55.902806 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 Mar 3 13:34:55.902811 kernel: DMI: Memory slots populated: 1/1 Mar 3 13:34:55.902816 kernel: Hypervisor detected: KVM Mar 3 13:34:55.902821 kernel: last_pfn = 0x7ff7c max_arch_pfn = 0x10000000000 Mar 3 13:34:55.902826 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Mar 3 13:34:55.902831 kernel: kvm-clock: using sched offset of 13606548030 cycles Mar 3 13:34:55.902839 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 3 13:34:55.902844 kernel: tsc: Detected 2396.398 MHz processor Mar 3 13:34:55.902850 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 3 13:34:55.902855 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 3 13:34:55.902860 kernel: last_pfn = 0x17a000 max_arch_pfn = 0x10000000000 Mar 3 13:34:55.902865 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Mar 3 13:34:55.902871 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 3 13:34:55.902876 kernel: last_pfn = 0x7ff7c max_arch_pfn = 0x10000000000 Mar 3 13:34:55.902881 kernel: Using GB pages for direct mapping Mar 3 13:34:55.902888 kernel: ACPI: Early table checksum verification disabled Mar 3 13:34:55.902893 kernel: ACPI: RSDP 0x000000007FB7E014 000024 (v02 BOCHS ) Mar 3 13:34:55.902899 kernel: ACPI: XSDT 0x000000007FB7D0E8 000054 (v01 BOCHS BXPC 00000001 01000013) Mar 3 13:34:55.902904 kernel: ACPI: FACP 0x000000007FB79000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 3 13:34:55.902909 kernel: ACPI: DSDT 0x000000007FB7A000 002443 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 3 13:34:55.902915 kernel: ACPI: FACS 0x000000007FBDD000 000040 Mar 3 13:34:55.902923 kernel: ACPI: APIC 0x000000007FB78000 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 3 13:34:55.902931 kernel: ACPI: HPET 0x000000007FB77000 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 3 13:34:55.902939 kernel: ACPI: MCFG 0x000000007FB76000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 3 13:34:55.902950 kernel: ACPI: WAET 0x000000007FB75000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 3 13:34:55.902958 kernel: ACPI: BGRT 0x000000007FB74000 000038 (v01 INTEL EDK2 00000002 01000013) Mar 3 13:34:55.902965 kernel: ACPI: Reserving FACP table memory at [mem 0x7fb79000-0x7fb790f3] Mar 3 13:34:55.902972 kernel: ACPI: Reserving DSDT table memory at [mem 0x7fb7a000-0x7fb7c442] Mar 3 13:34:55.902980 kernel: ACPI: Reserving FACS table memory at [mem 0x7fbdd000-0x7fbdd03f] Mar 3 13:34:55.902988 kernel: ACPI: Reserving APIC table memory at [mem 0x7fb78000-0x7fb7807f] Mar 3 13:34:55.902994 kernel: ACPI: Reserving HPET table memory at [mem 0x7fb77000-0x7fb77037] Mar 3 13:34:55.903000 kernel: ACPI: Reserving MCFG table memory at [mem 0x7fb76000-0x7fb7603b] Mar 3 13:34:55.903005 kernel: ACPI: Reserving WAET table memory at [mem 0x7fb75000-0x7fb75027] Mar 3 13:34:55.903013 kernel: ACPI: Reserving BGRT table memory at [mem 0x7fb74000-0x7fb74037] Mar 3 13:34:55.903018 kernel: No NUMA configuration found Mar 3 13:34:55.903023 kernel: Faking a node at [mem 0x0000000000000000-0x0000000179ffffff] Mar 3 13:34:55.903028 kernel: NODE_DATA(0) allocated [mem 0x179ff6dc0-0x179ffdfff] Mar 3 13:34:55.903034 kernel: Zone ranges: Mar 3 13:34:55.903039 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 3 13:34:55.903044 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Mar 3 13:34:55.903049 kernel: Normal [mem 0x0000000100000000-0x0000000179ffffff] Mar 3 13:34:55.903054 kernel: Device empty Mar 3 13:34:55.903062 kernel: Movable zone start for each node Mar 3 13:34:55.903067 kernel: Early memory node ranges Mar 3 13:34:55.903072 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Mar 3 13:34:55.903077 kernel: node 0: [mem 0x0000000000100000-0x000000007ed3efff] Mar 3 13:34:55.903082 kernel: node 0: [mem 0x000000007ee00000-0x000000007f8ecfff] Mar 3 13:34:55.903087 kernel: node 0: [mem 0x000000007fbff000-0x000000007ff7bfff] Mar 3 13:34:55.903092 kernel: node 0: [mem 0x0000000100000000-0x0000000179ffffff] Mar 3 13:34:55.903097 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x0000000179ffffff] Mar 3 13:34:55.903103 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 3 13:34:55.903108 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Mar 3 13:34:55.903115 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Mar 3 13:34:55.903120 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Mar 3 13:34:55.903125 kernel: On node 0, zone Normal: 132 pages in unavailable ranges Mar 3 13:34:55.903131 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Mar 3 13:34:55.903136 kernel: ACPI: PM-Timer IO Port: 0x608 Mar 3 13:34:55.903141 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Mar 3 13:34:55.903146 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Mar 3 13:34:55.903151 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Mar 3 13:34:55.903157 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Mar 3 13:34:55.903164 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 3 13:34:55.903169 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Mar 3 13:34:55.903174 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Mar 3 13:34:55.903180 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 3 13:34:55.903185 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Mar 3 13:34:55.903190 kernel: CPU topo: Max. logical packages: 1 Mar 3 13:34:55.903195 kernel: CPU topo: Max. logical dies: 1 Mar 3 13:34:55.903209 kernel: CPU topo: Max. dies per package: 1 Mar 3 13:34:55.903215 kernel: CPU topo: Max. threads per core: 1 Mar 3 13:34:55.903220 kernel: CPU topo: Num. cores per package: 2 Mar 3 13:34:55.903226 kernel: CPU topo: Num. threads per package: 2 Mar 3 13:34:55.903231 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Mar 3 13:34:55.903238 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Mar 3 13:34:55.903244 kernel: [mem 0x80000000-0xdfffffff] available for PCI devices Mar 3 13:34:55.903249 kernel: Booting paravirtualized kernel on KVM Mar 3 13:34:55.903255 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 3 13:34:55.903260 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Mar 3 13:34:55.903268 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Mar 3 13:34:55.903273 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Mar 3 13:34:55.903279 kernel: pcpu-alloc: [0] 0 1 Mar 3 13:34:55.903284 kernel: kvm-guest: PV spinlocks disabled, no host support Mar 3 13:34:55.903290 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=51ade538e3d3c371f07ae1ec6fa9803fff0566ec060cf4b56dc685fc36d0e01c Mar 3 13:34:55.903295 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 3 13:34:55.903301 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 3 13:34:55.903306 kernel: Fallback order for Node 0: 0 Mar 3 13:34:55.903314 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1022792 Mar 3 13:34:55.903320 kernel: Policy zone: Normal Mar 3 13:34:55.903325 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 3 13:34:55.903331 kernel: software IO TLB: area num 2. Mar 3 13:34:55.903336 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 3 13:34:55.903341 kernel: ftrace: allocating 40099 entries in 157 pages Mar 3 13:34:55.903347 kernel: ftrace: allocated 157 pages with 5 groups Mar 3 13:34:55.903352 kernel: Dynamic Preempt: voluntary Mar 3 13:34:55.903357 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 3 13:34:55.903366 kernel: rcu: RCU event tracing is enabled. Mar 3 13:34:55.903372 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 3 13:34:55.903378 kernel: Trampoline variant of Tasks RCU enabled. Mar 3 13:34:55.903383 kernel: Rude variant of Tasks RCU enabled. Mar 3 13:34:55.903388 kernel: Tracing variant of Tasks RCU enabled. Mar 3 13:34:55.903394 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 3 13:34:55.903399 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 3 13:34:55.903405 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 3 13:34:55.903410 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 3 13:34:55.903418 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 3 13:34:55.903423 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Mar 3 13:34:55.903429 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 3 13:34:55.903434 kernel: Console: colour dummy device 80x25 Mar 3 13:34:55.903439 kernel: printk: legacy console [tty0] enabled Mar 3 13:34:55.903445 kernel: printk: legacy console [ttyS0] enabled Mar 3 13:34:55.903450 kernel: ACPI: Core revision 20240827 Mar 3 13:34:55.903456 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Mar 3 13:34:55.903461 kernel: APIC: Switch to symmetric I/O mode setup Mar 3 13:34:55.903469 kernel: x2apic enabled Mar 3 13:34:55.903474 kernel: APIC: Switched APIC routing to: physical x2apic Mar 3 13:34:55.903479 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Mar 3 13:34:55.903485 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x228aecd6e18, max_idle_ns: 440795270957 ns Mar 3 13:34:55.903499 kernel: Calibrating delay loop (skipped) preset value.. 4792.79 BogoMIPS (lpj=2396398) Mar 3 13:34:55.903505 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Mar 3 13:34:55.903511 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Mar 3 13:34:55.903516 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Mar 3 13:34:55.903522 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 3 13:34:55.903531 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Mar 3 13:34:55.903536 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Mar 3 13:34:55.903542 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Mar 3 13:34:55.903547 kernel: active return thunk: srso_alias_return_thunk Mar 3 13:34:55.903553 kernel: Speculative Return Stack Overflow: Mitigation: Safe RET Mar 3 13:34:55.903558 kernel: Transient Scheduler Attacks: Forcing mitigation on in a VM Mar 3 13:34:55.903564 kernel: Transient Scheduler Attacks: Vulnerable: Clear CPU buffers attempted, no microcode Mar 3 13:34:55.903569 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Mar 3 13:34:55.903574 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Mar 3 13:34:55.903582 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Mar 3 13:34:55.903587 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Mar 3 13:34:55.903593 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Mar 3 13:34:55.903599 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Mar 3 13:34:55.903604 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Mar 3 13:34:55.903610 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Mar 3 13:34:55.903615 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Mar 3 13:34:55.903620 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Mar 3 13:34:55.903626 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Mar 3 13:34:55.903633 kernel: x86/fpu: xstate_offset[9]: 2432, xstate_sizes[9]: 8 Mar 3 13:34:55.903639 kernel: x86/fpu: Enabled xstate features 0x2e7, context size is 2440 bytes, using 'compacted' format. Mar 3 13:34:55.903644 kernel: Freeing SMP alternatives memory: 32K Mar 3 13:34:55.903650 kernel: pid_max: default: 32768 minimum: 301 Mar 3 13:34:55.903655 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Mar 3 13:34:55.903661 kernel: landlock: Up and running. Mar 3 13:34:55.903684 kernel: SELinux: Initializing. Mar 3 13:34:55.903690 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 3 13:34:55.903695 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 3 13:34:55.903703 kernel: smpboot: CPU0: AMD EPYC-Genoa Processor (family: 0x19, model: 0x11, stepping: 0x0) Mar 3 13:34:55.903708 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Mar 3 13:34:55.903714 kernel: ... version: 0 Mar 3 13:34:55.903719 kernel: ... bit width: 48 Mar 3 13:34:55.903724 kernel: ... generic registers: 6 Mar 3 13:34:55.903730 kernel: ... value mask: 0000ffffffffffff Mar 3 13:34:55.903735 kernel: ... max period: 00007fffffffffff Mar 3 13:34:55.903741 kernel: ... fixed-purpose events: 0 Mar 3 13:34:55.903746 kernel: ... event mask: 000000000000003f Mar 3 13:34:55.903754 kernel: signal: max sigframe size: 3376 Mar 3 13:34:55.903760 kernel: rcu: Hierarchical SRCU implementation. Mar 3 13:34:55.903765 kernel: rcu: Max phase no-delay instances is 400. Mar 3 13:34:55.903771 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Mar 3 13:34:55.903776 kernel: smp: Bringing up secondary CPUs ... Mar 3 13:34:55.903781 kernel: smpboot: x86: Booting SMP configuration: Mar 3 13:34:55.903787 kernel: .... node #0, CPUs: #1 Mar 3 13:34:55.903792 kernel: smp: Brought up 1 node, 2 CPUs Mar 3 13:34:55.903797 kernel: smpboot: Total of 2 processors activated (9585.59 BogoMIPS) Mar 3 13:34:55.903806 kernel: Memory: 3848512K/4091168K available (14336K kernel code, 2445K rwdata, 26064K rodata, 46200K init, 2560K bss, 237024K reserved, 0K cma-reserved) Mar 3 13:34:55.903811 kernel: devtmpfs: initialized Mar 3 13:34:55.903817 kernel: x86/mm: Memory block size: 128MB Mar 3 13:34:55.903822 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7fb7f000-0x7fbfefff] (524288 bytes) Mar 3 13:34:55.903828 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 3 13:34:55.903833 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 3 13:34:55.903839 kernel: pinctrl core: initialized pinctrl subsystem Mar 3 13:34:55.903844 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 3 13:34:55.903850 kernel: audit: initializing netlink subsys (disabled) Mar 3 13:34:55.903858 kernel: audit: type=2000 audit(1772544892.681:1): state=initialized audit_enabled=0 res=1 Mar 3 13:34:55.903863 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 3 13:34:55.903868 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 3 13:34:55.903874 kernel: cpuidle: using governor menu Mar 3 13:34:55.903879 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 3 13:34:55.903884 kernel: dca service started, version 1.12.1 Mar 3 13:34:55.903890 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Mar 3 13:34:55.903895 kernel: PCI: Using configuration type 1 for base access Mar 3 13:34:55.903901 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 3 13:34:55.903908 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 3 13:34:55.903913 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Mar 3 13:34:55.903919 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 3 13:34:55.903924 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 3 13:34:55.903930 kernel: ACPI: Added _OSI(Module Device) Mar 3 13:34:55.903935 kernel: ACPI: Added _OSI(Processor Device) Mar 3 13:34:55.903940 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 3 13:34:55.903946 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 3 13:34:55.903951 kernel: ACPI: Interpreter enabled Mar 3 13:34:55.903959 kernel: ACPI: PM: (supports S0 S5) Mar 3 13:34:55.903964 kernel: ACPI: Using IOAPIC for interrupt routing Mar 3 13:34:55.903969 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 3 13:34:55.903975 kernel: PCI: Using E820 reservations for host bridge windows Mar 3 13:34:55.903980 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Mar 3 13:34:55.903986 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 3 13:34:55.904142 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 3 13:34:55.904246 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Mar 3 13:34:55.904348 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Mar 3 13:34:55.904355 kernel: PCI host bridge to bus 0000:00 Mar 3 13:34:55.904455 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 3 13:34:55.904554 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Mar 3 13:34:55.904643 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 3 13:34:55.905778 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xdfffffff window] Mar 3 13:34:55.905877 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Mar 3 13:34:55.905971 kernel: pci_bus 0000:00: root bus resource [mem 0xc000000000-0xc7ffffffff window] Mar 3 13:34:55.906061 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 3 13:34:55.906173 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Mar 3 13:34:55.906309 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Mar 3 13:34:55.906435 kernel: pci 0000:00:01.0: BAR 0 [mem 0x80000000-0x807fffff pref] Mar 3 13:34:55.906561 kernel: pci 0000:00:01.0: BAR 2 [mem 0xc060500000-0xc060503fff 64bit pref] Mar 3 13:34:55.907714 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8138a000-0x8138afff] Mar 3 13:34:55.907841 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Mar 3 13:34:55.907952 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 3 13:34:55.908073 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 3 13:34:55.908186 kernel: pci 0000:00:02.0: BAR 0 [mem 0x81389000-0x81389fff] Mar 3 13:34:55.908300 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Mar 3 13:34:55.908410 kernel: pci 0000:00:02.0: bridge window [mem 0x81200000-0x812fffff] Mar 3 13:34:55.908537 kernel: pci 0000:00:02.0: bridge window [mem 0xc060000000-0xc0600fffff 64bit pref] Mar 3 13:34:55.908655 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 3 13:34:55.910870 kernel: pci 0000:00:02.1: BAR 0 [mem 0x81388000-0x81388fff] Mar 3 13:34:55.910978 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Mar 3 13:34:55.911076 kernel: pci 0000:00:02.1: bridge window [mem 0x81100000-0x811fffff] Mar 3 13:34:55.911180 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 3 13:34:55.911281 kernel: pci 0000:00:02.2: BAR 0 [mem 0x81387000-0x81387fff] Mar 3 13:34:55.911379 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Mar 3 13:34:55.911475 kernel: pci 0000:00:02.2: bridge window [mem 0x81000000-0x810fffff] Mar 3 13:34:55.911584 kernel: pci 0000:00:02.2: bridge window [mem 0xc060100000-0xc0601fffff 64bit pref] Mar 3 13:34:55.912363 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 3 13:34:55.912473 kernel: pci 0000:00:02.3: BAR 0 [mem 0x81386000-0x81386fff] Mar 3 13:34:55.912630 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Mar 3 13:34:55.912750 kernel: pci 0000:00:02.3: bridge window [mem 0xc060200000-0xc0602fffff 64bit pref] Mar 3 13:34:55.912859 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 3 13:34:55.912956 kernel: pci 0000:00:02.4: BAR 0 [mem 0x81385000-0x81385fff] Mar 3 13:34:55.913052 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Mar 3 13:34:55.913147 kernel: pci 0000:00:02.4: bridge window [mem 0x80f00000-0x80ffffff] Mar 3 13:34:55.913245 kernel: pci 0000:00:02.4: bridge window [mem 0xc060300000-0xc0603fffff 64bit pref] Mar 3 13:34:55.913346 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 3 13:34:55.913443 kernel: pci 0000:00:02.5: BAR 0 [mem 0x81384000-0x81384fff] Mar 3 13:34:55.913556 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Mar 3 13:34:55.913651 kernel: pci 0000:00:02.5: bridge window [mem 0x80e00000-0x80efffff] Mar 3 13:34:55.913766 kernel: pci 0000:00:02.5: bridge window [mem 0xc060400000-0xc0604fffff 64bit pref] Mar 3 13:34:55.913869 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 3 13:34:55.913982 kernel: pci 0000:00:02.6: BAR 0 [mem 0x81383000-0x81383fff] Mar 3 13:34:55.914080 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Mar 3 13:34:55.914176 kernel: pci 0000:00:02.6: bridge window [mem 0x80c00000-0x80dfffff] Mar 3 13:34:55.914275 kernel: pci 0000:00:02.6: bridge window [mem 0xc000000000-0xc01fffffff 64bit pref] Mar 3 13:34:55.914376 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 3 13:34:55.914472 kernel: pci 0000:00:02.7: BAR 0 [mem 0x81382000-0x81382fff] Mar 3 13:34:55.914576 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Mar 3 13:34:55.914685 kernel: pci 0000:00:02.7: bridge window [mem 0x80a00000-0x80bfffff] Mar 3 13:34:55.914782 kernel: pci 0000:00:02.7: bridge window [mem 0xc020000000-0xc03fffffff 64bit pref] Mar 3 13:34:55.914889 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Mar 3 13:34:55.914984 kernel: pci 0000:00:03.0: BAR 0 [mem 0x81381000-0x81381fff] Mar 3 13:34:55.915081 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Mar 3 13:34:55.915176 kernel: pci 0000:00:03.0: bridge window [mem 0x80800000-0x809fffff] Mar 3 13:34:55.915271 kernel: pci 0000:00:03.0: bridge window [mem 0xc040000000-0xc05fffffff 64bit pref] Mar 3 13:34:55.915374 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Mar 3 13:34:55.915470 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Mar 3 13:34:55.915586 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Mar 3 13:34:55.916855 kernel: pci 0000:00:1f.2: BAR 4 [io 0x6040-0x605f] Mar 3 13:34:55.916970 kernel: pci 0000:00:1f.2: BAR 5 [mem 0x81380000-0x81380fff] Mar 3 13:34:55.917076 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Mar 3 13:34:55.917174 kernel: pci 0000:00:1f.3: BAR 4 [io 0x6000-0x603f] Mar 3 13:34:55.917285 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Mar 3 13:34:55.917389 kernel: pci 0000:01:00.0: BAR 1 [mem 0x81200000-0x81200fff] Mar 3 13:34:55.917490 kernel: pci 0000:01:00.0: BAR 4 [mem 0xc060000000-0xc060003fff 64bit pref] Mar 3 13:34:55.917604 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Mar 3 13:34:55.917986 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Mar 3 13:34:55.918102 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Mar 3 13:34:55.918204 kernel: pci 0000:02:00.0: BAR 0 [mem 0x81100000-0x81103fff 64bit] Mar 3 13:34:55.918301 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Mar 3 13:34:55.918412 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Mar 3 13:34:55.918526 kernel: pci 0000:03:00.0: BAR 1 [mem 0x81000000-0x81000fff] Mar 3 13:34:55.918628 kernel: pci 0000:03:00.0: BAR 4 [mem 0xc060100000-0xc060103fff 64bit pref] Mar 3 13:34:55.918738 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Mar 3 13:34:55.918846 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Mar 3 13:34:55.918947 kernel: pci 0000:04:00.0: BAR 4 [mem 0xc060200000-0xc060203fff 64bit pref] Mar 3 13:34:55.919043 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Mar 3 13:34:55.919157 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Mar 3 13:34:55.919259 kernel: pci 0000:05:00.0: BAR 1 [mem 0x80f00000-0x80f00fff] Mar 3 13:34:55.919359 kernel: pci 0000:05:00.0: BAR 4 [mem 0xc060300000-0xc060303fff 64bit pref] Mar 3 13:34:55.919454 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Mar 3 13:34:55.919570 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Mar 3 13:34:55.919683 kernel: pci 0000:06:00.0: BAR 1 [mem 0x80e00000-0x80e00fff] Mar 3 13:34:55.919785 kernel: pci 0000:06:00.0: BAR 4 [mem 0xc060400000-0xc060403fff 64bit pref] Mar 3 13:34:55.919884 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Mar 3 13:34:55.919892 kernel: acpiphp: Slot [0] registered Mar 3 13:34:55.920000 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Mar 3 13:34:55.920100 kernel: pci 0000:07:00.0: BAR 1 [mem 0x80c00000-0x80c00fff] Mar 3 13:34:55.920201 kernel: pci 0000:07:00.0: BAR 4 [mem 0xc000000000-0xc000003fff 64bit pref] Mar 3 13:34:55.920300 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref] Mar 3 13:34:55.920397 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Mar 3 13:34:55.920406 kernel: acpiphp: Slot [0-2] registered Mar 3 13:34:55.920509 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Mar 3 13:34:55.920516 kernel: acpiphp: Slot [0-3] registered Mar 3 13:34:55.920612 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Mar 3 13:34:55.920622 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Mar 3 13:34:55.920643 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Mar 3 13:34:55.920652 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 3 13:34:55.920657 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Mar 3 13:34:55.920678 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Mar 3 13:34:55.920684 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Mar 3 13:34:55.920690 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Mar 3 13:34:55.920696 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Mar 3 13:34:55.920701 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Mar 3 13:34:55.920707 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Mar 3 13:34:55.920712 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Mar 3 13:34:55.920718 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Mar 3 13:34:55.920724 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Mar 3 13:34:55.920732 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Mar 3 13:34:55.920738 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Mar 3 13:34:55.920745 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Mar 3 13:34:55.920751 kernel: iommu: Default domain type: Translated Mar 3 13:34:55.920757 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 3 13:34:55.920766 kernel: efivars: Registered efivars operations Mar 3 13:34:55.920774 kernel: PCI: Using ACPI for IRQ routing Mar 3 13:34:55.920780 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 3 13:34:55.920786 kernel: e820: reserve RAM buffer [mem 0x7df34018-0x7fffffff] Mar 3 13:34:55.920791 kernel: e820: reserve RAM buffer [mem 0x7df70018-0x7fffffff] Mar 3 13:34:55.920797 kernel: e820: reserve RAM buffer [mem 0x7dfac018-0x7fffffff] Mar 3 13:34:55.920803 kernel: e820: reserve RAM buffer [mem 0x7ed3f000-0x7fffffff] Mar 3 13:34:55.920808 kernel: e820: reserve RAM buffer [mem 0x7f8ed000-0x7fffffff] Mar 3 13:34:55.920814 kernel: e820: reserve RAM buffer [mem 0x7ff7c000-0x7fffffff] Mar 3 13:34:55.920822 kernel: e820: reserve RAM buffer [mem 0x17a000000-0x17bffffff] Mar 3 13:34:55.920920 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Mar 3 13:34:55.921016 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Mar 3 13:34:55.921113 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 3 13:34:55.921120 kernel: vgaarb: loaded Mar 3 13:34:55.921126 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Mar 3 13:34:55.921132 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Mar 3 13:34:55.921138 kernel: clocksource: Switched to clocksource kvm-clock Mar 3 13:34:55.921143 kernel: VFS: Disk quotas dquot_6.6.0 Mar 3 13:34:55.921152 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 3 13:34:55.921158 kernel: pnp: PnP ACPI init Mar 3 13:34:55.921264 kernel: system 00:04: [mem 0xe0000000-0xefffffff window] has been reserved Mar 3 13:34:55.921272 kernel: pnp: PnP ACPI: found 5 devices Mar 3 13:34:55.921277 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 3 13:34:55.921283 kernel: NET: Registered PF_INET protocol family Mar 3 13:34:55.921289 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 3 13:34:55.921295 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 3 13:34:55.921303 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 3 13:34:55.921309 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 3 13:34:55.921315 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 3 13:34:55.921320 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 3 13:34:55.921326 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 3 13:34:55.921332 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 3 13:34:55.921337 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 3 13:34:55.921343 kernel: NET: Registered PF_XDP protocol family Mar 3 13:34:55.921446 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Mar 3 13:34:55.921561 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref]: can't claim; no compatible bridge window Mar 3 13:34:55.921657 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Mar 3 13:34:55.921776 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Mar 3 13:34:55.921873 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Mar 3 13:34:55.921970 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff]: assigned Mar 3 13:34:55.922065 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff]: assigned Mar 3 13:34:55.922161 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff]: assigned Mar 3 13:34:55.922263 kernel: pci 0000:01:00.0: ROM [mem 0x81280000-0x812fffff pref]: assigned Mar 3 13:34:55.922362 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Mar 3 13:34:55.922459 kernel: pci 0000:00:02.0: bridge window [mem 0x81200000-0x812fffff] Mar 3 13:34:55.922565 kernel: pci 0000:00:02.0: bridge window [mem 0xc060000000-0xc0600fffff 64bit pref] Mar 3 13:34:55.922680 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Mar 3 13:34:55.922779 kernel: pci 0000:00:02.1: bridge window [mem 0x81100000-0x811fffff] Mar 3 13:34:55.922876 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Mar 3 13:34:55.922982 kernel: pci 0000:00:02.2: bridge window [mem 0x81000000-0x810fffff] Mar 3 13:34:55.924743 kernel: pci 0000:00:02.2: bridge window [mem 0xc060100000-0xc0601fffff 64bit pref] Mar 3 13:34:55.924854 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Mar 3 13:34:55.924957 kernel: pci 0000:00:02.3: bridge window [mem 0xc060200000-0xc0602fffff 64bit pref] Mar 3 13:34:55.925054 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Mar 3 13:34:55.925151 kernel: pci 0000:00:02.4: bridge window [mem 0x80f00000-0x80ffffff] Mar 3 13:34:55.925248 kernel: pci 0000:00:02.4: bridge window [mem 0xc060300000-0xc0603fffff 64bit pref] Mar 3 13:34:55.925345 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Mar 3 13:34:55.925441 kernel: pci 0000:00:02.5: bridge window [mem 0x80e00000-0x80efffff] Mar 3 13:34:55.925553 kernel: pci 0000:00:02.5: bridge window [mem 0xc060400000-0xc0604fffff 64bit pref] Mar 3 13:34:55.927448 kernel: pci 0000:07:00.0: ROM [mem 0x80c80000-0x80cfffff pref]: assigned Mar 3 13:34:55.927593 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Mar 3 13:34:55.927779 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] Mar 3 13:34:55.927895 kernel: pci 0000:00:02.6: bridge window [mem 0x80c00000-0x80dfffff] Mar 3 13:34:55.928005 kernel: pci 0000:00:02.6: bridge window [mem 0xc000000000-0xc01fffffff 64bit pref] Mar 3 13:34:55.928115 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Mar 3 13:34:55.928214 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] Mar 3 13:34:55.928327 kernel: pci 0000:00:02.7: bridge window [mem 0x80a00000-0x80bfffff] Mar 3 13:34:55.928429 kernel: pci 0000:00:02.7: bridge window [mem 0xc020000000-0xc03fffffff 64bit pref] Mar 3 13:34:55.928536 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Mar 3 13:34:55.928633 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] Mar 3 13:34:55.928758 kernel: pci 0000:00:03.0: bridge window [mem 0x80800000-0x809fffff] Mar 3 13:34:55.928855 kernel: pci 0000:00:03.0: bridge window [mem 0xc040000000-0xc05fffffff 64bit pref] Mar 3 13:34:55.928947 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Mar 3 13:34:55.929036 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Mar 3 13:34:55.929124 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Mar 3 13:34:55.929218 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xdfffffff window] Mar 3 13:34:55.929320 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Mar 3 13:34:55.929428 kernel: pci_bus 0000:00: resource 9 [mem 0xc000000000-0xc7ffffffff window] Mar 3 13:34:55.929546 kernel: pci_bus 0000:01: resource 1 [mem 0x81200000-0x812fffff] Mar 3 13:34:55.929644 kernel: pci_bus 0000:01: resource 2 [mem 0xc060000000-0xc0600fffff 64bit pref] Mar 3 13:34:55.931856 kernel: pci_bus 0000:02: resource 1 [mem 0x81100000-0x811fffff] Mar 3 13:34:55.931997 kernel: pci_bus 0000:03: resource 1 [mem 0x81000000-0x810fffff] Mar 3 13:34:55.932097 kernel: pci_bus 0000:03: resource 2 [mem 0xc060100000-0xc0601fffff 64bit pref] Mar 3 13:34:55.932201 kernel: pci_bus 0000:04: resource 2 [mem 0xc060200000-0xc0602fffff 64bit pref] Mar 3 13:34:55.932304 kernel: pci_bus 0000:05: resource 1 [mem 0x80f00000-0x80ffffff] Mar 3 13:34:55.932399 kernel: pci_bus 0000:05: resource 2 [mem 0xc060300000-0xc0603fffff 64bit pref] Mar 3 13:34:55.932511 kernel: pci_bus 0000:06: resource 1 [mem 0x80e00000-0x80efffff] Mar 3 13:34:55.932610 kernel: pci_bus 0000:06: resource 2 [mem 0xc060400000-0xc0604fffff 64bit pref] Mar 3 13:34:55.932727 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] Mar 3 13:34:55.932824 kernel: pci_bus 0000:07: resource 1 [mem 0x80c00000-0x80dfffff] Mar 3 13:34:55.932918 kernel: pci_bus 0000:07: resource 2 [mem 0xc000000000-0xc01fffffff 64bit pref] Mar 3 13:34:55.933022 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] Mar 3 13:34:55.933116 kernel: pci_bus 0000:08: resource 1 [mem 0x80a00000-0x80bfffff] Mar 3 13:34:55.933209 kernel: pci_bus 0000:08: resource 2 [mem 0xc020000000-0xc03fffffff 64bit pref] Mar 3 13:34:55.933314 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] Mar 3 13:34:55.933411 kernel: pci_bus 0000:09: resource 1 [mem 0x80800000-0x809fffff] Mar 3 13:34:55.933514 kernel: pci_bus 0000:09: resource 2 [mem 0xc040000000-0xc05fffffff 64bit pref] Mar 3 13:34:55.933523 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Mar 3 13:34:55.933529 kernel: PCI: CLS 0 bytes, default 64 Mar 3 13:34:55.933536 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Mar 3 13:34:55.933542 kernel: software IO TLB: mapped [mem 0x0000000077ffd000-0x000000007bffd000] (64MB) Mar 3 13:34:55.933548 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x228aecd6e18, max_idle_ns: 440795270957 ns Mar 3 13:34:55.933557 kernel: Initialise system trusted keyrings Mar 3 13:34:55.933563 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 3 13:34:55.933569 kernel: Key type asymmetric registered Mar 3 13:34:55.933575 kernel: Asymmetric key parser 'x509' registered Mar 3 13:34:55.933581 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Mar 3 13:34:55.933587 kernel: io scheduler mq-deadline registered Mar 3 13:34:55.933593 kernel: io scheduler kyber registered Mar 3 13:34:55.933599 kernel: io scheduler bfq registered Mar 3 13:34:55.933745 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Mar 3 13:34:55.933853 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Mar 3 13:34:55.933954 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Mar 3 13:34:55.934052 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Mar 3 13:34:55.934150 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Mar 3 13:34:55.934247 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Mar 3 13:34:55.934346 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Mar 3 13:34:55.934444 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Mar 3 13:34:55.934554 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Mar 3 13:34:55.934654 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Mar 3 13:34:55.934768 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Mar 3 13:34:55.934865 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Mar 3 13:34:55.934964 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Mar 3 13:34:55.935061 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Mar 3 13:34:55.935159 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Mar 3 13:34:55.935256 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Mar 3 13:34:55.935266 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Mar 3 13:34:55.935363 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Mar 3 13:34:55.935460 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Mar 3 13:34:55.935467 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 3 13:34:55.935474 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Mar 3 13:34:55.935480 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 3 13:34:55.935486 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 3 13:34:55.935505 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Mar 3 13:34:55.935511 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 3 13:34:55.935517 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 3 13:34:55.935624 kernel: rtc_cmos 00:03: RTC can wake from S4 Mar 3 13:34:55.935633 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Mar 3 13:34:55.936811 kernel: rtc_cmos 00:03: registered as rtc0 Mar 3 13:34:55.936915 kernel: rtc_cmos 00:03: setting system clock to 2026-03-03T13:34:55 UTC (1772544895) Mar 3 13:34:55.937010 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Mar 3 13:34:55.937023 kernel: amd_pstate: The CPPC feature is supported but currently disabled by the BIOS. Please enable it if your BIOS has the CPPC option. Mar 3 13:34:55.937030 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Mar 3 13:34:55.937036 kernel: efifb: probing for efifb Mar 3 13:34:55.937042 kernel: efifb: framebuffer at 0x80000000, using 4000k, total 4000k Mar 3 13:34:55.937048 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Mar 3 13:34:55.937054 kernel: efifb: scrolling: redraw Mar 3 13:34:55.937060 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Mar 3 13:34:55.937066 kernel: Console: switching to colour frame buffer device 160x50 Mar 3 13:34:55.937072 kernel: fb0: EFI VGA frame buffer device Mar 3 13:34:55.937081 kernel: pstore: Using crash dump compression: deflate Mar 3 13:34:55.937086 kernel: pstore: Registered efi_pstore as persistent store backend Mar 3 13:34:55.937092 kernel: NET: Registered PF_INET6 protocol family Mar 3 13:34:55.937098 kernel: Segment Routing with IPv6 Mar 3 13:34:55.937104 kernel: In-situ OAM (IOAM) with IPv6 Mar 3 13:34:55.937110 kernel: NET: Registered PF_PACKET protocol family Mar 3 13:34:55.937116 kernel: Key type dns_resolver registered Mar 3 13:34:55.937121 kernel: IPI shorthand broadcast: enabled Mar 3 13:34:55.937127 kernel: sched_clock: Marking stable (2969011571, 270336394)->(3284761419, -45413454) Mar 3 13:34:55.937136 kernel: registered taskstats version 1 Mar 3 13:34:55.937142 kernel: Loading compiled-in X.509 certificates Mar 3 13:34:55.937148 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.74-flatcar: bf135b2a3d3664cc6742f4e1848867384c1e52f1' Mar 3 13:34:55.937154 kernel: Demotion targets for Node 0: null Mar 3 13:34:55.937160 kernel: Key type .fscrypt registered Mar 3 13:34:55.937165 kernel: Key type fscrypt-provisioning registered Mar 3 13:34:55.937171 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 3 13:34:55.937177 kernel: ima: Allocated hash algorithm: sha1 Mar 3 13:34:55.937185 kernel: ima: No architecture policies found Mar 3 13:34:55.937191 kernel: clk: Disabling unused clocks Mar 3 13:34:55.937197 kernel: Warning: unable to open an initial console. Mar 3 13:34:55.937204 kernel: Freeing unused kernel image (initmem) memory: 46200K Mar 3 13:34:55.937210 kernel: Write protecting the kernel read-only data: 40960k Mar 3 13:34:55.937216 kernel: Freeing unused kernel image (rodata/data gap) memory: 560K Mar 3 13:34:55.937221 kernel: Run /init as init process Mar 3 13:34:55.937227 kernel: with arguments: Mar 3 13:34:55.937233 kernel: /init Mar 3 13:34:55.937239 kernel: with environment: Mar 3 13:34:55.937248 kernel: HOME=/ Mar 3 13:34:55.937254 kernel: TERM=linux Mar 3 13:34:55.937261 systemd[1]: Successfully made /usr/ read-only. Mar 3 13:34:55.937270 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 3 13:34:55.937277 systemd[1]: Detected virtualization kvm. Mar 3 13:34:55.937283 systemd[1]: Detected architecture x86-64. Mar 3 13:34:55.937289 systemd[1]: Running in initrd. Mar 3 13:34:55.937297 systemd[1]: No hostname configured, using default hostname. Mar 3 13:34:55.937303 systemd[1]: Hostname set to . Mar 3 13:34:55.937309 systemd[1]: Initializing machine ID from VM UUID. Mar 3 13:34:55.937316 systemd[1]: Queued start job for default target initrd.target. Mar 3 13:34:55.937322 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 3 13:34:55.937328 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 3 13:34:55.937334 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 3 13:34:55.937341 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 3 13:34:55.937350 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 3 13:34:55.937356 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 3 13:34:55.937364 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 3 13:34:55.937370 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 3 13:34:55.937377 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 3 13:34:55.937383 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 3 13:34:55.937389 systemd[1]: Reached target paths.target - Path Units. Mar 3 13:34:55.937397 systemd[1]: Reached target slices.target - Slice Units. Mar 3 13:34:55.937404 systemd[1]: Reached target swap.target - Swaps. Mar 3 13:34:55.937410 systemd[1]: Reached target timers.target - Timer Units. Mar 3 13:34:55.937416 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 3 13:34:55.937422 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 3 13:34:55.937428 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 3 13:34:55.937434 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 3 13:34:55.937440 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 3 13:34:55.937448 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 3 13:34:55.937455 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 3 13:34:55.937461 systemd[1]: Reached target sockets.target - Socket Units. Mar 3 13:34:55.937467 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 3 13:34:55.937473 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 3 13:34:55.937479 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 3 13:34:55.937485 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Mar 3 13:34:55.937507 systemd[1]: Starting systemd-fsck-usr.service... Mar 3 13:34:55.937513 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 3 13:34:55.937522 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 3 13:34:55.937528 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 3 13:34:55.937535 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 3 13:34:55.937541 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 3 13:34:55.937549 systemd[1]: Finished systemd-fsck-usr.service. Mar 3 13:34:55.937576 systemd-journald[198]: Collecting audit messages is disabled. Mar 3 13:34:55.937596 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 3 13:34:55.937603 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 3 13:34:55.937611 kernel: Bridge firewalling registered Mar 3 13:34:55.937617 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 3 13:34:55.937623 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 3 13:34:55.937630 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 3 13:34:55.937637 systemd-journald[198]: Journal started Mar 3 13:34:55.937652 systemd-journald[198]: Runtime Journal (/run/log/journal/57e25f6eb24a46c697cfa1ebc3b1b784) is 8M, max 76.1M, 68.1M free. Mar 3 13:34:55.893352 systemd-modules-load[200]: Inserted module 'overlay' Mar 3 13:34:55.930246 systemd-modules-load[200]: Inserted module 'br_netfilter' Mar 3 13:34:55.945020 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 3 13:34:55.945049 systemd[1]: Started systemd-journald.service - Journal Service. Mar 3 13:34:55.957787 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 3 13:34:55.961417 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 3 13:34:55.963556 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 3 13:34:55.967243 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 3 13:34:55.975601 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 3 13:34:55.977983 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 3 13:34:55.981564 systemd-tmpfiles[221]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Mar 3 13:34:55.987431 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 3 13:34:55.990187 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 3 13:34:55.993778 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 3 13:34:55.994512 dracut-cmdline[236]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=51ade538e3d3c371f07ae1ec6fa9803fff0566ec060cf4b56dc685fc36d0e01c Mar 3 13:34:56.031857 systemd-resolved[249]: Positive Trust Anchors: Mar 3 13:34:56.032388 systemd-resolved[249]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 3 13:34:56.032410 systemd-resolved[249]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 3 13:34:56.036338 systemd-resolved[249]: Defaulting to hostname 'linux'. Mar 3 13:34:56.037222 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 3 13:34:56.037705 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 3 13:34:56.075691 kernel: SCSI subsystem initialized Mar 3 13:34:56.083690 kernel: Loading iSCSI transport class v2.0-870. Mar 3 13:34:56.092701 kernel: iscsi: registered transport (tcp) Mar 3 13:34:56.111239 kernel: iscsi: registered transport (qla4xxx) Mar 3 13:34:56.111286 kernel: QLogic iSCSI HBA Driver Mar 3 13:34:56.138550 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 3 13:34:56.165226 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 3 13:34:56.167227 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 3 13:34:56.226807 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 3 13:34:56.230432 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 3 13:34:56.289711 kernel: raid6: avx512x4 gen() 43498 MB/s Mar 3 13:34:56.307703 kernel: raid6: avx512x2 gen() 45793 MB/s Mar 3 13:34:56.325719 kernel: raid6: avx512x1 gen() 42747 MB/s Mar 3 13:34:56.343732 kernel: raid6: avx2x4 gen() 46287 MB/s Mar 3 13:34:56.361716 kernel: raid6: avx2x2 gen() 47402 MB/s Mar 3 13:34:56.380753 kernel: raid6: avx2x1 gen() 37841 MB/s Mar 3 13:34:56.380808 kernel: raid6: using algorithm avx2x2 gen() 47402 MB/s Mar 3 13:34:56.400826 kernel: raid6: .... xor() 36364 MB/s, rmw enabled Mar 3 13:34:56.400878 kernel: raid6: using avx512x2 recovery algorithm Mar 3 13:34:56.440707 kernel: xor: automatically using best checksumming function avx Mar 3 13:34:56.552722 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 3 13:34:56.562944 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 3 13:34:56.564536 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 3 13:34:56.588038 systemd-udevd[448]: Using default interface naming scheme 'v255'. Mar 3 13:34:56.593158 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 3 13:34:56.598084 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 3 13:34:56.621864 dracut-pre-trigger[460]: rd.md=0: removing MD RAID activation Mar 3 13:34:56.657749 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 3 13:34:56.661414 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 3 13:34:56.741281 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 3 13:34:56.743782 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 3 13:34:56.820688 kernel: cryptd: max_cpu_qlen set to 1000 Mar 3 13:34:56.826681 kernel: libata version 3.00 loaded. Mar 3 13:34:56.841702 kernel: ahci 0000:00:1f.2: version 3.0 Mar 3 13:34:56.841890 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Mar 3 13:34:56.841902 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Mar 3 13:34:56.848925 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Mar 3 13:34:56.849088 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Mar 3 13:34:56.849205 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Mar 3 13:34:56.860720 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Mar 3 13:34:56.870686 kernel: scsi host1: ahci Mar 3 13:34:56.872679 kernel: scsi host2: ahci Mar 3 13:34:56.876938 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 3 13:34:56.882538 kernel: ACPI: bus type USB registered Mar 3 13:34:56.882552 kernel: usbcore: registered new interface driver usbfs Mar 3 13:34:56.882562 kernel: usbcore: registered new interface driver hub Mar 3 13:34:56.882575 kernel: usbcore: registered new device driver usb Mar 3 13:34:56.882584 kernel: scsi host3: ahci Mar 3 13:34:56.877062 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 3 13:34:56.883303 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 3 13:34:56.884230 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 3 13:34:56.886517 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 3 13:34:56.893850 kernel: scsi host0: Virtio SCSI HBA Mar 3 13:34:56.894027 kernel: scsi 0:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Mar 3 13:34:56.898507 kernel: AES CTR mode by8 optimization enabled Mar 3 13:34:56.899235 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 3 13:34:56.899339 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 3 13:34:56.901850 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 3 13:34:56.907753 kernel: scsi host4: ahci Mar 3 13:34:56.911686 kernel: scsi host5: ahci Mar 3 13:34:56.927695 kernel: scsi host6: ahci Mar 3 13:34:56.940605 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 3 13:34:56.940825 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Mar 3 13:34:56.944688 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Mar 3 13:34:56.955396 kernel: ata1: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380100 irq 37 lpm-pol 1 Mar 3 13:34:56.955408 kernel: ata2: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380180 irq 37 lpm-pol 1 Mar 3 13:34:56.955421 kernel: ata3: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380200 irq 37 lpm-pol 1 Mar 3 13:34:56.956177 kernel: ata4: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380280 irq 37 lpm-pol 1 Mar 3 13:34:56.960128 kernel: ata5: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380300 irq 37 lpm-pol 1 Mar 3 13:34:56.960151 kernel: ata6: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380380 irq 37 lpm-pol 1 Mar 3 13:34:56.972313 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 3 13:34:56.975690 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Mar 3 13:34:56.979045 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 3 13:34:56.995543 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Mar 3 13:34:56.995716 kernel: sd 0:0:0:0: Power-on or device reset occurred Mar 3 13:34:56.995862 kernel: sd 0:0:0:0: [sda] 160006144 512-byte logical blocks: (81.9 GB/76.3 GiB) Mar 3 13:34:56.995995 kernel: hub 1-0:1.0: USB hub found Mar 3 13:34:56.996127 kernel: hub 1-0:1.0: 4 ports detected Mar 3 13:34:56.996243 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Mar 3 13:34:56.996366 kernel: sd 0:0:0:0: [sda] Write Protect is off Mar 3 13:34:56.996500 kernel: hub 2-0:1.0: USB hub found Mar 3 13:34:56.996623 kernel: sd 0:0:0:0: [sda] Mode Sense: 63 00 00 08 Mar 3 13:34:56.996760 kernel: hub 2-0:1.0: 4 ports detected Mar 3 13:34:56.996875 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Mar 3 13:34:57.011261 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 3 13:34:57.011288 kernel: GPT:17805311 != 160006143 Mar 3 13:34:57.011298 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 3 13:34:57.013007 kernel: GPT:17805311 != 160006143 Mar 3 13:34:57.014171 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 3 13:34:57.015708 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 3 13:34:57.018333 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Mar 3 13:34:57.229734 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Mar 3 13:34:57.280227 kernel: ata6: SATA link down (SStatus 0 SControl 300) Mar 3 13:34:57.280304 kernel: ata4: SATA link down (SStatus 0 SControl 300) Mar 3 13:34:57.280717 kernel: ata2: SATA link down (SStatus 0 SControl 300) Mar 3 13:34:57.285721 kernel: ata5: SATA link down (SStatus 0 SControl 300) Mar 3 13:34:57.297800 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Mar 3 13:34:57.297875 kernel: ata3: SATA link down (SStatus 0 SControl 300) Mar 3 13:34:57.300721 kernel: ata1.00: LPM support broken, forcing max_power Mar 3 13:34:57.304727 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Mar 3 13:34:57.309874 kernel: ata1.00: applying bridge limits Mar 3 13:34:57.320333 kernel: ata1.00: LPM support broken, forcing max_power Mar 3 13:34:57.320370 kernel: ata1.00: configured for UDMA/100 Mar 3 13:34:57.327846 kernel: scsi 1:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Mar 3 13:34:57.393099 kernel: sr 1:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Mar 3 13:34:57.393311 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 3 13:34:57.400704 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 3 13:34:57.406972 kernel: usbcore: registered new interface driver usbhid Mar 3 13:34:57.406999 kernel: usbhid: USB HID core driver Mar 3 13:34:57.417766 kernel: sr 1:0:0:0: Attached scsi CD-ROM sr0 Mar 3 13:34:57.423712 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input4 Mar 3 13:34:57.429712 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Mar 3 13:34:57.430702 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Mar 3 13:34:57.444257 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Mar 3 13:34:57.451230 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Mar 3 13:34:57.457509 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Mar 3 13:34:57.457863 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Mar 3 13:34:57.459907 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 3 13:34:57.483691 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 3 13:34:57.484721 disk-uuid[652]: Primary Header is updated. Mar 3 13:34:57.484721 disk-uuid[652]: Secondary Entries is updated. Mar 3 13:34:57.484721 disk-uuid[652]: Secondary Header is updated. Mar 3 13:34:57.647159 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 3 13:34:57.647925 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 3 13:34:57.648412 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 3 13:34:57.649035 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 3 13:34:57.650293 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 3 13:34:57.680906 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 3 13:34:58.507725 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 3 13:34:58.508268 disk-uuid[654]: The operation has completed successfully. Mar 3 13:34:58.565016 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 3 13:34:58.565111 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 3 13:34:58.588771 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 3 13:34:58.601358 sh[684]: Success Mar 3 13:34:58.617670 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 3 13:34:58.617703 kernel: device-mapper: uevent: version 1.0.3 Mar 3 13:34:58.620719 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Mar 3 13:34:58.632711 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Mar 3 13:34:58.677899 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 3 13:34:58.682238 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 3 13:34:58.685271 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 3 13:34:58.701717 kernel: BTRFS: device fsid f550cb98-648e-4600-9237-4b15eb09827b devid 1 transid 41 /dev/mapper/usr (254:0) scanned by mount (696) Mar 3 13:34:58.701759 kernel: BTRFS info (device dm-0): first mount of filesystem f550cb98-648e-4600-9237-4b15eb09827b Mar 3 13:34:58.704824 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 3 13:34:58.719020 kernel: BTRFS info (device dm-0 state E): enabling ssd optimizations Mar 3 13:34:58.719078 kernel: BTRFS info (device dm-0 state E): disabling log replay at mount time Mar 3 13:34:58.719117 kernel: BTRFS info (device dm-0 state E): enabling free space tree Mar 3 13:34:58.723386 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 3 13:34:58.724964 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Mar 3 13:34:58.726173 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 3 13:34:58.727388 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 3 13:34:58.729840 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 3 13:34:58.756697 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (729) Mar 3 13:34:58.762279 kernel: BTRFS info (device sda6): first mount of filesystem af9be1e8-b0f0-42a3-a696-521642a3b9f8 Mar 3 13:34:58.762303 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 3 13:34:58.768768 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 3 13:34:58.768792 kernel: BTRFS info (device sda6): turning on async discard Mar 3 13:34:58.769689 kernel: BTRFS info (device sda6): enabling free space tree Mar 3 13:34:58.777700 kernel: BTRFS info (device sda6): last unmount of filesystem af9be1e8-b0f0-42a3-a696-521642a3b9f8 Mar 3 13:34:58.778029 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 3 13:34:58.779270 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 3 13:34:58.864656 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 3 13:34:58.864972 ignition[788]: Ignition 2.22.0 Mar 3 13:34:58.864979 ignition[788]: Stage: fetch-offline Mar 3 13:34:58.865009 ignition[788]: no configs at "/usr/lib/ignition/base.d" Mar 3 13:34:58.865017 ignition[788]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 3 13:34:58.865087 ignition[788]: parsed url from cmdline: "" Mar 3 13:34:58.865091 ignition[788]: no config URL provided Mar 3 13:34:58.865095 ignition[788]: reading system config file "/usr/lib/ignition/user.ign" Mar 3 13:34:58.868809 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 3 13:34:58.865101 ignition[788]: no config at "/usr/lib/ignition/user.ign" Mar 3 13:34:58.865106 ignition[788]: failed to fetch config: resource requires networking Mar 3 13:34:58.870621 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 3 13:34:58.867727 ignition[788]: Ignition finished successfully Mar 3 13:34:58.899651 systemd-networkd[871]: lo: Link UP Mar 3 13:34:58.899659 systemd-networkd[871]: lo: Gained carrier Mar 3 13:34:58.902025 systemd-networkd[871]: Enumeration completed Mar 3 13:34:58.902168 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 3 13:34:58.902731 systemd-networkd[871]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 3 13:34:58.902736 systemd-networkd[871]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 3 13:34:58.903247 systemd[1]: Reached target network.target - Network. Mar 3 13:34:58.903737 systemd-networkd[871]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 3 13:34:58.903741 systemd-networkd[871]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 3 13:34:58.904542 systemd-networkd[871]: eth0: Link UP Mar 3 13:34:58.905235 systemd-networkd[871]: eth1: Link UP Mar 3 13:34:58.905406 systemd-networkd[871]: eth0: Gained carrier Mar 3 13:34:58.905563 systemd-networkd[871]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 3 13:34:58.906755 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 3 13:34:58.912162 systemd-networkd[871]: eth1: Gained carrier Mar 3 13:34:58.912175 systemd-networkd[871]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 3 13:34:58.928093 ignition[875]: Ignition 2.22.0 Mar 3 13:34:58.928104 ignition[875]: Stage: fetch Mar 3 13:34:58.928203 ignition[875]: no configs at "/usr/lib/ignition/base.d" Mar 3 13:34:58.928211 ignition[875]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 3 13:34:58.928271 ignition[875]: parsed url from cmdline: "" Mar 3 13:34:58.928274 ignition[875]: no config URL provided Mar 3 13:34:58.928278 ignition[875]: reading system config file "/usr/lib/ignition/user.ign" Mar 3 13:34:58.928285 ignition[875]: no config at "/usr/lib/ignition/user.ign" Mar 3 13:34:58.928318 ignition[875]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Mar 3 13:34:58.928456 ignition[875]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Mar 3 13:34:58.932705 systemd-networkd[871]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Mar 3 13:34:58.965709 systemd-networkd[871]: eth0: DHCPv4 address 89.167.115.149/32, gateway 172.31.1.1 acquired from 172.31.1.1 Mar 3 13:34:59.129616 ignition[875]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Mar 3 13:34:59.135215 ignition[875]: GET result: OK Mar 3 13:34:59.135321 ignition[875]: parsing config with SHA512: 067f578cace99e62c53f6ea197612db189a6ecf5793cca59549faf9ffc85999cedcba0703e3d14b26f234ba4edbfd99e079b05d22268268d0ef09150078cdd3b Mar 3 13:34:59.144100 unknown[875]: fetched base config from "system" Mar 3 13:34:59.144601 ignition[875]: fetch: fetch complete Mar 3 13:34:59.144126 unknown[875]: fetched base config from "system" Mar 3 13:34:59.144611 ignition[875]: fetch: fetch passed Mar 3 13:34:59.144138 unknown[875]: fetched user config from "hetzner" Mar 3 13:34:59.144876 ignition[875]: Ignition finished successfully Mar 3 13:34:59.150832 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 3 13:34:59.153744 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 3 13:34:59.209354 ignition[883]: Ignition 2.22.0 Mar 3 13:34:59.209376 ignition[883]: Stage: kargs Mar 3 13:34:59.209616 ignition[883]: no configs at "/usr/lib/ignition/base.d" Mar 3 13:34:59.209639 ignition[883]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 3 13:34:59.210830 ignition[883]: kargs: kargs passed Mar 3 13:34:59.210898 ignition[883]: Ignition finished successfully Mar 3 13:34:59.214639 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 3 13:34:59.218725 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 3 13:34:59.254208 ignition[890]: Ignition 2.22.0 Mar 3 13:34:59.254220 ignition[890]: Stage: disks Mar 3 13:34:59.254339 ignition[890]: no configs at "/usr/lib/ignition/base.d" Mar 3 13:34:59.254350 ignition[890]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 3 13:34:59.255005 ignition[890]: disks: disks passed Mar 3 13:34:59.255043 ignition[890]: Ignition finished successfully Mar 3 13:34:59.257988 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 3 13:34:59.259104 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 3 13:34:59.259791 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 3 13:34:59.260147 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 3 13:34:59.261155 systemd[1]: Reached target sysinit.target - System Initialization. Mar 3 13:34:59.262177 systemd[1]: Reached target basic.target - Basic System. Mar 3 13:34:59.264040 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 3 13:34:59.300224 systemd-fsck[899]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Mar 3 13:34:59.305019 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 3 13:34:59.308337 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 3 13:34:59.406686 kernel: EXT4-fs (sda9): mounted filesystem f0c751de-febc-4e57-b330-c926d38ed5ec r/w with ordered data mode. Quota mode: none. Mar 3 13:34:59.407785 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 3 13:34:59.409600 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 3 13:34:59.412905 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 3 13:34:59.416073 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 3 13:34:59.425826 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Mar 3 13:34:59.426848 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 3 13:34:59.426894 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 3 13:34:59.430481 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 3 13:34:59.438688 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (907) Mar 3 13:34:59.442133 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 3 13:34:59.450447 kernel: BTRFS info (device sda6): first mount of filesystem af9be1e8-b0f0-42a3-a696-521642a3b9f8 Mar 3 13:34:59.450499 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 3 13:34:59.461298 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 3 13:34:59.461323 kernel: BTRFS info (device sda6): turning on async discard Mar 3 13:34:59.461333 kernel: BTRFS info (device sda6): enabling free space tree Mar 3 13:34:59.470969 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 3 13:34:59.488879 coreos-metadata[909]: Mar 03 13:34:59.488 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Mar 3 13:34:59.490415 coreos-metadata[909]: Mar 03 13:34:59.490 INFO Fetch successful Mar 3 13:34:59.491336 coreos-metadata[909]: Mar 03 13:34:59.491 INFO wrote hostname ci-4459-2-4-a-1c48930dd2 to /sysroot/etc/hostname Mar 3 13:34:59.493855 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 3 13:34:59.503089 initrd-setup-root[935]: cut: /sysroot/etc/passwd: No such file or directory Mar 3 13:34:59.507538 initrd-setup-root[942]: cut: /sysroot/etc/group: No such file or directory Mar 3 13:34:59.511616 initrd-setup-root[949]: cut: /sysroot/etc/shadow: No such file or directory Mar 3 13:34:59.515427 initrd-setup-root[956]: cut: /sysroot/etc/gshadow: No such file or directory Mar 3 13:34:59.591004 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 3 13:34:59.592274 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 3 13:34:59.593512 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 3 13:34:59.612697 kernel: BTRFS info (device sda6): last unmount of filesystem af9be1e8-b0f0-42a3-a696-521642a3b9f8 Mar 3 13:34:59.624389 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 3 13:34:59.638140 ignition[1024]: INFO : Ignition 2.22.0 Mar 3 13:34:59.638140 ignition[1024]: INFO : Stage: mount Mar 3 13:34:59.639023 ignition[1024]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 3 13:34:59.639023 ignition[1024]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 3 13:34:59.639023 ignition[1024]: INFO : mount: mount passed Mar 3 13:34:59.639023 ignition[1024]: INFO : Ignition finished successfully Mar 3 13:34:59.640411 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 3 13:34:59.641614 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 3 13:34:59.699109 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 3 13:34:59.700353 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 3 13:34:59.742709 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1035) Mar 3 13:34:59.750222 kernel: BTRFS info (device sda6): first mount of filesystem af9be1e8-b0f0-42a3-a696-521642a3b9f8 Mar 3 13:34:59.750247 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 3 13:34:59.765531 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 3 13:34:59.765557 kernel: BTRFS info (device sda6): turning on async discard Mar 3 13:34:59.765566 kernel: BTRFS info (device sda6): enabling free space tree Mar 3 13:34:59.770900 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 3 13:34:59.817044 ignition[1051]: INFO : Ignition 2.22.0 Mar 3 13:34:59.817044 ignition[1051]: INFO : Stage: files Mar 3 13:34:59.818601 ignition[1051]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 3 13:34:59.818601 ignition[1051]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 3 13:34:59.819825 ignition[1051]: DEBUG : files: compiled without relabeling support, skipping Mar 3 13:34:59.820549 ignition[1051]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 3 13:34:59.820549 ignition[1051]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 3 13:34:59.823948 ignition[1051]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 3 13:34:59.825001 ignition[1051]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 3 13:34:59.826022 ignition[1051]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 3 13:34:59.825639 unknown[1051]: wrote ssh authorized keys file for user: core Mar 3 13:34:59.828493 ignition[1051]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 3 13:34:59.830065 ignition[1051]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Mar 3 13:34:59.973020 ignition[1051]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 3 13:35:00.282858 ignition[1051]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 3 13:35:00.282858 ignition[1051]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 3 13:35:00.285785 ignition[1051]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 3 13:35:00.285785 ignition[1051]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 3 13:35:00.285785 ignition[1051]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 3 13:35:00.285785 ignition[1051]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 3 13:35:00.285785 ignition[1051]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 3 13:35:00.285785 ignition[1051]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 3 13:35:00.285785 ignition[1051]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 3 13:35:00.291112 ignition[1051]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 3 13:35:00.291112 ignition[1051]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 3 13:35:00.291112 ignition[1051]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.4-x86-64.raw" Mar 3 13:35:00.291112 ignition[1051]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.4-x86-64.raw" Mar 3 13:35:00.291112 ignition[1051]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.4-x86-64.raw" Mar 3 13:35:00.291112 ignition[1051]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.4-x86-64.raw: attempt #1 Mar 3 13:35:00.448890 systemd-networkd[871]: eth0: Gained IPv6LL Mar 3 13:35:00.449440 systemd-networkd[871]: eth1: Gained IPv6LL Mar 3 13:35:00.841053 ignition[1051]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 3 13:35:01.146426 ignition[1051]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.4-x86-64.raw" Mar 3 13:35:01.146426 ignition[1051]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 3 13:35:01.148442 ignition[1051]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 3 13:35:01.150741 ignition[1051]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 3 13:35:01.150741 ignition[1051]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 3 13:35:01.150741 ignition[1051]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Mar 3 13:35:01.154858 ignition[1051]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Mar 3 13:35:01.154858 ignition[1051]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Mar 3 13:35:01.154858 ignition[1051]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Mar 3 13:35:01.154858 ignition[1051]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Mar 3 13:35:01.154858 ignition[1051]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Mar 3 13:35:01.154858 ignition[1051]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 3 13:35:01.154858 ignition[1051]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 3 13:35:01.154858 ignition[1051]: INFO : files: files passed Mar 3 13:35:01.154858 ignition[1051]: INFO : Ignition finished successfully Mar 3 13:35:01.155075 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 3 13:35:01.157703 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 3 13:35:01.162754 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 3 13:35:01.167640 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 3 13:35:01.167733 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 3 13:35:01.175682 initrd-setup-root-after-ignition[1081]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 3 13:35:01.175682 initrd-setup-root-after-ignition[1081]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 3 13:35:01.177485 initrd-setup-root-after-ignition[1085]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 3 13:35:01.178895 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 3 13:35:01.179998 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 3 13:35:01.181428 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 3 13:35:01.229378 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 3 13:35:01.229486 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 3 13:35:01.230619 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 3 13:35:01.231208 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 3 13:35:01.232166 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 3 13:35:01.233786 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 3 13:35:01.248647 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 3 13:35:01.250049 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 3 13:35:01.263315 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 3 13:35:01.263834 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 3 13:35:01.264602 systemd[1]: Stopped target timers.target - Timer Units. Mar 3 13:35:01.265382 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 3 13:35:01.265493 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 3 13:35:01.266492 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 3 13:35:01.267275 systemd[1]: Stopped target basic.target - Basic System. Mar 3 13:35:01.268005 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 3 13:35:01.268713 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 3 13:35:01.269381 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 3 13:35:01.270091 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Mar 3 13:35:01.270809 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 3 13:35:01.271515 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 3 13:35:01.272213 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 3 13:35:01.273017 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 3 13:35:01.273698 systemd[1]: Stopped target swap.target - Swaps. Mar 3 13:35:01.274424 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 3 13:35:01.274535 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 3 13:35:01.275498 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 3 13:35:01.276236 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 3 13:35:01.276865 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 3 13:35:01.276935 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 3 13:35:01.277621 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 3 13:35:01.277760 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 3 13:35:01.278708 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 3 13:35:01.278789 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 3 13:35:01.279440 systemd[1]: ignition-files.service: Deactivated successfully. Mar 3 13:35:01.279545 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 3 13:35:01.280131 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Mar 3 13:35:01.280222 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 3 13:35:01.281369 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 3 13:35:01.283788 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 3 13:35:01.284859 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 3 13:35:01.284944 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 3 13:35:01.286059 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 3 13:35:01.286129 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 3 13:35:01.290500 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 3 13:35:01.291026 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 3 13:35:01.305298 ignition[1105]: INFO : Ignition 2.22.0 Mar 3 13:35:01.305298 ignition[1105]: INFO : Stage: umount Mar 3 13:35:01.307707 ignition[1105]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 3 13:35:01.307707 ignition[1105]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 3 13:35:01.307707 ignition[1105]: INFO : umount: umount passed Mar 3 13:35:01.307707 ignition[1105]: INFO : Ignition finished successfully Mar 3 13:35:01.307286 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 3 13:35:01.307392 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 3 13:35:01.308212 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 3 13:35:01.308289 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 3 13:35:01.311060 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 3 13:35:01.311099 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 3 13:35:01.312740 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 3 13:35:01.312784 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 3 13:35:01.313253 systemd[1]: Stopped target network.target - Network. Mar 3 13:35:01.313991 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 3 13:35:01.314045 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 3 13:35:01.314786 systemd[1]: Stopped target paths.target - Path Units. Mar 3 13:35:01.315424 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 3 13:35:01.319717 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 3 13:35:01.320083 systemd[1]: Stopped target slices.target - Slice Units. Mar 3 13:35:01.320727 systemd[1]: Stopped target sockets.target - Socket Units. Mar 3 13:35:01.321359 systemd[1]: iscsid.socket: Deactivated successfully. Mar 3 13:35:01.321394 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 3 13:35:01.321962 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 3 13:35:01.321995 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 3 13:35:01.322747 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 3 13:35:01.322792 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 3 13:35:01.323354 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 3 13:35:01.323388 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 3 13:35:01.324051 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 3 13:35:01.324683 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 3 13:35:01.326481 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 3 13:35:01.330587 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 3 13:35:01.330708 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 3 13:35:01.333807 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 3 13:35:01.334046 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 3 13:35:01.334083 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 3 13:35:01.335523 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 3 13:35:01.339054 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 3 13:35:01.339150 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 3 13:35:01.340563 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 3 13:35:01.340741 systemd[1]: Stopped target network-pre.target - Preparation for Network. Mar 3 13:35:01.341261 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 3 13:35:01.341293 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 3 13:35:01.343724 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 3 13:35:01.344042 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 3 13:35:01.344083 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 3 13:35:01.344425 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 3 13:35:01.344468 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 3 13:35:01.348598 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 3 13:35:01.348651 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 3 13:35:01.349483 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 3 13:35:01.351084 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 3 13:35:01.364574 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 3 13:35:01.364752 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 3 13:35:01.366654 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 3 13:35:01.366738 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 3 13:35:01.367810 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 3 13:35:01.367947 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 3 13:35:01.368837 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 3 13:35:01.368918 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 3 13:35:01.370106 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 3 13:35:01.370167 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 3 13:35:01.370856 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 3 13:35:01.370885 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 3 13:35:01.371451 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 3 13:35:01.371502 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 3 13:35:01.372473 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 3 13:35:01.372511 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 3 13:35:01.373478 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 3 13:35:01.373515 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 3 13:35:01.375746 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 3 13:35:01.376095 systemd[1]: systemd-network-generator.service: Deactivated successfully. Mar 3 13:35:01.376136 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Mar 3 13:35:01.378568 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 3 13:35:01.378610 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 3 13:35:01.379788 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Mar 3 13:35:01.379832 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 3 13:35:01.380586 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 3 13:35:01.380623 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 3 13:35:01.381210 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 3 13:35:01.381244 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 3 13:35:01.387469 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 3 13:35:01.387558 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 3 13:35:01.388108 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 3 13:35:01.389154 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 3 13:35:01.414313 systemd[1]: Switching root. Mar 3 13:35:01.442711 systemd-journald[198]: Received SIGTERM from PID 1 (systemd). Mar 3 13:35:01.442766 systemd-journald[198]: Journal stopped Mar 3 13:35:02.493237 kernel: SELinux: policy capability network_peer_controls=1 Mar 3 13:35:02.493308 kernel: SELinux: policy capability open_perms=1 Mar 3 13:35:02.493318 kernel: SELinux: policy capability extended_socket_class=1 Mar 3 13:35:02.493327 kernel: SELinux: policy capability always_check_network=0 Mar 3 13:35:02.493336 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 3 13:35:02.493351 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 3 13:35:02.493360 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 3 13:35:02.493368 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 3 13:35:02.493377 kernel: SELinux: policy capability userspace_initial_context=0 Mar 3 13:35:02.493385 kernel: audit: type=1403 audit(1772544901.582:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 3 13:35:02.493395 systemd[1]: Successfully loaded SELinux policy in 65.271ms. Mar 3 13:35:02.493421 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 6.490ms. Mar 3 13:35:02.493430 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 3 13:35:02.493439 systemd[1]: Detected virtualization kvm. Mar 3 13:35:02.493450 systemd[1]: Detected architecture x86-64. Mar 3 13:35:02.493468 systemd[1]: Detected first boot. Mar 3 13:35:02.493477 systemd[1]: Hostname set to . Mar 3 13:35:02.493487 systemd[1]: Initializing machine ID from VM UUID. Mar 3 13:35:02.493496 zram_generator::config[1152]: No configuration found. Mar 3 13:35:02.493506 kernel: Guest personality initialized and is inactive Mar 3 13:35:02.493519 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Mar 3 13:35:02.493527 kernel: Initialized host personality Mar 3 13:35:02.493538 kernel: NET: Registered PF_VSOCK protocol family Mar 3 13:35:02.493547 systemd[1]: Populated /etc with preset unit settings. Mar 3 13:35:02.493556 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 3 13:35:02.493565 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 3 13:35:02.493573 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 3 13:35:02.493582 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 3 13:35:02.493595 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 3 13:35:02.493608 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 3 13:35:02.493617 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 3 13:35:02.493626 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 3 13:35:02.493635 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 3 13:35:02.493644 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 3 13:35:02.493653 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 3 13:35:02.493671 systemd[1]: Created slice user.slice - User and Session Slice. Mar 3 13:35:02.493680 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 3 13:35:02.493689 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 3 13:35:02.493701 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 3 13:35:02.493710 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 3 13:35:02.493720 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 3 13:35:02.493729 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 3 13:35:02.493738 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 3 13:35:02.493747 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 3 13:35:02.493758 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 3 13:35:02.493767 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 3 13:35:02.493775 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 3 13:35:02.493784 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 3 13:35:02.493793 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 3 13:35:02.493802 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 3 13:35:02.493811 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 3 13:35:02.493820 systemd[1]: Reached target slices.target - Slice Units. Mar 3 13:35:02.493829 systemd[1]: Reached target swap.target - Swaps. Mar 3 13:35:02.493838 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 3 13:35:02.493848 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 3 13:35:02.493857 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 3 13:35:02.493866 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 3 13:35:02.493875 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 3 13:35:02.493884 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 3 13:35:02.493894 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 3 13:35:02.493905 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 3 13:35:02.493914 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 3 13:35:02.493928 systemd[1]: Mounting media.mount - External Media Directory... Mar 3 13:35:02.493937 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 3 13:35:02.493945 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 3 13:35:02.493954 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 3 13:35:02.493963 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 3 13:35:02.493972 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 3 13:35:02.493981 systemd[1]: Reached target machines.target - Containers. Mar 3 13:35:02.493990 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 3 13:35:02.493999 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 3 13:35:02.494010 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 3 13:35:02.494019 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 3 13:35:02.494028 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 3 13:35:02.494037 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 3 13:35:02.494046 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 3 13:35:02.494054 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 3 13:35:02.494063 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 3 13:35:02.494072 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 3 13:35:02.494083 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 3 13:35:02.494092 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 3 13:35:02.494101 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 3 13:35:02.494110 systemd[1]: Stopped systemd-fsck-usr.service. Mar 3 13:35:02.494119 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 3 13:35:02.494128 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 3 13:35:02.494139 kernel: loop: module loaded Mar 3 13:35:02.494148 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 3 13:35:02.494159 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 3 13:35:02.494171 kernel: ACPI: bus type drm_connector registered Mar 3 13:35:02.494182 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 3 13:35:02.494191 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 3 13:35:02.494200 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 3 13:35:02.494209 systemd[1]: verity-setup.service: Deactivated successfully. Mar 3 13:35:02.494218 systemd[1]: Stopped verity-setup.service. Mar 3 13:35:02.494246 systemd-journald[1236]: Collecting audit messages is disabled. Mar 3 13:35:02.494272 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 3 13:35:02.494287 systemd-journald[1236]: Journal started Mar 3 13:35:02.494308 systemd-journald[1236]: Runtime Journal (/run/log/journal/57e25f6eb24a46c697cfa1ebc3b1b784) is 8M, max 76.1M, 68.1M free. Mar 3 13:35:02.171728 systemd[1]: Queued start job for default target multi-user.target. Mar 3 13:35:02.194993 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Mar 3 13:35:02.195488 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 3 13:35:02.509685 systemd[1]: Started systemd-journald.service - Journal Service. Mar 3 13:35:02.509723 kernel: fuse: init (API version 7.41) Mar 3 13:35:02.509417 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 3 13:35:02.510108 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 3 13:35:02.510614 systemd[1]: Mounted media.mount - External Media Directory. Mar 3 13:35:02.511425 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 3 13:35:02.512032 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 3 13:35:02.512485 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 3 13:35:02.513152 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 3 13:35:02.513862 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 3 13:35:02.514514 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 3 13:35:02.514829 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 3 13:35:02.515492 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 3 13:35:02.515646 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 3 13:35:02.516414 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 3 13:35:02.516619 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 3 13:35:02.517266 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 3 13:35:02.517472 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 3 13:35:02.518248 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 3 13:35:02.518442 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 3 13:35:02.519097 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 3 13:35:02.519250 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 3 13:35:02.520094 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 3 13:35:02.520745 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 3 13:35:02.521377 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 3 13:35:02.522031 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 3 13:35:02.532986 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 3 13:35:02.537726 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 3 13:35:02.539782 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 3 13:35:02.540590 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 3 13:35:02.540709 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 3 13:35:02.542166 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 3 13:35:02.548110 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 3 13:35:02.548581 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 3 13:35:02.551031 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 3 13:35:02.554802 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 3 13:35:02.555736 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 3 13:35:02.559753 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 3 13:35:02.560131 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 3 13:35:02.561753 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 3 13:35:02.564061 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 3 13:35:02.574239 systemd-journald[1236]: Time spent on flushing to /var/log/journal/57e25f6eb24a46c697cfa1ebc3b1b784 is 28.180ms for 1240 entries. Mar 3 13:35:02.574239 systemd-journald[1236]: System Journal (/var/log/journal/57e25f6eb24a46c697cfa1ebc3b1b784) is 8M, max 584.8M, 576.8M free. Mar 3 13:35:02.620119 systemd-journald[1236]: Received client request to flush runtime journal. Mar 3 13:35:02.569935 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 3 13:35:02.573112 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 3 13:35:02.573865 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 3 13:35:02.621482 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 3 13:35:02.629690 kernel: loop0: detected capacity change from 0 to 128560 Mar 3 13:35:02.641051 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 3 13:35:02.643053 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 3 13:35:02.647912 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 3 13:35:02.649987 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 3 13:35:02.658884 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 3 13:35:02.664215 systemd-tmpfiles[1274]: ACLs are not supported, ignoring. Mar 3 13:35:02.664228 systemd-tmpfiles[1274]: ACLs are not supported, ignoring. Mar 3 13:35:02.677115 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 3 13:35:02.683573 kernel: loop1: detected capacity change from 0 to 110984 Mar 3 13:35:02.682939 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 3 13:35:02.683570 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 3 13:35:02.699435 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 3 13:35:02.720745 kernel: loop2: detected capacity change from 0 to 8 Mar 3 13:35:02.728057 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 3 13:35:02.733197 kernel: loop3: detected capacity change from 0 to 219192 Mar 3 13:35:02.736256 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 3 13:35:02.778805 kernel: loop4: detected capacity change from 0 to 128560 Mar 3 13:35:02.778184 systemd-tmpfiles[1298]: ACLs are not supported, ignoring. Mar 3 13:35:02.778194 systemd-tmpfiles[1298]: ACLs are not supported, ignoring. Mar 3 13:35:02.788795 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 3 13:35:02.797822 kernel: loop5: detected capacity change from 0 to 110984 Mar 3 13:35:02.814698 kernel: loop6: detected capacity change from 0 to 8 Mar 3 13:35:02.818687 kernel: loop7: detected capacity change from 0 to 219192 Mar 3 13:35:02.843644 (sd-merge)[1301]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Mar 3 13:35:02.844306 (sd-merge)[1301]: Merged extensions into '/usr'. Mar 3 13:35:02.852920 systemd[1]: Reload requested from client PID 1273 ('systemd-sysext') (unit systemd-sysext.service)... Mar 3 13:35:02.852945 systemd[1]: Reloading... Mar 3 13:35:02.938701 zram_generator::config[1328]: No configuration found. Mar 3 13:35:03.011320 ldconfig[1268]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 3 13:35:03.109436 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 3 13:35:03.109843 systemd[1]: Reloading finished in 256 ms. Mar 3 13:35:03.135147 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 3 13:35:03.135889 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 3 13:35:03.138995 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 3 13:35:03.143922 systemd[1]: Starting ensure-sysext.service... Mar 3 13:35:03.147764 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 3 13:35:03.153403 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 3 13:35:03.164569 systemd[1]: Reload requested from client PID 1372 ('systemctl') (unit ensure-sysext.service)... Mar 3 13:35:03.164643 systemd[1]: Reloading... Mar 3 13:35:03.176173 systemd-tmpfiles[1373]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Mar 3 13:35:03.176407 systemd-tmpfiles[1373]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Mar 3 13:35:03.176903 systemd-tmpfiles[1373]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 3 13:35:03.177263 systemd-tmpfiles[1373]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 3 13:35:03.178725 systemd-tmpfiles[1373]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 3 13:35:03.178989 systemd-tmpfiles[1373]: ACLs are not supported, ignoring. Mar 3 13:35:03.179081 systemd-tmpfiles[1373]: ACLs are not supported, ignoring. Mar 3 13:35:03.186884 systemd-tmpfiles[1373]: Detected autofs mount point /boot during canonicalization of boot. Mar 3 13:35:03.187753 systemd-tmpfiles[1373]: Skipping /boot Mar 3 13:35:03.201550 systemd-udevd[1374]: Using default interface naming scheme 'v255'. Mar 3 13:35:03.210366 systemd-tmpfiles[1373]: Detected autofs mount point /boot during canonicalization of boot. Mar 3 13:35:03.210760 systemd-tmpfiles[1373]: Skipping /boot Mar 3 13:35:03.235694 zram_generator::config[1406]: No configuration found. Mar 3 13:35:03.441060 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 3 13:35:03.441264 systemd[1]: Reloading finished in 276 ms. Mar 3 13:35:03.446710 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input5 Mar 3 13:35:03.452500 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 3 13:35:03.454041 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 3 13:35:03.474746 kernel: mousedev: PS/2 mouse device common for all mice Mar 3 13:35:03.479952 kernel: ACPI: button: Power Button [PWRF] Mar 3 13:35:03.484042 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 3 13:35:03.486628 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 3 13:35:03.488809 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 3 13:35:03.493592 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 3 13:35:03.500223 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 3 13:35:03.502036 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 3 13:35:03.508584 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 3 13:35:03.517704 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 3 13:35:03.517840 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 3 13:35:03.519879 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 3 13:35:03.525325 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 3 13:35:03.531741 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 3 13:35:03.532211 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 3 13:35:03.532303 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 3 13:35:03.532362 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 3 13:35:03.535932 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 3 13:35:03.536065 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 3 13:35:03.536183 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 3 13:35:03.536236 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 3 13:35:03.536286 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 3 13:35:03.541248 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 3 13:35:03.541420 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 3 13:35:03.551742 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 3 13:35:03.552197 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 3 13:35:03.552276 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 3 13:35:03.552363 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 3 13:35:03.552978 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 3 13:35:03.565024 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 3 13:35:03.567890 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 3 13:35:03.569239 systemd[1]: Finished ensure-sysext.service. Mar 3 13:35:03.577408 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 3 13:35:03.603153 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 3 13:35:03.603331 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 3 13:35:03.605434 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 3 13:35:03.611031 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 3 13:35:03.613374 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 3 13:35:03.614048 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 3 13:35:03.614213 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 3 13:35:03.617963 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 3 13:35:03.619717 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 3 13:35:03.620961 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 3 13:35:03.621035 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 3 13:35:03.635825 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 3 13:35:03.642546 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Mar 3 13:35:03.644608 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 3 13:35:03.645012 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 3 13:35:03.647922 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 3 13:35:03.653161 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 3 13:35:03.654797 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 3 13:35:03.655775 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 3 13:35:03.655807 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 3 13:35:03.655828 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 3 13:35:03.656329 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 3 13:35:03.657220 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 3 13:35:03.666466 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 3 13:35:03.668238 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 3 13:35:03.677374 augenrules[1540]: No rules Mar 3 13:35:03.680786 systemd[1]: audit-rules.service: Deactivated successfully. Mar 3 13:35:03.681013 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 3 13:35:03.697833 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Mar 3 13:35:03.707009 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Mar 3 13:35:03.719516 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 3 13:35:03.720150 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 3 13:35:03.721743 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 3 13:35:03.725357 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 3 13:35:03.731163 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Mar 3 13:35:03.731395 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Mar 3 13:35:03.734062 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Mar 3 13:35:03.740789 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 3 13:35:03.741720 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 3 13:35:03.741762 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 3 13:35:03.757194 kernel: Console: switching to colour dummy device 80x25 Mar 3 13:35:03.769704 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Mar 3 13:35:03.773914 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Mar 3 13:35:03.773951 kernel: [drm] features: -context_init Mar 3 13:35:03.776783 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 3 13:35:03.795384 kernel: [drm] number of scanouts: 1 Mar 3 13:35:03.795429 kernel: [drm] number of cap sets: 0 Mar 3 13:35:03.805694 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Mar 3 13:35:03.817311 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 3 13:35:03.828790 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Mar 3 13:35:03.831660 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 3 13:35:03.831920 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 3 13:35:03.834269 kernel: EDAC MC: Ver: 3.0.0 Mar 3 13:35:03.843334 kernel: Console: switching to colour frame buffer device 160x50 Mar 3 13:35:03.863970 systemd-networkd[1489]: lo: Link UP Mar 3 13:35:03.866098 systemd-resolved[1490]: Positive Trust Anchors: Mar 3 13:35:03.866110 systemd-resolved[1490]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 3 13:35:03.866141 systemd-resolved[1490]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 3 13:35:03.866356 systemd-networkd[1489]: lo: Gained carrier Mar 3 13:35:03.873249 systemd-networkd[1489]: Enumeration completed Mar 3 13:35:03.878020 systemd-networkd[1489]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 3 13:35:03.878027 systemd-networkd[1489]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 3 13:35:03.878715 systemd-networkd[1489]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 3 13:35:03.878727 systemd-networkd[1489]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 3 13:35:03.879202 systemd-networkd[1489]: eth0: Link UP Mar 3 13:35:03.879841 systemd-networkd[1489]: eth0: Gained carrier Mar 3 13:35:03.879892 systemd-networkd[1489]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 3 13:35:03.880902 systemd-resolved[1490]: Using system hostname 'ci-4459-2-4-a-1c48930dd2'. Mar 3 13:35:03.885199 systemd-networkd[1489]: eth1: Link UP Mar 3 13:35:03.886790 systemd-networkd[1489]: eth1: Gained carrier Mar 3 13:35:03.886858 systemd-networkd[1489]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 3 13:35:03.906602 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Mar 3 13:35:03.908107 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 3 13:35:03.909349 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 3 13:35:03.910694 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 3 13:35:03.914052 systemd[1]: Reached target network.target - Network. Mar 3 13:35:03.914169 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 3 13:35:03.914239 systemd[1]: Reached target time-set.target - System Time Set. Mar 3 13:35:03.916774 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 3 13:35:03.918777 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 3 13:35:03.923741 systemd-networkd[1489]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Mar 3 13:35:03.924458 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 3 13:35:03.926141 systemd-timesyncd[1509]: Network configuration changed, trying to establish connection. Mar 3 13:35:03.937714 systemd-networkd[1489]: eth0: DHCPv4 address 89.167.115.149/32, gateway 172.31.1.1 acquired from 172.31.1.1 Mar 3 13:35:03.955508 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 3 13:35:03.956381 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 3 13:35:03.958930 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 3 13:35:03.962853 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 3 13:35:04.019321 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 3 13:35:04.019603 systemd[1]: Reached target sysinit.target - System Initialization. Mar 3 13:35:04.019837 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 3 13:35:04.021356 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 3 13:35:04.021471 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Mar 3 13:35:04.021701 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 3 13:35:04.021876 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 3 13:35:04.021952 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 3 13:35:04.022021 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 3 13:35:04.022045 systemd[1]: Reached target paths.target - Path Units. Mar 3 13:35:04.023543 systemd[1]: Reached target timers.target - Timer Units. Mar 3 13:35:04.026051 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 3 13:35:04.028426 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 3 13:35:04.033474 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 3 13:35:04.036325 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 3 13:35:04.037845 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 3 13:35:04.040588 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 3 13:35:04.045710 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 3 13:35:04.047329 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 3 13:35:04.050408 systemd[1]: Reached target sockets.target - Socket Units. Mar 3 13:35:04.051476 systemd[1]: Reached target basic.target - Basic System. Mar 3 13:35:04.052536 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 3 13:35:04.052565 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 3 13:35:04.053795 systemd[1]: Starting containerd.service - containerd container runtime... Mar 3 13:35:04.070780 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 3 13:35:04.073132 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 3 13:35:04.077771 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 3 13:35:04.081553 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 3 13:35:04.083745 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 3 13:35:04.087327 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 3 13:35:04.089576 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Mar 3 13:35:04.092977 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 3 13:35:04.096079 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 3 13:35:04.099494 jq[1592]: false Mar 3 13:35:04.104851 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Mar 3 13:35:04.107434 google_oslogin_nss_cache[1596]: oslogin_cache_refresh[1596]: Refreshing passwd entry cache Mar 3 13:35:04.108735 oslogin_cache_refresh[1596]: Refreshing passwd entry cache Mar 3 13:35:04.110864 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 3 13:35:04.111242 coreos-metadata[1589]: Mar 03 13:35:04.111 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Mar 3 13:35:04.112865 google_oslogin_nss_cache[1596]: oslogin_cache_refresh[1596]: Failure getting users, quitting Mar 3 13:35:04.112908 oslogin_cache_refresh[1596]: Failure getting users, quitting Mar 3 13:35:04.112966 google_oslogin_nss_cache[1596]: oslogin_cache_refresh[1596]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Mar 3 13:35:04.112987 oslogin_cache_refresh[1596]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Mar 3 13:35:04.113046 google_oslogin_nss_cache[1596]: oslogin_cache_refresh[1596]: Refreshing group entry cache Mar 3 13:35:04.113145 oslogin_cache_refresh[1596]: Refreshing group entry cache Mar 3 13:35:04.114907 google_oslogin_nss_cache[1596]: oslogin_cache_refresh[1596]: Failure getting groups, quitting Mar 3 13:35:04.114907 google_oslogin_nss_cache[1596]: oslogin_cache_refresh[1596]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Mar 3 13:35:04.114895 oslogin_cache_refresh[1596]: Failure getting groups, quitting Mar 3 13:35:04.114909 oslogin_cache_refresh[1596]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Mar 3 13:35:04.119258 coreos-metadata[1589]: Mar 03 13:35:04.119 INFO Fetch successful Mar 3 13:35:04.119258 coreos-metadata[1589]: Mar 03 13:35:04.119 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Mar 3 13:35:04.119008 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 3 13:35:04.120715 coreos-metadata[1589]: Mar 03 13:35:04.120 INFO Fetch successful Mar 3 13:35:04.131480 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 3 13:35:04.133193 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 3 13:35:04.135317 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 3 13:35:04.136851 systemd[1]: Starting update-engine.service - Update Engine... Mar 3 13:35:04.143452 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 3 13:35:04.146584 extend-filesystems[1593]: Found /dev/sda6 Mar 3 13:35:04.156135 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 3 13:35:04.159040 jq[1614]: true Mar 3 13:35:04.159468 extend-filesystems[1593]: Found /dev/sda9 Mar 3 13:35:04.159954 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 3 13:35:04.160159 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 3 13:35:04.160459 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Mar 3 13:35:04.160651 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Mar 3 13:35:04.167180 extend-filesystems[1593]: Checking size of /dev/sda9 Mar 3 13:35:04.166841 systemd[1]: motdgen.service: Deactivated successfully. Mar 3 13:35:04.172650 update_engine[1613]: I20260303 13:35:04.167721 1613 main.cc:92] Flatcar Update Engine starting Mar 3 13:35:04.167023 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 3 13:35:04.170037 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 3 13:35:04.170911 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 3 13:35:04.183878 extend-filesystems[1593]: Resized partition /dev/sda9 Mar 3 13:35:04.198689 extend-filesystems[1635]: resize2fs 1.47.3 (8-Jul-2025) Mar 3 13:35:04.203588 jq[1625]: true Mar 3 13:35:04.216958 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 19393531 blocks Mar 3 13:35:04.217194 (ntainerd)[1639]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 3 13:35:04.243630 tar[1623]: linux-amd64/LICENSE Mar 3 13:35:04.244982 tar[1623]: linux-amd64/helm Mar 3 13:35:04.257110 dbus-daemon[1590]: [system] SELinux support is enabled Mar 3 13:35:04.257252 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 3 13:35:04.264321 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 3 13:35:04.264519 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 3 13:35:04.265733 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 3 13:35:04.265750 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 3 13:35:04.276772 systemd[1]: Started update-engine.service - Update Engine. Mar 3 13:35:04.280028 update_engine[1613]: I20260303 13:35:04.279860 1613 update_check_scheduler.cc:74] Next update check in 2m5s Mar 3 13:35:04.282166 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 3 13:35:04.309063 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 3 13:35:04.311280 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 3 13:35:04.330414 systemd-logind[1607]: New seat seat0. Mar 3 13:35:04.342558 systemd-logind[1607]: Watching system buttons on /dev/input/event3 (Power Button) Mar 3 13:35:04.342577 systemd-logind[1607]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 3 13:35:04.343429 systemd[1]: Started systemd-logind.service - User Login Management. Mar 3 13:35:04.351353 systemd-timesyncd[1509]: Contacted time server 92.60.37.230:123 (1.flatcar.pool.ntp.org). Mar 3 13:35:04.351533 systemd-timesyncd[1509]: Initial clock synchronization to Tue 2026-03-03 13:35:04.512777 UTC. Mar 3 13:35:04.374742 bash[1666]: Updated "/home/core/.ssh/authorized_keys" Mar 3 13:35:04.374581 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 3 13:35:04.383246 systemd[1]: Starting sshkeys.service... Mar 3 13:35:04.386376 sshd_keygen[1622]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 3 13:35:04.404530 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 3 13:35:04.413786 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 3 13:35:04.468125 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 3 13:35:04.476949 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 3 13:35:04.497292 coreos-metadata[1678]: Mar 03 13:35:04.497 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Mar 3 13:35:04.502885 coreos-metadata[1678]: Mar 03 13:35:04.502 INFO Fetch successful Mar 3 13:35:04.502943 containerd[1639]: time="2026-03-03T13:35:04Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 3 13:35:04.511686 containerd[1639]: time="2026-03-03T13:35:04.508873847Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Mar 3 13:35:04.511804 unknown[1678]: wrote ssh authorized keys file for user: core Mar 3 13:35:04.519016 containerd[1639]: time="2026-03-03T13:35:04.518986234Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.772µs" Mar 3 13:35:04.519243 containerd[1639]: time="2026-03-03T13:35:04.519226936Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 3 13:35:04.519324 containerd[1639]: time="2026-03-03T13:35:04.519314778Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 3 13:35:04.520822 containerd[1639]: time="2026-03-03T13:35:04.519493716Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 3 13:35:04.520883 containerd[1639]: time="2026-03-03T13:35:04.520871965Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 3 13:35:04.520934 containerd[1639]: time="2026-03-03T13:35:04.520926757Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 3 13:35:04.521019 containerd[1639]: time="2026-03-03T13:35:04.521009030Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 3 13:35:04.521050 containerd[1639]: time="2026-03-03T13:35:04.521042461Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 3 13:35:04.521273 containerd[1639]: time="2026-03-03T13:35:04.521258024Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 3 13:35:04.521345 containerd[1639]: time="2026-03-03T13:35:04.521335570Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 3 13:35:04.521375 containerd[1639]: time="2026-03-03T13:35:04.521368219Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 3 13:35:04.522694 containerd[1639]: time="2026-03-03T13:35:04.521516021Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 3 13:35:04.522687 systemd[1]: issuegen.service: Deactivated successfully. Mar 3 13:35:04.523269 containerd[1639]: time="2026-03-03T13:35:04.523250424Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 3 13:35:04.523605 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 3 13:35:04.525381 containerd[1639]: time="2026-03-03T13:35:04.523944917Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 3 13:35:04.527843 containerd[1639]: time="2026-03-03T13:35:04.526536366Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 3 13:35:04.527926 containerd[1639]: time="2026-03-03T13:35:04.527907404Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 3 13:35:04.527996 containerd[1639]: time="2026-03-03T13:35:04.527983848Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 3 13:35:04.532905 containerd[1639]: time="2026-03-03T13:35:04.532886196Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 3 13:35:04.533379 containerd[1639]: time="2026-03-03T13:35:04.533016552Z" level=info msg="metadata content store policy set" policy=shared Mar 3 13:35:04.551873 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 3 13:35:04.559387 containerd[1639]: time="2026-03-03T13:35:04.559361171Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 3 13:35:04.561424 containerd[1639]: time="2026-03-03T13:35:04.559483344Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 3 13:35:04.561424 containerd[1639]: time="2026-03-03T13:35:04.559502883Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 3 13:35:04.561424 containerd[1639]: time="2026-03-03T13:35:04.559523224Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 3 13:35:04.561424 containerd[1639]: time="2026-03-03T13:35:04.559533028Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 3 13:35:04.561424 containerd[1639]: time="2026-03-03T13:35:04.559540389Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 3 13:35:04.561424 containerd[1639]: time="2026-03-03T13:35:04.559548301Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 3 13:35:04.561424 containerd[1639]: time="2026-03-03T13:35:04.559559158Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 3 13:35:04.561424 containerd[1639]: time="2026-03-03T13:35:04.559570685Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 3 13:35:04.561424 containerd[1639]: time="2026-03-03T13:35:04.559582803Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 3 13:35:04.561424 containerd[1639]: time="2026-03-03T13:35:04.559593099Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 3 13:35:04.561424 containerd[1639]: time="2026-03-03T13:35:04.559604396Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 3 13:35:04.564064 containerd[1639]: time="2026-03-03T13:35:04.563746813Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 3 13:35:04.564064 containerd[1639]: time="2026-03-03T13:35:04.563774805Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 3 13:35:04.564064 containerd[1639]: time="2026-03-03T13:35:04.563794825Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 3 13:35:04.564064 containerd[1639]: time="2026-03-03T13:35:04.563803347Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 3 13:35:04.564064 containerd[1639]: time="2026-03-03T13:35:04.563811990Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 3 13:35:04.564064 containerd[1639]: time="2026-03-03T13:35:04.563820123Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 3 13:35:04.564064 containerd[1639]: time="2026-03-03T13:35:04.563833102Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 3 13:35:04.564064 containerd[1639]: time="2026-03-03T13:35:04.563840533Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 3 13:35:04.564064 containerd[1639]: time="2026-03-03T13:35:04.563849907Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 3 13:35:04.564064 containerd[1639]: time="2026-03-03T13:35:04.563859091Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 3 13:35:04.564064 containerd[1639]: time="2026-03-03T13:35:04.563866713Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 3 13:35:04.564064 containerd[1639]: time="2026-03-03T13:35:04.563907233Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 3 13:35:04.564064 containerd[1639]: time="2026-03-03T13:35:04.563917639Z" level=info msg="Start snapshots syncer" Mar 3 13:35:04.564064 containerd[1639]: time="2026-03-03T13:35:04.564023097Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 3 13:35:04.564266 containerd[1639]: time="2026-03-03T13:35:04.564235626Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 3 13:35:04.564353 containerd[1639]: time="2026-03-03T13:35:04.564266903Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 3 13:35:04.565561 containerd[1639]: time="2026-03-03T13:35:04.565539123Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 3 13:35:04.566770 containerd[1639]: time="2026-03-03T13:35:04.566743801Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 3 13:35:04.566845 containerd[1639]: time="2026-03-03T13:35:04.566791202Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 3 13:35:04.566845 containerd[1639]: time="2026-03-03T13:35:04.566801627Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 3 13:35:04.566845 containerd[1639]: time="2026-03-03T13:35:04.566810120Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 3 13:35:04.566845 containerd[1639]: time="2026-03-03T13:35:04.566819855Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 3 13:35:04.566845 containerd[1639]: time="2026-03-03T13:35:04.566828157Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 3 13:35:04.566845 containerd[1639]: time="2026-03-03T13:35:04.566836309Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 3 13:35:04.567047 containerd[1639]: time="2026-03-03T13:35:04.566869389Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 3 13:35:04.567047 containerd[1639]: time="2026-03-03T13:35:04.566878533Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 3 13:35:04.567047 containerd[1639]: time="2026-03-03T13:35:04.566889619Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 3 13:35:04.567047 containerd[1639]: time="2026-03-03T13:35:04.566917091Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 3 13:35:04.567243 containerd[1639]: time="2026-03-03T13:35:04.567122739Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 3 13:35:04.567243 containerd[1639]: time="2026-03-03T13:35:04.567134868Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 3 13:35:04.567243 containerd[1639]: time="2026-03-03T13:35:04.567143190Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 3 13:35:04.567243 containerd[1639]: time="2026-03-03T13:35:04.567148768Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 3 13:35:04.567243 containerd[1639]: time="2026-03-03T13:35:04.567162068Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 3 13:35:04.567243 containerd[1639]: time="2026-03-03T13:35:04.567174787Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 3 13:35:04.567243 containerd[1639]: time="2026-03-03T13:35:04.567231543Z" level=info msg="runtime interface created" Mar 3 13:35:04.567243 containerd[1639]: time="2026-03-03T13:35:04.567236780Z" level=info msg="created NRI interface" Mar 3 13:35:04.567243 containerd[1639]: time="2026-03-03T13:35:04.567243691Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 3 13:35:04.567493 containerd[1639]: time="2026-03-03T13:35:04.567252614Z" level=info msg="Connect containerd service" Mar 3 13:35:04.567493 containerd[1639]: time="2026-03-03T13:35:04.567267406Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 3 13:35:04.571690 containerd[1639]: time="2026-03-03T13:35:04.571608732Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 3 13:35:04.574596 locksmithd[1662]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 3 13:35:04.579965 kernel: EXT4-fs (sda9): resized filesystem to 19393531 Mar 3 13:35:04.586143 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 3 13:35:04.593243 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 3 13:35:04.614286 update-ssh-keys[1694]: Updated "/home/core/.ssh/authorized_keys" Mar 3 13:35:04.596890 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 3 13:35:04.598571 systemd[1]: Reached target getty.target - Login Prompts. Mar 3 13:35:04.614523 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 3 13:35:04.617923 extend-filesystems[1635]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Mar 3 13:35:04.617923 extend-filesystems[1635]: old_desc_blocks = 1, new_desc_blocks = 10 Mar 3 13:35:04.617923 extend-filesystems[1635]: The filesystem on /dev/sda9 is now 19393531 (4k) blocks long. Mar 3 13:35:04.644461 extend-filesystems[1593]: Resized filesystem in /dev/sda9 Mar 3 13:35:04.620331 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 3 13:35:04.620603 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 3 13:35:04.635519 systemd[1]: Finished sshkeys.service. Mar 3 13:35:04.668498 containerd[1639]: time="2026-03-03T13:35:04.668417286Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 3 13:35:04.668498 containerd[1639]: time="2026-03-03T13:35:04.668493099Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 3 13:35:04.668619 containerd[1639]: time="2026-03-03T13:35:04.668519719Z" level=info msg="Start subscribing containerd event" Mar 3 13:35:04.668619 containerd[1639]: time="2026-03-03T13:35:04.668540040Z" level=info msg="Start recovering state" Mar 3 13:35:04.668619 containerd[1639]: time="2026-03-03T13:35:04.668613290Z" level=info msg="Start event monitor" Mar 3 13:35:04.668695 containerd[1639]: time="2026-03-03T13:35:04.668621392Z" level=info msg="Start cni network conf syncer for default" Mar 3 13:35:04.668695 containerd[1639]: time="2026-03-03T13:35:04.668629464Z" level=info msg="Start streaming server" Mar 3 13:35:04.668695 containerd[1639]: time="2026-03-03T13:35:04.668645087Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 3 13:35:04.668695 containerd[1639]: time="2026-03-03T13:35:04.668653099Z" level=info msg="runtime interface starting up..." Mar 3 13:35:04.669769 containerd[1639]: time="2026-03-03T13:35:04.668660350Z" level=info msg="starting plugins..." Mar 3 13:35:04.669896 containerd[1639]: time="2026-03-03T13:35:04.669828053Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 3 13:35:04.671103 systemd[1]: Started containerd.service - containerd container runtime. Mar 3 13:35:04.676627 containerd[1639]: time="2026-03-03T13:35:04.676073036Z" level=info msg="containerd successfully booted in 0.174322s" Mar 3 13:35:04.801660 tar[1623]: linux-amd64/README.md Mar 3 13:35:04.820871 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 3 13:35:05.056973 systemd-networkd[1489]: eth1: Gained IPv6LL Mar 3 13:35:05.060786 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 3 13:35:05.062180 systemd[1]: Reached target network-online.target - Network is Online. Mar 3 13:35:05.067288 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 3 13:35:05.070919 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 3 13:35:05.107487 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 3 13:35:05.761082 systemd-networkd[1489]: eth0: Gained IPv6LL Mar 3 13:35:05.797710 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 3 13:35:05.802793 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 3 13:35:05.804120 systemd[1]: Startup finished in 3.052s (kernel) + 5.895s (initrd) + 4.285s (userspace) = 13.234s. Mar 3 13:35:05.806043 (kubelet)[1740]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 3 13:35:06.223424 kubelet[1740]: E0303 13:35:06.223347 1740 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 3 13:35:06.227028 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 3 13:35:06.227401 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 3 13:35:06.228146 systemd[1]: kubelet.service: Consumed 716ms CPU time, 255.5M memory peak. Mar 3 13:35:10.585844 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 3 13:35:10.587987 systemd[1]: Started sshd@0-89.167.115.149:22-20.161.92.111:38766.service - OpenSSH per-connection server daemon (20.161.92.111:38766). Mar 3 13:35:11.264923 sshd[1752]: Accepted publickey for core from 20.161.92.111 port 38766 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:35:11.268616 sshd-session[1752]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:35:11.281171 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 3 13:35:11.283807 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 3 13:35:11.293764 systemd-logind[1607]: New session 1 of user core. Mar 3 13:35:11.309015 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 3 13:35:11.314182 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 3 13:35:11.331713 (systemd)[1757]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 3 13:35:11.334310 systemd-logind[1607]: New session c1 of user core. Mar 3 13:35:11.466198 systemd[1757]: Queued start job for default target default.target. Mar 3 13:35:11.471708 systemd[1757]: Created slice app.slice - User Application Slice. Mar 3 13:35:11.471728 systemd[1757]: Reached target paths.target - Paths. Mar 3 13:35:11.471770 systemd[1757]: Reached target timers.target - Timers. Mar 3 13:35:11.473242 systemd[1757]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 3 13:35:11.485578 systemd[1757]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 3 13:35:11.485781 systemd[1757]: Reached target sockets.target - Sockets. Mar 3 13:35:11.485916 systemd[1757]: Reached target basic.target - Basic System. Mar 3 13:35:11.486082 systemd[1757]: Reached target default.target - Main User Target. Mar 3 13:35:11.486174 systemd[1757]: Startup finished in 144ms. Mar 3 13:35:11.486289 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 3 13:35:11.497848 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 3 13:35:11.890263 systemd[1]: Started sshd@1-89.167.115.149:22-20.161.92.111:38780.service - OpenSSH per-connection server daemon (20.161.92.111:38780). Mar 3 13:35:12.573127 sshd[1768]: Accepted publickey for core from 20.161.92.111 port 38780 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:35:12.576666 sshd-session[1768]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:35:12.586757 systemd-logind[1607]: New session 2 of user core. Mar 3 13:35:12.593942 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 3 13:35:12.947183 sshd[1771]: Connection closed by 20.161.92.111 port 38780 Mar 3 13:35:12.947927 sshd-session[1768]: pam_unix(sshd:session): session closed for user core Mar 3 13:35:12.952319 systemd[1]: sshd@1-89.167.115.149:22-20.161.92.111:38780.service: Deactivated successfully. Mar 3 13:35:12.955466 systemd[1]: session-2.scope: Deactivated successfully. Mar 3 13:35:12.957858 systemd-logind[1607]: Session 2 logged out. Waiting for processes to exit. Mar 3 13:35:12.959568 systemd-logind[1607]: Removed session 2. Mar 3 13:35:13.081573 systemd[1]: Started sshd@2-89.167.115.149:22-20.161.92.111:38788.service - OpenSSH per-connection server daemon (20.161.92.111:38788). Mar 3 13:35:13.759756 sshd[1777]: Accepted publickey for core from 20.161.92.111 port 38788 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:35:13.762466 sshd-session[1777]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:35:13.771775 systemd-logind[1607]: New session 3 of user core. Mar 3 13:35:13.778907 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 3 13:35:14.127068 sshd[1780]: Connection closed by 20.161.92.111 port 38788 Mar 3 13:35:14.127711 sshd-session[1777]: pam_unix(sshd:session): session closed for user core Mar 3 13:35:14.132511 systemd[1]: sshd@2-89.167.115.149:22-20.161.92.111:38788.service: Deactivated successfully. Mar 3 13:35:14.135154 systemd[1]: session-3.scope: Deactivated successfully. Mar 3 13:35:14.136395 systemd-logind[1607]: Session 3 logged out. Waiting for processes to exit. Mar 3 13:35:14.138182 systemd-logind[1607]: Removed session 3. Mar 3 13:35:14.260076 systemd[1]: Started sshd@3-89.167.115.149:22-20.161.92.111:38804.service - OpenSSH per-connection server daemon (20.161.92.111:38804). Mar 3 13:35:14.907567 sshd[1786]: Accepted publickey for core from 20.161.92.111 port 38804 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:35:14.911073 sshd-session[1786]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:35:14.920575 systemd-logind[1607]: New session 4 of user core. Mar 3 13:35:14.930914 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 3 13:35:15.279521 sshd[1789]: Connection closed by 20.161.92.111 port 38804 Mar 3 13:35:15.281061 sshd-session[1786]: pam_unix(sshd:session): session closed for user core Mar 3 13:35:15.288324 systemd-logind[1607]: Session 4 logged out. Waiting for processes to exit. Mar 3 13:35:15.289915 systemd[1]: sshd@3-89.167.115.149:22-20.161.92.111:38804.service: Deactivated successfully. Mar 3 13:35:15.293297 systemd[1]: session-4.scope: Deactivated successfully. Mar 3 13:35:15.295460 systemd-logind[1607]: Removed session 4. Mar 3 13:35:15.421988 systemd[1]: Started sshd@4-89.167.115.149:22-20.161.92.111:38806.service - OpenSSH per-connection server daemon (20.161.92.111:38806). Mar 3 13:35:16.058786 sshd[1795]: Accepted publickey for core from 20.161.92.111 port 38806 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:35:16.061446 sshd-session[1795]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:35:16.070768 systemd-logind[1607]: New session 5 of user core. Mar 3 13:35:16.078894 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 3 13:35:16.323891 sudo[1799]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 3 13:35:16.324604 sudo[1799]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 3 13:35:16.326484 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 3 13:35:16.329949 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 3 13:35:16.344597 sudo[1799]: pam_unix(sudo:session): session closed for user root Mar 3 13:35:16.466946 sshd[1798]: Connection closed by 20.161.92.111 port 38806 Mar 3 13:35:16.470051 sshd-session[1795]: pam_unix(sshd:session): session closed for user core Mar 3 13:35:16.478916 systemd-logind[1607]: Session 5 logged out. Waiting for processes to exit. Mar 3 13:35:16.480113 systemd[1]: sshd@4-89.167.115.149:22-20.161.92.111:38806.service: Deactivated successfully. Mar 3 13:35:16.484624 systemd[1]: session-5.scope: Deactivated successfully. Mar 3 13:35:16.488270 systemd-logind[1607]: Removed session 5. Mar 3 13:35:16.518993 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 3 13:35:16.529976 (kubelet)[1812]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 3 13:35:16.559472 kubelet[1812]: E0303 13:35:16.559423 1812 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 3 13:35:16.568163 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 3 13:35:16.568334 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 3 13:35:16.568899 systemd[1]: kubelet.service: Consumed 189ms CPU time, 109.8M memory peak. Mar 3 13:35:16.598145 systemd[1]: Started sshd@5-89.167.115.149:22-20.161.92.111:38816.service - OpenSSH per-connection server daemon (20.161.92.111:38816). Mar 3 13:35:17.242762 sshd[1820]: Accepted publickey for core from 20.161.92.111 port 38816 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:35:17.244743 sshd-session[1820]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:35:17.254779 systemd-logind[1607]: New session 6 of user core. Mar 3 13:35:17.262935 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 3 13:35:17.495479 sudo[1825]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 3 13:35:17.496315 sudo[1825]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 3 13:35:17.505910 sudo[1825]: pam_unix(sudo:session): session closed for user root Mar 3 13:35:17.517285 sudo[1824]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 3 13:35:17.517936 sudo[1824]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 3 13:35:17.529822 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 3 13:35:17.596452 augenrules[1847]: No rules Mar 3 13:35:17.597893 systemd[1]: audit-rules.service: Deactivated successfully. Mar 3 13:35:17.598447 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 3 13:35:17.600128 sudo[1824]: pam_unix(sudo:session): session closed for user root Mar 3 13:35:17.721097 sshd[1823]: Connection closed by 20.161.92.111 port 38816 Mar 3 13:35:17.722855 sshd-session[1820]: pam_unix(sshd:session): session closed for user core Mar 3 13:35:17.727960 systemd[1]: sshd@5-89.167.115.149:22-20.161.92.111:38816.service: Deactivated successfully. Mar 3 13:35:17.731344 systemd[1]: session-6.scope: Deactivated successfully. Mar 3 13:35:17.732650 systemd-logind[1607]: Session 6 logged out. Waiting for processes to exit. Mar 3 13:35:17.734428 systemd-logind[1607]: Removed session 6. Mar 3 13:35:17.855490 systemd[1]: Started sshd@6-89.167.115.149:22-20.161.92.111:38830.service - OpenSSH per-connection server daemon (20.161.92.111:38830). Mar 3 13:35:18.527137 sshd[1856]: Accepted publickey for core from 20.161.92.111 port 38830 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:35:18.529807 sshd-session[1856]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:35:18.539571 systemd-logind[1607]: New session 7 of user core. Mar 3 13:35:18.548966 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 3 13:35:18.780620 sudo[1860]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 3 13:35:18.781286 sudo[1860]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 3 13:35:19.138175 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 3 13:35:19.168406 (dockerd)[1878]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 3 13:35:19.402296 dockerd[1878]: time="2026-03-03T13:35:19.401996690Z" level=info msg="Starting up" Mar 3 13:35:19.403377 dockerd[1878]: time="2026-03-03T13:35:19.403062282Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 3 13:35:19.412083 dockerd[1878]: time="2026-03-03T13:35:19.412055812Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Mar 3 13:35:19.441280 systemd[1]: var-lib-docker-metacopy\x2dcheck1092519386-merged.mount: Deactivated successfully. Mar 3 13:35:19.458244 dockerd[1878]: time="2026-03-03T13:35:19.458208830Z" level=info msg="Loading containers: start." Mar 3 13:35:19.467710 kernel: Initializing XFRM netlink socket Mar 3 13:35:19.686327 systemd-networkd[1489]: docker0: Link UP Mar 3 13:35:19.692908 dockerd[1878]: time="2026-03-03T13:35:19.692869423Z" level=info msg="Loading containers: done." Mar 3 13:35:19.706344 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3895296437-merged.mount: Deactivated successfully. Mar 3 13:35:19.708242 dockerd[1878]: time="2026-03-03T13:35:19.708203140Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 3 13:35:19.708311 dockerd[1878]: time="2026-03-03T13:35:19.708279325Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Mar 3 13:35:19.708393 dockerd[1878]: time="2026-03-03T13:35:19.708371093Z" level=info msg="Initializing buildkit" Mar 3 13:35:19.730012 dockerd[1878]: time="2026-03-03T13:35:19.729984244Z" level=info msg="Completed buildkit initialization" Mar 3 13:35:19.736302 dockerd[1878]: time="2026-03-03T13:35:19.736265024Z" level=info msg="Daemon has completed initialization" Mar 3 13:35:19.736427 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 3 13:35:19.736793 dockerd[1878]: time="2026-03-03T13:35:19.736747063Z" level=info msg="API listen on /run/docker.sock" Mar 3 13:35:20.293113 containerd[1639]: time="2026-03-03T13:35:20.293066084Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.5\"" Mar 3 13:35:20.941254 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3735982330.mount: Deactivated successfully. Mar 3 13:35:21.889791 containerd[1639]: time="2026-03-03T13:35:21.889743181Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:21.890835 containerd[1639]: time="2026-03-03T13:35:21.890638286Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.5: active requests=0, bytes read=27074597" Mar 3 13:35:21.892877 containerd[1639]: time="2026-03-03T13:35:21.892856877Z" level=info msg="ImageCreate event name:\"sha256:364ea2876e41b29691964751b6217cd2e343433690fbe16a5c6a236042684df3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:21.895070 containerd[1639]: time="2026-03-03T13:35:21.895051092Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:c548633fcd3b4aad59b70815be4c8be54a0fddaddc3fcffa9371eedb0e96417a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:21.895620 containerd[1639]: time="2026-03-03T13:35:21.895604114Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.5\" with image id \"sha256:364ea2876e41b29691964751b6217cd2e343433690fbe16a5c6a236042684df3\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:c548633fcd3b4aad59b70815be4c8be54a0fddaddc3fcffa9371eedb0e96417a\", size \"27071096\" in 1.602493139s" Mar 3 13:35:21.895700 containerd[1639]: time="2026-03-03T13:35:21.895690514Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.5\" returns image reference \"sha256:364ea2876e41b29691964751b6217cd2e343433690fbe16a5c6a236042684df3\"" Mar 3 13:35:21.896118 containerd[1639]: time="2026-03-03T13:35:21.896093315Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.5\"" Mar 3 13:35:22.969176 containerd[1639]: time="2026-03-03T13:35:22.969123381Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:22.970368 containerd[1639]: time="2026-03-03T13:35:22.970147433Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.5: active requests=0, bytes read=21165845" Mar 3 13:35:22.971162 containerd[1639]: time="2026-03-03T13:35:22.971141926Z" level=info msg="ImageCreate event name:\"sha256:8926c34822743bb97f9003f92c30127bfeaad8bed71cd36f1c861ed8fda2c154\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:22.973335 containerd[1639]: time="2026-03-03T13:35:22.973318444Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:f0426100c873816560c520d542fa28999a98dad909edd04365f3b0eead790da3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:22.973912 containerd[1639]: time="2026-03-03T13:35:22.973892223Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.5\" with image id \"sha256:8926c34822743bb97f9003f92c30127bfeaad8bed71cd36f1c861ed8fda2c154\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:f0426100c873816560c520d542fa28999a98dad909edd04365f3b0eead790da3\", size \"22822771\" in 1.077779554s" Mar 3 13:35:22.973970 containerd[1639]: time="2026-03-03T13:35:22.973960581Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.5\" returns image reference \"sha256:8926c34822743bb97f9003f92c30127bfeaad8bed71cd36f1c861ed8fda2c154\"" Mar 3 13:35:22.974532 containerd[1639]: time="2026-03-03T13:35:22.974502685Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.5\"" Mar 3 13:35:23.915452 containerd[1639]: time="2026-03-03T13:35:23.915402549Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:23.916324 containerd[1639]: time="2026-03-03T13:35:23.916124153Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.5: active requests=0, bytes read=15729846" Mar 3 13:35:23.917258 containerd[1639]: time="2026-03-03T13:35:23.917240032Z" level=info msg="ImageCreate event name:\"sha256:f6b3520b1732b4980b2528fe5622e62be26bb6a8d38da81349cb6ccd3a1e6d65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:23.919302 containerd[1639]: time="2026-03-03T13:35:23.919279020Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:b67b0d627c8e99ffa362bd4d9a60ca9a6c449e363a5f88d2aa8c224bd84ca51d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:23.919951 containerd[1639]: time="2026-03-03T13:35:23.919930559Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.5\" with image id \"sha256:f6b3520b1732b4980b2528fe5622e62be26bb6a8d38da81349cb6ccd3a1e6d65\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:b67b0d627c8e99ffa362bd4d9a60ca9a6c449e363a5f88d2aa8c224bd84ca51d\", size \"17386790\" in 945.352421ms" Mar 3 13:35:23.919990 containerd[1639]: time="2026-03-03T13:35:23.919953907Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.5\" returns image reference \"sha256:f6b3520b1732b4980b2528fe5622e62be26bb6a8d38da81349cb6ccd3a1e6d65\"" Mar 3 13:35:23.920551 containerd[1639]: time="2026-03-03T13:35:23.920527655Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.5\"" Mar 3 13:35:24.874694 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1522192873.mount: Deactivated successfully. Mar 3 13:35:25.064414 containerd[1639]: time="2026-03-03T13:35:25.064355258Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:25.065333 containerd[1639]: time="2026-03-03T13:35:25.065195886Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.5: active requests=0, bytes read=25861798" Mar 3 13:35:25.066141 containerd[1639]: time="2026-03-03T13:35:25.066120139Z" level=info msg="ImageCreate event name:\"sha256:38728cde323c302ed9eca4f1b7c0080d17db50144e39398fcf901d9df13f0c3e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:25.067718 containerd[1639]: time="2026-03-03T13:35:25.067698405Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:8a22a3bf452d07af3b5a3064b089d2ad6579d5dd3b850386e05cc0f36dc3f4cf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:25.068313 containerd[1639]: time="2026-03-03T13:35:25.068287998Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.5\" with image id \"sha256:38728cde323c302ed9eca4f1b7c0080d17db50144e39398fcf901d9df13f0c3e\", repo tag \"registry.k8s.io/kube-proxy:v1.34.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:8a22a3bf452d07af3b5a3064b089d2ad6579d5dd3b850386e05cc0f36dc3f4cf\", size \"25860789\" in 1.147480843s" Mar 3 13:35:25.068417 containerd[1639]: time="2026-03-03T13:35:25.068397780Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.5\" returns image reference \"sha256:38728cde323c302ed9eca4f1b7c0080d17db50144e39398fcf901d9df13f0c3e\"" Mar 3 13:35:25.070746 containerd[1639]: time="2026-03-03T13:35:25.070731113Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Mar 3 13:35:25.647855 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4128494315.mount: Deactivated successfully. Mar 3 13:35:26.401121 containerd[1639]: time="2026-03-03T13:35:26.401084106Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:26.402292 containerd[1639]: time="2026-03-03T13:35:26.402167280Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=22388101" Mar 3 13:35:26.403211 containerd[1639]: time="2026-03-03T13:35:26.403190882Z" level=info msg="ImageCreate event name:\"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:26.405133 containerd[1639]: time="2026-03-03T13:35:26.405111838Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:26.405808 containerd[1639]: time="2026-03-03T13:35:26.405789529Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"22384805\" in 1.334991442s" Mar 3 13:35:26.405869 containerd[1639]: time="2026-03-03T13:35:26.405858709Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:52546a367cc9e0d924aa3b190596a9167fa6e53245023b5b5baf0f07e5443969\"" Mar 3 13:35:26.406435 containerd[1639]: time="2026-03-03T13:35:26.406410565Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Mar 3 13:35:26.635310 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 3 13:35:26.638866 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 3 13:35:26.824854 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 3 13:35:26.830959 (kubelet)[2222]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 3 13:35:26.866069 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2059053388.mount: Deactivated successfully. Mar 3 13:35:26.872361 kubelet[2222]: E0303 13:35:26.872315 2222 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 3 13:35:26.874396 containerd[1639]: time="2026-03-03T13:35:26.874358247Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:26.875796 containerd[1639]: time="2026-03-03T13:35:26.875689544Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=321240" Mar 3 13:35:26.876885 containerd[1639]: time="2026-03-03T13:35:26.876853681Z" level=info msg="ImageCreate event name:\"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:26.877407 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 3 13:35:26.878286 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 3 13:35:26.878594 systemd[1]: kubelet.service: Consumed 173ms CPU time, 108.6M memory peak. Mar 3 13:35:26.879930 containerd[1639]: time="2026-03-03T13:35:26.879888606Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:26.880206 containerd[1639]: time="2026-03-03T13:35:26.880176537Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"320448\" in 473.745367ms" Mar 3 13:35:26.880206 containerd[1639]: time="2026-03-03T13:35:26.880203721Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\"" Mar 3 13:35:26.880778 containerd[1639]: time="2026-03-03T13:35:26.880662382Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\"" Mar 3 13:35:27.393895 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount911836409.mount: Deactivated successfully. Mar 3 13:35:28.063520 containerd[1639]: time="2026-03-03T13:35:28.063458171Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.5-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:28.064441 containerd[1639]: time="2026-03-03T13:35:28.064284620Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.5-0: active requests=0, bytes read=22860762" Mar 3 13:35:28.065047 containerd[1639]: time="2026-03-03T13:35:28.065024797Z" level=info msg="ImageCreate event name:\"sha256:a3e246e9556e93d71e2850085ba581b376c76a9187b4b8a01c120f86579ef2b1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:28.067024 containerd[1639]: time="2026-03-03T13:35:28.066995393Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:28.067603 containerd[1639]: time="2026-03-03T13:35:28.067574916Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.5-0\" with image id \"sha256:a3e246e9556e93d71e2850085ba581b376c76a9187b4b8a01c120f86579ef2b1\", repo tag \"registry.k8s.io/etcd:3.6.5-0\", repo digest \"registry.k8s.io/etcd@sha256:042ef9c02799eb9303abf1aa99b09f09d94b8ee3ba0c2dd3f42dc4e1d3dce534\", size \"22871747\" in 1.186690356s" Mar 3 13:35:28.067636 containerd[1639]: time="2026-03-03T13:35:28.067603506Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.5-0\" returns image reference \"sha256:a3e246e9556e93d71e2850085ba581b376c76a9187b4b8a01c120f86579ef2b1\"" Mar 3 13:35:30.448185 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 3 13:35:30.448470 systemd[1]: kubelet.service: Consumed 173ms CPU time, 108.6M memory peak. Mar 3 13:35:30.452488 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 3 13:35:30.496316 systemd[1]: Reload requested from client PID 2322 ('systemctl') (unit session-7.scope)... Mar 3 13:35:30.496330 systemd[1]: Reloading... Mar 3 13:35:30.619690 zram_generator::config[2362]: No configuration found. Mar 3 13:35:30.801536 systemd[1]: Reloading finished in 304 ms. Mar 3 13:35:30.845305 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 3 13:35:30.845400 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 3 13:35:30.845958 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 3 13:35:30.846002 systemd[1]: kubelet.service: Consumed 112ms CPU time, 98.3M memory peak. Mar 3 13:35:30.847763 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 3 13:35:31.004253 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 3 13:35:31.010952 (kubelet)[2417]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 3 13:35:31.041898 kubelet[2417]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 3 13:35:31.041898 kubelet[2417]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 3 13:35:31.041898 kubelet[2417]: I0303 13:35:31.041869 2417 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 3 13:35:31.798461 kubelet[2417]: I0303 13:35:31.798396 2417 server.go:529] "Kubelet version" kubeletVersion="v1.34.4" Mar 3 13:35:31.798461 kubelet[2417]: I0303 13:35:31.798420 2417 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 3 13:35:31.800711 kubelet[2417]: I0303 13:35:31.800642 2417 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 3 13:35:31.800711 kubelet[2417]: I0303 13:35:31.800677 2417 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 3 13:35:31.800879 kubelet[2417]: I0303 13:35:31.800850 2417 server.go:956] "Client rotation is on, will bootstrap in background" Mar 3 13:35:31.808698 kubelet[2417]: I0303 13:35:31.808502 2417 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 3 13:35:31.809144 kubelet[2417]: E0303 13:35:31.809114 2417 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://89.167.115.149:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 89.167.115.149:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 3 13:35:31.812497 kubelet[2417]: I0303 13:35:31.812468 2417 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 3 13:35:31.815568 kubelet[2417]: I0303 13:35:31.815553 2417 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 3 13:35:31.816905 kubelet[2417]: I0303 13:35:31.816859 2417 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 3 13:35:31.817016 kubelet[2417]: I0303 13:35:31.816888 2417 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-4-a-1c48930dd2","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 3 13:35:31.817016 kubelet[2417]: I0303 13:35:31.817005 2417 topology_manager.go:138] "Creating topology manager with none policy" Mar 3 13:35:31.817016 kubelet[2417]: I0303 13:35:31.817012 2417 container_manager_linux.go:306] "Creating device plugin manager" Mar 3 13:35:31.817180 kubelet[2417]: I0303 13:35:31.817076 2417 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Mar 3 13:35:31.819495 kubelet[2417]: I0303 13:35:31.819462 2417 state_mem.go:36] "Initialized new in-memory state store" Mar 3 13:35:31.819722 kubelet[2417]: I0303 13:35:31.819705 2417 kubelet.go:475] "Attempting to sync node with API server" Mar 3 13:35:31.819756 kubelet[2417]: I0303 13:35:31.819726 2417 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 3 13:35:31.819756 kubelet[2417]: I0303 13:35:31.819752 2417 kubelet.go:387] "Adding apiserver pod source" Mar 3 13:35:31.819800 kubelet[2417]: I0303 13:35:31.819764 2417 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 3 13:35:31.821347 kubelet[2417]: E0303 13:35:31.820830 2417 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://89.167.115.149:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-4-a-1c48930dd2&limit=500&resourceVersion=0\": dial tcp 89.167.115.149:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 3 13:35:31.821347 kubelet[2417]: E0303 13:35:31.821324 2417 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://89.167.115.149:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 89.167.115.149:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 3 13:35:31.821984 kubelet[2417]: I0303 13:35:31.821974 2417 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 3 13:35:31.822393 kubelet[2417]: I0303 13:35:31.822382 2417 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 3 13:35:31.822451 kubelet[2417]: I0303 13:35:31.822443 2417 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 3 13:35:31.822517 kubelet[2417]: W0303 13:35:31.822510 2417 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 3 13:35:31.826076 kubelet[2417]: I0303 13:35:31.826064 2417 server.go:1262] "Started kubelet" Mar 3 13:35:31.827513 kubelet[2417]: I0303 13:35:31.827468 2417 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 3 13:35:31.830147 kubelet[2417]: E0303 13:35:31.829152 2417 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://89.167.115.149:6443/api/v1/namespaces/default/events\": dial tcp 89.167.115.149:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-2-4-a-1c48930dd2.1899583961e91a4b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-2-4-a-1c48930dd2,UID:ci-4459-2-4-a-1c48930dd2,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-2-4-a-1c48930dd2,},FirstTimestamp:2026-03-03 13:35:31.826047563 +0000 UTC m=+0.811755525,LastTimestamp:2026-03-03 13:35:31.826047563 +0000 UTC m=+0.811755525,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-4-a-1c48930dd2,}" Mar 3 13:35:31.831284 kubelet[2417]: E0303 13:35:31.831260 2417 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 3 13:35:31.831361 kubelet[2417]: I0303 13:35:31.831336 2417 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 3 13:35:31.832585 kubelet[2417]: I0303 13:35:31.832409 2417 server.go:310] "Adding debug handlers to kubelet server" Mar 3 13:35:31.832775 kubelet[2417]: I0303 13:35:31.832749 2417 volume_manager.go:313] "Starting Kubelet Volume Manager" Mar 3 13:35:31.832953 kubelet[2417]: E0303 13:35:31.832929 2417 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-a-1c48930dd2\" not found" Mar 3 13:35:31.837689 kubelet[2417]: I0303 13:35:31.836459 2417 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 3 13:35:31.837689 kubelet[2417]: I0303 13:35:31.836492 2417 reconciler.go:29] "Reconciler: start to sync state" Mar 3 13:35:31.838574 kubelet[2417]: I0303 13:35:31.838524 2417 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 3 13:35:31.838744 kubelet[2417]: I0303 13:35:31.838731 2417 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 3 13:35:31.839005 kubelet[2417]: I0303 13:35:31.838993 2417 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 3 13:35:31.840293 kubelet[2417]: I0303 13:35:31.840278 2417 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 3 13:35:31.844245 kubelet[2417]: E0303 13:35:31.844219 2417 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://89.167.115.149:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-a-1c48930dd2?timeout=10s\": dial tcp 89.167.115.149:6443: connect: connection refused" interval="200ms" Mar 3 13:35:31.845888 kubelet[2417]: E0303 13:35:31.845857 2417 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://89.167.115.149:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 89.167.115.149:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 3 13:35:31.846270 kubelet[2417]: I0303 13:35:31.846162 2417 factory.go:223] Registration of the systemd container factory successfully Mar 3 13:35:31.846441 kubelet[2417]: I0303 13:35:31.846426 2417 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 3 13:35:31.847868 kubelet[2417]: I0303 13:35:31.847855 2417 factory.go:223] Registration of the containerd container factory successfully Mar 3 13:35:31.860928 kubelet[2417]: I0303 13:35:31.860895 2417 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 3 13:35:31.860928 kubelet[2417]: I0303 13:35:31.860925 2417 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 3 13:35:31.860928 kubelet[2417]: I0303 13:35:31.860937 2417 state_mem.go:36] "Initialized new in-memory state store" Mar 3 13:35:31.862235 kubelet[2417]: I0303 13:35:31.862151 2417 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 3 13:35:31.862472 kubelet[2417]: I0303 13:35:31.862442 2417 policy_none.go:49] "None policy: Start" Mar 3 13:35:31.862472 kubelet[2417]: I0303 13:35:31.862463 2417 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 3 13:35:31.862472 kubelet[2417]: I0303 13:35:31.862472 2417 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 3 13:35:31.863646 kubelet[2417]: I0303 13:35:31.863616 2417 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 3 13:35:31.863646 kubelet[2417]: I0303 13:35:31.863633 2417 status_manager.go:244] "Starting to sync pod status with apiserver" Mar 3 13:35:31.863646 kubelet[2417]: I0303 13:35:31.863647 2417 kubelet.go:2428] "Starting kubelet main sync loop" Mar 3 13:35:31.863753 kubelet[2417]: E0303 13:35:31.863694 2417 kubelet.go:2452] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 3 13:35:31.864334 kubelet[2417]: I0303 13:35:31.863906 2417 policy_none.go:47] "Start" Mar 3 13:35:31.867654 kubelet[2417]: E0303 13:35:31.867635 2417 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://89.167.115.149:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 89.167.115.149:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 3 13:35:31.870628 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 3 13:35:31.883069 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 3 13:35:31.885999 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 3 13:35:31.905453 kubelet[2417]: E0303 13:35:31.904553 2417 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 3 13:35:31.905453 kubelet[2417]: I0303 13:35:31.904816 2417 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 3 13:35:31.905453 kubelet[2417]: I0303 13:35:31.904827 2417 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 3 13:35:31.905453 kubelet[2417]: I0303 13:35:31.905118 2417 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 3 13:35:31.907949 kubelet[2417]: E0303 13:35:31.907926 2417 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 3 13:35:31.907999 kubelet[2417]: E0303 13:35:31.907962 2417 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459-2-4-a-1c48930dd2\" not found" Mar 3 13:35:31.981489 systemd[1]: Created slice kubepods-burstable-podc935c4a8a9f6f7d0099370a1fb38915e.slice - libcontainer container kubepods-burstable-podc935c4a8a9f6f7d0099370a1fb38915e.slice. Mar 3 13:35:31.999385 kubelet[2417]: E0303 13:35:31.999333 2417 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-a-1c48930dd2\" not found" node="ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:32.000689 systemd[1]: Created slice kubepods-burstable-podc856a9a8037c97609a00d34ac066f722.slice - libcontainer container kubepods-burstable-podc856a9a8037c97609a00d34ac066f722.slice. Mar 3 13:35:32.003691 kubelet[2417]: E0303 13:35:32.003353 2417 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-a-1c48930dd2\" not found" node="ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:32.007378 kubelet[2417]: I0303 13:35:32.007368 2417 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:32.008584 kubelet[2417]: E0303 13:35:32.008567 2417 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://89.167.115.149:6443/api/v1/nodes\": dial tcp 89.167.115.149:6443: connect: connection refused" node="ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:32.008608 systemd[1]: Created slice kubepods-burstable-podb55be887bce29f8749013c7dd6d0b968.slice - libcontainer container kubepods-burstable-podb55be887bce29f8749013c7dd6d0b968.slice. Mar 3 13:35:32.010551 kubelet[2417]: E0303 13:35:32.010535 2417 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-a-1c48930dd2\" not found" node="ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:32.037288 kubelet[2417]: I0303 13:35:32.037246 2417 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c935c4a8a9f6f7d0099370a1fb38915e-ca-certs\") pod \"kube-apiserver-ci-4459-2-4-a-1c48930dd2\" (UID: \"c935c4a8a9f6f7d0099370a1fb38915e\") " pod="kube-system/kube-apiserver-ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:32.045919 kubelet[2417]: E0303 13:35:32.045872 2417 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://89.167.115.149:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-a-1c48930dd2?timeout=10s\": dial tcp 89.167.115.149:6443: connect: connection refused" interval="400ms" Mar 3 13:35:32.138869 kubelet[2417]: I0303 13:35:32.138736 2417 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c935c4a8a9f6f7d0099370a1fb38915e-k8s-certs\") pod \"kube-apiserver-ci-4459-2-4-a-1c48930dd2\" (UID: \"c935c4a8a9f6f7d0099370a1fb38915e\") " pod="kube-system/kube-apiserver-ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:32.138869 kubelet[2417]: I0303 13:35:32.138839 2417 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c856a9a8037c97609a00d34ac066f722-ca-certs\") pod \"kube-controller-manager-ci-4459-2-4-a-1c48930dd2\" (UID: \"c856a9a8037c97609a00d34ac066f722\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:32.139131 kubelet[2417]: I0303 13:35:32.138868 2417 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/c856a9a8037c97609a00d34ac066f722-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-4-a-1c48930dd2\" (UID: \"c856a9a8037c97609a00d34ac066f722\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:32.139131 kubelet[2417]: I0303 13:35:32.138927 2417 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c856a9a8037c97609a00d34ac066f722-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-4-a-1c48930dd2\" (UID: \"c856a9a8037c97609a00d34ac066f722\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:32.139131 kubelet[2417]: I0303 13:35:32.138953 2417 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c856a9a8037c97609a00d34ac066f722-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-4-a-1c48930dd2\" (UID: \"c856a9a8037c97609a00d34ac066f722\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:32.139131 kubelet[2417]: I0303 13:35:32.139027 2417 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b55be887bce29f8749013c7dd6d0b968-kubeconfig\") pod \"kube-scheduler-ci-4459-2-4-a-1c48930dd2\" (UID: \"b55be887bce29f8749013c7dd6d0b968\") " pod="kube-system/kube-scheduler-ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:32.139131 kubelet[2417]: I0303 13:35:32.139051 2417 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c935c4a8a9f6f7d0099370a1fb38915e-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-4-a-1c48930dd2\" (UID: \"c935c4a8a9f6f7d0099370a1fb38915e\") " pod="kube-system/kube-apiserver-ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:32.139393 kubelet[2417]: I0303 13:35:32.139105 2417 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c856a9a8037c97609a00d34ac066f722-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-4-a-1c48930dd2\" (UID: \"c856a9a8037c97609a00d34ac066f722\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:32.212497 kubelet[2417]: I0303 13:35:32.212422 2417 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:32.212991 kubelet[2417]: E0303 13:35:32.212945 2417 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://89.167.115.149:6443/api/v1/nodes\": dial tcp 89.167.115.149:6443: connect: connection refused" node="ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:32.305342 containerd[1639]: time="2026-03-03T13:35:32.305210527Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-4-a-1c48930dd2,Uid:c935c4a8a9f6f7d0099370a1fb38915e,Namespace:kube-system,Attempt:0,}" Mar 3 13:35:32.307598 containerd[1639]: time="2026-03-03T13:35:32.307531740Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-4-a-1c48930dd2,Uid:c856a9a8037c97609a00d34ac066f722,Namespace:kube-system,Attempt:0,}" Mar 3 13:35:32.314506 containerd[1639]: time="2026-03-03T13:35:32.314457488Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-4-a-1c48930dd2,Uid:b55be887bce29f8749013c7dd6d0b968,Namespace:kube-system,Attempt:0,}" Mar 3 13:35:32.448861 kubelet[2417]: E0303 13:35:32.447043 2417 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://89.167.115.149:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-a-1c48930dd2?timeout=10s\": dial tcp 89.167.115.149:6443: connect: connection refused" interval="800ms" Mar 3 13:35:32.616211 kubelet[2417]: I0303 13:35:32.616154 2417 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:32.616795 kubelet[2417]: E0303 13:35:32.616756 2417 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://89.167.115.149:6443/api/v1/nodes\": dial tcp 89.167.115.149:6443: connect: connection refused" node="ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:32.788630 kubelet[2417]: E0303 13:35:32.788455 2417 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://89.167.115.149:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 89.167.115.149:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 3 13:35:32.831131 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3383338584.mount: Deactivated successfully. Mar 3 13:35:32.841019 containerd[1639]: time="2026-03-03T13:35:32.840500770Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 3 13:35:32.844610 containerd[1639]: time="2026-03-03T13:35:32.844515882Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321160" Mar 3 13:35:32.845703 containerd[1639]: time="2026-03-03T13:35:32.845605527Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 3 13:35:32.850843 containerd[1639]: time="2026-03-03T13:35:32.850777352Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 3 13:35:32.853730 containerd[1639]: time="2026-03-03T13:35:32.853067892Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 3 13:35:32.854863 containerd[1639]: time="2026-03-03T13:35:32.854800561Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 3 13:35:32.855852 containerd[1639]: time="2026-03-03T13:35:32.855787715Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 3 13:35:32.857060 containerd[1639]: time="2026-03-03T13:35:32.857024463Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 3 13:35:32.859115 containerd[1639]: time="2026-03-03T13:35:32.859057992Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 541.890893ms" Mar 3 13:35:32.861782 containerd[1639]: time="2026-03-03T13:35:32.861371478Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 552.338957ms" Mar 3 13:35:32.862572 containerd[1639]: time="2026-03-03T13:35:32.862533974Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 552.265927ms" Mar 3 13:35:32.908753 containerd[1639]: time="2026-03-03T13:35:32.908719537Z" level=info msg="connecting to shim 8160feff6e460e0c42ef5320152913d449f11d7b7e54a08c3dc71d1a475ef7a5" address="unix:///run/containerd/s/50ad18e68255cf1071049f09e621c9c9619568dd926cb8b6a83de66dbbaf7a56" namespace=k8s.io protocol=ttrpc version=3 Mar 3 13:35:32.910212 containerd[1639]: time="2026-03-03T13:35:32.910194075Z" level=info msg="connecting to shim bae0e1b80809613d69ac08e65adf27fbcbf4bede06c58d4bf58debd0cf34226c" address="unix:///run/containerd/s/da835e90a6c7b98e41199e5d60a316b15f9188e5ec23363b39057b5a3c023d84" namespace=k8s.io protocol=ttrpc version=3 Mar 3 13:35:32.914558 containerd[1639]: time="2026-03-03T13:35:32.914537793Z" level=info msg="connecting to shim 10398700551117778987d7477260e6e2515e1b6701dec3aab4d989cb24e4c927" address="unix:///run/containerd/s/e95189b606a7b413bdea73056fba8e4278993e230b9453445365eea7c5b9b8bc" namespace=k8s.io protocol=ttrpc version=3 Mar 3 13:35:32.928635 kubelet[2417]: E0303 13:35:32.928592 2417 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://89.167.115.149:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 89.167.115.149:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 3 13:35:32.933947 systemd[1]: Started cri-containerd-8160feff6e460e0c42ef5320152913d449f11d7b7e54a08c3dc71d1a475ef7a5.scope - libcontainer container 8160feff6e460e0c42ef5320152913d449f11d7b7e54a08c3dc71d1a475ef7a5. Mar 3 13:35:32.939808 systemd[1]: Started cri-containerd-bae0e1b80809613d69ac08e65adf27fbcbf4bede06c58d4bf58debd0cf34226c.scope - libcontainer container bae0e1b80809613d69ac08e65adf27fbcbf4bede06c58d4bf58debd0cf34226c. Mar 3 13:35:32.945481 systemd[1]: Started cri-containerd-10398700551117778987d7477260e6e2515e1b6701dec3aab4d989cb24e4c927.scope - libcontainer container 10398700551117778987d7477260e6e2515e1b6701dec3aab4d989cb24e4c927. Mar 3 13:35:32.999691 containerd[1639]: time="2026-03-03T13:35:32.999050920Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-4-a-1c48930dd2,Uid:c935c4a8a9f6f7d0099370a1fb38915e,Namespace:kube-system,Attempt:0,} returns sandbox id \"10398700551117778987d7477260e6e2515e1b6701dec3aab4d989cb24e4c927\"" Mar 3 13:35:33.009376 containerd[1639]: time="2026-03-03T13:35:33.009336586Z" level=info msg="CreateContainer within sandbox \"10398700551117778987d7477260e6e2515e1b6701dec3aab4d989cb24e4c927\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 3 13:35:33.015180 containerd[1639]: time="2026-03-03T13:35:33.015147632Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-4-a-1c48930dd2,Uid:b55be887bce29f8749013c7dd6d0b968,Namespace:kube-system,Attempt:0,} returns sandbox id \"8160feff6e460e0c42ef5320152913d449f11d7b7e54a08c3dc71d1a475ef7a5\"" Mar 3 13:35:33.022261 containerd[1639]: time="2026-03-03T13:35:33.022228345Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-4-a-1c48930dd2,Uid:c856a9a8037c97609a00d34ac066f722,Namespace:kube-system,Attempt:0,} returns sandbox id \"bae0e1b80809613d69ac08e65adf27fbcbf4bede06c58d4bf58debd0cf34226c\"" Mar 3 13:35:33.022965 containerd[1639]: time="2026-03-03T13:35:33.022950742Z" level=info msg="CreateContainer within sandbox \"8160feff6e460e0c42ef5320152913d449f11d7b7e54a08c3dc71d1a475ef7a5\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 3 13:35:33.024031 containerd[1639]: time="2026-03-03T13:35:33.024005652Z" level=info msg="Container 6207f63513c36ee3afe06def5d74a73353a8acd3537d6aded341b7da2450302d: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:35:33.030242 containerd[1639]: time="2026-03-03T13:35:33.030212185Z" level=info msg="CreateContainer within sandbox \"bae0e1b80809613d69ac08e65adf27fbcbf4bede06c58d4bf58debd0cf34226c\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 3 13:35:33.035146 containerd[1639]: time="2026-03-03T13:35:33.035128470Z" level=info msg="CreateContainer within sandbox \"10398700551117778987d7477260e6e2515e1b6701dec3aab4d989cb24e4c927\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"6207f63513c36ee3afe06def5d74a73353a8acd3537d6aded341b7da2450302d\"" Mar 3 13:35:33.035551 containerd[1639]: time="2026-03-03T13:35:33.035536532Z" level=info msg="StartContainer for \"6207f63513c36ee3afe06def5d74a73353a8acd3537d6aded341b7da2450302d\"" Mar 3 13:35:33.036704 containerd[1639]: time="2026-03-03T13:35:33.036677222Z" level=info msg="connecting to shim 6207f63513c36ee3afe06def5d74a73353a8acd3537d6aded341b7da2450302d" address="unix:///run/containerd/s/e95189b606a7b413bdea73056fba8e4278993e230b9453445365eea7c5b9b8bc" protocol=ttrpc version=3 Mar 3 13:35:33.040979 containerd[1639]: time="2026-03-03T13:35:33.040602095Z" level=info msg="Container 8b4375371a18044a6cf1841dbb85c8ad936f2468a3360009604b6af8aa3db15a: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:35:33.042766 containerd[1639]: time="2026-03-03T13:35:33.042729599Z" level=info msg="Container 7160af87749acd009d42b59a3b89da290eb5c5a4ee53bedea1892a5d5b535456: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:35:33.046949 containerd[1639]: time="2026-03-03T13:35:33.046923737Z" level=info msg="CreateContainer within sandbox \"8160feff6e460e0c42ef5320152913d449f11d7b7e54a08c3dc71d1a475ef7a5\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"8b4375371a18044a6cf1841dbb85c8ad936f2468a3360009604b6af8aa3db15a\"" Mar 3 13:35:33.047423 containerd[1639]: time="2026-03-03T13:35:33.047382620Z" level=info msg="StartContainer for \"8b4375371a18044a6cf1841dbb85c8ad936f2468a3360009604b6af8aa3db15a\"" Mar 3 13:35:33.048818 containerd[1639]: time="2026-03-03T13:35:33.048791494Z" level=info msg="connecting to shim 8b4375371a18044a6cf1841dbb85c8ad936f2468a3360009604b6af8aa3db15a" address="unix:///run/containerd/s/50ad18e68255cf1071049f09e621c9c9619568dd926cb8b6a83de66dbbaf7a56" protocol=ttrpc version=3 Mar 3 13:35:33.055405 containerd[1639]: time="2026-03-03T13:35:33.055381931Z" level=info msg="CreateContainer within sandbox \"bae0e1b80809613d69ac08e65adf27fbcbf4bede06c58d4bf58debd0cf34226c\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"7160af87749acd009d42b59a3b89da290eb5c5a4ee53bedea1892a5d5b535456\"" Mar 3 13:35:33.056527 containerd[1639]: time="2026-03-03T13:35:33.056488043Z" level=info msg="StartContainer for \"7160af87749acd009d42b59a3b89da290eb5c5a4ee53bedea1892a5d5b535456\"" Mar 3 13:35:33.057361 containerd[1639]: time="2026-03-03T13:35:33.057325720Z" level=info msg="connecting to shim 7160af87749acd009d42b59a3b89da290eb5c5a4ee53bedea1892a5d5b535456" address="unix:///run/containerd/s/da835e90a6c7b98e41199e5d60a316b15f9188e5ec23363b39057b5a3c023d84" protocol=ttrpc version=3 Mar 3 13:35:33.059988 systemd[1]: Started cri-containerd-6207f63513c36ee3afe06def5d74a73353a8acd3537d6aded341b7da2450302d.scope - libcontainer container 6207f63513c36ee3afe06def5d74a73353a8acd3537d6aded341b7da2450302d. Mar 3 13:35:33.072783 systemd[1]: Started cri-containerd-8b4375371a18044a6cf1841dbb85c8ad936f2468a3360009604b6af8aa3db15a.scope - libcontainer container 8b4375371a18044a6cf1841dbb85c8ad936f2468a3360009604b6af8aa3db15a. Mar 3 13:35:33.081861 systemd[1]: Started cri-containerd-7160af87749acd009d42b59a3b89da290eb5c5a4ee53bedea1892a5d5b535456.scope - libcontainer container 7160af87749acd009d42b59a3b89da290eb5c5a4ee53bedea1892a5d5b535456. Mar 3 13:35:33.132785 kubelet[2417]: E0303 13:35:33.132731 2417 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://89.167.115.149:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-2-4-a-1c48930dd2&limit=500&resourceVersion=0\": dial tcp 89.167.115.149:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 3 13:35:33.154182 containerd[1639]: time="2026-03-03T13:35:33.154138691Z" level=info msg="StartContainer for \"6207f63513c36ee3afe06def5d74a73353a8acd3537d6aded341b7da2450302d\" returns successfully" Mar 3 13:35:33.155338 containerd[1639]: time="2026-03-03T13:35:33.155039224Z" level=info msg="StartContainer for \"7160af87749acd009d42b59a3b89da290eb5c5a4ee53bedea1892a5d5b535456\" returns successfully" Mar 3 13:35:33.164054 kubelet[2417]: E0303 13:35:33.164018 2417 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://89.167.115.149:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 89.167.115.149:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 3 13:35:33.167865 containerd[1639]: time="2026-03-03T13:35:33.167804661Z" level=info msg="StartContainer for \"8b4375371a18044a6cf1841dbb85c8ad936f2468a3360009604b6af8aa3db15a\" returns successfully" Mar 3 13:35:33.420336 kubelet[2417]: I0303 13:35:33.419837 2417 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:33.877961 kubelet[2417]: E0303 13:35:33.877700 2417 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-a-1c48930dd2\" not found" node="ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:33.879331 kubelet[2417]: E0303 13:35:33.879308 2417 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-a-1c48930dd2\" not found" node="ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:33.883032 kubelet[2417]: E0303 13:35:33.883003 2417 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-a-1c48930dd2\" not found" node="ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:34.478359 kubelet[2417]: E0303 13:35:34.478312 2417 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459-2-4-a-1c48930dd2\" not found" node="ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:34.544530 kubelet[2417]: E0303 13:35:34.544416 2417 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{ci-4459-2-4-a-1c48930dd2.1899583961e91a4b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-2-4-a-1c48930dd2,UID:ci-4459-2-4-a-1c48930dd2,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-2-4-a-1c48930dd2,},FirstTimestamp:2026-03-03 13:35:31.826047563 +0000 UTC m=+0.811755525,LastTimestamp:2026-03-03 13:35:31.826047563 +0000 UTC m=+0.811755525,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-4-a-1c48930dd2,}" Mar 3 13:35:34.604258 kubelet[2417]: I0303 13:35:34.603183 2417 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:34.604258 kubelet[2417]: E0303 13:35:34.603225 2417 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"ci-4459-2-4-a-1c48930dd2\": node \"ci-4459-2-4-a-1c48930dd2\" not found" Mar 3 13:35:34.618597 kubelet[2417]: E0303 13:35:34.618564 2417 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-a-1c48930dd2\" not found" Mar 3 13:35:34.719292 kubelet[2417]: E0303 13:35:34.719243 2417 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-a-1c48930dd2\" not found" Mar 3 13:35:34.820925 kubelet[2417]: E0303 13:35:34.820163 2417 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-a-1c48930dd2\" not found" Mar 3 13:35:34.890225 kubelet[2417]: E0303 13:35:34.890019 2417 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-a-1c48930dd2\" not found" node="ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:34.893055 kubelet[2417]: E0303 13:35:34.893009 2417 kubelet.go:3216] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-4-a-1c48930dd2\" not found" node="ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:34.920744 kubelet[2417]: E0303 13:35:34.920687 2417 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-a-1c48930dd2\" not found" Mar 3 13:35:35.021466 kubelet[2417]: E0303 13:35:35.021411 2417 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-a-1c48930dd2\" not found" Mar 3 13:35:35.122599 kubelet[2417]: E0303 13:35:35.122431 2417 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-a-1c48930dd2\" not found" Mar 3 13:35:35.223637 kubelet[2417]: E0303 13:35:35.223566 2417 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-a-1c48930dd2\" not found" Mar 3 13:35:35.324030 kubelet[2417]: E0303 13:35:35.323983 2417 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-a-1c48930dd2\" not found" Mar 3 13:35:35.424477 kubelet[2417]: E0303 13:35:35.424430 2417 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-a-1c48930dd2\" not found" Mar 3 13:35:35.525265 kubelet[2417]: E0303 13:35:35.525212 2417 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4459-2-4-a-1c48930dd2\" not found" Mar 3 13:35:35.533755 kubelet[2417]: I0303 13:35:35.533738 2417 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:35.541785 kubelet[2417]: I0303 13:35:35.541759 2417 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:35.547107 kubelet[2417]: I0303 13:35:35.547081 2417 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:35.823703 kubelet[2417]: I0303 13:35:35.823335 2417 apiserver.go:52] "Watching apiserver" Mar 3 13:35:35.838103 kubelet[2417]: I0303 13:35:35.838061 2417 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 3 13:35:36.501466 systemd[1]: Reload requested from client PID 2701 ('systemctl') (unit session-7.scope)... Mar 3 13:35:36.501494 systemd[1]: Reloading... Mar 3 13:35:36.598720 zram_generator::config[2745]: No configuration found. Mar 3 13:35:36.778771 systemd[1]: Reloading finished in 276 ms. Mar 3 13:35:36.804446 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 3 13:35:36.816886 systemd[1]: kubelet.service: Deactivated successfully. Mar 3 13:35:36.817124 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 3 13:35:36.817170 systemd[1]: kubelet.service: Consumed 1.182s CPU time, 125.3M memory peak. Mar 3 13:35:36.820119 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 3 13:35:36.998825 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 3 13:35:37.014115 (kubelet)[2796]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 3 13:35:37.044660 kubelet[2796]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 3 13:35:37.044660 kubelet[2796]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 3 13:35:37.044660 kubelet[2796]: I0303 13:35:37.044419 2796 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 3 13:35:37.051014 kubelet[2796]: I0303 13:35:37.050962 2796 server.go:529] "Kubelet version" kubeletVersion="v1.34.4" Mar 3 13:35:37.051135 kubelet[2796]: I0303 13:35:37.051127 2796 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 3 13:35:37.051186 kubelet[2796]: I0303 13:35:37.051179 2796 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 3 13:35:37.051219 kubelet[2796]: I0303 13:35:37.051213 2796 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 3 13:35:37.051418 kubelet[2796]: I0303 13:35:37.051408 2796 server.go:956] "Client rotation is on, will bootstrap in background" Mar 3 13:35:37.052316 kubelet[2796]: I0303 13:35:37.052304 2796 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 3 13:35:37.054839 kubelet[2796]: I0303 13:35:37.054827 2796 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 3 13:35:37.060093 kubelet[2796]: I0303 13:35:37.060076 2796 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 3 13:35:37.064575 kubelet[2796]: I0303 13:35:37.064514 2796 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 3 13:35:37.064958 kubelet[2796]: I0303 13:35:37.064938 2796 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 3 13:35:37.065124 kubelet[2796]: I0303 13:35:37.064993 2796 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-4-a-1c48930dd2","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 3 13:35:37.065237 kubelet[2796]: I0303 13:35:37.065228 2796 topology_manager.go:138] "Creating topology manager with none policy" Mar 3 13:35:37.065274 kubelet[2796]: I0303 13:35:37.065269 2796 container_manager_linux.go:306] "Creating device plugin manager" Mar 3 13:35:37.065316 kubelet[2796]: I0303 13:35:37.065311 2796 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Mar 3 13:35:37.065496 kubelet[2796]: I0303 13:35:37.065488 2796 state_mem.go:36] "Initialized new in-memory state store" Mar 3 13:35:37.065974 kubelet[2796]: I0303 13:35:37.065964 2796 kubelet.go:475] "Attempting to sync node with API server" Mar 3 13:35:37.066565 kubelet[2796]: I0303 13:35:37.066548 2796 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 3 13:35:37.066611 kubelet[2796]: I0303 13:35:37.066574 2796 kubelet.go:387] "Adding apiserver pod source" Mar 3 13:35:37.066611 kubelet[2796]: I0303 13:35:37.066587 2796 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 3 13:35:37.073905 kubelet[2796]: I0303 13:35:37.073814 2796 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 3 13:35:37.075685 kubelet[2796]: I0303 13:35:37.074435 2796 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 3 13:35:37.075685 kubelet[2796]: I0303 13:35:37.074454 2796 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 3 13:35:37.075685 kubelet[2796]: I0303 13:35:37.075139 2796 apiserver.go:52] "Watching apiserver" Mar 3 13:35:37.079154 kubelet[2796]: I0303 13:35:37.079138 2796 server.go:1262] "Started kubelet" Mar 3 13:35:37.080697 kubelet[2796]: I0303 13:35:37.080582 2796 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 3 13:35:37.082734 kubelet[2796]: I0303 13:35:37.082719 2796 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 3 13:35:37.083464 kubelet[2796]: I0303 13:35:37.083452 2796 server.go:310] "Adding debug handlers to kubelet server" Mar 3 13:35:37.086511 kubelet[2796]: I0303 13:35:37.086492 2796 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 3 13:35:37.086581 kubelet[2796]: I0303 13:35:37.086573 2796 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 3 13:35:37.086749 kubelet[2796]: I0303 13:35:37.086740 2796 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 3 13:35:37.086956 kubelet[2796]: I0303 13:35:37.086945 2796 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 3 13:35:37.088501 kubelet[2796]: I0303 13:35:37.088486 2796 volume_manager.go:313] "Starting Kubelet Volume Manager" Mar 3 13:35:37.089470 kubelet[2796]: I0303 13:35:37.089456 2796 factory.go:223] Registration of the systemd container factory successfully Mar 3 13:35:37.089636 kubelet[2796]: I0303 13:35:37.089624 2796 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 3 13:35:37.089929 kubelet[2796]: E0303 13:35:37.089919 2796 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 3 13:35:37.092187 kubelet[2796]: I0303 13:35:37.091586 2796 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 3 13:35:37.092187 kubelet[2796]: I0303 13:35:37.091722 2796 reconciler.go:29] "Reconciler: start to sync state" Mar 3 13:35:37.093764 kubelet[2796]: I0303 13:35:37.093753 2796 factory.go:223] Registration of the containerd container factory successfully Mar 3 13:35:37.094836 kubelet[2796]: I0303 13:35:37.094061 2796 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 3 13:35:37.095433 kubelet[2796]: I0303 13:35:37.095370 2796 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 3 13:35:37.095433 kubelet[2796]: I0303 13:35:37.095399 2796 status_manager.go:244] "Starting to sync pod status with apiserver" Mar 3 13:35:37.095433 kubelet[2796]: I0303 13:35:37.095417 2796 kubelet.go:2428] "Starting kubelet main sync loop" Mar 3 13:35:37.095501 kubelet[2796]: E0303 13:35:37.095463 2796 kubelet.go:2452] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 3 13:35:37.139393 kubelet[2796]: I0303 13:35:37.139371 2796 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 3 13:35:37.139551 kubelet[2796]: I0303 13:35:37.139542 2796 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 3 13:35:37.139592 kubelet[2796]: I0303 13:35:37.139586 2796 state_mem.go:36] "Initialized new in-memory state store" Mar 3 13:35:37.139732 kubelet[2796]: I0303 13:35:37.139722 2796 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 3 13:35:37.139797 kubelet[2796]: I0303 13:35:37.139777 2796 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 3 13:35:37.139828 kubelet[2796]: I0303 13:35:37.139822 2796 policy_none.go:49] "None policy: Start" Mar 3 13:35:37.139865 kubelet[2796]: I0303 13:35:37.139858 2796 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 3 13:35:37.139917 kubelet[2796]: I0303 13:35:37.139909 2796 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 3 13:35:37.140046 kubelet[2796]: I0303 13:35:37.140036 2796 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Mar 3 13:35:37.140100 kubelet[2796]: I0303 13:35:37.140092 2796 policy_none.go:47] "Start" Mar 3 13:35:37.143944 kubelet[2796]: E0303 13:35:37.143931 2796 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 3 13:35:37.144460 kubelet[2796]: I0303 13:35:37.144293 2796 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 3 13:35:37.144460 kubelet[2796]: I0303 13:35:37.144304 2796 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 3 13:35:37.144750 kubelet[2796]: I0303 13:35:37.144564 2796 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 3 13:35:37.148727 kubelet[2796]: E0303 13:35:37.148712 2796 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 3 13:35:37.196773 kubelet[2796]: I0303 13:35:37.196746 2796 kubelet.go:3220] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:37.203566 kubelet[2796]: E0303 13:35:37.203502 2796 kubelet.go:3222] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-4-a-1c48930dd2\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:37.212696 kubelet[2796]: I0303 13:35:37.212553 2796 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459-2-4-a-1c48930dd2" podStartSLOduration=2.212519328 podStartE2EDuration="2.212519328s" podCreationTimestamp="2026-03-03 13:35:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-03 13:35:37.212169533 +0000 UTC m=+0.193151418" watchObservedRunningTime="2026-03-03 13:35:37.212519328 +0000 UTC m=+0.193501212" Mar 3 13:35:37.219593 kubelet[2796]: I0303 13:35:37.219549 2796 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459-2-4-a-1c48930dd2" podStartSLOduration=2.219541501 podStartE2EDuration="2.219541501s" podCreationTimestamp="2026-03-03 13:35:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-03 13:35:37.219278456 +0000 UTC m=+0.200260310" watchObservedRunningTime="2026-03-03 13:35:37.219541501 +0000 UTC m=+0.200523355" Mar 3 13:35:37.228008 kubelet[2796]: I0303 13:35:37.227967 2796 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459-2-4-a-1c48930dd2" podStartSLOduration=2.227954639 podStartE2EDuration="2.227954639s" podCreationTimestamp="2026-03-03 13:35:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-03 13:35:37.227279812 +0000 UTC m=+0.208261676" watchObservedRunningTime="2026-03-03 13:35:37.227954639 +0000 UTC m=+0.208936493" Mar 3 13:35:37.251748 kubelet[2796]: I0303 13:35:37.251711 2796 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:37.260561 kubelet[2796]: I0303 13:35:37.260528 2796 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:37.260830 kubelet[2796]: I0303 13:35:37.260604 2796 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:37.292582 kubelet[2796]: I0303 13:35:37.292522 2796 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 3 13:35:37.393517 kubelet[2796]: I0303 13:35:37.393468 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c856a9a8037c97609a00d34ac066f722-ca-certs\") pod \"kube-controller-manager-ci-4459-2-4-a-1c48930dd2\" (UID: \"c856a9a8037c97609a00d34ac066f722\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:37.393517 kubelet[2796]: I0303 13:35:37.393518 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b55be887bce29f8749013c7dd6d0b968-kubeconfig\") pod \"kube-scheduler-ci-4459-2-4-a-1c48930dd2\" (UID: \"b55be887bce29f8749013c7dd6d0b968\") " pod="kube-system/kube-scheduler-ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:37.393851 kubelet[2796]: I0303 13:35:37.393548 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c935c4a8a9f6f7d0099370a1fb38915e-k8s-certs\") pod \"kube-apiserver-ci-4459-2-4-a-1c48930dd2\" (UID: \"c935c4a8a9f6f7d0099370a1fb38915e\") " pod="kube-system/kube-apiserver-ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:37.393851 kubelet[2796]: I0303 13:35:37.393572 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/c856a9a8037c97609a00d34ac066f722-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-4-a-1c48930dd2\" (UID: \"c856a9a8037c97609a00d34ac066f722\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:37.393851 kubelet[2796]: I0303 13:35:37.393597 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c856a9a8037c97609a00d34ac066f722-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-4-a-1c48930dd2\" (UID: \"c856a9a8037c97609a00d34ac066f722\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:37.393851 kubelet[2796]: I0303 13:35:37.393621 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c856a9a8037c97609a00d34ac066f722-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-4-a-1c48930dd2\" (UID: \"c856a9a8037c97609a00d34ac066f722\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:37.393851 kubelet[2796]: I0303 13:35:37.393645 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c856a9a8037c97609a00d34ac066f722-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-4-a-1c48930dd2\" (UID: \"c856a9a8037c97609a00d34ac066f722\") " pod="kube-system/kube-controller-manager-ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:37.394099 kubelet[2796]: I0303 13:35:37.393703 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c935c4a8a9f6f7d0099370a1fb38915e-ca-certs\") pod \"kube-apiserver-ci-4459-2-4-a-1c48930dd2\" (UID: \"c935c4a8a9f6f7d0099370a1fb38915e\") " pod="kube-system/kube-apiserver-ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:37.394099 kubelet[2796]: I0303 13:35:37.393726 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c935c4a8a9f6f7d0099370a1fb38915e-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-4-a-1c48930dd2\" (UID: \"c935c4a8a9f6f7d0099370a1fb38915e\") " pod="kube-system/kube-apiserver-ci-4459-2-4-a-1c48930dd2" Mar 3 13:35:43.843309 kubelet[2796]: I0303 13:35:43.843240 2796 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 3 13:35:43.843808 containerd[1639]: time="2026-03-03T13:35:43.843765185Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 3 13:35:43.844192 kubelet[2796]: I0303 13:35:43.844150 2796 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 3 13:35:44.700901 systemd[1]: Created slice kubepods-besteffort-pod0ff02e58_86ac_446b_9eae_f164244e9881.slice - libcontainer container kubepods-besteffort-pod0ff02e58_86ac_446b_9eae_f164244e9881.slice. Mar 3 13:35:44.838238 kubelet[2796]: I0303 13:35:44.838142 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/0ff02e58-86ac-446b-9eae-f164244e9881-kube-proxy\") pod \"kube-proxy-5k6nw\" (UID: \"0ff02e58-86ac-446b-9eae-f164244e9881\") " pod="kube-system/kube-proxy-5k6nw" Mar 3 13:35:44.838238 kubelet[2796]: I0303 13:35:44.838207 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/0ff02e58-86ac-446b-9eae-f164244e9881-xtables-lock\") pod \"kube-proxy-5k6nw\" (UID: \"0ff02e58-86ac-446b-9eae-f164244e9881\") " pod="kube-system/kube-proxy-5k6nw" Mar 3 13:35:44.838238 kubelet[2796]: I0303 13:35:44.838239 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0ff02e58-86ac-446b-9eae-f164244e9881-lib-modules\") pod \"kube-proxy-5k6nw\" (UID: \"0ff02e58-86ac-446b-9eae-f164244e9881\") " pod="kube-system/kube-proxy-5k6nw" Mar 3 13:35:44.838502 kubelet[2796]: I0303 13:35:44.838268 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5dvv\" (UniqueName: \"kubernetes.io/projected/0ff02e58-86ac-446b-9eae-f164244e9881-kube-api-access-t5dvv\") pod \"kube-proxy-5k6nw\" (UID: \"0ff02e58-86ac-446b-9eae-f164244e9881\") " pod="kube-system/kube-proxy-5k6nw" Mar 3 13:35:45.016478 containerd[1639]: time="2026-03-03T13:35:45.015726608Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-5k6nw,Uid:0ff02e58-86ac-446b-9eae-f164244e9881,Namespace:kube-system,Attempt:0,}" Mar 3 13:35:45.052433 containerd[1639]: time="2026-03-03T13:35:45.052359245Z" level=info msg="connecting to shim d737de52aae478569f8fe089e375271233f4fac1633148775dfd3d71a5c77383" address="unix:///run/containerd/s/b9a84e43667cf7d26a301548fc17921049fb43fe0aeedbb2eca9e3a9e86187e3" namespace=k8s.io protocol=ttrpc version=3 Mar 3 13:35:45.082778 systemd[1]: Started cri-containerd-d737de52aae478569f8fe089e375271233f4fac1633148775dfd3d71a5c77383.scope - libcontainer container d737de52aae478569f8fe089e375271233f4fac1633148775dfd3d71a5c77383. Mar 3 13:35:45.088165 systemd[1]: Created slice kubepods-besteffort-pod7cde15be_08d4_48ac_bacc_52cbb233fccb.slice - libcontainer container kubepods-besteffort-pod7cde15be_08d4_48ac_bacc_52cbb233fccb.slice. Mar 3 13:35:45.110951 containerd[1639]: time="2026-03-03T13:35:45.110888774Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-5k6nw,Uid:0ff02e58-86ac-446b-9eae-f164244e9881,Namespace:kube-system,Attempt:0,} returns sandbox id \"d737de52aae478569f8fe089e375271233f4fac1633148775dfd3d71a5c77383\"" Mar 3 13:35:45.116086 containerd[1639]: time="2026-03-03T13:35:45.116066894Z" level=info msg="CreateContainer within sandbox \"d737de52aae478569f8fe089e375271233f4fac1633148775dfd3d71a5c77383\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 3 13:35:45.127769 containerd[1639]: time="2026-03-03T13:35:45.127729656Z" level=info msg="Container 7fb8037c0d049cae440d7bb9c7e9983f5a8fe7fa87b42be50677d79b282257d7: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:35:45.130194 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1951477058.mount: Deactivated successfully. Mar 3 13:35:45.133598 containerd[1639]: time="2026-03-03T13:35:45.133567243Z" level=info msg="CreateContainer within sandbox \"d737de52aae478569f8fe089e375271233f4fac1633148775dfd3d71a5c77383\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"7fb8037c0d049cae440d7bb9c7e9983f5a8fe7fa87b42be50677d79b282257d7\"" Mar 3 13:35:45.134123 containerd[1639]: time="2026-03-03T13:35:45.134104447Z" level=info msg="StartContainer for \"7fb8037c0d049cae440d7bb9c7e9983f5a8fe7fa87b42be50677d79b282257d7\"" Mar 3 13:35:45.135085 containerd[1639]: time="2026-03-03T13:35:45.135040880Z" level=info msg="connecting to shim 7fb8037c0d049cae440d7bb9c7e9983f5a8fe7fa87b42be50677d79b282257d7" address="unix:///run/containerd/s/b9a84e43667cf7d26a301548fc17921049fb43fe0aeedbb2eca9e3a9e86187e3" protocol=ttrpc version=3 Mar 3 13:35:45.156768 systemd[1]: Started cri-containerd-7fb8037c0d049cae440d7bb9c7e9983f5a8fe7fa87b42be50677d79b282257d7.scope - libcontainer container 7fb8037c0d049cae440d7bb9c7e9983f5a8fe7fa87b42be50677d79b282257d7. Mar 3 13:35:45.230804 containerd[1639]: time="2026-03-03T13:35:45.230767215Z" level=info msg="StartContainer for \"7fb8037c0d049cae440d7bb9c7e9983f5a8fe7fa87b42be50677d79b282257d7\" returns successfully" Mar 3 13:35:45.240153 kubelet[2796]: I0303 13:35:45.240113 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwkdf\" (UniqueName: \"kubernetes.io/projected/7cde15be-08d4-48ac-bacc-52cbb233fccb-kube-api-access-xwkdf\") pod \"tigera-operator-5588576f44-cdlhx\" (UID: \"7cde15be-08d4-48ac-bacc-52cbb233fccb\") " pod="tigera-operator/tigera-operator-5588576f44-cdlhx" Mar 3 13:35:45.240153 kubelet[2796]: I0303 13:35:45.240140 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/7cde15be-08d4-48ac-bacc-52cbb233fccb-var-lib-calico\") pod \"tigera-operator-5588576f44-cdlhx\" (UID: \"7cde15be-08d4-48ac-bacc-52cbb233fccb\") " pod="tigera-operator/tigera-operator-5588576f44-cdlhx" Mar 3 13:35:45.394964 containerd[1639]: time="2026-03-03T13:35:45.394864060Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5588576f44-cdlhx,Uid:7cde15be-08d4-48ac-bacc-52cbb233fccb,Namespace:tigera-operator,Attempt:0,}" Mar 3 13:35:45.430352 containerd[1639]: time="2026-03-03T13:35:45.429609667Z" level=info msg="connecting to shim bddd77a6814877316ecf8b704a1d5101a2d4b862e2c87b9918d978594860a419" address="unix:///run/containerd/s/25855f899f239c4c17e2de247b27cea654382418c9363eca74f6409bd5e50b23" namespace=k8s.io protocol=ttrpc version=3 Mar 3 13:35:45.464444 systemd[1]: Started cri-containerd-bddd77a6814877316ecf8b704a1d5101a2d4b862e2c87b9918d978594860a419.scope - libcontainer container bddd77a6814877316ecf8b704a1d5101a2d4b862e2c87b9918d978594860a419. Mar 3 13:35:45.505012 containerd[1639]: time="2026-03-03T13:35:45.504970693Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5588576f44-cdlhx,Uid:7cde15be-08d4-48ac-bacc-52cbb233fccb,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"bddd77a6814877316ecf8b704a1d5101a2d4b862e2c87b9918d978594860a419\"" Mar 3 13:35:45.506556 containerd[1639]: time="2026-03-03T13:35:45.506396281Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Mar 3 13:35:46.167254 kubelet[2796]: I0303 13:35:46.166430 2796 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-5k6nw" podStartSLOduration=2.16640797 podStartE2EDuration="2.16640797s" podCreationTimestamp="2026-03-03 13:35:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-03 13:35:46.166196333 +0000 UTC m=+9.147178227" watchObservedRunningTime="2026-03-03 13:35:46.16640797 +0000 UTC m=+9.147389864" Mar 3 13:35:46.983656 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1155602825.mount: Deactivated successfully. Mar 3 13:35:47.529605 containerd[1639]: time="2026-03-03T13:35:47.529559819Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:47.530749 containerd[1639]: time="2026-03-03T13:35:47.530558670Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=40846156" Mar 3 13:35:47.531556 containerd[1639]: time="2026-03-03T13:35:47.531534754Z" level=info msg="ImageCreate event name:\"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:47.533236 containerd[1639]: time="2026-03-03T13:35:47.533215197Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:47.533681 containerd[1639]: time="2026-03-03T13:35:47.533648314Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"40842151\" in 2.027230577s" Mar 3 13:35:47.533747 containerd[1639]: time="2026-03-03T13:35:47.533735747Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\"" Mar 3 13:35:47.536533 containerd[1639]: time="2026-03-03T13:35:47.536511409Z" level=info msg="CreateContainer within sandbox \"bddd77a6814877316ecf8b704a1d5101a2d4b862e2c87b9918d978594860a419\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 3 13:35:47.545855 containerd[1639]: time="2026-03-03T13:35:47.542316948Z" level=info msg="Container 44e36430d6fa05c64b2e269cc16823a3cc5535921b75e22cf2f67a4a3feb29c1: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:35:47.561204 containerd[1639]: time="2026-03-03T13:35:47.561160653Z" level=info msg="CreateContainer within sandbox \"bddd77a6814877316ecf8b704a1d5101a2d4b862e2c87b9918d978594860a419\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"44e36430d6fa05c64b2e269cc16823a3cc5535921b75e22cf2f67a4a3feb29c1\"" Mar 3 13:35:47.561658 containerd[1639]: time="2026-03-03T13:35:47.561614254Z" level=info msg="StartContainer for \"44e36430d6fa05c64b2e269cc16823a3cc5535921b75e22cf2f67a4a3feb29c1\"" Mar 3 13:35:47.565307 containerd[1639]: time="2026-03-03T13:35:47.565288182Z" level=info msg="connecting to shim 44e36430d6fa05c64b2e269cc16823a3cc5535921b75e22cf2f67a4a3feb29c1" address="unix:///run/containerd/s/25855f899f239c4c17e2de247b27cea654382418c9363eca74f6409bd5e50b23" protocol=ttrpc version=3 Mar 3 13:35:47.585807 systemd[1]: Started cri-containerd-44e36430d6fa05c64b2e269cc16823a3cc5535921b75e22cf2f67a4a3feb29c1.scope - libcontainer container 44e36430d6fa05c64b2e269cc16823a3cc5535921b75e22cf2f67a4a3feb29c1. Mar 3 13:35:47.613747 containerd[1639]: time="2026-03-03T13:35:47.613712573Z" level=info msg="StartContainer for \"44e36430d6fa05c64b2e269cc16823a3cc5535921b75e22cf2f67a4a3feb29c1\" returns successfully" Mar 3 13:35:49.571248 update_engine[1613]: I20260303 13:35:49.570699 1613 update_attempter.cc:509] Updating boot flags... Mar 3 13:35:49.879884 kubelet[2796]: I0303 13:35:49.879723 2796 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-5588576f44-cdlhx" podStartSLOduration=2.851306844 podStartE2EDuration="4.87970266s" podCreationTimestamp="2026-03-03 13:35:45 +0000 UTC" firstStartedPulling="2026-03-03 13:35:45.50603705 +0000 UTC m=+8.487018904" lastFinishedPulling="2026-03-03 13:35:47.534432866 +0000 UTC m=+10.515414720" observedRunningTime="2026-03-03 13:35:48.177455294 +0000 UTC m=+11.158437188" watchObservedRunningTime="2026-03-03 13:35:49.87970266 +0000 UTC m=+12.860684514" Mar 3 13:35:52.867722 sudo[1860]: pam_unix(sudo:session): session closed for user root Mar 3 13:35:52.987794 sshd[1859]: Connection closed by 20.161.92.111 port 38830 Mar 3 13:35:52.988963 sshd-session[1856]: pam_unix(sshd:session): session closed for user core Mar 3 13:35:52.994038 systemd[1]: sshd@6-89.167.115.149:22-20.161.92.111:38830.service: Deactivated successfully. Mar 3 13:35:52.999298 systemd[1]: session-7.scope: Deactivated successfully. Mar 3 13:35:53.000713 systemd[1]: session-7.scope: Consumed 4.244s CPU time, 233.2M memory peak. Mar 3 13:35:53.003741 systemd-logind[1607]: Session 7 logged out. Waiting for processes to exit. Mar 3 13:35:53.009178 systemd-logind[1607]: Removed session 7. Mar 3 13:35:54.668977 systemd[1]: Created slice kubepods-besteffort-poda0e4975a_39b4_4603_b656_02aca61e9353.slice - libcontainer container kubepods-besteffort-poda0e4975a_39b4_4603_b656_02aca61e9353.slice. Mar 3 13:35:54.696306 kubelet[2796]: I0303 13:35:54.696265 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/a0e4975a-39b4-4603-b656-02aca61e9353-typha-certs\") pod \"calico-typha-797b48589c-m8sml\" (UID: \"a0e4975a-39b4-4603-b656-02aca61e9353\") " pod="calico-system/calico-typha-797b48589c-m8sml" Mar 3 13:35:54.697740 kubelet[2796]: I0303 13:35:54.696318 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0e4975a-39b4-4603-b656-02aca61e9353-tigera-ca-bundle\") pod \"calico-typha-797b48589c-m8sml\" (UID: \"a0e4975a-39b4-4603-b656-02aca61e9353\") " pod="calico-system/calico-typha-797b48589c-m8sml" Mar 3 13:35:54.697740 kubelet[2796]: I0303 13:35:54.696335 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qql4q\" (UniqueName: \"kubernetes.io/projected/a0e4975a-39b4-4603-b656-02aca61e9353-kube-api-access-qql4q\") pod \"calico-typha-797b48589c-m8sml\" (UID: \"a0e4975a-39b4-4603-b656-02aca61e9353\") " pod="calico-system/calico-typha-797b48589c-m8sml" Mar 3 13:35:54.793403 systemd[1]: Created slice kubepods-besteffort-podc328a717_2e18_4ba4_afe4_8aef6b284d56.slice - libcontainer container kubepods-besteffort-podc328a717_2e18_4ba4_afe4_8aef6b284d56.slice. Mar 3 13:35:54.796447 kubelet[2796]: I0303 13:35:54.796417 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/c328a717-2e18-4ba4-afe4-8aef6b284d56-bpffs\") pod \"calico-node-9nfsc\" (UID: \"c328a717-2e18-4ba4-afe4-8aef6b284d56\") " pod="calico-system/calico-node-9nfsc" Mar 3 13:35:54.796447 kubelet[2796]: I0303 13:35:54.796446 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/c328a717-2e18-4ba4-afe4-8aef6b284d56-cni-bin-dir\") pod \"calico-node-9nfsc\" (UID: \"c328a717-2e18-4ba4-afe4-8aef6b284d56\") " pod="calico-system/calico-node-9nfsc" Mar 3 13:35:54.799042 kubelet[2796]: I0303 13:35:54.796458 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/c328a717-2e18-4ba4-afe4-8aef6b284d56-flexvol-driver-host\") pod \"calico-node-9nfsc\" (UID: \"c328a717-2e18-4ba4-afe4-8aef6b284d56\") " pod="calico-system/calico-node-9nfsc" Mar 3 13:35:54.799042 kubelet[2796]: I0303 13:35:54.796470 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/c328a717-2e18-4ba4-afe4-8aef6b284d56-var-lib-calico\") pod \"calico-node-9nfsc\" (UID: \"c328a717-2e18-4ba4-afe4-8aef6b284d56\") " pod="calico-system/calico-node-9nfsc" Mar 3 13:35:54.799042 kubelet[2796]: I0303 13:35:54.796481 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmcbj\" (UniqueName: \"kubernetes.io/projected/c328a717-2e18-4ba4-afe4-8aef6b284d56-kube-api-access-nmcbj\") pod \"calico-node-9nfsc\" (UID: \"c328a717-2e18-4ba4-afe4-8aef6b284d56\") " pod="calico-system/calico-node-9nfsc" Mar 3 13:35:54.799042 kubelet[2796]: I0303 13:35:54.796491 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c328a717-2e18-4ba4-afe4-8aef6b284d56-sys-fs\") pod \"calico-node-9nfsc\" (UID: \"c328a717-2e18-4ba4-afe4-8aef6b284d56\") " pod="calico-system/calico-node-9nfsc" Mar 3 13:35:54.799042 kubelet[2796]: I0303 13:35:54.796512 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/c328a717-2e18-4ba4-afe4-8aef6b284d56-cni-net-dir\") pod \"calico-node-9nfsc\" (UID: \"c328a717-2e18-4ba4-afe4-8aef6b284d56\") " pod="calico-system/calico-node-9nfsc" Mar 3 13:35:54.799169 kubelet[2796]: I0303 13:35:54.796523 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/c328a717-2e18-4ba4-afe4-8aef6b284d56-policysync\") pod \"calico-node-9nfsc\" (UID: \"c328a717-2e18-4ba4-afe4-8aef6b284d56\") " pod="calico-system/calico-node-9nfsc" Mar 3 13:35:54.799169 kubelet[2796]: I0303 13:35:54.796534 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/c328a717-2e18-4ba4-afe4-8aef6b284d56-cni-log-dir\") pod \"calico-node-9nfsc\" (UID: \"c328a717-2e18-4ba4-afe4-8aef6b284d56\") " pod="calico-system/calico-node-9nfsc" Mar 3 13:35:54.799169 kubelet[2796]: I0303 13:35:54.796544 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c328a717-2e18-4ba4-afe4-8aef6b284d56-tigera-ca-bundle\") pod \"calico-node-9nfsc\" (UID: \"c328a717-2e18-4ba4-afe4-8aef6b284d56\") " pod="calico-system/calico-node-9nfsc" Mar 3 13:35:54.799169 kubelet[2796]: I0303 13:35:54.796555 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/c328a717-2e18-4ba4-afe4-8aef6b284d56-var-run-calico\") pod \"calico-node-9nfsc\" (UID: \"c328a717-2e18-4ba4-afe4-8aef6b284d56\") " pod="calico-system/calico-node-9nfsc" Mar 3 13:35:54.799169 kubelet[2796]: I0303 13:35:54.796571 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/c328a717-2e18-4ba4-afe4-8aef6b284d56-node-certs\") pod \"calico-node-9nfsc\" (UID: \"c328a717-2e18-4ba4-afe4-8aef6b284d56\") " pod="calico-system/calico-node-9nfsc" Mar 3 13:35:54.799250 kubelet[2796]: I0303 13:35:54.796580 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/c328a717-2e18-4ba4-afe4-8aef6b284d56-xtables-lock\") pod \"calico-node-9nfsc\" (UID: \"c328a717-2e18-4ba4-afe4-8aef6b284d56\") " pod="calico-system/calico-node-9nfsc" Mar 3 13:35:54.799250 kubelet[2796]: I0303 13:35:54.796590 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c328a717-2e18-4ba4-afe4-8aef6b284d56-lib-modules\") pod \"calico-node-9nfsc\" (UID: \"c328a717-2e18-4ba4-afe4-8aef6b284d56\") " pod="calico-system/calico-node-9nfsc" Mar 3 13:35:54.799250 kubelet[2796]: I0303 13:35:54.796601 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/c328a717-2e18-4ba4-afe4-8aef6b284d56-nodeproc\") pod \"calico-node-9nfsc\" (UID: \"c328a717-2e18-4ba4-afe4-8aef6b284d56\") " pod="calico-system/calico-node-9nfsc" Mar 3 13:35:54.898974 kubelet[2796]: E0303 13:35:54.898906 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.898974 kubelet[2796]: W0303 13:35:54.898935 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.898974 kubelet[2796]: E0303 13:35:54.898961 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.899520 kubelet[2796]: E0303 13:35:54.899456 2796 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dlz7j" podUID="01ac93a6-8099-4557-bc77-3925251db8d8" Mar 3 13:35:54.900726 kubelet[2796]: E0303 13:35:54.900701 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.900726 kubelet[2796]: W0303 13:35:54.900722 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.900800 kubelet[2796]: E0303 13:35:54.900735 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.900952 kubelet[2796]: E0303 13:35:54.900929 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.900952 kubelet[2796]: W0303 13:35:54.900951 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.901006 kubelet[2796]: E0303 13:35:54.900961 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.901238 kubelet[2796]: E0303 13:35:54.901222 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.901238 kubelet[2796]: W0303 13:35:54.901234 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.901298 kubelet[2796]: E0303 13:35:54.901244 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.901566 kubelet[2796]: E0303 13:35:54.901545 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.901566 kubelet[2796]: W0303 13:35:54.901558 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.901617 kubelet[2796]: E0303 13:35:54.901568 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.901863 kubelet[2796]: E0303 13:35:54.901849 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.901863 kubelet[2796]: W0303 13:35:54.901860 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.901917 kubelet[2796]: E0303 13:35:54.901870 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.903715 kubelet[2796]: E0303 13:35:54.903600 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.903715 kubelet[2796]: W0303 13:35:54.903712 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.903795 kubelet[2796]: E0303 13:35:54.903726 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.904034 kubelet[2796]: E0303 13:35:54.904015 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.904034 kubelet[2796]: W0303 13:35:54.904031 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.904088 kubelet[2796]: E0303 13:35:54.904040 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.904355 kubelet[2796]: E0303 13:35:54.904335 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.904355 kubelet[2796]: W0303 13:35:54.904353 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.904434 kubelet[2796]: E0303 13:35:54.904363 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.904763 kubelet[2796]: E0303 13:35:54.904741 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.904763 kubelet[2796]: W0303 13:35:54.904759 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.904819 kubelet[2796]: E0303 13:35:54.904769 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.905082 kubelet[2796]: E0303 13:35:54.905059 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.905082 kubelet[2796]: W0303 13:35:54.905076 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.905748 kubelet[2796]: E0303 13:35:54.905084 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.905748 kubelet[2796]: E0303 13:35:54.905427 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.905748 kubelet[2796]: W0303 13:35:54.905436 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.905748 kubelet[2796]: E0303 13:35:54.905445 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.905812 kubelet[2796]: E0303 13:35:54.905778 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.905812 kubelet[2796]: W0303 13:35:54.905785 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.905812 kubelet[2796]: E0303 13:35:54.905793 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.906021 kubelet[2796]: E0303 13:35:54.906003 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.906021 kubelet[2796]: W0303 13:35:54.906016 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.906021 kubelet[2796]: E0303 13:35:54.906022 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.906222 kubelet[2796]: E0303 13:35:54.906203 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.906222 kubelet[2796]: W0303 13:35:54.906218 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.906272 kubelet[2796]: E0303 13:35:54.906227 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.907693 kubelet[2796]: E0303 13:35:54.907650 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.907736 kubelet[2796]: W0303 13:35:54.907696 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.907736 kubelet[2796]: E0303 13:35:54.907707 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.908205 kubelet[2796]: E0303 13:35:54.908170 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.908205 kubelet[2796]: W0303 13:35:54.908190 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.908205 kubelet[2796]: E0303 13:35:54.908199 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.908431 kubelet[2796]: E0303 13:35:54.908408 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.908431 kubelet[2796]: W0303 13:35:54.908424 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.908492 kubelet[2796]: E0303 13:35:54.908432 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.909854 kubelet[2796]: E0303 13:35:54.909822 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.909854 kubelet[2796]: W0303 13:35:54.909841 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.909854 kubelet[2796]: E0303 13:35:54.909851 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.910733 kubelet[2796]: E0303 13:35:54.910713 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.910733 kubelet[2796]: W0303 13:35:54.910728 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.910797 kubelet[2796]: E0303 13:35:54.910736 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.910985 kubelet[2796]: E0303 13:35:54.910967 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.910985 kubelet[2796]: W0303 13:35:54.910980 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.910985 kubelet[2796]: E0303 13:35:54.910986 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.911183 kubelet[2796]: E0303 13:35:54.911166 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.911183 kubelet[2796]: W0303 13:35:54.911178 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.911183 kubelet[2796]: E0303 13:35:54.911185 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.911384 kubelet[2796]: E0303 13:35:54.911368 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.911384 kubelet[2796]: W0303 13:35:54.911379 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.911384 kubelet[2796]: E0303 13:35:54.911386 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.912112 kubelet[2796]: E0303 13:35:54.912092 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.912112 kubelet[2796]: W0303 13:35:54.912108 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.912112 kubelet[2796]: E0303 13:35:54.912116 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.912334 kubelet[2796]: E0303 13:35:54.912303 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.912334 kubelet[2796]: W0303 13:35:54.912316 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.912334 kubelet[2796]: E0303 13:35:54.912323 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.913340 kubelet[2796]: E0303 13:35:54.913319 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.913340 kubelet[2796]: W0303 13:35:54.913335 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.913340 kubelet[2796]: E0303 13:35:54.913343 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.914041 kubelet[2796]: E0303 13:35:54.913835 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.914041 kubelet[2796]: W0303 13:35:54.913846 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.914041 kubelet[2796]: E0303 13:35:54.913853 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.914130 kubelet[2796]: E0303 13:35:54.914047 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.914130 kubelet[2796]: W0303 13:35:54.914055 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.914130 kubelet[2796]: E0303 13:35:54.914061 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.914249 kubelet[2796]: E0303 13:35:54.914231 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.914249 kubelet[2796]: W0303 13:35:54.914244 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.914249 kubelet[2796]: E0303 13:35:54.914250 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.920751 kubelet[2796]: E0303 13:35:54.919226 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.920751 kubelet[2796]: W0303 13:35:54.919239 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.920751 kubelet[2796]: E0303 13:35:54.919248 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.923241 kubelet[2796]: E0303 13:35:54.922831 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.923241 kubelet[2796]: W0303 13:35:54.922843 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.923241 kubelet[2796]: E0303 13:35:54.922852 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.975965 containerd[1639]: time="2026-03-03T13:35:54.975901045Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-797b48589c-m8sml,Uid:a0e4975a-39b4-4603-b656-02aca61e9353,Namespace:calico-system,Attempt:0,}" Mar 3 13:35:54.995532 kubelet[2796]: E0303 13:35:54.994425 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.995532 kubelet[2796]: W0303 13:35:54.995258 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.995532 kubelet[2796]: E0303 13:35:54.995282 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.995532 kubelet[2796]: E0303 13:35:54.995468 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.995532 kubelet[2796]: W0303 13:35:54.995477 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.995532 kubelet[2796]: E0303 13:35:54.995486 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.996096 kubelet[2796]: E0303 13:35:54.996087 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.996290 kubelet[2796]: W0303 13:35:54.996152 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.996290 kubelet[2796]: E0303 13:35:54.996161 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.997441 kubelet[2796]: E0303 13:35:54.997380 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.997441 kubelet[2796]: W0303 13:35:54.997391 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.997441 kubelet[2796]: E0303 13:35:54.997398 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.997725 kubelet[2796]: E0303 13:35:54.997717 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.997802 kubelet[2796]: W0303 13:35:54.997763 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.997802 kubelet[2796]: E0303 13:35:54.997771 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.998105 kubelet[2796]: E0303 13:35:54.998096 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.998207 kubelet[2796]: W0303 13:35:54.998144 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.998207 kubelet[2796]: E0303 13:35:54.998152 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.998248 containerd[1639]: time="2026-03-03T13:35:54.998188240Z" level=info msg="connecting to shim f741f2d05db477bf090e8f3364b776108839c0b46bf257e5a7bc2fa003a924e8" address="unix:///run/containerd/s/f342d09da3a5f335d812a3ab9c9dc35766fd0c7a49af090e34ac44ffc2330581" namespace=k8s.io protocol=ttrpc version=3 Mar 3 13:35:54.998426 kubelet[2796]: E0303 13:35:54.998383 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.998426 kubelet[2796]: W0303 13:35:54.998390 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.998426 kubelet[2796]: E0303 13:35:54.998396 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.998919 kubelet[2796]: E0303 13:35:54.998875 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.998919 kubelet[2796]: W0303 13:35:54.998884 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.998919 kubelet[2796]: E0303 13:35:54.998891 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.999228 kubelet[2796]: E0303 13:35:54.999204 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.999228 kubelet[2796]: W0303 13:35:54.999212 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.999343 kubelet[2796]: E0303 13:35:54.999284 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.999539 kubelet[2796]: E0303 13:35:54.999461 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.999539 kubelet[2796]: W0303 13:35:54.999468 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.999539 kubelet[2796]: E0303 13:35:54.999476 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:54.999720 kubelet[2796]: E0303 13:35:54.999660 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:54.999805 kubelet[2796]: W0303 13:35:54.999747 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:54.999805 kubelet[2796]: E0303 13:35:54.999756 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.000102 kubelet[2796]: E0303 13:35:55.000094 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.000237 kubelet[2796]: W0303 13:35:55.000148 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.000237 kubelet[2796]: E0303 13:35:55.000156 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.000548 kubelet[2796]: E0303 13:35:55.000495 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.000548 kubelet[2796]: W0303 13:35:55.000504 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.000548 kubelet[2796]: E0303 13:35:55.000511 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.000889 kubelet[2796]: E0303 13:35:55.000811 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.000889 kubelet[2796]: W0303 13:35:55.000819 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.000889 kubelet[2796]: E0303 13:35:55.000825 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.001120 kubelet[2796]: E0303 13:35:55.001111 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.001299 kubelet[2796]: W0303 13:35:55.001166 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.001299 kubelet[2796]: E0303 13:35:55.001175 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.001404 kubelet[2796]: E0303 13:35:55.001396 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.001497 kubelet[2796]: W0303 13:35:55.001473 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.001497 kubelet[2796]: E0303 13:35:55.001485 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.002089 kubelet[2796]: E0303 13:35:55.001826 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.002089 kubelet[2796]: W0303 13:35:55.001834 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.002089 kubelet[2796]: E0303 13:35:55.001841 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.002269 kubelet[2796]: E0303 13:35:55.002256 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.002269 kubelet[2796]: W0303 13:35:55.002266 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.002317 kubelet[2796]: E0303 13:35:55.002273 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.002500 kubelet[2796]: E0303 13:35:55.002482 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.002656 kubelet[2796]: W0303 13:35:55.002546 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.002656 kubelet[2796]: E0303 13:35:55.002580 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.002965 kubelet[2796]: E0303 13:35:55.002916 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.002965 kubelet[2796]: W0303 13:35:55.002927 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.002965 kubelet[2796]: E0303 13:35:55.002935 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.003295 kubelet[2796]: E0303 13:35:55.003277 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.003326 kubelet[2796]: W0303 13:35:55.003308 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.003326 kubelet[2796]: E0303 13:35:55.003316 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.003363 kubelet[2796]: I0303 13:35:55.003339 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/01ac93a6-8099-4557-bc77-3925251db8d8-registration-dir\") pod \"csi-node-driver-dlz7j\" (UID: \"01ac93a6-8099-4557-bc77-3925251db8d8\") " pod="calico-system/csi-node-driver-dlz7j" Mar 3 13:35:55.003715 kubelet[2796]: E0303 13:35:55.003694 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.003715 kubelet[2796]: W0303 13:35:55.003710 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.003765 kubelet[2796]: E0303 13:35:55.003718 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.003765 kubelet[2796]: I0303 13:35:55.003738 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lblls\" (UniqueName: \"kubernetes.io/projected/01ac93a6-8099-4557-bc77-3925251db8d8-kube-api-access-lblls\") pod \"csi-node-driver-dlz7j\" (UID: \"01ac93a6-8099-4557-bc77-3925251db8d8\") " pod="calico-system/csi-node-driver-dlz7j" Mar 3 13:35:55.004176 kubelet[2796]: E0303 13:35:55.004095 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.004176 kubelet[2796]: W0303 13:35:55.004107 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.004176 kubelet[2796]: E0303 13:35:55.004114 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.004176 kubelet[2796]: I0303 13:35:55.004132 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/01ac93a6-8099-4557-bc77-3925251db8d8-socket-dir\") pod \"csi-node-driver-dlz7j\" (UID: \"01ac93a6-8099-4557-bc77-3925251db8d8\") " pod="calico-system/csi-node-driver-dlz7j" Mar 3 13:35:55.004619 kubelet[2796]: E0303 13:35:55.004450 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.004619 kubelet[2796]: W0303 13:35:55.004461 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.004619 kubelet[2796]: E0303 13:35:55.004468 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.004619 kubelet[2796]: I0303 13:35:55.004482 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/01ac93a6-8099-4557-bc77-3925251db8d8-kubelet-dir\") pod \"csi-node-driver-dlz7j\" (UID: \"01ac93a6-8099-4557-bc77-3925251db8d8\") " pod="calico-system/csi-node-driver-dlz7j" Mar 3 13:35:55.004788 kubelet[2796]: E0303 13:35:55.004768 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.004788 kubelet[2796]: W0303 13:35:55.004781 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.004900 kubelet[2796]: E0303 13:35:55.004788 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.004999 kubelet[2796]: I0303 13:35:55.004977 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/01ac93a6-8099-4557-bc77-3925251db8d8-varrun\") pod \"csi-node-driver-dlz7j\" (UID: \"01ac93a6-8099-4557-bc77-3925251db8d8\") " pod="calico-system/csi-node-driver-dlz7j" Mar 3 13:35:55.005263 kubelet[2796]: E0303 13:35:55.005242 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.005263 kubelet[2796]: W0303 13:35:55.005259 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.005309 kubelet[2796]: E0303 13:35:55.005268 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.005594 kubelet[2796]: E0303 13:35:55.005573 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.005594 kubelet[2796]: W0303 13:35:55.005585 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.005594 kubelet[2796]: E0303 13:35:55.005592 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.005914 kubelet[2796]: E0303 13:35:55.005894 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.005914 kubelet[2796]: W0303 13:35:55.005903 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.005914 kubelet[2796]: E0303 13:35:55.005910 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.006192 kubelet[2796]: E0303 13:35:55.006175 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.006192 kubelet[2796]: W0303 13:35:55.006188 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.006225 kubelet[2796]: E0303 13:35:55.006195 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.006475 kubelet[2796]: E0303 13:35:55.006457 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.006475 kubelet[2796]: W0303 13:35:55.006468 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.006475 kubelet[2796]: E0303 13:35:55.006475 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.006804 kubelet[2796]: E0303 13:35:55.006787 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.006804 kubelet[2796]: W0303 13:35:55.006799 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.007220 kubelet[2796]: E0303 13:35:55.006806 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.007220 kubelet[2796]: E0303 13:35:55.007141 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.007220 kubelet[2796]: W0303 13:35:55.007148 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.007220 kubelet[2796]: E0303 13:35:55.007183 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.007435 kubelet[2796]: E0303 13:35:55.007413 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.007435 kubelet[2796]: W0303 13:35:55.007426 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.007792 kubelet[2796]: E0303 13:35:55.007432 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.008066 kubelet[2796]: E0303 13:35:55.008048 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.008066 kubelet[2796]: W0303 13:35:55.008062 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.008122 kubelet[2796]: E0303 13:35:55.008069 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.008308 kubelet[2796]: E0303 13:35:55.008293 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.008308 kubelet[2796]: W0303 13:35:55.008304 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.008366 kubelet[2796]: E0303 13:35:55.008311 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.025814 systemd[1]: Started cri-containerd-f741f2d05db477bf090e8f3364b776108839c0b46bf257e5a7bc2fa003a924e8.scope - libcontainer container f741f2d05db477bf090e8f3364b776108839c0b46bf257e5a7bc2fa003a924e8. Mar 3 13:35:55.067447 containerd[1639]: time="2026-03-03T13:35:55.067391781Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-797b48589c-m8sml,Uid:a0e4975a-39b4-4603-b656-02aca61e9353,Namespace:calico-system,Attempt:0,} returns sandbox id \"f741f2d05db477bf090e8f3364b776108839c0b46bf257e5a7bc2fa003a924e8\"" Mar 3 13:35:55.069710 containerd[1639]: time="2026-03-03T13:35:55.069318374Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Mar 3 13:35:55.099361 containerd[1639]: time="2026-03-03T13:35:55.099318306Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-9nfsc,Uid:c328a717-2e18-4ba4-afe4-8aef6b284d56,Namespace:calico-system,Attempt:0,}" Mar 3 13:35:55.106333 kubelet[2796]: E0303 13:35:55.106305 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.106333 kubelet[2796]: W0303 13:35:55.106324 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.106468 kubelet[2796]: E0303 13:35:55.106347 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.106698 kubelet[2796]: E0303 13:35:55.106682 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.106698 kubelet[2796]: W0303 13:35:55.106694 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.106749 kubelet[2796]: E0303 13:35:55.106704 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.106995 kubelet[2796]: E0303 13:35:55.106955 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.106995 kubelet[2796]: W0303 13:35:55.106981 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.107126 kubelet[2796]: E0303 13:35:55.106998 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.107344 kubelet[2796]: E0303 13:35:55.107325 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.107344 kubelet[2796]: W0303 13:35:55.107336 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.107344 kubelet[2796]: E0303 13:35:55.107343 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.107542 kubelet[2796]: E0303 13:35:55.107523 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.107542 kubelet[2796]: W0303 13:35:55.107533 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.107542 kubelet[2796]: E0303 13:35:55.107539 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.107776 kubelet[2796]: E0303 13:35:55.107748 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.107776 kubelet[2796]: W0303 13:35:55.107760 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.107776 kubelet[2796]: E0303 13:35:55.107767 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.108039 kubelet[2796]: E0303 13:35:55.108002 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.108039 kubelet[2796]: W0303 13:35:55.108013 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.108039 kubelet[2796]: E0303 13:35:55.108020 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.108214 kubelet[2796]: E0303 13:35:55.108202 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.108214 kubelet[2796]: W0303 13:35:55.108212 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.108258 kubelet[2796]: E0303 13:35:55.108218 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.108646 kubelet[2796]: E0303 13:35:55.108388 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.108646 kubelet[2796]: W0303 13:35:55.108395 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.108646 kubelet[2796]: E0303 13:35:55.108401 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.108646 kubelet[2796]: E0303 13:35:55.108540 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.108646 kubelet[2796]: W0303 13:35:55.108546 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.108646 kubelet[2796]: E0303 13:35:55.108552 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.108886 kubelet[2796]: E0303 13:35:55.108722 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.108886 kubelet[2796]: W0303 13:35:55.108728 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.108886 kubelet[2796]: E0303 13:35:55.108734 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.108936 kubelet[2796]: E0303 13:35:55.108907 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.108936 kubelet[2796]: W0303 13:35:55.108914 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.108936 kubelet[2796]: E0303 13:35:55.108919 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.109124 kubelet[2796]: E0303 13:35:55.109099 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.109124 kubelet[2796]: W0303 13:35:55.109109 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.109124 kubelet[2796]: E0303 13:35:55.109115 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.109460 kubelet[2796]: E0303 13:35:55.109348 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.109460 kubelet[2796]: W0303 13:35:55.109354 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.109460 kubelet[2796]: E0303 13:35:55.109360 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.109605 kubelet[2796]: E0303 13:35:55.109541 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.109605 kubelet[2796]: W0303 13:35:55.109547 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.109605 kubelet[2796]: E0303 13:35:55.109553 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.109788 kubelet[2796]: E0303 13:35:55.109770 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.109788 kubelet[2796]: W0303 13:35:55.109781 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.109788 kubelet[2796]: E0303 13:35:55.109787 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.109995 kubelet[2796]: E0303 13:35:55.109969 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.109995 kubelet[2796]: W0303 13:35:55.109982 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.109995 kubelet[2796]: E0303 13:35:55.109990 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.110189 kubelet[2796]: E0303 13:35:55.110168 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.110189 kubelet[2796]: W0303 13:35:55.110178 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.110189 kubelet[2796]: E0303 13:35:55.110184 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.110462 kubelet[2796]: E0303 13:35:55.110422 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.110462 kubelet[2796]: W0303 13:35:55.110433 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.110462 kubelet[2796]: E0303 13:35:55.110440 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.110900 kubelet[2796]: E0303 13:35:55.110806 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.110900 kubelet[2796]: W0303 13:35:55.110815 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.110900 kubelet[2796]: E0303 13:35:55.110822 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.111571 kubelet[2796]: E0303 13:35:55.111014 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.111571 kubelet[2796]: W0303 13:35:55.111020 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.111571 kubelet[2796]: E0303 13:35:55.111026 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.111571 kubelet[2796]: E0303 13:35:55.111194 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.111571 kubelet[2796]: W0303 13:35:55.111200 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.111571 kubelet[2796]: E0303 13:35:55.111206 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.111571 kubelet[2796]: E0303 13:35:55.111377 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.111571 kubelet[2796]: W0303 13:35:55.111382 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.111571 kubelet[2796]: E0303 13:35:55.111387 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.111737 kubelet[2796]: E0303 13:35:55.111600 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.111737 kubelet[2796]: W0303 13:35:55.111606 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.111737 kubelet[2796]: E0303 13:35:55.111612 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.111835 kubelet[2796]: E0303 13:35:55.111812 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.111835 kubelet[2796]: W0303 13:35:55.111828 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.111835 kubelet[2796]: E0303 13:35:55.111834 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.119514 kubelet[2796]: E0303 13:35:55.119490 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:55.119514 kubelet[2796]: W0303 13:35:55.119507 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:55.119582 kubelet[2796]: E0303 13:35:55.119518 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:55.119885 containerd[1639]: time="2026-03-03T13:35:55.119653013Z" level=info msg="connecting to shim 889d343f7c8840212e1a79cec53967ad685f857ce2400a6eec25d5b0027e906b" address="unix:///run/containerd/s/7c4987813edb4da252bd6261277b8d34b7df6ffb4b338e11cd5bce5491fff45a" namespace=k8s.io protocol=ttrpc version=3 Mar 3 13:35:55.140795 systemd[1]: Started cri-containerd-889d343f7c8840212e1a79cec53967ad685f857ce2400a6eec25d5b0027e906b.scope - libcontainer container 889d343f7c8840212e1a79cec53967ad685f857ce2400a6eec25d5b0027e906b. Mar 3 13:35:55.170085 containerd[1639]: time="2026-03-03T13:35:55.169993491Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-9nfsc,Uid:c328a717-2e18-4ba4-afe4-8aef6b284d56,Namespace:calico-system,Attempt:0,} returns sandbox id \"889d343f7c8840212e1a79cec53967ad685f857ce2400a6eec25d5b0027e906b\"" Mar 3 13:35:57.035434 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1776052070.mount: Deactivated successfully. Mar 3 13:35:57.099319 kubelet[2796]: E0303 13:35:57.099283 2796 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dlz7j" podUID="01ac93a6-8099-4557-bc77-3925251db8d8" Mar 3 13:35:57.390675 containerd[1639]: time="2026-03-03T13:35:57.390601702Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:57.391658 containerd[1639]: time="2026-03-03T13:35:57.391488404Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=36107596" Mar 3 13:35:57.392555 containerd[1639]: time="2026-03-03T13:35:57.392516201Z" level=info msg="ImageCreate event name:\"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:57.394046 containerd[1639]: time="2026-03-03T13:35:57.394013600Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:57.394876 containerd[1639]: time="2026-03-03T13:35:57.394517307Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"36107450\" in 2.324808272s" Mar 3 13:35:57.394876 containerd[1639]: time="2026-03-03T13:35:57.394540794Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\"" Mar 3 13:35:57.395802 containerd[1639]: time="2026-03-03T13:35:57.395790100Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Mar 3 13:35:57.412064 containerd[1639]: time="2026-03-03T13:35:57.412029181Z" level=info msg="CreateContainer within sandbox \"f741f2d05db477bf090e8f3364b776108839c0b46bf257e5a7bc2fa003a924e8\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 3 13:35:57.418438 containerd[1639]: time="2026-03-03T13:35:57.418170918Z" level=info msg="Container 7c55fc1d9435ff9b3486304655a822c1d122b6a65253069bd2151aec346d0423: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:35:57.423786 containerd[1639]: time="2026-03-03T13:35:57.423758377Z" level=info msg="CreateContainer within sandbox \"f741f2d05db477bf090e8f3364b776108839c0b46bf257e5a7bc2fa003a924e8\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"7c55fc1d9435ff9b3486304655a822c1d122b6a65253069bd2151aec346d0423\"" Mar 3 13:35:57.424646 containerd[1639]: time="2026-03-03T13:35:57.424622324Z" level=info msg="StartContainer for \"7c55fc1d9435ff9b3486304655a822c1d122b6a65253069bd2151aec346d0423\"" Mar 3 13:35:57.425640 containerd[1639]: time="2026-03-03T13:35:57.425613512Z" level=info msg="connecting to shim 7c55fc1d9435ff9b3486304655a822c1d122b6a65253069bd2151aec346d0423" address="unix:///run/containerd/s/f342d09da3a5f335d812a3ab9c9dc35766fd0c7a49af090e34ac44ffc2330581" protocol=ttrpc version=3 Mar 3 13:35:57.439773 systemd[1]: Started cri-containerd-7c55fc1d9435ff9b3486304655a822c1d122b6a65253069bd2151aec346d0423.scope - libcontainer container 7c55fc1d9435ff9b3486304655a822c1d122b6a65253069bd2151aec346d0423. Mar 3 13:35:57.483597 containerd[1639]: time="2026-03-03T13:35:57.483471136Z" level=info msg="StartContainer for \"7c55fc1d9435ff9b3486304655a822c1d122b6a65253069bd2151aec346d0423\" returns successfully" Mar 3 13:35:58.225921 kubelet[2796]: E0303 13:35:58.225840 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.225921 kubelet[2796]: W0303 13:35:58.225877 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.225921 kubelet[2796]: E0303 13:35:58.225905 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.227530 kubelet[2796]: E0303 13:35:58.226276 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.227530 kubelet[2796]: W0303 13:35:58.226289 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.227530 kubelet[2796]: E0303 13:35:58.226304 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.227530 kubelet[2796]: E0303 13:35:58.226625 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.227530 kubelet[2796]: W0303 13:35:58.226637 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.227530 kubelet[2796]: E0303 13:35:58.226650 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.227530 kubelet[2796]: E0303 13:35:58.227101 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.227530 kubelet[2796]: W0303 13:35:58.227117 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.227530 kubelet[2796]: E0303 13:35:58.227132 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.227530 kubelet[2796]: E0303 13:35:58.227516 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.228134 kubelet[2796]: W0303 13:35:58.227528 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.228134 kubelet[2796]: E0303 13:35:58.227545 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.228134 kubelet[2796]: E0303 13:35:58.227903 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.228134 kubelet[2796]: W0303 13:35:58.227915 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.228134 kubelet[2796]: E0303 13:35:58.227930 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.228498 kubelet[2796]: E0303 13:35:58.228305 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.228498 kubelet[2796]: W0303 13:35:58.228320 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.228498 kubelet[2796]: E0303 13:35:58.228338 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.228813 kubelet[2796]: E0303 13:35:58.228776 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.228813 kubelet[2796]: W0303 13:35:58.228800 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.228970 kubelet[2796]: E0303 13:35:58.228816 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.229318 kubelet[2796]: E0303 13:35:58.229260 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.229318 kubelet[2796]: W0303 13:35:58.229293 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.229318 kubelet[2796]: E0303 13:35:58.229315 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.229787 kubelet[2796]: E0303 13:35:58.229748 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.229787 kubelet[2796]: W0303 13:35:58.229779 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.229892 kubelet[2796]: E0303 13:35:58.229795 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.230285 kubelet[2796]: E0303 13:35:58.230230 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.230285 kubelet[2796]: W0303 13:35:58.230259 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.230285 kubelet[2796]: E0303 13:35:58.230282 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.230830 kubelet[2796]: E0303 13:35:58.230759 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.230830 kubelet[2796]: W0303 13:35:58.230784 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.230830 kubelet[2796]: E0303 13:35:58.230801 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.231249 kubelet[2796]: E0303 13:35:58.231227 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.231370 kubelet[2796]: W0303 13:35:58.231331 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.231370 kubelet[2796]: E0303 13:35:58.231359 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.231998 kubelet[2796]: E0303 13:35:58.231808 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.231998 kubelet[2796]: W0303 13:35:58.231828 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.231998 kubelet[2796]: E0303 13:35:58.231845 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.232281 kubelet[2796]: E0303 13:35:58.232243 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.232281 kubelet[2796]: W0303 13:35:58.232264 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.232442 kubelet[2796]: E0303 13:35:58.232279 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.232964 kubelet[2796]: E0303 13:35:58.232926 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.232964 kubelet[2796]: W0303 13:35:58.232953 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.233071 kubelet[2796]: E0303 13:35:58.232971 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.233427 kubelet[2796]: E0303 13:35:58.233394 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.233427 kubelet[2796]: W0303 13:35:58.233416 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.233520 kubelet[2796]: E0303 13:35:58.233435 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.233970 kubelet[2796]: E0303 13:35:58.233939 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.233970 kubelet[2796]: W0303 13:35:58.233959 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.234095 kubelet[2796]: E0303 13:35:58.233975 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.234508 kubelet[2796]: E0303 13:35:58.234475 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.234508 kubelet[2796]: W0303 13:35:58.234498 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.234604 kubelet[2796]: E0303 13:35:58.234514 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.235005 kubelet[2796]: E0303 13:35:58.234974 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.235005 kubelet[2796]: W0303 13:35:58.234994 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.235121 kubelet[2796]: E0303 13:35:58.235008 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.235516 kubelet[2796]: E0303 13:35:58.235479 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.235516 kubelet[2796]: W0303 13:35:58.235504 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.235661 kubelet[2796]: E0303 13:35:58.235521 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.236068 kubelet[2796]: E0303 13:35:58.236030 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.236068 kubelet[2796]: W0303 13:35:58.236055 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.236196 kubelet[2796]: E0303 13:35:58.236074 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.236495 kubelet[2796]: E0303 13:35:58.236460 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.236495 kubelet[2796]: W0303 13:35:58.236484 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.236595 kubelet[2796]: E0303 13:35:58.236498 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.236992 kubelet[2796]: E0303 13:35:58.236950 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.236992 kubelet[2796]: W0303 13:35:58.236979 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.237116 kubelet[2796]: E0303 13:35:58.236997 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.237764 kubelet[2796]: E0303 13:35:58.237729 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.237764 kubelet[2796]: W0303 13:35:58.237756 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.237892 kubelet[2796]: E0303 13:35:58.237771 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.238250 kubelet[2796]: E0303 13:35:58.238195 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.238250 kubelet[2796]: W0303 13:35:58.238239 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.238340 kubelet[2796]: E0303 13:35:58.238254 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.238728 kubelet[2796]: E0303 13:35:58.238637 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.238728 kubelet[2796]: W0303 13:35:58.238657 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.238920 kubelet[2796]: E0303 13:35:58.238704 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.239339 kubelet[2796]: E0303 13:35:58.239304 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.239339 kubelet[2796]: W0303 13:35:58.239329 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.239441 kubelet[2796]: E0303 13:35:58.239346 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.239835 kubelet[2796]: E0303 13:35:58.239800 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.239835 kubelet[2796]: W0303 13:35:58.239821 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.239953 kubelet[2796]: E0303 13:35:58.239852 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.241142 kubelet[2796]: E0303 13:35:58.241106 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.241142 kubelet[2796]: W0303 13:35:58.241134 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.241279 kubelet[2796]: E0303 13:35:58.241188 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.241868 kubelet[2796]: E0303 13:35:58.241831 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.241868 kubelet[2796]: W0303 13:35:58.241860 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.241981 kubelet[2796]: E0303 13:35:58.241878 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.242402 kubelet[2796]: E0303 13:35:58.242367 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.242402 kubelet[2796]: W0303 13:35:58.242391 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.242504 kubelet[2796]: E0303 13:35:58.242405 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:58.243127 kubelet[2796]: E0303 13:35:58.243090 2796 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 3 13:35:58.243127 kubelet[2796]: W0303 13:35:58.243111 2796 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 3 13:35:58.243245 kubelet[2796]: E0303 13:35:58.243127 2796 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 3 13:35:59.034777 containerd[1639]: time="2026-03-03T13:35:59.034727756Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:59.036031 containerd[1639]: time="2026-03-03T13:35:59.035935608Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4630250" Mar 3 13:35:59.037016 containerd[1639]: time="2026-03-03T13:35:59.036998112Z" level=info msg="ImageCreate event name:\"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:59.041277 containerd[1639]: time="2026-03-03T13:35:59.041250512Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:35:59.041681 containerd[1639]: time="2026-03-03T13:35:59.041626233Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"6186255\" in 1.64574019s" Mar 3 13:35:59.041771 containerd[1639]: time="2026-03-03T13:35:59.041729886Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\"" Mar 3 13:35:59.045486 containerd[1639]: time="2026-03-03T13:35:59.045372234Z" level=info msg="CreateContainer within sandbox \"889d343f7c8840212e1a79cec53967ad685f857ce2400a6eec25d5b0027e906b\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 3 13:35:59.055043 containerd[1639]: time="2026-03-03T13:35:59.054325779Z" level=info msg="Container d23342ff5d40c4330d940cb5682b68df6c7cd74255948ede8903d49cffcb4b0d: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:35:59.067490 containerd[1639]: time="2026-03-03T13:35:59.067459788Z" level=info msg="CreateContainer within sandbox \"889d343f7c8840212e1a79cec53967ad685f857ce2400a6eec25d5b0027e906b\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"d23342ff5d40c4330d940cb5682b68df6c7cd74255948ede8903d49cffcb4b0d\"" Mar 3 13:35:59.067914 containerd[1639]: time="2026-03-03T13:35:59.067893792Z" level=info msg="StartContainer for \"d23342ff5d40c4330d940cb5682b68df6c7cd74255948ede8903d49cffcb4b0d\"" Mar 3 13:35:59.068900 containerd[1639]: time="2026-03-03T13:35:59.068878203Z" level=info msg="connecting to shim d23342ff5d40c4330d940cb5682b68df6c7cd74255948ede8903d49cffcb4b0d" address="unix:///run/containerd/s/7c4987813edb4da252bd6261277b8d34b7df6ffb4b338e11cd5bce5491fff45a" protocol=ttrpc version=3 Mar 3 13:35:59.089783 systemd[1]: Started cri-containerd-d23342ff5d40c4330d940cb5682b68df6c7cd74255948ede8903d49cffcb4b0d.scope - libcontainer container d23342ff5d40c4330d940cb5682b68df6c7cd74255948ede8903d49cffcb4b0d. Mar 3 13:35:59.096747 kubelet[2796]: E0303 13:35:59.096714 2796 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dlz7j" podUID="01ac93a6-8099-4557-bc77-3925251db8d8" Mar 3 13:35:59.149344 containerd[1639]: time="2026-03-03T13:35:59.149297026Z" level=info msg="StartContainer for \"d23342ff5d40c4330d940cb5682b68df6c7cd74255948ede8903d49cffcb4b0d\" returns successfully" Mar 3 13:35:59.159994 systemd[1]: cri-containerd-d23342ff5d40c4330d940cb5682b68df6c7cd74255948ede8903d49cffcb4b0d.scope: Deactivated successfully. Mar 3 13:35:59.162960 containerd[1639]: time="2026-03-03T13:35:59.162931975Z" level=info msg="received container exit event container_id:\"d23342ff5d40c4330d940cb5682b68df6c7cd74255948ede8903d49cffcb4b0d\" id:\"d23342ff5d40c4330d940cb5682b68df6c7cd74255948ede8903d49cffcb4b0d\" pid:3505 exited_at:{seconds:1772544959 nanos:162475425}" Mar 3 13:35:59.180385 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d23342ff5d40c4330d940cb5682b68df6c7cd74255948ede8903d49cffcb4b0d-rootfs.mount: Deactivated successfully. Mar 3 13:35:59.199016 kubelet[2796]: I0303 13:35:59.198993 2796 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 3 13:35:59.214697 kubelet[2796]: I0303 13:35:59.214316 2796 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-797b48589c-m8sml" podStartSLOduration=2.887222564 podStartE2EDuration="5.214305273s" podCreationTimestamp="2026-03-03 13:35:54 +0000 UTC" firstStartedPulling="2026-03-03 13:35:55.068555922 +0000 UTC m=+18.049537776" lastFinishedPulling="2026-03-03 13:35:57.395638621 +0000 UTC m=+20.376620485" observedRunningTime="2026-03-03 13:35:58.210191645 +0000 UTC m=+21.191173549" watchObservedRunningTime="2026-03-03 13:35:59.214305273 +0000 UTC m=+22.195287127" Mar 3 13:36:00.205682 containerd[1639]: time="2026-03-03T13:36:00.205634667Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Mar 3 13:36:01.099220 kubelet[2796]: E0303 13:36:01.097854 2796 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dlz7j" podUID="01ac93a6-8099-4557-bc77-3925251db8d8" Mar 3 13:36:03.096586 kubelet[2796]: E0303 13:36:03.095951 2796 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dlz7j" podUID="01ac93a6-8099-4557-bc77-3925251db8d8" Mar 3 13:36:04.213946 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1366951609.mount: Deactivated successfully. Mar 3 13:36:04.262266 containerd[1639]: time="2026-03-03T13:36:04.262219953Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:04.263310 containerd[1639]: time="2026-03-03T13:36:04.263179692Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=159838564" Mar 3 13:36:04.264905 containerd[1639]: time="2026-03-03T13:36:04.264882471Z" level=info msg="ImageCreate event name:\"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:04.266844 containerd[1639]: time="2026-03-03T13:36:04.266803652Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:04.267070 containerd[1639]: time="2026-03-03T13:36:04.267053922Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"159838426\" in 4.061387706s" Mar 3 13:36:04.267121 containerd[1639]: time="2026-03-03T13:36:04.267111613Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\"" Mar 3 13:36:04.272273 containerd[1639]: time="2026-03-03T13:36:04.272247714Z" level=info msg="CreateContainer within sandbox \"889d343f7c8840212e1a79cec53967ad685f857ce2400a6eec25d5b0027e906b\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Mar 3 13:36:04.285428 containerd[1639]: time="2026-03-03T13:36:04.284829163Z" level=info msg="Container c0b3542b2d902ec89d17163cee62cdce4f6bb703c2c618c270de7863297d63b4: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:36:04.297850 containerd[1639]: time="2026-03-03T13:36:04.297821163Z" level=info msg="CreateContainer within sandbox \"889d343f7c8840212e1a79cec53967ad685f857ce2400a6eec25d5b0027e906b\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"c0b3542b2d902ec89d17163cee62cdce4f6bb703c2c618c270de7863297d63b4\"" Mar 3 13:36:04.298431 containerd[1639]: time="2026-03-03T13:36:04.298378633Z" level=info msg="StartContainer for \"c0b3542b2d902ec89d17163cee62cdce4f6bb703c2c618c270de7863297d63b4\"" Mar 3 13:36:04.299576 containerd[1639]: time="2026-03-03T13:36:04.299555081Z" level=info msg="connecting to shim c0b3542b2d902ec89d17163cee62cdce4f6bb703c2c618c270de7863297d63b4" address="unix:///run/containerd/s/7c4987813edb4da252bd6261277b8d34b7df6ffb4b338e11cd5bce5491fff45a" protocol=ttrpc version=3 Mar 3 13:36:04.318772 systemd[1]: Started cri-containerd-c0b3542b2d902ec89d17163cee62cdce4f6bb703c2c618c270de7863297d63b4.scope - libcontainer container c0b3542b2d902ec89d17163cee62cdce4f6bb703c2c618c270de7863297d63b4. Mar 3 13:36:04.385776 containerd[1639]: time="2026-03-03T13:36:04.385716334Z" level=info msg="StartContainer for \"c0b3542b2d902ec89d17163cee62cdce4f6bb703c2c618c270de7863297d63b4\" returns successfully" Mar 3 13:36:04.422236 systemd[1]: cri-containerd-c0b3542b2d902ec89d17163cee62cdce4f6bb703c2c618c270de7863297d63b4.scope: Deactivated successfully. Mar 3 13:36:04.424095 containerd[1639]: time="2026-03-03T13:36:04.424060629Z" level=info msg="received container exit event container_id:\"c0b3542b2d902ec89d17163cee62cdce4f6bb703c2c618c270de7863297d63b4\" id:\"c0b3542b2d902ec89d17163cee62cdce4f6bb703c2c618c270de7863297d63b4\" pid:3560 exited_at:{seconds:1772544964 nanos:423797918}" Mar 3 13:36:04.444805 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c0b3542b2d902ec89d17163cee62cdce4f6bb703c2c618c270de7863297d63b4-rootfs.mount: Deactivated successfully. Mar 3 13:36:05.096720 kubelet[2796]: E0303 13:36:05.096204 2796 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dlz7j" podUID="01ac93a6-8099-4557-bc77-3925251db8d8" Mar 3 13:36:05.223266 containerd[1639]: time="2026-03-03T13:36:05.223165872Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Mar 3 13:36:07.095901 kubelet[2796]: E0303 13:36:07.095843 2796 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dlz7j" podUID="01ac93a6-8099-4557-bc77-3925251db8d8" Mar 3 13:36:07.822191 containerd[1639]: time="2026-03-03T13:36:07.822140435Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:07.823106 containerd[1639]: time="2026-03-03T13:36:07.822968424Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=70611671" Mar 3 13:36:07.823659 containerd[1639]: time="2026-03-03T13:36:07.823636062Z" level=info msg="ImageCreate event name:\"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:07.825020 containerd[1639]: time="2026-03-03T13:36:07.824998682Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:07.825439 containerd[1639]: time="2026-03-03T13:36:07.825423602Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"72167716\" in 2.602164475s" Mar 3 13:36:07.825494 containerd[1639]: time="2026-03-03T13:36:07.825484847Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\"" Mar 3 13:36:07.827915 containerd[1639]: time="2026-03-03T13:36:07.827892142Z" level=info msg="CreateContainer within sandbox \"889d343f7c8840212e1a79cec53967ad685f857ce2400a6eec25d5b0027e906b\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 3 13:36:07.837392 containerd[1639]: time="2026-03-03T13:36:07.836719923Z" level=info msg="Container 3969ceb797a755e3342663dc867a794bb5165bf3ab7fc870c189252d0b76ff23: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:36:07.852037 containerd[1639]: time="2026-03-03T13:36:07.852007654Z" level=info msg="CreateContainer within sandbox \"889d343f7c8840212e1a79cec53967ad685f857ce2400a6eec25d5b0027e906b\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"3969ceb797a755e3342663dc867a794bb5165bf3ab7fc870c189252d0b76ff23\"" Mar 3 13:36:07.852517 containerd[1639]: time="2026-03-03T13:36:07.852490886Z" level=info msg="StartContainer for \"3969ceb797a755e3342663dc867a794bb5165bf3ab7fc870c189252d0b76ff23\"" Mar 3 13:36:07.853680 containerd[1639]: time="2026-03-03T13:36:07.853566539Z" level=info msg="connecting to shim 3969ceb797a755e3342663dc867a794bb5165bf3ab7fc870c189252d0b76ff23" address="unix:///run/containerd/s/7c4987813edb4da252bd6261277b8d34b7df6ffb4b338e11cd5bce5491fff45a" protocol=ttrpc version=3 Mar 3 13:36:07.874782 systemd[1]: Started cri-containerd-3969ceb797a755e3342663dc867a794bb5165bf3ab7fc870c189252d0b76ff23.scope - libcontainer container 3969ceb797a755e3342663dc867a794bb5165bf3ab7fc870c189252d0b76ff23. Mar 3 13:36:07.960659 containerd[1639]: time="2026-03-03T13:36:07.960536801Z" level=info msg="StartContainer for \"3969ceb797a755e3342663dc867a794bb5165bf3ab7fc870c189252d0b76ff23\" returns successfully" Mar 3 13:36:08.367993 containerd[1639]: time="2026-03-03T13:36:08.367949881Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 3 13:36:08.369929 systemd[1]: cri-containerd-3969ceb797a755e3342663dc867a794bb5165bf3ab7fc870c189252d0b76ff23.scope: Deactivated successfully. Mar 3 13:36:08.370309 systemd[1]: cri-containerd-3969ceb797a755e3342663dc867a794bb5165bf3ab7fc870c189252d0b76ff23.scope: Consumed 369ms CPU time, 185.3M memory peak, 2.5M read from disk, 177M written to disk. Mar 3 13:36:08.371584 containerd[1639]: time="2026-03-03T13:36:08.371556355Z" level=info msg="received container exit event container_id:\"3969ceb797a755e3342663dc867a794bb5165bf3ab7fc870c189252d0b76ff23\" id:\"3969ceb797a755e3342663dc867a794bb5165bf3ab7fc870c189252d0b76ff23\" pid:3618 exited_at:{seconds:1772544968 nanos:371286405}" Mar 3 13:36:08.409327 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3969ceb797a755e3342663dc867a794bb5165bf3ab7fc870c189252d0b76ff23-rootfs.mount: Deactivated successfully. Mar 3 13:36:08.469526 kubelet[2796]: I0303 13:36:08.469505 2796 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Mar 3 13:36:08.503418 kubelet[2796]: I0303 13:36:08.503366 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/69f392ca-620e-4545-90dd-be526067bb84-config-volume\") pod \"coredns-66bc5c9577-j9kwn\" (UID: \"69f392ca-620e-4545-90dd-be526067bb84\") " pod="kube-system/coredns-66bc5c9577-j9kwn" Mar 3 13:36:08.503984 kubelet[2796]: I0303 13:36:08.503972 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8pjh\" (UniqueName: \"kubernetes.io/projected/69f392ca-620e-4545-90dd-be526067bb84-kube-api-access-h8pjh\") pod \"coredns-66bc5c9577-j9kwn\" (UID: \"69f392ca-620e-4545-90dd-be526067bb84\") " pod="kube-system/coredns-66bc5c9577-j9kwn" Mar 3 13:36:08.504068 kubelet[2796]: I0303 13:36:08.504059 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/24edf4b1-ec3a-4ab5-a696-e9721189d47d-config-volume\") pod \"coredns-66bc5c9577-wx9bf\" (UID: \"24edf4b1-ec3a-4ab5-a696-e9721189d47d\") " pod="kube-system/coredns-66bc5c9577-wx9bf" Mar 3 13:36:08.504167 kubelet[2796]: I0303 13:36:08.504130 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg2mb\" (UniqueName: \"kubernetes.io/projected/24edf4b1-ec3a-4ab5-a696-e9721189d47d-kube-api-access-zg2mb\") pod \"coredns-66bc5c9577-wx9bf\" (UID: \"24edf4b1-ec3a-4ab5-a696-e9721189d47d\") " pod="kube-system/coredns-66bc5c9577-wx9bf" Mar 3 13:36:08.510473 systemd[1]: Created slice kubepods-burstable-pod69f392ca_620e_4545_90dd_be526067bb84.slice - libcontainer container kubepods-burstable-pod69f392ca_620e_4545_90dd_be526067bb84.slice. Mar 3 13:36:08.524072 systemd[1]: Created slice kubepods-burstable-pod24edf4b1_ec3a_4ab5_a696_e9721189d47d.slice - libcontainer container kubepods-burstable-pod24edf4b1_ec3a_4ab5_a696_e9721189d47d.slice. Mar 3 13:36:08.531602 systemd[1]: Created slice kubepods-besteffort-pod8745b89a_4de0_4efe_bb7d_0887a5a20193.slice - libcontainer container kubepods-besteffort-pod8745b89a_4de0_4efe_bb7d_0887a5a20193.slice. Mar 3 13:36:08.539268 systemd[1]: Created slice kubepods-besteffort-pod79ec080a_ea5e_48da_b785_0639f3bb4193.slice - libcontainer container kubepods-besteffort-pod79ec080a_ea5e_48da_b785_0639f3bb4193.slice. Mar 3 13:36:08.546364 systemd[1]: Created slice kubepods-besteffort-podee6739e9_2745_46b9_936e_2b03da5ded77.slice - libcontainer container kubepods-besteffort-podee6739e9_2745_46b9_936e_2b03da5ded77.slice. Mar 3 13:36:08.551500 systemd[1]: Created slice kubepods-besteffort-pod323b935e_0775_45cf_89a0_f43932ed5b64.slice - libcontainer container kubepods-besteffort-pod323b935e_0775_45cf_89a0_f43932ed5b64.slice. Mar 3 13:36:08.555886 systemd[1]: Created slice kubepods-besteffort-pod67394235_a1d5_4b93_9138_aad850f57cbd.slice - libcontainer container kubepods-besteffort-pod67394235_a1d5_4b93_9138_aad850f57cbd.slice. Mar 3 13:36:08.604558 kubelet[2796]: I0303 13:36:08.604521 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlmmk\" (UniqueName: \"kubernetes.io/projected/67394235-a1d5-4b93-9138-aad850f57cbd-kube-api-access-hlmmk\") pod \"whisker-698594dd47-qmkfb\" (UID: \"67394235-a1d5-4b93-9138-aad850f57cbd\") " pod="calico-system/whisker-698594dd47-qmkfb" Mar 3 13:36:08.604558 kubelet[2796]: I0303 13:36:08.604555 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/ee6739e9-2745-46b9-936e-2b03da5ded77-calico-apiserver-certs\") pod \"calico-apiserver-588cd44876-x8bpf\" (UID: \"ee6739e9-2745-46b9-936e-2b03da5ded77\") " pod="calico-system/calico-apiserver-588cd44876-x8bpf" Mar 3 13:36:08.604558 kubelet[2796]: I0303 13:36:08.604566 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/323b935e-0775-45cf-89a0-f43932ed5b64-calico-apiserver-certs\") pod \"calico-apiserver-588cd44876-hb6b4\" (UID: \"323b935e-0775-45cf-89a0-f43932ed5b64\") " pod="calico-system/calico-apiserver-588cd44876-hb6b4" Mar 3 13:36:08.604764 kubelet[2796]: I0303 13:36:08.604579 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/8745b89a-4de0-4efe-bb7d-0887a5a20193-goldmane-key-pair\") pod \"goldmane-cccfbd5cf-xvc2m\" (UID: \"8745b89a-4de0-4efe-bb7d-0887a5a20193\") " pod="calico-system/goldmane-cccfbd5cf-xvc2m" Mar 3 13:36:08.604764 kubelet[2796]: I0303 13:36:08.604590 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79ec080a-ea5e-48da-b785-0639f3bb4193-tigera-ca-bundle\") pod \"calico-kube-controllers-77c7bb8fcc-hh8jb\" (UID: \"79ec080a-ea5e-48da-b785-0639f3bb4193\") " pod="calico-system/calico-kube-controllers-77c7bb8fcc-hh8jb" Mar 3 13:36:08.604764 kubelet[2796]: I0303 13:36:08.604600 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67394235-a1d5-4b93-9138-aad850f57cbd-whisker-ca-bundle\") pod \"whisker-698594dd47-qmkfb\" (UID: \"67394235-a1d5-4b93-9138-aad850f57cbd\") " pod="calico-system/whisker-698594dd47-qmkfb" Mar 3 13:36:08.604764 kubelet[2796]: I0303 13:36:08.604616 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/67394235-a1d5-4b93-9138-aad850f57cbd-nginx-config\") pod \"whisker-698594dd47-qmkfb\" (UID: \"67394235-a1d5-4b93-9138-aad850f57cbd\") " pod="calico-system/whisker-698594dd47-qmkfb" Mar 3 13:36:08.604764 kubelet[2796]: I0303 13:36:08.604628 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/67394235-a1d5-4b93-9138-aad850f57cbd-whisker-backend-key-pair\") pod \"whisker-698594dd47-qmkfb\" (UID: \"67394235-a1d5-4b93-9138-aad850f57cbd\") " pod="calico-system/whisker-698594dd47-qmkfb" Mar 3 13:36:08.604860 kubelet[2796]: I0303 13:36:08.604642 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkpck\" (UniqueName: \"kubernetes.io/projected/ee6739e9-2745-46b9-936e-2b03da5ded77-kube-api-access-hkpck\") pod \"calico-apiserver-588cd44876-x8bpf\" (UID: \"ee6739e9-2745-46b9-936e-2b03da5ded77\") " pod="calico-system/calico-apiserver-588cd44876-x8bpf" Mar 3 13:36:08.604860 kubelet[2796]: I0303 13:36:08.604677 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8bj2\" (UniqueName: \"kubernetes.io/projected/323b935e-0775-45cf-89a0-f43932ed5b64-kube-api-access-q8bj2\") pod \"calico-apiserver-588cd44876-hb6b4\" (UID: \"323b935e-0775-45cf-89a0-f43932ed5b64\") " pod="calico-system/calico-apiserver-588cd44876-hb6b4" Mar 3 13:36:08.604860 kubelet[2796]: I0303 13:36:08.604696 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8745b89a-4de0-4efe-bb7d-0887a5a20193-config\") pod \"goldmane-cccfbd5cf-xvc2m\" (UID: \"8745b89a-4de0-4efe-bb7d-0887a5a20193\") " pod="calico-system/goldmane-cccfbd5cf-xvc2m" Mar 3 13:36:08.604860 kubelet[2796]: I0303 13:36:08.604707 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8745b89a-4de0-4efe-bb7d-0887a5a20193-goldmane-ca-bundle\") pod \"goldmane-cccfbd5cf-xvc2m\" (UID: \"8745b89a-4de0-4efe-bb7d-0887a5a20193\") " pod="calico-system/goldmane-cccfbd5cf-xvc2m" Mar 3 13:36:08.604860 kubelet[2796]: I0303 13:36:08.604717 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sfgq\" (UniqueName: \"kubernetes.io/projected/8745b89a-4de0-4efe-bb7d-0887a5a20193-kube-api-access-2sfgq\") pod \"goldmane-cccfbd5cf-xvc2m\" (UID: \"8745b89a-4de0-4efe-bb7d-0887a5a20193\") " pod="calico-system/goldmane-cccfbd5cf-xvc2m" Mar 3 13:36:08.604948 kubelet[2796]: I0303 13:36:08.604739 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2phd\" (UniqueName: \"kubernetes.io/projected/79ec080a-ea5e-48da-b785-0639f3bb4193-kube-api-access-p2phd\") pod \"calico-kube-controllers-77c7bb8fcc-hh8jb\" (UID: \"79ec080a-ea5e-48da-b785-0639f3bb4193\") " pod="calico-system/calico-kube-controllers-77c7bb8fcc-hh8jb" Mar 3 13:36:08.822489 containerd[1639]: time="2026-03-03T13:36:08.822134804Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-j9kwn,Uid:69f392ca-620e-4545-90dd-be526067bb84,Namespace:kube-system,Attempt:0,}" Mar 3 13:36:08.835520 containerd[1639]: time="2026-03-03T13:36:08.835118827Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-wx9bf,Uid:24edf4b1-ec3a-4ab5-a696-e9721189d47d,Namespace:kube-system,Attempt:0,}" Mar 3 13:36:08.838907 containerd[1639]: time="2026-03-03T13:36:08.838870837Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-xvc2m,Uid:8745b89a-4de0-4efe-bb7d-0887a5a20193,Namespace:calico-system,Attempt:0,}" Mar 3 13:36:08.848584 containerd[1639]: time="2026-03-03T13:36:08.848175172Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-77c7bb8fcc-hh8jb,Uid:79ec080a-ea5e-48da-b785-0639f3bb4193,Namespace:calico-system,Attempt:0,}" Mar 3 13:36:08.871975 containerd[1639]: time="2026-03-03T13:36:08.871947917Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-698594dd47-qmkfb,Uid:67394235-a1d5-4b93-9138-aad850f57cbd,Namespace:calico-system,Attempt:0,}" Mar 3 13:36:08.874053 containerd[1639]: time="2026-03-03T13:36:08.872194170Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-588cd44876-x8bpf,Uid:ee6739e9-2745-46b9-936e-2b03da5ded77,Namespace:calico-system,Attempt:0,}" Mar 3 13:36:08.874198 containerd[1639]: time="2026-03-03T13:36:08.872219819Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-588cd44876-hb6b4,Uid:323b935e-0775-45cf-89a0-f43932ed5b64,Namespace:calico-system,Attempt:0,}" Mar 3 13:36:09.001789 containerd[1639]: time="2026-03-03T13:36:09.001573903Z" level=error msg="Failed to destroy network for sandbox \"eb4fe9cd70e1b4eae273aba756dd17764d876e79275f468a4ceb995a2d0211f9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:09.003469 containerd[1639]: time="2026-03-03T13:36:09.003441692Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-77c7bb8fcc-hh8jb,Uid:79ec080a-ea5e-48da-b785-0639f3bb4193,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb4fe9cd70e1b4eae273aba756dd17764d876e79275f468a4ceb995a2d0211f9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:09.003690 kubelet[2796]: E0303 13:36:09.003642 2796 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb4fe9cd70e1b4eae273aba756dd17764d876e79275f468a4ceb995a2d0211f9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:09.003898 kubelet[2796]: E0303 13:36:09.003713 2796 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb4fe9cd70e1b4eae273aba756dd17764d876e79275f468a4ceb995a2d0211f9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-77c7bb8fcc-hh8jb" Mar 3 13:36:09.003898 kubelet[2796]: E0303 13:36:09.003728 2796 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb4fe9cd70e1b4eae273aba756dd17764d876e79275f468a4ceb995a2d0211f9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-77c7bb8fcc-hh8jb" Mar 3 13:36:09.003898 kubelet[2796]: E0303 13:36:09.003770 2796 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-77c7bb8fcc-hh8jb_calico-system(79ec080a-ea5e-48da-b785-0639f3bb4193)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-77c7bb8fcc-hh8jb_calico-system(79ec080a-ea5e-48da-b785-0639f3bb4193)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"eb4fe9cd70e1b4eae273aba756dd17764d876e79275f468a4ceb995a2d0211f9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-77c7bb8fcc-hh8jb" podUID="79ec080a-ea5e-48da-b785-0639f3bb4193" Mar 3 13:36:09.005170 containerd[1639]: time="2026-03-03T13:36:09.005139447Z" level=error msg="Failed to destroy network for sandbox \"4dfef0eb1bc09c8fd397e867d623bdef21fb665d9b36f716f7aa0986f7572077\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:09.008011 containerd[1639]: time="2026-03-03T13:36:09.007918289Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-588cd44876-hb6b4,Uid:323b935e-0775-45cf-89a0-f43932ed5b64,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4dfef0eb1bc09c8fd397e867d623bdef21fb665d9b36f716f7aa0986f7572077\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:09.008184 kubelet[2796]: E0303 13:36:09.008103 2796 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4dfef0eb1bc09c8fd397e867d623bdef21fb665d9b36f716f7aa0986f7572077\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:09.008184 kubelet[2796]: E0303 13:36:09.008133 2796 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4dfef0eb1bc09c8fd397e867d623bdef21fb665d9b36f716f7aa0986f7572077\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-588cd44876-hb6b4" Mar 3 13:36:09.008184 kubelet[2796]: E0303 13:36:09.008146 2796 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4dfef0eb1bc09c8fd397e867d623bdef21fb665d9b36f716f7aa0986f7572077\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-588cd44876-hb6b4" Mar 3 13:36:09.008274 kubelet[2796]: E0303 13:36:09.008200 2796 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-588cd44876-hb6b4_calico-system(323b935e-0775-45cf-89a0-f43932ed5b64)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-588cd44876-hb6b4_calico-system(323b935e-0775-45cf-89a0-f43932ed5b64)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4dfef0eb1bc09c8fd397e867d623bdef21fb665d9b36f716f7aa0986f7572077\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-588cd44876-hb6b4" podUID="323b935e-0775-45cf-89a0-f43932ed5b64" Mar 3 13:36:09.039237 containerd[1639]: time="2026-03-03T13:36:09.039182020Z" level=error msg="Failed to destroy network for sandbox \"fd43615d96c87989b72c1b09d6acd8dd00e468835b83963eb095fec565a3d25d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:09.039705 containerd[1639]: time="2026-03-03T13:36:09.039334647Z" level=error msg="Failed to destroy network for sandbox \"38eaceb203684a63b4a6772fdbac99ac46a11e7da45f8a8f7b200fd9768ba9b2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:09.040541 containerd[1639]: time="2026-03-03T13:36:09.040515008Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-xvc2m,Uid:8745b89a-4de0-4efe-bb7d-0887a5a20193,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fd43615d96c87989b72c1b09d6acd8dd00e468835b83963eb095fec565a3d25d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:09.041544 kubelet[2796]: E0303 13:36:09.041504 2796 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fd43615d96c87989b72c1b09d6acd8dd00e468835b83963eb095fec565a3d25d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:09.041596 kubelet[2796]: E0303 13:36:09.041560 2796 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fd43615d96c87989b72c1b09d6acd8dd00e468835b83963eb095fec565a3d25d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-cccfbd5cf-xvc2m" Mar 3 13:36:09.041596 kubelet[2796]: E0303 13:36:09.041582 2796 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fd43615d96c87989b72c1b09d6acd8dd00e468835b83963eb095fec565a3d25d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-cccfbd5cf-xvc2m" Mar 3 13:36:09.042251 kubelet[2796]: E0303 13:36:09.042009 2796 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-cccfbd5cf-xvc2m_calico-system(8745b89a-4de0-4efe-bb7d-0887a5a20193)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-cccfbd5cf-xvc2m_calico-system(8745b89a-4de0-4efe-bb7d-0887a5a20193)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fd43615d96c87989b72c1b09d6acd8dd00e468835b83963eb095fec565a3d25d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-cccfbd5cf-xvc2m" podUID="8745b89a-4de0-4efe-bb7d-0887a5a20193" Mar 3 13:36:09.044324 containerd[1639]: time="2026-03-03T13:36:09.044268744Z" level=error msg="Failed to destroy network for sandbox \"80e81c5e046a4673b7441f8034623518650ee491b78facbba2dd050e5bd2c7f4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:09.045463 containerd[1639]: time="2026-03-03T13:36:09.045439901Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-wx9bf,Uid:24edf4b1-ec3a-4ab5-a696-e9721189d47d,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"38eaceb203684a63b4a6772fdbac99ac46a11e7da45f8a8f7b200fd9768ba9b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:09.045640 containerd[1639]: time="2026-03-03T13:36:09.045608734Z" level=error msg="Failed to destroy network for sandbox \"4b97372159a7d77c048c60ad626bd71f9937b05fcf3dff2b190f1ada53d40b3b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:09.045762 kubelet[2796]: E0303 13:36:09.045621 2796 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"38eaceb203684a63b4a6772fdbac99ac46a11e7da45f8a8f7b200fd9768ba9b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:09.045899 kubelet[2796]: E0303 13:36:09.045871 2796 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"38eaceb203684a63b4a6772fdbac99ac46a11e7da45f8a8f7b200fd9768ba9b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-wx9bf" Mar 3 13:36:09.045954 kubelet[2796]: E0303 13:36:09.045944 2796 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"38eaceb203684a63b4a6772fdbac99ac46a11e7da45f8a8f7b200fd9768ba9b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-wx9bf" Mar 3 13:36:09.046067 kubelet[2796]: E0303 13:36:09.046027 2796 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-wx9bf_kube-system(24edf4b1-ec3a-4ab5-a696-e9721189d47d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-wx9bf_kube-system(24edf4b1-ec3a-4ab5-a696-e9721189d47d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"38eaceb203684a63b4a6772fdbac99ac46a11e7da45f8a8f7b200fd9768ba9b2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-wx9bf" podUID="24edf4b1-ec3a-4ab5-a696-e9721189d47d" Mar 3 13:36:09.046635 containerd[1639]: time="2026-03-03T13:36:09.046608345Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-j9kwn,Uid:69f392ca-620e-4545-90dd-be526067bb84,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"80e81c5e046a4673b7441f8034623518650ee491b78facbba2dd050e5bd2c7f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:09.046909 kubelet[2796]: E0303 13:36:09.046750 2796 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"80e81c5e046a4673b7441f8034623518650ee491b78facbba2dd050e5bd2c7f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:09.046909 kubelet[2796]: E0303 13:36:09.046768 2796 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"80e81c5e046a4673b7441f8034623518650ee491b78facbba2dd050e5bd2c7f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-j9kwn" Mar 3 13:36:09.047091 kubelet[2796]: E0303 13:36:09.047073 2796 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"80e81c5e046a4673b7441f8034623518650ee491b78facbba2dd050e5bd2c7f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-j9kwn" Mar 3 13:36:09.047289 kubelet[2796]: E0303 13:36:09.047267 2796 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-j9kwn_kube-system(69f392ca-620e-4545-90dd-be526067bb84)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-j9kwn_kube-system(69f392ca-620e-4545-90dd-be526067bb84)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"80e81c5e046a4673b7441f8034623518650ee491b78facbba2dd050e5bd2c7f4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-j9kwn" podUID="69f392ca-620e-4545-90dd-be526067bb84" Mar 3 13:36:09.048470 containerd[1639]: time="2026-03-03T13:36:09.048424834Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-698594dd47-qmkfb,Uid:67394235-a1d5-4b93-9138-aad850f57cbd,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4b97372159a7d77c048c60ad626bd71f9937b05fcf3dff2b190f1ada53d40b3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:09.048777 kubelet[2796]: E0303 13:36:09.048756 2796 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4b97372159a7d77c048c60ad626bd71f9937b05fcf3dff2b190f1ada53d40b3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:09.048777 kubelet[2796]: E0303 13:36:09.048809 2796 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4b97372159a7d77c048c60ad626bd71f9937b05fcf3dff2b190f1ada53d40b3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-698594dd47-qmkfb" Mar 3 13:36:09.048777 kubelet[2796]: E0303 13:36:09.048821 2796 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4b97372159a7d77c048c60ad626bd71f9937b05fcf3dff2b190f1ada53d40b3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-698594dd47-qmkfb" Mar 3 13:36:09.048972 kubelet[2796]: E0303 13:36:09.048957 2796 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-698594dd47-qmkfb_calico-system(67394235-a1d5-4b93-9138-aad850f57cbd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-698594dd47-qmkfb_calico-system(67394235-a1d5-4b93-9138-aad850f57cbd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4b97372159a7d77c048c60ad626bd71f9937b05fcf3dff2b190f1ada53d40b3b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-698594dd47-qmkfb" podUID="67394235-a1d5-4b93-9138-aad850f57cbd" Mar 3 13:36:09.052010 containerd[1639]: time="2026-03-03T13:36:09.051984318Z" level=error msg="Failed to destroy network for sandbox \"807cf9cc9e320f3290b232f00ad9556cd643dc314b93cb247655ac56d689a62d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:09.053217 containerd[1639]: time="2026-03-03T13:36:09.053196139Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-588cd44876-x8bpf,Uid:ee6739e9-2745-46b9-936e-2b03da5ded77,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"807cf9cc9e320f3290b232f00ad9556cd643dc314b93cb247655ac56d689a62d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:09.053335 kubelet[2796]: E0303 13:36:09.053313 2796 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"807cf9cc9e320f3290b232f00ad9556cd643dc314b93cb247655ac56d689a62d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:09.053363 kubelet[2796]: E0303 13:36:09.053335 2796 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"807cf9cc9e320f3290b232f00ad9556cd643dc314b93cb247655ac56d689a62d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-588cd44876-x8bpf" Mar 3 13:36:09.053385 kubelet[2796]: E0303 13:36:09.053373 2796 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"807cf9cc9e320f3290b232f00ad9556cd643dc314b93cb247655ac56d689a62d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-588cd44876-x8bpf" Mar 3 13:36:09.053462 kubelet[2796]: E0303 13:36:09.053431 2796 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-588cd44876-x8bpf_calico-system(ee6739e9-2745-46b9-936e-2b03da5ded77)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-588cd44876-x8bpf_calico-system(ee6739e9-2745-46b9-936e-2b03da5ded77)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"807cf9cc9e320f3290b232f00ad9556cd643dc314b93cb247655ac56d689a62d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-588cd44876-x8bpf" podUID="ee6739e9-2745-46b9-936e-2b03da5ded77" Mar 3 13:36:09.101867 systemd[1]: Created slice kubepods-besteffort-pod01ac93a6_8099_4557_bc77_3925251db8d8.slice - libcontainer container kubepods-besteffort-pod01ac93a6_8099_4557_bc77_3925251db8d8.slice. Mar 3 13:36:09.106826 containerd[1639]: time="2026-03-03T13:36:09.106794430Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dlz7j,Uid:01ac93a6-8099-4557-bc77-3925251db8d8,Namespace:calico-system,Attempt:0,}" Mar 3 13:36:09.142689 containerd[1639]: time="2026-03-03T13:36:09.142627031Z" level=error msg="Failed to destroy network for sandbox \"ac4097e65dda191394939dc6d76bf727865e860d1451a5f324e3a8b6b5dd6538\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:09.143980 containerd[1639]: time="2026-03-03T13:36:09.143955183Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dlz7j,Uid:01ac93a6-8099-4557-bc77-3925251db8d8,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac4097e65dda191394939dc6d76bf727865e860d1451a5f324e3a8b6b5dd6538\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:09.144224 kubelet[2796]: E0303 13:36:09.144185 2796 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac4097e65dda191394939dc6d76bf727865e860d1451a5f324e3a8b6b5dd6538\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 3 13:36:09.144278 kubelet[2796]: E0303 13:36:09.144258 2796 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac4097e65dda191394939dc6d76bf727865e860d1451a5f324e3a8b6b5dd6538\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dlz7j" Mar 3 13:36:09.144278 kubelet[2796]: E0303 13:36:09.144274 2796 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac4097e65dda191394939dc6d76bf727865e860d1451a5f324e3a8b6b5dd6538\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dlz7j" Mar 3 13:36:09.144364 kubelet[2796]: E0303 13:36:09.144339 2796 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-dlz7j_calico-system(01ac93a6-8099-4557-bc77-3925251db8d8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-dlz7j_calico-system(01ac93a6-8099-4557-bc77-3925251db8d8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ac4097e65dda191394939dc6d76bf727865e860d1451a5f324e3a8b6b5dd6538\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dlz7j" podUID="01ac93a6-8099-4557-bc77-3925251db8d8" Mar 3 13:36:09.271047 containerd[1639]: time="2026-03-03T13:36:09.270818909Z" level=info msg="CreateContainer within sandbox \"889d343f7c8840212e1a79cec53967ad685f857ce2400a6eec25d5b0027e906b\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 3 13:36:09.283433 containerd[1639]: time="2026-03-03T13:36:09.283400665Z" level=info msg="Container cf2bb7bb1b0235d67c7814b01f790e757daa1ac1a305a7679c8379e922701121: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:36:09.289467 containerd[1639]: time="2026-03-03T13:36:09.289442231Z" level=info msg="CreateContainer within sandbox \"889d343f7c8840212e1a79cec53967ad685f857ce2400a6eec25d5b0027e906b\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"cf2bb7bb1b0235d67c7814b01f790e757daa1ac1a305a7679c8379e922701121\"" Mar 3 13:36:09.289874 containerd[1639]: time="2026-03-03T13:36:09.289836003Z" level=info msg="StartContainer for \"cf2bb7bb1b0235d67c7814b01f790e757daa1ac1a305a7679c8379e922701121\"" Mar 3 13:36:09.291037 containerd[1639]: time="2026-03-03T13:36:09.291002012Z" level=info msg="connecting to shim cf2bb7bb1b0235d67c7814b01f790e757daa1ac1a305a7679c8379e922701121" address="unix:///run/containerd/s/7c4987813edb4da252bd6261277b8d34b7df6ffb4b338e11cd5bce5491fff45a" protocol=ttrpc version=3 Mar 3 13:36:09.311787 systemd[1]: Started cri-containerd-cf2bb7bb1b0235d67c7814b01f790e757daa1ac1a305a7679c8379e922701121.scope - libcontainer container cf2bb7bb1b0235d67c7814b01f790e757daa1ac1a305a7679c8379e922701121. Mar 3 13:36:09.375901 containerd[1639]: time="2026-03-03T13:36:09.375458395Z" level=info msg="StartContainer for \"cf2bb7bb1b0235d67c7814b01f790e757daa1ac1a305a7679c8379e922701121\" returns successfully" Mar 3 13:36:09.513046 kubelet[2796]: I0303 13:36:09.511637 2796 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlmmk\" (UniqueName: \"kubernetes.io/projected/67394235-a1d5-4b93-9138-aad850f57cbd-kube-api-access-hlmmk\") pod \"67394235-a1d5-4b93-9138-aad850f57cbd\" (UID: \"67394235-a1d5-4b93-9138-aad850f57cbd\") " Mar 3 13:36:09.513558 kubelet[2796]: I0303 13:36:09.513547 2796 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/67394235-a1d5-4b93-9138-aad850f57cbd-whisker-backend-key-pair\") pod \"67394235-a1d5-4b93-9138-aad850f57cbd\" (UID: \"67394235-a1d5-4b93-9138-aad850f57cbd\") " Mar 3 13:36:09.513642 kubelet[2796]: I0303 13:36:09.513634 2796 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67394235-a1d5-4b93-9138-aad850f57cbd-whisker-ca-bundle\") pod \"67394235-a1d5-4b93-9138-aad850f57cbd\" (UID: \"67394235-a1d5-4b93-9138-aad850f57cbd\") " Mar 3 13:36:09.519382 kubelet[2796]: I0303 13:36:09.516220 2796 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/67394235-a1d5-4b93-9138-aad850f57cbd-nginx-config\") pod \"67394235-a1d5-4b93-9138-aad850f57cbd\" (UID: \"67394235-a1d5-4b93-9138-aad850f57cbd\") " Mar 3 13:36:09.519382 kubelet[2796]: I0303 13:36:09.514085 2796 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67394235-a1d5-4b93-9138-aad850f57cbd-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "67394235-a1d5-4b93-9138-aad850f57cbd" (UID: "67394235-a1d5-4b93-9138-aad850f57cbd"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 3 13:36:09.519382 kubelet[2796]: I0303 13:36:09.516503 2796 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67394235-a1d5-4b93-9138-aad850f57cbd-nginx-config" (OuterVolumeSpecName: "nginx-config") pod "67394235-a1d5-4b93-9138-aad850f57cbd" (UID: "67394235-a1d5-4b93-9138-aad850f57cbd"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 3 13:36:09.519382 kubelet[2796]: I0303 13:36:09.516328 2796 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67394235-a1d5-4b93-9138-aad850f57cbd-whisker-ca-bundle\") on node \"ci-4459-2-4-a-1c48930dd2\" DevicePath \"\"" Mar 3 13:36:09.520080 kubelet[2796]: I0303 13:36:09.520065 2796 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67394235-a1d5-4b93-9138-aad850f57cbd-kube-api-access-hlmmk" (OuterVolumeSpecName: "kube-api-access-hlmmk") pod "67394235-a1d5-4b93-9138-aad850f57cbd" (UID: "67394235-a1d5-4b93-9138-aad850f57cbd"). InnerVolumeSpecName "kube-api-access-hlmmk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 3 13:36:09.520196 kubelet[2796]: I0303 13:36:09.520184 2796 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67394235-a1d5-4b93-9138-aad850f57cbd-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "67394235-a1d5-4b93-9138-aad850f57cbd" (UID: "67394235-a1d5-4b93-9138-aad850f57cbd"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 3 13:36:09.619233 kubelet[2796]: I0303 13:36:09.619174 2796 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hlmmk\" (UniqueName: \"kubernetes.io/projected/67394235-a1d5-4b93-9138-aad850f57cbd-kube-api-access-hlmmk\") on node \"ci-4459-2-4-a-1c48930dd2\" DevicePath \"\"" Mar 3 13:36:09.619233 kubelet[2796]: I0303 13:36:09.619218 2796 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/67394235-a1d5-4b93-9138-aad850f57cbd-whisker-backend-key-pair\") on node \"ci-4459-2-4-a-1c48930dd2\" DevicePath \"\"" Mar 3 13:36:09.619233 kubelet[2796]: I0303 13:36:09.619238 2796 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/67394235-a1d5-4b93-9138-aad850f57cbd-nginx-config\") on node \"ci-4459-2-4-a-1c48930dd2\" DevicePath \"\"" Mar 3 13:36:09.846533 systemd[1]: run-netns-cni\x2d52489215\x2d5b00\x2dbd3f\x2dd618\x2d3ae267c687c6.mount: Deactivated successfully. Mar 3 13:36:09.846799 systemd[1]: run-netns-cni\x2de4a73143\x2d7010\x2d40b3\x2d9bac\x2d84b23f125c1c.mount: Deactivated successfully. Mar 3 13:36:09.846946 systemd[1]: run-netns-cni\x2d74d2e497\x2db029\x2d4ab1\x2df71a\x2d0bdef08d7ba6.mount: Deactivated successfully. Mar 3 13:36:09.847065 systemd[1]: run-netns-cni\x2dbdde5215\x2d044b\x2db878\x2d4cd3\x2d1fe2ae4b5b6a.mount: Deactivated successfully. Mar 3 13:36:09.847191 systemd[1]: run-netns-cni\x2d15068830\x2d6db9\x2d168e\x2d4d3e\x2dbdec101b1890.mount: Deactivated successfully. Mar 3 13:36:09.847309 systemd[1]: run-netns-cni\x2dac607943\x2d232e\x2d6e8d\x2d91c0\x2d68af6a687dbd.mount: Deactivated successfully. Mar 3 13:36:09.847429 systemd[1]: run-netns-cni\x2d9017f92b\x2d92e4\x2dfe52\x2d2992\x2dba6a939352cc.mount: Deactivated successfully. Mar 3 13:36:09.847551 systemd[1]: var-lib-kubelet-pods-67394235\x2da1d5\x2d4b93\x2d9138\x2daad850f57cbd-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dhlmmk.mount: Deactivated successfully. Mar 3 13:36:09.847739 systemd[1]: var-lib-kubelet-pods-67394235\x2da1d5\x2d4b93\x2d9138\x2daad850f57cbd-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Mar 3 13:36:10.267247 systemd[1]: Removed slice kubepods-besteffort-pod67394235_a1d5_4b93_9138_aad850f57cbd.slice - libcontainer container kubepods-besteffort-pod67394235_a1d5_4b93_9138_aad850f57cbd.slice. Mar 3 13:36:10.283439 kubelet[2796]: I0303 13:36:10.283238 2796 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-9nfsc" podStartSLOduration=3.628663573 podStartE2EDuration="16.283213384s" podCreationTimestamp="2026-03-03 13:35:54 +0000 UTC" firstStartedPulling="2026-03-03 13:35:55.171456909 +0000 UTC m=+18.152438773" lastFinishedPulling="2026-03-03 13:36:07.82600673 +0000 UTC m=+30.806988584" observedRunningTime="2026-03-03 13:36:10.279378299 +0000 UTC m=+33.260360193" watchObservedRunningTime="2026-03-03 13:36:10.283213384 +0000 UTC m=+33.264195279" Mar 3 13:36:10.351811 systemd[1]: Created slice kubepods-besteffort-pod7c81a7aa_c336_4066_a7b9_9902b49103fc.slice - libcontainer container kubepods-besteffort-pod7c81a7aa_c336_4066_a7b9_9902b49103fc.slice. Mar 3 13:36:10.424086 kubelet[2796]: I0303 13:36:10.423955 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/7c81a7aa-c336-4066-a7b9-9902b49103fc-nginx-config\") pod \"whisker-6c6545cdb4-s8d2n\" (UID: \"7c81a7aa-c336-4066-a7b9-9902b49103fc\") " pod="calico-system/whisker-6c6545cdb4-s8d2n" Mar 3 13:36:10.424086 kubelet[2796]: I0303 13:36:10.423997 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b6sx\" (UniqueName: \"kubernetes.io/projected/7c81a7aa-c336-4066-a7b9-9902b49103fc-kube-api-access-6b6sx\") pod \"whisker-6c6545cdb4-s8d2n\" (UID: \"7c81a7aa-c336-4066-a7b9-9902b49103fc\") " pod="calico-system/whisker-6c6545cdb4-s8d2n" Mar 3 13:36:10.424086 kubelet[2796]: I0303 13:36:10.424012 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c81a7aa-c336-4066-a7b9-9902b49103fc-whisker-ca-bundle\") pod \"whisker-6c6545cdb4-s8d2n\" (UID: \"7c81a7aa-c336-4066-a7b9-9902b49103fc\") " pod="calico-system/whisker-6c6545cdb4-s8d2n" Mar 3 13:36:10.424086 kubelet[2796]: I0303 13:36:10.424027 2796 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/7c81a7aa-c336-4066-a7b9-9902b49103fc-whisker-backend-key-pair\") pod \"whisker-6c6545cdb4-s8d2n\" (UID: \"7c81a7aa-c336-4066-a7b9-9902b49103fc\") " pod="calico-system/whisker-6c6545cdb4-s8d2n" Mar 3 13:36:10.660343 containerd[1639]: time="2026-03-03T13:36:10.660300279Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6c6545cdb4-s8d2n,Uid:7c81a7aa-c336-4066-a7b9-9902b49103fc,Namespace:calico-system,Attempt:0,}" Mar 3 13:36:10.825218 systemd-networkd[1489]: calibdc8b308609: Link UP Mar 3 13:36:10.826468 systemd-networkd[1489]: calibdc8b308609: Gained carrier Mar 3 13:36:10.849117 containerd[1639]: 2026-03-03 13:36:10.697 [ERROR][3977] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 3 13:36:10.849117 containerd[1639]: 2026-03-03 13:36:10.710 [INFO][3977] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--a--1c48930dd2-k8s-whisker--6c6545cdb4--s8d2n-eth0 whisker-6c6545cdb4- calico-system 7c81a7aa-c336-4066-a7b9-9902b49103fc 904 0 2026-03-03 13:36:10 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:6c6545cdb4 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-2-4-a-1c48930dd2 whisker-6c6545cdb4-s8d2n eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calibdc8b308609 [] [] }} ContainerID="bf05c2aae75f8361f7d97fa96f73188f41b1ea32e41e9d884a2b093586ca4d23" Namespace="calico-system" Pod="whisker-6c6545cdb4-s8d2n" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-whisker--6c6545cdb4--s8d2n-" Mar 3 13:36:10.849117 containerd[1639]: 2026-03-03 13:36:10.711 [INFO][3977] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bf05c2aae75f8361f7d97fa96f73188f41b1ea32e41e9d884a2b093586ca4d23" Namespace="calico-system" Pod="whisker-6c6545cdb4-s8d2n" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-whisker--6c6545cdb4--s8d2n-eth0" Mar 3 13:36:10.849117 containerd[1639]: 2026-03-03 13:36:10.756 [INFO][4023] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bf05c2aae75f8361f7d97fa96f73188f41b1ea32e41e9d884a2b093586ca4d23" HandleID="k8s-pod-network.bf05c2aae75f8361f7d97fa96f73188f41b1ea32e41e9d884a2b093586ca4d23" Workload="ci--4459--2--4--a--1c48930dd2-k8s-whisker--6c6545cdb4--s8d2n-eth0" Mar 3 13:36:10.849574 containerd[1639]: 2026-03-03 13:36:10.767 [INFO][4023] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="bf05c2aae75f8361f7d97fa96f73188f41b1ea32e41e9d884a2b093586ca4d23" HandleID="k8s-pod-network.bf05c2aae75f8361f7d97fa96f73188f41b1ea32e41e9d884a2b093586ca4d23" Workload="ci--4459--2--4--a--1c48930dd2-k8s-whisker--6c6545cdb4--s8d2n-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fda50), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-a-1c48930dd2", "pod":"whisker-6c6545cdb4-s8d2n", "timestamp":"2026-03-03 13:36:10.756812747 +0000 UTC"}, Hostname:"ci-4459-2-4-a-1c48930dd2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0000bf080)} Mar 3 13:36:10.849574 containerd[1639]: 2026-03-03 13:36:10.767 [INFO][4023] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 3 13:36:10.849574 containerd[1639]: 2026-03-03 13:36:10.767 [INFO][4023] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 3 13:36:10.849574 containerd[1639]: 2026-03-03 13:36:10.767 [INFO][4023] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-a-1c48930dd2' Mar 3 13:36:10.849574 containerd[1639]: 2026-03-03 13:36:10.769 [INFO][4023] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.bf05c2aae75f8361f7d97fa96f73188f41b1ea32e41e9d884a2b093586ca4d23" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:10.849574 containerd[1639]: 2026-03-03 13:36:10.778 [INFO][4023] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:10.849574 containerd[1639]: 2026-03-03 13:36:10.786 [INFO][4023] ipam/ipam.go 526: Trying affinity for 192.168.119.64/26 host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:10.849574 containerd[1639]: 2026-03-03 13:36:10.788 [INFO][4023] ipam/ipam.go 160: Attempting to load block cidr=192.168.119.64/26 host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:10.849574 containerd[1639]: 2026-03-03 13:36:10.790 [INFO][4023] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.119.64/26 host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:10.850022 containerd[1639]: 2026-03-03 13:36:10.790 [INFO][4023] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.119.64/26 handle="k8s-pod-network.bf05c2aae75f8361f7d97fa96f73188f41b1ea32e41e9d884a2b093586ca4d23" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:10.850022 containerd[1639]: 2026-03-03 13:36:10.791 [INFO][4023] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.bf05c2aae75f8361f7d97fa96f73188f41b1ea32e41e9d884a2b093586ca4d23 Mar 3 13:36:10.850022 containerd[1639]: 2026-03-03 13:36:10.796 [INFO][4023] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.119.64/26 handle="k8s-pod-network.bf05c2aae75f8361f7d97fa96f73188f41b1ea32e41e9d884a2b093586ca4d23" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:10.850022 containerd[1639]: 2026-03-03 13:36:10.800 [INFO][4023] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.119.65/26] block=192.168.119.64/26 handle="k8s-pod-network.bf05c2aae75f8361f7d97fa96f73188f41b1ea32e41e9d884a2b093586ca4d23" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:10.850022 containerd[1639]: 2026-03-03 13:36:10.800 [INFO][4023] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.119.65/26] handle="k8s-pod-network.bf05c2aae75f8361f7d97fa96f73188f41b1ea32e41e9d884a2b093586ca4d23" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:10.850022 containerd[1639]: 2026-03-03 13:36:10.800 [INFO][4023] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 3 13:36:10.850022 containerd[1639]: 2026-03-03 13:36:10.801 [INFO][4023] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.119.65/26] IPv6=[] ContainerID="bf05c2aae75f8361f7d97fa96f73188f41b1ea32e41e9d884a2b093586ca4d23" HandleID="k8s-pod-network.bf05c2aae75f8361f7d97fa96f73188f41b1ea32e41e9d884a2b093586ca4d23" Workload="ci--4459--2--4--a--1c48930dd2-k8s-whisker--6c6545cdb4--s8d2n-eth0" Mar 3 13:36:10.850157 containerd[1639]: 2026-03-03 13:36:10.808 [INFO][3977] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bf05c2aae75f8361f7d97fa96f73188f41b1ea32e41e9d884a2b093586ca4d23" Namespace="calico-system" Pod="whisker-6c6545cdb4-s8d2n" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-whisker--6c6545cdb4--s8d2n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--a--1c48930dd2-k8s-whisker--6c6545cdb4--s8d2n-eth0", GenerateName:"whisker-6c6545cdb4-", Namespace:"calico-system", SelfLink:"", UID:"7c81a7aa-c336-4066-a7b9-9902b49103fc", ResourceVersion:"904", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 36, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6c6545cdb4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-a-1c48930dd2", ContainerID:"", Pod:"whisker-6c6545cdb4-s8d2n", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.119.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calibdc8b308609", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:10.850157 containerd[1639]: 2026-03-03 13:36:10.809 [INFO][3977] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.119.65/32] ContainerID="bf05c2aae75f8361f7d97fa96f73188f41b1ea32e41e9d884a2b093586ca4d23" Namespace="calico-system" Pod="whisker-6c6545cdb4-s8d2n" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-whisker--6c6545cdb4--s8d2n-eth0" Mar 3 13:36:10.850294 containerd[1639]: 2026-03-03 13:36:10.809 [INFO][3977] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibdc8b308609 ContainerID="bf05c2aae75f8361f7d97fa96f73188f41b1ea32e41e9d884a2b093586ca4d23" Namespace="calico-system" Pod="whisker-6c6545cdb4-s8d2n" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-whisker--6c6545cdb4--s8d2n-eth0" Mar 3 13:36:10.850294 containerd[1639]: 2026-03-03 13:36:10.828 [INFO][3977] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bf05c2aae75f8361f7d97fa96f73188f41b1ea32e41e9d884a2b093586ca4d23" Namespace="calico-system" Pod="whisker-6c6545cdb4-s8d2n" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-whisker--6c6545cdb4--s8d2n-eth0" Mar 3 13:36:10.850394 containerd[1639]: 2026-03-03 13:36:10.830 [INFO][3977] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bf05c2aae75f8361f7d97fa96f73188f41b1ea32e41e9d884a2b093586ca4d23" Namespace="calico-system" Pod="whisker-6c6545cdb4-s8d2n" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-whisker--6c6545cdb4--s8d2n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--a--1c48930dd2-k8s-whisker--6c6545cdb4--s8d2n-eth0", GenerateName:"whisker-6c6545cdb4-", Namespace:"calico-system", SelfLink:"", UID:"7c81a7aa-c336-4066-a7b9-9902b49103fc", ResourceVersion:"904", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 36, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6c6545cdb4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-a-1c48930dd2", ContainerID:"bf05c2aae75f8361f7d97fa96f73188f41b1ea32e41e9d884a2b093586ca4d23", Pod:"whisker-6c6545cdb4-s8d2n", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.119.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calibdc8b308609", MAC:"6a:ab:05:9d:4a:83", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:10.850657 containerd[1639]: 2026-03-03 13:36:10.843 [INFO][3977] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bf05c2aae75f8361f7d97fa96f73188f41b1ea32e41e9d884a2b093586ca4d23" Namespace="calico-system" Pod="whisker-6c6545cdb4-s8d2n" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-whisker--6c6545cdb4--s8d2n-eth0" Mar 3 13:36:10.902631 containerd[1639]: time="2026-03-03T13:36:10.902566635Z" level=info msg="connecting to shim bf05c2aae75f8361f7d97fa96f73188f41b1ea32e41e9d884a2b093586ca4d23" address="unix:///run/containerd/s/577b26b5d8d2ad0b0c2a703f6ef7bb53f92710036535f1c27137b49a21304350" namespace=k8s.io protocol=ttrpc version=3 Mar 3 13:36:10.937132 systemd[1]: Started cri-containerd-bf05c2aae75f8361f7d97fa96f73188f41b1ea32e41e9d884a2b093586ca4d23.scope - libcontainer container bf05c2aae75f8361f7d97fa96f73188f41b1ea32e41e9d884a2b093586ca4d23. Mar 3 13:36:10.998494 containerd[1639]: time="2026-03-03T13:36:10.998455917Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6c6545cdb4-s8d2n,Uid:7c81a7aa-c336-4066-a7b9-9902b49103fc,Namespace:calico-system,Attempt:0,} returns sandbox id \"bf05c2aae75f8361f7d97fa96f73188f41b1ea32e41e9d884a2b093586ca4d23\"" Mar 3 13:36:11.002048 containerd[1639]: time="2026-03-03T13:36:11.002031280Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Mar 3 13:36:11.101232 kubelet[2796]: I0303 13:36:11.101153 2796 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67394235-a1d5-4b93-9138-aad850f57cbd" path="/var/lib/kubelet/pods/67394235-a1d5-4b93-9138-aad850f57cbd/volumes" Mar 3 13:36:11.256281 kubelet[2796]: I0303 13:36:11.256171 2796 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 3 13:36:11.938903 systemd-networkd[1489]: calibdc8b308609: Gained IPv6LL Mar 3 13:36:13.163069 containerd[1639]: time="2026-03-03T13:36:13.163032970Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:13.163927 containerd[1639]: time="2026-03-03T13:36:13.163837594Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=6039889" Mar 3 13:36:13.164937 containerd[1639]: time="2026-03-03T13:36:13.164908890Z" level=info msg="ImageCreate event name:\"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:13.167013 containerd[1639]: time="2026-03-03T13:36:13.166982436Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:13.167410 containerd[1639]: time="2026-03-03T13:36:13.167393792Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7595926\" in 2.165262369s" Mar 3 13:36:13.167475 containerd[1639]: time="2026-03-03T13:36:13.167463260Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\"" Mar 3 13:36:13.171328 containerd[1639]: time="2026-03-03T13:36:13.171312922Z" level=info msg="CreateContainer within sandbox \"bf05c2aae75f8361f7d97fa96f73188f41b1ea32e41e9d884a2b093586ca4d23\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Mar 3 13:36:13.182776 containerd[1639]: time="2026-03-03T13:36:13.182743595Z" level=info msg="Container 4a9a741fedec510f31f0429c3efe659d58388d8b0f3b09f5aa16b204bc51f54b: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:36:13.184424 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount506708542.mount: Deactivated successfully. Mar 3 13:36:13.193651 containerd[1639]: time="2026-03-03T13:36:13.193621563Z" level=info msg="CreateContainer within sandbox \"bf05c2aae75f8361f7d97fa96f73188f41b1ea32e41e9d884a2b093586ca4d23\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"4a9a741fedec510f31f0429c3efe659d58388d8b0f3b09f5aa16b204bc51f54b\"" Mar 3 13:36:13.194173 containerd[1639]: time="2026-03-03T13:36:13.194148007Z" level=info msg="StartContainer for \"4a9a741fedec510f31f0429c3efe659d58388d8b0f3b09f5aa16b204bc51f54b\"" Mar 3 13:36:13.195641 containerd[1639]: time="2026-03-03T13:36:13.195616758Z" level=info msg="connecting to shim 4a9a741fedec510f31f0429c3efe659d58388d8b0f3b09f5aa16b204bc51f54b" address="unix:///run/containerd/s/577b26b5d8d2ad0b0c2a703f6ef7bb53f92710036535f1c27137b49a21304350" protocol=ttrpc version=3 Mar 3 13:36:13.211767 systemd[1]: Started cri-containerd-4a9a741fedec510f31f0429c3efe659d58388d8b0f3b09f5aa16b204bc51f54b.scope - libcontainer container 4a9a741fedec510f31f0429c3efe659d58388d8b0f3b09f5aa16b204bc51f54b. Mar 3 13:36:13.251965 containerd[1639]: time="2026-03-03T13:36:13.251893488Z" level=info msg="StartContainer for \"4a9a741fedec510f31f0429c3efe659d58388d8b0f3b09f5aa16b204bc51f54b\" returns successfully" Mar 3 13:36:13.254031 containerd[1639]: time="2026-03-03T13:36:13.254005352Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Mar 3 13:36:14.513932 kubelet[2796]: I0303 13:36:14.513826 2796 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 3 13:36:14.820018 kubelet[2796]: I0303 13:36:14.818904 2796 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 3 13:36:15.869809 systemd-networkd[1489]: vxlan.calico: Link UP Mar 3 13:36:15.869821 systemd-networkd[1489]: vxlan.calico: Gained carrier Mar 3 13:36:16.151216 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3624841970.mount: Deactivated successfully. Mar 3 13:36:16.173186 containerd[1639]: time="2026-03-03T13:36:16.173128985Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:16.174137 containerd[1639]: time="2026-03-03T13:36:16.174077959Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=17609475" Mar 3 13:36:16.175703 containerd[1639]: time="2026-03-03T13:36:16.175057810Z" level=info msg="ImageCreate event name:\"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:16.176958 containerd[1639]: time="2026-03-03T13:36:16.176934896Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:16.177466 containerd[1639]: time="2026-03-03T13:36:16.177447707Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"17609305\" in 2.923319325s" Mar 3 13:36:16.177531 containerd[1639]: time="2026-03-03T13:36:16.177520049Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\"" Mar 3 13:36:16.182039 containerd[1639]: time="2026-03-03T13:36:16.182015173Z" level=info msg="CreateContainer within sandbox \"bf05c2aae75f8361f7d97fa96f73188f41b1ea32e41e9d884a2b093586ca4d23\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Mar 3 13:36:16.195038 containerd[1639]: time="2026-03-03T13:36:16.194975235Z" level=info msg="Container 2b5d0c4a2707d064d8c03d28973d90eb4ada05945f198fcd993b41d06e1f58ef: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:36:16.212926 containerd[1639]: time="2026-03-03T13:36:16.212884532Z" level=info msg="CreateContainer within sandbox \"bf05c2aae75f8361f7d97fa96f73188f41b1ea32e41e9d884a2b093586ca4d23\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"2b5d0c4a2707d064d8c03d28973d90eb4ada05945f198fcd993b41d06e1f58ef\"" Mar 3 13:36:16.214935 containerd[1639]: time="2026-03-03T13:36:16.214915304Z" level=info msg="StartContainer for \"2b5d0c4a2707d064d8c03d28973d90eb4ada05945f198fcd993b41d06e1f58ef\"" Mar 3 13:36:16.215871 containerd[1639]: time="2026-03-03T13:36:16.215852360Z" level=info msg="connecting to shim 2b5d0c4a2707d064d8c03d28973d90eb4ada05945f198fcd993b41d06e1f58ef" address="unix:///run/containerd/s/577b26b5d8d2ad0b0c2a703f6ef7bb53f92710036535f1c27137b49a21304350" protocol=ttrpc version=3 Mar 3 13:36:16.243892 systemd[1]: Started cri-containerd-2b5d0c4a2707d064d8c03d28973d90eb4ada05945f198fcd993b41d06e1f58ef.scope - libcontainer container 2b5d0c4a2707d064d8c03d28973d90eb4ada05945f198fcd993b41d06e1f58ef. Mar 3 13:36:16.293964 containerd[1639]: time="2026-03-03T13:36:16.293929522Z" level=info msg="StartContainer for \"2b5d0c4a2707d064d8c03d28973d90eb4ada05945f198fcd993b41d06e1f58ef\" returns successfully" Mar 3 13:36:17.297566 kubelet[2796]: I0303 13:36:17.297465 2796 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-6c6545cdb4-s8d2n" podStartSLOduration=2.119231879 podStartE2EDuration="7.297436894s" podCreationTimestamp="2026-03-03 13:36:10 +0000 UTC" firstStartedPulling="2026-03-03 13:36:11.000028014 +0000 UTC m=+33.981009878" lastFinishedPulling="2026-03-03 13:36:16.178233039 +0000 UTC m=+39.159214893" observedRunningTime="2026-03-03 13:36:17.295923421 +0000 UTC m=+40.276905305" watchObservedRunningTime="2026-03-03 13:36:17.297436894 +0000 UTC m=+40.278418778" Mar 3 13:36:17.313339 systemd-networkd[1489]: vxlan.calico: Gained IPv6LL Mar 3 13:36:20.100400 containerd[1639]: time="2026-03-03T13:36:20.100214710Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dlz7j,Uid:01ac93a6-8099-4557-bc77-3925251db8d8,Namespace:calico-system,Attempt:0,}" Mar 3 13:36:20.263710 systemd-networkd[1489]: cali76ac596641a: Link UP Mar 3 13:36:20.265875 systemd-networkd[1489]: cali76ac596641a: Gained carrier Mar 3 13:36:20.288739 containerd[1639]: 2026-03-03 13:36:20.182 [INFO][4434] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--a--1c48930dd2-k8s-csi--node--driver--dlz7j-eth0 csi-node-driver- calico-system 01ac93a6-8099-4557-bc77-3925251db8d8 719 0 2026-03-03 13:35:54 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:98cbb5577 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459-2-4-a-1c48930dd2 csi-node-driver-dlz7j eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali76ac596641a [] [] }} ContainerID="92c99a05abddc311b5fb95b949b5442bb932e12639d3d2a55ef89656d6a8a743" Namespace="calico-system" Pod="csi-node-driver-dlz7j" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-csi--node--driver--dlz7j-" Mar 3 13:36:20.288739 containerd[1639]: 2026-03-03 13:36:20.182 [INFO][4434] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="92c99a05abddc311b5fb95b949b5442bb932e12639d3d2a55ef89656d6a8a743" Namespace="calico-system" Pod="csi-node-driver-dlz7j" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-csi--node--driver--dlz7j-eth0" Mar 3 13:36:20.288739 containerd[1639]: 2026-03-03 13:36:20.223 [INFO][4446] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="92c99a05abddc311b5fb95b949b5442bb932e12639d3d2a55ef89656d6a8a743" HandleID="k8s-pod-network.92c99a05abddc311b5fb95b949b5442bb932e12639d3d2a55ef89656d6a8a743" Workload="ci--4459--2--4--a--1c48930dd2-k8s-csi--node--driver--dlz7j-eth0" Mar 3 13:36:20.290812 containerd[1639]: 2026-03-03 13:36:20.230 [INFO][4446] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="92c99a05abddc311b5fb95b949b5442bb932e12639d3d2a55ef89656d6a8a743" HandleID="k8s-pod-network.92c99a05abddc311b5fb95b949b5442bb932e12639d3d2a55ef89656d6a8a743" Workload="ci--4459--2--4--a--1c48930dd2-k8s-csi--node--driver--dlz7j-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000381d30), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-a-1c48930dd2", "pod":"csi-node-driver-dlz7j", "timestamp":"2026-03-03 13:36:20.223765337 +0000 UTC"}, Hostname:"ci-4459-2-4-a-1c48930dd2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0004ac000)} Mar 3 13:36:20.290812 containerd[1639]: 2026-03-03 13:36:20.230 [INFO][4446] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 3 13:36:20.290812 containerd[1639]: 2026-03-03 13:36:20.230 [INFO][4446] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 3 13:36:20.290812 containerd[1639]: 2026-03-03 13:36:20.230 [INFO][4446] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-a-1c48930dd2' Mar 3 13:36:20.290812 containerd[1639]: 2026-03-03 13:36:20.232 [INFO][4446] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.92c99a05abddc311b5fb95b949b5442bb932e12639d3d2a55ef89656d6a8a743" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:20.290812 containerd[1639]: 2026-03-03 13:36:20.236 [INFO][4446] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:20.290812 containerd[1639]: 2026-03-03 13:36:20.240 [INFO][4446] ipam/ipam.go 526: Trying affinity for 192.168.119.64/26 host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:20.290812 containerd[1639]: 2026-03-03 13:36:20.241 [INFO][4446] ipam/ipam.go 160: Attempting to load block cidr=192.168.119.64/26 host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:20.290812 containerd[1639]: 2026-03-03 13:36:20.243 [INFO][4446] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.119.64/26 host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:20.291052 containerd[1639]: 2026-03-03 13:36:20.243 [INFO][4446] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.119.64/26 handle="k8s-pod-network.92c99a05abddc311b5fb95b949b5442bb932e12639d3d2a55ef89656d6a8a743" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:20.291052 containerd[1639]: 2026-03-03 13:36:20.245 [INFO][4446] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.92c99a05abddc311b5fb95b949b5442bb932e12639d3d2a55ef89656d6a8a743 Mar 3 13:36:20.291052 containerd[1639]: 2026-03-03 13:36:20.249 [INFO][4446] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.119.64/26 handle="k8s-pod-network.92c99a05abddc311b5fb95b949b5442bb932e12639d3d2a55ef89656d6a8a743" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:20.291052 containerd[1639]: 2026-03-03 13:36:20.256 [INFO][4446] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.119.66/26] block=192.168.119.64/26 handle="k8s-pod-network.92c99a05abddc311b5fb95b949b5442bb932e12639d3d2a55ef89656d6a8a743" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:20.291052 containerd[1639]: 2026-03-03 13:36:20.256 [INFO][4446] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.119.66/26] handle="k8s-pod-network.92c99a05abddc311b5fb95b949b5442bb932e12639d3d2a55ef89656d6a8a743" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:20.291052 containerd[1639]: 2026-03-03 13:36:20.256 [INFO][4446] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 3 13:36:20.291052 containerd[1639]: 2026-03-03 13:36:20.256 [INFO][4446] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.119.66/26] IPv6=[] ContainerID="92c99a05abddc311b5fb95b949b5442bb932e12639d3d2a55ef89656d6a8a743" HandleID="k8s-pod-network.92c99a05abddc311b5fb95b949b5442bb932e12639d3d2a55ef89656d6a8a743" Workload="ci--4459--2--4--a--1c48930dd2-k8s-csi--node--driver--dlz7j-eth0" Mar 3 13:36:20.291165 containerd[1639]: 2026-03-03 13:36:20.260 [INFO][4434] cni-plugin/k8s.go 418: Populated endpoint ContainerID="92c99a05abddc311b5fb95b949b5442bb932e12639d3d2a55ef89656d6a8a743" Namespace="calico-system" Pod="csi-node-driver-dlz7j" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-csi--node--driver--dlz7j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--a--1c48930dd2-k8s-csi--node--driver--dlz7j-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"01ac93a6-8099-4557-bc77-3925251db8d8", ResourceVersion:"719", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 35, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-a-1c48930dd2", ContainerID:"", Pod:"csi-node-driver-dlz7j", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.119.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali76ac596641a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:20.291209 containerd[1639]: 2026-03-03 13:36:20.260 [INFO][4434] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.119.66/32] ContainerID="92c99a05abddc311b5fb95b949b5442bb932e12639d3d2a55ef89656d6a8a743" Namespace="calico-system" Pod="csi-node-driver-dlz7j" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-csi--node--driver--dlz7j-eth0" Mar 3 13:36:20.291209 containerd[1639]: 2026-03-03 13:36:20.260 [INFO][4434] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali76ac596641a ContainerID="92c99a05abddc311b5fb95b949b5442bb932e12639d3d2a55ef89656d6a8a743" Namespace="calico-system" Pod="csi-node-driver-dlz7j" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-csi--node--driver--dlz7j-eth0" Mar 3 13:36:20.291209 containerd[1639]: 2026-03-03 13:36:20.266 [INFO][4434] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="92c99a05abddc311b5fb95b949b5442bb932e12639d3d2a55ef89656d6a8a743" Namespace="calico-system" Pod="csi-node-driver-dlz7j" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-csi--node--driver--dlz7j-eth0" Mar 3 13:36:20.291274 containerd[1639]: 2026-03-03 13:36:20.267 [INFO][4434] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="92c99a05abddc311b5fb95b949b5442bb932e12639d3d2a55ef89656d6a8a743" Namespace="calico-system" Pod="csi-node-driver-dlz7j" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-csi--node--driver--dlz7j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--a--1c48930dd2-k8s-csi--node--driver--dlz7j-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"01ac93a6-8099-4557-bc77-3925251db8d8", ResourceVersion:"719", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 35, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"98cbb5577", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-a-1c48930dd2", ContainerID:"92c99a05abddc311b5fb95b949b5442bb932e12639d3d2a55ef89656d6a8a743", Pod:"csi-node-driver-dlz7j", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.119.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali76ac596641a", MAC:"06:ad:20:fa:34:34", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:20.291312 containerd[1639]: 2026-03-03 13:36:20.277 [INFO][4434] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="92c99a05abddc311b5fb95b949b5442bb932e12639d3d2a55ef89656d6a8a743" Namespace="calico-system" Pod="csi-node-driver-dlz7j" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-csi--node--driver--dlz7j-eth0" Mar 3 13:36:20.323658 containerd[1639]: time="2026-03-03T13:36:20.323573459Z" level=info msg="connecting to shim 92c99a05abddc311b5fb95b949b5442bb932e12639d3d2a55ef89656d6a8a743" address="unix:///run/containerd/s/89c516469df0985547efec0fece60b7e204a48f5e04edcc8ba10100f98d15073" namespace=k8s.io protocol=ttrpc version=3 Mar 3 13:36:20.357874 systemd[1]: Started cri-containerd-92c99a05abddc311b5fb95b949b5442bb932e12639d3d2a55ef89656d6a8a743.scope - libcontainer container 92c99a05abddc311b5fb95b949b5442bb932e12639d3d2a55ef89656d6a8a743. Mar 3 13:36:20.401725 containerd[1639]: time="2026-03-03T13:36:20.401104319Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dlz7j,Uid:01ac93a6-8099-4557-bc77-3925251db8d8,Namespace:calico-system,Attempt:0,} returns sandbox id \"92c99a05abddc311b5fb95b949b5442bb932e12639d3d2a55ef89656d6a8a743\"" Mar 3 13:36:20.404082 containerd[1639]: time="2026-03-03T13:36:20.403918951Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Mar 3 13:36:21.100580 containerd[1639]: time="2026-03-03T13:36:21.099883030Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-xvc2m,Uid:8745b89a-4de0-4efe-bb7d-0887a5a20193,Namespace:calico-system,Attempt:0,}" Mar 3 13:36:21.264448 systemd-networkd[1489]: cali245cfa653bd: Link UP Mar 3 13:36:21.265506 systemd-networkd[1489]: cali245cfa653bd: Gained carrier Mar 3 13:36:21.285589 containerd[1639]: 2026-03-03 13:36:21.182 [INFO][4522] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--a--1c48930dd2-k8s-goldmane--cccfbd5cf--xvc2m-eth0 goldmane-cccfbd5cf- calico-system 8745b89a-4de0-4efe-bb7d-0887a5a20193 850 0 2026-03-03 13:35:54 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:cccfbd5cf projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459-2-4-a-1c48930dd2 goldmane-cccfbd5cf-xvc2m eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali245cfa653bd [] [] }} ContainerID="6af89fc72473a95e8be077ae139eeb4e4ba5e3b2d992e03ba8c267665723cc5f" Namespace="calico-system" Pod="goldmane-cccfbd5cf-xvc2m" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-goldmane--cccfbd5cf--xvc2m-" Mar 3 13:36:21.285589 containerd[1639]: 2026-03-03 13:36:21.183 [INFO][4522] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6af89fc72473a95e8be077ae139eeb4e4ba5e3b2d992e03ba8c267665723cc5f" Namespace="calico-system" Pod="goldmane-cccfbd5cf-xvc2m" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-goldmane--cccfbd5cf--xvc2m-eth0" Mar 3 13:36:21.285589 containerd[1639]: 2026-03-03 13:36:21.218 [INFO][4533] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6af89fc72473a95e8be077ae139eeb4e4ba5e3b2d992e03ba8c267665723cc5f" HandleID="k8s-pod-network.6af89fc72473a95e8be077ae139eeb4e4ba5e3b2d992e03ba8c267665723cc5f" Workload="ci--4459--2--4--a--1c48930dd2-k8s-goldmane--cccfbd5cf--xvc2m-eth0" Mar 3 13:36:21.285948 containerd[1639]: 2026-03-03 13:36:21.227 [INFO][4533] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="6af89fc72473a95e8be077ae139eeb4e4ba5e3b2d992e03ba8c267665723cc5f" HandleID="k8s-pod-network.6af89fc72473a95e8be077ae139eeb4e4ba5e3b2d992e03ba8c267665723cc5f" Workload="ci--4459--2--4--a--1c48930dd2-k8s-goldmane--cccfbd5cf--xvc2m-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fd4c0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-a-1c48930dd2", "pod":"goldmane-cccfbd5cf-xvc2m", "timestamp":"2026-03-03 13:36:21.21887577 +0000 UTC"}, Hostname:"ci-4459-2-4-a-1c48930dd2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000283600)} Mar 3 13:36:21.285948 containerd[1639]: 2026-03-03 13:36:21.227 [INFO][4533] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 3 13:36:21.285948 containerd[1639]: 2026-03-03 13:36:21.227 [INFO][4533] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 3 13:36:21.285948 containerd[1639]: 2026-03-03 13:36:21.227 [INFO][4533] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-a-1c48930dd2' Mar 3 13:36:21.285948 containerd[1639]: 2026-03-03 13:36:21.230 [INFO][4533] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.6af89fc72473a95e8be077ae139eeb4e4ba5e3b2d992e03ba8c267665723cc5f" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:21.285948 containerd[1639]: 2026-03-03 13:36:21.234 [INFO][4533] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:21.285948 containerd[1639]: 2026-03-03 13:36:21.238 [INFO][4533] ipam/ipam.go 526: Trying affinity for 192.168.119.64/26 host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:21.285948 containerd[1639]: 2026-03-03 13:36:21.241 [INFO][4533] ipam/ipam.go 160: Attempting to load block cidr=192.168.119.64/26 host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:21.285948 containerd[1639]: 2026-03-03 13:36:21.243 [INFO][4533] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.119.64/26 host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:21.286512 containerd[1639]: 2026-03-03 13:36:21.243 [INFO][4533] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.119.64/26 handle="k8s-pod-network.6af89fc72473a95e8be077ae139eeb4e4ba5e3b2d992e03ba8c267665723cc5f" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:21.286512 containerd[1639]: 2026-03-03 13:36:21.244 [INFO][4533] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.6af89fc72473a95e8be077ae139eeb4e4ba5e3b2d992e03ba8c267665723cc5f Mar 3 13:36:21.286512 containerd[1639]: 2026-03-03 13:36:21.248 [INFO][4533] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.119.64/26 handle="k8s-pod-network.6af89fc72473a95e8be077ae139eeb4e4ba5e3b2d992e03ba8c267665723cc5f" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:21.286512 containerd[1639]: 2026-03-03 13:36:21.254 [INFO][4533] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.119.67/26] block=192.168.119.64/26 handle="k8s-pod-network.6af89fc72473a95e8be077ae139eeb4e4ba5e3b2d992e03ba8c267665723cc5f" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:21.286512 containerd[1639]: 2026-03-03 13:36:21.255 [INFO][4533] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.119.67/26] handle="k8s-pod-network.6af89fc72473a95e8be077ae139eeb4e4ba5e3b2d992e03ba8c267665723cc5f" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:21.286512 containerd[1639]: 2026-03-03 13:36:21.255 [INFO][4533] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 3 13:36:21.286512 containerd[1639]: 2026-03-03 13:36:21.255 [INFO][4533] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.119.67/26] IPv6=[] ContainerID="6af89fc72473a95e8be077ae139eeb4e4ba5e3b2d992e03ba8c267665723cc5f" HandleID="k8s-pod-network.6af89fc72473a95e8be077ae139eeb4e4ba5e3b2d992e03ba8c267665723cc5f" Workload="ci--4459--2--4--a--1c48930dd2-k8s-goldmane--cccfbd5cf--xvc2m-eth0" Mar 3 13:36:21.286647 containerd[1639]: 2026-03-03 13:36:21.258 [INFO][4522] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6af89fc72473a95e8be077ae139eeb4e4ba5e3b2d992e03ba8c267665723cc5f" Namespace="calico-system" Pod="goldmane-cccfbd5cf-xvc2m" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-goldmane--cccfbd5cf--xvc2m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--a--1c48930dd2-k8s-goldmane--cccfbd5cf--xvc2m-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"8745b89a-4de0-4efe-bb7d-0887a5a20193", ResourceVersion:"850", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 35, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-a-1c48930dd2", ContainerID:"", Pod:"goldmane-cccfbd5cf-xvc2m", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.119.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali245cfa653bd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:21.286647 containerd[1639]: 2026-03-03 13:36:21.258 [INFO][4522] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.119.67/32] ContainerID="6af89fc72473a95e8be077ae139eeb4e4ba5e3b2d992e03ba8c267665723cc5f" Namespace="calico-system" Pod="goldmane-cccfbd5cf-xvc2m" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-goldmane--cccfbd5cf--xvc2m-eth0" Mar 3 13:36:21.286832 containerd[1639]: 2026-03-03 13:36:21.258 [INFO][4522] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali245cfa653bd ContainerID="6af89fc72473a95e8be077ae139eeb4e4ba5e3b2d992e03ba8c267665723cc5f" Namespace="calico-system" Pod="goldmane-cccfbd5cf-xvc2m" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-goldmane--cccfbd5cf--xvc2m-eth0" Mar 3 13:36:21.286832 containerd[1639]: 2026-03-03 13:36:21.265 [INFO][4522] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6af89fc72473a95e8be077ae139eeb4e4ba5e3b2d992e03ba8c267665723cc5f" Namespace="calico-system" Pod="goldmane-cccfbd5cf-xvc2m" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-goldmane--cccfbd5cf--xvc2m-eth0" Mar 3 13:36:21.286891 containerd[1639]: 2026-03-03 13:36:21.266 [INFO][4522] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6af89fc72473a95e8be077ae139eeb4e4ba5e3b2d992e03ba8c267665723cc5f" Namespace="calico-system" Pod="goldmane-cccfbd5cf-xvc2m" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-goldmane--cccfbd5cf--xvc2m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--a--1c48930dd2-k8s-goldmane--cccfbd5cf--xvc2m-eth0", GenerateName:"goldmane-cccfbd5cf-", Namespace:"calico-system", SelfLink:"", UID:"8745b89a-4de0-4efe-bb7d-0887a5a20193", ResourceVersion:"850", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 35, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"cccfbd5cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-a-1c48930dd2", ContainerID:"6af89fc72473a95e8be077ae139eeb4e4ba5e3b2d992e03ba8c267665723cc5f", Pod:"goldmane-cccfbd5cf-xvc2m", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.119.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali245cfa653bd", MAC:"4e:a6:d9:40:d7:48", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:21.287010 containerd[1639]: 2026-03-03 13:36:21.280 [INFO][4522] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6af89fc72473a95e8be077ae139eeb4e4ba5e3b2d992e03ba8c267665723cc5f" Namespace="calico-system" Pod="goldmane-cccfbd5cf-xvc2m" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-goldmane--cccfbd5cf--xvc2m-eth0" Mar 3 13:36:21.313373 containerd[1639]: time="2026-03-03T13:36:21.313316197Z" level=info msg="connecting to shim 6af89fc72473a95e8be077ae139eeb4e4ba5e3b2d992e03ba8c267665723cc5f" address="unix:///run/containerd/s/8a7473faa6f7e8ba0a9be7575864572d51be424ce487f14a64b8d827003e9d62" namespace=k8s.io protocol=ttrpc version=3 Mar 3 13:36:21.338818 systemd[1]: Started cri-containerd-6af89fc72473a95e8be077ae139eeb4e4ba5e3b2d992e03ba8c267665723cc5f.scope - libcontainer container 6af89fc72473a95e8be077ae139eeb4e4ba5e3b2d992e03ba8c267665723cc5f. Mar 3 13:36:21.346594 systemd-networkd[1489]: cali76ac596641a: Gained IPv6LL Mar 3 13:36:21.411361 containerd[1639]: time="2026-03-03T13:36:21.411278580Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-cccfbd5cf-xvc2m,Uid:8745b89a-4de0-4efe-bb7d-0887a5a20193,Namespace:calico-system,Attempt:0,} returns sandbox id \"6af89fc72473a95e8be077ae139eeb4e4ba5e3b2d992e03ba8c267665723cc5f\"" Mar 3 13:36:22.099777 containerd[1639]: time="2026-03-03T13:36:22.099712612Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-j9kwn,Uid:69f392ca-620e-4545-90dd-be526067bb84,Namespace:kube-system,Attempt:0,}" Mar 3 13:36:22.101429 containerd[1639]: time="2026-03-03T13:36:22.101397933Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-588cd44876-x8bpf,Uid:ee6739e9-2745-46b9-936e-2b03da5ded77,Namespace:calico-system,Attempt:0,}" Mar 3 13:36:22.102514 containerd[1639]: time="2026-03-03T13:36:22.102488198Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-77c7bb8fcc-hh8jb,Uid:79ec080a-ea5e-48da-b785-0639f3bb4193,Namespace:calico-system,Attempt:0,}" Mar 3 13:36:22.106691 containerd[1639]: time="2026-03-03T13:36:22.106609004Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:22.113968 containerd[1639]: time="2026-03-03T13:36:22.113936068Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8792502" Mar 3 13:36:22.122477 containerd[1639]: time="2026-03-03T13:36:22.122439911Z" level=info msg="ImageCreate event name:\"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:22.128565 containerd[1639]: time="2026-03-03T13:36:22.128523423Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:22.129926 containerd[1639]: time="2026-03-03T13:36:22.129519004Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"10348547\" in 1.725433957s" Mar 3 13:36:22.129926 containerd[1639]: time="2026-03-03T13:36:22.129865116Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\"" Mar 3 13:36:22.134196 containerd[1639]: time="2026-03-03T13:36:22.134116713Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Mar 3 13:36:22.138802 containerd[1639]: time="2026-03-03T13:36:22.138769646Z" level=info msg="CreateContainer within sandbox \"92c99a05abddc311b5fb95b949b5442bb932e12639d3d2a55ef89656d6a8a743\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 3 13:36:22.160558 containerd[1639]: time="2026-03-03T13:36:22.159779294Z" level=info msg="Container 277fe898a5bbd7749eebfabeaf25a69e41fd90a8a962eca461979feecacc53ef: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:36:22.170373 containerd[1639]: time="2026-03-03T13:36:22.170340771Z" level=info msg="CreateContainer within sandbox \"92c99a05abddc311b5fb95b949b5442bb932e12639d3d2a55ef89656d6a8a743\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"277fe898a5bbd7749eebfabeaf25a69e41fd90a8a962eca461979feecacc53ef\"" Mar 3 13:36:22.173842 containerd[1639]: time="2026-03-03T13:36:22.173823405Z" level=info msg="StartContainer for \"277fe898a5bbd7749eebfabeaf25a69e41fd90a8a962eca461979feecacc53ef\"" Mar 3 13:36:22.177752 containerd[1639]: time="2026-03-03T13:36:22.177722930Z" level=info msg="connecting to shim 277fe898a5bbd7749eebfabeaf25a69e41fd90a8a962eca461979feecacc53ef" address="unix:///run/containerd/s/89c516469df0985547efec0fece60b7e204a48f5e04edcc8ba10100f98d15073" protocol=ttrpc version=3 Mar 3 13:36:22.218030 systemd[1]: Started cri-containerd-277fe898a5bbd7749eebfabeaf25a69e41fd90a8a962eca461979feecacc53ef.scope - libcontainer container 277fe898a5bbd7749eebfabeaf25a69e41fd90a8a962eca461979feecacc53ef. Mar 3 13:36:22.306892 systemd-networkd[1489]: cali959a5791523: Link UP Mar 3 13:36:22.307443 systemd-networkd[1489]: cali959a5791523: Gained carrier Mar 3 13:36:22.318381 containerd[1639]: time="2026-03-03T13:36:22.318350662Z" level=info msg="StartContainer for \"277fe898a5bbd7749eebfabeaf25a69e41fd90a8a962eca461979feecacc53ef\" returns successfully" Mar 3 13:36:22.325543 containerd[1639]: 2026-03-03 13:36:22.196 [INFO][4627] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--a--1c48930dd2-k8s-calico--apiserver--588cd44876--x8bpf-eth0 calico-apiserver-588cd44876- calico-system ee6739e9-2745-46b9-936e-2b03da5ded77 853 0 2026-03-03 13:35:54 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:588cd44876 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-4-a-1c48930dd2 calico-apiserver-588cd44876-x8bpf eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali959a5791523 [] [] }} ContainerID="bd6a810d210a5099c926b0b873c2825e6674c58e0b52bbf13821369d15b58c3e" Namespace="calico-system" Pod="calico-apiserver-588cd44876-x8bpf" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-calico--apiserver--588cd44876--x8bpf-" Mar 3 13:36:22.325543 containerd[1639]: 2026-03-03 13:36:22.198 [INFO][4627] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bd6a810d210a5099c926b0b873c2825e6674c58e0b52bbf13821369d15b58c3e" Namespace="calico-system" Pod="calico-apiserver-588cd44876-x8bpf" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-calico--apiserver--588cd44876--x8bpf-eth0" Mar 3 13:36:22.325543 containerd[1639]: 2026-03-03 13:36:22.249 [INFO][4675] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bd6a810d210a5099c926b0b873c2825e6674c58e0b52bbf13821369d15b58c3e" HandleID="k8s-pod-network.bd6a810d210a5099c926b0b873c2825e6674c58e0b52bbf13821369d15b58c3e" Workload="ci--4459--2--4--a--1c48930dd2-k8s-calico--apiserver--588cd44876--x8bpf-eth0" Mar 3 13:36:22.325719 containerd[1639]: 2026-03-03 13:36:22.262 [INFO][4675] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="bd6a810d210a5099c926b0b873c2825e6674c58e0b52bbf13821369d15b58c3e" HandleID="k8s-pod-network.bd6a810d210a5099c926b0b873c2825e6674c58e0b52bbf13821369d15b58c3e" Workload="ci--4459--2--4--a--1c48930dd2-k8s-calico--apiserver--588cd44876--x8bpf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000389ca0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-a-1c48930dd2", "pod":"calico-apiserver-588cd44876-x8bpf", "timestamp":"2026-03-03 13:36:22.249812085 +0000 UTC"}, Hostname:"ci-4459-2-4-a-1c48930dd2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000188dc0)} Mar 3 13:36:22.325719 containerd[1639]: 2026-03-03 13:36:22.262 [INFO][4675] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 3 13:36:22.325719 containerd[1639]: 2026-03-03 13:36:22.262 [INFO][4675] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 3 13:36:22.325719 containerd[1639]: 2026-03-03 13:36:22.262 [INFO][4675] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-a-1c48930dd2' Mar 3 13:36:22.325719 containerd[1639]: 2026-03-03 13:36:22.266 [INFO][4675] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.bd6a810d210a5099c926b0b873c2825e6674c58e0b52bbf13821369d15b58c3e" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:22.325719 containerd[1639]: 2026-03-03 13:36:22.271 [INFO][4675] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:22.325719 containerd[1639]: 2026-03-03 13:36:22.275 [INFO][4675] ipam/ipam.go 526: Trying affinity for 192.168.119.64/26 host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:22.325719 containerd[1639]: 2026-03-03 13:36:22.277 [INFO][4675] ipam/ipam.go 160: Attempting to load block cidr=192.168.119.64/26 host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:22.325719 containerd[1639]: 2026-03-03 13:36:22.278 [INFO][4675] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.119.64/26 host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:22.325874 containerd[1639]: 2026-03-03 13:36:22.278 [INFO][4675] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.119.64/26 handle="k8s-pod-network.bd6a810d210a5099c926b0b873c2825e6674c58e0b52bbf13821369d15b58c3e" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:22.325874 containerd[1639]: 2026-03-03 13:36:22.280 [INFO][4675] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.bd6a810d210a5099c926b0b873c2825e6674c58e0b52bbf13821369d15b58c3e Mar 3 13:36:22.325874 containerd[1639]: 2026-03-03 13:36:22.284 [INFO][4675] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.119.64/26 handle="k8s-pod-network.bd6a810d210a5099c926b0b873c2825e6674c58e0b52bbf13821369d15b58c3e" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:22.325874 containerd[1639]: 2026-03-03 13:36:22.291 [INFO][4675] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.119.68/26] block=192.168.119.64/26 handle="k8s-pod-network.bd6a810d210a5099c926b0b873c2825e6674c58e0b52bbf13821369d15b58c3e" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:22.325874 containerd[1639]: 2026-03-03 13:36:22.291 [INFO][4675] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.119.68/26] handle="k8s-pod-network.bd6a810d210a5099c926b0b873c2825e6674c58e0b52bbf13821369d15b58c3e" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:22.325874 containerd[1639]: 2026-03-03 13:36:22.291 [INFO][4675] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 3 13:36:22.325874 containerd[1639]: 2026-03-03 13:36:22.291 [INFO][4675] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.119.68/26] IPv6=[] ContainerID="bd6a810d210a5099c926b0b873c2825e6674c58e0b52bbf13821369d15b58c3e" HandleID="k8s-pod-network.bd6a810d210a5099c926b0b873c2825e6674c58e0b52bbf13821369d15b58c3e" Workload="ci--4459--2--4--a--1c48930dd2-k8s-calico--apiserver--588cd44876--x8bpf-eth0" Mar 3 13:36:22.325974 containerd[1639]: 2026-03-03 13:36:22.299 [INFO][4627] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bd6a810d210a5099c926b0b873c2825e6674c58e0b52bbf13821369d15b58c3e" Namespace="calico-system" Pod="calico-apiserver-588cd44876-x8bpf" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-calico--apiserver--588cd44876--x8bpf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--a--1c48930dd2-k8s-calico--apiserver--588cd44876--x8bpf-eth0", GenerateName:"calico-apiserver-588cd44876-", Namespace:"calico-system", SelfLink:"", UID:"ee6739e9-2745-46b9-936e-2b03da5ded77", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 35, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"588cd44876", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-a-1c48930dd2", ContainerID:"", Pod:"calico-apiserver-588cd44876-x8bpf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.119.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali959a5791523", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:22.326011 containerd[1639]: 2026-03-03 13:36:22.299 [INFO][4627] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.119.68/32] ContainerID="bd6a810d210a5099c926b0b873c2825e6674c58e0b52bbf13821369d15b58c3e" Namespace="calico-system" Pod="calico-apiserver-588cd44876-x8bpf" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-calico--apiserver--588cd44876--x8bpf-eth0" Mar 3 13:36:22.326011 containerd[1639]: 2026-03-03 13:36:22.299 [INFO][4627] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali959a5791523 ContainerID="bd6a810d210a5099c926b0b873c2825e6674c58e0b52bbf13821369d15b58c3e" Namespace="calico-system" Pod="calico-apiserver-588cd44876-x8bpf" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-calico--apiserver--588cd44876--x8bpf-eth0" Mar 3 13:36:22.326011 containerd[1639]: 2026-03-03 13:36:22.313 [INFO][4627] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bd6a810d210a5099c926b0b873c2825e6674c58e0b52bbf13821369d15b58c3e" Namespace="calico-system" Pod="calico-apiserver-588cd44876-x8bpf" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-calico--apiserver--588cd44876--x8bpf-eth0" Mar 3 13:36:22.326059 containerd[1639]: 2026-03-03 13:36:22.314 [INFO][4627] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bd6a810d210a5099c926b0b873c2825e6674c58e0b52bbf13821369d15b58c3e" Namespace="calico-system" Pod="calico-apiserver-588cd44876-x8bpf" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-calico--apiserver--588cd44876--x8bpf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--a--1c48930dd2-k8s-calico--apiserver--588cd44876--x8bpf-eth0", GenerateName:"calico-apiserver-588cd44876-", Namespace:"calico-system", SelfLink:"", UID:"ee6739e9-2745-46b9-936e-2b03da5ded77", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 35, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"588cd44876", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-a-1c48930dd2", ContainerID:"bd6a810d210a5099c926b0b873c2825e6674c58e0b52bbf13821369d15b58c3e", Pod:"calico-apiserver-588cd44876-x8bpf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.119.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali959a5791523", MAC:"5a:c3:92:5f:4e:52", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:22.326096 containerd[1639]: 2026-03-03 13:36:22.322 [INFO][4627] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bd6a810d210a5099c926b0b873c2825e6674c58e0b52bbf13821369d15b58c3e" Namespace="calico-system" Pod="calico-apiserver-588cd44876-x8bpf" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-calico--apiserver--588cd44876--x8bpf-eth0" Mar 3 13:36:22.365564 containerd[1639]: time="2026-03-03T13:36:22.364919614Z" level=info msg="connecting to shim bd6a810d210a5099c926b0b873c2825e6674c58e0b52bbf13821369d15b58c3e" address="unix:///run/containerd/s/38d1f4f6c8056481fd119389db95ac52f9c24b820276cdeb45dea28dc0180a9b" namespace=k8s.io protocol=ttrpc version=3 Mar 3 13:36:22.391773 systemd[1]: Started cri-containerd-bd6a810d210a5099c926b0b873c2825e6674c58e0b52bbf13821369d15b58c3e.scope - libcontainer container bd6a810d210a5099c926b0b873c2825e6674c58e0b52bbf13821369d15b58c3e. Mar 3 13:36:22.401372 systemd-networkd[1489]: calib7ca8d9d316: Link UP Mar 3 13:36:22.401601 systemd-networkd[1489]: calib7ca8d9d316: Gained carrier Mar 3 13:36:22.423527 containerd[1639]: 2026-03-03 13:36:22.178 [INFO][4640] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--a--1c48930dd2-k8s-calico--kube--controllers--77c7bb8fcc--hh8jb-eth0 calico-kube-controllers-77c7bb8fcc- calico-system 79ec080a-ea5e-48da-b785-0639f3bb4193 849 0 2026-03-03 13:35:54 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:77c7bb8fcc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459-2-4-a-1c48930dd2 calico-kube-controllers-77c7bb8fcc-hh8jb eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calib7ca8d9d316 [] [] }} ContainerID="b6882a656dc49a7f8247047829031ebf39607d4c1c39cb0fa3a0ca82bcdf4ef4" Namespace="calico-system" Pod="calico-kube-controllers-77c7bb8fcc-hh8jb" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-calico--kube--controllers--77c7bb8fcc--hh8jb-" Mar 3 13:36:22.423527 containerd[1639]: 2026-03-03 13:36:22.178 [INFO][4640] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b6882a656dc49a7f8247047829031ebf39607d4c1c39cb0fa3a0ca82bcdf4ef4" Namespace="calico-system" Pod="calico-kube-controllers-77c7bb8fcc-hh8jb" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-calico--kube--controllers--77c7bb8fcc--hh8jb-eth0" Mar 3 13:36:22.423527 containerd[1639]: 2026-03-03 13:36:22.250 [INFO][4663] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b6882a656dc49a7f8247047829031ebf39607d4c1c39cb0fa3a0ca82bcdf4ef4" HandleID="k8s-pod-network.b6882a656dc49a7f8247047829031ebf39607d4c1c39cb0fa3a0ca82bcdf4ef4" Workload="ci--4459--2--4--a--1c48930dd2-k8s-calico--kube--controllers--77c7bb8fcc--hh8jb-eth0" Mar 3 13:36:22.423718 containerd[1639]: 2026-03-03 13:36:22.265 [INFO][4663] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="b6882a656dc49a7f8247047829031ebf39607d4c1c39cb0fa3a0ca82bcdf4ef4" HandleID="k8s-pod-network.b6882a656dc49a7f8247047829031ebf39607d4c1c39cb0fa3a0ca82bcdf4ef4" Workload="ci--4459--2--4--a--1c48930dd2-k8s-calico--kube--controllers--77c7bb8fcc--hh8jb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000380030), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-a-1c48930dd2", "pod":"calico-kube-controllers-77c7bb8fcc-hh8jb", "timestamp":"2026-03-03 13:36:22.250111505 +0000 UTC"}, Hostname:"ci-4459-2-4-a-1c48930dd2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000282000)} Mar 3 13:36:22.423718 containerd[1639]: 2026-03-03 13:36:22.265 [INFO][4663] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 3 13:36:22.423718 containerd[1639]: 2026-03-03 13:36:22.291 [INFO][4663] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 3 13:36:22.423718 containerd[1639]: 2026-03-03 13:36:22.291 [INFO][4663] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-a-1c48930dd2' Mar 3 13:36:22.423718 containerd[1639]: 2026-03-03 13:36:22.368 [INFO][4663] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.b6882a656dc49a7f8247047829031ebf39607d4c1c39cb0fa3a0ca82bcdf4ef4" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:22.423718 containerd[1639]: 2026-03-03 13:36:22.373 [INFO][4663] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:22.423718 containerd[1639]: 2026-03-03 13:36:22.377 [INFO][4663] ipam/ipam.go 526: Trying affinity for 192.168.119.64/26 host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:22.423718 containerd[1639]: 2026-03-03 13:36:22.379 [INFO][4663] ipam/ipam.go 160: Attempting to load block cidr=192.168.119.64/26 host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:22.423718 containerd[1639]: 2026-03-03 13:36:22.381 [INFO][4663] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.119.64/26 host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:22.423870 containerd[1639]: 2026-03-03 13:36:22.381 [INFO][4663] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.119.64/26 handle="k8s-pod-network.b6882a656dc49a7f8247047829031ebf39607d4c1c39cb0fa3a0ca82bcdf4ef4" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:22.423870 containerd[1639]: 2026-03-03 13:36:22.384 [INFO][4663] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.b6882a656dc49a7f8247047829031ebf39607d4c1c39cb0fa3a0ca82bcdf4ef4 Mar 3 13:36:22.423870 containerd[1639]: 2026-03-03 13:36:22.388 [INFO][4663] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.119.64/26 handle="k8s-pod-network.b6882a656dc49a7f8247047829031ebf39607d4c1c39cb0fa3a0ca82bcdf4ef4" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:22.423870 containerd[1639]: 2026-03-03 13:36:22.394 [INFO][4663] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.119.69/26] block=192.168.119.64/26 handle="k8s-pod-network.b6882a656dc49a7f8247047829031ebf39607d4c1c39cb0fa3a0ca82bcdf4ef4" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:22.423870 containerd[1639]: 2026-03-03 13:36:22.394 [INFO][4663] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.119.69/26] handle="k8s-pod-network.b6882a656dc49a7f8247047829031ebf39607d4c1c39cb0fa3a0ca82bcdf4ef4" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:22.423870 containerd[1639]: 2026-03-03 13:36:22.394 [INFO][4663] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 3 13:36:22.423870 containerd[1639]: 2026-03-03 13:36:22.394 [INFO][4663] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.119.69/26] IPv6=[] ContainerID="b6882a656dc49a7f8247047829031ebf39607d4c1c39cb0fa3a0ca82bcdf4ef4" HandleID="k8s-pod-network.b6882a656dc49a7f8247047829031ebf39607d4c1c39cb0fa3a0ca82bcdf4ef4" Workload="ci--4459--2--4--a--1c48930dd2-k8s-calico--kube--controllers--77c7bb8fcc--hh8jb-eth0" Mar 3 13:36:22.423971 containerd[1639]: 2026-03-03 13:36:22.396 [INFO][4640] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b6882a656dc49a7f8247047829031ebf39607d4c1c39cb0fa3a0ca82bcdf4ef4" Namespace="calico-system" Pod="calico-kube-controllers-77c7bb8fcc-hh8jb" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-calico--kube--controllers--77c7bb8fcc--hh8jb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--a--1c48930dd2-k8s-calico--kube--controllers--77c7bb8fcc--hh8jb-eth0", GenerateName:"calico-kube-controllers-77c7bb8fcc-", Namespace:"calico-system", SelfLink:"", UID:"79ec080a-ea5e-48da-b785-0639f3bb4193", ResourceVersion:"849", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 35, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"77c7bb8fcc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-a-1c48930dd2", ContainerID:"", Pod:"calico-kube-controllers-77c7bb8fcc-hh8jb", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.119.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calib7ca8d9d316", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:22.424007 containerd[1639]: 2026-03-03 13:36:22.397 [INFO][4640] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.119.69/32] ContainerID="b6882a656dc49a7f8247047829031ebf39607d4c1c39cb0fa3a0ca82bcdf4ef4" Namespace="calico-system" Pod="calico-kube-controllers-77c7bb8fcc-hh8jb" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-calico--kube--controllers--77c7bb8fcc--hh8jb-eth0" Mar 3 13:36:22.424007 containerd[1639]: 2026-03-03 13:36:22.397 [INFO][4640] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib7ca8d9d316 ContainerID="b6882a656dc49a7f8247047829031ebf39607d4c1c39cb0fa3a0ca82bcdf4ef4" Namespace="calico-system" Pod="calico-kube-controllers-77c7bb8fcc-hh8jb" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-calico--kube--controllers--77c7bb8fcc--hh8jb-eth0" Mar 3 13:36:22.424007 containerd[1639]: 2026-03-03 13:36:22.406 [INFO][4640] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b6882a656dc49a7f8247047829031ebf39607d4c1c39cb0fa3a0ca82bcdf4ef4" Namespace="calico-system" Pod="calico-kube-controllers-77c7bb8fcc-hh8jb" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-calico--kube--controllers--77c7bb8fcc--hh8jb-eth0" Mar 3 13:36:22.424057 containerd[1639]: 2026-03-03 13:36:22.407 [INFO][4640] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b6882a656dc49a7f8247047829031ebf39607d4c1c39cb0fa3a0ca82bcdf4ef4" Namespace="calico-system" Pod="calico-kube-controllers-77c7bb8fcc-hh8jb" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-calico--kube--controllers--77c7bb8fcc--hh8jb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--a--1c48930dd2-k8s-calico--kube--controllers--77c7bb8fcc--hh8jb-eth0", GenerateName:"calico-kube-controllers-77c7bb8fcc-", Namespace:"calico-system", SelfLink:"", UID:"79ec080a-ea5e-48da-b785-0639f3bb4193", ResourceVersion:"849", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 35, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"77c7bb8fcc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-a-1c48930dd2", ContainerID:"b6882a656dc49a7f8247047829031ebf39607d4c1c39cb0fa3a0ca82bcdf4ef4", Pod:"calico-kube-controllers-77c7bb8fcc-hh8jb", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.119.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calib7ca8d9d316", MAC:"aa:82:31:8f:90:bd", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:22.424094 containerd[1639]: 2026-03-03 13:36:22.421 [INFO][4640] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b6882a656dc49a7f8247047829031ebf39607d4c1c39cb0fa3a0ca82bcdf4ef4" Namespace="calico-system" Pod="calico-kube-controllers-77c7bb8fcc-hh8jb" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-calico--kube--controllers--77c7bb8fcc--hh8jb-eth0" Mar 3 13:36:22.456273 containerd[1639]: time="2026-03-03T13:36:22.456058665Z" level=info msg="connecting to shim b6882a656dc49a7f8247047829031ebf39607d4c1c39cb0fa3a0ca82bcdf4ef4" address="unix:///run/containerd/s/26f89c39b6348bd46f4f8de623e82f676cb0e9d1b5f1ce79548bc458eff98d76" namespace=k8s.io protocol=ttrpc version=3 Mar 3 13:36:22.479134 containerd[1639]: time="2026-03-03T13:36:22.479092105Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-588cd44876-x8bpf,Uid:ee6739e9-2745-46b9-936e-2b03da5ded77,Namespace:calico-system,Attempt:0,} returns sandbox id \"bd6a810d210a5099c926b0b873c2825e6674c58e0b52bbf13821369d15b58c3e\"" Mar 3 13:36:22.502801 systemd[1]: Started cri-containerd-b6882a656dc49a7f8247047829031ebf39607d4c1c39cb0fa3a0ca82bcdf4ef4.scope - libcontainer container b6882a656dc49a7f8247047829031ebf39607d4c1c39cb0fa3a0ca82bcdf4ef4. Mar 3 13:36:22.519170 systemd-networkd[1489]: cali6b3776a9e1e: Link UP Mar 3 13:36:22.521285 systemd-networkd[1489]: cali6b3776a9e1e: Gained carrier Mar 3 13:36:22.543463 containerd[1639]: 2026-03-03 13:36:22.210 [INFO][4620] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--a--1c48930dd2-k8s-coredns--66bc5c9577--j9kwn-eth0 coredns-66bc5c9577- kube-system 69f392ca-620e-4545-90dd-be526067bb84 841 0 2026-03-03 13:35:45 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-4-a-1c48930dd2 coredns-66bc5c9577-j9kwn eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali6b3776a9e1e [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="c917259aa478f08f599f54326f6d2acd7797b04dbe4de2ec9480fd037cd7be69" Namespace="kube-system" Pod="coredns-66bc5c9577-j9kwn" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-coredns--66bc5c9577--j9kwn-" Mar 3 13:36:22.543463 containerd[1639]: 2026-03-03 13:36:22.210 [INFO][4620] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c917259aa478f08f599f54326f6d2acd7797b04dbe4de2ec9480fd037cd7be69" Namespace="kube-system" Pod="coredns-66bc5c9577-j9kwn" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-coredns--66bc5c9577--j9kwn-eth0" Mar 3 13:36:22.543463 containerd[1639]: 2026-03-03 13:36:22.265 [INFO][4680] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c917259aa478f08f599f54326f6d2acd7797b04dbe4de2ec9480fd037cd7be69" HandleID="k8s-pod-network.c917259aa478f08f599f54326f6d2acd7797b04dbe4de2ec9480fd037cd7be69" Workload="ci--4459--2--4--a--1c48930dd2-k8s-coredns--66bc5c9577--j9kwn-eth0" Mar 3 13:36:22.543821 containerd[1639]: 2026-03-03 13:36:22.270 [INFO][4680] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="c917259aa478f08f599f54326f6d2acd7797b04dbe4de2ec9480fd037cd7be69" HandleID="k8s-pod-network.c917259aa478f08f599f54326f6d2acd7797b04dbe4de2ec9480fd037cd7be69" Workload="ci--4459--2--4--a--1c48930dd2-k8s-coredns--66bc5c9577--j9kwn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002eff30), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-4-a-1c48930dd2", "pod":"coredns-66bc5c9577-j9kwn", "timestamp":"2026-03-03 13:36:22.265730967 +0000 UTC"}, Hostname:"ci-4459-2-4-a-1c48930dd2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0000fba20)} Mar 3 13:36:22.543821 containerd[1639]: 2026-03-03 13:36:22.271 [INFO][4680] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 3 13:36:22.543821 containerd[1639]: 2026-03-03 13:36:22.394 [INFO][4680] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 3 13:36:22.543821 containerd[1639]: 2026-03-03 13:36:22.394 [INFO][4680] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-a-1c48930dd2' Mar 3 13:36:22.543821 containerd[1639]: 2026-03-03 13:36:22.469 [INFO][4680] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.c917259aa478f08f599f54326f6d2acd7797b04dbe4de2ec9480fd037cd7be69" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:22.543821 containerd[1639]: 2026-03-03 13:36:22.478 [INFO][4680] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:22.543821 containerd[1639]: 2026-03-03 13:36:22.487 [INFO][4680] ipam/ipam.go 526: Trying affinity for 192.168.119.64/26 host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:22.543821 containerd[1639]: 2026-03-03 13:36:22.489 [INFO][4680] ipam/ipam.go 160: Attempting to load block cidr=192.168.119.64/26 host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:22.543821 containerd[1639]: 2026-03-03 13:36:22.491 [INFO][4680] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.119.64/26 host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:22.544009 containerd[1639]: 2026-03-03 13:36:22.491 [INFO][4680] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.119.64/26 handle="k8s-pod-network.c917259aa478f08f599f54326f6d2acd7797b04dbe4de2ec9480fd037cd7be69" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:22.544009 containerd[1639]: 2026-03-03 13:36:22.493 [INFO][4680] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.c917259aa478f08f599f54326f6d2acd7797b04dbe4de2ec9480fd037cd7be69 Mar 3 13:36:22.544009 containerd[1639]: 2026-03-03 13:36:22.497 [INFO][4680] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.119.64/26 handle="k8s-pod-network.c917259aa478f08f599f54326f6d2acd7797b04dbe4de2ec9480fd037cd7be69" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:22.544009 containerd[1639]: 2026-03-03 13:36:22.508 [INFO][4680] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.119.70/26] block=192.168.119.64/26 handle="k8s-pod-network.c917259aa478f08f599f54326f6d2acd7797b04dbe4de2ec9480fd037cd7be69" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:22.544009 containerd[1639]: 2026-03-03 13:36:22.508 [INFO][4680] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.119.70/26] handle="k8s-pod-network.c917259aa478f08f599f54326f6d2acd7797b04dbe4de2ec9480fd037cd7be69" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:22.544009 containerd[1639]: 2026-03-03 13:36:22.508 [INFO][4680] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 3 13:36:22.544009 containerd[1639]: 2026-03-03 13:36:22.508 [INFO][4680] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.119.70/26] IPv6=[] ContainerID="c917259aa478f08f599f54326f6d2acd7797b04dbe4de2ec9480fd037cd7be69" HandleID="k8s-pod-network.c917259aa478f08f599f54326f6d2acd7797b04dbe4de2ec9480fd037cd7be69" Workload="ci--4459--2--4--a--1c48930dd2-k8s-coredns--66bc5c9577--j9kwn-eth0" Mar 3 13:36:22.544131 containerd[1639]: 2026-03-03 13:36:22.515 [INFO][4620] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c917259aa478f08f599f54326f6d2acd7797b04dbe4de2ec9480fd037cd7be69" Namespace="kube-system" Pod="coredns-66bc5c9577-j9kwn" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-coredns--66bc5c9577--j9kwn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--a--1c48930dd2-k8s-coredns--66bc5c9577--j9kwn-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"69f392ca-620e-4545-90dd-be526067bb84", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 35, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-a-1c48930dd2", ContainerID:"", Pod:"coredns-66bc5c9577-j9kwn", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.119.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6b3776a9e1e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:22.544131 containerd[1639]: 2026-03-03 13:36:22.515 [INFO][4620] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.119.70/32] ContainerID="c917259aa478f08f599f54326f6d2acd7797b04dbe4de2ec9480fd037cd7be69" Namespace="kube-system" Pod="coredns-66bc5c9577-j9kwn" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-coredns--66bc5c9577--j9kwn-eth0" Mar 3 13:36:22.544131 containerd[1639]: 2026-03-03 13:36:22.515 [INFO][4620] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6b3776a9e1e ContainerID="c917259aa478f08f599f54326f6d2acd7797b04dbe4de2ec9480fd037cd7be69" Namespace="kube-system" Pod="coredns-66bc5c9577-j9kwn" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-coredns--66bc5c9577--j9kwn-eth0" Mar 3 13:36:22.544131 containerd[1639]: 2026-03-03 13:36:22.526 [INFO][4620] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c917259aa478f08f599f54326f6d2acd7797b04dbe4de2ec9480fd037cd7be69" Namespace="kube-system" Pod="coredns-66bc5c9577-j9kwn" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-coredns--66bc5c9577--j9kwn-eth0" Mar 3 13:36:22.544131 containerd[1639]: 2026-03-03 13:36:22.528 [INFO][4620] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c917259aa478f08f599f54326f6d2acd7797b04dbe4de2ec9480fd037cd7be69" Namespace="kube-system" Pod="coredns-66bc5c9577-j9kwn" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-coredns--66bc5c9577--j9kwn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--a--1c48930dd2-k8s-coredns--66bc5c9577--j9kwn-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"69f392ca-620e-4545-90dd-be526067bb84", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 35, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-a-1c48930dd2", ContainerID:"c917259aa478f08f599f54326f6d2acd7797b04dbe4de2ec9480fd037cd7be69", Pod:"coredns-66bc5c9577-j9kwn", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.119.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6b3776a9e1e", MAC:"ba:80:1c:08:a0:62", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:22.544367 containerd[1639]: 2026-03-03 13:36:22.539 [INFO][4620] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c917259aa478f08f599f54326f6d2acd7797b04dbe4de2ec9480fd037cd7be69" Namespace="kube-system" Pod="coredns-66bc5c9577-j9kwn" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-coredns--66bc5c9577--j9kwn-eth0" Mar 3 13:36:22.575847 containerd[1639]: time="2026-03-03T13:36:22.574600835Z" level=info msg="connecting to shim c917259aa478f08f599f54326f6d2acd7797b04dbe4de2ec9480fd037cd7be69" address="unix:///run/containerd/s/bc08e4ccb38f57f2a99cbaf2c5ba4164f091dd8e9bd8307aefc3f8e3fefee007" namespace=k8s.io protocol=ttrpc version=3 Mar 3 13:36:22.590883 containerd[1639]: time="2026-03-03T13:36:22.590845028Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-77c7bb8fcc-hh8jb,Uid:79ec080a-ea5e-48da-b785-0639f3bb4193,Namespace:calico-system,Attempt:0,} returns sandbox id \"b6882a656dc49a7f8247047829031ebf39607d4c1c39cb0fa3a0ca82bcdf4ef4\"" Mar 3 13:36:22.610808 systemd[1]: Started cri-containerd-c917259aa478f08f599f54326f6d2acd7797b04dbe4de2ec9480fd037cd7be69.scope - libcontainer container c917259aa478f08f599f54326f6d2acd7797b04dbe4de2ec9480fd037cd7be69. Mar 3 13:36:22.658688 containerd[1639]: time="2026-03-03T13:36:22.658643136Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-j9kwn,Uid:69f392ca-620e-4545-90dd-be526067bb84,Namespace:kube-system,Attempt:0,} returns sandbox id \"c917259aa478f08f599f54326f6d2acd7797b04dbe4de2ec9480fd037cd7be69\"" Mar 3 13:36:22.665568 containerd[1639]: time="2026-03-03T13:36:22.665533438Z" level=info msg="CreateContainer within sandbox \"c917259aa478f08f599f54326f6d2acd7797b04dbe4de2ec9480fd037cd7be69\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 3 13:36:22.679198 containerd[1639]: time="2026-03-03T13:36:22.679162961Z" level=info msg="Container 8c6b0e75040c6c98897d50b2478b5cd2d8e98e41471e57baecc880e19b581a9a: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:36:22.683098 containerd[1639]: time="2026-03-03T13:36:22.683070388Z" level=info msg="CreateContainer within sandbox \"c917259aa478f08f599f54326f6d2acd7797b04dbe4de2ec9480fd037cd7be69\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"8c6b0e75040c6c98897d50b2478b5cd2d8e98e41471e57baecc880e19b581a9a\"" Mar 3 13:36:22.683554 containerd[1639]: time="2026-03-03T13:36:22.683513560Z" level=info msg="StartContainer for \"8c6b0e75040c6c98897d50b2478b5cd2d8e98e41471e57baecc880e19b581a9a\"" Mar 3 13:36:22.684997 containerd[1639]: time="2026-03-03T13:36:22.684945872Z" level=info msg="connecting to shim 8c6b0e75040c6c98897d50b2478b5cd2d8e98e41471e57baecc880e19b581a9a" address="unix:///run/containerd/s/bc08e4ccb38f57f2a99cbaf2c5ba4164f091dd8e9bd8307aefc3f8e3fefee007" protocol=ttrpc version=3 Mar 3 13:36:22.703793 systemd[1]: Started cri-containerd-8c6b0e75040c6c98897d50b2478b5cd2d8e98e41471e57baecc880e19b581a9a.scope - libcontainer container 8c6b0e75040c6c98897d50b2478b5cd2d8e98e41471e57baecc880e19b581a9a. Mar 3 13:36:22.732744 containerd[1639]: time="2026-03-03T13:36:22.732707878Z" level=info msg="StartContainer for \"8c6b0e75040c6c98897d50b2478b5cd2d8e98e41471e57baecc880e19b581a9a\" returns successfully" Mar 3 13:36:23.136931 systemd-networkd[1489]: cali245cfa653bd: Gained IPv6LL Mar 3 13:36:23.338185 kubelet[2796]: I0303 13:36:23.338078 2796 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-j9kwn" podStartSLOduration=38.338050658 podStartE2EDuration="38.338050658s" podCreationTimestamp="2026-03-03 13:35:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-03 13:36:23.337317472 +0000 UTC m=+46.318299406" watchObservedRunningTime="2026-03-03 13:36:23.338050658 +0000 UTC m=+46.319032552" Mar 3 13:36:23.520937 systemd-networkd[1489]: cali959a5791523: Gained IPv6LL Mar 3 13:36:23.905319 systemd-networkd[1489]: cali6b3776a9e1e: Gained IPv6LL Mar 3 13:36:24.097054 systemd-networkd[1489]: calib7ca8d9d316: Gained IPv6LL Mar 3 13:36:24.109565 containerd[1639]: time="2026-03-03T13:36:24.108809337Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-wx9bf,Uid:24edf4b1-ec3a-4ab5-a696-e9721189d47d,Namespace:kube-system,Attempt:0,}" Mar 3 13:36:24.113105 containerd[1639]: time="2026-03-03T13:36:24.112919921Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-588cd44876-hb6b4,Uid:323b935e-0775-45cf-89a0-f43932ed5b64,Namespace:calico-system,Attempt:0,}" Mar 3 13:36:24.249784 systemd-networkd[1489]: calib396cebb693: Link UP Mar 3 13:36:24.250012 systemd-networkd[1489]: calib396cebb693: Gained carrier Mar 3 13:36:24.265736 containerd[1639]: 2026-03-03 13:36:24.172 [INFO][4956] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--a--1c48930dd2-k8s-calico--apiserver--588cd44876--hb6b4-eth0 calico-apiserver-588cd44876- calico-system 323b935e-0775-45cf-89a0-f43932ed5b64 851 0 2026-03-03 13:35:54 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:588cd44876 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-4-a-1c48930dd2 calico-apiserver-588cd44876-hb6b4 eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] calib396cebb693 [] [] }} ContainerID="a5cee5da1579adc3c16dbd98f779def810be107ca31d43c9af49e065e48693d1" Namespace="calico-system" Pod="calico-apiserver-588cd44876-hb6b4" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-calico--apiserver--588cd44876--hb6b4-" Mar 3 13:36:24.265736 containerd[1639]: 2026-03-03 13:36:24.172 [INFO][4956] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a5cee5da1579adc3c16dbd98f779def810be107ca31d43c9af49e065e48693d1" Namespace="calico-system" Pod="calico-apiserver-588cd44876-hb6b4" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-calico--apiserver--588cd44876--hb6b4-eth0" Mar 3 13:36:24.265736 containerd[1639]: 2026-03-03 13:36:24.205 [INFO][4981] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a5cee5da1579adc3c16dbd98f779def810be107ca31d43c9af49e065e48693d1" HandleID="k8s-pod-network.a5cee5da1579adc3c16dbd98f779def810be107ca31d43c9af49e065e48693d1" Workload="ci--4459--2--4--a--1c48930dd2-k8s-calico--apiserver--588cd44876--hb6b4-eth0" Mar 3 13:36:24.265736 containerd[1639]: 2026-03-03 13:36:24.213 [INFO][4981] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="a5cee5da1579adc3c16dbd98f779def810be107ca31d43c9af49e065e48693d1" HandleID="k8s-pod-network.a5cee5da1579adc3c16dbd98f779def810be107ca31d43c9af49e065e48693d1" Workload="ci--4459--2--4--a--1c48930dd2-k8s-calico--apiserver--588cd44876--hb6b4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000404f90), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-4-a-1c48930dd2", "pod":"calico-apiserver-588cd44876-hb6b4", "timestamp":"2026-03-03 13:36:24.205633128 +0000 UTC"}, Hostname:"ci-4459-2-4-a-1c48930dd2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000267600)} Mar 3 13:36:24.265736 containerd[1639]: 2026-03-03 13:36:24.213 [INFO][4981] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 3 13:36:24.265736 containerd[1639]: 2026-03-03 13:36:24.213 [INFO][4981] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 3 13:36:24.265736 containerd[1639]: 2026-03-03 13:36:24.213 [INFO][4981] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-a-1c48930dd2' Mar 3 13:36:24.265736 containerd[1639]: 2026-03-03 13:36:24.215 [INFO][4981] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.a5cee5da1579adc3c16dbd98f779def810be107ca31d43c9af49e065e48693d1" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:24.265736 containerd[1639]: 2026-03-03 13:36:24.219 [INFO][4981] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:24.265736 containerd[1639]: 2026-03-03 13:36:24.223 [INFO][4981] ipam/ipam.go 526: Trying affinity for 192.168.119.64/26 host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:24.265736 containerd[1639]: 2026-03-03 13:36:24.226 [INFO][4981] ipam/ipam.go 160: Attempting to load block cidr=192.168.119.64/26 host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:24.265736 containerd[1639]: 2026-03-03 13:36:24.228 [INFO][4981] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.119.64/26 host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:24.265736 containerd[1639]: 2026-03-03 13:36:24.228 [INFO][4981] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.119.64/26 handle="k8s-pod-network.a5cee5da1579adc3c16dbd98f779def810be107ca31d43c9af49e065e48693d1" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:24.265736 containerd[1639]: 2026-03-03 13:36:24.230 [INFO][4981] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.a5cee5da1579adc3c16dbd98f779def810be107ca31d43c9af49e065e48693d1 Mar 3 13:36:24.265736 containerd[1639]: 2026-03-03 13:36:24.234 [INFO][4981] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.119.64/26 handle="k8s-pod-network.a5cee5da1579adc3c16dbd98f779def810be107ca31d43c9af49e065e48693d1" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:24.265736 containerd[1639]: 2026-03-03 13:36:24.240 [INFO][4981] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.119.71/26] block=192.168.119.64/26 handle="k8s-pod-network.a5cee5da1579adc3c16dbd98f779def810be107ca31d43c9af49e065e48693d1" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:24.265736 containerd[1639]: 2026-03-03 13:36:24.240 [INFO][4981] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.119.71/26] handle="k8s-pod-network.a5cee5da1579adc3c16dbd98f779def810be107ca31d43c9af49e065e48693d1" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:24.265736 containerd[1639]: 2026-03-03 13:36:24.240 [INFO][4981] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 3 13:36:24.265736 containerd[1639]: 2026-03-03 13:36:24.240 [INFO][4981] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.119.71/26] IPv6=[] ContainerID="a5cee5da1579adc3c16dbd98f779def810be107ca31d43c9af49e065e48693d1" HandleID="k8s-pod-network.a5cee5da1579adc3c16dbd98f779def810be107ca31d43c9af49e065e48693d1" Workload="ci--4459--2--4--a--1c48930dd2-k8s-calico--apiserver--588cd44876--hb6b4-eth0" Mar 3 13:36:24.266352 containerd[1639]: 2026-03-03 13:36:24.244 [INFO][4956] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a5cee5da1579adc3c16dbd98f779def810be107ca31d43c9af49e065e48693d1" Namespace="calico-system" Pod="calico-apiserver-588cd44876-hb6b4" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-calico--apiserver--588cd44876--hb6b4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--a--1c48930dd2-k8s-calico--apiserver--588cd44876--hb6b4-eth0", GenerateName:"calico-apiserver-588cd44876-", Namespace:"calico-system", SelfLink:"", UID:"323b935e-0775-45cf-89a0-f43932ed5b64", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 35, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"588cd44876", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-a-1c48930dd2", ContainerID:"", Pod:"calico-apiserver-588cd44876-hb6b4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.119.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calib396cebb693", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:24.266352 containerd[1639]: 2026-03-03 13:36:24.244 [INFO][4956] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.119.71/32] ContainerID="a5cee5da1579adc3c16dbd98f779def810be107ca31d43c9af49e065e48693d1" Namespace="calico-system" Pod="calico-apiserver-588cd44876-hb6b4" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-calico--apiserver--588cd44876--hb6b4-eth0" Mar 3 13:36:24.266352 containerd[1639]: 2026-03-03 13:36:24.244 [INFO][4956] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib396cebb693 ContainerID="a5cee5da1579adc3c16dbd98f779def810be107ca31d43c9af49e065e48693d1" Namespace="calico-system" Pod="calico-apiserver-588cd44876-hb6b4" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-calico--apiserver--588cd44876--hb6b4-eth0" Mar 3 13:36:24.266352 containerd[1639]: 2026-03-03 13:36:24.250 [INFO][4956] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a5cee5da1579adc3c16dbd98f779def810be107ca31d43c9af49e065e48693d1" Namespace="calico-system" Pod="calico-apiserver-588cd44876-hb6b4" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-calico--apiserver--588cd44876--hb6b4-eth0" Mar 3 13:36:24.266352 containerd[1639]: 2026-03-03 13:36:24.251 [INFO][4956] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a5cee5da1579adc3c16dbd98f779def810be107ca31d43c9af49e065e48693d1" Namespace="calico-system" Pod="calico-apiserver-588cd44876-hb6b4" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-calico--apiserver--588cd44876--hb6b4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--a--1c48930dd2-k8s-calico--apiserver--588cd44876--hb6b4-eth0", GenerateName:"calico-apiserver-588cd44876-", Namespace:"calico-system", SelfLink:"", UID:"323b935e-0775-45cf-89a0-f43932ed5b64", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 35, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"588cd44876", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-a-1c48930dd2", ContainerID:"a5cee5da1579adc3c16dbd98f779def810be107ca31d43c9af49e065e48693d1", Pod:"calico-apiserver-588cd44876-hb6b4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.119.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calib396cebb693", MAC:"6a:ba:a2:3c:01:42", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:24.266352 containerd[1639]: 2026-03-03 13:36:24.260 [INFO][4956] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a5cee5da1579adc3c16dbd98f779def810be107ca31d43c9af49e065e48693d1" Namespace="calico-system" Pod="calico-apiserver-588cd44876-hb6b4" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-calico--apiserver--588cd44876--hb6b4-eth0" Mar 3 13:36:24.304779 containerd[1639]: time="2026-03-03T13:36:24.304735588Z" level=info msg="connecting to shim a5cee5da1579adc3c16dbd98f779def810be107ca31d43c9af49e065e48693d1" address="unix:///run/containerd/s/29468790aed36038fe9a650217cd404f8ead90f3127bf14bad8f7b4e6cc2cd3a" namespace=k8s.io protocol=ttrpc version=3 Mar 3 13:36:24.342782 systemd[1]: Started cri-containerd-a5cee5da1579adc3c16dbd98f779def810be107ca31d43c9af49e065e48693d1.scope - libcontainer container a5cee5da1579adc3c16dbd98f779def810be107ca31d43c9af49e065e48693d1. Mar 3 13:36:24.363923 systemd-networkd[1489]: cali4fc39ee3ed5: Link UP Mar 3 13:36:24.365149 systemd-networkd[1489]: cali4fc39ee3ed5: Gained carrier Mar 3 13:36:24.385774 containerd[1639]: 2026-03-03 13:36:24.183 [INFO][4966] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--4--a--1c48930dd2-k8s-coredns--66bc5c9577--wx9bf-eth0 coredns-66bc5c9577- kube-system 24edf4b1-ec3a-4ab5-a696-e9721189d47d 845 0 2026-03-03 13:35:45 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-4-a-1c48930dd2 coredns-66bc5c9577-wx9bf eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali4fc39ee3ed5 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="eef7830b74df4a79b8a8abead33dd3b71bdb5af774da21a0165394d48cfba9b5" Namespace="kube-system" Pod="coredns-66bc5c9577-wx9bf" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-coredns--66bc5c9577--wx9bf-" Mar 3 13:36:24.385774 containerd[1639]: 2026-03-03 13:36:24.183 [INFO][4966] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="eef7830b74df4a79b8a8abead33dd3b71bdb5af774da21a0165394d48cfba9b5" Namespace="kube-system" Pod="coredns-66bc5c9577-wx9bf" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-coredns--66bc5c9577--wx9bf-eth0" Mar 3 13:36:24.385774 containerd[1639]: 2026-03-03 13:36:24.232 [INFO][4983] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="eef7830b74df4a79b8a8abead33dd3b71bdb5af774da21a0165394d48cfba9b5" HandleID="k8s-pod-network.eef7830b74df4a79b8a8abead33dd3b71bdb5af774da21a0165394d48cfba9b5" Workload="ci--4459--2--4--a--1c48930dd2-k8s-coredns--66bc5c9577--wx9bf-eth0" Mar 3 13:36:24.385774 containerd[1639]: 2026-03-03 13:36:24.242 [INFO][4983] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="eef7830b74df4a79b8a8abead33dd3b71bdb5af774da21a0165394d48cfba9b5" HandleID="k8s-pod-network.eef7830b74df4a79b8a8abead33dd3b71bdb5af774da21a0165394d48cfba9b5" Workload="ci--4459--2--4--a--1c48930dd2-k8s-coredns--66bc5c9577--wx9bf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000125b40), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-4-a-1c48930dd2", "pod":"coredns-66bc5c9577-wx9bf", "timestamp":"2026-03-03 13:36:24.232498602 +0000 UTC"}, Hostname:"ci-4459-2-4-a-1c48930dd2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0001886e0)} Mar 3 13:36:24.385774 containerd[1639]: 2026-03-03 13:36:24.242 [INFO][4983] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 3 13:36:24.385774 containerd[1639]: 2026-03-03 13:36:24.242 [INFO][4983] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 3 13:36:24.385774 containerd[1639]: 2026-03-03 13:36:24.243 [INFO][4983] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-4-a-1c48930dd2' Mar 3 13:36:24.385774 containerd[1639]: 2026-03-03 13:36:24.318 [INFO][4983] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.eef7830b74df4a79b8a8abead33dd3b71bdb5af774da21a0165394d48cfba9b5" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:24.385774 containerd[1639]: 2026-03-03 13:36:24.323 [INFO][4983] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:24.385774 containerd[1639]: 2026-03-03 13:36:24.328 [INFO][4983] ipam/ipam.go 526: Trying affinity for 192.168.119.64/26 host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:24.385774 containerd[1639]: 2026-03-03 13:36:24.330 [INFO][4983] ipam/ipam.go 160: Attempting to load block cidr=192.168.119.64/26 host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:24.385774 containerd[1639]: 2026-03-03 13:36:24.332 [INFO][4983] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.119.64/26 host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:24.385774 containerd[1639]: 2026-03-03 13:36:24.333 [INFO][4983] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.119.64/26 handle="k8s-pod-network.eef7830b74df4a79b8a8abead33dd3b71bdb5af774da21a0165394d48cfba9b5" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:24.385774 containerd[1639]: 2026-03-03 13:36:24.335 [INFO][4983] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.eef7830b74df4a79b8a8abead33dd3b71bdb5af774da21a0165394d48cfba9b5 Mar 3 13:36:24.385774 containerd[1639]: 2026-03-03 13:36:24.343 [INFO][4983] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.119.64/26 handle="k8s-pod-network.eef7830b74df4a79b8a8abead33dd3b71bdb5af774da21a0165394d48cfba9b5" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:24.385774 containerd[1639]: 2026-03-03 13:36:24.352 [INFO][4983] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.119.72/26] block=192.168.119.64/26 handle="k8s-pod-network.eef7830b74df4a79b8a8abead33dd3b71bdb5af774da21a0165394d48cfba9b5" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:24.385774 containerd[1639]: 2026-03-03 13:36:24.353 [INFO][4983] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.119.72/26] handle="k8s-pod-network.eef7830b74df4a79b8a8abead33dd3b71bdb5af774da21a0165394d48cfba9b5" host="ci-4459-2-4-a-1c48930dd2" Mar 3 13:36:24.385774 containerd[1639]: 2026-03-03 13:36:24.353 [INFO][4983] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 3 13:36:24.385774 containerd[1639]: 2026-03-03 13:36:24.353 [INFO][4983] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.119.72/26] IPv6=[] ContainerID="eef7830b74df4a79b8a8abead33dd3b71bdb5af774da21a0165394d48cfba9b5" HandleID="k8s-pod-network.eef7830b74df4a79b8a8abead33dd3b71bdb5af774da21a0165394d48cfba9b5" Workload="ci--4459--2--4--a--1c48930dd2-k8s-coredns--66bc5c9577--wx9bf-eth0" Mar 3 13:36:24.386824 containerd[1639]: 2026-03-03 13:36:24.358 [INFO][4966] cni-plugin/k8s.go 418: Populated endpoint ContainerID="eef7830b74df4a79b8a8abead33dd3b71bdb5af774da21a0165394d48cfba9b5" Namespace="kube-system" Pod="coredns-66bc5c9577-wx9bf" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-coredns--66bc5c9577--wx9bf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--a--1c48930dd2-k8s-coredns--66bc5c9577--wx9bf-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"24edf4b1-ec3a-4ab5-a696-e9721189d47d", ResourceVersion:"845", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 35, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-a-1c48930dd2", ContainerID:"", Pod:"coredns-66bc5c9577-wx9bf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.119.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4fc39ee3ed5", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:24.386824 containerd[1639]: 2026-03-03 13:36:24.359 [INFO][4966] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.119.72/32] ContainerID="eef7830b74df4a79b8a8abead33dd3b71bdb5af774da21a0165394d48cfba9b5" Namespace="kube-system" Pod="coredns-66bc5c9577-wx9bf" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-coredns--66bc5c9577--wx9bf-eth0" Mar 3 13:36:24.386824 containerd[1639]: 2026-03-03 13:36:24.359 [INFO][4966] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4fc39ee3ed5 ContainerID="eef7830b74df4a79b8a8abead33dd3b71bdb5af774da21a0165394d48cfba9b5" Namespace="kube-system" Pod="coredns-66bc5c9577-wx9bf" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-coredns--66bc5c9577--wx9bf-eth0" Mar 3 13:36:24.386824 containerd[1639]: 2026-03-03 13:36:24.364 [INFO][4966] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="eef7830b74df4a79b8a8abead33dd3b71bdb5af774da21a0165394d48cfba9b5" Namespace="kube-system" Pod="coredns-66bc5c9577-wx9bf" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-coredns--66bc5c9577--wx9bf-eth0" Mar 3 13:36:24.386824 containerd[1639]: 2026-03-03 13:36:24.366 [INFO][4966] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="eef7830b74df4a79b8a8abead33dd3b71bdb5af774da21a0165394d48cfba9b5" Namespace="kube-system" Pod="coredns-66bc5c9577-wx9bf" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-coredns--66bc5c9577--wx9bf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--4--a--1c48930dd2-k8s-coredns--66bc5c9577--wx9bf-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"24edf4b1-ec3a-4ab5-a696-e9721189d47d", ResourceVersion:"845", Generation:0, CreationTimestamp:time.Date(2026, time.March, 3, 13, 35, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-4-a-1c48930dd2", ContainerID:"eef7830b74df4a79b8a8abead33dd3b71bdb5af774da21a0165394d48cfba9b5", Pod:"coredns-66bc5c9577-wx9bf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.119.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4fc39ee3ed5", MAC:"ce:32:2c:dc:ff:92", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 3 13:36:24.387084 containerd[1639]: 2026-03-03 13:36:24.379 [INFO][4966] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="eef7830b74df4a79b8a8abead33dd3b71bdb5af774da21a0165394d48cfba9b5" Namespace="kube-system" Pod="coredns-66bc5c9577-wx9bf" WorkloadEndpoint="ci--4459--2--4--a--1c48930dd2-k8s-coredns--66bc5c9577--wx9bf-eth0" Mar 3 13:36:24.438795 containerd[1639]: time="2026-03-03T13:36:24.438756532Z" level=info msg="connecting to shim eef7830b74df4a79b8a8abead33dd3b71bdb5af774da21a0165394d48cfba9b5" address="unix:///run/containerd/s/ac35b2be1ba354464905023c96d6a04c8151b77c148de513f78fe280ccac5bed" namespace=k8s.io protocol=ttrpc version=3 Mar 3 13:36:24.491097 containerd[1639]: time="2026-03-03T13:36:24.491052519Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-588cd44876-hb6b4,Uid:323b935e-0775-45cf-89a0-f43932ed5b64,Namespace:calico-system,Attempt:0,} returns sandbox id \"a5cee5da1579adc3c16dbd98f779def810be107ca31d43c9af49e065e48693d1\"" Mar 3 13:36:24.508821 systemd[1]: Started cri-containerd-eef7830b74df4a79b8a8abead33dd3b71bdb5af774da21a0165394d48cfba9b5.scope - libcontainer container eef7830b74df4a79b8a8abead33dd3b71bdb5af774da21a0165394d48cfba9b5. Mar 3 13:36:24.573771 containerd[1639]: time="2026-03-03T13:36:24.573633272Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-wx9bf,Uid:24edf4b1-ec3a-4ab5-a696-e9721189d47d,Namespace:kube-system,Attempt:0,} returns sandbox id \"eef7830b74df4a79b8a8abead33dd3b71bdb5af774da21a0165394d48cfba9b5\"" Mar 3 13:36:24.590266 containerd[1639]: time="2026-03-03T13:36:24.590235593Z" level=info msg="CreateContainer within sandbox \"eef7830b74df4a79b8a8abead33dd3b71bdb5af774da21a0165394d48cfba9b5\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 3 13:36:24.599225 containerd[1639]: time="2026-03-03T13:36:24.598636707Z" level=info msg="Container 7447b47fcd6148f658aec1397320dcb7da39f55cdf4363e4eaf0b847f1b208c5: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:36:24.615194 containerd[1639]: time="2026-03-03T13:36:24.615155329Z" level=info msg="CreateContainer within sandbox \"eef7830b74df4a79b8a8abead33dd3b71bdb5af774da21a0165394d48cfba9b5\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"7447b47fcd6148f658aec1397320dcb7da39f55cdf4363e4eaf0b847f1b208c5\"" Mar 3 13:36:24.616039 containerd[1639]: time="2026-03-03T13:36:24.615625942Z" level=info msg="StartContainer for \"7447b47fcd6148f658aec1397320dcb7da39f55cdf4363e4eaf0b847f1b208c5\"" Mar 3 13:36:24.616916 containerd[1639]: time="2026-03-03T13:36:24.616834688Z" level=info msg="connecting to shim 7447b47fcd6148f658aec1397320dcb7da39f55cdf4363e4eaf0b847f1b208c5" address="unix:///run/containerd/s/ac35b2be1ba354464905023c96d6a04c8151b77c148de513f78fe280ccac5bed" protocol=ttrpc version=3 Mar 3 13:36:24.637895 systemd[1]: Started cri-containerd-7447b47fcd6148f658aec1397320dcb7da39f55cdf4363e4eaf0b847f1b208c5.scope - libcontainer container 7447b47fcd6148f658aec1397320dcb7da39f55cdf4363e4eaf0b847f1b208c5. Mar 3 13:36:24.669789 containerd[1639]: time="2026-03-03T13:36:24.669758479Z" level=info msg="StartContainer for \"7447b47fcd6148f658aec1397320dcb7da39f55cdf4363e4eaf0b847f1b208c5\" returns successfully" Mar 3 13:36:24.939308 containerd[1639]: time="2026-03-03T13:36:24.939255383Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:24.940042 containerd[1639]: time="2026-03-03T13:36:24.939896669Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=55623386" Mar 3 13:36:24.940627 containerd[1639]: time="2026-03-03T13:36:24.940602112Z" level=info msg="ImageCreate event name:\"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:24.942043 containerd[1639]: time="2026-03-03T13:36:24.942022584Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:24.942489 containerd[1639]: time="2026-03-03T13:36:24.942472746Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"55623232\" in 2.808201305s" Mar 3 13:36:24.942540 containerd[1639]: time="2026-03-03T13:36:24.942531937Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\"" Mar 3 13:36:24.944297 containerd[1639]: time="2026-03-03T13:36:24.944105613Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Mar 3 13:36:24.946182 containerd[1639]: time="2026-03-03T13:36:24.946163634Z" level=info msg="CreateContainer within sandbox \"6af89fc72473a95e8be077ae139eeb4e4ba5e3b2d992e03ba8c267665723cc5f\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Mar 3 13:36:24.951951 containerd[1639]: time="2026-03-03T13:36:24.951924783Z" level=info msg="Container 0af21e2e12c6733d690bb8591afcf2d5346a06683d6041412e4ba3b823fd4f0e: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:36:24.957456 containerd[1639]: time="2026-03-03T13:36:24.957434196Z" level=info msg="CreateContainer within sandbox \"6af89fc72473a95e8be077ae139eeb4e4ba5e3b2d992e03ba8c267665723cc5f\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"0af21e2e12c6733d690bb8591afcf2d5346a06683d6041412e4ba3b823fd4f0e\"" Mar 3 13:36:24.957877 containerd[1639]: time="2026-03-03T13:36:24.957864346Z" level=info msg="StartContainer for \"0af21e2e12c6733d690bb8591afcf2d5346a06683d6041412e4ba3b823fd4f0e\"" Mar 3 13:36:24.958654 containerd[1639]: time="2026-03-03T13:36:24.958613077Z" level=info msg="connecting to shim 0af21e2e12c6733d690bb8591afcf2d5346a06683d6041412e4ba3b823fd4f0e" address="unix:///run/containerd/s/8a7473faa6f7e8ba0a9be7575864572d51be424ce487f14a64b8d827003e9d62" protocol=ttrpc version=3 Mar 3 13:36:24.975887 systemd[1]: Started cri-containerd-0af21e2e12c6733d690bb8591afcf2d5346a06683d6041412e4ba3b823fd4f0e.scope - libcontainer container 0af21e2e12c6733d690bb8591afcf2d5346a06683d6041412e4ba3b823fd4f0e. Mar 3 13:36:25.022776 containerd[1639]: time="2026-03-03T13:36:25.022732991Z" level=info msg="StartContainer for \"0af21e2e12c6733d690bb8591afcf2d5346a06683d6041412e4ba3b823fd4f0e\" returns successfully" Mar 3 13:36:25.314395 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount56838914.mount: Deactivated successfully. Mar 3 13:36:25.357275 kubelet[2796]: I0303 13:36:25.356709 2796 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-wx9bf" podStartSLOduration=40.356654145 podStartE2EDuration="40.356654145s" podCreationTimestamp="2026-03-03 13:35:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-03 13:36:25.355939789 +0000 UTC m=+48.336921683" watchObservedRunningTime="2026-03-03 13:36:25.356654145 +0000 UTC m=+48.337636039" Mar 3 13:36:25.372306 kubelet[2796]: I0303 13:36:25.371500 2796 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-cccfbd5cf-xvc2m" podStartSLOduration=27.841016329 podStartE2EDuration="31.371470655s" podCreationTimestamp="2026-03-03 13:35:54 +0000 UTC" firstStartedPulling="2026-03-03 13:36:21.41274127 +0000 UTC m=+44.393723134" lastFinishedPulling="2026-03-03 13:36:24.943195606 +0000 UTC m=+47.924177460" observedRunningTime="2026-03-03 13:36:25.367882325 +0000 UTC m=+48.348864179" watchObservedRunningTime="2026-03-03 13:36:25.371470655 +0000 UTC m=+48.352452549" Mar 3 13:36:25.632907 systemd-networkd[1489]: calib396cebb693: Gained IPv6LL Mar 3 13:36:26.401903 systemd-networkd[1489]: cali4fc39ee3ed5: Gained IPv6LL Mar 3 13:36:27.047868 containerd[1639]: time="2026-03-03T13:36:27.047813650Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:27.048576 containerd[1639]: time="2026-03-03T13:36:27.048544200Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=14704317" Mar 3 13:36:27.049378 containerd[1639]: time="2026-03-03T13:36:27.049326671Z" level=info msg="ImageCreate event name:\"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:27.051098 containerd[1639]: time="2026-03-03T13:36:27.051078529Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:27.051630 containerd[1639]: time="2026-03-03T13:36:27.051468247Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"16260314\" in 2.107013677s" Mar 3 13:36:27.051630 containerd[1639]: time="2026-03-03T13:36:27.051492935Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\"" Mar 3 13:36:27.052715 containerd[1639]: time="2026-03-03T13:36:27.052701869Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 3 13:36:27.055033 containerd[1639]: time="2026-03-03T13:36:27.055001517Z" level=info msg="CreateContainer within sandbox \"92c99a05abddc311b5fb95b949b5442bb932e12639d3d2a55ef89656d6a8a743\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 3 13:36:27.064638 containerd[1639]: time="2026-03-03T13:36:27.063869013Z" level=info msg="Container db76583ffec096bd359c51c74f95f5698b4a17952420697f6e6e2f0174aa1b92: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:36:27.081614 containerd[1639]: time="2026-03-03T13:36:27.081581358Z" level=info msg="CreateContainer within sandbox \"92c99a05abddc311b5fb95b949b5442bb932e12639d3d2a55ef89656d6a8a743\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"db76583ffec096bd359c51c74f95f5698b4a17952420697f6e6e2f0174aa1b92\"" Mar 3 13:36:27.082328 containerd[1639]: time="2026-03-03T13:36:27.082277787Z" level=info msg="StartContainer for \"db76583ffec096bd359c51c74f95f5698b4a17952420697f6e6e2f0174aa1b92\"" Mar 3 13:36:27.083293 containerd[1639]: time="2026-03-03T13:36:27.083273855Z" level=info msg="connecting to shim db76583ffec096bd359c51c74f95f5698b4a17952420697f6e6e2f0174aa1b92" address="unix:///run/containerd/s/89c516469df0985547efec0fece60b7e204a48f5e04edcc8ba10100f98d15073" protocol=ttrpc version=3 Mar 3 13:36:27.101782 systemd[1]: Started cri-containerd-db76583ffec096bd359c51c74f95f5698b4a17952420697f6e6e2f0174aa1b92.scope - libcontainer container db76583ffec096bd359c51c74f95f5698b4a17952420697f6e6e2f0174aa1b92. Mar 3 13:36:27.180995 containerd[1639]: time="2026-03-03T13:36:27.180955886Z" level=info msg="StartContainer for \"db76583ffec096bd359c51c74f95f5698b4a17952420697f6e6e2f0174aa1b92\" returns successfully" Mar 3 13:36:27.354630 kubelet[2796]: I0303 13:36:27.354478 2796 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-dlz7j" podStartSLOduration=26.705703005 podStartE2EDuration="33.354453787s" podCreationTimestamp="2026-03-03 13:35:54 +0000 UTC" firstStartedPulling="2026-03-03 13:36:20.403426523 +0000 UTC m=+43.384408387" lastFinishedPulling="2026-03-03 13:36:27.052177305 +0000 UTC m=+50.033159169" observedRunningTime="2026-03-03 13:36:27.353836471 +0000 UTC m=+50.334818335" watchObservedRunningTime="2026-03-03 13:36:27.354453787 +0000 UTC m=+50.335435641" Mar 3 13:36:28.180337 kubelet[2796]: I0303 13:36:28.180291 2796 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 3 13:36:28.182411 kubelet[2796]: I0303 13:36:28.182361 2796 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 3 13:36:31.168356 containerd[1639]: time="2026-03-03T13:36:31.168297704Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:31.169186 containerd[1639]: time="2026-03-03T13:36:31.169025289Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=48415780" Mar 3 13:36:31.169748 containerd[1639]: time="2026-03-03T13:36:31.169728517Z" level=info msg="ImageCreate event name:\"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:31.171505 containerd[1639]: time="2026-03-03T13:36:31.171431326Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:31.171833 containerd[1639]: time="2026-03-03T13:36:31.171812500Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 4.118588399s" Mar 3 13:36:31.171868 containerd[1639]: time="2026-03-03T13:36:31.171835496Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Mar 3 13:36:31.172787 containerd[1639]: time="2026-03-03T13:36:31.172774274Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Mar 3 13:36:31.176445 containerd[1639]: time="2026-03-03T13:36:31.176153332Z" level=info msg="CreateContainer within sandbox \"bd6a810d210a5099c926b0b873c2825e6674c58e0b52bbf13821369d15b58c3e\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 3 13:36:31.182841 containerd[1639]: time="2026-03-03T13:36:31.182736431Z" level=info msg="Container 3bee7ec1067ee961d66637b72b08a201b5aebd75b22883e3e14a9af70d9448fa: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:36:31.186755 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1288387527.mount: Deactivated successfully. Mar 3 13:36:31.189872 containerd[1639]: time="2026-03-03T13:36:31.189846727Z" level=info msg="CreateContainer within sandbox \"bd6a810d210a5099c926b0b873c2825e6674c58e0b52bbf13821369d15b58c3e\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"3bee7ec1067ee961d66637b72b08a201b5aebd75b22883e3e14a9af70d9448fa\"" Mar 3 13:36:31.190471 containerd[1639]: time="2026-03-03T13:36:31.190452805Z" level=info msg="StartContainer for \"3bee7ec1067ee961d66637b72b08a201b5aebd75b22883e3e14a9af70d9448fa\"" Mar 3 13:36:31.192231 containerd[1639]: time="2026-03-03T13:36:31.192211299Z" level=info msg="connecting to shim 3bee7ec1067ee961d66637b72b08a201b5aebd75b22883e3e14a9af70d9448fa" address="unix:///run/containerd/s/38d1f4f6c8056481fd119389db95ac52f9c24b820276cdeb45dea28dc0180a9b" protocol=ttrpc version=3 Mar 3 13:36:31.213777 systemd[1]: Started cri-containerd-3bee7ec1067ee961d66637b72b08a201b5aebd75b22883e3e14a9af70d9448fa.scope - libcontainer container 3bee7ec1067ee961d66637b72b08a201b5aebd75b22883e3e14a9af70d9448fa. Mar 3 13:36:31.255763 containerd[1639]: time="2026-03-03T13:36:31.255646791Z" level=info msg="StartContainer for \"3bee7ec1067ee961d66637b72b08a201b5aebd75b22883e3e14a9af70d9448fa\" returns successfully" Mar 3 13:36:32.026412 kubelet[2796]: I0303 13:36:32.026164 2796 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-588cd44876-x8bpf" podStartSLOduration=29.335327305 podStartE2EDuration="38.025932435s" podCreationTimestamp="2026-03-03 13:35:54 +0000 UTC" firstStartedPulling="2026-03-03 13:36:22.482053778 +0000 UTC m=+45.463035642" lastFinishedPulling="2026-03-03 13:36:31.172658918 +0000 UTC m=+54.153640772" observedRunningTime="2026-03-03 13:36:31.371330278 +0000 UTC m=+54.352312142" watchObservedRunningTime="2026-03-03 13:36:32.025932435 +0000 UTC m=+55.006914289" Mar 3 13:36:33.475359 containerd[1639]: time="2026-03-03T13:36:33.475240360Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:33.476241 containerd[1639]: time="2026-03-03T13:36:33.475871977Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=52406348" Mar 3 13:36:33.476681 containerd[1639]: time="2026-03-03T13:36:33.476640124Z" level=info msg="ImageCreate event name:\"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:33.478581 containerd[1639]: time="2026-03-03T13:36:33.478137317Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:33.478581 containerd[1639]: time="2026-03-03T13:36:33.478489056Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"53962361\" in 2.305646307s" Mar 3 13:36:33.478581 containerd[1639]: time="2026-03-03T13:36:33.478509007Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\"" Mar 3 13:36:33.479511 containerd[1639]: time="2026-03-03T13:36:33.479488216Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 3 13:36:33.492107 containerd[1639]: time="2026-03-03T13:36:33.492078701Z" level=info msg="CreateContainer within sandbox \"b6882a656dc49a7f8247047829031ebf39607d4c1c39cb0fa3a0ca82bcdf4ef4\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 3 13:36:33.497695 containerd[1639]: time="2026-03-03T13:36:33.496559213Z" level=info msg="Container 37fe581e2e3672ebc56c79a80ede820ca949f039f4175a3caeb80b1bac4d2cab: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:36:33.504989 containerd[1639]: time="2026-03-03T13:36:33.504960412Z" level=info msg="CreateContainer within sandbox \"b6882a656dc49a7f8247047829031ebf39607d4c1c39cb0fa3a0ca82bcdf4ef4\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"37fe581e2e3672ebc56c79a80ede820ca949f039f4175a3caeb80b1bac4d2cab\"" Mar 3 13:36:33.506255 containerd[1639]: time="2026-03-03T13:36:33.505460398Z" level=info msg="StartContainer for \"37fe581e2e3672ebc56c79a80ede820ca949f039f4175a3caeb80b1bac4d2cab\"" Mar 3 13:36:33.506255 containerd[1639]: time="2026-03-03T13:36:33.506178438Z" level=info msg="connecting to shim 37fe581e2e3672ebc56c79a80ede820ca949f039f4175a3caeb80b1bac4d2cab" address="unix:///run/containerd/s/26f89c39b6348bd46f4f8de623e82f676cb0e9d1b5f1ce79548bc458eff98d76" protocol=ttrpc version=3 Mar 3 13:36:33.527784 systemd[1]: Started cri-containerd-37fe581e2e3672ebc56c79a80ede820ca949f039f4175a3caeb80b1bac4d2cab.scope - libcontainer container 37fe581e2e3672ebc56c79a80ede820ca949f039f4175a3caeb80b1bac4d2cab. Mar 3 13:36:33.565808 containerd[1639]: time="2026-03-03T13:36:33.565776743Z" level=info msg="StartContainer for \"37fe581e2e3672ebc56c79a80ede820ca949f039f4175a3caeb80b1bac4d2cab\" returns successfully" Mar 3 13:36:33.980297 containerd[1639]: time="2026-03-03T13:36:33.980216603Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 3 13:36:33.982547 containerd[1639]: time="2026-03-03T13:36:33.982319494Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Mar 3 13:36:33.986104 containerd[1639]: time="2026-03-03T13:36:33.986056108Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 506.540368ms" Mar 3 13:36:33.986104 containerd[1639]: time="2026-03-03T13:36:33.986101427Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Mar 3 13:36:33.992608 containerd[1639]: time="2026-03-03T13:36:33.992233480Z" level=info msg="CreateContainer within sandbox \"a5cee5da1579adc3c16dbd98f779def810be107ca31d43c9af49e065e48693d1\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 3 13:36:34.003984 containerd[1639]: time="2026-03-03T13:36:34.003919538Z" level=info msg="Container 52cb4ebe80a5598c3bac0518633fdbc50b481f749467337f05f7d84e53f4c98d: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:36:34.017203 containerd[1639]: time="2026-03-03T13:36:34.017146673Z" level=info msg="CreateContainer within sandbox \"a5cee5da1579adc3c16dbd98f779def810be107ca31d43c9af49e065e48693d1\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"52cb4ebe80a5598c3bac0518633fdbc50b481f749467337f05f7d84e53f4c98d\"" Mar 3 13:36:34.017967 containerd[1639]: time="2026-03-03T13:36:34.017876010Z" level=info msg="StartContainer for \"52cb4ebe80a5598c3bac0518633fdbc50b481f749467337f05f7d84e53f4c98d\"" Mar 3 13:36:34.020784 containerd[1639]: time="2026-03-03T13:36:34.020623567Z" level=info msg="connecting to shim 52cb4ebe80a5598c3bac0518633fdbc50b481f749467337f05f7d84e53f4c98d" address="unix:///run/containerd/s/29468790aed36038fe9a650217cd404f8ead90f3127bf14bad8f7b4e6cc2cd3a" protocol=ttrpc version=3 Mar 3 13:36:34.050900 systemd[1]: Started cri-containerd-52cb4ebe80a5598c3bac0518633fdbc50b481f749467337f05f7d84e53f4c98d.scope - libcontainer container 52cb4ebe80a5598c3bac0518633fdbc50b481f749467337f05f7d84e53f4c98d. Mar 3 13:36:34.113481 containerd[1639]: time="2026-03-03T13:36:34.113389052Z" level=info msg="StartContainer for \"52cb4ebe80a5598c3bac0518633fdbc50b481f749467337f05f7d84e53f4c98d\" returns successfully" Mar 3 13:36:34.408682 kubelet[2796]: I0303 13:36:34.408449 2796 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-77c7bb8fcc-hh8jb" podStartSLOduration=29.523061183 podStartE2EDuration="40.408429189s" podCreationTimestamp="2026-03-03 13:35:54 +0000 UTC" firstStartedPulling="2026-03-03 13:36:22.593773219 +0000 UTC m=+45.574755083" lastFinishedPulling="2026-03-03 13:36:33.479141234 +0000 UTC m=+56.460123089" observedRunningTime="2026-03-03 13:36:34.408033844 +0000 UTC m=+57.389015738" watchObservedRunningTime="2026-03-03 13:36:34.408429189 +0000 UTC m=+57.389411093" Mar 3 13:36:34.409261 kubelet[2796]: I0303 13:36:34.409216 2796 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-588cd44876-hb6b4" podStartSLOduration=30.91531963 podStartE2EDuration="40.409207361s" podCreationTimestamp="2026-03-03 13:35:54 +0000 UTC" firstStartedPulling="2026-03-03 13:36:24.493052591 +0000 UTC m=+47.474034445" lastFinishedPulling="2026-03-03 13:36:33.986940282 +0000 UTC m=+56.967922176" observedRunningTime="2026-03-03 13:36:34.39694272 +0000 UTC m=+57.377924614" watchObservedRunningTime="2026-03-03 13:36:34.409207361 +0000 UTC m=+57.390189245" Mar 3 13:37:09.605288 update_engine[1613]: I20260303 13:37:09.605209 1613 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Mar 3 13:37:09.606369 update_engine[1613]: I20260303 13:37:09.605288 1613 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Mar 3 13:37:09.606369 update_engine[1613]: I20260303 13:37:09.605726 1613 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Mar 3 13:37:09.607690 update_engine[1613]: I20260303 13:37:09.606981 1613 omaha_request_params.cc:62] Current group set to stable Mar 3 13:37:09.607690 update_engine[1613]: I20260303 13:37:09.607139 1613 update_attempter.cc:499] Already updated boot flags. Skipping. Mar 3 13:37:09.607690 update_engine[1613]: I20260303 13:37:09.607154 1613 update_attempter.cc:643] Scheduling an action processor start. Mar 3 13:37:09.607690 update_engine[1613]: I20260303 13:37:09.607183 1613 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 3 13:37:09.607690 update_engine[1613]: I20260303 13:37:09.607231 1613 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Mar 3 13:37:09.607690 update_engine[1613]: I20260303 13:37:09.607331 1613 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 3 13:37:09.607690 update_engine[1613]: I20260303 13:37:09.607367 1613 omaha_request_action.cc:272] Request: Mar 3 13:37:09.607690 update_engine[1613]: Mar 3 13:37:09.607690 update_engine[1613]: Mar 3 13:37:09.607690 update_engine[1613]: Mar 3 13:37:09.607690 update_engine[1613]: Mar 3 13:37:09.607690 update_engine[1613]: Mar 3 13:37:09.607690 update_engine[1613]: Mar 3 13:37:09.607690 update_engine[1613]: Mar 3 13:37:09.607690 update_engine[1613]: Mar 3 13:37:09.607690 update_engine[1613]: I20260303 13:37:09.607380 1613 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 3 13:37:09.615925 locksmithd[1662]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Mar 3 13:37:09.618531 update_engine[1613]: I20260303 13:37:09.617630 1613 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 3 13:37:09.618631 update_engine[1613]: I20260303 13:37:09.618584 1613 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 3 13:37:09.622958 update_engine[1613]: E20260303 13:37:09.622881 1613 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 3 13:37:09.623041 update_engine[1613]: I20260303 13:37:09.623001 1613 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Mar 3 13:37:19.578473 update_engine[1613]: I20260303 13:37:19.578311 1613 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 3 13:37:19.579363 update_engine[1613]: I20260303 13:37:19.579288 1613 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 3 13:37:19.579967 update_engine[1613]: I20260303 13:37:19.579917 1613 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 3 13:37:19.580442 update_engine[1613]: E20260303 13:37:19.580392 1613 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 3 13:37:19.580576 update_engine[1613]: I20260303 13:37:19.580492 1613 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Mar 3 13:37:28.279919 systemd[1]: Started sshd@7-89.167.115.149:22-20.161.92.111:52830.service - OpenSSH per-connection server daemon (20.161.92.111:52830). Mar 3 13:37:28.977574 sshd[5721]: Accepted publickey for core from 20.161.92.111 port 52830 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:37:28.980953 sshd-session[5721]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:37:28.991592 systemd-logind[1607]: New session 8 of user core. Mar 3 13:37:29.000005 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 3 13:37:29.425852 sshd[5724]: Connection closed by 20.161.92.111 port 52830 Mar 3 13:37:29.426895 sshd-session[5721]: pam_unix(sshd:session): session closed for user core Mar 3 13:37:29.435024 systemd[1]: sshd@7-89.167.115.149:22-20.161.92.111:52830.service: Deactivated successfully. Mar 3 13:37:29.439548 systemd[1]: session-8.scope: Deactivated successfully. Mar 3 13:37:29.442366 systemd-logind[1607]: Session 8 logged out. Waiting for processes to exit. Mar 3 13:37:29.445575 systemd-logind[1607]: Removed session 8. Mar 3 13:37:29.569977 update_engine[1613]: I20260303 13:37:29.569875 1613 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 3 13:37:29.570463 update_engine[1613]: I20260303 13:37:29.570011 1613 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 3 13:37:29.570686 update_engine[1613]: I20260303 13:37:29.570626 1613 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 3 13:37:29.571134 update_engine[1613]: E20260303 13:37:29.571094 1613 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 3 13:37:29.571230 update_engine[1613]: I20260303 13:37:29.571202 1613 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Mar 3 13:37:34.561298 systemd[1]: Started sshd@8-89.167.115.149:22-20.161.92.111:49368.service - OpenSSH per-connection server daemon (20.161.92.111:49368). Mar 3 13:37:35.218743 sshd[5738]: Accepted publickey for core from 20.161.92.111 port 49368 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:37:35.222168 sshd-session[5738]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:37:35.231104 systemd-logind[1607]: New session 9 of user core. Mar 3 13:37:35.239911 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 3 13:37:35.640404 sshd[5741]: Connection closed by 20.161.92.111 port 49368 Mar 3 13:37:35.642087 sshd-session[5738]: pam_unix(sshd:session): session closed for user core Mar 3 13:37:35.649236 systemd[1]: sshd@8-89.167.115.149:22-20.161.92.111:49368.service: Deactivated successfully. Mar 3 13:37:35.654052 systemd[1]: session-9.scope: Deactivated successfully. Mar 3 13:37:35.655937 systemd-logind[1607]: Session 9 logged out. Waiting for processes to exit. Mar 3 13:37:35.659884 systemd-logind[1607]: Removed session 9. Mar 3 13:37:39.572103 update_engine[1613]: I20260303 13:37:39.571971 1613 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 3 13:37:39.572103 update_engine[1613]: I20260303 13:37:39.572113 1613 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 3 13:37:39.573245 update_engine[1613]: I20260303 13:37:39.572877 1613 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 3 13:37:39.573316 update_engine[1613]: E20260303 13:37:39.573253 1613 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 3 13:37:39.573408 update_engine[1613]: I20260303 13:37:39.573359 1613 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 3 13:37:39.573408 update_engine[1613]: I20260303 13:37:39.573376 1613 omaha_request_action.cc:617] Omaha request response: Mar 3 13:37:39.573574 update_engine[1613]: E20260303 13:37:39.573537 1613 omaha_request_action.cc:636] Omaha request network transfer failed. Mar 3 13:37:39.573805 update_engine[1613]: I20260303 13:37:39.573738 1613 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Mar 3 13:37:39.573805 update_engine[1613]: I20260303 13:37:39.573763 1613 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 3 13:37:39.573805 update_engine[1613]: I20260303 13:37:39.573778 1613 update_attempter.cc:306] Processing Done. Mar 3 13:37:39.573805 update_engine[1613]: E20260303 13:37:39.573804 1613 update_attempter.cc:619] Update failed. Mar 3 13:37:39.574513 update_engine[1613]: I20260303 13:37:39.573818 1613 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Mar 3 13:37:39.574513 update_engine[1613]: I20260303 13:37:39.573832 1613 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Mar 3 13:37:39.574513 update_engine[1613]: I20260303 13:37:39.573844 1613 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Mar 3 13:37:39.574513 update_engine[1613]: I20260303 13:37:39.573956 1613 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 3 13:37:39.574513 update_engine[1613]: I20260303 13:37:39.573987 1613 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 3 13:37:39.574513 update_engine[1613]: I20260303 13:37:39.574011 1613 omaha_request_action.cc:272] Request: Mar 3 13:37:39.574513 update_engine[1613]: Mar 3 13:37:39.574513 update_engine[1613]: Mar 3 13:37:39.574513 update_engine[1613]: Mar 3 13:37:39.574513 update_engine[1613]: Mar 3 13:37:39.574513 update_engine[1613]: Mar 3 13:37:39.574513 update_engine[1613]: Mar 3 13:37:39.574513 update_engine[1613]: I20260303 13:37:39.574025 1613 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 3 13:37:39.574513 update_engine[1613]: I20260303 13:37:39.574059 1613 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 3 13:37:39.576122 update_engine[1613]: I20260303 13:37:39.574571 1613 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 3 13:37:39.576122 update_engine[1613]: E20260303 13:37:39.574934 1613 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 3 13:37:39.576122 update_engine[1613]: I20260303 13:37:39.575038 1613 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 3 13:37:39.576122 update_engine[1613]: I20260303 13:37:39.575068 1613 omaha_request_action.cc:617] Omaha request response: Mar 3 13:37:39.576122 update_engine[1613]: I20260303 13:37:39.575088 1613 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 3 13:37:39.576122 update_engine[1613]: I20260303 13:37:39.575105 1613 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 3 13:37:39.576122 update_engine[1613]: I20260303 13:37:39.575121 1613 update_attempter.cc:306] Processing Done. Mar 3 13:37:39.576122 update_engine[1613]: I20260303 13:37:39.575134 1613 update_attempter.cc:310] Error event sent. Mar 3 13:37:39.576122 update_engine[1613]: I20260303 13:37:39.575151 1613 update_check_scheduler.cc:74] Next update check in 43m52s Mar 3 13:37:39.576929 locksmithd[1662]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Mar 3 13:37:39.577553 locksmithd[1662]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Mar 3 13:37:40.778380 systemd[1]: Started sshd@9-89.167.115.149:22-20.161.92.111:58322.service - OpenSSH per-connection server daemon (20.161.92.111:58322). Mar 3 13:37:41.455124 sshd[5787]: Accepted publickey for core from 20.161.92.111 port 58322 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:37:41.458156 sshd-session[5787]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:37:41.465737 systemd-logind[1607]: New session 10 of user core. Mar 3 13:37:41.468814 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 3 13:37:41.921055 sshd[5790]: Connection closed by 20.161.92.111 port 58322 Mar 3 13:37:41.922067 sshd-session[5787]: pam_unix(sshd:session): session closed for user core Mar 3 13:37:41.931557 systemd[1]: sshd@9-89.167.115.149:22-20.161.92.111:58322.service: Deactivated successfully. Mar 3 13:37:41.936217 systemd[1]: session-10.scope: Deactivated successfully. Mar 3 13:37:41.938709 systemd-logind[1607]: Session 10 logged out. Waiting for processes to exit. Mar 3 13:37:41.942242 systemd-logind[1607]: Removed session 10. Mar 3 13:37:47.066002 systemd[1]: Started sshd@10-89.167.115.149:22-20.161.92.111:58326.service - OpenSSH per-connection server daemon (20.161.92.111:58326). Mar 3 13:37:47.741490 sshd[5875]: Accepted publickey for core from 20.161.92.111 port 58326 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:37:47.747832 sshd-session[5875]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:37:47.759086 systemd-logind[1607]: New session 11 of user core. Mar 3 13:37:47.765857 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 3 13:37:48.172858 sshd[5882]: Connection closed by 20.161.92.111 port 58326 Mar 3 13:37:48.175384 sshd-session[5875]: pam_unix(sshd:session): session closed for user core Mar 3 13:37:48.182818 systemd[1]: sshd@10-89.167.115.149:22-20.161.92.111:58326.service: Deactivated successfully. Mar 3 13:37:48.187326 systemd[1]: session-11.scope: Deactivated successfully. Mar 3 13:37:48.189364 systemd-logind[1607]: Session 11 logged out. Waiting for processes to exit. Mar 3 13:37:48.193393 systemd-logind[1607]: Removed session 11. Mar 3 13:37:48.309738 systemd[1]: Started sshd@11-89.167.115.149:22-20.161.92.111:58342.service - OpenSSH per-connection server daemon (20.161.92.111:58342). Mar 3 13:37:48.987336 sshd[5895]: Accepted publickey for core from 20.161.92.111 port 58342 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:37:48.990125 sshd-session[5895]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:37:48.999152 systemd-logind[1607]: New session 12 of user core. Mar 3 13:37:49.007914 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 3 13:37:49.471774 sshd[5898]: Connection closed by 20.161.92.111 port 58342 Mar 3 13:37:49.474089 sshd-session[5895]: pam_unix(sshd:session): session closed for user core Mar 3 13:37:49.483148 systemd[1]: sshd@11-89.167.115.149:22-20.161.92.111:58342.service: Deactivated successfully. Mar 3 13:37:49.488414 systemd[1]: session-12.scope: Deactivated successfully. Mar 3 13:37:49.490293 systemd-logind[1607]: Session 12 logged out. Waiting for processes to exit. Mar 3 13:37:49.493810 systemd-logind[1607]: Removed session 12. Mar 3 13:37:49.601831 systemd[1]: Started sshd@12-89.167.115.149:22-20.161.92.111:58354.service - OpenSSH per-connection server daemon (20.161.92.111:58354). Mar 3 13:37:50.251607 sshd[5907]: Accepted publickey for core from 20.161.92.111 port 58354 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:37:50.254469 sshd-session[5907]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:37:50.263770 systemd-logind[1607]: New session 13 of user core. Mar 3 13:37:50.273977 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 3 13:37:50.706430 sshd[5933]: Connection closed by 20.161.92.111 port 58354 Mar 3 13:37:50.707869 sshd-session[5907]: pam_unix(sshd:session): session closed for user core Mar 3 13:37:50.715357 systemd[1]: sshd@12-89.167.115.149:22-20.161.92.111:58354.service: Deactivated successfully. Mar 3 13:37:50.720349 systemd[1]: session-13.scope: Deactivated successfully. Mar 3 13:37:50.725616 systemd-logind[1607]: Session 13 logged out. Waiting for processes to exit. Mar 3 13:37:50.727537 systemd-logind[1607]: Removed session 13. Mar 3 13:37:55.845118 systemd[1]: Started sshd@13-89.167.115.149:22-20.161.92.111:45270.service - OpenSSH per-connection server daemon (20.161.92.111:45270). Mar 3 13:37:56.497036 sshd[5975]: Accepted publickey for core from 20.161.92.111 port 45270 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:37:56.499417 sshd-session[5975]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:37:56.504735 systemd-logind[1607]: New session 14 of user core. Mar 3 13:37:56.508786 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 3 13:37:56.953228 sshd[5978]: Connection closed by 20.161.92.111 port 45270 Mar 3 13:37:56.955014 sshd-session[5975]: pam_unix(sshd:session): session closed for user core Mar 3 13:37:56.961642 systemd[1]: sshd@13-89.167.115.149:22-20.161.92.111:45270.service: Deactivated successfully. Mar 3 13:37:56.965776 systemd[1]: session-14.scope: Deactivated successfully. Mar 3 13:37:56.967734 systemd-logind[1607]: Session 14 logged out. Waiting for processes to exit. Mar 3 13:37:56.970890 systemd-logind[1607]: Removed session 14. Mar 3 13:37:57.092063 systemd[1]: Started sshd@14-89.167.115.149:22-20.161.92.111:45274.service - OpenSSH per-connection server daemon (20.161.92.111:45274). Mar 3 13:37:57.746365 sshd[5990]: Accepted publickey for core from 20.161.92.111 port 45274 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:37:57.750210 sshd-session[5990]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:37:57.762268 systemd-logind[1607]: New session 15 of user core. Mar 3 13:37:57.769173 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 3 13:37:58.325258 sshd[6015]: Connection closed by 20.161.92.111 port 45274 Mar 3 13:37:58.325885 sshd-session[5990]: pam_unix(sshd:session): session closed for user core Mar 3 13:37:58.332449 systemd[1]: sshd@14-89.167.115.149:22-20.161.92.111:45274.service: Deactivated successfully. Mar 3 13:37:58.334474 systemd[1]: session-15.scope: Deactivated successfully. Mar 3 13:37:58.335300 systemd-logind[1607]: Session 15 logged out. Waiting for processes to exit. Mar 3 13:37:58.337207 systemd-logind[1607]: Removed session 15. Mar 3 13:37:58.461359 systemd[1]: Started sshd@15-89.167.115.149:22-20.161.92.111:45278.service - OpenSSH per-connection server daemon (20.161.92.111:45278). Mar 3 13:37:59.135933 sshd[6026]: Accepted publickey for core from 20.161.92.111 port 45278 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:37:59.139098 sshd-session[6026]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:37:59.148765 systemd-logind[1607]: New session 16 of user core. Mar 3 13:37:59.162895 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 3 13:38:00.200378 sshd[6029]: Connection closed by 20.161.92.111 port 45278 Mar 3 13:38:00.202657 sshd-session[6026]: pam_unix(sshd:session): session closed for user core Mar 3 13:38:00.211299 systemd[1]: sshd@15-89.167.115.149:22-20.161.92.111:45278.service: Deactivated successfully. Mar 3 13:38:00.216919 systemd[1]: session-16.scope: Deactivated successfully. Mar 3 13:38:00.219625 systemd-logind[1607]: Session 16 logged out. Waiting for processes to exit. Mar 3 13:38:00.222157 systemd-logind[1607]: Removed session 16. Mar 3 13:38:00.339410 systemd[1]: Started sshd@16-89.167.115.149:22-20.161.92.111:60994.service - OpenSSH per-connection server daemon (20.161.92.111:60994). Mar 3 13:38:01.012020 sshd[6052]: Accepted publickey for core from 20.161.92.111 port 60994 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:38:01.015234 sshd-session[6052]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:38:01.025433 systemd-logind[1607]: New session 17 of user core. Mar 3 13:38:01.030937 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 3 13:38:01.564113 sshd[6055]: Connection closed by 20.161.92.111 port 60994 Mar 3 13:38:01.564935 sshd-session[6052]: pam_unix(sshd:session): session closed for user core Mar 3 13:38:01.569312 systemd[1]: sshd@16-89.167.115.149:22-20.161.92.111:60994.service: Deactivated successfully. Mar 3 13:38:01.571610 systemd[1]: session-17.scope: Deactivated successfully. Mar 3 13:38:01.572618 systemd-logind[1607]: Session 17 logged out. Waiting for processes to exit. Mar 3 13:38:01.574599 systemd-logind[1607]: Removed session 17. Mar 3 13:38:01.703157 systemd[1]: Started sshd@17-89.167.115.149:22-20.161.92.111:32774.service - OpenSSH per-connection server daemon (20.161.92.111:32774). Mar 3 13:38:02.345236 sshd[6066]: Accepted publickey for core from 20.161.92.111 port 32774 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:38:02.348215 sshd-session[6066]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:38:02.357807 systemd-logind[1607]: New session 18 of user core. Mar 3 13:38:02.363943 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 3 13:38:02.772648 sshd[6069]: Connection closed by 20.161.92.111 port 32774 Mar 3 13:38:02.773856 sshd-session[6066]: pam_unix(sshd:session): session closed for user core Mar 3 13:38:02.784431 systemd-logind[1607]: Session 18 logged out. Waiting for processes to exit. Mar 3 13:38:02.786009 systemd[1]: sshd@17-89.167.115.149:22-20.161.92.111:32774.service: Deactivated successfully. Mar 3 13:38:02.791814 systemd[1]: session-18.scope: Deactivated successfully. Mar 3 13:38:02.794529 systemd-logind[1607]: Removed session 18. Mar 3 13:38:07.902851 systemd[1]: Started sshd@18-89.167.115.149:22-20.161.92.111:32790.service - OpenSSH per-connection server daemon (20.161.92.111:32790). Mar 3 13:38:08.557817 sshd[6105]: Accepted publickey for core from 20.161.92.111 port 32790 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:38:08.559945 sshd-session[6105]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:38:08.566785 systemd-logind[1607]: New session 19 of user core. Mar 3 13:38:08.569812 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 3 13:38:09.002892 sshd[6108]: Connection closed by 20.161.92.111 port 32790 Mar 3 13:38:09.003866 sshd-session[6105]: pam_unix(sshd:session): session closed for user core Mar 3 13:38:09.007381 systemd[1]: sshd@18-89.167.115.149:22-20.161.92.111:32790.service: Deactivated successfully. Mar 3 13:38:09.010136 systemd[1]: session-19.scope: Deactivated successfully. Mar 3 13:38:09.011464 systemd-logind[1607]: Session 19 logged out. Waiting for processes to exit. Mar 3 13:38:09.013226 systemd-logind[1607]: Removed session 19. Mar 3 13:38:14.144789 systemd[1]: Started sshd@19-89.167.115.149:22-20.161.92.111:44602.service - OpenSSH per-connection server daemon (20.161.92.111:44602). Mar 3 13:38:14.810983 sshd[6120]: Accepted publickey for core from 20.161.92.111 port 44602 ssh2: RSA SHA256:Z8/rK+Wv3rpYCX2pdoo3bSXSUfY9cXN+4if7niANqB4 Mar 3 13:38:14.813712 sshd-session[6120]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 3 13:38:14.821846 systemd-logind[1607]: New session 20 of user core. Mar 3 13:38:14.827904 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 3 13:38:15.261621 sshd[6146]: Connection closed by 20.161.92.111 port 44602 Mar 3 13:38:15.263419 sshd-session[6120]: pam_unix(sshd:session): session closed for user core Mar 3 13:38:15.273997 systemd[1]: sshd@19-89.167.115.149:22-20.161.92.111:44602.service: Deactivated successfully. Mar 3 13:38:15.277351 systemd[1]: session-20.scope: Deactivated successfully. Mar 3 13:38:15.278901 systemd-logind[1607]: Session 20 logged out. Waiting for processes to exit. Mar 3 13:38:15.281754 systemd-logind[1607]: Removed session 20. Mar 3 13:38:33.295355 systemd[1]: cri-containerd-7160af87749acd009d42b59a3b89da290eb5c5a4ee53bedea1892a5d5b535456.scope: Deactivated successfully. Mar 3 13:38:33.297460 containerd[1639]: time="2026-03-03T13:38:33.297219031Z" level=info msg="received container exit event container_id:\"7160af87749acd009d42b59a3b89da290eb5c5a4ee53bedea1892a5d5b535456\" id:\"7160af87749acd009d42b59a3b89da290eb5c5a4ee53bedea1892a5d5b535456\" pid:2649 exit_status:1 exited_at:{seconds:1772545113 nanos:296967649}" Mar 3 13:38:33.297722 systemd[1]: cri-containerd-7160af87749acd009d42b59a3b89da290eb5c5a4ee53bedea1892a5d5b535456.scope: Consumed 3.312s CPU time, 59.4M memory peak, 64K read from disk. Mar 3 13:38:33.337169 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7160af87749acd009d42b59a3b89da290eb5c5a4ee53bedea1892a5d5b535456-rootfs.mount: Deactivated successfully. Mar 3 13:38:33.507849 kubelet[2796]: E0303 13:38:33.507802 2796 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:59746->10.0.0.2:2379: read: connection timed out" Mar 3 13:38:33.724045 kubelet[2796]: I0303 13:38:33.723877 2796 scope.go:117] "RemoveContainer" containerID="7160af87749acd009d42b59a3b89da290eb5c5a4ee53bedea1892a5d5b535456" Mar 3 13:38:33.726574 containerd[1639]: time="2026-03-03T13:38:33.726511243Z" level=info msg="CreateContainer within sandbox \"bae0e1b80809613d69ac08e65adf27fbcbf4bede06c58d4bf58debd0cf34226c\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Mar 3 13:38:33.739585 containerd[1639]: time="2026-03-03T13:38:33.739534980Z" level=info msg="Container 3095bcdfc4a8ae0c14b0b89b82560fab98eb6b9473050b2e4501264400426a94: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:38:33.760147 containerd[1639]: time="2026-03-03T13:38:33.760081106Z" level=info msg="CreateContainer within sandbox \"bae0e1b80809613d69ac08e65adf27fbcbf4bede06c58d4bf58debd0cf34226c\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"3095bcdfc4a8ae0c14b0b89b82560fab98eb6b9473050b2e4501264400426a94\"" Mar 3 13:38:33.761328 containerd[1639]: time="2026-03-03T13:38:33.761165764Z" level=info msg="StartContainer for \"3095bcdfc4a8ae0c14b0b89b82560fab98eb6b9473050b2e4501264400426a94\"" Mar 3 13:38:33.763915 containerd[1639]: time="2026-03-03T13:38:33.763865385Z" level=info msg="connecting to shim 3095bcdfc4a8ae0c14b0b89b82560fab98eb6b9473050b2e4501264400426a94" address="unix:///run/containerd/s/da835e90a6c7b98e41199e5d60a316b15f9188e5ec23363b39057b5a3c023d84" protocol=ttrpc version=3 Mar 3 13:38:33.779295 systemd[1]: cri-containerd-44e36430d6fa05c64b2e269cc16823a3cc5535921b75e22cf2f67a4a3feb29c1.scope: Deactivated successfully. Mar 3 13:38:33.779825 systemd[1]: cri-containerd-44e36430d6fa05c64b2e269cc16823a3cc5535921b75e22cf2f67a4a3feb29c1.scope: Consumed 7.055s CPU time, 129.3M memory peak, 640K read from disk. Mar 3 13:38:33.782146 containerd[1639]: time="2026-03-03T13:38:33.782073379Z" level=info msg="received container exit event container_id:\"44e36430d6fa05c64b2e269cc16823a3cc5535921b75e22cf2f67a4a3feb29c1\" id:\"44e36430d6fa05c64b2e269cc16823a3cc5535921b75e22cf2f67a4a3feb29c1\" pid:3124 exit_status:1 exited_at:{seconds:1772545113 nanos:778926437}" Mar 3 13:38:33.791781 systemd[1]: Started cri-containerd-3095bcdfc4a8ae0c14b0b89b82560fab98eb6b9473050b2e4501264400426a94.scope - libcontainer container 3095bcdfc4a8ae0c14b0b89b82560fab98eb6b9473050b2e4501264400426a94. Mar 3 13:38:33.809600 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-44e36430d6fa05c64b2e269cc16823a3cc5535921b75e22cf2f67a4a3feb29c1-rootfs.mount: Deactivated successfully. Mar 3 13:38:33.845233 containerd[1639]: time="2026-03-03T13:38:33.845206937Z" level=info msg="StartContainer for \"3095bcdfc4a8ae0c14b0b89b82560fab98eb6b9473050b2e4501264400426a94\" returns successfully" Mar 3 13:38:34.728095 kubelet[2796]: I0303 13:38:34.728068 2796 scope.go:117] "RemoveContainer" containerID="44e36430d6fa05c64b2e269cc16823a3cc5535921b75e22cf2f67a4a3feb29c1" Mar 3 13:38:34.730166 containerd[1639]: time="2026-03-03T13:38:34.730142246Z" level=info msg="CreateContainer within sandbox \"bddd77a6814877316ecf8b704a1d5101a2d4b862e2c87b9918d978594860a419\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Mar 3 13:38:34.742167 containerd[1639]: time="2026-03-03T13:38:34.741817965Z" level=info msg="Container 920f91ad52500bf7f1d5028ccd43908416bd9105ef7413c8d06b5c7a12aad660: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:38:34.748652 containerd[1639]: time="2026-03-03T13:38:34.748622203Z" level=info msg="CreateContainer within sandbox \"bddd77a6814877316ecf8b704a1d5101a2d4b862e2c87b9918d978594860a419\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"920f91ad52500bf7f1d5028ccd43908416bd9105ef7413c8d06b5c7a12aad660\"" Mar 3 13:38:34.749037 containerd[1639]: time="2026-03-03T13:38:34.749016332Z" level=info msg="StartContainer for \"920f91ad52500bf7f1d5028ccd43908416bd9105ef7413c8d06b5c7a12aad660\"" Mar 3 13:38:34.749656 containerd[1639]: time="2026-03-03T13:38:34.749636164Z" level=info msg="connecting to shim 920f91ad52500bf7f1d5028ccd43908416bd9105ef7413c8d06b5c7a12aad660" address="unix:///run/containerd/s/25855f899f239c4c17e2de247b27cea654382418c9363eca74f6409bd5e50b23" protocol=ttrpc version=3 Mar 3 13:38:34.764772 systemd[1]: Started cri-containerd-920f91ad52500bf7f1d5028ccd43908416bd9105ef7413c8d06b5c7a12aad660.scope - libcontainer container 920f91ad52500bf7f1d5028ccd43908416bd9105ef7413c8d06b5c7a12aad660. Mar 3 13:38:34.790413 containerd[1639]: time="2026-03-03T13:38:34.790362191Z" level=info msg="StartContainer for \"920f91ad52500bf7f1d5028ccd43908416bd9105ef7413c8d06b5c7a12aad660\" returns successfully" Mar 3 13:38:37.603812 kubelet[2796]: E0303 13:38:37.601825 2796 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:59406->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4459-2-4-a-1c48930dd2.1899586234c26576 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4459-2-4-a-1c48930dd2,UID:c935c4a8a9f6f7d0099370a1fb38915e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4459-2-4-a-1c48930dd2,},FirstTimestamp:2026-03-03 13:38:27.162195318 +0000 UTC m=+170.143177172,LastTimestamp:2026-03-03 13:38:27.162195318 +0000 UTC m=+170.143177172,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-4-a-1c48930dd2,}" Mar 3 13:38:38.077309 systemd[1]: cri-containerd-8b4375371a18044a6cf1841dbb85c8ad936f2468a3360009604b6af8aa3db15a.scope: Deactivated successfully. Mar 3 13:38:38.078378 systemd[1]: cri-containerd-8b4375371a18044a6cf1841dbb85c8ad936f2468a3360009604b6af8aa3db15a.scope: Consumed 2.622s CPU time, 21.5M memory peak. Mar 3 13:38:38.081068 containerd[1639]: time="2026-03-03T13:38:38.080834868Z" level=info msg="received container exit event container_id:\"8b4375371a18044a6cf1841dbb85c8ad936f2468a3360009604b6af8aa3db15a\" id:\"8b4375371a18044a6cf1841dbb85c8ad936f2468a3360009604b6af8aa3db15a\" pid:2642 exit_status:1 exited_at:{seconds:1772545118 nanos:80446147}" Mar 3 13:38:38.118362 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8b4375371a18044a6cf1841dbb85c8ad936f2468a3360009604b6af8aa3db15a-rootfs.mount: Deactivated successfully. Mar 3 13:38:38.749522 kubelet[2796]: I0303 13:38:38.749460 2796 scope.go:117] "RemoveContainer" containerID="8b4375371a18044a6cf1841dbb85c8ad936f2468a3360009604b6af8aa3db15a" Mar 3 13:38:38.752434 containerd[1639]: time="2026-03-03T13:38:38.752329363Z" level=info msg="CreateContainer within sandbox \"8160feff6e460e0c42ef5320152913d449f11d7b7e54a08c3dc71d1a475ef7a5\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Mar 3 13:38:38.770355 containerd[1639]: time="2026-03-03T13:38:38.767516780Z" level=info msg="Container 04892c98672a2ea508f647179e943c728feabddde3098f60f5667773162ef138: CDI devices from CRI Config.CDIDevices: []" Mar 3 13:38:38.781043 containerd[1639]: time="2026-03-03T13:38:38.780975230Z" level=info msg="CreateContainer within sandbox \"8160feff6e460e0c42ef5320152913d449f11d7b7e54a08c3dc71d1a475ef7a5\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"04892c98672a2ea508f647179e943c728feabddde3098f60f5667773162ef138\"" Mar 3 13:38:38.782833 containerd[1639]: time="2026-03-03T13:38:38.782757978Z" level=info msg="StartContainer for \"04892c98672a2ea508f647179e943c728feabddde3098f60f5667773162ef138\"" Mar 3 13:38:38.785466 containerd[1639]: time="2026-03-03T13:38:38.785410359Z" level=info msg="connecting to shim 04892c98672a2ea508f647179e943c728feabddde3098f60f5667773162ef138" address="unix:///run/containerd/s/50ad18e68255cf1071049f09e621c9c9619568dd926cb8b6a83de66dbbaf7a56" protocol=ttrpc version=3 Mar 3 13:38:38.816925 systemd[1]: Started cri-containerd-04892c98672a2ea508f647179e943c728feabddde3098f60f5667773162ef138.scope - libcontainer container 04892c98672a2ea508f647179e943c728feabddde3098f60f5667773162ef138. Mar 3 13:38:38.861577 containerd[1639]: time="2026-03-03T13:38:38.861542210Z" level=info msg="StartContainer for \"04892c98672a2ea508f647179e943c728feabddde3098f60f5667773162ef138\" returns successfully" Mar 3 13:38:43.509566 kubelet[2796]: E0303 13:38:43.509007 2796 controller.go:195] "Failed to update lease" err="Put \"https://89.167.115.149:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-4-a-1c48930dd2?timeout=10s\": context deadline exceeded" Mar 3 13:38:45.372809 systemd[1]: cri-containerd-920f91ad52500bf7f1d5028ccd43908416bd9105ef7413c8d06b5c7a12aad660.scope: Deactivated successfully. Mar 3 13:38:45.374356 systemd[1]: cri-containerd-920f91ad52500bf7f1d5028ccd43908416bd9105ef7413c8d06b5c7a12aad660.scope: Consumed 232ms CPU time, 38.3M memory peak, 1M read from disk. Mar 3 13:38:45.377802 containerd[1639]: time="2026-03-03T13:38:45.377654402Z" level=info msg="received container exit event container_id:\"920f91ad52500bf7f1d5028ccd43908416bd9105ef7413c8d06b5c7a12aad660\" id:\"920f91ad52500bf7f1d5028ccd43908416bd9105ef7413c8d06b5c7a12aad660\" pid:6256 exit_status:1 exited_at:{seconds:1772545125 nanos:374368889}" Mar 3 13:38:45.421389 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-920f91ad52500bf7f1d5028ccd43908416bd9105ef7413c8d06b5c7a12aad660-rootfs.mount: Deactivated successfully. Mar 3 13:38:45.776716 kubelet[2796]: I0303 13:38:45.776634 2796 scope.go:117] "RemoveContainer" containerID="44e36430d6fa05c64b2e269cc16823a3cc5535921b75e22cf2f67a4a3feb29c1" Mar 3 13:38:45.778506 kubelet[2796]: I0303 13:38:45.777259 2796 scope.go:117] "RemoveContainer" containerID="920f91ad52500bf7f1d5028ccd43908416bd9105ef7413c8d06b5c7a12aad660" Mar 3 13:38:45.778506 kubelet[2796]: E0303 13:38:45.777444 2796 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-5588576f44-cdlhx_tigera-operator(7cde15be-08d4-48ac-bacc-52cbb233fccb)\"" pod="tigera-operator/tigera-operator-5588576f44-cdlhx" podUID="7cde15be-08d4-48ac-bacc-52cbb233fccb" Mar 3 13:38:45.780584 containerd[1639]: time="2026-03-03T13:38:45.780529664Z" level=info msg="RemoveContainer for \"44e36430d6fa05c64b2e269cc16823a3cc5535921b75e22cf2f67a4a3feb29c1\"" Mar 3 13:38:45.789142 containerd[1639]: time="2026-03-03T13:38:45.789088229Z" level=info msg="RemoveContainer for \"44e36430d6fa05c64b2e269cc16823a3cc5535921b75e22cf2f67a4a3feb29c1\" returns successfully"