Mar 7 01:10:04.975084 kernel: Linux version 6.6.127-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT_DYNAMIC Fri Mar 6 22:58:19 -00 2026 Mar 7 01:10:04.975101 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=531e046a631dbba7b4aae1b7955ffa961f5ce7d570e89a624d767cf739ab70b5 Mar 7 01:10:04.975111 kernel: BIOS-provided physical RAM map: Mar 7 01:10:04.975116 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Mar 7 01:10:04.975120 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ed3efff] usable Mar 7 01:10:04.975125 kernel: BIOS-e820: [mem 0x000000007ed3f000-0x000000007edfffff] reserved Mar 7 01:10:04.975130 kernel: BIOS-e820: [mem 0x000000007ee00000-0x000000007f8ecfff] usable Mar 7 01:10:04.975134 kernel: BIOS-e820: [mem 0x000000007f8ed000-0x000000007f9ecfff] reserved Mar 7 01:10:04.975139 kernel: BIOS-e820: [mem 0x000000007f9ed000-0x000000007faecfff] type 20 Mar 7 01:10:04.975143 kernel: BIOS-e820: [mem 0x000000007faed000-0x000000007fb6cfff] reserved Mar 7 01:10:04.975148 kernel: BIOS-e820: [mem 0x000000007fb6d000-0x000000007fb7efff] ACPI data Mar 7 01:10:04.975155 kernel: BIOS-e820: [mem 0x000000007fb7f000-0x000000007fbfefff] ACPI NVS Mar 7 01:10:04.975159 kernel: BIOS-e820: [mem 0x000000007fbff000-0x000000007ff7bfff] usable Mar 7 01:10:04.975164 kernel: BIOS-e820: [mem 0x000000007ff7c000-0x000000007fffffff] reserved Mar 7 01:10:04.975169 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Mar 7 01:10:04.975174 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 7 01:10:04.975181 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Mar 7 01:10:04.975186 kernel: BIOS-e820: [mem 0x0000000100000000-0x0000000179ffffff] usable Mar 7 01:10:04.975190 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Mar 7 01:10:04.975195 kernel: NX (Execute Disable) protection: active Mar 7 01:10:04.975200 kernel: APIC: Static calls initialized Mar 7 01:10:04.975205 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Mar 7 01:10:04.975209 kernel: efi: SMBIOS=0x7f988000 SMBIOS 3.0=0x7f986000 ACPI=0x7fb7e000 ACPI 2.0=0x7fb7e014 MEMATTR=0x7e01b198 Mar 7 01:10:04.975214 kernel: efi: Remove mem135: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Mar 7 01:10:04.975219 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Mar 7 01:10:04.975224 kernel: SMBIOS 3.0.0 present. Mar 7 01:10:04.975228 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 Mar 7 01:10:04.975233 kernel: Hypervisor detected: KVM Mar 7 01:10:04.975240 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Mar 7 01:10:04.975245 kernel: kvm-clock: using sched offset of 12762313029 cycles Mar 7 01:10:04.975250 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 7 01:10:04.975255 kernel: tsc: Detected 2399.998 MHz processor Mar 7 01:10:04.975260 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 7 01:10:04.975265 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 7 01:10:04.975269 kernel: last_pfn = 0x17a000 max_arch_pfn = 0x10000000000 Mar 7 01:10:04.975274 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Mar 7 01:10:04.975279 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 7 01:10:04.975286 kernel: last_pfn = 0x7ff7c max_arch_pfn = 0x10000000000 Mar 7 01:10:04.975292 kernel: Using GB pages for direct mapping Mar 7 01:10:04.975297 kernel: Secure boot disabled Mar 7 01:10:04.975305 kernel: ACPI: Early table checksum verification disabled Mar 7 01:10:04.975310 kernel: ACPI: RSDP 0x000000007FB7E014 000024 (v02 BOCHS ) Mar 7 01:10:04.975315 kernel: ACPI: XSDT 0x000000007FB7D0E8 000054 (v01 BOCHS BXPC 00000001 01000013) Mar 7 01:10:04.975321 kernel: ACPI: FACP 0x000000007FB79000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 7 01:10:04.975328 kernel: ACPI: DSDT 0x000000007FB7A000 002443 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 7 01:10:04.975334 kernel: ACPI: FACS 0x000000007FBDD000 000040 Mar 7 01:10:04.975339 kernel: ACPI: APIC 0x000000007FB78000 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 7 01:10:04.975344 kernel: ACPI: HPET 0x000000007FB77000 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 7 01:10:04.975349 kernel: ACPI: MCFG 0x000000007FB76000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 7 01:10:04.975354 kernel: ACPI: WAET 0x000000007FB75000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 7 01:10:04.975359 kernel: ACPI: BGRT 0x000000007FB74000 000038 (v01 INTEL EDK2 00000002 01000013) Mar 7 01:10:04.975367 kernel: ACPI: Reserving FACP table memory at [mem 0x7fb79000-0x7fb790f3] Mar 7 01:10:04.975372 kernel: ACPI: Reserving DSDT table memory at [mem 0x7fb7a000-0x7fb7c442] Mar 7 01:10:04.975377 kernel: ACPI: Reserving FACS table memory at [mem 0x7fbdd000-0x7fbdd03f] Mar 7 01:10:04.975382 kernel: ACPI: Reserving APIC table memory at [mem 0x7fb78000-0x7fb7807f] Mar 7 01:10:04.975387 kernel: ACPI: Reserving HPET table memory at [mem 0x7fb77000-0x7fb77037] Mar 7 01:10:04.975392 kernel: ACPI: Reserving MCFG table memory at [mem 0x7fb76000-0x7fb7603b] Mar 7 01:10:04.975466 kernel: ACPI: Reserving WAET table memory at [mem 0x7fb75000-0x7fb75027] Mar 7 01:10:04.975471 kernel: ACPI: Reserving BGRT table memory at [mem 0x7fb74000-0x7fb74037] Mar 7 01:10:04.975476 kernel: No NUMA configuration found Mar 7 01:10:04.975484 kernel: Faking a node at [mem 0x0000000000000000-0x0000000179ffffff] Mar 7 01:10:04.975489 kernel: NODE_DATA(0) allocated [mem 0x179ff8000-0x179ffdfff] Mar 7 01:10:04.975494 kernel: Zone ranges: Mar 7 01:10:04.975499 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 7 01:10:04.975505 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Mar 7 01:10:04.975510 kernel: Normal [mem 0x0000000100000000-0x0000000179ffffff] Mar 7 01:10:04.975515 kernel: Movable zone start for each node Mar 7 01:10:04.975520 kernel: Early memory node ranges Mar 7 01:10:04.975525 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Mar 7 01:10:04.975530 kernel: node 0: [mem 0x0000000000100000-0x000000007ed3efff] Mar 7 01:10:04.975537 kernel: node 0: [mem 0x000000007ee00000-0x000000007f8ecfff] Mar 7 01:10:04.975542 kernel: node 0: [mem 0x000000007fbff000-0x000000007ff7bfff] Mar 7 01:10:04.975547 kernel: node 0: [mem 0x0000000100000000-0x0000000179ffffff] Mar 7 01:10:04.975552 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x0000000179ffffff] Mar 7 01:10:04.975557 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 7 01:10:04.975562 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Mar 7 01:10:04.975567 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Mar 7 01:10:04.975572 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Mar 7 01:10:04.975577 kernel: On node 0, zone Normal: 132 pages in unavailable ranges Mar 7 01:10:04.975585 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Mar 7 01:10:04.975590 kernel: ACPI: PM-Timer IO Port: 0x608 Mar 7 01:10:04.975595 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Mar 7 01:10:04.975600 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Mar 7 01:10:04.975605 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Mar 7 01:10:04.975610 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Mar 7 01:10:04.975615 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 7 01:10:04.975620 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Mar 7 01:10:04.975625 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Mar 7 01:10:04.975641 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 7 01:10:04.975646 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Mar 7 01:10:04.975651 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs Mar 7 01:10:04.975656 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Mar 7 01:10:04.975661 kernel: [mem 0x80000000-0xdfffffff] available for PCI devices Mar 7 01:10:04.975666 kernel: Booting paravirtualized kernel on KVM Mar 7 01:10:04.975671 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 7 01:10:04.975677 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Mar 7 01:10:04.975682 kernel: percpu: Embedded 57 pages/cpu s196328 r8192 d28952 u1048576 Mar 7 01:10:04.975689 kernel: pcpu-alloc: s196328 r8192 d28952 u1048576 alloc=1*2097152 Mar 7 01:10:04.975694 kernel: pcpu-alloc: [0] 0 1 Mar 7 01:10:04.975699 kernel: kvm-guest: PV spinlocks disabled, no host support Mar 7 01:10:04.975705 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=531e046a631dbba7b4aae1b7955ffa961f5ce7d570e89a624d767cf739ab70b5 Mar 7 01:10:04.975710 kernel: random: crng init done Mar 7 01:10:04.975715 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 7 01:10:04.975721 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 7 01:10:04.975726 kernel: Fallback order for Node 0: 0 Mar 7 01:10:04.975733 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1004632 Mar 7 01:10:04.975738 kernel: Policy zone: Normal Mar 7 01:10:04.975743 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 7 01:10:04.975748 kernel: software IO TLB: area num 2. Mar 7 01:10:04.975753 kernel: Memory: 3827764K/4091168K available (12288K kernel code, 2288K rwdata, 22752K rodata, 42892K init, 2304K bss, 263200K reserved, 0K cma-reserved) Mar 7 01:10:04.975758 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 7 01:10:04.975763 kernel: ftrace: allocating 37996 entries in 149 pages Mar 7 01:10:04.975769 kernel: ftrace: allocated 149 pages with 4 groups Mar 7 01:10:04.975774 kernel: Dynamic Preempt: voluntary Mar 7 01:10:04.975779 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 7 01:10:04.975787 kernel: rcu: RCU event tracing is enabled. Mar 7 01:10:04.975793 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 7 01:10:04.975798 kernel: Trampoline variant of Tasks RCU enabled. Mar 7 01:10:04.975810 kernel: Rude variant of Tasks RCU enabled. Mar 7 01:10:04.975818 kernel: Tracing variant of Tasks RCU enabled. Mar 7 01:10:04.975823 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 7 01:10:04.975828 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 7 01:10:04.975834 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Mar 7 01:10:04.975839 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 7 01:10:04.975844 kernel: Console: colour dummy device 80x25 Mar 7 01:10:04.975850 kernel: printk: console [tty0] enabled Mar 7 01:10:04.975855 kernel: printk: console [ttyS0] enabled Mar 7 01:10:04.975863 kernel: ACPI: Core revision 20230628 Mar 7 01:10:04.975868 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Mar 7 01:10:04.975873 kernel: APIC: Switch to symmetric I/O mode setup Mar 7 01:10:04.975879 kernel: x2apic enabled Mar 7 01:10:04.975884 kernel: APIC: Switched APIC routing to: physical x2apic Mar 7 01:10:04.975892 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Mar 7 01:10:04.975897 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized Mar 7 01:10:04.975903 kernel: Calibrating delay loop (skipped) preset value.. 4799.99 BogoMIPS (lpj=2399998) Mar 7 01:10:04.975908 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Mar 7 01:10:04.975913 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Mar 7 01:10:04.975919 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Mar 7 01:10:04.975924 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 7 01:10:04.975929 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Mar 7 01:10:04.975935 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Mar 7 01:10:04.975942 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Mar 7 01:10:04.975948 kernel: active return thunk: srso_alias_return_thunk Mar 7 01:10:04.975953 kernel: Speculative Return Stack Overflow: Mitigation: Safe RET Mar 7 01:10:04.975958 kernel: Transient Scheduler Attacks: Forcing mitigation on in a VM Mar 7 01:10:04.975963 kernel: Transient Scheduler Attacks: Vulnerable: Clear CPU buffers attempted, no microcode Mar 7 01:10:04.975969 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Mar 7 01:10:04.975974 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Mar 7 01:10:04.975980 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Mar 7 01:10:04.975987 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Mar 7 01:10:04.975993 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Mar 7 01:10:04.975998 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Mar 7 01:10:04.976003 kernel: x86/fpu: Supporting XSAVE feature 0x200: 'Protection Keys User registers' Mar 7 01:10:04.976008 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Mar 7 01:10:04.976014 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Mar 7 01:10:04.976019 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Mar 7 01:10:04.976024 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Mar 7 01:10:04.976030 kernel: x86/fpu: xstate_offset[9]: 2432, xstate_sizes[9]: 8 Mar 7 01:10:04.976037 kernel: x86/fpu: Enabled xstate features 0x2e7, context size is 2440 bytes, using 'compacted' format. Mar 7 01:10:04.976042 kernel: Freeing SMP alternatives memory: 32K Mar 7 01:10:04.976048 kernel: pid_max: default: 32768 minimum: 301 Mar 7 01:10:04.976053 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 7 01:10:04.976058 kernel: landlock: Up and running. Mar 7 01:10:04.976064 kernel: SELinux: Initializing. Mar 7 01:10:04.976069 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 7 01:10:04.976074 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 7 01:10:04.976080 kernel: smpboot: CPU0: AMD EPYC-Genoa Processor (family: 0x19, model: 0x11, stepping: 0x0) Mar 7 01:10:04.976087 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 7 01:10:04.976092 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 7 01:10:04.976098 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 7 01:10:04.976103 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Mar 7 01:10:04.976109 kernel: ... version: 0 Mar 7 01:10:04.976114 kernel: ... bit width: 48 Mar 7 01:10:04.976119 kernel: ... generic registers: 6 Mar 7 01:10:04.976125 kernel: ... value mask: 0000ffffffffffff Mar 7 01:10:04.976130 kernel: ... max period: 00007fffffffffff Mar 7 01:10:04.976137 kernel: ... fixed-purpose events: 0 Mar 7 01:10:04.976143 kernel: ... event mask: 000000000000003f Mar 7 01:10:04.976148 kernel: signal: max sigframe size: 3376 Mar 7 01:10:04.976153 kernel: rcu: Hierarchical SRCU implementation. Mar 7 01:10:04.976159 kernel: rcu: Max phase no-delay instances is 400. Mar 7 01:10:04.976164 kernel: smp: Bringing up secondary CPUs ... Mar 7 01:10:04.976169 kernel: smpboot: x86: Booting SMP configuration: Mar 7 01:10:04.976175 kernel: .... node #0, CPUs: #1 Mar 7 01:10:04.976180 kernel: smp: Brought up 1 node, 2 CPUs Mar 7 01:10:04.976187 kernel: smpboot: Max logical packages: 1 Mar 7 01:10:04.976193 kernel: smpboot: Total of 2 processors activated (9599.99 BogoMIPS) Mar 7 01:10:04.976198 kernel: devtmpfs: initialized Mar 7 01:10:04.976203 kernel: x86/mm: Memory block size: 128MB Mar 7 01:10:04.976209 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7fb7f000-0x7fbfefff] (524288 bytes) Mar 7 01:10:04.976214 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 7 01:10:04.976220 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 7 01:10:04.976225 kernel: pinctrl core: initialized pinctrl subsystem Mar 7 01:10:04.976230 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 7 01:10:04.976238 kernel: audit: initializing netlink subsys (disabled) Mar 7 01:10:04.976243 kernel: audit: type=2000 audit(1772845802.873:1): state=initialized audit_enabled=0 res=1 Mar 7 01:10:04.976249 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 7 01:10:04.976254 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 7 01:10:04.976259 kernel: cpuidle: using governor menu Mar 7 01:10:04.976264 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 7 01:10:04.976270 kernel: dca service started, version 1.12.1 Mar 7 01:10:04.976275 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xe0000000-0xefffffff] (base 0xe0000000) Mar 7 01:10:04.976280 kernel: PCI: Using configuration type 1 for base access Mar 7 01:10:04.976288 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 7 01:10:04.976294 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 7 01:10:04.976299 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Mar 7 01:10:04.976304 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 7 01:10:04.976309 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 7 01:10:04.976315 kernel: ACPI: Added _OSI(Module Device) Mar 7 01:10:04.976320 kernel: ACPI: Added _OSI(Processor Device) Mar 7 01:10:04.976325 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 7 01:10:04.976330 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 7 01:10:04.976338 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Mar 7 01:10:04.976343 kernel: ACPI: Interpreter enabled Mar 7 01:10:04.976348 kernel: ACPI: PM: (supports S0 S5) Mar 7 01:10:04.976354 kernel: ACPI: Using IOAPIC for interrupt routing Mar 7 01:10:04.976359 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 7 01:10:04.976364 kernel: PCI: Using E820 reservations for host bridge windows Mar 7 01:10:04.976369 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Mar 7 01:10:04.976375 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 7 01:10:04.976536 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 7 01:10:04.976656 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Mar 7 01:10:04.976755 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Mar 7 01:10:04.976761 kernel: PCI host bridge to bus 0000:00 Mar 7 01:10:04.976862 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 7 01:10:04.976951 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Mar 7 01:10:04.977038 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 7 01:10:04.977129 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xdfffffff window] Mar 7 01:10:04.977217 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Mar 7 01:10:04.977304 kernel: pci_bus 0000:00: root bus resource [mem 0xc000000000-0xc7ffffffff window] Mar 7 01:10:04.977390 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 7 01:10:04.977513 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 Mar 7 01:10:04.977619 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 Mar 7 01:10:04.977726 kernel: pci 0000:00:01.0: reg 0x10: [mem 0x80000000-0x807fffff pref] Mar 7 01:10:04.977825 kernel: pci 0000:00:01.0: reg 0x18: [mem 0xc060500000-0xc060503fff 64bit pref] Mar 7 01:10:04.977923 kernel: pci 0000:00:01.0: reg 0x20: [mem 0x8138a000-0x8138afff] Mar 7 01:10:04.978019 kernel: pci 0000:00:01.0: reg 0x30: [mem 0xffff0000-0xffffffff pref] Mar 7 01:10:04.978121 kernel: pci 0000:00:01.0: BAR 0: assigned to efifb Mar 7 01:10:04.978217 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 7 01:10:04.978323 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Mar 7 01:10:04.978435 kernel: pci 0000:00:02.0: reg 0x10: [mem 0x81389000-0x81389fff] Mar 7 01:10:04.978545 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Mar 7 01:10:04.978651 kernel: pci 0000:00:02.1: reg 0x10: [mem 0x81388000-0x81388fff] Mar 7 01:10:04.978758 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Mar 7 01:10:04.978855 kernel: pci 0000:00:02.2: reg 0x10: [mem 0x81387000-0x81387fff] Mar 7 01:10:04.978962 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Mar 7 01:10:04.979059 kernel: pci 0000:00:02.3: reg 0x10: [mem 0x81386000-0x81386fff] Mar 7 01:10:04.979165 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Mar 7 01:10:04.979261 kernel: pci 0000:00:02.4: reg 0x10: [mem 0x81385000-0x81385fff] Mar 7 01:10:04.979364 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Mar 7 01:10:04.979473 kernel: pci 0000:00:02.5: reg 0x10: [mem 0x81384000-0x81384fff] Mar 7 01:10:04.979577 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Mar 7 01:10:04.979681 kernel: pci 0000:00:02.6: reg 0x10: [mem 0x81383000-0x81383fff] Mar 7 01:10:04.979788 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Mar 7 01:10:04.979885 kernel: pci 0000:00:02.7: reg 0x10: [mem 0x81382000-0x81382fff] Mar 7 01:10:04.979988 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 Mar 7 01:10:04.980083 kernel: pci 0000:00:03.0: reg 0x10: [mem 0x81381000-0x81381fff] Mar 7 01:10:04.980186 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 Mar 7 01:10:04.980282 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Mar 7 01:10:04.980387 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 Mar 7 01:10:04.980501 kernel: pci 0000:00:1f.2: reg 0x20: [io 0x6040-0x605f] Mar 7 01:10:04.980599 kernel: pci 0000:00:1f.2: reg 0x24: [mem 0x81380000-0x81380fff] Mar 7 01:10:04.980713 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 Mar 7 01:10:04.980811 kernel: pci 0000:00:1f.3: reg 0x20: [io 0x6000-0x603f] Mar 7 01:10:04.980920 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 Mar 7 01:10:04.981026 kernel: pci 0000:01:00.0: reg 0x14: [mem 0x81200000-0x81200fff] Mar 7 01:10:04.981127 kernel: pci 0000:01:00.0: reg 0x20: [mem 0xc060000000-0xc060003fff 64bit pref] Mar 7 01:10:04.981227 kernel: pci 0000:01:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Mar 7 01:10:04.981324 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Mar 7 01:10:04.981432 kernel: pci 0000:00:02.0: bridge window [mem 0x81200000-0x812fffff] Mar 7 01:10:04.981529 kernel: pci 0000:00:02.0: bridge window [mem 0xc060000000-0xc0600fffff 64bit pref] Mar 7 01:10:04.981645 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 Mar 7 01:10:04.981750 kernel: pci 0000:02:00.0: reg 0x10: [mem 0x81100000-0x81103fff 64bit] Mar 7 01:10:04.981848 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Mar 7 01:10:04.981943 kernel: pci 0000:00:02.1: bridge window [mem 0x81100000-0x811fffff] Mar 7 01:10:04.982051 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 Mar 7 01:10:04.982152 kernel: pci 0000:03:00.0: reg 0x14: [mem 0x81000000-0x81000fff] Mar 7 01:10:04.982252 kernel: pci 0000:03:00.0: reg 0x20: [mem 0xc060100000-0xc060103fff 64bit pref] Mar 7 01:10:04.982349 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Mar 7 01:10:04.982505 kernel: pci 0000:00:02.2: bridge window [mem 0x81000000-0x810fffff] Mar 7 01:10:04.982604 kernel: pci 0000:00:02.2: bridge window [mem 0xc060100000-0xc0601fffff 64bit pref] Mar 7 01:10:04.982721 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 Mar 7 01:10:04.982822 kernel: pci 0000:04:00.0: reg 0x20: [mem 0xc060200000-0xc060203fff 64bit pref] Mar 7 01:10:04.982920 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Mar 7 01:10:04.983014 kernel: pci 0000:00:02.3: bridge window [mem 0xc060200000-0xc0602fffff 64bit pref] Mar 7 01:10:04.983121 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 Mar 7 01:10:04.983224 kernel: pci 0000:05:00.0: reg 0x14: [mem 0x80f00000-0x80f00fff] Mar 7 01:10:04.983323 kernel: pci 0000:05:00.0: reg 0x20: [mem 0xc060300000-0xc060303fff 64bit pref] Mar 7 01:10:04.985456 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Mar 7 01:10:04.985568 kernel: pci 0000:00:02.4: bridge window [mem 0x80f00000-0x80ffffff] Mar 7 01:10:04.985678 kernel: pci 0000:00:02.4: bridge window [mem 0xc060300000-0xc0603fffff 64bit pref] Mar 7 01:10:04.985789 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 Mar 7 01:10:04.985892 kernel: pci 0000:06:00.0: reg 0x14: [mem 0x80e00000-0x80e00fff] Mar 7 01:10:04.985998 kernel: pci 0000:06:00.0: reg 0x20: [mem 0xc060400000-0xc060403fff 64bit pref] Mar 7 01:10:04.986096 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Mar 7 01:10:04.986193 kernel: pci 0000:00:02.5: bridge window [mem 0x80e00000-0x80efffff] Mar 7 01:10:04.986289 kernel: pci 0000:00:02.5: bridge window [mem 0xc060400000-0xc0604fffff 64bit pref] Mar 7 01:10:04.986296 kernel: acpiphp: Slot [0] registered Mar 7 01:10:04.986418 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 Mar 7 01:10:04.986522 kernel: pci 0000:07:00.0: reg 0x14: [mem 0x80c00000-0x80c00fff] Mar 7 01:10:04.986622 kernel: pci 0000:07:00.0: reg 0x20: [mem 0xc000000000-0xc000003fff 64bit pref] Mar 7 01:10:04.986736 kernel: pci 0000:07:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Mar 7 01:10:04.986837 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Mar 7 01:10:04.986933 kernel: pci 0000:00:02.6: bridge window [mem 0x80c00000-0x80dfffff] Mar 7 01:10:04.987027 kernel: pci 0000:00:02.6: bridge window [mem 0xc000000000-0xc01fffffff 64bit pref] Mar 7 01:10:04.987034 kernel: acpiphp: Slot [0-2] registered Mar 7 01:10:04.987130 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Mar 7 01:10:04.987226 kernel: pci 0000:00:02.7: bridge window [mem 0x80a00000-0x80bfffff] Mar 7 01:10:04.987321 kernel: pci 0000:00:02.7: bridge window [mem 0xc020000000-0xc03fffffff 64bit pref] Mar 7 01:10:04.987331 kernel: acpiphp: Slot [0-3] registered Mar 7 01:10:04.987437 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Mar 7 01:10:04.987532 kernel: pci 0000:00:03.0: bridge window [mem 0x80800000-0x809fffff] Mar 7 01:10:04.987629 kernel: pci 0000:00:03.0: bridge window [mem 0xc040000000-0xc05fffffff 64bit pref] Mar 7 01:10:04.987643 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Mar 7 01:10:04.987649 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Mar 7 01:10:04.987654 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 7 01:10:04.987660 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Mar 7 01:10:04.987665 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Mar 7 01:10:04.987674 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Mar 7 01:10:04.987679 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Mar 7 01:10:04.987685 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Mar 7 01:10:04.987690 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Mar 7 01:10:04.987695 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Mar 7 01:10:04.987701 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Mar 7 01:10:04.987706 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Mar 7 01:10:04.987712 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Mar 7 01:10:04.987717 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Mar 7 01:10:04.987725 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Mar 7 01:10:04.987731 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Mar 7 01:10:04.987736 kernel: iommu: Default domain type: Translated Mar 7 01:10:04.987741 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 7 01:10:04.987747 kernel: efivars: Registered efivars operations Mar 7 01:10:04.987752 kernel: PCI: Using ACPI for IRQ routing Mar 7 01:10:04.987757 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 7 01:10:04.987763 kernel: e820: reserve RAM buffer [mem 0x7ed3f000-0x7fffffff] Mar 7 01:10:04.987771 kernel: e820: reserve RAM buffer [mem 0x7f8ed000-0x7fffffff] Mar 7 01:10:04.987777 kernel: e820: reserve RAM buffer [mem 0x7ff7c000-0x7fffffff] Mar 7 01:10:04.987782 kernel: e820: reserve RAM buffer [mem 0x17a000000-0x17bffffff] Mar 7 01:10:04.987883 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Mar 7 01:10:04.987980 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Mar 7 01:10:04.988076 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 7 01:10:04.988083 kernel: vgaarb: loaded Mar 7 01:10:04.988088 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Mar 7 01:10:04.988093 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Mar 7 01:10:04.988101 kernel: clocksource: Switched to clocksource kvm-clock Mar 7 01:10:04.988107 kernel: VFS: Disk quotas dquot_6.6.0 Mar 7 01:10:04.988113 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 7 01:10:04.988118 kernel: pnp: PnP ACPI init Mar 7 01:10:04.988226 kernel: system 00:04: [mem 0xe0000000-0xefffffff window] has been reserved Mar 7 01:10:04.988234 kernel: pnp: PnP ACPI: found 5 devices Mar 7 01:10:04.988239 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 7 01:10:04.988245 kernel: NET: Registered PF_INET protocol family Mar 7 01:10:04.988266 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 7 01:10:04.988275 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 7 01:10:04.988280 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 7 01:10:04.988286 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 7 01:10:04.988292 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 7 01:10:04.988298 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 7 01:10:04.988303 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 7 01:10:04.988309 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 7 01:10:04.988314 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 7 01:10:04.988322 kernel: NET: Registered PF_XDP protocol family Mar 7 01:10:04.990467 kernel: pci 0000:01:00.0: can't claim BAR 6 [mem 0xfff80000-0xffffffff pref]: no compatible bridge window Mar 7 01:10:04.990587 kernel: pci 0000:07:00.0: can't claim BAR 6 [mem 0xfff80000-0xffffffff pref]: no compatible bridge window Mar 7 01:10:04.990700 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Mar 7 01:10:04.990799 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Mar 7 01:10:04.990896 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Mar 7 01:10:04.990993 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x1000-0x1fff] Mar 7 01:10:04.991093 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x2000-0x2fff] Mar 7 01:10:04.991191 kernel: pci 0000:00:03.0: BAR 13: assigned [io 0x3000-0x3fff] Mar 7 01:10:04.991294 kernel: pci 0000:01:00.0: BAR 6: assigned [mem 0x81280000-0x812fffff pref] Mar 7 01:10:04.991392 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Mar 7 01:10:04.993524 kernel: pci 0000:00:02.0: bridge window [mem 0x81200000-0x812fffff] Mar 7 01:10:04.993660 kernel: pci 0000:00:02.0: bridge window [mem 0xc060000000-0xc0600fffff 64bit pref] Mar 7 01:10:04.993774 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Mar 7 01:10:04.993873 kernel: pci 0000:00:02.1: bridge window [mem 0x81100000-0x811fffff] Mar 7 01:10:04.993970 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Mar 7 01:10:04.994066 kernel: pci 0000:00:02.2: bridge window [mem 0x81000000-0x810fffff] Mar 7 01:10:04.994161 kernel: pci 0000:00:02.2: bridge window [mem 0xc060100000-0xc0601fffff 64bit pref] Mar 7 01:10:04.994259 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Mar 7 01:10:04.994353 kernel: pci 0000:00:02.3: bridge window [mem 0xc060200000-0xc0602fffff 64bit pref] Mar 7 01:10:04.994475 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Mar 7 01:10:04.994573 kernel: pci 0000:00:02.4: bridge window [mem 0x80f00000-0x80ffffff] Mar 7 01:10:04.994679 kernel: pci 0000:00:02.4: bridge window [mem 0xc060300000-0xc0603fffff 64bit pref] Mar 7 01:10:04.994778 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Mar 7 01:10:04.994877 kernel: pci 0000:00:02.5: bridge window [mem 0x80e00000-0x80efffff] Mar 7 01:10:04.994973 kernel: pci 0000:00:02.5: bridge window [mem 0xc060400000-0xc0604fffff 64bit pref] Mar 7 01:10:04.995075 kernel: pci 0000:07:00.0: BAR 6: assigned [mem 0x80c80000-0x80cfffff pref] Mar 7 01:10:04.995171 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Mar 7 01:10:04.995271 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] Mar 7 01:10:04.995366 kernel: pci 0000:00:02.6: bridge window [mem 0x80c00000-0x80dfffff] Mar 7 01:10:04.995477 kernel: pci 0000:00:02.6: bridge window [mem 0xc000000000-0xc01fffffff 64bit pref] Mar 7 01:10:04.995574 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Mar 7 01:10:04.995685 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] Mar 7 01:10:04.995783 kernel: pci 0000:00:02.7: bridge window [mem 0x80a00000-0x80bfffff] Mar 7 01:10:04.995878 kernel: pci 0000:00:02.7: bridge window [mem 0xc020000000-0xc03fffffff 64bit pref] Mar 7 01:10:04.995976 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Mar 7 01:10:04.996072 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] Mar 7 01:10:04.996171 kernel: pci 0000:00:03.0: bridge window [mem 0x80800000-0x809fffff] Mar 7 01:10:04.996267 kernel: pci 0000:00:03.0: bridge window [mem 0xc040000000-0xc05fffffff 64bit pref] Mar 7 01:10:04.996363 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Mar 7 01:10:04.999391 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Mar 7 01:10:04.999501 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Mar 7 01:10:04.999599 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xdfffffff window] Mar 7 01:10:04.999694 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Mar 7 01:10:04.999781 kernel: pci_bus 0000:00: resource 9 [mem 0xc000000000-0xc7ffffffff window] Mar 7 01:10:05.001754 kernel: pci_bus 0000:01: resource 1 [mem 0x81200000-0x812fffff] Mar 7 01:10:05.001865 kernel: pci_bus 0000:01: resource 2 [mem 0xc060000000-0xc0600fffff 64bit pref] Mar 7 01:10:05.001971 kernel: pci_bus 0000:02: resource 1 [mem 0x81100000-0x811fffff] Mar 7 01:10:05.002075 kernel: pci_bus 0000:03: resource 1 [mem 0x81000000-0x810fffff] Mar 7 01:10:05.002168 kernel: pci_bus 0000:03: resource 2 [mem 0xc060100000-0xc0601fffff 64bit pref] Mar 7 01:10:05.002268 kernel: pci_bus 0000:04: resource 2 [mem 0xc060200000-0xc0602fffff 64bit pref] Mar 7 01:10:05.002367 kernel: pci_bus 0000:05: resource 1 [mem 0x80f00000-0x80ffffff] Mar 7 01:10:05.002489 kernel: pci_bus 0000:05: resource 2 [mem 0xc060300000-0xc0603fffff 64bit pref] Mar 7 01:10:05.002590 kernel: pci_bus 0000:06: resource 1 [mem 0x80e00000-0x80efffff] Mar 7 01:10:05.002699 kernel: pci_bus 0000:06: resource 2 [mem 0xc060400000-0xc0604fffff 64bit pref] Mar 7 01:10:05.002799 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] Mar 7 01:10:05.002893 kernel: pci_bus 0000:07: resource 1 [mem 0x80c00000-0x80dfffff] Mar 7 01:10:05.002985 kernel: pci_bus 0000:07: resource 2 [mem 0xc000000000-0xc01fffffff 64bit pref] Mar 7 01:10:05.003088 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] Mar 7 01:10:05.003182 kernel: pci_bus 0000:08: resource 1 [mem 0x80a00000-0x80bfffff] Mar 7 01:10:05.003276 kernel: pci_bus 0000:08: resource 2 [mem 0xc020000000-0xc03fffffff 64bit pref] Mar 7 01:10:05.003379 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] Mar 7 01:10:05.004632 kernel: pci_bus 0000:09: resource 1 [mem 0x80800000-0x809fffff] Mar 7 01:10:05.004741 kernel: pci_bus 0000:09: resource 2 [mem 0xc040000000-0xc05fffffff 64bit pref] Mar 7 01:10:05.004750 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Mar 7 01:10:05.004756 kernel: PCI: CLS 0 bytes, default 64 Mar 7 01:10:05.004762 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Mar 7 01:10:05.004768 kernel: software IO TLB: mapped [mem 0x0000000077ffd000-0x000000007bffd000] (64MB) Mar 7 01:10:05.004774 kernel: Initialise system trusted keyrings Mar 7 01:10:05.004784 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 7 01:10:05.004791 kernel: Key type asymmetric registered Mar 7 01:10:05.004797 kernel: Asymmetric key parser 'x509' registered Mar 7 01:10:05.004802 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Mar 7 01:10:05.004808 kernel: io scheduler mq-deadline registered Mar 7 01:10:05.004814 kernel: io scheduler kyber registered Mar 7 01:10:05.004820 kernel: io scheduler bfq registered Mar 7 01:10:05.004920 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Mar 7 01:10:05.005021 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Mar 7 01:10:05.005123 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Mar 7 01:10:05.005221 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Mar 7 01:10:05.005319 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Mar 7 01:10:05.005433 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Mar 7 01:10:05.005534 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Mar 7 01:10:05.005631 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Mar 7 01:10:05.005737 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Mar 7 01:10:05.005834 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Mar 7 01:10:05.005936 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Mar 7 01:10:05.006578 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Mar 7 01:10:05.006691 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Mar 7 01:10:05.006789 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Mar 7 01:10:05.006886 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Mar 7 01:10:05.006982 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Mar 7 01:10:05.006989 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Mar 7 01:10:05.007085 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Mar 7 01:10:05.007186 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Mar 7 01:10:05.007193 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 7 01:10:05.007199 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Mar 7 01:10:05.007204 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 7 01:10:05.007210 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 7 01:10:05.007216 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Mar 7 01:10:05.007222 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 7 01:10:05.007228 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 7 01:10:05.007330 kernel: rtc_cmos 00:03: RTC can wake from S4 Mar 7 01:10:05.007341 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Mar 7 01:10:05.007443 kernel: rtc_cmos 00:03: registered as rtc0 Mar 7 01:10:05.007536 kernel: rtc_cmos 00:03: setting system clock to 2026-03-07T01:10:04 UTC (1772845804) Mar 7 01:10:05.007628 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Mar 7 01:10:05.007644 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Mar 7 01:10:05.007650 kernel: efifb: probing for efifb Mar 7 01:10:05.007656 kernel: efifb: framebuffer at 0x80000000, using 4032k, total 4032k Mar 7 01:10:05.007661 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Mar 7 01:10:05.007670 kernel: efifb: scrolling: redraw Mar 7 01:10:05.007676 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Mar 7 01:10:05.007682 kernel: Console: switching to colour frame buffer device 160x50 Mar 7 01:10:05.007690 kernel: fb0: EFI VGA frame buffer device Mar 7 01:10:05.007696 kernel: pstore: Using crash dump compression: deflate Mar 7 01:10:05.007701 kernel: pstore: Registered efi_pstore as persistent store backend Mar 7 01:10:05.007707 kernel: NET: Registered PF_INET6 protocol family Mar 7 01:10:05.007713 kernel: Segment Routing with IPv6 Mar 7 01:10:05.007718 kernel: In-situ OAM (IOAM) with IPv6 Mar 7 01:10:05.007726 kernel: NET: Registered PF_PACKET protocol family Mar 7 01:10:05.007732 kernel: Key type dns_resolver registered Mar 7 01:10:05.007738 kernel: IPI shorthand broadcast: enabled Mar 7 01:10:05.007743 kernel: sched_clock: Marking stable (1434011496, 216062935)->(1712483886, -62409455) Mar 7 01:10:05.007749 kernel: registered taskstats version 1 Mar 7 01:10:05.007755 kernel: Loading compiled-in X.509 certificates Mar 7 01:10:05.007761 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.127-flatcar: da286e6f6c247ee6f65a875c513de7da57782e90' Mar 7 01:10:05.007766 kernel: Key type .fscrypt registered Mar 7 01:10:05.007772 kernel: Key type fscrypt-provisioning registered Mar 7 01:10:05.007780 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 7 01:10:05.007786 kernel: ima: Allocated hash algorithm: sha1 Mar 7 01:10:05.007792 kernel: ima: No architecture policies found Mar 7 01:10:05.007797 kernel: clk: Disabling unused clocks Mar 7 01:10:05.007803 kernel: Freeing unused kernel image (initmem) memory: 42892K Mar 7 01:10:05.007808 kernel: Write protecting the kernel read-only data: 36864k Mar 7 01:10:05.007814 kernel: Freeing unused kernel image (rodata/data gap) memory: 1824K Mar 7 01:10:05.007820 kernel: Run /init as init process Mar 7 01:10:05.007825 kernel: with arguments: Mar 7 01:10:05.007835 kernel: /init Mar 7 01:10:05.007841 kernel: with environment: Mar 7 01:10:05.007846 kernel: HOME=/ Mar 7 01:10:05.007852 kernel: TERM=linux Mar 7 01:10:05.007859 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 7 01:10:05.007867 systemd[1]: Detected virtualization kvm. Mar 7 01:10:05.007874 systemd[1]: Detected architecture x86-64. Mar 7 01:10:05.007882 systemd[1]: Running in initrd. Mar 7 01:10:05.007888 systemd[1]: No hostname configured, using default hostname. Mar 7 01:10:05.007894 systemd[1]: Hostname set to . Mar 7 01:10:05.007900 systemd[1]: Initializing machine ID from VM UUID. Mar 7 01:10:05.007906 systemd[1]: Queued start job for default target initrd.target. Mar 7 01:10:05.007912 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 7 01:10:05.007918 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 7 01:10:05.007925 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 7 01:10:05.007933 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 7 01:10:05.007939 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 7 01:10:05.007945 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 7 01:10:05.007952 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 7 01:10:05.007958 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 7 01:10:05.007964 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 7 01:10:05.007970 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 7 01:10:05.007978 systemd[1]: Reached target paths.target - Path Units. Mar 7 01:10:05.007984 systemd[1]: Reached target slices.target - Slice Units. Mar 7 01:10:05.007990 systemd[1]: Reached target swap.target - Swaps. Mar 7 01:10:05.007996 systemd[1]: Reached target timers.target - Timer Units. Mar 7 01:10:05.008002 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 7 01:10:05.008008 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 7 01:10:05.008014 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 7 01:10:05.008020 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Mar 7 01:10:05.008025 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 7 01:10:05.008034 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 7 01:10:05.008040 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 7 01:10:05.008046 systemd[1]: Reached target sockets.target - Socket Units. Mar 7 01:10:05.008052 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 7 01:10:05.008060 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 7 01:10:05.008066 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 7 01:10:05.008072 systemd[1]: Starting systemd-fsck-usr.service... Mar 7 01:10:05.008078 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 7 01:10:05.008086 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 7 01:10:05.008111 systemd-journald[187]: Collecting audit messages is disabled. Mar 7 01:10:05.008125 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 01:10:05.008131 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 7 01:10:05.008139 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 7 01:10:05.008145 systemd[1]: Finished systemd-fsck-usr.service. Mar 7 01:10:05.008152 systemd-journald[187]: Journal started Mar 7 01:10:05.008168 systemd-journald[187]: Runtime Journal (/run/log/journal/867eba112a0c477889a3aa10f34e5022) is 8.0M, max 76.3M, 68.3M free. Mar 7 01:10:05.005989 systemd-modules-load[189]: Inserted module 'overlay' Mar 7 01:10:05.013266 systemd[1]: Started systemd-journald.service - Journal Service. Mar 7 01:10:05.014294 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 01:10:05.022587 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 7 01:10:05.023891 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 7 01:10:05.026530 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 7 01:10:05.033422 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 7 01:10:05.038423 kernel: Bridge firewalling registered Mar 7 01:10:05.036266 systemd-modules-load[189]: Inserted module 'br_netfilter' Mar 7 01:10:05.038601 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 7 01:10:05.046566 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 7 01:10:05.047705 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 7 01:10:05.049255 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 7 01:10:05.050192 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 7 01:10:05.055566 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 7 01:10:05.057508 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 7 01:10:05.058656 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 7 01:10:05.067384 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 7 01:10:05.071048 dracut-cmdline[217]: dracut-dracut-053 Mar 7 01:10:05.073800 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 7 01:10:05.076019 dracut-cmdline[217]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=531e046a631dbba7b4aae1b7955ffa961f5ce7d570e89a624d767cf739ab70b5 Mar 7 01:10:05.092511 systemd-resolved[225]: Positive Trust Anchors: Mar 7 01:10:05.092526 systemd-resolved[225]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 7 01:10:05.092548 systemd-resolved[225]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 7 01:10:05.095137 systemd-resolved[225]: Defaulting to hostname 'linux'. Mar 7 01:10:05.096718 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 7 01:10:05.097343 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 7 01:10:05.136436 kernel: SCSI subsystem initialized Mar 7 01:10:05.144417 kernel: Loading iSCSI transport class v2.0-870. Mar 7 01:10:05.153421 kernel: iscsi: registered transport (tcp) Mar 7 01:10:05.169439 kernel: iscsi: registered transport (qla4xxx) Mar 7 01:10:05.169495 kernel: QLogic iSCSI HBA Driver Mar 7 01:10:05.204384 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 7 01:10:05.209528 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 7 01:10:05.231376 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 7 01:10:05.231464 kernel: device-mapper: uevent: version 1.0.3 Mar 7 01:10:05.234434 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 7 01:10:05.271458 kernel: raid6: avx512x4 gen() 44740 MB/s Mar 7 01:10:05.289423 kernel: raid6: avx512x2 gen() 46234 MB/s Mar 7 01:10:05.307446 kernel: raid6: avx512x1 gen() 42612 MB/s Mar 7 01:10:05.325445 kernel: raid6: avx2x4 gen() 46738 MB/s Mar 7 01:10:05.343448 kernel: raid6: avx2x2 gen() 49050 MB/s Mar 7 01:10:05.362480 kernel: raid6: avx2x1 gen() 39531 MB/s Mar 7 01:10:05.362513 kernel: raid6: using algorithm avx2x2 gen() 49050 MB/s Mar 7 01:10:05.382581 kernel: raid6: .... xor() 37322 MB/s, rmw enabled Mar 7 01:10:05.382614 kernel: raid6: using avx512x2 recovery algorithm Mar 7 01:10:05.421431 kernel: xor: automatically using best checksumming function avx Mar 7 01:10:05.539460 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 7 01:10:05.553500 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 7 01:10:05.560525 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 7 01:10:05.594471 systemd-udevd[407]: Using default interface naming scheme 'v255'. Mar 7 01:10:05.600060 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 7 01:10:05.607537 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 7 01:10:05.632259 dracut-pre-trigger[417]: rd.md=0: removing MD RAID activation Mar 7 01:10:05.659742 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 7 01:10:05.669876 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 7 01:10:05.742909 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 7 01:10:05.750582 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 7 01:10:05.760314 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 7 01:10:05.761260 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 7 01:10:05.762844 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 7 01:10:05.763163 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 7 01:10:05.769570 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 7 01:10:05.791213 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 7 01:10:05.838760 kernel: ACPI: bus type USB registered Mar 7 01:10:05.838808 kernel: usbcore: registered new interface driver usbfs Mar 7 01:10:05.844478 kernel: cryptd: max_cpu_qlen set to 1000 Mar 7 01:10:05.847349 kernel: usbcore: registered new interface driver hub Mar 7 01:10:05.850536 kernel: scsi host0: Virtio SCSI HBA Mar 7 01:10:05.857534 kernel: usbcore: registered new device driver usb Mar 7 01:10:05.858893 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 7 01:10:05.864857 kernel: scsi 0:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Mar 7 01:10:05.859005 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 7 01:10:05.859481 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 7 01:10:05.859792 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 7 01:10:05.859881 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 01:10:05.860206 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 01:10:05.875143 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 01:10:05.880313 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 7 01:10:05.881353 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 01:10:05.886759 kernel: AVX2 version of gcm_enc/dec engaged. Mar 7 01:10:05.886780 kernel: AES CTR mode by8 optimization enabled Mar 7 01:10:05.891577 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 01:10:05.901425 kernel: libata version 3.00 loaded. Mar 7 01:10:05.920315 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 01:10:05.924670 kernel: ahci 0000:00:1f.2: version 3.0 Mar 7 01:10:05.924870 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Mar 7 01:10:05.933425 kernel: ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode Mar 7 01:10:05.933648 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Mar 7 01:10:05.932438 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 7 01:10:05.940424 kernel: scsi host1: ahci Mar 7 01:10:05.944232 kernel: scsi host2: ahci Mar 7 01:10:05.947409 kernel: scsi host3: ahci Mar 7 01:10:05.947611 kernel: scsi host4: ahci Mar 7 01:10:05.950353 kernel: scsi host5: ahci Mar 7 01:10:05.963848 kernel: scsi host6: ahci Mar 7 01:10:05.964074 kernel: ata1: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380100 irq 48 Mar 7 01:10:05.964083 kernel: ata2: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380180 irq 48 Mar 7 01:10:05.964092 kernel: ata3: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380200 irq 48 Mar 7 01:10:05.964100 kernel: ata4: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380280 irq 48 Mar 7 01:10:05.964107 kernel: ata5: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380300 irq 48 Mar 7 01:10:05.964115 kernel: ata6: SATA max UDMA/133 abar m4096@0x81380000 port 0x81380380 irq 48 Mar 7 01:10:05.959824 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 7 01:10:05.975013 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 7 01:10:05.975209 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Mar 7 01:10:05.975328 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Mar 7 01:10:05.977656 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 7 01:10:05.977826 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Mar 7 01:10:05.981341 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Mar 7 01:10:05.984182 kernel: hub 1-0:1.0: USB hub found Mar 7 01:10:05.984390 kernel: hub 1-0:1.0: 4 ports detected Mar 7 01:10:05.988319 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Mar 7 01:10:05.988523 kernel: hub 2-0:1.0: USB hub found Mar 7 01:10:05.988667 kernel: hub 2-0:1.0: 4 ports detected Mar 7 01:10:06.228806 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Mar 7 01:10:06.280438 kernel: ata2: SATA link down (SStatus 0 SControl 300) Mar 7 01:10:06.280549 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Mar 7 01:10:06.288783 kernel: ata4: SATA link down (SStatus 0 SControl 300) Mar 7 01:10:06.289451 kernel: ata3: SATA link down (SStatus 0 SControl 300) Mar 7 01:10:06.300961 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Mar 7 01:10:06.301020 kernel: ata1.00: applying bridge limits Mar 7 01:10:06.306467 kernel: ata5: SATA link down (SStatus 0 SControl 300) Mar 7 01:10:06.306555 kernel: ata6: SATA link down (SStatus 0 SControl 300) Mar 7 01:10:06.314477 kernel: ata1.00: configured for UDMA/100 Mar 7 01:10:06.321471 kernel: scsi 1:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Mar 7 01:10:06.357697 kernel: sd 0:0:0:0: Power-on or device reset occurred Mar 7 01:10:06.358084 kernel: sd 0:0:0:0: [sda] 160006144 512-byte logical blocks: (81.9 GB/76.3 GiB) Mar 7 01:10:06.363773 kernel: sd 0:0:0:0: [sda] Write Protect is off Mar 7 01:10:06.366734 kernel: sd 0:0:0:0: [sda] Mode Sense: 63 00 00 08 Mar 7 01:10:06.367052 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Mar 7 01:10:06.383421 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 7 01:10:06.383473 kernel: GPT:17805311 != 160006143 Mar 7 01:10:06.383498 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 7 01:10:06.383514 kernel: GPT:17805311 != 160006143 Mar 7 01:10:06.384636 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 7 01:10:06.387429 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 7 01:10:06.389292 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Mar 7 01:10:06.392429 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 7 01:10:06.399611 kernel: usbcore: registered new interface driver usbhid Mar 7 01:10:06.399649 kernel: usbhid: USB HID core driver Mar 7 01:10:06.408443 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input2 Mar 7 01:10:06.408501 kernel: sr 1:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Mar 7 01:10:06.410419 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Mar 7 01:10:06.410618 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 7 01:10:06.427461 kernel: sr 1:0:0:0: Attached scsi CD-ROM sr0 Mar 7 01:10:06.434280 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Mar 7 01:10:06.447669 kernel: BTRFS: device label OEM devid 1 transid 9 /dev/sda6 scanned by (udev-worker) (476) Mar 7 01:10:06.447698 kernel: BTRFS: device fsid 3bed8db9-42ad-4483-9cc8-1ad17a6cd948 devid 1 transid 34 /dev/sda3 scanned by (udev-worker) (468) Mar 7 01:10:06.455636 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Mar 7 01:10:06.461669 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Mar 7 01:10:06.462091 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Mar 7 01:10:06.466320 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Mar 7 01:10:06.470553 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 7 01:10:06.478951 disk-uuid[587]: Primary Header is updated. Mar 7 01:10:06.478951 disk-uuid[587]: Secondary Entries is updated. Mar 7 01:10:06.478951 disk-uuid[587]: Secondary Header is updated. Mar 7 01:10:06.484446 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 7 01:10:06.490451 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 7 01:10:06.497433 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 7 01:10:07.502499 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 7 01:10:07.507023 disk-uuid[588]: The operation has completed successfully. Mar 7 01:10:07.576224 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 7 01:10:07.576320 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 7 01:10:07.588503 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 7 01:10:07.591482 sh[609]: Success Mar 7 01:10:07.603454 kernel: device-mapper: verity: sha256 using implementation "sha256-ni" Mar 7 01:10:07.644982 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 7 01:10:07.652504 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 7 01:10:07.653456 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 7 01:10:07.667625 kernel: BTRFS info (device dm-0): first mount of filesystem 3bed8db9-42ad-4483-9cc8-1ad17a6cd948 Mar 7 01:10:07.667670 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 7 01:10:07.672151 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 7 01:10:07.672173 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 7 01:10:07.675801 kernel: BTRFS info (device dm-0): using free space tree Mar 7 01:10:07.684425 kernel: BTRFS info (device dm-0): enabling ssd optimizations Mar 7 01:10:07.686118 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 7 01:10:07.687061 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 7 01:10:07.692515 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 7 01:10:07.693511 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 7 01:10:07.711787 kernel: BTRFS info (device sda6): first mount of filesystem 872bf425-12c9-4ef2-aaf0-71379b3513d9 Mar 7 01:10:07.711838 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 7 01:10:07.711848 kernel: BTRFS info (device sda6): using free space tree Mar 7 01:10:07.720798 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 7 01:10:07.720851 kernel: BTRFS info (device sda6): auto enabling async discard Mar 7 01:10:07.733846 systemd[1]: mnt-oem.mount: Deactivated successfully. Mar 7 01:10:07.734442 kernel: BTRFS info (device sda6): last unmount of filesystem 872bf425-12c9-4ef2-aaf0-71379b3513d9 Mar 7 01:10:07.740344 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 7 01:10:07.747557 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 7 01:10:07.793684 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 7 01:10:07.800143 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 7 01:10:07.820590 ignition[732]: Ignition 2.19.0 Mar 7 01:10:07.821182 ignition[732]: Stage: fetch-offline Mar 7 01:10:07.820828 systemd-networkd[791]: lo: Link UP Mar 7 01:10:07.821237 ignition[732]: no configs at "/usr/lib/ignition/base.d" Mar 7 01:10:07.820833 systemd-networkd[791]: lo: Gained carrier Mar 7 01:10:07.821249 ignition[732]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 7 01:10:07.821329 ignition[732]: parsed url from cmdline: "" Mar 7 01:10:07.821332 ignition[732]: no config URL provided Mar 7 01:10:07.821337 ignition[732]: reading system config file "/usr/lib/ignition/user.ign" Mar 7 01:10:07.821347 ignition[732]: no config at "/usr/lib/ignition/user.ign" Mar 7 01:10:07.824195 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 7 01:10:07.821352 ignition[732]: failed to fetch config: resource requires networking Mar 7 01:10:07.825559 systemd-networkd[791]: Enumeration completed Mar 7 01:10:07.821530 ignition[732]: Ignition finished successfully Mar 7 01:10:07.826209 systemd-networkd[791]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 01:10:07.826213 systemd-networkd[791]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 7 01:10:07.827237 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 7 01:10:07.827753 systemd-networkd[791]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 01:10:07.827757 systemd-networkd[791]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 7 01:10:07.828370 systemd-networkd[791]: eth0: Link UP Mar 7 01:10:07.828374 systemd-networkd[791]: eth0: Gained carrier Mar 7 01:10:07.828381 systemd-networkd[791]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 01:10:07.828688 systemd[1]: Reached target network.target - Network. Mar 7 01:10:07.832621 systemd-networkd[791]: eth1: Link UP Mar 7 01:10:07.832625 systemd-networkd[791]: eth1: Gained carrier Mar 7 01:10:07.832631 systemd-networkd[791]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 01:10:07.833552 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 7 01:10:07.846117 ignition[798]: Ignition 2.19.0 Mar 7 01:10:07.846129 ignition[798]: Stage: fetch Mar 7 01:10:07.846261 ignition[798]: no configs at "/usr/lib/ignition/base.d" Mar 7 01:10:07.846271 ignition[798]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 7 01:10:07.846347 ignition[798]: parsed url from cmdline: "" Mar 7 01:10:07.846351 ignition[798]: no config URL provided Mar 7 01:10:07.846355 ignition[798]: reading system config file "/usr/lib/ignition/user.ign" Mar 7 01:10:07.846363 ignition[798]: no config at "/usr/lib/ignition/user.ign" Mar 7 01:10:07.846377 ignition[798]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Mar 7 01:10:07.846519 ignition[798]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Mar 7 01:10:07.870459 systemd-networkd[791]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Mar 7 01:10:07.890448 systemd-networkd[791]: eth0: DHCPv4 address 89.167.115.210/32, gateway 172.31.1.1 acquired from 172.31.1.1 Mar 7 01:10:08.046720 ignition[798]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Mar 7 01:10:08.058840 ignition[798]: GET result: OK Mar 7 01:10:08.058954 ignition[798]: parsing config with SHA512: 19cab71d32507a05de6535e545638489bd5543a3c4ddb94f9d870a93500de591ba24acc4b7ffd418791baab843be82f9a64907792cc7ed6b23ccb7558ac6e696 Mar 7 01:10:08.063999 unknown[798]: fetched base config from "system" Mar 7 01:10:08.064025 unknown[798]: fetched base config from "system" Mar 7 01:10:08.064463 ignition[798]: fetch: fetch complete Mar 7 01:10:08.064037 unknown[798]: fetched user config from "hetzner" Mar 7 01:10:08.064473 ignition[798]: fetch: fetch passed Mar 7 01:10:08.064564 ignition[798]: Ignition finished successfully Mar 7 01:10:08.070083 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 7 01:10:08.076719 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 7 01:10:08.115604 ignition[807]: Ignition 2.19.0 Mar 7 01:10:08.115621 ignition[807]: Stage: kargs Mar 7 01:10:08.115882 ignition[807]: no configs at "/usr/lib/ignition/base.d" Mar 7 01:10:08.115901 ignition[807]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 7 01:10:08.117085 ignition[807]: kargs: kargs passed Mar 7 01:10:08.117153 ignition[807]: Ignition finished successfully Mar 7 01:10:08.121545 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 7 01:10:08.132636 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 7 01:10:08.174026 ignition[813]: Ignition 2.19.0 Mar 7 01:10:08.174054 ignition[813]: Stage: disks Mar 7 01:10:08.174377 ignition[813]: no configs at "/usr/lib/ignition/base.d" Mar 7 01:10:08.174441 ignition[813]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 7 01:10:08.175973 ignition[813]: disks: disks passed Mar 7 01:10:08.176087 ignition[813]: Ignition finished successfully Mar 7 01:10:08.178902 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 7 01:10:08.180607 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 7 01:10:08.182220 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 7 01:10:08.182913 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 7 01:10:08.184260 systemd[1]: Reached target sysinit.target - System Initialization. Mar 7 01:10:08.185561 systemd[1]: Reached target basic.target - Basic System. Mar 7 01:10:08.191613 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 7 01:10:08.227724 systemd-fsck[821]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Mar 7 01:10:08.232979 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 7 01:10:08.240007 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 7 01:10:08.330429 kernel: EXT4-fs (sda9): mounted filesystem aab0506b-de72-4dd2-9393-24d7958f49a5 r/w with ordered data mode. Quota mode: none. Mar 7 01:10:08.331646 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 7 01:10:08.333704 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 7 01:10:08.340575 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 7 01:10:08.343996 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 7 01:10:08.359422 kernel: BTRFS: device label OEM devid 1 transid 10 /dev/sda6 scanned by mount (829) Mar 7 01:10:08.358080 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Mar 7 01:10:08.359637 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 7 01:10:08.388423 kernel: BTRFS info (device sda6): first mount of filesystem 872bf425-12c9-4ef2-aaf0-71379b3513d9 Mar 7 01:10:08.388464 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 7 01:10:08.388487 kernel: BTRFS info (device sda6): using free space tree Mar 7 01:10:08.388507 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 7 01:10:08.388527 kernel: BTRFS info (device sda6): auto enabling async discard Mar 7 01:10:08.359704 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 7 01:10:08.393175 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 7 01:10:08.394072 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 7 01:10:08.405561 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 7 01:10:08.413885 coreos-metadata[831]: Mar 07 01:10:08.413 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Mar 7 01:10:08.414697 coreos-metadata[831]: Mar 07 01:10:08.414 INFO Fetch successful Mar 7 01:10:08.415928 coreos-metadata[831]: Mar 07 01:10:08.415 INFO wrote hostname ci-4081-3-6-n-e40d23dcbc to /sysroot/etc/hostname Mar 7 01:10:08.417092 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 7 01:10:08.437708 initrd-setup-root[857]: cut: /sysroot/etc/passwd: No such file or directory Mar 7 01:10:08.443561 initrd-setup-root[864]: cut: /sysroot/etc/group: No such file or directory Mar 7 01:10:08.447392 initrd-setup-root[871]: cut: /sysroot/etc/shadow: No such file or directory Mar 7 01:10:08.452333 initrd-setup-root[878]: cut: /sysroot/etc/gshadow: No such file or directory Mar 7 01:10:08.540130 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 7 01:10:08.546500 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 7 01:10:08.548516 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 7 01:10:08.556475 kernel: BTRFS info (device sda6): last unmount of filesystem 872bf425-12c9-4ef2-aaf0-71379b3513d9 Mar 7 01:10:08.574891 ignition[946]: INFO : Ignition 2.19.0 Mar 7 01:10:08.575554 ignition[946]: INFO : Stage: mount Mar 7 01:10:08.575924 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 7 01:10:08.577141 ignition[946]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 7 01:10:08.577141 ignition[946]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 7 01:10:08.577141 ignition[946]: INFO : mount: mount passed Mar 7 01:10:08.577141 ignition[946]: INFO : Ignition finished successfully Mar 7 01:10:08.578226 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 7 01:10:08.583526 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 7 01:10:08.666920 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 7 01:10:08.671636 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 7 01:10:08.701932 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 scanned by mount (958) Mar 7 01:10:08.702001 kernel: BTRFS info (device sda6): first mount of filesystem 872bf425-12c9-4ef2-aaf0-71379b3513d9 Mar 7 01:10:08.709076 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 7 01:10:08.714607 kernel: BTRFS info (device sda6): using free space tree Mar 7 01:10:08.729080 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 7 01:10:08.729156 kernel: BTRFS info (device sda6): auto enabling async discard Mar 7 01:10:08.734350 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 7 01:10:08.769439 ignition[974]: INFO : Ignition 2.19.0 Mar 7 01:10:08.769439 ignition[974]: INFO : Stage: files Mar 7 01:10:08.769439 ignition[974]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 7 01:10:08.769439 ignition[974]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 7 01:10:08.773012 ignition[974]: DEBUG : files: compiled without relabeling support, skipping Mar 7 01:10:08.773685 ignition[974]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 7 01:10:08.773685 ignition[974]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 7 01:10:08.776871 ignition[974]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 7 01:10:08.777755 ignition[974]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 7 01:10:08.778968 unknown[974]: wrote ssh authorized keys file for user: core Mar 7 01:10:08.779929 ignition[974]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 7 01:10:08.780820 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 7 01:10:08.782008 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Mar 7 01:10:08.934190 systemd-networkd[791]: eth1: Gained IPv6LL Mar 7 01:10:08.985824 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 7 01:10:09.292133 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 7 01:10:09.292133 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 7 01:10:09.295614 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 7 01:10:09.295614 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 7 01:10:09.295614 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 7 01:10:09.295614 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 7 01:10:09.295614 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 7 01:10:09.295614 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 7 01:10:09.295614 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 7 01:10:09.295614 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 7 01:10:09.295614 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 7 01:10:09.295614 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Mar 7 01:10:09.295614 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Mar 7 01:10:09.295614 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Mar 7 01:10:09.295614 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.8-x86-64.raw: attempt #1 Mar 7 01:10:09.636866 systemd-networkd[791]: eth0: Gained IPv6LL Mar 7 01:10:09.705294 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 7 01:10:10.005045 ignition[974]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.8-x86-64.raw" Mar 7 01:10:10.005045 ignition[974]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 7 01:10:10.006377 ignition[974]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 7 01:10:10.006377 ignition[974]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 7 01:10:10.006377 ignition[974]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 7 01:10:10.006377 ignition[974]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Mar 7 01:10:10.006377 ignition[974]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Mar 7 01:10:10.006377 ignition[974]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Mar 7 01:10:10.006377 ignition[974]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Mar 7 01:10:10.006377 ignition[974]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Mar 7 01:10:10.011147 ignition[974]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Mar 7 01:10:10.011147 ignition[974]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 7 01:10:10.011147 ignition[974]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 7 01:10:10.011147 ignition[974]: INFO : files: files passed Mar 7 01:10:10.011147 ignition[974]: INFO : Ignition finished successfully Mar 7 01:10:10.008202 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 7 01:10:10.018606 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 7 01:10:10.020577 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 7 01:10:10.022818 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 7 01:10:10.023332 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 7 01:10:10.032896 initrd-setup-root-after-ignition[1004]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 7 01:10:10.032896 initrd-setup-root-after-ignition[1004]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 7 01:10:10.034457 initrd-setup-root-after-ignition[1008]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 7 01:10:10.036001 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 7 01:10:10.037270 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 7 01:10:10.040499 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 7 01:10:10.070350 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 7 01:10:10.070492 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 7 01:10:10.072074 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 7 01:10:10.073373 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 7 01:10:10.074032 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 7 01:10:10.074882 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 7 01:10:10.103275 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 7 01:10:10.111576 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 7 01:10:10.120629 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 7 01:10:10.121325 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 7 01:10:10.122275 systemd[1]: Stopped target timers.target - Timer Units. Mar 7 01:10:10.123164 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 7 01:10:10.123273 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 7 01:10:10.124455 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 7 01:10:10.125341 systemd[1]: Stopped target basic.target - Basic System. Mar 7 01:10:10.126189 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 7 01:10:10.127062 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 7 01:10:10.127958 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 7 01:10:10.128880 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 7 01:10:10.129807 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 7 01:10:10.130741 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 7 01:10:10.131611 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 7 01:10:10.132506 systemd[1]: Stopped target swap.target - Swaps. Mar 7 01:10:10.133364 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 7 01:10:10.133513 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 7 01:10:10.134689 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 7 01:10:10.135575 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 7 01:10:10.136428 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 7 01:10:10.136558 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 7 01:10:10.137367 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 7 01:10:10.137520 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 7 01:10:10.138585 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 7 01:10:10.138722 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 7 01:10:10.139502 systemd[1]: ignition-files.service: Deactivated successfully. Mar 7 01:10:10.139596 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 7 01:10:10.140371 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Mar 7 01:10:10.140483 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 7 01:10:10.152536 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 7 01:10:10.152902 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 7 01:10:10.153014 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 7 01:10:10.155215 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 7 01:10:10.155582 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 7 01:10:10.155701 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 7 01:10:10.156902 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 7 01:10:10.157373 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 7 01:10:10.161954 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 7 01:10:10.162381 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 7 01:10:10.173582 ignition[1028]: INFO : Ignition 2.19.0 Mar 7 01:10:10.173582 ignition[1028]: INFO : Stage: umount Mar 7 01:10:10.173582 ignition[1028]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 7 01:10:10.173582 ignition[1028]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 7 01:10:10.173582 ignition[1028]: INFO : umount: umount passed Mar 7 01:10:10.173582 ignition[1028]: INFO : Ignition finished successfully Mar 7 01:10:10.176078 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 7 01:10:10.176447 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 7 01:10:10.177462 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 7 01:10:10.177539 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 7 01:10:10.178009 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 7 01:10:10.178052 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 7 01:10:10.178393 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 7 01:10:10.181474 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 7 01:10:10.182173 systemd[1]: Stopped target network.target - Network. Mar 7 01:10:10.183458 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 7 01:10:10.183508 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 7 01:10:10.184171 systemd[1]: Stopped target paths.target - Path Units. Mar 7 01:10:10.185447 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 7 01:10:10.189457 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 7 01:10:10.189850 systemd[1]: Stopped target slices.target - Slice Units. Mar 7 01:10:10.190457 systemd[1]: Stopped target sockets.target - Socket Units. Mar 7 01:10:10.191118 systemd[1]: iscsid.socket: Deactivated successfully. Mar 7 01:10:10.191166 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 7 01:10:10.191734 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 7 01:10:10.191774 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 7 01:10:10.192308 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 7 01:10:10.192353 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 7 01:10:10.192937 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 7 01:10:10.192975 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 7 01:10:10.193688 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 7 01:10:10.194394 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 7 01:10:10.196029 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 7 01:10:10.197474 systemd-networkd[791]: eth1: DHCPv6 lease lost Mar 7 01:10:10.197598 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 7 01:10:10.197705 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 7 01:10:10.198509 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 7 01:10:10.198607 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 7 01:10:10.202154 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 7 01:10:10.202207 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 7 01:10:10.202861 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 7 01:10:10.202899 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 7 01:10:10.204038 systemd-networkd[791]: eth0: DHCPv6 lease lost Mar 7 01:10:10.205349 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 7 01:10:10.205531 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 7 01:10:10.206325 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 7 01:10:10.206358 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 7 01:10:10.209490 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 7 01:10:10.209821 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 7 01:10:10.209865 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 7 01:10:10.210259 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 7 01:10:10.210297 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 7 01:10:10.210695 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 7 01:10:10.210743 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 7 01:10:10.211172 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 7 01:10:10.222764 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 7 01:10:10.222904 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 7 01:10:10.225148 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 7 01:10:10.225270 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 7 01:10:10.226723 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 7 01:10:10.226776 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 7 01:10:10.227444 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 7 01:10:10.227483 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 7 01:10:10.228043 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 7 01:10:10.228085 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 7 01:10:10.229079 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 7 01:10:10.229121 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 7 01:10:10.230133 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 7 01:10:10.230176 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 7 01:10:10.243583 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 7 01:10:10.243953 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 7 01:10:10.244011 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 7 01:10:10.244429 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 7 01:10:10.244480 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 01:10:10.250409 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 7 01:10:10.250523 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 7 01:10:10.251082 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 7 01:10:10.255574 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 7 01:10:10.262374 systemd[1]: Switching root. Mar 7 01:10:10.317357 systemd-journald[187]: Journal stopped Mar 7 01:10:11.365227 systemd-journald[187]: Received SIGTERM from PID 1 (systemd). Mar 7 01:10:11.365309 kernel: SELinux: policy capability network_peer_controls=1 Mar 7 01:10:11.365321 kernel: SELinux: policy capability open_perms=1 Mar 7 01:10:11.365330 kernel: SELinux: policy capability extended_socket_class=1 Mar 7 01:10:11.365344 kernel: SELinux: policy capability always_check_network=0 Mar 7 01:10:11.365352 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 7 01:10:11.365361 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 7 01:10:11.365369 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 7 01:10:11.365378 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 7 01:10:11.365386 kernel: audit: type=1403 audit(1772845810.481:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 7 01:10:11.366707 systemd[1]: Successfully loaded SELinux policy in 48.087ms. Mar 7 01:10:11.366737 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 9.756ms. Mar 7 01:10:11.366756 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Mar 7 01:10:11.366767 systemd[1]: Detected virtualization kvm. Mar 7 01:10:11.366776 systemd[1]: Detected architecture x86-64. Mar 7 01:10:11.366801 systemd[1]: Detected first boot. Mar 7 01:10:11.366833 systemd[1]: Hostname set to . Mar 7 01:10:11.366863 systemd[1]: Initializing machine ID from VM UUID. Mar 7 01:10:11.366904 zram_generator::config[1072]: No configuration found. Mar 7 01:10:11.366939 systemd[1]: Populated /etc with preset unit settings. Mar 7 01:10:11.366971 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 7 01:10:11.367002 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 7 01:10:11.367031 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 7 01:10:11.367041 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 7 01:10:11.367058 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 7 01:10:11.367067 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 7 01:10:11.367078 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 7 01:10:11.367088 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 7 01:10:11.367097 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 7 01:10:11.367106 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 7 01:10:11.367115 systemd[1]: Created slice user.slice - User and Session Slice. Mar 7 01:10:11.367124 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 7 01:10:11.367133 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 7 01:10:11.367146 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 7 01:10:11.367155 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 7 01:10:11.367166 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 7 01:10:11.367175 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 7 01:10:11.367184 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 7 01:10:11.367194 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 7 01:10:11.367202 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 7 01:10:11.367212 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 7 01:10:11.367223 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 7 01:10:11.367231 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 7 01:10:11.367240 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 7 01:10:11.367249 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 7 01:10:11.367258 systemd[1]: Reached target slices.target - Slice Units. Mar 7 01:10:11.367266 systemd[1]: Reached target swap.target - Swaps. Mar 7 01:10:11.367275 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 7 01:10:11.367285 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 7 01:10:11.367293 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 7 01:10:11.367305 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 7 01:10:11.367314 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 7 01:10:11.367323 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 7 01:10:11.367331 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 7 01:10:11.367340 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 7 01:10:11.367349 systemd[1]: Mounting media.mount - External Media Directory... Mar 7 01:10:11.367358 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 7 01:10:11.367367 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 7 01:10:11.367376 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 7 01:10:11.367387 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 7 01:10:11.367645 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 7 01:10:11.367662 systemd[1]: Reached target machines.target - Containers. Mar 7 01:10:11.367679 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 7 01:10:11.367688 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 7 01:10:11.367697 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 7 01:10:11.367706 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 7 01:10:11.367715 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 7 01:10:11.367727 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 7 01:10:11.367736 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 7 01:10:11.367745 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 7 01:10:11.367755 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 7 01:10:11.367763 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 7 01:10:11.367772 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 7 01:10:11.367781 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 7 01:10:11.367793 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 7 01:10:11.367805 systemd[1]: Stopped systemd-fsck-usr.service. Mar 7 01:10:11.367816 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 7 01:10:11.367825 kernel: fuse: init (API version 7.39) Mar 7 01:10:11.367834 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 7 01:10:11.367843 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 7 01:10:11.367852 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 7 01:10:11.367884 systemd-journald[1155]: Collecting audit messages is disabled. Mar 7 01:10:11.367904 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 7 01:10:11.367913 systemd[1]: verity-setup.service: Deactivated successfully. Mar 7 01:10:11.367923 systemd[1]: Stopped verity-setup.service. Mar 7 01:10:11.367932 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 7 01:10:11.367941 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 7 01:10:11.367949 kernel: loop: module loaded Mar 7 01:10:11.367957 kernel: ACPI: bus type drm_connector registered Mar 7 01:10:11.367966 systemd-journald[1155]: Journal started Mar 7 01:10:11.367985 systemd-journald[1155]: Runtime Journal (/run/log/journal/867eba112a0c477889a3aa10f34e5022) is 8.0M, max 76.3M, 68.3M free. Mar 7 01:10:11.374881 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 7 01:10:11.045296 systemd[1]: Queued start job for default target multi-user.target. Mar 7 01:10:11.071042 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Mar 7 01:10:11.071553 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 7 01:10:11.378635 systemd[1]: Started systemd-journald.service - Journal Service. Mar 7 01:10:11.380016 systemd[1]: Mounted media.mount - External Media Directory. Mar 7 01:10:11.381015 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 7 01:10:11.381500 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 7 01:10:11.382009 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 7 01:10:11.382653 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 7 01:10:11.383250 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 7 01:10:11.383876 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 7 01:10:11.384024 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 7 01:10:11.384726 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 7 01:10:11.384855 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 7 01:10:11.385713 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 7 01:10:11.385842 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 7 01:10:11.386459 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 7 01:10:11.386581 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 7 01:10:11.387186 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 7 01:10:11.387305 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 7 01:10:11.388073 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 7 01:10:11.388197 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 7 01:10:11.388786 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 7 01:10:11.389344 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 7 01:10:11.390184 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 7 01:10:11.400165 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 7 01:10:11.407544 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 7 01:10:11.413139 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 7 01:10:11.413530 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 7 01:10:11.413565 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 7 01:10:11.415049 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Mar 7 01:10:11.418538 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 7 01:10:11.421066 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 7 01:10:11.421608 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 7 01:10:11.427617 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 7 01:10:11.428848 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 7 01:10:11.429209 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 7 01:10:11.431479 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 7 01:10:11.432259 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 7 01:10:11.436617 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 7 01:10:11.445084 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 7 01:10:11.447837 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 7 01:10:11.451077 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 7 01:10:11.451530 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 7 01:10:11.452747 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 7 01:10:11.478527 kernel: loop0: detected capacity change from 0 to 228704 Mar 7 01:10:11.483557 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 7 01:10:11.484385 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 7 01:10:11.487595 systemd-journald[1155]: Time spent on flushing to /var/log/journal/867eba112a0c477889a3aa10f34e5022 is 65.968ms for 1182 entries. Mar 7 01:10:11.487595 systemd-journald[1155]: System Journal (/var/log/journal/867eba112a0c477889a3aa10f34e5022) is 8.0M, max 584.8M, 576.8M free. Mar 7 01:10:11.574163 systemd-journald[1155]: Received client request to flush runtime journal. Mar 7 01:10:11.574203 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 7 01:10:11.574226 kernel: loop1: detected capacity change from 0 to 140768 Mar 7 01:10:11.494927 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Mar 7 01:10:11.500166 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 7 01:10:11.563734 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 7 01:10:11.573572 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 7 01:10:11.575760 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 7 01:10:11.588722 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 7 01:10:11.592226 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Mar 7 01:10:11.628029 kernel: loop2: detected capacity change from 0 to 142488 Mar 7 01:10:11.628559 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 7 01:10:11.639768 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 7 01:10:11.643606 systemd-tmpfiles[1206]: ACLs are not supported, ignoring. Mar 7 01:10:11.643621 systemd-tmpfiles[1206]: ACLs are not supported, ignoring. Mar 7 01:10:11.665859 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 7 01:10:11.669903 udevadm[1213]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Mar 7 01:10:11.685745 kernel: loop3: detected capacity change from 0 to 8 Mar 7 01:10:11.710423 kernel: loop4: detected capacity change from 0 to 228704 Mar 7 01:10:11.727439 kernel: loop5: detected capacity change from 0 to 140768 Mar 7 01:10:11.744427 kernel: loop6: detected capacity change from 0 to 142488 Mar 7 01:10:11.762808 kernel: loop7: detected capacity change from 0 to 8 Mar 7 01:10:11.763712 (sd-merge)[1217]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Mar 7 01:10:11.765892 (sd-merge)[1217]: Merged extensions into '/usr'. Mar 7 01:10:11.772030 systemd[1]: Reloading requested from client PID 1192 ('systemd-sysext') (unit systemd-sysext.service)... Mar 7 01:10:11.772245 systemd[1]: Reloading... Mar 7 01:10:11.848428 zram_generator::config[1243]: No configuration found. Mar 7 01:10:11.941363 ldconfig[1187]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 7 01:10:11.978726 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 7 01:10:12.015788 systemd[1]: Reloading finished in 242 ms. Mar 7 01:10:12.045234 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 7 01:10:12.046105 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 7 01:10:12.046921 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 7 01:10:12.056549 systemd[1]: Starting ensure-sysext.service... Mar 7 01:10:12.058086 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 7 01:10:12.062527 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 7 01:10:12.069503 systemd[1]: Reloading requested from client PID 1287 ('systemctl') (unit ensure-sysext.service)... Mar 7 01:10:12.069517 systemd[1]: Reloading... Mar 7 01:10:12.086208 systemd-tmpfiles[1288]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 7 01:10:12.086522 systemd-tmpfiles[1288]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 7 01:10:12.087319 systemd-tmpfiles[1288]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 7 01:10:12.088387 systemd-tmpfiles[1288]: ACLs are not supported, ignoring. Mar 7 01:10:12.088545 systemd-tmpfiles[1288]: ACLs are not supported, ignoring. Mar 7 01:10:12.098159 systemd-tmpfiles[1288]: Detected autofs mount point /boot during canonicalization of boot. Mar 7 01:10:12.098173 systemd-tmpfiles[1288]: Skipping /boot Mar 7 01:10:12.105570 systemd-udevd[1289]: Using default interface naming scheme 'v255'. Mar 7 01:10:12.123484 zram_generator::config[1314]: No configuration found. Mar 7 01:10:12.132988 systemd-tmpfiles[1288]: Detected autofs mount point /boot during canonicalization of boot. Mar 7 01:10:12.133005 systemd-tmpfiles[1288]: Skipping /boot Mar 7 01:10:12.258148 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 7 01:10:12.313573 systemd[1]: Reloading finished in 243 ms. Mar 7 01:10:12.324469 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Mar 7 01:10:12.329006 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 7 01:10:12.331117 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 7 01:10:12.344418 kernel: ACPI: button: Power Button [PWRF] Mar 7 01:10:12.344751 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 7 01:10:12.349581 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Mar 7 01:10:12.352259 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 7 01:10:12.354534 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 7 01:10:12.358594 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 7 01:10:12.367591 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 7 01:10:12.370564 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 7 01:10:12.376114 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 7 01:10:12.381281 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 7 01:10:12.381433 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 7 01:10:12.390011 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 7 01:10:12.391635 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 7 01:10:12.393821 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 7 01:10:12.395247 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 7 01:10:12.395337 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 7 01:10:12.401280 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 7 01:10:12.401437 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 7 01:10:12.401563 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 7 01:10:12.401616 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 7 01:10:12.407367 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 7 01:10:12.407980 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 7 01:10:12.414665 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 7 01:10:12.415149 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 7 01:10:12.415300 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 7 01:10:12.417203 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 7 01:10:12.417371 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 7 01:10:12.422550 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 7 01:10:12.429426 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 7 01:10:12.434176 systemd[1]: Finished ensure-sysext.service. Mar 7 01:10:12.456699 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 7 01:10:12.456865 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 7 01:10:12.461860 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 7 01:10:12.472546 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 7 01:10:12.475831 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 7 01:10:12.477603 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 7 01:10:12.477798 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 7 01:10:12.487468 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 34 scanned by (udev-worker) (1374) Mar 7 01:10:12.490231 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 7 01:10:12.490447 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 7 01:10:12.491036 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 7 01:10:12.499081 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 7 01:10:12.507158 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 7 01:10:12.518484 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Mar 7 01:10:12.520934 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 7 01:10:12.521042 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 7 01:10:12.532554 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 7 01:10:12.536151 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 7 01:10:12.541544 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 7 01:10:12.542529 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 7 01:10:12.542555 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 7 01:10:12.544539 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 7 01:10:12.545636 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 7 01:10:12.549761 kernel: mousedev: PS/2 mouse device common for all mice Mar 7 01:10:12.554309 augenrules[1440]: No rules Mar 7 01:10:12.556087 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Mar 7 01:10:12.560099 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 7 01:10:12.560714 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 7 01:10:12.579277 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 7 01:10:12.581010 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 7 01:10:12.582372 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 7 01:10:12.582571 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 7 01:10:12.585269 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 7 01:10:12.585342 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 7 01:10:12.623917 systemd-resolved[1393]: Positive Trust Anchors: Mar 7 01:10:12.624376 systemd-resolved[1393]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 7 01:10:12.624453 systemd-resolved[1393]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 7 01:10:12.630005 systemd-resolved[1393]: Using system hostname 'ci-4081-3-6-n-e40d23dcbc'. Mar 7 01:10:12.632445 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 7 01:10:12.632978 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 7 01:10:12.653480 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Mar 7 01:10:12.653754 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Mar 7 01:10:12.657405 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI) Mar 7 01:10:12.657637 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Mar 7 01:10:12.656030 systemd-networkd[1392]: lo: Link UP Mar 7 01:10:12.656035 systemd-networkd[1392]: lo: Gained carrier Mar 7 01:10:12.658171 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 7 01:10:12.658645 systemd[1]: Reached target time-set.target - System Time Set. Mar 7 01:10:12.660443 systemd-timesyncd[1417]: No network connectivity, watching for changes. Mar 7 01:10:12.660853 systemd-networkd[1392]: Enumeration completed Mar 7 01:10:12.660922 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 7 01:10:12.661507 systemd[1]: Reached target network.target - Network. Mar 7 01:10:12.661772 systemd-networkd[1392]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 01:10:12.661776 systemd-networkd[1392]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 7 01:10:12.662722 systemd-networkd[1392]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 01:10:12.662726 systemd-networkd[1392]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 7 01:10:12.663420 systemd-networkd[1392]: eth0: Link UP Mar 7 01:10:12.663425 systemd-networkd[1392]: eth0: Gained carrier Mar 7 01:10:12.663435 systemd-networkd[1392]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 01:10:12.668708 systemd-networkd[1392]: eth1: Link UP Mar 7 01:10:12.668715 systemd-networkd[1392]: eth1: Gained carrier Mar 7 01:10:12.668729 systemd-networkd[1392]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 7 01:10:12.670512 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 7 01:10:12.675671 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 01:10:12.703990 systemd-networkd[1392]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Mar 7 01:10:12.707634 systemd-timesyncd[1417]: Network configuration changed, trying to establish connection. Mar 7 01:10:12.721139 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input4 Mar 7 01:10:12.720549 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 7 01:10:12.720802 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 01:10:12.724469 systemd-networkd[1392]: eth0: DHCPv4 address 89.167.115.210/32, gateway 172.31.1.1 acquired from 172.31.1.1 Mar 7 01:10:12.726592 systemd-timesyncd[1417]: Network configuration changed, trying to establish connection. Mar 7 01:10:12.728356 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 01:10:12.736059 kernel: EDAC MC: Ver: 3.0.0 Mar 7 01:10:12.747337 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Mar 7 01:10:12.747437 kernel: Console: switching to colour dummy device 80x25 Mar 7 01:10:12.750417 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Mar 7 01:10:12.753780 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Mar 7 01:10:12.753823 kernel: [drm] features: -context_init Mar 7 01:10:12.758658 kernel: [drm] number of scanouts: 1 Mar 7 01:10:12.758727 kernel: [drm] number of cap sets: 0 Mar 7 01:10:12.758107 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Mar 7 01:10:12.762047 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 7 01:10:12.762443 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:01.0 on minor 0 Mar 7 01:10:12.765434 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Mar 7 01:10:12.769845 kernel: Console: switching to colour frame buffer device 160x50 Mar 7 01:10:12.776885 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Mar 7 01:10:12.777977 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 7 01:10:12.778179 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 01:10:12.782392 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 7 01:10:12.792728 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 7 01:10:12.828201 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 7 01:10:12.889536 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 7 01:10:12.897704 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 7 01:10:12.913675 lvm[1476]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 7 01:10:12.949315 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 7 01:10:12.951425 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 7 01:10:12.951564 systemd[1]: Reached target sysinit.target - System Initialization. Mar 7 01:10:12.951860 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 7 01:10:12.952032 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 7 01:10:12.952811 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 7 01:10:12.954113 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 7 01:10:12.954783 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 7 01:10:12.954912 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 7 01:10:12.954953 systemd[1]: Reached target paths.target - Path Units. Mar 7 01:10:12.955038 systemd[1]: Reached target timers.target - Timer Units. Mar 7 01:10:12.956729 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 7 01:10:12.960812 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 7 01:10:12.969321 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 7 01:10:12.981725 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 7 01:10:12.985045 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 7 01:10:12.985630 systemd[1]: Reached target sockets.target - Socket Units. Mar 7 01:10:12.987454 systemd[1]: Reached target basic.target - Basic System. Mar 7 01:10:12.988944 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 7 01:10:12.989028 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 7 01:10:12.989831 lvm[1480]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 7 01:10:12.997660 systemd[1]: Starting containerd.service - containerd container runtime... Mar 7 01:10:13.013598 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 7 01:10:13.020669 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 7 01:10:13.025150 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 7 01:10:13.029723 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 7 01:10:13.031889 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 7 01:10:13.033332 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 7 01:10:13.038598 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 7 01:10:13.043573 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Mar 7 01:10:13.049588 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 7 01:10:13.058560 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 7 01:10:13.066783 jq[1486]: false Mar 7 01:10:13.090031 extend-filesystems[1487]: Found loop4 Mar 7 01:10:13.090031 extend-filesystems[1487]: Found loop5 Mar 7 01:10:13.090031 extend-filesystems[1487]: Found loop6 Mar 7 01:10:13.090031 extend-filesystems[1487]: Found loop7 Mar 7 01:10:13.090031 extend-filesystems[1487]: Found sda Mar 7 01:10:13.090031 extend-filesystems[1487]: Found sda1 Mar 7 01:10:13.090031 extend-filesystems[1487]: Found sda2 Mar 7 01:10:13.090031 extend-filesystems[1487]: Found sda3 Mar 7 01:10:13.090031 extend-filesystems[1487]: Found usr Mar 7 01:10:13.090031 extend-filesystems[1487]: Found sda4 Mar 7 01:10:13.090031 extend-filesystems[1487]: Found sda6 Mar 7 01:10:13.090031 extend-filesystems[1487]: Found sda7 Mar 7 01:10:13.090031 extend-filesystems[1487]: Found sda9 Mar 7 01:10:13.090031 extend-filesystems[1487]: Checking size of /dev/sda9 Mar 7 01:10:13.127856 coreos-metadata[1482]: Mar 07 01:10:13.081 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Mar 7 01:10:13.127856 coreos-metadata[1482]: Mar 07 01:10:13.085 INFO Fetch successful Mar 7 01:10:13.127856 coreos-metadata[1482]: Mar 07 01:10:13.088 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Mar 7 01:10:13.127856 coreos-metadata[1482]: Mar 07 01:10:13.094 INFO Fetch successful Mar 7 01:10:13.069600 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 7 01:10:13.094509 dbus-daemon[1483]: [system] SELinux support is enabled Mar 7 01:10:13.161730 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 19393531 blocks Mar 7 01:10:13.071325 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 7 01:10:13.162028 extend-filesystems[1487]: Resized partition /dev/sda9 Mar 7 01:10:13.071785 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 7 01:10:13.165369 extend-filesystems[1516]: resize2fs 1.47.1 (20-May-2024) Mar 7 01:10:13.073353 systemd[1]: Starting update-engine.service - Update Engine... Mar 7 01:10:13.079510 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 7 01:10:13.081646 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 7 01:10:13.177483 jq[1498]: true Mar 7 01:10:13.098649 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 7 01:10:13.126960 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 7 01:10:13.128222 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 7 01:10:13.128600 systemd[1]: motdgen.service: Deactivated successfully. Mar 7 01:10:13.128764 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 7 01:10:13.137944 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 7 01:10:13.138157 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 7 01:10:13.154230 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 7 01:10:13.154261 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 7 01:10:13.155215 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 7 01:10:13.155234 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 7 01:10:13.199419 tar[1513]: linux-amd64/LICENSE Mar 7 01:10:13.199419 tar[1513]: linux-amd64/helm Mar 7 01:10:13.199689 update_engine[1496]: I20260307 01:10:13.195830 1496 main.cc:92] Flatcar Update Engine starting Mar 7 01:10:13.193450 (ntainerd)[1517]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 7 01:10:13.201737 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 34 scanned by (udev-worker) (1333) Mar 7 01:10:13.205129 systemd[1]: Started update-engine.service - Update Engine. Mar 7 01:10:13.210808 update_engine[1496]: I20260307 01:10:13.210644 1496 update_check_scheduler.cc:74] Next update check in 2m48s Mar 7 01:10:13.215873 jq[1515]: true Mar 7 01:10:13.217552 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 7 01:10:13.315997 systemd-logind[1493]: New seat seat0. Mar 7 01:10:13.322023 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 7 01:10:13.328165 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 7 01:10:13.341995 systemd-logind[1493]: Watching system buttons on /dev/input/event2 (Power Button) Mar 7 01:10:13.344452 systemd-logind[1493]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 7 01:10:13.344702 systemd[1]: Started systemd-logind.service - User Login Management. Mar 7 01:10:13.406852 bash[1553]: Updated "/home/core/.ssh/authorized_keys" Mar 7 01:10:13.408743 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 7 01:10:13.424875 systemd[1]: Starting sshkeys.service... Mar 7 01:10:13.448919 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 7 01:10:13.459659 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 7 01:10:13.479804 sshd_keygen[1512]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 7 01:10:13.494072 kernel: EXT4-fs (sda9): resized filesystem to 19393531 Mar 7 01:10:13.498745 coreos-metadata[1563]: Mar 07 01:10:13.498 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Mar 7 01:10:13.515645 coreos-metadata[1563]: Mar 07 01:10:13.501 INFO Fetch successful Mar 7 01:10:13.506428 locksmithd[1531]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 7 01:10:13.517298 unknown[1563]: wrote ssh authorized keys file for user: core Mar 7 01:10:13.519024 extend-filesystems[1516]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Mar 7 01:10:13.519024 extend-filesystems[1516]: old_desc_blocks = 1, new_desc_blocks = 10 Mar 7 01:10:13.519024 extend-filesystems[1516]: The filesystem on /dev/sda9 is now 19393531 (4k) blocks long. Mar 7 01:10:13.523702 extend-filesystems[1487]: Resized filesystem in /dev/sda9 Mar 7 01:10:13.523702 extend-filesystems[1487]: Found sr0 Mar 7 01:10:13.519335 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 7 01:10:13.519960 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 7 01:10:13.525147 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 7 01:10:13.536021 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 7 01:10:13.540435 containerd[1517]: time="2026-03-07T01:10:13.539702231Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Mar 7 01:10:13.544660 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 7 01:10:13.548828 update-ssh-keys[1577]: Updated "/home/core/.ssh/authorized_keys" Mar 7 01:10:13.548676 systemd[1]: Started sshd@0-89.167.115.210:22-4.153.228.146:59748.service - OpenSSH per-connection server daemon (4.153.228.146:59748). Mar 7 01:10:13.552221 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 7 01:10:13.554883 systemd[1]: issuegen.service: Deactivated successfully. Mar 7 01:10:13.555057 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 7 01:10:13.556767 systemd[1]: Finished sshkeys.service. Mar 7 01:10:13.579669 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 7 01:10:13.584361 containerd[1517]: time="2026-03-07T01:10:13.584285288Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Mar 7 01:10:13.587871 containerd[1517]: time="2026-03-07T01:10:13.587736281Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.127-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Mar 7 01:10:13.587871 containerd[1517]: time="2026-03-07T01:10:13.587760611Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Mar 7 01:10:13.587871 containerd[1517]: time="2026-03-07T01:10:13.587773971Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Mar 7 01:10:13.587961 containerd[1517]: time="2026-03-07T01:10:13.587908081Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Mar 7 01:10:13.587961 containerd[1517]: time="2026-03-07T01:10:13.587919771Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Mar 7 01:10:13.587988 containerd[1517]: time="2026-03-07T01:10:13.587971261Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Mar 7 01:10:13.587988 containerd[1517]: time="2026-03-07T01:10:13.587979741Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Mar 7 01:10:13.588287 containerd[1517]: time="2026-03-07T01:10:13.588119551Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 7 01:10:13.588287 containerd[1517]: time="2026-03-07T01:10:13.588132671Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Mar 7 01:10:13.588287 containerd[1517]: time="2026-03-07T01:10:13.588142611Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Mar 7 01:10:13.588287 containerd[1517]: time="2026-03-07T01:10:13.588149751Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Mar 7 01:10:13.588287 containerd[1517]: time="2026-03-07T01:10:13.588212601Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Mar 7 01:10:13.588631 containerd[1517]: time="2026-03-07T01:10:13.588389892Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Mar 7 01:10:13.589558 containerd[1517]: time="2026-03-07T01:10:13.589537672Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 7 01:10:13.589558 containerd[1517]: time="2026-03-07T01:10:13.589556192Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Mar 7 01:10:13.589654 containerd[1517]: time="2026-03-07T01:10:13.589639263Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Mar 7 01:10:13.589705 containerd[1517]: time="2026-03-07T01:10:13.589691823Z" level=info msg="metadata content store policy set" policy=shared Mar 7 01:10:13.595903 containerd[1517]: time="2026-03-07T01:10:13.595475097Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Mar 7 01:10:13.595903 containerd[1517]: time="2026-03-07T01:10:13.595523667Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Mar 7 01:10:13.595903 containerd[1517]: time="2026-03-07T01:10:13.595536547Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Mar 7 01:10:13.595903 containerd[1517]: time="2026-03-07T01:10:13.595548377Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Mar 7 01:10:13.595903 containerd[1517]: time="2026-03-07T01:10:13.595561687Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Mar 7 01:10:13.595903 containerd[1517]: time="2026-03-07T01:10:13.595698988Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Mar 7 01:10:13.595903 containerd[1517]: time="2026-03-07T01:10:13.595861908Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Mar 7 01:10:13.596058 containerd[1517]: time="2026-03-07T01:10:13.595944278Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Mar 7 01:10:13.596058 containerd[1517]: time="2026-03-07T01:10:13.595955378Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Mar 7 01:10:13.596058 containerd[1517]: time="2026-03-07T01:10:13.595965038Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Mar 7 01:10:13.596058 containerd[1517]: time="2026-03-07T01:10:13.595974368Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Mar 7 01:10:13.596058 containerd[1517]: time="2026-03-07T01:10:13.595983428Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Mar 7 01:10:13.596058 containerd[1517]: time="2026-03-07T01:10:13.595992188Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Mar 7 01:10:13.596058 containerd[1517]: time="2026-03-07T01:10:13.596001848Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Mar 7 01:10:13.596058 containerd[1517]: time="2026-03-07T01:10:13.596012528Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Mar 7 01:10:13.596058 containerd[1517]: time="2026-03-07T01:10:13.596021848Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Mar 7 01:10:13.596058 containerd[1517]: time="2026-03-07T01:10:13.596031068Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Mar 7 01:10:13.596058 containerd[1517]: time="2026-03-07T01:10:13.596039888Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Mar 7 01:10:13.596058 containerd[1517]: time="2026-03-07T01:10:13.596054898Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Mar 7 01:10:13.596210 containerd[1517]: time="2026-03-07T01:10:13.596064528Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Mar 7 01:10:13.596210 containerd[1517]: time="2026-03-07T01:10:13.596076108Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Mar 7 01:10:13.596210 containerd[1517]: time="2026-03-07T01:10:13.596085338Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Mar 7 01:10:13.596210 containerd[1517]: time="2026-03-07T01:10:13.596093738Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Mar 7 01:10:13.596210 containerd[1517]: time="2026-03-07T01:10:13.596103028Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Mar 7 01:10:13.596210 containerd[1517]: time="2026-03-07T01:10:13.596111478Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Mar 7 01:10:13.596210 containerd[1517]: time="2026-03-07T01:10:13.596120568Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Mar 7 01:10:13.596210 containerd[1517]: time="2026-03-07T01:10:13.596129518Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Mar 7 01:10:13.596210 containerd[1517]: time="2026-03-07T01:10:13.596139838Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Mar 7 01:10:13.596210 containerd[1517]: time="2026-03-07T01:10:13.596148158Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Mar 7 01:10:13.596210 containerd[1517]: time="2026-03-07T01:10:13.596156468Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Mar 7 01:10:13.596210 containerd[1517]: time="2026-03-07T01:10:13.596169018Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Mar 7 01:10:13.596210 containerd[1517]: time="2026-03-07T01:10:13.596180158Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Mar 7 01:10:13.596210 containerd[1517]: time="2026-03-07T01:10:13.596195148Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Mar 7 01:10:13.596210 containerd[1517]: time="2026-03-07T01:10:13.596203568Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Mar 7 01:10:13.597085 containerd[1517]: time="2026-03-07T01:10:13.596210918Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Mar 7 01:10:13.597085 containerd[1517]: time="2026-03-07T01:10:13.596253178Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Mar 7 01:10:13.597085 containerd[1517]: time="2026-03-07T01:10:13.596266278Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Mar 7 01:10:13.597085 containerd[1517]: time="2026-03-07T01:10:13.596274128Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Mar 7 01:10:13.597085 containerd[1517]: time="2026-03-07T01:10:13.596281868Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Mar 7 01:10:13.597085 containerd[1517]: time="2026-03-07T01:10:13.596288608Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Mar 7 01:10:13.597085 containerd[1517]: time="2026-03-07T01:10:13.596297028Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Mar 7 01:10:13.597085 containerd[1517]: time="2026-03-07T01:10:13.596309458Z" level=info msg="NRI interface is disabled by configuration." Mar 7 01:10:13.597085 containerd[1517]: time="2026-03-07T01:10:13.596317538Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Mar 7 01:10:13.597211 containerd[1517]: time="2026-03-07T01:10:13.597098209Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Mar 7 01:10:13.597211 containerd[1517]: time="2026-03-07T01:10:13.597143379Z" level=info msg="Connect containerd service" Mar 7 01:10:13.597211 containerd[1517]: time="2026-03-07T01:10:13.597170379Z" level=info msg="using legacy CRI server" Mar 7 01:10:13.597211 containerd[1517]: time="2026-03-07T01:10:13.597176029Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 7 01:10:13.597379 containerd[1517]: time="2026-03-07T01:10:13.597241839Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Mar 7 01:10:13.599151 containerd[1517]: time="2026-03-07T01:10:13.598707000Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 7 01:10:13.599151 containerd[1517]: time="2026-03-07T01:10:13.598866930Z" level=info msg="Start subscribing containerd event" Mar 7 01:10:13.599151 containerd[1517]: time="2026-03-07T01:10:13.598904420Z" level=info msg="Start recovering state" Mar 7 01:10:13.599151 containerd[1517]: time="2026-03-07T01:10:13.598955580Z" level=info msg="Start event monitor" Mar 7 01:10:13.599151 containerd[1517]: time="2026-03-07T01:10:13.598963340Z" level=info msg="Start snapshots syncer" Mar 7 01:10:13.599151 containerd[1517]: time="2026-03-07T01:10:13.598970060Z" level=info msg="Start cni network conf syncer for default" Mar 7 01:10:13.599151 containerd[1517]: time="2026-03-07T01:10:13.598976710Z" level=info msg="Start streaming server" Mar 7 01:10:13.600629 containerd[1517]: time="2026-03-07T01:10:13.600488192Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 7 01:10:13.600629 containerd[1517]: time="2026-03-07T01:10:13.600560812Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 7 01:10:13.600749 systemd[1]: Started containerd.service - containerd container runtime. Mar 7 01:10:13.604012 containerd[1517]: time="2026-03-07T01:10:13.603977474Z" level=info msg="containerd successfully booted in 0.070173s" Mar 7 01:10:13.612383 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 7 01:10:13.624124 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 7 01:10:13.633887 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 7 01:10:13.635158 systemd[1]: Reached target getty.target - Login Prompts. Mar 7 01:10:13.797331 tar[1513]: linux-amd64/README.md Mar 7 01:10:13.808949 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 7 01:10:14.244900 systemd-networkd[1392]: eth0: Gained IPv6LL Mar 7 01:10:14.246623 systemd-timesyncd[1417]: Network configuration changed, trying to establish connection. Mar 7 01:10:14.251139 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 7 01:10:14.254866 systemd[1]: Reached target network-online.target - Network is Online. Mar 7 01:10:14.264866 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 01:10:14.280677 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 7 01:10:14.312767 systemd-networkd[1392]: eth1: Gained IPv6LL Mar 7 01:10:14.314370 systemd-timesyncd[1417]: Network configuration changed, trying to establish connection. Mar 7 01:10:14.321100 sshd[1582]: Accepted publickey for core from 4.153.228.146 port 59748 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:10:14.323998 sshd[1582]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:10:14.327185 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 7 01:10:14.337604 systemd-logind[1493]: New session 1 of user core. Mar 7 01:10:14.339963 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 7 01:10:14.347793 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 7 01:10:14.363263 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 7 01:10:14.373209 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 7 01:10:14.377881 (systemd)[1612]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 7 01:10:14.477754 systemd[1612]: Queued start job for default target default.target. Mar 7 01:10:14.481891 systemd[1612]: Created slice app.slice - User Application Slice. Mar 7 01:10:14.481914 systemd[1612]: Reached target paths.target - Paths. Mar 7 01:10:14.481926 systemd[1612]: Reached target timers.target - Timers. Mar 7 01:10:14.484335 systemd[1612]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 7 01:10:14.494053 systemd[1612]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 7 01:10:14.494213 systemd[1612]: Reached target sockets.target - Sockets. Mar 7 01:10:14.494306 systemd[1612]: Reached target basic.target - Basic System. Mar 7 01:10:14.494351 systemd[1612]: Reached target default.target - Main User Target. Mar 7 01:10:14.494384 systemd[1612]: Startup finished in 110ms. Mar 7 01:10:14.494554 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 7 01:10:14.502545 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 7 01:10:15.034616 systemd[1]: Started sshd@1-89.167.115.210:22-4.153.228.146:59760.service - OpenSSH per-connection server daemon (4.153.228.146:59760). Mar 7 01:10:15.252090 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 01:10:15.255110 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 7 01:10:15.259179 systemd[1]: Startup finished in 1.595s (kernel) + 5.730s (initrd) + 4.824s (userspace) = 12.149s. Mar 7 01:10:15.266913 (kubelet)[1630]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 7 01:10:15.763051 sshd[1623]: Accepted publickey for core from 4.153.228.146 port 59760 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:10:15.765615 sshd[1623]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:10:15.774581 systemd-logind[1493]: New session 2 of user core. Mar 7 01:10:15.778653 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 7 01:10:15.948583 kubelet[1630]: E0307 01:10:15.948486 1630 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 7 01:10:15.954717 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 7 01:10:15.955041 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 7 01:10:15.955629 systemd[1]: kubelet.service: Consumed 1.001s CPU time. Mar 7 01:10:16.288767 sshd[1623]: pam_unix(sshd:session): session closed for user core Mar 7 01:10:16.294815 systemd-logind[1493]: Session 2 logged out. Waiting for processes to exit. Mar 7 01:10:16.296252 systemd[1]: sshd@1-89.167.115.210:22-4.153.228.146:59760.service: Deactivated successfully. Mar 7 01:10:16.302127 systemd[1]: session-2.scope: Deactivated successfully. Mar 7 01:10:16.305792 systemd-logind[1493]: Removed session 2. Mar 7 01:10:16.422816 systemd[1]: Started sshd@2-89.167.115.210:22-4.153.228.146:59776.service - OpenSSH per-connection server daemon (4.153.228.146:59776). Mar 7 01:10:17.183052 sshd[1646]: Accepted publickey for core from 4.153.228.146 port 59776 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:10:17.184358 sshd[1646]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:10:17.193214 systemd-logind[1493]: New session 3 of user core. Mar 7 01:10:17.203736 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 7 01:10:17.702591 sshd[1646]: pam_unix(sshd:session): session closed for user core Mar 7 01:10:17.707487 systemd[1]: sshd@2-89.167.115.210:22-4.153.228.146:59776.service: Deactivated successfully. Mar 7 01:10:17.711215 systemd[1]: session-3.scope: Deactivated successfully. Mar 7 01:10:17.713393 systemd-logind[1493]: Session 3 logged out. Waiting for processes to exit. Mar 7 01:10:17.715561 systemd-logind[1493]: Removed session 3. Mar 7 01:10:17.835530 systemd[1]: Started sshd@3-89.167.115.210:22-4.153.228.146:59784.service - OpenSSH per-connection server daemon (4.153.228.146:59784). Mar 7 01:10:18.610471 sshd[1653]: Accepted publickey for core from 4.153.228.146 port 59784 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:10:18.613322 sshd[1653]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:10:18.622491 systemd-logind[1493]: New session 4 of user core. Mar 7 01:10:18.628653 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 7 01:10:19.140691 sshd[1653]: pam_unix(sshd:session): session closed for user core Mar 7 01:10:19.145651 systemd[1]: sshd@3-89.167.115.210:22-4.153.228.146:59784.service: Deactivated successfully. Mar 7 01:10:19.149314 systemd[1]: session-4.scope: Deactivated successfully. Mar 7 01:10:19.151654 systemd-logind[1493]: Session 4 logged out. Waiting for processes to exit. Mar 7 01:10:19.153977 systemd-logind[1493]: Removed session 4. Mar 7 01:10:19.281864 systemd[1]: Started sshd@4-89.167.115.210:22-4.153.228.146:51866.service - OpenSSH per-connection server daemon (4.153.228.146:51866). Mar 7 01:10:20.034304 sshd[1660]: Accepted publickey for core from 4.153.228.146 port 51866 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:10:20.036994 sshd[1660]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:10:20.045111 systemd-logind[1493]: New session 5 of user core. Mar 7 01:10:20.052684 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 7 01:10:20.456092 sudo[1663]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 7 01:10:20.456836 sudo[1663]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 7 01:10:20.477131 sudo[1663]: pam_unix(sudo:session): session closed for user root Mar 7 01:10:20.598678 sshd[1660]: pam_unix(sshd:session): session closed for user core Mar 7 01:10:20.605620 systemd[1]: sshd@4-89.167.115.210:22-4.153.228.146:51866.service: Deactivated successfully. Mar 7 01:10:20.608640 systemd[1]: session-5.scope: Deactivated successfully. Mar 7 01:10:20.609703 systemd-logind[1493]: Session 5 logged out. Waiting for processes to exit. Mar 7 01:10:20.611677 systemd-logind[1493]: Removed session 5. Mar 7 01:10:20.738197 systemd[1]: Started sshd@5-89.167.115.210:22-4.153.228.146:51874.service - OpenSSH per-connection server daemon (4.153.228.146:51874). Mar 7 01:10:21.499530 sshd[1668]: Accepted publickey for core from 4.153.228.146 port 51874 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:10:21.502340 sshd[1668]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:10:21.510025 systemd-logind[1493]: New session 6 of user core. Mar 7 01:10:21.521627 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 7 01:10:21.912586 sudo[1672]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 7 01:10:21.913906 sudo[1672]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 7 01:10:21.920779 sudo[1672]: pam_unix(sudo:session): session closed for user root Mar 7 01:10:21.934065 sudo[1671]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Mar 7 01:10:21.934931 sudo[1671]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 7 01:10:21.962825 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Mar 7 01:10:21.971311 auditctl[1675]: No rules Mar 7 01:10:21.973043 systemd[1]: audit-rules.service: Deactivated successfully. Mar 7 01:10:21.973517 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Mar 7 01:10:21.981452 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Mar 7 01:10:22.045492 augenrules[1693]: No rules Mar 7 01:10:22.048074 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Mar 7 01:10:22.050801 sudo[1671]: pam_unix(sudo:session): session closed for user root Mar 7 01:10:22.172565 sshd[1668]: pam_unix(sshd:session): session closed for user core Mar 7 01:10:22.180786 systemd[1]: sshd@5-89.167.115.210:22-4.153.228.146:51874.service: Deactivated successfully. Mar 7 01:10:22.184596 systemd[1]: session-6.scope: Deactivated successfully. Mar 7 01:10:22.185906 systemd-logind[1493]: Session 6 logged out. Waiting for processes to exit. Mar 7 01:10:22.187794 systemd-logind[1493]: Removed session 6. Mar 7 01:10:22.311153 systemd[1]: Started sshd@6-89.167.115.210:22-4.153.228.146:51878.service - OpenSSH per-connection server daemon (4.153.228.146:51878). Mar 7 01:10:23.066874 sshd[1701]: Accepted publickey for core from 4.153.228.146 port 51878 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:10:23.068283 sshd[1701]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:10:23.076460 systemd-logind[1493]: New session 7 of user core. Mar 7 01:10:23.092708 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 7 01:10:23.473686 sudo[1704]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 7 01:10:23.474387 sudo[1704]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 7 01:10:23.815750 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 7 01:10:23.816161 (dockerd)[1719]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 7 01:10:24.092485 dockerd[1719]: time="2026-03-07T01:10:24.092069872Z" level=info msg="Starting up" Mar 7 01:10:24.148536 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport936734406-merged.mount: Deactivated successfully. Mar 7 01:10:24.171680 dockerd[1719]: time="2026-03-07T01:10:24.171649588Z" level=info msg="Loading containers: start." Mar 7 01:10:24.263430 kernel: Initializing XFRM netlink socket Mar 7 01:10:24.284584 systemd-timesyncd[1417]: Network configuration changed, trying to establish connection. Mar 7 01:10:24.335574 systemd-networkd[1392]: docker0: Link UP Mar 7 01:10:24.347684 dockerd[1719]: time="2026-03-07T01:10:24.347596125Z" level=info msg="Loading containers: done." Mar 7 01:10:24.359717 dockerd[1719]: time="2026-03-07T01:10:24.359683095Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 7 01:10:24.359902 dockerd[1719]: time="2026-03-07T01:10:24.359765955Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Mar 7 01:10:24.359902 dockerd[1719]: time="2026-03-07T01:10:24.359863385Z" level=info msg="Daemon has completed initialization" Mar 7 01:10:24.387092 dockerd[1719]: time="2026-03-07T01:10:24.387052928Z" level=info msg="API listen on /run/docker.sock" Mar 7 01:10:24.387298 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 7 01:10:24.469551 systemd-timesyncd[1417]: Contacted time server 144.76.76.107:123 (2.flatcar.pool.ntp.org). Mar 7 01:10:24.469621 systemd-timesyncd[1417]: Initial clock synchronization to Sat 2026-03-07 01:10:24.291600 UTC. Mar 7 01:10:24.829171 containerd[1517]: time="2026-03-07T01:10:24.828254915Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.9\"" Mar 7 01:10:25.147092 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck214120899-merged.mount: Deactivated successfully. Mar 7 01:10:25.448492 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1379687484.mount: Deactivated successfully. Mar 7 01:10:26.205444 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 7 01:10:26.216244 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 01:10:26.367530 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 01:10:26.371469 (kubelet)[1925]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 7 01:10:26.411169 kubelet[1925]: E0307 01:10:26.411134 1925 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 7 01:10:26.416198 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 7 01:10:26.416477 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 7 01:10:26.716066 containerd[1517]: time="2026-03-07T01:10:26.716012167Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:10:26.717013 containerd[1517]: time="2026-03-07T01:10:26.716832663Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.9: active requests=0, bytes read=30116286" Mar 7 01:10:26.718545 containerd[1517]: time="2026-03-07T01:10:26.717780456Z" level=info msg="ImageCreate event name:\"sha256:d3c49e1d7c1cb22893888d0d7a4142c80e16023143fdd2c0225a362ec08ab4a4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:10:26.720989 containerd[1517]: time="2026-03-07T01:10:26.720948372Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:a1fe354f8b36dbce37fef26c3731e2376fb8eb7375e7df3068df7ad11656f022\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:10:26.721890 containerd[1517]: time="2026-03-07T01:10:26.721436054Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.9\" with image id \"sha256:d3c49e1d7c1cb22893888d0d7a4142c80e16023143fdd2c0225a362ec08ab4a4\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.9\", repo digest \"registry.k8s.io/kube-apiserver@sha256:a1fe354f8b36dbce37fef26c3731e2376fb8eb7375e7df3068df7ad11656f022\", size \"30112785\" in 1.893133974s" Mar 7 01:10:26.721890 containerd[1517]: time="2026-03-07T01:10:26.721481170Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.9\" returns image reference \"sha256:d3c49e1d7c1cb22893888d0d7a4142c80e16023143fdd2c0225a362ec08ab4a4\"" Mar 7 01:10:26.722104 containerd[1517]: time="2026-03-07T01:10:26.722075626Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.9\"" Mar 7 01:10:28.061114 containerd[1517]: time="2026-03-07T01:10:28.061058801Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:10:28.062146 containerd[1517]: time="2026-03-07T01:10:28.061984634Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.9: active requests=0, bytes read=26021832" Mar 7 01:10:28.064085 containerd[1517]: time="2026-03-07T01:10:28.062860760Z" level=info msg="ImageCreate event name:\"sha256:bdbe897c17b593b8163eebd3c55c6723711b8b775bf7e554da6d75d33d114e98\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:10:28.065714 containerd[1517]: time="2026-03-07T01:10:28.064695387Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:a495c9f30cfd4d57ae6c27cb21e477b9b1ddebdace61762e80a06fe264a0d61a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:10:28.065714 containerd[1517]: time="2026-03-07T01:10:28.065460568Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.9\" with image id \"sha256:bdbe897c17b593b8163eebd3c55c6723711b8b775bf7e554da6d75d33d114e98\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.9\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:a495c9f30cfd4d57ae6c27cb21e477b9b1ddebdace61762e80a06fe264a0d61a\", size \"27678758\" in 1.343359876s" Mar 7 01:10:28.065714 containerd[1517]: time="2026-03-07T01:10:28.065490377Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.9\" returns image reference \"sha256:bdbe897c17b593b8163eebd3c55c6723711b8b775bf7e554da6d75d33d114e98\"" Mar 7 01:10:28.066190 containerd[1517]: time="2026-03-07T01:10:28.066167223Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.9\"" Mar 7 01:10:29.300466 containerd[1517]: time="2026-03-07T01:10:29.300415504Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:10:29.301610 containerd[1517]: time="2026-03-07T01:10:29.301421019Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.9: active requests=0, bytes read=20162768" Mar 7 01:10:29.302448 containerd[1517]: time="2026-03-07T01:10:29.302425981Z" level=info msg="ImageCreate event name:\"sha256:04e9a75bd404b7d5d286565ebcd5e8d5a2be3355e6cb0c3f1ab9db53fe6f180a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:10:29.305726 containerd[1517]: time="2026-03-07T01:10:29.304698027Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:d1533368d3acd772e3d11225337a61be319b5ecf7523adeff7ebfe4107ab05b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:10:29.305726 containerd[1517]: time="2026-03-07T01:10:29.305454616Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.9\" with image id \"sha256:04e9a75bd404b7d5d286565ebcd5e8d5a2be3355e6cb0c3f1ab9db53fe6f180a\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.9\", repo digest \"registry.k8s.io/kube-scheduler@sha256:d1533368d3acd772e3d11225337a61be319b5ecf7523adeff7ebfe4107ab05b5\", size \"21819712\" in 1.239263235s" Mar 7 01:10:29.305726 containerd[1517]: time="2026-03-07T01:10:29.305485498Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.9\" returns image reference \"sha256:04e9a75bd404b7d5d286565ebcd5e8d5a2be3355e6cb0c3f1ab9db53fe6f180a\"" Mar 7 01:10:29.305949 containerd[1517]: time="2026-03-07T01:10:29.305924802Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.9\"" Mar 7 01:10:30.312075 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount622469162.mount: Deactivated successfully. Mar 7 01:10:30.646673 containerd[1517]: time="2026-03-07T01:10:30.646313083Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:10:30.647550 containerd[1517]: time="2026-03-07T01:10:30.647357803Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.9: active requests=0, bytes read=31828675" Mar 7 01:10:30.649297 containerd[1517]: time="2026-03-07T01:10:30.648556143Z" level=info msg="ImageCreate event name:\"sha256:36d290108190a8d792e275b3e6ba8f1c0def0fd717573d69c3970816d945510a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:10:30.650478 containerd[1517]: time="2026-03-07T01:10:30.650450116Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:079ba0e77e457dbf755e78bf3a6d736b7eb73d021fe53b853a0b82bbb2c17322\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:10:30.650998 containerd[1517]: time="2026-03-07T01:10:30.650896314Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.9\" with image id \"sha256:36d290108190a8d792e275b3e6ba8f1c0def0fd717573d69c3970816d945510a\", repo tag \"registry.k8s.io/kube-proxy:v1.33.9\", repo digest \"registry.k8s.io/kube-proxy@sha256:079ba0e77e457dbf755e78bf3a6d736b7eb73d021fe53b853a0b82bbb2c17322\", size \"31827666\" in 1.344750428s" Mar 7 01:10:30.650998 containerd[1517]: time="2026-03-07T01:10:30.650926990Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.9\" returns image reference \"sha256:36d290108190a8d792e275b3e6ba8f1c0def0fd717573d69c3970816d945510a\"" Mar 7 01:10:30.651393 containerd[1517]: time="2026-03-07T01:10:30.651321158Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Mar 7 01:10:31.150889 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount455138267.mount: Deactivated successfully. Mar 7 01:10:31.906224 containerd[1517]: time="2026-03-07T01:10:31.906170604Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:10:31.907261 containerd[1517]: time="2026-03-07T01:10:31.907226610Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20942332" Mar 7 01:10:31.908084 containerd[1517]: time="2026-03-07T01:10:31.908049069Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:10:31.910190 containerd[1517]: time="2026-03-07T01:10:31.910157696Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:10:31.911196 containerd[1517]: time="2026-03-07T01:10:31.911169350Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 1.259652895s" Mar 7 01:10:31.911196 containerd[1517]: time="2026-03-07T01:10:31.911193011Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" Mar 7 01:10:31.911916 containerd[1517]: time="2026-03-07T01:10:31.911876727Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Mar 7 01:10:32.395971 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2877402035.mount: Deactivated successfully. Mar 7 01:10:32.406435 containerd[1517]: time="2026-03-07T01:10:32.406356748Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:10:32.407798 containerd[1517]: time="2026-03-07T01:10:32.407715731Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321160" Mar 7 01:10:32.409121 containerd[1517]: time="2026-03-07T01:10:32.408982286Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:10:32.413314 containerd[1517]: time="2026-03-07T01:10:32.412775210Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:10:32.413314 containerd[1517]: time="2026-03-07T01:10:32.413198059Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 501.280158ms" Mar 7 01:10:32.413314 containerd[1517]: time="2026-03-07T01:10:32.413218023Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Mar 7 01:10:32.414525 containerd[1517]: time="2026-03-07T01:10:32.414485708Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.24-0\"" Mar 7 01:10:32.976025 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4165748505.mount: Deactivated successfully. Mar 7 01:10:33.750870 containerd[1517]: time="2026-03-07T01:10:33.750827712Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.24-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:10:33.751855 containerd[1517]: time="2026-03-07T01:10:33.751735142Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.24-0: active requests=0, bytes read=23718940" Mar 7 01:10:33.753798 containerd[1517]: time="2026-03-07T01:10:33.752617982Z" level=info msg="ImageCreate event name:\"sha256:8cb12dd0c3e42c6d0175d09a060358cbb68a3ecc2ba4dbb00327c7d760e1425d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:10:33.754574 containerd[1517]: time="2026-03-07T01:10:33.754558806Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:251e7e490f64859d329cd963bc879dc04acf3d7195bb52c4c50b4a07bedf37d6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:10:33.755275 containerd[1517]: time="2026-03-07T01:10:33.755257155Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.24-0\" with image id \"sha256:8cb12dd0c3e42c6d0175d09a060358cbb68a3ecc2ba4dbb00327c7d760e1425d\", repo tag \"registry.k8s.io/etcd:3.5.24-0\", repo digest \"registry.k8s.io/etcd@sha256:251e7e490f64859d329cd963bc879dc04acf3d7195bb52c4c50b4a07bedf37d6\", size \"23716032\" in 1.340729995s" Mar 7 01:10:33.755326 containerd[1517]: time="2026-03-07T01:10:33.755315873Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.24-0\" returns image reference \"sha256:8cb12dd0c3e42c6d0175d09a060358cbb68a3ecc2ba4dbb00327c7d760e1425d\"" Mar 7 01:10:36.179175 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 01:10:36.191628 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 01:10:36.218556 systemd[1]: Reloading requested from client PID 2096 ('systemctl') (unit session-7.scope)... Mar 7 01:10:36.218569 systemd[1]: Reloading... Mar 7 01:10:36.343446 zram_generator::config[2151]: No configuration found. Mar 7 01:10:36.409641 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 7 01:10:36.470301 systemd[1]: Reloading finished in 251 ms. Mar 7 01:10:36.521491 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 01:10:36.529685 (kubelet)[2181]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 7 01:10:36.533702 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 01:10:36.536138 systemd[1]: kubelet.service: Deactivated successfully. Mar 7 01:10:36.536806 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 01:10:36.544815 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 01:10:36.655250 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 01:10:36.660384 (kubelet)[2197]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 7 01:10:36.696978 kubelet[2197]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 7 01:10:36.696978 kubelet[2197]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 7 01:10:36.696978 kubelet[2197]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 7 01:10:36.697977 kubelet[2197]: I0307 01:10:36.697923 2197 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 7 01:10:37.018529 kubelet[2197]: I0307 01:10:37.018486 2197 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 7 01:10:37.018529 kubelet[2197]: I0307 01:10:37.018506 2197 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 7 01:10:37.018732 kubelet[2197]: I0307 01:10:37.018656 2197 server.go:956] "Client rotation is on, will bootstrap in background" Mar 7 01:10:37.038424 kubelet[2197]: E0307 01:10:37.038339 2197 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://89.167.115.210:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 89.167.115.210:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 7 01:10:37.042924 kubelet[2197]: I0307 01:10:37.042699 2197 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 7 01:10:37.052254 kubelet[2197]: E0307 01:10:37.052213 2197 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Mar 7 01:10:37.052448 kubelet[2197]: I0307 01:10:37.052378 2197 server.go:1423] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Mar 7 01:10:37.059956 kubelet[2197]: I0307 01:10:37.059924 2197 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 7 01:10:37.060585 kubelet[2197]: I0307 01:10:37.060541 2197 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 7 01:10:37.060818 kubelet[2197]: I0307 01:10:37.060587 2197 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081-3-6-n-e40d23dcbc","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 7 01:10:37.060881 kubelet[2197]: I0307 01:10:37.060828 2197 topology_manager.go:138] "Creating topology manager with none policy" Mar 7 01:10:37.060881 kubelet[2197]: I0307 01:10:37.060843 2197 container_manager_linux.go:303] "Creating device plugin manager" Mar 7 01:10:37.061055 kubelet[2197]: I0307 01:10:37.061037 2197 state_mem.go:36] "Initialized new in-memory state store" Mar 7 01:10:37.068608 kubelet[2197]: I0307 01:10:37.068582 2197 kubelet.go:480] "Attempting to sync node with API server" Mar 7 01:10:37.068653 kubelet[2197]: I0307 01:10:37.068618 2197 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 7 01:10:37.068678 kubelet[2197]: I0307 01:10:37.068660 2197 kubelet.go:386] "Adding apiserver pod source" Mar 7 01:10:37.071450 kubelet[2197]: I0307 01:10:37.071325 2197 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 7 01:10:37.077358 kubelet[2197]: I0307 01:10:37.077321 2197 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Mar 7 01:10:37.078255 kubelet[2197]: I0307 01:10:37.078227 2197 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 7 01:10:37.079334 kubelet[2197]: W0307 01:10:37.079228 2197 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 7 01:10:37.080844 kubelet[2197]: E0307 01:10:37.080541 2197 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://89.167.115.210:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 89.167.115.210:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 7 01:10:37.080844 kubelet[2197]: E0307 01:10:37.080601 2197 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://89.167.115.210:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4081-3-6-n-e40d23dcbc&limit=500&resourceVersion=0\": dial tcp 89.167.115.210:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 7 01:10:37.085378 kubelet[2197]: I0307 01:10:37.085354 2197 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 7 01:10:37.085687 kubelet[2197]: I0307 01:10:37.085447 2197 server.go:1289] "Started kubelet" Mar 7 01:10:37.087279 kubelet[2197]: I0307 01:10:37.087252 2197 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 7 01:10:37.090444 kubelet[2197]: E0307 01:10:37.089418 2197 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://89.167.115.210:6443/api/v1/namespaces/default/events\": dial tcp 89.167.115.210:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4081-3-6-n-e40d23dcbc.189a69e559010808 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4081-3-6-n-e40d23dcbc,UID:ci-4081-3-6-n-e40d23dcbc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4081-3-6-n-e40d23dcbc,},FirstTimestamp:2026-03-07 01:10:37.085378568 +0000 UTC m=+0.421688964,LastTimestamp:2026-03-07 01:10:37.085378568 +0000 UTC m=+0.421688964,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4081-3-6-n-e40d23dcbc,}" Mar 7 01:10:37.091148 kubelet[2197]: I0307 01:10:37.091126 2197 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 7 01:10:37.091812 kubelet[2197]: I0307 01:10:37.091801 2197 server.go:317] "Adding debug handlers to kubelet server" Mar 7 01:10:37.092361 kubelet[2197]: I0307 01:10:37.092316 2197 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 7 01:10:37.093440 kubelet[2197]: E0307 01:10:37.093084 2197 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-e40d23dcbc\" not found" Mar 7 01:10:37.094267 kubelet[2197]: I0307 01:10:37.094238 2197 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 7 01:10:37.094483 kubelet[2197]: I0307 01:10:37.094463 2197 reconciler.go:26] "Reconciler: start to sync state" Mar 7 01:10:37.098799 kubelet[2197]: E0307 01:10:37.098768 2197 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://89.167.115.210:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 89.167.115.210:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 7 01:10:37.099433 kubelet[2197]: E0307 01:10:37.098862 2197 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://89.167.115.210:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-n-e40d23dcbc?timeout=10s\": dial tcp 89.167.115.210:6443: connect: connection refused" interval="200ms" Mar 7 01:10:37.099433 kubelet[2197]: I0307 01:10:37.098899 2197 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 7 01:10:37.099433 kubelet[2197]: I0307 01:10:37.099004 2197 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 7 01:10:37.099433 kubelet[2197]: I0307 01:10:37.099254 2197 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 7 01:10:37.104027 kubelet[2197]: E0307 01:10:37.104005 2197 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 7 01:10:37.104114 kubelet[2197]: I0307 01:10:37.104103 2197 factory.go:223] Registration of the systemd container factory successfully Mar 7 01:10:37.104207 kubelet[2197]: I0307 01:10:37.104196 2197 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 7 01:10:37.107825 kubelet[2197]: I0307 01:10:37.107814 2197 factory.go:223] Registration of the containerd container factory successfully Mar 7 01:10:37.117440 kubelet[2197]: I0307 01:10:37.117378 2197 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 7 01:10:37.122583 kubelet[2197]: I0307 01:10:37.122546 2197 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 7 01:10:37.122583 kubelet[2197]: I0307 01:10:37.122581 2197 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 7 01:10:37.122641 kubelet[2197]: I0307 01:10:37.122595 2197 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 7 01:10:37.122641 kubelet[2197]: I0307 01:10:37.122600 2197 kubelet.go:2436] "Starting kubelet main sync loop" Mar 7 01:10:37.123973 kubelet[2197]: E0307 01:10:37.122629 2197 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 7 01:10:37.123973 kubelet[2197]: E0307 01:10:37.123818 2197 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://89.167.115.210:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 89.167.115.210:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 7 01:10:37.124537 kubelet[2197]: I0307 01:10:37.124527 2197 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 7 01:10:37.124641 kubelet[2197]: I0307 01:10:37.124633 2197 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 7 01:10:37.124709 kubelet[2197]: I0307 01:10:37.124703 2197 state_mem.go:36] "Initialized new in-memory state store" Mar 7 01:10:37.126967 kubelet[2197]: I0307 01:10:37.126957 2197 policy_none.go:49] "None policy: Start" Mar 7 01:10:37.127032 kubelet[2197]: I0307 01:10:37.127024 2197 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 7 01:10:37.127076 kubelet[2197]: I0307 01:10:37.127069 2197 state_mem.go:35] "Initializing new in-memory state store" Mar 7 01:10:37.131792 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 7 01:10:37.141627 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 7 01:10:37.153750 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 7 01:10:37.154720 kubelet[2197]: E0307 01:10:37.154707 2197 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 7 01:10:37.155193 kubelet[2197]: I0307 01:10:37.155183 2197 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 7 01:10:37.155255 kubelet[2197]: I0307 01:10:37.155237 2197 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 7 01:10:37.155485 kubelet[2197]: I0307 01:10:37.155475 2197 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 7 01:10:37.156155 kubelet[2197]: E0307 01:10:37.156144 2197 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 7 01:10:37.156265 kubelet[2197]: E0307 01:10:37.156257 2197 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4081-3-6-n-e40d23dcbc\" not found" Mar 7 01:10:37.243552 systemd[1]: Created slice kubepods-burstable-pod6deea6d952b3f72c7d75a6827f33c096.slice - libcontainer container kubepods-burstable-pod6deea6d952b3f72c7d75a6827f33c096.slice. Mar 7 01:10:37.258109 kubelet[2197]: I0307 01:10:37.258025 2197 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:37.259152 kubelet[2197]: E0307 01:10:37.258477 2197 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://89.167.115.210:6443/api/v1/nodes\": dial tcp 89.167.115.210:6443: connect: connection refused" node="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:37.261987 kubelet[2197]: E0307 01:10:37.261948 2197 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-e40d23dcbc\" not found" node="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:37.267928 systemd[1]: Created slice kubepods-burstable-pod2b4ab3695228850ff9c093db9086482d.slice - libcontainer container kubepods-burstable-pod2b4ab3695228850ff9c093db9086482d.slice. Mar 7 01:10:37.278844 kubelet[2197]: E0307 01:10:37.278683 2197 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-e40d23dcbc\" not found" node="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:37.284481 systemd[1]: Created slice kubepods-burstable-podc714ccd199a2f1af73714bbf4100aa30.slice - libcontainer container kubepods-burstable-podc714ccd199a2f1af73714bbf4100aa30.slice. Mar 7 01:10:37.287942 kubelet[2197]: E0307 01:10:37.287903 2197 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-e40d23dcbc\" not found" node="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:37.299775 kubelet[2197]: E0307 01:10:37.299696 2197 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://89.167.115.210:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-n-e40d23dcbc?timeout=10s\": dial tcp 89.167.115.210:6443: connect: connection refused" interval="400ms" Mar 7 01:10:37.396122 kubelet[2197]: I0307 01:10:37.396064 2197 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6deea6d952b3f72c7d75a6827f33c096-ca-certs\") pod \"kube-apiserver-ci-4081-3-6-n-e40d23dcbc\" (UID: \"6deea6d952b3f72c7d75a6827f33c096\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:37.396122 kubelet[2197]: I0307 01:10:37.396121 2197 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6deea6d952b3f72c7d75a6827f33c096-k8s-certs\") pod \"kube-apiserver-ci-4081-3-6-n-e40d23dcbc\" (UID: \"6deea6d952b3f72c7d75a6827f33c096\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:37.396456 kubelet[2197]: I0307 01:10:37.396153 2197 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2b4ab3695228850ff9c093db9086482d-ca-certs\") pod \"kube-controller-manager-ci-4081-3-6-n-e40d23dcbc\" (UID: \"2b4ab3695228850ff9c093db9086482d\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:37.396456 kubelet[2197]: I0307 01:10:37.396180 2197 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/2b4ab3695228850ff9c093db9086482d-flexvolume-dir\") pod \"kube-controller-manager-ci-4081-3-6-n-e40d23dcbc\" (UID: \"2b4ab3695228850ff9c093db9086482d\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:37.396456 kubelet[2197]: I0307 01:10:37.396225 2197 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6deea6d952b3f72c7d75a6827f33c096-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081-3-6-n-e40d23dcbc\" (UID: \"6deea6d952b3f72c7d75a6827f33c096\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:37.396456 kubelet[2197]: I0307 01:10:37.396248 2197 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2b4ab3695228850ff9c093db9086482d-k8s-certs\") pod \"kube-controller-manager-ci-4081-3-6-n-e40d23dcbc\" (UID: \"2b4ab3695228850ff9c093db9086482d\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:37.396456 kubelet[2197]: I0307 01:10:37.396271 2197 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/2b4ab3695228850ff9c093db9086482d-kubeconfig\") pod \"kube-controller-manager-ci-4081-3-6-n-e40d23dcbc\" (UID: \"2b4ab3695228850ff9c093db9086482d\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:37.396962 kubelet[2197]: I0307 01:10:37.396295 2197 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2b4ab3695228850ff9c093db9086482d-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081-3-6-n-e40d23dcbc\" (UID: \"2b4ab3695228850ff9c093db9086482d\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:37.396962 kubelet[2197]: I0307 01:10:37.396320 2197 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c714ccd199a2f1af73714bbf4100aa30-kubeconfig\") pod \"kube-scheduler-ci-4081-3-6-n-e40d23dcbc\" (UID: \"c714ccd199a2f1af73714bbf4100aa30\") " pod="kube-system/kube-scheduler-ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:37.461629 kubelet[2197]: I0307 01:10:37.461521 2197 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:37.462124 kubelet[2197]: E0307 01:10:37.462087 2197 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://89.167.115.210:6443/api/v1/nodes\": dial tcp 89.167.115.210:6443: connect: connection refused" node="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:37.564488 containerd[1517]: time="2026-03-07T01:10:37.564210752Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081-3-6-n-e40d23dcbc,Uid:6deea6d952b3f72c7d75a6827f33c096,Namespace:kube-system,Attempt:0,}" Mar 7 01:10:37.582091 containerd[1517]: time="2026-03-07T01:10:37.582011960Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081-3-6-n-e40d23dcbc,Uid:2b4ab3695228850ff9c093db9086482d,Namespace:kube-system,Attempt:0,}" Mar 7 01:10:37.589850 containerd[1517]: time="2026-03-07T01:10:37.589793253Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081-3-6-n-e40d23dcbc,Uid:c714ccd199a2f1af73714bbf4100aa30,Namespace:kube-system,Attempt:0,}" Mar 7 01:10:37.613070 kubelet[2197]: E0307 01:10:37.612884 2197 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://89.167.115.210:6443/api/v1/namespaces/default/events\": dial tcp 89.167.115.210:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4081-3-6-n-e40d23dcbc.189a69e559010808 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4081-3-6-n-e40d23dcbc,UID:ci-4081-3-6-n-e40d23dcbc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4081-3-6-n-e40d23dcbc,},FirstTimestamp:2026-03-07 01:10:37.085378568 +0000 UTC m=+0.421688964,LastTimestamp:2026-03-07 01:10:37.085378568 +0000 UTC m=+0.421688964,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4081-3-6-n-e40d23dcbc,}" Mar 7 01:10:37.701052 kubelet[2197]: E0307 01:10:37.700435 2197 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://89.167.115.210:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-n-e40d23dcbc?timeout=10s\": dial tcp 89.167.115.210:6443: connect: connection refused" interval="800ms" Mar 7 01:10:37.865052 kubelet[2197]: I0307 01:10:37.864917 2197 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:37.865897 kubelet[2197]: E0307 01:10:37.865848 2197 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://89.167.115.210:6443/api/v1/nodes\": dial tcp 89.167.115.210:6443: connect: connection refused" node="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:37.984642 kubelet[2197]: E0307 01:10:37.984502 2197 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://89.167.115.210:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 89.167.115.210:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 7 01:10:38.039445 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3367665020.mount: Deactivated successfully. Mar 7 01:10:38.046977 containerd[1517]: time="2026-03-07T01:10:38.046872108Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 01:10:38.048340 containerd[1517]: time="2026-03-07T01:10:38.048221405Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 01:10:38.049859 containerd[1517]: time="2026-03-07T01:10:38.049767280Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 7 01:10:38.053495 containerd[1517]: time="2026-03-07T01:10:38.053376547Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312078" Mar 7 01:10:38.055216 containerd[1517]: time="2026-03-07T01:10:38.055166006Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 01:10:38.056963 containerd[1517]: time="2026-03-07T01:10:38.056803092Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 01:10:38.056963 containerd[1517]: time="2026-03-07T01:10:38.056848962Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Mar 7 01:10:38.061510 containerd[1517]: time="2026-03-07T01:10:38.061365514Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 7 01:10:38.064695 containerd[1517]: time="2026-03-07T01:10:38.064651900Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 500.298223ms" Mar 7 01:10:38.066981 containerd[1517]: time="2026-03-07T01:10:38.066929430Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 477.042651ms" Mar 7 01:10:38.068443 containerd[1517]: time="2026-03-07T01:10:38.068116027Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 485.998337ms" Mar 7 01:10:38.178891 containerd[1517]: time="2026-03-07T01:10:38.178651863Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:10:38.178891 containerd[1517]: time="2026-03-07T01:10:38.178700413Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:10:38.178891 containerd[1517]: time="2026-03-07T01:10:38.178711419Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:10:38.179176 containerd[1517]: time="2026-03-07T01:10:38.179098996Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:10:38.181212 containerd[1517]: time="2026-03-07T01:10:38.181142287Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:10:38.181314 containerd[1517]: time="2026-03-07T01:10:38.181196175Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:10:38.181314 containerd[1517]: time="2026-03-07T01:10:38.181222263Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:10:38.181371 containerd[1517]: time="2026-03-07T01:10:38.181297557Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:10:38.183249 containerd[1517]: time="2026-03-07T01:10:38.181566579Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:10:38.183249 containerd[1517]: time="2026-03-07T01:10:38.181617390Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:10:38.183249 containerd[1517]: time="2026-03-07T01:10:38.181631554Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:10:38.183249 containerd[1517]: time="2026-03-07T01:10:38.181730745Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:10:38.203040 systemd[1]: Started cri-containerd-28451ef59cf4a379a051e5be392783009c391f2ca8e0533974dc9c6f459801b5.scope - libcontainer container 28451ef59cf4a379a051e5be392783009c391f2ca8e0533974dc9c6f459801b5. Mar 7 01:10:38.214606 systemd[1]: Started cri-containerd-e600065ac5d0bb53cb5ae7c2b16aa761e4ac57689946ea07f2d71443bcaf459e.scope - libcontainer container e600065ac5d0bb53cb5ae7c2b16aa761e4ac57689946ea07f2d71443bcaf459e. Mar 7 01:10:38.228576 systemd[1]: Started cri-containerd-c64ee6765446386f22c4f2080672b0e63d6fda831195072753573ca6df2b128c.scope - libcontainer container c64ee6765446386f22c4f2080672b0e63d6fda831195072753573ca6df2b128c. Mar 7 01:10:38.260196 containerd[1517]: time="2026-03-07T01:10:38.260167199Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081-3-6-n-e40d23dcbc,Uid:c714ccd199a2f1af73714bbf4100aa30,Namespace:kube-system,Attempt:0,} returns sandbox id \"28451ef59cf4a379a051e5be392783009c391f2ca8e0533974dc9c6f459801b5\"" Mar 7 01:10:38.267419 containerd[1517]: time="2026-03-07T01:10:38.266880967Z" level=info msg="CreateContainer within sandbox \"28451ef59cf4a379a051e5be392783009c391f2ca8e0533974dc9c6f459801b5\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 7 01:10:38.291573 containerd[1517]: time="2026-03-07T01:10:38.291536181Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081-3-6-n-e40d23dcbc,Uid:6deea6d952b3f72c7d75a6827f33c096,Namespace:kube-system,Attempt:0,} returns sandbox id \"e600065ac5d0bb53cb5ae7c2b16aa761e4ac57689946ea07f2d71443bcaf459e\"" Mar 7 01:10:38.295024 containerd[1517]: time="2026-03-07T01:10:38.294998525Z" level=info msg="CreateContainer within sandbox \"28451ef59cf4a379a051e5be392783009c391f2ca8e0533974dc9c6f459801b5\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"b10634c722411af6f2beb366d3e22c24d275189d436fb8401462fb5395750c77\"" Mar 7 01:10:38.295719 containerd[1517]: time="2026-03-07T01:10:38.295699810Z" level=info msg="StartContainer for \"b10634c722411af6f2beb366d3e22c24d275189d436fb8401462fb5395750c77\"" Mar 7 01:10:38.298691 containerd[1517]: time="2026-03-07T01:10:38.298663304Z" level=info msg="CreateContainer within sandbox \"e600065ac5d0bb53cb5ae7c2b16aa761e4ac57689946ea07f2d71443bcaf459e\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 7 01:10:38.301573 containerd[1517]: time="2026-03-07T01:10:38.301546535Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081-3-6-n-e40d23dcbc,Uid:2b4ab3695228850ff9c093db9086482d,Namespace:kube-system,Attempt:0,} returns sandbox id \"c64ee6765446386f22c4f2080672b0e63d6fda831195072753573ca6df2b128c\"" Mar 7 01:10:38.305513 containerd[1517]: time="2026-03-07T01:10:38.305483534Z" level=info msg="CreateContainer within sandbox \"c64ee6765446386f22c4f2080672b0e63d6fda831195072753573ca6df2b128c\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 7 01:10:38.316077 containerd[1517]: time="2026-03-07T01:10:38.316045967Z" level=info msg="CreateContainer within sandbox \"e600065ac5d0bb53cb5ae7c2b16aa761e4ac57689946ea07f2d71443bcaf459e\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"ee5226160a4075d2471c657eb046f83aa4fdcb510eb8b72b6b103613335c5073\"" Mar 7 01:10:38.316676 containerd[1517]: time="2026-03-07T01:10:38.316656259Z" level=info msg="StartContainer for \"ee5226160a4075d2471c657eb046f83aa4fdcb510eb8b72b6b103613335c5073\"" Mar 7 01:10:38.324898 containerd[1517]: time="2026-03-07T01:10:38.324827130Z" level=info msg="CreateContainer within sandbox \"c64ee6765446386f22c4f2080672b0e63d6fda831195072753573ca6df2b128c\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"53ea4186dae333b566c978013acf0388dd4cd1ceb0c6471aaf066554535ca174\"" Mar 7 01:10:38.325252 containerd[1517]: time="2026-03-07T01:10:38.325224071Z" level=info msg="StartContainer for \"53ea4186dae333b566c978013acf0388dd4cd1ceb0c6471aaf066554535ca174\"" Mar 7 01:10:38.328578 systemd[1]: Started cri-containerd-b10634c722411af6f2beb366d3e22c24d275189d436fb8401462fb5395750c77.scope - libcontainer container b10634c722411af6f2beb366d3e22c24d275189d436fb8401462fb5395750c77. Mar 7 01:10:38.345554 systemd[1]: Started cri-containerd-ee5226160a4075d2471c657eb046f83aa4fdcb510eb8b72b6b103613335c5073.scope - libcontainer container ee5226160a4075d2471c657eb046f83aa4fdcb510eb8b72b6b103613335c5073. Mar 7 01:10:38.367581 systemd[1]: Started cri-containerd-53ea4186dae333b566c978013acf0388dd4cd1ceb0c6471aaf066554535ca174.scope - libcontainer container 53ea4186dae333b566c978013acf0388dd4cd1ceb0c6471aaf066554535ca174. Mar 7 01:10:38.391988 containerd[1517]: time="2026-03-07T01:10:38.391937582Z" level=info msg="StartContainer for \"b10634c722411af6f2beb366d3e22c24d275189d436fb8401462fb5395750c77\" returns successfully" Mar 7 01:10:38.398119 kubelet[2197]: E0307 01:10:38.398085 2197 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://89.167.115.210:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4081-3-6-n-e40d23dcbc&limit=500&resourceVersion=0\": dial tcp 89.167.115.210:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 7 01:10:38.405840 kubelet[2197]: E0307 01:10:38.405501 2197 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://89.167.115.210:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 89.167.115.210:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 7 01:10:38.408796 containerd[1517]: time="2026-03-07T01:10:38.408159566Z" level=info msg="StartContainer for \"ee5226160a4075d2471c657eb046f83aa4fdcb510eb8b72b6b103613335c5073\" returns successfully" Mar 7 01:10:38.414604 kubelet[2197]: E0307 01:10:38.414573 2197 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://89.167.115.210:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 89.167.115.210:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Mar 7 01:10:38.438219 containerd[1517]: time="2026-03-07T01:10:38.438113379Z" level=info msg="StartContainer for \"53ea4186dae333b566c978013acf0388dd4cd1ceb0c6471aaf066554535ca174\" returns successfully" Mar 7 01:10:38.667594 kubelet[2197]: I0307 01:10:38.667562 2197 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:39.133619 kubelet[2197]: E0307 01:10:39.133590 2197 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-e40d23dcbc\" not found" node="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:39.135721 kubelet[2197]: E0307 01:10:39.135704 2197 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-e40d23dcbc\" not found" node="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:39.138130 kubelet[2197]: E0307 01:10:39.138115 2197 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-e40d23dcbc\" not found" node="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:39.448758 kubelet[2197]: E0307 01:10:39.448451 2197 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4081-3-6-n-e40d23dcbc\" not found" node="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:39.542288 kubelet[2197]: I0307 01:10:39.542134 2197 kubelet_node_status.go:78] "Successfully registered node" node="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:39.542288 kubelet[2197]: E0307 01:10:39.542164 2197 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4081-3-6-n-e40d23dcbc\": node \"ci-4081-3-6-n-e40d23dcbc\" not found" Mar 7 01:10:39.552816 kubelet[2197]: E0307 01:10:39.552787 2197 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-e40d23dcbc\" not found" Mar 7 01:10:39.652980 kubelet[2197]: E0307 01:10:39.652934 2197 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-e40d23dcbc\" not found" Mar 7 01:10:39.753829 kubelet[2197]: E0307 01:10:39.753454 2197 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-e40d23dcbc\" not found" Mar 7 01:10:39.854024 kubelet[2197]: E0307 01:10:39.853930 2197 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-e40d23dcbc\" not found" Mar 7 01:10:39.954585 kubelet[2197]: E0307 01:10:39.954512 2197 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-e40d23dcbc\" not found" Mar 7 01:10:40.054880 kubelet[2197]: E0307 01:10:40.054700 2197 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-e40d23dcbc\" not found" Mar 7 01:10:40.142003 kubelet[2197]: E0307 01:10:40.141655 2197 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-e40d23dcbc\" not found" node="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:40.142003 kubelet[2197]: E0307 01:10:40.141815 2197 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-n-e40d23dcbc\" not found" node="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:40.155181 kubelet[2197]: E0307 01:10:40.155132 2197 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-e40d23dcbc\" not found" Mar 7 01:10:40.256315 kubelet[2197]: E0307 01:10:40.256240 2197 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-e40d23dcbc\" not found" Mar 7 01:10:40.357306 kubelet[2197]: E0307 01:10:40.357133 2197 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-e40d23dcbc\" not found" Mar 7 01:10:40.458751 kubelet[2197]: E0307 01:10:40.458311 2197 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-e40d23dcbc\" not found" Mar 7 01:10:40.558556 kubelet[2197]: E0307 01:10:40.558519 2197 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-e40d23dcbc\" not found" Mar 7 01:10:40.658819 kubelet[2197]: E0307 01:10:40.658606 2197 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-e40d23dcbc\" not found" Mar 7 01:10:40.758823 kubelet[2197]: E0307 01:10:40.758776 2197 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-n-e40d23dcbc\" not found" Mar 7 01:10:40.894183 kubelet[2197]: I0307 01:10:40.893917 2197 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:40.906006 kubelet[2197]: I0307 01:10:40.905772 2197 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:40.910359 kubelet[2197]: I0307 01:10:40.910245 2197 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:41.082762 kubelet[2197]: I0307 01:10:41.082707 2197 apiserver.go:52] "Watching apiserver" Mar 7 01:10:41.094502 kubelet[2197]: I0307 01:10:41.094439 2197 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 7 01:10:41.616807 systemd[1]: Reloading requested from client PID 2487 ('systemctl') (unit session-7.scope)... Mar 7 01:10:41.616820 systemd[1]: Reloading... Mar 7 01:10:41.689431 zram_generator::config[2530]: No configuration found. Mar 7 01:10:41.776593 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 7 01:10:41.849430 systemd[1]: Reloading finished in 232 ms. Mar 7 01:10:41.887470 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 01:10:41.916656 systemd[1]: kubelet.service: Deactivated successfully. Mar 7 01:10:41.917420 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 01:10:41.922770 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 7 01:10:42.068664 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 7 01:10:42.070646 (kubelet)[2578]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 7 01:10:42.100124 kubelet[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 7 01:10:42.100124 kubelet[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 7 01:10:42.100124 kubelet[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 7 01:10:42.100626 kubelet[2578]: I0307 01:10:42.100159 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 7 01:10:42.106154 kubelet[2578]: I0307 01:10:42.106121 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 7 01:10:42.106154 kubelet[2578]: I0307 01:10:42.106138 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 7 01:10:42.106302 kubelet[2578]: I0307 01:10:42.106287 2578 server.go:956] "Client rotation is on, will bootstrap in background" Mar 7 01:10:42.107904 kubelet[2578]: I0307 01:10:42.107886 2578 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 7 01:10:42.111473 kubelet[2578]: I0307 01:10:42.111311 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 7 01:10:42.113737 kubelet[2578]: E0307 01:10:42.113715 2578 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Mar 7 01:10:42.113737 kubelet[2578]: I0307 01:10:42.113733 2578 server.go:1423] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Mar 7 01:10:42.116433 kubelet[2578]: I0307 01:10:42.116393 2578 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 7 01:10:42.116612 kubelet[2578]: I0307 01:10:42.116589 2578 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 7 01:10:42.116723 kubelet[2578]: I0307 01:10:42.116606 2578 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081-3-6-n-e40d23dcbc","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 7 01:10:42.116779 kubelet[2578]: I0307 01:10:42.116725 2578 topology_manager.go:138] "Creating topology manager with none policy" Mar 7 01:10:42.116779 kubelet[2578]: I0307 01:10:42.116732 2578 container_manager_linux.go:303] "Creating device plugin manager" Mar 7 01:10:42.116779 kubelet[2578]: I0307 01:10:42.116777 2578 state_mem.go:36] "Initialized new in-memory state store" Mar 7 01:10:42.117125 kubelet[2578]: I0307 01:10:42.116936 2578 kubelet.go:480] "Attempting to sync node with API server" Mar 7 01:10:42.117125 kubelet[2578]: I0307 01:10:42.116947 2578 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 7 01:10:42.117125 kubelet[2578]: I0307 01:10:42.116982 2578 kubelet.go:386] "Adding apiserver pod source" Mar 7 01:10:42.117125 kubelet[2578]: I0307 01:10:42.116994 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 7 01:10:42.120574 kubelet[2578]: I0307 01:10:42.120556 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Mar 7 01:10:42.121028 kubelet[2578]: I0307 01:10:42.121013 2578 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 7 01:10:42.124903 kubelet[2578]: I0307 01:10:42.124808 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 7 01:10:42.124903 kubelet[2578]: I0307 01:10:42.124833 2578 server.go:1289] "Started kubelet" Mar 7 01:10:42.125092 kubelet[2578]: I0307 01:10:42.125052 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 7 01:10:42.125230 kubelet[2578]: I0307 01:10:42.125194 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 7 01:10:42.126661 kubelet[2578]: I0307 01:10:42.125453 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 7 01:10:42.128565 kubelet[2578]: I0307 01:10:42.128507 2578 server.go:317] "Adding debug handlers to kubelet server" Mar 7 01:10:42.129836 kubelet[2578]: I0307 01:10:42.129825 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 7 01:10:42.131216 kubelet[2578]: I0307 01:10:42.130730 2578 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 7 01:10:42.134047 kubelet[2578]: I0307 01:10:42.134027 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 7 01:10:42.134374 kubelet[2578]: I0307 01:10:42.134356 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 7 01:10:42.134540 kubelet[2578]: I0307 01:10:42.134526 2578 reconciler.go:26] "Reconciler: start to sync state" Mar 7 01:10:42.137553 kubelet[2578]: E0307 01:10:42.137540 2578 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 7 01:10:42.139675 kubelet[2578]: I0307 01:10:42.138746 2578 factory.go:223] Registration of the containerd container factory successfully Mar 7 01:10:42.139675 kubelet[2578]: I0307 01:10:42.138760 2578 factory.go:223] Registration of the systemd container factory successfully Mar 7 01:10:42.139675 kubelet[2578]: I0307 01:10:42.138805 2578 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 7 01:10:42.146746 kubelet[2578]: I0307 01:10:42.146718 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 7 01:10:42.147815 kubelet[2578]: I0307 01:10:42.147802 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 7 01:10:42.147875 kubelet[2578]: I0307 01:10:42.147869 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 7 01:10:42.147922 kubelet[2578]: I0307 01:10:42.147915 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 7 01:10:42.147951 kubelet[2578]: I0307 01:10:42.147946 2578 kubelet.go:2436] "Starting kubelet main sync loop" Mar 7 01:10:42.148176 kubelet[2578]: E0307 01:10:42.148002 2578 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 7 01:10:42.184181 kubelet[2578]: I0307 01:10:42.184156 2578 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 7 01:10:42.184181 kubelet[2578]: I0307 01:10:42.184169 2578 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 7 01:10:42.184181 kubelet[2578]: I0307 01:10:42.184184 2578 state_mem.go:36] "Initialized new in-memory state store" Mar 7 01:10:42.184324 kubelet[2578]: I0307 01:10:42.184275 2578 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 7 01:10:42.184324 kubelet[2578]: I0307 01:10:42.184282 2578 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 7 01:10:42.184324 kubelet[2578]: I0307 01:10:42.184295 2578 policy_none.go:49] "None policy: Start" Mar 7 01:10:42.184324 kubelet[2578]: I0307 01:10:42.184305 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 7 01:10:42.184324 kubelet[2578]: I0307 01:10:42.184315 2578 state_mem.go:35] "Initializing new in-memory state store" Mar 7 01:10:42.184445 kubelet[2578]: I0307 01:10:42.184430 2578 state_mem.go:75] "Updated machine memory state" Mar 7 01:10:42.188239 kubelet[2578]: E0307 01:10:42.188128 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 7 01:10:42.188538 kubelet[2578]: I0307 01:10:42.188524 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 7 01:10:42.188578 kubelet[2578]: I0307 01:10:42.188538 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 7 01:10:42.188724 kubelet[2578]: I0307 01:10:42.188711 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 7 01:10:42.190768 kubelet[2578]: E0307 01:10:42.190751 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 7 01:10:42.249433 kubelet[2578]: I0307 01:10:42.249148 2578 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:42.249433 kubelet[2578]: I0307 01:10:42.249239 2578 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:42.249433 kubelet[2578]: I0307 01:10:42.249361 2578 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:42.253457 kubelet[2578]: E0307 01:10:42.253437 2578 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081-3-6-n-e40d23dcbc\" already exists" pod="kube-system/kube-apiserver-ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:42.254038 kubelet[2578]: E0307 01:10:42.253954 2578 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081-3-6-n-e40d23dcbc\" already exists" pod="kube-system/kube-scheduler-ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:42.254038 kubelet[2578]: E0307 01:10:42.253965 2578 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4081-3-6-n-e40d23dcbc\" already exists" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:42.298593 kubelet[2578]: I0307 01:10:42.298123 2578 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:42.305044 kubelet[2578]: I0307 01:10:42.304717 2578 kubelet_node_status.go:124] "Node was previously registered" node="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:42.305044 kubelet[2578]: I0307 01:10:42.304834 2578 kubelet_node_status.go:78] "Successfully registered node" node="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:42.335977 kubelet[2578]: I0307 01:10:42.335923 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/2b4ab3695228850ff9c093db9086482d-kubeconfig\") pod \"kube-controller-manager-ci-4081-3-6-n-e40d23dcbc\" (UID: \"2b4ab3695228850ff9c093db9086482d\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:42.335977 kubelet[2578]: I0307 01:10:42.335971 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c714ccd199a2f1af73714bbf4100aa30-kubeconfig\") pod \"kube-scheduler-ci-4081-3-6-n-e40d23dcbc\" (UID: \"c714ccd199a2f1af73714bbf4100aa30\") " pod="kube-system/kube-scheduler-ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:42.336128 kubelet[2578]: I0307 01:10:42.336000 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6deea6d952b3f72c7d75a6827f33c096-ca-certs\") pod \"kube-apiserver-ci-4081-3-6-n-e40d23dcbc\" (UID: \"6deea6d952b3f72c7d75a6827f33c096\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:42.336128 kubelet[2578]: I0307 01:10:42.336023 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6deea6d952b3f72c7d75a6827f33c096-k8s-certs\") pod \"kube-apiserver-ci-4081-3-6-n-e40d23dcbc\" (UID: \"6deea6d952b3f72c7d75a6827f33c096\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:42.336128 kubelet[2578]: I0307 01:10:42.336045 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2b4ab3695228850ff9c093db9086482d-k8s-certs\") pod \"kube-controller-manager-ci-4081-3-6-n-e40d23dcbc\" (UID: \"2b4ab3695228850ff9c093db9086482d\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:42.336128 kubelet[2578]: I0307 01:10:42.336070 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2b4ab3695228850ff9c093db9086482d-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081-3-6-n-e40d23dcbc\" (UID: \"2b4ab3695228850ff9c093db9086482d\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:42.336128 kubelet[2578]: I0307 01:10:42.336094 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6deea6d952b3f72c7d75a6827f33c096-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081-3-6-n-e40d23dcbc\" (UID: \"6deea6d952b3f72c7d75a6827f33c096\") " pod="kube-system/kube-apiserver-ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:42.336244 kubelet[2578]: I0307 01:10:42.336116 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2b4ab3695228850ff9c093db9086482d-ca-certs\") pod \"kube-controller-manager-ci-4081-3-6-n-e40d23dcbc\" (UID: \"2b4ab3695228850ff9c093db9086482d\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:42.336244 kubelet[2578]: I0307 01:10:42.336140 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/2b4ab3695228850ff9c093db9086482d-flexvolume-dir\") pod \"kube-controller-manager-ci-4081-3-6-n-e40d23dcbc\" (UID: \"2b4ab3695228850ff9c093db9086482d\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:43.122438 kubelet[2578]: I0307 01:10:43.120882 2578 apiserver.go:52] "Watching apiserver" Mar 7 01:10:43.135311 kubelet[2578]: I0307 01:10:43.135237 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 7 01:10:43.176279 kubelet[2578]: I0307 01:10:43.173909 2578 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:43.180126 kubelet[2578]: E0307 01:10:43.180059 2578 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081-3-6-n-e40d23dcbc\" already exists" pod="kube-system/kube-scheduler-ci-4081-3-6-n-e40d23dcbc" Mar 7 01:10:43.198101 kubelet[2578]: I0307 01:10:43.197969 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4081-3-6-n-e40d23dcbc" podStartSLOduration=3.197955996 podStartE2EDuration="3.197955996s" podCreationTimestamp="2026-03-07 01:10:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 01:10:43.190645111 +0000 UTC m=+1.115612152" watchObservedRunningTime="2026-03-07 01:10:43.197955996 +0000 UTC m=+1.122923047" Mar 7 01:10:43.205225 kubelet[2578]: I0307 01:10:43.205189 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4081-3-6-n-e40d23dcbc" podStartSLOduration=3.205176534 podStartE2EDuration="3.205176534s" podCreationTimestamp="2026-03-07 01:10:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 01:10:43.198505401 +0000 UTC m=+1.123472452" watchObservedRunningTime="2026-03-07 01:10:43.205176534 +0000 UTC m=+1.130143585" Mar 7 01:10:46.891594 kubelet[2578]: I0307 01:10:46.891519 2578 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 7 01:10:46.892110 kubelet[2578]: I0307 01:10:46.891885 2578 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 7 01:10:46.892147 containerd[1517]: time="2026-03-07T01:10:46.891762390Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 7 01:10:47.713002 kubelet[2578]: I0307 01:10:47.712014 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4081-3-6-n-e40d23dcbc" podStartSLOduration=7.712000153 podStartE2EDuration="7.712000153s" podCreationTimestamp="2026-03-07 01:10:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 01:10:43.205604952 +0000 UTC m=+1.130572003" watchObservedRunningTime="2026-03-07 01:10:47.712000153 +0000 UTC m=+5.636967194" Mar 7 01:10:47.722486 systemd[1]: Created slice kubepods-besteffort-pode5917c3e_7a44_4b38_beeb_18c4766c1dac.slice - libcontainer container kubepods-besteffort-pode5917c3e_7a44_4b38_beeb_18c4766c1dac.slice. Mar 7 01:10:47.780350 kubelet[2578]: I0307 01:10:47.780294 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p2bl\" (UniqueName: \"kubernetes.io/projected/e5917c3e-7a44-4b38-beeb-18c4766c1dac-kube-api-access-2p2bl\") pod \"kube-proxy-nmzcx\" (UID: \"e5917c3e-7a44-4b38-beeb-18c4766c1dac\") " pod="kube-system/kube-proxy-nmzcx" Mar 7 01:10:47.780493 kubelet[2578]: I0307 01:10:47.780371 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/e5917c3e-7a44-4b38-beeb-18c4766c1dac-kube-proxy\") pod \"kube-proxy-nmzcx\" (UID: \"e5917c3e-7a44-4b38-beeb-18c4766c1dac\") " pod="kube-system/kube-proxy-nmzcx" Mar 7 01:10:47.780493 kubelet[2578]: I0307 01:10:47.780432 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/e5917c3e-7a44-4b38-beeb-18c4766c1dac-xtables-lock\") pod \"kube-proxy-nmzcx\" (UID: \"e5917c3e-7a44-4b38-beeb-18c4766c1dac\") " pod="kube-system/kube-proxy-nmzcx" Mar 7 01:10:47.780493 kubelet[2578]: I0307 01:10:47.780454 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e5917c3e-7a44-4b38-beeb-18c4766c1dac-lib-modules\") pod \"kube-proxy-nmzcx\" (UID: \"e5917c3e-7a44-4b38-beeb-18c4766c1dac\") " pod="kube-system/kube-proxy-nmzcx" Mar 7 01:10:47.840659 systemd[1]: Created slice kubepods-besteffort-pod7c194493_3b24_4c2d_bb77_815f24e5c7a0.slice - libcontainer container kubepods-besteffort-pod7c194493_3b24_4c2d_bb77_815f24e5c7a0.slice. Mar 7 01:10:47.881548 kubelet[2578]: I0307 01:10:47.881085 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/7c194493-3b24-4c2d-bb77-815f24e5c7a0-var-lib-calico\") pod \"tigera-operator-6bf85f8dd-5fjdk\" (UID: \"7c194493-3b24-4c2d-bb77-815f24e5c7a0\") " pod="tigera-operator/tigera-operator-6bf85f8dd-5fjdk" Mar 7 01:10:47.881548 kubelet[2578]: I0307 01:10:47.881122 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz9p4\" (UniqueName: \"kubernetes.io/projected/7c194493-3b24-4c2d-bb77-815f24e5c7a0-kube-api-access-bz9p4\") pod \"tigera-operator-6bf85f8dd-5fjdk\" (UID: \"7c194493-3b24-4c2d-bb77-815f24e5c7a0\") " pod="tigera-operator/tigera-operator-6bf85f8dd-5fjdk" Mar 7 01:10:48.032678 containerd[1517]: time="2026-03-07T01:10:48.032202628Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-nmzcx,Uid:e5917c3e-7a44-4b38-beeb-18c4766c1dac,Namespace:kube-system,Attempt:0,}" Mar 7 01:10:48.071182 containerd[1517]: time="2026-03-07T01:10:48.071041158Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:10:48.071587 containerd[1517]: time="2026-03-07T01:10:48.071231621Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:10:48.071587 containerd[1517]: time="2026-03-07T01:10:48.071298342Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:10:48.071793 containerd[1517]: time="2026-03-07T01:10:48.071591091Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:10:48.107624 systemd[1]: Started cri-containerd-bdab0556d1338c3b4b0a7271e982b3c73d8d4f91acdc87bab0f8111cb305ce02.scope - libcontainer container bdab0556d1338c3b4b0a7271e982b3c73d8d4f91acdc87bab0f8111cb305ce02. Mar 7 01:10:48.137526 containerd[1517]: time="2026-03-07T01:10:48.137466007Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-nmzcx,Uid:e5917c3e-7a44-4b38-beeb-18c4766c1dac,Namespace:kube-system,Attempt:0,} returns sandbox id \"bdab0556d1338c3b4b0a7271e982b3c73d8d4f91acdc87bab0f8111cb305ce02\"" Mar 7 01:10:48.143798 containerd[1517]: time="2026-03-07T01:10:48.143767955Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6bf85f8dd-5fjdk,Uid:7c194493-3b24-4c2d-bb77-815f24e5c7a0,Namespace:tigera-operator,Attempt:0,}" Mar 7 01:10:48.144079 containerd[1517]: time="2026-03-07T01:10:48.143837902Z" level=info msg="CreateContainer within sandbox \"bdab0556d1338c3b4b0a7271e982b3c73d8d4f91acdc87bab0f8111cb305ce02\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 7 01:10:48.164113 containerd[1517]: time="2026-03-07T01:10:48.164002329Z" level=info msg="CreateContainer within sandbox \"bdab0556d1338c3b4b0a7271e982b3c73d8d4f91acdc87bab0f8111cb305ce02\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"daf9cdcca06bc6b509789d8c0c2d2bca8c25373a4f5cbae0ae803179bea0dcfb\"" Mar 7 01:10:48.165447 containerd[1517]: time="2026-03-07T01:10:48.164878457Z" level=info msg="StartContainer for \"daf9cdcca06bc6b509789d8c0c2d2bca8c25373a4f5cbae0ae803179bea0dcfb\"" Mar 7 01:10:48.174739 containerd[1517]: time="2026-03-07T01:10:48.174654286Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:10:48.174911 containerd[1517]: time="2026-03-07T01:10:48.174893729Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:10:48.174991 containerd[1517]: time="2026-03-07T01:10:48.174975645Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:10:48.175174 containerd[1517]: time="2026-03-07T01:10:48.175157229Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:10:48.198549 systemd[1]: Started cri-containerd-413501224bc3aac686f1ce7fd5d5758d55e0e42e5b488d7612830120771e085f.scope - libcontainer container 413501224bc3aac686f1ce7fd5d5758d55e0e42e5b488d7612830120771e085f. Mar 7 01:10:48.201925 systemd[1]: Started cri-containerd-daf9cdcca06bc6b509789d8c0c2d2bca8c25373a4f5cbae0ae803179bea0dcfb.scope - libcontainer container daf9cdcca06bc6b509789d8c0c2d2bca8c25373a4f5cbae0ae803179bea0dcfb. Mar 7 01:10:48.235866 containerd[1517]: time="2026-03-07T01:10:48.235809595Z" level=info msg="StartContainer for \"daf9cdcca06bc6b509789d8c0c2d2bca8c25373a4f5cbae0ae803179bea0dcfb\" returns successfully" Mar 7 01:10:48.242649 containerd[1517]: time="2026-03-07T01:10:48.242548613Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6bf85f8dd-5fjdk,Uid:7c194493-3b24-4c2d-bb77-815f24e5c7a0,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"413501224bc3aac686f1ce7fd5d5758d55e0e42e5b488d7612830120771e085f\"" Mar 7 01:10:48.246849 containerd[1517]: time="2026-03-07T01:10:48.246210410Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Mar 7 01:10:49.951833 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3718825458.mount: Deactivated successfully. Mar 7 01:10:50.601789 containerd[1517]: time="2026-03-07T01:10:50.601739919Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:10:50.602873 containerd[1517]: time="2026-03-07T01:10:50.602778730Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=40846156" Mar 7 01:10:50.604571 containerd[1517]: time="2026-03-07T01:10:50.603710875Z" level=info msg="ImageCreate event name:\"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:10:50.606364 containerd[1517]: time="2026-03-07T01:10:50.605801956Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:10:50.606364 containerd[1517]: time="2026-03-07T01:10:50.606259886Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"40842151\" in 2.360019595s" Mar 7 01:10:50.606364 containerd[1517]: time="2026-03-07T01:10:50.606280739Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\"" Mar 7 01:10:50.609239 containerd[1517]: time="2026-03-07T01:10:50.609211398Z" level=info msg="CreateContainer within sandbox \"413501224bc3aac686f1ce7fd5d5758d55e0e42e5b488d7612830120771e085f\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 7 01:10:50.631996 containerd[1517]: time="2026-03-07T01:10:50.631946445Z" level=info msg="CreateContainer within sandbox \"413501224bc3aac686f1ce7fd5d5758d55e0e42e5b488d7612830120771e085f\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"8b3ae79596106a8c76118ecce1b81b0457ad20a1d0b3463bd982c82d98b69ab8\"" Mar 7 01:10:50.633596 containerd[1517]: time="2026-03-07T01:10:50.632327954Z" level=info msg="StartContainer for \"8b3ae79596106a8c76118ecce1b81b0457ad20a1d0b3463bd982c82d98b69ab8\"" Mar 7 01:10:50.658581 systemd[1]: Started cri-containerd-8b3ae79596106a8c76118ecce1b81b0457ad20a1d0b3463bd982c82d98b69ab8.scope - libcontainer container 8b3ae79596106a8c76118ecce1b81b0457ad20a1d0b3463bd982c82d98b69ab8. Mar 7 01:10:50.680244 containerd[1517]: time="2026-03-07T01:10:50.680186005Z" level=info msg="StartContainer for \"8b3ae79596106a8c76118ecce1b81b0457ad20a1d0b3463bd982c82d98b69ab8\" returns successfully" Mar 7 01:10:51.205889 kubelet[2578]: I0307 01:10:51.205818 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-nmzcx" podStartSLOduration=4.205799714 podStartE2EDuration="4.205799714s" podCreationTimestamp="2026-03-07 01:10:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 01:10:49.199625076 +0000 UTC m=+7.124592157" watchObservedRunningTime="2026-03-07 01:10:51.205799714 +0000 UTC m=+9.130766775" Mar 7 01:10:51.206647 kubelet[2578]: I0307 01:10:51.205971 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6bf85f8dd-5fjdk" podStartSLOduration=1.844064796 podStartE2EDuration="4.205963221s" podCreationTimestamp="2026-03-07 01:10:47 +0000 UTC" firstStartedPulling="2026-03-07 01:10:48.245082372 +0000 UTC m=+6.170049423" lastFinishedPulling="2026-03-07 01:10:50.606980807 +0000 UTC m=+8.531947848" observedRunningTime="2026-03-07 01:10:51.205625513 +0000 UTC m=+9.130592574" watchObservedRunningTime="2026-03-07 01:10:51.205963221 +0000 UTC m=+9.130930262" Mar 7 01:10:55.908753 sudo[1704]: pam_unix(sudo:session): session closed for user root Mar 7 01:10:56.029295 sshd[1701]: pam_unix(sshd:session): session closed for user core Mar 7 01:10:56.033129 systemd-logind[1493]: Session 7 logged out. Waiting for processes to exit. Mar 7 01:10:56.035355 systemd[1]: sshd@6-89.167.115.210:22-4.153.228.146:51878.service: Deactivated successfully. Mar 7 01:10:56.041855 systemd[1]: session-7.scope: Deactivated successfully. Mar 7 01:10:56.042172 systemd[1]: session-7.scope: Consumed 4.365s CPU time, 157.7M memory peak, 0B memory swap peak. Mar 7 01:10:56.043924 systemd-logind[1493]: Removed session 7. Mar 7 01:10:57.688746 systemd[1]: Created slice kubepods-besteffort-podc41dcc46_138e_4c90_a556_927e589ea977.slice - libcontainer container kubepods-besteffort-podc41dcc46_138e_4c90_a556_927e589ea977.slice. Mar 7 01:10:57.751481 kubelet[2578]: I0307 01:10:57.750894 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/c41dcc46-138e-4c90-a556-927e589ea977-typha-certs\") pod \"calico-typha-84bf5446f6-rzhwz\" (UID: \"c41dcc46-138e-4c90-a556-927e589ea977\") " pod="calico-system/calico-typha-84bf5446f6-rzhwz" Mar 7 01:10:57.751481 kubelet[2578]: I0307 01:10:57.750918 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gfdh\" (UniqueName: \"kubernetes.io/projected/c41dcc46-138e-4c90-a556-927e589ea977-kube-api-access-6gfdh\") pod \"calico-typha-84bf5446f6-rzhwz\" (UID: \"c41dcc46-138e-4c90-a556-927e589ea977\") " pod="calico-system/calico-typha-84bf5446f6-rzhwz" Mar 7 01:10:57.751481 kubelet[2578]: I0307 01:10:57.750931 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c41dcc46-138e-4c90-a556-927e589ea977-tigera-ca-bundle\") pod \"calico-typha-84bf5446f6-rzhwz\" (UID: \"c41dcc46-138e-4c90-a556-927e589ea977\") " pod="calico-system/calico-typha-84bf5446f6-rzhwz" Mar 7 01:10:57.751885 systemd[1]: Created slice kubepods-besteffort-pod9a4f6928_5888_4159_935e_d4409c18c5ce.slice - libcontainer container kubepods-besteffort-pod9a4f6928_5888_4159_935e_d4409c18c5ce.slice. Mar 7 01:10:57.841904 kubelet[2578]: E0307 01:10:57.841853 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gpr4h" podUID="b2c4739c-e360-4c95-9680-eefc754cc98b" Mar 7 01:10:57.852260 kubelet[2578]: I0307 01:10:57.852229 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/9a4f6928-5888-4159-935e-d4409c18c5ce-nodeproc\") pod \"calico-node-vtmpc\" (UID: \"9a4f6928-5888-4159-935e-d4409c18c5ce\") " pod="calico-system/calico-node-vtmpc" Mar 7 01:10:57.852260 kubelet[2578]: I0307 01:10:57.852253 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtk6h\" (UniqueName: \"kubernetes.io/projected/9a4f6928-5888-4159-935e-d4409c18c5ce-kube-api-access-qtk6h\") pod \"calico-node-vtmpc\" (UID: \"9a4f6928-5888-4159-935e-d4409c18c5ce\") " pod="calico-system/calico-node-vtmpc" Mar 7 01:10:57.852390 kubelet[2578]: I0307 01:10:57.852366 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a4f6928-5888-4159-935e-d4409c18c5ce-lib-modules\") pod \"calico-node-vtmpc\" (UID: \"9a4f6928-5888-4159-935e-d4409c18c5ce\") " pod="calico-system/calico-node-vtmpc" Mar 7 01:10:57.852390 kubelet[2578]: I0307 01:10:57.852379 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/9a4f6928-5888-4159-935e-d4409c18c5ce-node-certs\") pod \"calico-node-vtmpc\" (UID: \"9a4f6928-5888-4159-935e-d4409c18c5ce\") " pod="calico-system/calico-node-vtmpc" Mar 7 01:10:57.852443 kubelet[2578]: I0307 01:10:57.852390 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9a4f6928-5888-4159-935e-d4409c18c5ce-sys-fs\") pod \"calico-node-vtmpc\" (UID: \"9a4f6928-5888-4159-935e-d4409c18c5ce\") " pod="calico-system/calico-node-vtmpc" Mar 7 01:10:57.853630 kubelet[2578]: I0307 01:10:57.853602 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/9a4f6928-5888-4159-935e-d4409c18c5ce-var-run-calico\") pod \"calico-node-vtmpc\" (UID: \"9a4f6928-5888-4159-935e-d4409c18c5ce\") " pod="calico-system/calico-node-vtmpc" Mar 7 01:10:57.853630 kubelet[2578]: I0307 01:10:57.853629 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/9a4f6928-5888-4159-935e-d4409c18c5ce-xtables-lock\") pod \"calico-node-vtmpc\" (UID: \"9a4f6928-5888-4159-935e-d4409c18c5ce\") " pod="calico-system/calico-node-vtmpc" Mar 7 01:10:57.853718 kubelet[2578]: I0307 01:10:57.853645 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/9a4f6928-5888-4159-935e-d4409c18c5ce-var-lib-calico\") pod \"calico-node-vtmpc\" (UID: \"9a4f6928-5888-4159-935e-d4409c18c5ce\") " pod="calico-system/calico-node-vtmpc" Mar 7 01:10:57.853718 kubelet[2578]: I0307 01:10:57.853658 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/9a4f6928-5888-4159-935e-d4409c18c5ce-flexvol-driver-host\") pod \"calico-node-vtmpc\" (UID: \"9a4f6928-5888-4159-935e-d4409c18c5ce\") " pod="calico-system/calico-node-vtmpc" Mar 7 01:10:57.853718 kubelet[2578]: I0307 01:10:57.853690 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/9a4f6928-5888-4159-935e-d4409c18c5ce-policysync\") pod \"calico-node-vtmpc\" (UID: \"9a4f6928-5888-4159-935e-d4409c18c5ce\") " pod="calico-system/calico-node-vtmpc" Mar 7 01:10:57.853718 kubelet[2578]: I0307 01:10:57.853701 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/9a4f6928-5888-4159-935e-d4409c18c5ce-bpffs\") pod \"calico-node-vtmpc\" (UID: \"9a4f6928-5888-4159-935e-d4409c18c5ce\") " pod="calico-system/calico-node-vtmpc" Mar 7 01:10:57.853718 kubelet[2578]: I0307 01:10:57.853711 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/9a4f6928-5888-4159-935e-d4409c18c5ce-cni-bin-dir\") pod \"calico-node-vtmpc\" (UID: \"9a4f6928-5888-4159-935e-d4409c18c5ce\") " pod="calico-system/calico-node-vtmpc" Mar 7 01:10:57.853833 kubelet[2578]: I0307 01:10:57.853727 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/9a4f6928-5888-4159-935e-d4409c18c5ce-cni-log-dir\") pod \"calico-node-vtmpc\" (UID: \"9a4f6928-5888-4159-935e-d4409c18c5ce\") " pod="calico-system/calico-node-vtmpc" Mar 7 01:10:57.853833 kubelet[2578]: I0307 01:10:57.853737 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a4f6928-5888-4159-935e-d4409c18c5ce-tigera-ca-bundle\") pod \"calico-node-vtmpc\" (UID: \"9a4f6928-5888-4159-935e-d4409c18c5ce\") " pod="calico-system/calico-node-vtmpc" Mar 7 01:10:57.853833 kubelet[2578]: I0307 01:10:57.853770 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/9a4f6928-5888-4159-935e-d4409c18c5ce-cni-net-dir\") pod \"calico-node-vtmpc\" (UID: \"9a4f6928-5888-4159-935e-d4409c18c5ce\") " pod="calico-system/calico-node-vtmpc" Mar 7 01:10:57.955022 kubelet[2578]: I0307 01:10:57.953953 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/b2c4739c-e360-4c95-9680-eefc754cc98b-varrun\") pod \"csi-node-driver-gpr4h\" (UID: \"b2c4739c-e360-4c95-9680-eefc754cc98b\") " pod="calico-system/csi-node-driver-gpr4h" Mar 7 01:10:57.955022 kubelet[2578]: I0307 01:10:57.953998 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b2c4739c-e360-4c95-9680-eefc754cc98b-registration-dir\") pod \"csi-node-driver-gpr4h\" (UID: \"b2c4739c-e360-4c95-9680-eefc754cc98b\") " pod="calico-system/csi-node-driver-gpr4h" Mar 7 01:10:57.955022 kubelet[2578]: I0307 01:10:57.954081 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b2c4739c-e360-4c95-9680-eefc754cc98b-socket-dir\") pod \"csi-node-driver-gpr4h\" (UID: \"b2c4739c-e360-4c95-9680-eefc754cc98b\") " pod="calico-system/csi-node-driver-gpr4h" Mar 7 01:10:57.955022 kubelet[2578]: I0307 01:10:57.954093 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vtqp\" (UniqueName: \"kubernetes.io/projected/b2c4739c-e360-4c95-9680-eefc754cc98b-kube-api-access-2vtqp\") pod \"csi-node-driver-gpr4h\" (UID: \"b2c4739c-e360-4c95-9680-eefc754cc98b\") " pod="calico-system/csi-node-driver-gpr4h" Mar 7 01:10:57.955022 kubelet[2578]: I0307 01:10:57.954124 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b2c4739c-e360-4c95-9680-eefc754cc98b-kubelet-dir\") pod \"csi-node-driver-gpr4h\" (UID: \"b2c4739c-e360-4c95-9680-eefc754cc98b\") " pod="calico-system/csi-node-driver-gpr4h" Mar 7 01:10:57.956198 kubelet[2578]: E0307 01:10:57.956169 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:57.956308 kubelet[2578]: W0307 01:10:57.956297 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:57.956431 kubelet[2578]: E0307 01:10:57.956421 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:57.956747 kubelet[2578]: E0307 01:10:57.956738 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:57.956853 kubelet[2578]: W0307 01:10:57.956844 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:57.956894 kubelet[2578]: E0307 01:10:57.956884 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:57.958824 kubelet[2578]: E0307 01:10:57.958812 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:57.958887 kubelet[2578]: W0307 01:10:57.958869 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:57.958943 kubelet[2578]: E0307 01:10:57.958934 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:57.959911 kubelet[2578]: E0307 01:10:57.959856 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:57.959911 kubelet[2578]: W0307 01:10:57.959866 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:57.959911 kubelet[2578]: E0307 01:10:57.959874 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:57.961187 kubelet[2578]: E0307 01:10:57.960706 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:57.961187 kubelet[2578]: W0307 01:10:57.960717 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:57.961187 kubelet[2578]: E0307 01:10:57.960726 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:57.961187 kubelet[2578]: E0307 01:10:57.960961 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:57.961187 kubelet[2578]: W0307 01:10:57.960967 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:57.961187 kubelet[2578]: E0307 01:10:57.960974 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:57.961314 kubelet[2578]: E0307 01:10:57.961209 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:57.961314 kubelet[2578]: W0307 01:10:57.961215 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:57.961314 kubelet[2578]: E0307 01:10:57.961222 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:57.961564 kubelet[2578]: E0307 01:10:57.961443 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:57.961564 kubelet[2578]: W0307 01:10:57.961451 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:57.961564 kubelet[2578]: E0307 01:10:57.961457 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:57.961744 kubelet[2578]: E0307 01:10:57.961730 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:57.961744 kubelet[2578]: W0307 01:10:57.961739 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:57.961783 kubelet[2578]: E0307 01:10:57.961746 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:57.962010 kubelet[2578]: E0307 01:10:57.961995 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:57.962032 kubelet[2578]: W0307 01:10:57.962013 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:57.962032 kubelet[2578]: E0307 01:10:57.962020 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:57.962231 kubelet[2578]: E0307 01:10:57.962216 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:57.962231 kubelet[2578]: W0307 01:10:57.962226 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:57.962273 kubelet[2578]: E0307 01:10:57.962232 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:57.962484 kubelet[2578]: E0307 01:10:57.962469 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:57.962484 kubelet[2578]: W0307 01:10:57.962479 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:57.962541 kubelet[2578]: E0307 01:10:57.962486 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:57.962723 kubelet[2578]: E0307 01:10:57.962708 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:57.962747 kubelet[2578]: W0307 01:10:57.962725 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:57.962747 kubelet[2578]: E0307 01:10:57.962731 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:57.962959 kubelet[2578]: E0307 01:10:57.962944 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:57.962959 kubelet[2578]: W0307 01:10:57.962954 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:57.962996 kubelet[2578]: E0307 01:10:57.962968 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:57.963174 kubelet[2578]: E0307 01:10:57.963160 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:57.963174 kubelet[2578]: W0307 01:10:57.963168 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:57.963217 kubelet[2578]: E0307 01:10:57.963174 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:57.963414 kubelet[2578]: E0307 01:10:57.963380 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:57.963414 kubelet[2578]: W0307 01:10:57.963389 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:57.963414 kubelet[2578]: E0307 01:10:57.963405 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:57.963620 kubelet[2578]: E0307 01:10:57.963603 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:57.963620 kubelet[2578]: W0307 01:10:57.963614 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:57.963620 kubelet[2578]: E0307 01:10:57.963620 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:57.963846 kubelet[2578]: E0307 01:10:57.963830 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:57.963846 kubelet[2578]: W0307 01:10:57.963841 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:57.963886 kubelet[2578]: E0307 01:10:57.963847 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:57.964203 kubelet[2578]: E0307 01:10:57.964167 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:57.964203 kubelet[2578]: W0307 01:10:57.964177 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:57.964203 kubelet[2578]: E0307 01:10:57.964202 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:57.964942 kubelet[2578]: E0307 01:10:57.964467 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:57.964942 kubelet[2578]: W0307 01:10:57.964475 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:57.964942 kubelet[2578]: E0307 01:10:57.964499 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:57.965029 kubelet[2578]: E0307 01:10:57.964958 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:57.965091 kubelet[2578]: W0307 01:10:57.964965 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:57.965091 kubelet[2578]: E0307 01:10:57.965084 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:57.965543 kubelet[2578]: E0307 01:10:57.965524 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:57.965579 kubelet[2578]: W0307 01:10:57.965556 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:57.965579 kubelet[2578]: E0307 01:10:57.965563 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:57.965865 kubelet[2578]: E0307 01:10:57.965848 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:57.965865 kubelet[2578]: W0307 01:10:57.965858 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:57.965921 kubelet[2578]: E0307 01:10:57.965866 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:57.966167 kubelet[2578]: E0307 01:10:57.966152 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:57.966167 kubelet[2578]: W0307 01:10:57.966163 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:57.966223 kubelet[2578]: E0307 01:10:57.966170 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:57.967446 kubelet[2578]: E0307 01:10:57.966458 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:57.967446 kubelet[2578]: W0307 01:10:57.966466 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:57.967446 kubelet[2578]: E0307 01:10:57.966473 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:57.967446 kubelet[2578]: E0307 01:10:57.966773 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:57.967446 kubelet[2578]: W0307 01:10:57.966781 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:57.967446 kubelet[2578]: E0307 01:10:57.966788 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:57.972274 kubelet[2578]: E0307 01:10:57.972261 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:57.972353 kubelet[2578]: W0307 01:10:57.972325 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:57.972353 kubelet[2578]: E0307 01:10:57.972336 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:57.996043 containerd[1517]: time="2026-03-07T01:10:57.996010834Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-84bf5446f6-rzhwz,Uid:c41dcc46-138e-4c90-a556-927e589ea977,Namespace:calico-system,Attempt:0,}" Mar 7 01:10:58.019106 containerd[1517]: time="2026-03-07T01:10:58.018472335Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:10:58.019106 containerd[1517]: time="2026-03-07T01:10:58.019065045Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:10:58.019106 containerd[1517]: time="2026-03-07T01:10:58.019074123Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:10:58.019296 containerd[1517]: time="2026-03-07T01:10:58.019138062Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:10:58.036626 systemd[1]: Started cri-containerd-72211f4854d8c3b640ff5562726e0cea99ea290ac9545e251e19b5ad1f54abbf.scope - libcontainer container 72211f4854d8c3b640ff5562726e0cea99ea290ac9545e251e19b5ad1f54abbf. Mar 7 01:10:58.054777 kubelet[2578]: E0307 01:10:58.054744 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.054777 kubelet[2578]: W0307 01:10:58.054761 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.054777 kubelet[2578]: E0307 01:10:58.054777 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.055064 kubelet[2578]: E0307 01:10:58.054983 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.055064 kubelet[2578]: W0307 01:10:58.054989 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.055064 kubelet[2578]: E0307 01:10:58.054997 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.055573 kubelet[2578]: E0307 01:10:58.055557 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.055573 kubelet[2578]: W0307 01:10:58.055569 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.056371 kubelet[2578]: E0307 01:10:58.055578 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.056371 kubelet[2578]: E0307 01:10:58.055799 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.056371 kubelet[2578]: W0307 01:10:58.055806 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.056371 kubelet[2578]: E0307 01:10:58.055812 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.056598 containerd[1517]: time="2026-03-07T01:10:58.056567853Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-vtmpc,Uid:9a4f6928-5888-4159-935e-d4409c18c5ce,Namespace:calico-system,Attempt:0,}" Mar 7 01:10:58.057457 kubelet[2578]: E0307 01:10:58.057442 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.057457 kubelet[2578]: W0307 01:10:58.057452 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.057457 kubelet[2578]: E0307 01:10:58.057460 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.057700 kubelet[2578]: E0307 01:10:58.057664 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.057700 kubelet[2578]: W0307 01:10:58.057675 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.057700 kubelet[2578]: E0307 01:10:58.057682 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.058062 kubelet[2578]: E0307 01:10:58.057983 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.058062 kubelet[2578]: W0307 01:10:58.057990 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.058062 kubelet[2578]: E0307 01:10:58.057997 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.058428 kubelet[2578]: E0307 01:10:58.058388 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.058428 kubelet[2578]: W0307 01:10:58.058418 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.058428 kubelet[2578]: E0307 01:10:58.058425 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.058983 kubelet[2578]: E0307 01:10:58.058962 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.058983 kubelet[2578]: W0307 01:10:58.058973 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.058983 kubelet[2578]: E0307 01:10:58.058982 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.059202 kubelet[2578]: E0307 01:10:58.059187 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.059202 kubelet[2578]: W0307 01:10:58.059197 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.059246 kubelet[2578]: E0307 01:10:58.059203 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.059841 kubelet[2578]: E0307 01:10:58.059823 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.059841 kubelet[2578]: W0307 01:10:58.059834 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.059841 kubelet[2578]: E0307 01:10:58.059842 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.060145 kubelet[2578]: E0307 01:10:58.060130 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.060145 kubelet[2578]: W0307 01:10:58.060139 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.060145 kubelet[2578]: E0307 01:10:58.060146 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.060455 kubelet[2578]: E0307 01:10:58.060441 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.060455 kubelet[2578]: W0307 01:10:58.060450 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.060531 kubelet[2578]: E0307 01:10:58.060457 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.060784 kubelet[2578]: E0307 01:10:58.060762 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.060784 kubelet[2578]: W0307 01:10:58.060773 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.060784 kubelet[2578]: E0307 01:10:58.060779 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.061585 kubelet[2578]: E0307 01:10:58.061485 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.061585 kubelet[2578]: W0307 01:10:58.061502 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.061585 kubelet[2578]: E0307 01:10:58.061511 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.061721 kubelet[2578]: E0307 01:10:58.061706 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.061721 kubelet[2578]: W0307 01:10:58.061716 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.061792 kubelet[2578]: E0307 01:10:58.061723 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.062297 kubelet[2578]: E0307 01:10:58.062208 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.062297 kubelet[2578]: W0307 01:10:58.062218 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.062297 kubelet[2578]: E0307 01:10:58.062226 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.062637 kubelet[2578]: E0307 01:10:58.062621 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.062637 kubelet[2578]: W0307 01:10:58.062634 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.062751 kubelet[2578]: E0307 01:10:58.062641 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.063255 kubelet[2578]: E0307 01:10:58.063121 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.063255 kubelet[2578]: W0307 01:10:58.063128 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.063255 kubelet[2578]: E0307 01:10:58.063136 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.063341 kubelet[2578]: E0307 01:10:58.063323 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.063341 kubelet[2578]: W0307 01:10:58.063332 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.063341 kubelet[2578]: E0307 01:10:58.063339 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.063615 kubelet[2578]: E0307 01:10:58.063606 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.063615 kubelet[2578]: W0307 01:10:58.063614 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.063685 kubelet[2578]: E0307 01:10:58.063621 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.063837 kubelet[2578]: E0307 01:10:58.063813 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.063837 kubelet[2578]: W0307 01:10:58.063821 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.063837 kubelet[2578]: E0307 01:10:58.063827 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.064542 kubelet[2578]: E0307 01:10:58.064526 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.064542 kubelet[2578]: W0307 01:10:58.064539 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.064612 kubelet[2578]: E0307 01:10:58.064547 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.064897 kubelet[2578]: E0307 01:10:58.064846 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.064897 kubelet[2578]: W0307 01:10:58.064856 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.064897 kubelet[2578]: E0307 01:10:58.064862 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.065206 kubelet[2578]: E0307 01:10:58.065182 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.065206 kubelet[2578]: W0307 01:10:58.065192 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.065206 kubelet[2578]: E0307 01:10:58.065198 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.075266 kubelet[2578]: E0307 01:10:58.075240 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.075266 kubelet[2578]: W0307 01:10:58.075255 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.075500 kubelet[2578]: E0307 01:10:58.075437 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.083042 containerd[1517]: time="2026-03-07T01:10:58.083018865Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-84bf5446f6-rzhwz,Uid:c41dcc46-138e-4c90-a556-927e589ea977,Namespace:calico-system,Attempt:0,} returns sandbox id \"72211f4854d8c3b640ff5562726e0cea99ea290ac9545e251e19b5ad1f54abbf\"" Mar 7 01:10:58.084690 containerd[1517]: time="2026-03-07T01:10:58.084503275Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Mar 7 01:10:58.089252 containerd[1517]: time="2026-03-07T01:10:58.089174187Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:10:58.089309 containerd[1517]: time="2026-03-07T01:10:58.089248615Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:10:58.089309 containerd[1517]: time="2026-03-07T01:10:58.089267932Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:10:58.089441 containerd[1517]: time="2026-03-07T01:10:58.089352438Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:10:58.108613 systemd[1]: Started cri-containerd-24389ad71d26fc0585aa17d895517330987f224be5762cc9fd1fa6a705f61ac0.scope - libcontainer container 24389ad71d26fc0585aa17d895517330987f224be5762cc9fd1fa6a705f61ac0. Mar 7 01:10:58.110682 update_engine[1496]: I20260307 01:10:58.110071 1496 update_attempter.cc:509] Updating boot flags... Mar 7 01:10:58.136180 containerd[1517]: time="2026-03-07T01:10:58.136150540Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-vtmpc,Uid:9a4f6928-5888-4159-935e-d4409c18c5ce,Namespace:calico-system,Attempt:0,} returns sandbox id \"24389ad71d26fc0585aa17d895517330987f224be5762cc9fd1fa6a705f61ac0\"" Mar 7 01:10:58.161446 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 34 scanned by (udev-worker) (3124) Mar 7 01:10:58.220519 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 34 scanned by (udev-worker) (3125) Mar 7 01:10:58.228036 kubelet[2578]: E0307 01:10:58.228006 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.228454 kubelet[2578]: W0307 01:10:58.228116 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.228454 kubelet[2578]: E0307 01:10:58.228134 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.228454 kubelet[2578]: E0307 01:10:58.228442 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.228454 kubelet[2578]: W0307 01:10:58.228450 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.229544 kubelet[2578]: E0307 01:10:58.228458 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.229544 kubelet[2578]: E0307 01:10:58.228884 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.229544 kubelet[2578]: W0307 01:10:58.228890 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.229544 kubelet[2578]: E0307 01:10:58.228898 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.229544 kubelet[2578]: E0307 01:10:58.229211 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.229544 kubelet[2578]: W0307 01:10:58.229218 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.229544 kubelet[2578]: E0307 01:10:58.229226 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.229977 kubelet[2578]: E0307 01:10:58.229659 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.229977 kubelet[2578]: W0307 01:10:58.229683 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.229977 kubelet[2578]: E0307 01:10:58.229691 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.230033 kubelet[2578]: E0307 01:10:58.230024 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.230033 kubelet[2578]: W0307 01:10:58.230031 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.230066 kubelet[2578]: E0307 01:10:58.230037 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.230630 kubelet[2578]: E0307 01:10:58.230242 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.230630 kubelet[2578]: W0307 01:10:58.230251 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.230630 kubelet[2578]: E0307 01:10:58.230257 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.230914 kubelet[2578]: E0307 01:10:58.230900 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.230914 kubelet[2578]: W0307 01:10:58.230911 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.230952 kubelet[2578]: E0307 01:10:58.230919 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.231745 kubelet[2578]: E0307 01:10:58.231088 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.231745 kubelet[2578]: W0307 01:10:58.231096 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.231745 kubelet[2578]: E0307 01:10:58.231102 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.231745 kubelet[2578]: E0307 01:10:58.231264 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.231745 kubelet[2578]: W0307 01:10:58.231269 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.231745 kubelet[2578]: E0307 01:10:58.231275 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.231745 kubelet[2578]: E0307 01:10:58.231483 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.231745 kubelet[2578]: W0307 01:10:58.231489 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.231745 kubelet[2578]: E0307 01:10:58.231496 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.231745 kubelet[2578]: E0307 01:10:58.231711 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.231921 kubelet[2578]: W0307 01:10:58.231718 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.231921 kubelet[2578]: E0307 01:10:58.231724 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.232569 kubelet[2578]: E0307 01:10:58.232555 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.232697 kubelet[2578]: W0307 01:10:58.232579 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.232697 kubelet[2578]: E0307 01:10:58.232587 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.232776 kubelet[2578]: E0307 01:10:58.232763 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.232776 kubelet[2578]: W0307 01:10:58.232773 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.232808 kubelet[2578]: E0307 01:10:58.232779 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:58.233014 kubelet[2578]: E0307 01:10:58.232935 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:10:58.233014 kubelet[2578]: W0307 01:10:58.232942 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:10:58.233014 kubelet[2578]: E0307 01:10:58.232947 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:10:59.916630 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1987468970.mount: Deactivated successfully. Mar 7 01:11:00.150048 kubelet[2578]: E0307 01:11:00.148896 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gpr4h" podUID="b2c4739c-e360-4c95-9680-eefc754cc98b" Mar 7 01:11:01.406915 containerd[1517]: time="2026-03-07T01:11:01.406868381Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:01.407958 containerd[1517]: time="2026-03-07T01:11:01.407870598Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=36107596" Mar 7 01:11:01.408960 containerd[1517]: time="2026-03-07T01:11:01.408927377Z" level=info msg="ImageCreate event name:\"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:01.410610 containerd[1517]: time="2026-03-07T01:11:01.410583302Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:01.411352 containerd[1517]: time="2026-03-07T01:11:01.411022789Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"36107450\" in 3.326497457s" Mar 7 01:11:01.411352 containerd[1517]: time="2026-03-07T01:11:01.411045706Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\"" Mar 7 01:11:01.413034 containerd[1517]: time="2026-03-07T01:11:01.412757013Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Mar 7 01:11:01.424149 containerd[1517]: time="2026-03-07T01:11:01.424118726Z" level=info msg="CreateContainer within sandbox \"72211f4854d8c3b640ff5562726e0cea99ea290ac9545e251e19b5ad1f54abbf\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 7 01:11:01.439875 containerd[1517]: time="2026-03-07T01:11:01.439838770Z" level=info msg="CreateContainer within sandbox \"72211f4854d8c3b640ff5562726e0cea99ea290ac9545e251e19b5ad1f54abbf\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"7cc04583e2e675fb8b038e0c315b05b40a7953144974ea7876a414b5b910b5e2\"" Mar 7 01:11:01.441542 containerd[1517]: time="2026-03-07T01:11:01.440538301Z" level=info msg="StartContainer for \"7cc04583e2e675fb8b038e0c315b05b40a7953144974ea7876a414b5b910b5e2\"" Mar 7 01:11:01.462524 systemd[1]: Started cri-containerd-7cc04583e2e675fb8b038e0c315b05b40a7953144974ea7876a414b5b910b5e2.scope - libcontainer container 7cc04583e2e675fb8b038e0c315b05b40a7953144974ea7876a414b5b910b5e2. Mar 7 01:11:01.503842 containerd[1517]: time="2026-03-07T01:11:01.503735201Z" level=info msg="StartContainer for \"7cc04583e2e675fb8b038e0c315b05b40a7953144974ea7876a414b5b910b5e2\" returns successfully" Mar 7 01:11:02.154172 kubelet[2578]: E0307 01:11:02.154042 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gpr4h" podUID="b2c4739c-e360-4c95-9680-eefc754cc98b" Mar 7 01:11:02.258723 kubelet[2578]: E0307 01:11:02.258649 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.258723 kubelet[2578]: W0307 01:11:02.258682 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.258723 kubelet[2578]: E0307 01:11:02.258708 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.259962 kubelet[2578]: E0307 01:11:02.259924 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.259962 kubelet[2578]: W0307 01:11:02.259949 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.260113 kubelet[2578]: E0307 01:11:02.259968 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.261878 kubelet[2578]: E0307 01:11:02.261804 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.261878 kubelet[2578]: W0307 01:11:02.261832 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.261878 kubelet[2578]: E0307 01:11:02.261852 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.262680 kubelet[2578]: E0307 01:11:02.262601 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.262680 kubelet[2578]: W0307 01:11:02.262637 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.262680 kubelet[2578]: E0307 01:11:02.262655 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.263194 kubelet[2578]: E0307 01:11:02.263154 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.263194 kubelet[2578]: W0307 01:11:02.263173 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.263194 kubelet[2578]: E0307 01:11:02.263188 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.264552 kubelet[2578]: E0307 01:11:02.264526 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.264652 kubelet[2578]: W0307 01:11:02.264633 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.264749 kubelet[2578]: E0307 01:11:02.264729 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.266766 kubelet[2578]: E0307 01:11:02.266663 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.266766 kubelet[2578]: W0307 01:11:02.266675 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.266766 kubelet[2578]: E0307 01:11:02.266683 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.267267 kubelet[2578]: E0307 01:11:02.267258 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.267380 kubelet[2578]: W0307 01:11:02.267303 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.267380 kubelet[2578]: E0307 01:11:02.267326 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.268872 kubelet[2578]: E0307 01:11:02.268775 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.268872 kubelet[2578]: W0307 01:11:02.268784 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.268872 kubelet[2578]: E0307 01:11:02.268792 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.269643 kubelet[2578]: E0307 01:11:02.269629 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.269711 kubelet[2578]: W0307 01:11:02.269702 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.269780 kubelet[2578]: E0307 01:11:02.269739 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.270106 kubelet[2578]: E0307 01:11:02.270028 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.270106 kubelet[2578]: W0307 01:11:02.270037 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.270106 kubelet[2578]: E0307 01:11:02.270045 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.271214 kubelet[2578]: E0307 01:11:02.271197 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.271257 kubelet[2578]: W0307 01:11:02.271249 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.271416 kubelet[2578]: E0307 01:11:02.271311 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.271765 kubelet[2578]: E0307 01:11:02.271725 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.271820 kubelet[2578]: W0307 01:11:02.271812 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.271888 kubelet[2578]: E0307 01:11:02.271843 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.272180 kubelet[2578]: E0307 01:11:02.272171 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.272294 kubelet[2578]: W0307 01:11:02.272216 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.272294 kubelet[2578]: E0307 01:11:02.272225 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.272653 kubelet[2578]: E0307 01:11:02.272643 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.272748 kubelet[2578]: W0307 01:11:02.272690 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.272748 kubelet[2578]: E0307 01:11:02.272699 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.290246 kubelet[2578]: E0307 01:11:02.290231 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.290366 kubelet[2578]: W0307 01:11:02.290303 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.290366 kubelet[2578]: E0307 01:11:02.290316 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.290806 kubelet[2578]: E0307 01:11:02.290706 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.290806 kubelet[2578]: W0307 01:11:02.290715 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.290806 kubelet[2578]: E0307 01:11:02.290723 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.291152 kubelet[2578]: E0307 01:11:02.291068 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.291152 kubelet[2578]: W0307 01:11:02.291076 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.291152 kubelet[2578]: E0307 01:11:02.291083 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.291501 kubelet[2578]: E0307 01:11:02.291429 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.291501 kubelet[2578]: W0307 01:11:02.291439 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.291501 kubelet[2578]: E0307 01:11:02.291449 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.291797 kubelet[2578]: E0307 01:11:02.291760 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.291797 kubelet[2578]: W0307 01:11:02.291768 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.291797 kubelet[2578]: E0307 01:11:02.291774 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.292158 kubelet[2578]: E0307 01:11:02.292149 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.292280 kubelet[2578]: W0307 01:11:02.292196 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.292280 kubelet[2578]: E0307 01:11:02.292203 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.292676 kubelet[2578]: E0307 01:11:02.292654 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.292711 kubelet[2578]: W0307 01:11:02.292675 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.292711 kubelet[2578]: E0307 01:11:02.292691 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.293029 kubelet[2578]: E0307 01:11:02.293001 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.293029 kubelet[2578]: W0307 01:11:02.293018 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.293071 kubelet[2578]: E0307 01:11:02.293030 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.293608 kubelet[2578]: E0307 01:11:02.293523 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.293608 kubelet[2578]: W0307 01:11:02.293532 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.293608 kubelet[2578]: E0307 01:11:02.293540 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.294336 kubelet[2578]: E0307 01:11:02.294300 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.294336 kubelet[2578]: W0307 01:11:02.294327 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.294490 kubelet[2578]: E0307 01:11:02.294347 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.294950 kubelet[2578]: E0307 01:11:02.294913 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.294950 kubelet[2578]: W0307 01:11:02.294947 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.295066 kubelet[2578]: E0307 01:11:02.294968 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.295592 kubelet[2578]: E0307 01:11:02.295546 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.295592 kubelet[2578]: W0307 01:11:02.295567 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.295592 kubelet[2578]: E0307 01:11:02.295582 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.296281 kubelet[2578]: E0307 01:11:02.296234 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.296281 kubelet[2578]: W0307 01:11:02.296268 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.296281 kubelet[2578]: E0307 01:11:02.296296 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.296943 kubelet[2578]: E0307 01:11:02.296907 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.296943 kubelet[2578]: W0307 01:11:02.296938 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.297042 kubelet[2578]: E0307 01:11:02.296962 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.297921 kubelet[2578]: E0307 01:11:02.297855 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.297921 kubelet[2578]: W0307 01:11:02.297907 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.298036 kubelet[2578]: E0307 01:11:02.297930 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.298853 kubelet[2578]: E0307 01:11:02.298759 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.298853 kubelet[2578]: W0307 01:11:02.298793 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.298853 kubelet[2578]: E0307 01:11:02.298818 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.299613 kubelet[2578]: E0307 01:11:02.299572 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.299613 kubelet[2578]: W0307 01:11:02.299592 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.299613 kubelet[2578]: E0307 01:11:02.299606 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:02.300064 kubelet[2578]: E0307 01:11:02.300017 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:02.300064 kubelet[2578]: W0307 01:11:02.300029 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:02.300064 kubelet[2578]: E0307 01:11:02.300042 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.219644 kubelet[2578]: I0307 01:11:03.219586 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 7 01:11:03.278768 kubelet[2578]: E0307 01:11:03.278718 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.278768 kubelet[2578]: W0307 01:11:03.278746 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.278768 kubelet[2578]: E0307 01:11:03.278764 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.279081 kubelet[2578]: E0307 01:11:03.279063 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.279081 kubelet[2578]: W0307 01:11:03.279074 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.279081 kubelet[2578]: E0307 01:11:03.279081 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.279315 kubelet[2578]: E0307 01:11:03.279301 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.279315 kubelet[2578]: W0307 01:11:03.279314 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.279391 kubelet[2578]: E0307 01:11:03.279321 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.279537 kubelet[2578]: E0307 01:11:03.279519 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.279537 kubelet[2578]: W0307 01:11:03.279530 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.279537 kubelet[2578]: E0307 01:11:03.279536 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.279721 kubelet[2578]: E0307 01:11:03.279706 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.279721 kubelet[2578]: W0307 01:11:03.279715 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.279721 kubelet[2578]: E0307 01:11:03.279720 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.279901 kubelet[2578]: E0307 01:11:03.279889 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.279901 kubelet[2578]: W0307 01:11:03.279895 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.279901 kubelet[2578]: E0307 01:11:03.279901 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.280109 kubelet[2578]: E0307 01:11:03.280092 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.280109 kubelet[2578]: W0307 01:11:03.280102 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.280109 kubelet[2578]: E0307 01:11:03.280108 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.280594 kubelet[2578]: E0307 01:11:03.280449 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.280594 kubelet[2578]: W0307 01:11:03.280480 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.280594 kubelet[2578]: E0307 01:11:03.280501 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.280943 kubelet[2578]: E0307 01:11:03.280918 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.280943 kubelet[2578]: W0307 01:11:03.280929 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.280943 kubelet[2578]: E0307 01:11:03.280936 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.281357 kubelet[2578]: E0307 01:11:03.281319 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.281357 kubelet[2578]: W0307 01:11:03.281334 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.281357 kubelet[2578]: E0307 01:11:03.281341 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.281610 kubelet[2578]: E0307 01:11:03.281588 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.281610 kubelet[2578]: W0307 01:11:03.281602 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.281661 kubelet[2578]: E0307 01:11:03.281629 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.282040 kubelet[2578]: E0307 01:11:03.282022 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.282040 kubelet[2578]: W0307 01:11:03.282033 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.282085 kubelet[2578]: E0307 01:11:03.282042 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.282300 kubelet[2578]: E0307 01:11:03.282283 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.282300 kubelet[2578]: W0307 01:11:03.282294 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.282356 kubelet[2578]: E0307 01:11:03.282340 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.282644 kubelet[2578]: E0307 01:11:03.282622 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.282644 kubelet[2578]: W0307 01:11:03.282637 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.282644 kubelet[2578]: E0307 01:11:03.282646 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.283067 kubelet[2578]: E0307 01:11:03.283045 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.283067 kubelet[2578]: W0307 01:11:03.283057 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.283067 kubelet[2578]: E0307 01:11:03.283064 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.284684 containerd[1517]: time="2026-03-07T01:11:03.284646941Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:03.285672 containerd[1517]: time="2026-03-07T01:11:03.285558586Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4630250" Mar 7 01:11:03.286450 containerd[1517]: time="2026-03-07T01:11:03.286428515Z" level=info msg="ImageCreate event name:\"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:03.289102 containerd[1517]: time="2026-03-07T01:11:03.288525858Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:03.289102 containerd[1517]: time="2026-03-07T01:11:03.288972231Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"6186255\" in 1.876196801s" Mar 7 01:11:03.289102 containerd[1517]: time="2026-03-07T01:11:03.289002837Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\"" Mar 7 01:11:03.292792 containerd[1517]: time="2026-03-07T01:11:03.292703996Z" level=info msg="CreateContainer within sandbox \"24389ad71d26fc0585aa17d895517330987f224be5762cc9fd1fa6a705f61ac0\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 7 01:11:03.297785 kubelet[2578]: E0307 01:11:03.297771 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.297881 kubelet[2578]: W0307 01:11:03.297859 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.297938 kubelet[2578]: E0307 01:11:03.297894 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.298172 kubelet[2578]: E0307 01:11:03.298154 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.298172 kubelet[2578]: W0307 01:11:03.298164 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.298172 kubelet[2578]: E0307 01:11:03.298170 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.298480 kubelet[2578]: E0307 01:11:03.298464 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.298480 kubelet[2578]: W0307 01:11:03.298473 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.298480 kubelet[2578]: E0307 01:11:03.298480 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.298864 kubelet[2578]: E0307 01:11:03.298826 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.298864 kubelet[2578]: W0307 01:11:03.298841 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.298864 kubelet[2578]: E0307 01:11:03.298853 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.299159 kubelet[2578]: E0307 01:11:03.299137 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.299159 kubelet[2578]: W0307 01:11:03.299150 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.299159 kubelet[2578]: E0307 01:11:03.299158 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.299467 kubelet[2578]: E0307 01:11:03.299449 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.299467 kubelet[2578]: W0307 01:11:03.299460 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.299467 kubelet[2578]: E0307 01:11:03.299467 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.299940 kubelet[2578]: E0307 01:11:03.299924 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.299940 kubelet[2578]: W0307 01:11:03.299936 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.299984 kubelet[2578]: E0307 01:11:03.299944 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.300999 kubelet[2578]: E0307 01:11:03.300432 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.300999 kubelet[2578]: W0307 01:11:03.300440 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.300999 kubelet[2578]: E0307 01:11:03.300447 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.300999 kubelet[2578]: E0307 01:11:03.300879 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.300999 kubelet[2578]: W0307 01:11:03.300900 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.300999 kubelet[2578]: E0307 01:11:03.300911 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.301168 kubelet[2578]: E0307 01:11:03.301153 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.301168 kubelet[2578]: W0307 01:11:03.301162 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.301200 kubelet[2578]: E0307 01:11:03.301187 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.301385 kubelet[2578]: E0307 01:11:03.301368 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.301385 kubelet[2578]: W0307 01:11:03.301378 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.301385 kubelet[2578]: E0307 01:11:03.301384 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.301629 kubelet[2578]: E0307 01:11:03.301605 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.301629 kubelet[2578]: W0307 01:11:03.301612 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.301629 kubelet[2578]: E0307 01:11:03.301620 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.302026 kubelet[2578]: E0307 01:11:03.302005 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.302026 kubelet[2578]: W0307 01:11:03.302017 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.302026 kubelet[2578]: E0307 01:11:03.302026 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.302285 kubelet[2578]: E0307 01:11:03.302262 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.302285 kubelet[2578]: W0307 01:11:03.302273 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.302285 kubelet[2578]: E0307 01:11:03.302280 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.302544 kubelet[2578]: E0307 01:11:03.302531 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.302544 kubelet[2578]: W0307 01:11:03.302541 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.302584 kubelet[2578]: E0307 01:11:03.302548 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.303049 kubelet[2578]: E0307 01:11:03.302838 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.303049 kubelet[2578]: W0307 01:11:03.302848 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.303049 kubelet[2578]: E0307 01:11:03.302856 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.304350 kubelet[2578]: E0307 01:11:03.304332 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.304350 kubelet[2578]: W0307 01:11:03.304344 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.304350 kubelet[2578]: E0307 01:11:03.304352 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.304678 kubelet[2578]: E0307 01:11:03.304659 2578 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 7 01:11:03.304678 kubelet[2578]: W0307 01:11:03.304671 2578 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 7 01:11:03.304678 kubelet[2578]: E0307 01:11:03.304678 2578 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 7 01:11:03.310299 containerd[1517]: time="2026-03-07T01:11:03.310270109Z" level=info msg="CreateContainer within sandbox \"24389ad71d26fc0585aa17d895517330987f224be5762cc9fd1fa6a705f61ac0\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"c1b46872a8a6f4e1e43c19a0b2c98ff1d8e7d7581db407d384a33bd64718cc7f\"" Mar 7 01:11:03.310996 containerd[1517]: time="2026-03-07T01:11:03.310691096Z" level=info msg="StartContainer for \"c1b46872a8a6f4e1e43c19a0b2c98ff1d8e7d7581db407d384a33bd64718cc7f\"" Mar 7 01:11:03.335012 systemd[1]: run-containerd-runc-k8s.io-c1b46872a8a6f4e1e43c19a0b2c98ff1d8e7d7581db407d384a33bd64718cc7f-runc.7SFL9J.mount: Deactivated successfully. Mar 7 01:11:03.344526 systemd[1]: Started cri-containerd-c1b46872a8a6f4e1e43c19a0b2c98ff1d8e7d7581db407d384a33bd64718cc7f.scope - libcontainer container c1b46872a8a6f4e1e43c19a0b2c98ff1d8e7d7581db407d384a33bd64718cc7f. Mar 7 01:11:03.368365 containerd[1517]: time="2026-03-07T01:11:03.368260625Z" level=info msg="StartContainer for \"c1b46872a8a6f4e1e43c19a0b2c98ff1d8e7d7581db407d384a33bd64718cc7f\" returns successfully" Mar 7 01:11:03.379999 systemd[1]: cri-containerd-c1b46872a8a6f4e1e43c19a0b2c98ff1d8e7d7581db407d384a33bd64718cc7f.scope: Deactivated successfully. Mar 7 01:11:03.399876 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c1b46872a8a6f4e1e43c19a0b2c98ff1d8e7d7581db407d384a33bd64718cc7f-rootfs.mount: Deactivated successfully. Mar 7 01:11:03.526806 containerd[1517]: time="2026-03-07T01:11:03.526634780Z" level=info msg="shim disconnected" id=c1b46872a8a6f4e1e43c19a0b2c98ff1d8e7d7581db407d384a33bd64718cc7f namespace=k8s.io Mar 7 01:11:03.526806 containerd[1517]: time="2026-03-07T01:11:03.526721779Z" level=warning msg="cleaning up after shim disconnected" id=c1b46872a8a6f4e1e43c19a0b2c98ff1d8e7d7581db407d384a33bd64718cc7f namespace=k8s.io Mar 7 01:11:03.526806 containerd[1517]: time="2026-03-07T01:11:03.526735137Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 01:11:04.148708 kubelet[2578]: E0307 01:11:04.148464 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gpr4h" podUID="b2c4739c-e360-4c95-9680-eefc754cc98b" Mar 7 01:11:04.225618 containerd[1517]: time="2026-03-07T01:11:04.225501880Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Mar 7 01:11:04.250228 kubelet[2578]: I0307 01:11:04.248926 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-84bf5446f6-rzhwz" podStartSLOduration=3.921268823 podStartE2EDuration="7.248852755s" podCreationTimestamp="2026-03-07 01:10:57 +0000 UTC" firstStartedPulling="2026-03-07 01:10:58.084066558 +0000 UTC m=+16.009033609" lastFinishedPulling="2026-03-07 01:11:01.41165049 +0000 UTC m=+19.336617541" observedRunningTime="2026-03-07 01:11:02.233238499 +0000 UTC m=+20.158205590" watchObservedRunningTime="2026-03-07 01:11:04.248852755 +0000 UTC m=+22.173819836" Mar 7 01:11:04.613348 kubelet[2578]: I0307 01:11:04.613083 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 7 01:11:06.148879 kubelet[2578]: E0307 01:11:06.148790 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gpr4h" podUID="b2c4739c-e360-4c95-9680-eefc754cc98b" Mar 7 01:11:08.149227 kubelet[2578]: E0307 01:11:08.149088 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gpr4h" podUID="b2c4739c-e360-4c95-9680-eefc754cc98b" Mar 7 01:11:10.148823 kubelet[2578]: E0307 01:11:10.148743 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gpr4h" podUID="b2c4739c-e360-4c95-9680-eefc754cc98b" Mar 7 01:11:12.150555 kubelet[2578]: E0307 01:11:12.150508 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gpr4h" podUID="b2c4739c-e360-4c95-9680-eefc754cc98b" Mar 7 01:11:12.756285 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3805110621.mount: Deactivated successfully. Mar 7 01:11:12.784218 containerd[1517]: time="2026-03-07T01:11:12.784184676Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:12.785021 containerd[1517]: time="2026-03-07T01:11:12.784862281Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=159838564" Mar 7 01:11:12.785634 containerd[1517]: time="2026-03-07T01:11:12.785604043Z" level=info msg="ImageCreate event name:\"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:12.787519 containerd[1517]: time="2026-03-07T01:11:12.787450476Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:12.788010 containerd[1517]: time="2026-03-07T01:11:12.787943497Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"159838426\" in 8.562390983s" Mar 7 01:11:12.788010 containerd[1517]: time="2026-03-07T01:11:12.787967345Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\"" Mar 7 01:11:12.793510 containerd[1517]: time="2026-03-07T01:11:12.793465627Z" level=info msg="CreateContainer within sandbox \"24389ad71d26fc0585aa17d895517330987f224be5762cc9fd1fa6a705f61ac0\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Mar 7 01:11:12.808290 containerd[1517]: time="2026-03-07T01:11:12.808262238Z" level=info msg="CreateContainer within sandbox \"24389ad71d26fc0585aa17d895517330987f224be5762cc9fd1fa6a705f61ac0\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"060bb5a34f86859476c027f7e3d57faad28590d2015ec002de823798319d361a\"" Mar 7 01:11:12.809483 containerd[1517]: time="2026-03-07T01:11:12.808856271Z" level=info msg="StartContainer for \"060bb5a34f86859476c027f7e3d57faad28590d2015ec002de823798319d361a\"" Mar 7 01:11:12.836836 systemd[1]: run-containerd-runc-k8s.io-060bb5a34f86859476c027f7e3d57faad28590d2015ec002de823798319d361a-runc.YWXmqV.mount: Deactivated successfully. Mar 7 01:11:12.845563 systemd[1]: Started cri-containerd-060bb5a34f86859476c027f7e3d57faad28590d2015ec002de823798319d361a.scope - libcontainer container 060bb5a34f86859476c027f7e3d57faad28590d2015ec002de823798319d361a. Mar 7 01:11:12.871440 containerd[1517]: time="2026-03-07T01:11:12.871066126Z" level=info msg="StartContainer for \"060bb5a34f86859476c027f7e3d57faad28590d2015ec002de823798319d361a\" returns successfully" Mar 7 01:11:12.906176 systemd[1]: cri-containerd-060bb5a34f86859476c027f7e3d57faad28590d2015ec002de823798319d361a.scope: Deactivated successfully. Mar 7 01:11:13.006894 containerd[1517]: time="2026-03-07T01:11:13.006773109Z" level=info msg="shim disconnected" id=060bb5a34f86859476c027f7e3d57faad28590d2015ec002de823798319d361a namespace=k8s.io Mar 7 01:11:13.006894 containerd[1517]: time="2026-03-07T01:11:13.006844893Z" level=warning msg="cleaning up after shim disconnected" id=060bb5a34f86859476c027f7e3d57faad28590d2015ec002de823798319d361a namespace=k8s.io Mar 7 01:11:13.006894 containerd[1517]: time="2026-03-07T01:11:13.006851973Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 01:11:13.245698 containerd[1517]: time="2026-03-07T01:11:13.245476862Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Mar 7 01:11:13.759733 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-060bb5a34f86859476c027f7e3d57faad28590d2015ec002de823798319d361a-rootfs.mount: Deactivated successfully. Mar 7 01:11:14.149245 kubelet[2578]: E0307 01:11:14.148491 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gpr4h" podUID="b2c4739c-e360-4c95-9680-eefc754cc98b" Mar 7 01:11:16.149554 kubelet[2578]: E0307 01:11:16.149311 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gpr4h" podUID="b2c4739c-e360-4c95-9680-eefc754cc98b" Mar 7 01:11:18.149535 kubelet[2578]: E0307 01:11:18.149225 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gpr4h" podUID="b2c4739c-e360-4c95-9680-eefc754cc98b" Mar 7 01:11:20.149353 kubelet[2578]: E0307 01:11:20.149293 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gpr4h" podUID="b2c4739c-e360-4c95-9680-eefc754cc98b" Mar 7 01:11:22.149532 kubelet[2578]: E0307 01:11:22.149210 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gpr4h" podUID="b2c4739c-e360-4c95-9680-eefc754cc98b" Mar 7 01:11:23.260068 containerd[1517]: time="2026-03-07T01:11:23.260008014Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:23.261319 containerd[1517]: time="2026-03-07T01:11:23.261148339Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=70611671" Mar 7 01:11:23.262242 containerd[1517]: time="2026-03-07T01:11:23.262187858Z" level=info msg="ImageCreate event name:\"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:23.264138 containerd[1517]: time="2026-03-07T01:11:23.264104025Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:23.265028 containerd[1517]: time="2026-03-07T01:11:23.264678156Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"72167716\" in 10.019093481s" Mar 7 01:11:23.265028 containerd[1517]: time="2026-03-07T01:11:23.264724234Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\"" Mar 7 01:11:23.269996 containerd[1517]: time="2026-03-07T01:11:23.269961209Z" level=info msg="CreateContainer within sandbox \"24389ad71d26fc0585aa17d895517330987f224be5762cc9fd1fa6a705f61ac0\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 7 01:11:23.285385 containerd[1517]: time="2026-03-07T01:11:23.285335238Z" level=info msg="CreateContainer within sandbox \"24389ad71d26fc0585aa17d895517330987f224be5762cc9fd1fa6a705f61ac0\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"212277b356ea385656740622e518ef30796ddb0d3d286c5d9d2bbdb63dc4d711\"" Mar 7 01:11:23.287330 containerd[1517]: time="2026-03-07T01:11:23.286150067Z" level=info msg="StartContainer for \"212277b356ea385656740622e518ef30796ddb0d3d286c5d9d2bbdb63dc4d711\"" Mar 7 01:11:23.314197 systemd[1]: run-containerd-runc-k8s.io-212277b356ea385656740622e518ef30796ddb0d3d286c5d9d2bbdb63dc4d711-runc.KqRYqT.mount: Deactivated successfully. Mar 7 01:11:23.323561 systemd[1]: Started cri-containerd-212277b356ea385656740622e518ef30796ddb0d3d286c5d9d2bbdb63dc4d711.scope - libcontainer container 212277b356ea385656740622e518ef30796ddb0d3d286c5d9d2bbdb63dc4d711. Mar 7 01:11:23.353129 containerd[1517]: time="2026-03-07T01:11:23.353086327Z" level=info msg="StartContainer for \"212277b356ea385656740622e518ef30796ddb0d3d286c5d9d2bbdb63dc4d711\" returns successfully" Mar 7 01:11:23.849443 containerd[1517]: time="2026-03-07T01:11:23.849348928Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 7 01:11:23.852030 systemd[1]: cri-containerd-212277b356ea385656740622e518ef30796ddb0d3d286c5d9d2bbdb63dc4d711.scope: Deactivated successfully. Mar 7 01:11:23.872258 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-212277b356ea385656740622e518ef30796ddb0d3d286c5d9d2bbdb63dc4d711-rootfs.mount: Deactivated successfully. Mar 7 01:11:23.877701 containerd[1517]: time="2026-03-07T01:11:23.877538221Z" level=info msg="shim disconnected" id=212277b356ea385656740622e518ef30796ddb0d3d286c5d9d2bbdb63dc4d711 namespace=k8s.io Mar 7 01:11:23.877701 containerd[1517]: time="2026-03-07T01:11:23.877581849Z" level=warning msg="cleaning up after shim disconnected" id=212277b356ea385656740622e518ef30796ddb0d3d286c5d9d2bbdb63dc4d711 namespace=k8s.io Mar 7 01:11:23.877701 containerd[1517]: time="2026-03-07T01:11:23.877590159Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 01:11:23.940389 kubelet[2578]: I0307 01:11:23.940071 2578 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Mar 7 01:11:23.980494 systemd[1]: Created slice kubepods-burstable-pod9e84ea27_025b_43b3_a817_46d96e7f19b0.slice - libcontainer container kubepods-burstable-pod9e84ea27_025b_43b3_a817_46d96e7f19b0.slice. Mar 7 01:11:23.992703 systemd[1]: Created slice kubepods-besteffort-poddd5d2b91_08a1_4e2b_97af_8acbc0f68c75.slice - libcontainer container kubepods-besteffort-poddd5d2b91_08a1_4e2b_97af_8acbc0f68c75.slice. Mar 7 01:11:24.005310 systemd[1]: Created slice kubepods-besteffort-pod9a2395af_edcb_49ca_a47a_8c642cac381d.slice - libcontainer container kubepods-besteffort-pod9a2395af_edcb_49ca_a47a_8c642cac381d.slice. Mar 7 01:11:24.016678 systemd[1]: Created slice kubepods-besteffort-pod32e149ae_638d_489c_a5af_ccd7aafddaba.slice - libcontainer container kubepods-besteffort-pod32e149ae_638d_489c_a5af_ccd7aafddaba.slice. Mar 7 01:11:24.024876 systemd[1]: Created slice kubepods-besteffort-pod4298d66b_eb58_4358_8b7f_a00b4a5de60e.slice - libcontainer container kubepods-besteffort-pod4298d66b_eb58_4358_8b7f_a00b4a5de60e.slice. Mar 7 01:11:24.031707 systemd[1]: Created slice kubepods-burstable-pod9a59049d_3a86_490e_8990_b534f38c98ed.slice - libcontainer container kubepods-burstable-pod9a59049d_3a86_490e_8990_b534f38c98ed.slice. Mar 7 01:11:24.038426 systemd[1]: Created slice kubepods-besteffort-podc6423fb5_e2f2_4a85_962c_3faf4ee6a34d.slice - libcontainer container kubepods-besteffort-podc6423fb5_e2f2_4a85_962c_3faf4ee6a34d.slice. Mar 7 01:11:24.053669 kubelet[2578]: I0307 01:11:24.053552 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zsnl\" (UniqueName: \"kubernetes.io/projected/9a59049d-3a86-490e-8990-b534f38c98ed-kube-api-access-7zsnl\") pod \"coredns-674b8bbfcf-2fcnt\" (UID: \"9a59049d-3a86-490e-8990-b534f38c98ed\") " pod="kube-system/coredns-674b8bbfcf-2fcnt" Mar 7 01:11:24.054096 kubelet[2578]: I0307 01:11:24.054043 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/dd5d2b91-08a1-4e2b-97af-8acbc0f68c75-calico-apiserver-certs\") pod \"calico-apiserver-7cdd895f86-5kmtj\" (UID: \"dd5d2b91-08a1-4e2b-97af-8acbc0f68c75\") " pod="calico-system/calico-apiserver-7cdd895f86-5kmtj" Mar 7 01:11:24.054096 kubelet[2578]: I0307 01:11:24.054067 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tjqr\" (UniqueName: \"kubernetes.io/projected/9a2395af-edcb-49ca-a47a-8c642cac381d-kube-api-access-8tjqr\") pod \"goldmane-5b85766d88-8nxm8\" (UID: \"9a2395af-edcb-49ca-a47a-8c642cac381d\") " pod="calico-system/goldmane-5b85766d88-8nxm8" Mar 7 01:11:24.054096 kubelet[2578]: I0307 01:11:24.054079 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65nz6\" (UniqueName: \"kubernetes.io/projected/32e149ae-638d-489c-a5af-ccd7aafddaba-kube-api-access-65nz6\") pod \"whisker-d895dd87-pfsm5\" (UID: \"32e149ae-638d-489c-a5af-ccd7aafddaba\") " pod="calico-system/whisker-d895dd87-pfsm5" Mar 7 01:11:24.054426 kubelet[2578]: I0307 01:11:24.054322 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5gr2\" (UniqueName: \"kubernetes.io/projected/9e84ea27-025b-43b3-a817-46d96e7f19b0-kube-api-access-r5gr2\") pod \"coredns-674b8bbfcf-2g6xd\" (UID: \"9e84ea27-025b-43b3-a817-46d96e7f19b0\") " pod="kube-system/coredns-674b8bbfcf-2g6xd" Mar 7 01:11:24.054426 kubelet[2578]: I0307 01:11:24.054338 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32e149ae-638d-489c-a5af-ccd7aafddaba-whisker-ca-bundle\") pod \"whisker-d895dd87-pfsm5\" (UID: \"32e149ae-638d-489c-a5af-ccd7aafddaba\") " pod="calico-system/whisker-d895dd87-pfsm5" Mar 7 01:11:24.054495 kubelet[2578]: I0307 01:11:24.054470 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a59049d-3a86-490e-8990-b534f38c98ed-config-volume\") pod \"coredns-674b8bbfcf-2fcnt\" (UID: \"9a59049d-3a86-490e-8990-b534f38c98ed\") " pod="kube-system/coredns-674b8bbfcf-2fcnt" Mar 7 01:11:24.054521 kubelet[2578]: I0307 01:11:24.054501 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a2395af-edcb-49ca-a47a-8c642cac381d-config\") pod \"goldmane-5b85766d88-8nxm8\" (UID: \"9a2395af-edcb-49ca-a47a-8c642cac381d\") " pod="calico-system/goldmane-5b85766d88-8nxm8" Mar 7 01:11:24.054541 kubelet[2578]: I0307 01:11:24.054521 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/32e149ae-638d-489c-a5af-ccd7aafddaba-whisker-backend-key-pair\") pod \"whisker-d895dd87-pfsm5\" (UID: \"32e149ae-638d-489c-a5af-ccd7aafddaba\") " pod="calico-system/whisker-d895dd87-pfsm5" Mar 7 01:11:24.054565 kubelet[2578]: I0307 01:11:24.054558 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9e84ea27-025b-43b3-a817-46d96e7f19b0-config-volume\") pod \"coredns-674b8bbfcf-2g6xd\" (UID: \"9e84ea27-025b-43b3-a817-46d96e7f19b0\") " pod="kube-system/coredns-674b8bbfcf-2g6xd" Mar 7 01:11:24.054585 kubelet[2578]: I0307 01:11:24.054571 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a2395af-edcb-49ca-a47a-8c642cac381d-goldmane-ca-bundle\") pod \"goldmane-5b85766d88-8nxm8\" (UID: \"9a2395af-edcb-49ca-a47a-8c642cac381d\") " pod="calico-system/goldmane-5b85766d88-8nxm8" Mar 7 01:11:24.054603 kubelet[2578]: I0307 01:11:24.054588 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6423fb5-e2f2-4a85-962c-3faf4ee6a34d-tigera-ca-bundle\") pod \"calico-kube-controllers-cf679fccd-lv8s8\" (UID: \"c6423fb5-e2f2-4a85-962c-3faf4ee6a34d\") " pod="calico-system/calico-kube-controllers-cf679fccd-lv8s8" Mar 7 01:11:24.054623 kubelet[2578]: I0307 01:11:24.054613 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s968k\" (UniqueName: \"kubernetes.io/projected/4298d66b-eb58-4358-8b7f-a00b4a5de60e-kube-api-access-s968k\") pod \"calico-apiserver-7cdd895f86-pmmf8\" (UID: \"4298d66b-eb58-4358-8b7f-a00b4a5de60e\") " pod="calico-system/calico-apiserver-7cdd895f86-pmmf8" Mar 7 01:11:24.054969 kubelet[2578]: I0307 01:11:24.054642 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/9a2395af-edcb-49ca-a47a-8c642cac381d-goldmane-key-pair\") pod \"goldmane-5b85766d88-8nxm8\" (UID: \"9a2395af-edcb-49ca-a47a-8c642cac381d\") " pod="calico-system/goldmane-5b85766d88-8nxm8" Mar 7 01:11:24.054969 kubelet[2578]: I0307 01:11:24.054692 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd8zb\" (UniqueName: \"kubernetes.io/projected/dd5d2b91-08a1-4e2b-97af-8acbc0f68c75-kube-api-access-gd8zb\") pod \"calico-apiserver-7cdd895f86-5kmtj\" (UID: \"dd5d2b91-08a1-4e2b-97af-8acbc0f68c75\") " pod="calico-system/calico-apiserver-7cdd895f86-5kmtj" Mar 7 01:11:24.054969 kubelet[2578]: I0307 01:11:24.054708 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpf66\" (UniqueName: \"kubernetes.io/projected/c6423fb5-e2f2-4a85-962c-3faf4ee6a34d-kube-api-access-bpf66\") pod \"calico-kube-controllers-cf679fccd-lv8s8\" (UID: \"c6423fb5-e2f2-4a85-962c-3faf4ee6a34d\") " pod="calico-system/calico-kube-controllers-cf679fccd-lv8s8" Mar 7 01:11:24.054969 kubelet[2578]: I0307 01:11:24.054720 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/4298d66b-eb58-4358-8b7f-a00b4a5de60e-calico-apiserver-certs\") pod \"calico-apiserver-7cdd895f86-pmmf8\" (UID: \"4298d66b-eb58-4358-8b7f-a00b4a5de60e\") " pod="calico-system/calico-apiserver-7cdd895f86-pmmf8" Mar 7 01:11:24.054969 kubelet[2578]: I0307 01:11:24.054732 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/32e149ae-638d-489c-a5af-ccd7aafddaba-nginx-config\") pod \"whisker-d895dd87-pfsm5\" (UID: \"32e149ae-638d-489c-a5af-ccd7aafddaba\") " pod="calico-system/whisker-d895dd87-pfsm5" Mar 7 01:11:24.159361 systemd[1]: Created slice kubepods-besteffort-podb2c4739c_e360_4c95_9680_eefc754cc98b.slice - libcontainer container kubepods-besteffort-podb2c4739c_e360_4c95_9680_eefc754cc98b.slice. Mar 7 01:11:24.190441 containerd[1517]: time="2026-03-07T01:11:24.189355597Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gpr4h,Uid:b2c4739c-e360-4c95-9680-eefc754cc98b,Namespace:calico-system,Attempt:0,}" Mar 7 01:11:24.290035 containerd[1517]: time="2026-03-07T01:11:24.290005427Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-2g6xd,Uid:9e84ea27-025b-43b3-a817-46d96e7f19b0,Namespace:kube-system,Attempt:0,}" Mar 7 01:11:24.298031 containerd[1517]: time="2026-03-07T01:11:24.297973892Z" level=info msg="CreateContainer within sandbox \"24389ad71d26fc0585aa17d895517330987f224be5762cc9fd1fa6a705f61ac0\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 7 01:11:24.300031 containerd[1517]: time="2026-03-07T01:11:24.300002037Z" level=error msg="Failed to destroy network for sandbox \"6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:11:24.300445 containerd[1517]: time="2026-03-07T01:11:24.300426737Z" level=error msg="encountered an error cleaning up failed sandbox \"6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:11:24.300548 containerd[1517]: time="2026-03-07T01:11:24.300532651Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gpr4h,Uid:b2c4739c-e360-4c95-9680-eefc754cc98b,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:11:24.303199 kubelet[2578]: E0307 01:11:24.303119 2578 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:11:24.303199 kubelet[2578]: E0307 01:11:24.303186 2578 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gpr4h" Mar 7 01:11:24.304116 kubelet[2578]: E0307 01:11:24.303203 2578 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gpr4h" Mar 7 01:11:24.304116 kubelet[2578]: E0307 01:11:24.303251 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-gpr4h_calico-system(b2c4739c-e360-4c95-9680-eefc754cc98b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-gpr4h_calico-system(b2c4739c-e360-4c95-9680-eefc754cc98b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-gpr4h" podUID="b2c4739c-e360-4c95-9680-eefc754cc98b" Mar 7 01:11:24.310137 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089-shm.mount: Deactivated successfully. Mar 7 01:11:24.311266 containerd[1517]: time="2026-03-07T01:11:24.310569130Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7cdd895f86-5kmtj,Uid:dd5d2b91-08a1-4e2b-97af-8acbc0f68c75,Namespace:calico-system,Attempt:0,}" Mar 7 01:11:24.312940 containerd[1517]: time="2026-03-07T01:11:24.312903381Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-8nxm8,Uid:9a2395af-edcb-49ca-a47a-8c642cac381d,Namespace:calico-system,Attempt:0,}" Mar 7 01:11:24.323033 containerd[1517]: time="2026-03-07T01:11:24.322994666Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-d895dd87-pfsm5,Uid:32e149ae-638d-489c-a5af-ccd7aafddaba,Namespace:calico-system,Attempt:0,}" Mar 7 01:11:24.330099 containerd[1517]: time="2026-03-07T01:11:24.330066794Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7cdd895f86-pmmf8,Uid:4298d66b-eb58-4358-8b7f-a00b4a5de60e,Namespace:calico-system,Attempt:0,}" Mar 7 01:11:24.336597 containerd[1517]: time="2026-03-07T01:11:24.336548109Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-2fcnt,Uid:9a59049d-3a86-490e-8990-b534f38c98ed,Namespace:kube-system,Attempt:0,}" Mar 7 01:11:24.340696 containerd[1517]: time="2026-03-07T01:11:24.340656276Z" level=info msg="CreateContainer within sandbox \"24389ad71d26fc0585aa17d895517330987f224be5762cc9fd1fa6a705f61ac0\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"5b5a91af529efeca02e953877ce9d94d32debf076d1ebbf98df1d5a3c7b5b06a\"" Mar 7 01:11:24.342894 containerd[1517]: time="2026-03-07T01:11:24.341891138Z" level=info msg="StartContainer for \"5b5a91af529efeca02e953877ce9d94d32debf076d1ebbf98df1d5a3c7b5b06a\"" Mar 7 01:11:24.342894 containerd[1517]: time="2026-03-07T01:11:24.342876252Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cf679fccd-lv8s8,Uid:c6423fb5-e2f2-4a85-962c-3faf4ee6a34d,Namespace:calico-system,Attempt:0,}" Mar 7 01:11:24.394589 systemd[1]: Started cri-containerd-5b5a91af529efeca02e953877ce9d94d32debf076d1ebbf98df1d5a3c7b5b06a.scope - libcontainer container 5b5a91af529efeca02e953877ce9d94d32debf076d1ebbf98df1d5a3c7b5b06a. Mar 7 01:11:24.461496 containerd[1517]: time="2026-03-07T01:11:24.461384651Z" level=info msg="StartContainer for \"5b5a91af529efeca02e953877ce9d94d32debf076d1ebbf98df1d5a3c7b5b06a\" returns successfully" Mar 7 01:11:24.490223 containerd[1517]: time="2026-03-07T01:11:24.490180688Z" level=error msg="Failed to destroy network for sandbox \"7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:11:24.491600 containerd[1517]: time="2026-03-07T01:11:24.491570653Z" level=error msg="encountered an error cleaning up failed sandbox \"7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:11:24.492854 containerd[1517]: time="2026-03-07T01:11:24.492831084Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-2g6xd,Uid:9e84ea27-025b-43b3-a817-46d96e7f19b0,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:11:24.494595 kubelet[2578]: E0307 01:11:24.493545 2578 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:11:24.495233 kubelet[2578]: E0307 01:11:24.494769 2578 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-2g6xd" Mar 7 01:11:24.495233 kubelet[2578]: E0307 01:11:24.494797 2578 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-2g6xd" Mar 7 01:11:24.495233 kubelet[2578]: E0307 01:11:24.495099 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-2g6xd_kube-system(9e84ea27-025b-43b3-a817-46d96e7f19b0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-2g6xd_kube-system(9e84ea27-025b-43b3-a817-46d96e7f19b0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-2g6xd" podUID="9e84ea27-025b-43b3-a817-46d96e7f19b0" Mar 7 01:11:24.550181 containerd[1517]: time="2026-03-07T01:11:24.550115631Z" level=error msg="Failed to destroy network for sandbox \"aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:11:24.553489 containerd[1517]: time="2026-03-07T01:11:24.553443185Z" level=error msg="encountered an error cleaning up failed sandbox \"aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:11:24.553571 containerd[1517]: time="2026-03-07T01:11:24.553493092Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7cdd895f86-5kmtj,Uid:dd5d2b91-08a1-4e2b-97af-8acbc0f68c75,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:11:24.553798 kubelet[2578]: E0307 01:11:24.553690 2578 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:11:24.553798 kubelet[2578]: E0307 01:11:24.553740 2578 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-7cdd895f86-5kmtj" Mar 7 01:11:24.553798 kubelet[2578]: E0307 01:11:24.553758 2578 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-7cdd895f86-5kmtj" Mar 7 01:11:24.553882 kubelet[2578]: E0307 01:11:24.553794 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7cdd895f86-5kmtj_calico-system(dd5d2b91-08a1-4e2b-97af-8acbc0f68c75)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7cdd895f86-5kmtj_calico-system(dd5d2b91-08a1-4e2b-97af-8acbc0f68c75)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-7cdd895f86-5kmtj" podUID="dd5d2b91-08a1-4e2b-97af-8acbc0f68c75" Mar 7 01:11:24.757308 containerd[1517]: 2026-03-07 01:11:24.683 [INFO][3709] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="efa6d58c4c990d9a7d263f83f65df51d9192669fc08e8c61eca6e0082c540289" Mar 7 01:11:24.757308 containerd[1517]: 2026-03-07 01:11:24.683 [INFO][3709] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="efa6d58c4c990d9a7d263f83f65df51d9192669fc08e8c61eca6e0082c540289" iface="eth0" netns="/var/run/netns/cni-ac7b8961-232e-8f1c-84a9-72f6797c2168" Mar 7 01:11:24.757308 containerd[1517]: 2026-03-07 01:11:24.684 [INFO][3709] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="efa6d58c4c990d9a7d263f83f65df51d9192669fc08e8c61eca6e0082c540289" iface="eth0" netns="/var/run/netns/cni-ac7b8961-232e-8f1c-84a9-72f6797c2168" Mar 7 01:11:24.757308 containerd[1517]: 2026-03-07 01:11:24.684 [INFO][3709] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="efa6d58c4c990d9a7d263f83f65df51d9192669fc08e8c61eca6e0082c540289" iface="eth0" netns="/var/run/netns/cni-ac7b8961-232e-8f1c-84a9-72f6797c2168" Mar 7 01:11:24.757308 containerd[1517]: 2026-03-07 01:11:24.684 [INFO][3709] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="efa6d58c4c990d9a7d263f83f65df51d9192669fc08e8c61eca6e0082c540289" Mar 7 01:11:24.757308 containerd[1517]: 2026-03-07 01:11:24.684 [INFO][3709] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="efa6d58c4c990d9a7d263f83f65df51d9192669fc08e8c61eca6e0082c540289" Mar 7 01:11:24.757308 containerd[1517]: 2026-03-07 01:11:24.726 [INFO][3762] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="efa6d58c4c990d9a7d263f83f65df51d9192669fc08e8c61eca6e0082c540289" HandleID="k8s-pod-network.efa6d58c4c990d9a7d263f83f65df51d9192669fc08e8c61eca6e0082c540289" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-calico--kube--controllers--cf679fccd--lv8s8-eth0" Mar 7 01:11:24.757308 containerd[1517]: 2026-03-07 01:11:24.726 [INFO][3762] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:11:24.757308 containerd[1517]: 2026-03-07 01:11:24.726 [INFO][3762] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:11:24.757308 containerd[1517]: 2026-03-07 01:11:24.730 [WARNING][3762] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="efa6d58c4c990d9a7d263f83f65df51d9192669fc08e8c61eca6e0082c540289" HandleID="k8s-pod-network.efa6d58c4c990d9a7d263f83f65df51d9192669fc08e8c61eca6e0082c540289" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-calico--kube--controllers--cf679fccd--lv8s8-eth0" Mar 7 01:11:24.757308 containerd[1517]: 2026-03-07 01:11:24.730 [INFO][3762] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="efa6d58c4c990d9a7d263f83f65df51d9192669fc08e8c61eca6e0082c540289" HandleID="k8s-pod-network.efa6d58c4c990d9a7d263f83f65df51d9192669fc08e8c61eca6e0082c540289" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-calico--kube--controllers--cf679fccd--lv8s8-eth0" Mar 7 01:11:24.757308 containerd[1517]: 2026-03-07 01:11:24.734 [INFO][3762] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:11:24.757308 containerd[1517]: 2026-03-07 01:11:24.746 [INFO][3709] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="efa6d58c4c990d9a7d263f83f65df51d9192669fc08e8c61eca6e0082c540289" Mar 7 01:11:24.764347 containerd[1517]: time="2026-03-07T01:11:24.763827096Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cf679fccd-lv8s8,Uid:c6423fb5-e2f2-4a85-962c-3faf4ee6a34d,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"efa6d58c4c990d9a7d263f83f65df51d9192669fc08e8c61eca6e0082c540289\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:11:24.764933 kubelet[2578]: E0307 01:11:24.764469 2578 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"efa6d58c4c990d9a7d263f83f65df51d9192669fc08e8c61eca6e0082c540289\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:11:24.764933 kubelet[2578]: E0307 01:11:24.764642 2578 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"efa6d58c4c990d9a7d263f83f65df51d9192669fc08e8c61eca6e0082c540289\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-cf679fccd-lv8s8" Mar 7 01:11:24.764933 kubelet[2578]: E0307 01:11:24.764763 2578 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"efa6d58c4c990d9a7d263f83f65df51d9192669fc08e8c61eca6e0082c540289\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-cf679fccd-lv8s8" Mar 7 01:11:24.765058 kubelet[2578]: E0307 01:11:24.764807 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-cf679fccd-lv8s8_calico-system(c6423fb5-e2f2-4a85-962c-3faf4ee6a34d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-cf679fccd-lv8s8_calico-system(c6423fb5-e2f2-4a85-962c-3faf4ee6a34d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"efa6d58c4c990d9a7d263f83f65df51d9192669fc08e8c61eca6e0082c540289\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-cf679fccd-lv8s8" podUID="c6423fb5-e2f2-4a85-962c-3faf4ee6a34d" Mar 7 01:11:24.794944 containerd[1517]: 2026-03-07 01:11:24.699 [INFO][3694] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="ac99b4e71911a8a14a04f7cab9ad8b527bc650238cd27d5c17d976b7747273b0" Mar 7 01:11:24.794944 containerd[1517]: 2026-03-07 01:11:24.701 [INFO][3694] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="ac99b4e71911a8a14a04f7cab9ad8b527bc650238cd27d5c17d976b7747273b0" iface="eth0" netns="/var/run/netns/cni-833cb442-3e77-9bc7-1fc2-494bea61efd2" Mar 7 01:11:24.794944 containerd[1517]: 2026-03-07 01:11:24.701 [INFO][3694] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="ac99b4e71911a8a14a04f7cab9ad8b527bc650238cd27d5c17d976b7747273b0" iface="eth0" netns="/var/run/netns/cni-833cb442-3e77-9bc7-1fc2-494bea61efd2" Mar 7 01:11:24.794944 containerd[1517]: 2026-03-07 01:11:24.702 [INFO][3694] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="ac99b4e71911a8a14a04f7cab9ad8b527bc650238cd27d5c17d976b7747273b0" iface="eth0" netns="/var/run/netns/cni-833cb442-3e77-9bc7-1fc2-494bea61efd2" Mar 7 01:11:24.794944 containerd[1517]: 2026-03-07 01:11:24.702 [INFO][3694] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="ac99b4e71911a8a14a04f7cab9ad8b527bc650238cd27d5c17d976b7747273b0" Mar 7 01:11:24.794944 containerd[1517]: 2026-03-07 01:11:24.702 [INFO][3694] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="ac99b4e71911a8a14a04f7cab9ad8b527bc650238cd27d5c17d976b7747273b0" Mar 7 01:11:24.794944 containerd[1517]: 2026-03-07 01:11:24.775 [INFO][3774] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="ac99b4e71911a8a14a04f7cab9ad8b527bc650238cd27d5c17d976b7747273b0" HandleID="k8s-pod-network.ac99b4e71911a8a14a04f7cab9ad8b527bc650238cd27d5c17d976b7747273b0" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-goldmane--5b85766d88--8nxm8-eth0" Mar 7 01:11:24.794944 containerd[1517]: 2026-03-07 01:11:24.775 [INFO][3774] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:11:24.794944 containerd[1517]: 2026-03-07 01:11:24.775 [INFO][3774] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:11:24.794944 containerd[1517]: 2026-03-07 01:11:24.787 [WARNING][3774] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="ac99b4e71911a8a14a04f7cab9ad8b527bc650238cd27d5c17d976b7747273b0" HandleID="k8s-pod-network.ac99b4e71911a8a14a04f7cab9ad8b527bc650238cd27d5c17d976b7747273b0" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-goldmane--5b85766d88--8nxm8-eth0" Mar 7 01:11:24.794944 containerd[1517]: 2026-03-07 01:11:24.787 [INFO][3774] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="ac99b4e71911a8a14a04f7cab9ad8b527bc650238cd27d5c17d976b7747273b0" HandleID="k8s-pod-network.ac99b4e71911a8a14a04f7cab9ad8b527bc650238cd27d5c17d976b7747273b0" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-goldmane--5b85766d88--8nxm8-eth0" Mar 7 01:11:24.794944 containerd[1517]: 2026-03-07 01:11:24.789 [INFO][3774] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:11:24.794944 containerd[1517]: 2026-03-07 01:11:24.792 [INFO][3694] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="ac99b4e71911a8a14a04f7cab9ad8b527bc650238cd27d5c17d976b7747273b0" Mar 7 01:11:24.799853 containerd[1517]: 2026-03-07 01:11:24.684 [INFO][3708] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="1e1fb0826234b85da7e97910e375a5aade57e7e0c3114a62d0245b5181fb52a0" Mar 7 01:11:24.799853 containerd[1517]: 2026-03-07 01:11:24.684 [INFO][3708] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="1e1fb0826234b85da7e97910e375a5aade57e7e0c3114a62d0245b5181fb52a0" iface="eth0" netns="/var/run/netns/cni-80bf0a71-335c-67c0-26c9-0978f866d161" Mar 7 01:11:24.799853 containerd[1517]: 2026-03-07 01:11:24.684 [INFO][3708] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="1e1fb0826234b85da7e97910e375a5aade57e7e0c3114a62d0245b5181fb52a0" iface="eth0" netns="/var/run/netns/cni-80bf0a71-335c-67c0-26c9-0978f866d161" Mar 7 01:11:24.799853 containerd[1517]: 2026-03-07 01:11:24.685 [INFO][3708] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="1e1fb0826234b85da7e97910e375a5aade57e7e0c3114a62d0245b5181fb52a0" iface="eth0" netns="/var/run/netns/cni-80bf0a71-335c-67c0-26c9-0978f866d161" Mar 7 01:11:24.799853 containerd[1517]: 2026-03-07 01:11:24.685 [INFO][3708] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="1e1fb0826234b85da7e97910e375a5aade57e7e0c3114a62d0245b5181fb52a0" Mar 7 01:11:24.799853 containerd[1517]: 2026-03-07 01:11:24.685 [INFO][3708] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="1e1fb0826234b85da7e97910e375a5aade57e7e0c3114a62d0245b5181fb52a0" Mar 7 01:11:24.799853 containerd[1517]: 2026-03-07 01:11:24.776 [INFO][3764] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="1e1fb0826234b85da7e97910e375a5aade57e7e0c3114a62d0245b5181fb52a0" HandleID="k8s-pod-network.1e1fb0826234b85da7e97910e375a5aade57e7e0c3114a62d0245b5181fb52a0" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-whisker--d895dd87--pfsm5-eth0" Mar 7 01:11:24.799853 containerd[1517]: 2026-03-07 01:11:24.777 [INFO][3764] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:11:24.799853 containerd[1517]: 2026-03-07 01:11:24.789 [INFO][3764] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:11:24.799853 containerd[1517]: 2026-03-07 01:11:24.794 [WARNING][3764] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="1e1fb0826234b85da7e97910e375a5aade57e7e0c3114a62d0245b5181fb52a0" HandleID="k8s-pod-network.1e1fb0826234b85da7e97910e375a5aade57e7e0c3114a62d0245b5181fb52a0" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-whisker--d895dd87--pfsm5-eth0" Mar 7 01:11:24.799853 containerd[1517]: 2026-03-07 01:11:24.794 [INFO][3764] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="1e1fb0826234b85da7e97910e375a5aade57e7e0c3114a62d0245b5181fb52a0" HandleID="k8s-pod-network.1e1fb0826234b85da7e97910e375a5aade57e7e0c3114a62d0245b5181fb52a0" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-whisker--d895dd87--pfsm5-eth0" Mar 7 01:11:24.799853 containerd[1517]: 2026-03-07 01:11:24.795 [INFO][3764] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:11:24.799853 containerd[1517]: 2026-03-07 01:11:24.797 [INFO][3708] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="1e1fb0826234b85da7e97910e375a5aade57e7e0c3114a62d0245b5181fb52a0" Mar 7 01:11:24.803554 containerd[1517]: time="2026-03-07T01:11:24.803340239Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-8nxm8,Uid:9a2395af-edcb-49ca-a47a-8c642cac381d,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"ac99b4e71911a8a14a04f7cab9ad8b527bc650238cd27d5c17d976b7747273b0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:11:24.804026 kubelet[2578]: E0307 01:11:24.803969 2578 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac99b4e71911a8a14a04f7cab9ad8b527bc650238cd27d5c17d976b7747273b0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:11:24.804260 kubelet[2578]: E0307 01:11:24.804120 2578 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac99b4e71911a8a14a04f7cab9ad8b527bc650238cd27d5c17d976b7747273b0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-5b85766d88-8nxm8" Mar 7 01:11:24.804260 kubelet[2578]: E0307 01:11:24.804143 2578 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac99b4e71911a8a14a04f7cab9ad8b527bc650238cd27d5c17d976b7747273b0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-5b85766d88-8nxm8" Mar 7 01:11:24.804450 kubelet[2578]: E0307 01:11:24.804367 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-5b85766d88-8nxm8_calico-system(9a2395af-edcb-49ca-a47a-8c642cac381d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-5b85766d88-8nxm8_calico-system(9a2395af-edcb-49ca-a47a-8c642cac381d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ac99b4e71911a8a14a04f7cab9ad8b527bc650238cd27d5c17d976b7747273b0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-5b85766d88-8nxm8" podUID="9a2395af-edcb-49ca-a47a-8c642cac381d" Mar 7 01:11:24.808470 containerd[1517]: time="2026-03-07T01:11:24.808383462Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-d895dd87-pfsm5,Uid:32e149ae-638d-489c-a5af-ccd7aafddaba,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"1e1fb0826234b85da7e97910e375a5aade57e7e0c3114a62d0245b5181fb52a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:11:24.809020 kubelet[2578]: E0307 01:11:24.808959 2578 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1e1fb0826234b85da7e97910e375a5aade57e7e0c3114a62d0245b5181fb52a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:11:24.809020 kubelet[2578]: E0307 01:11:24.808990 2578 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1e1fb0826234b85da7e97910e375a5aade57e7e0c3114a62d0245b5181fb52a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-d895dd87-pfsm5" Mar 7 01:11:24.811237 containerd[1517]: 2026-03-07 01:11:24.695 [INFO][3737] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="4a97572b51fefe5bdee13f951fc167a969d64886d2dafafa80678c97a2e62e53" Mar 7 01:11:24.811237 containerd[1517]: 2026-03-07 01:11:24.696 [INFO][3737] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="4a97572b51fefe5bdee13f951fc167a969d64886d2dafafa80678c97a2e62e53" iface="eth0" netns="/var/run/netns/cni-3e0b8e75-a5a0-d887-ea49-0a5b1cd6a777" Mar 7 01:11:24.811237 containerd[1517]: 2026-03-07 01:11:24.696 [INFO][3737] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="4a97572b51fefe5bdee13f951fc167a969d64886d2dafafa80678c97a2e62e53" iface="eth0" netns="/var/run/netns/cni-3e0b8e75-a5a0-d887-ea49-0a5b1cd6a777" Mar 7 01:11:24.811237 containerd[1517]: 2026-03-07 01:11:24.696 [INFO][3737] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="4a97572b51fefe5bdee13f951fc167a969d64886d2dafafa80678c97a2e62e53" iface="eth0" netns="/var/run/netns/cni-3e0b8e75-a5a0-d887-ea49-0a5b1cd6a777" Mar 7 01:11:24.811237 containerd[1517]: 2026-03-07 01:11:24.696 [INFO][3737] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="4a97572b51fefe5bdee13f951fc167a969d64886d2dafafa80678c97a2e62e53" Mar 7 01:11:24.811237 containerd[1517]: 2026-03-07 01:11:24.696 [INFO][3737] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="4a97572b51fefe5bdee13f951fc167a969d64886d2dafafa80678c97a2e62e53" Mar 7 01:11:24.811237 containerd[1517]: 2026-03-07 01:11:24.782 [INFO][3772] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="4a97572b51fefe5bdee13f951fc167a969d64886d2dafafa80678c97a2e62e53" HandleID="k8s-pod-network.4a97572b51fefe5bdee13f951fc167a969d64886d2dafafa80678c97a2e62e53" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--pmmf8-eth0" Mar 7 01:11:24.811237 containerd[1517]: 2026-03-07 01:11:24.783 [INFO][3772] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:11:24.811237 containerd[1517]: 2026-03-07 01:11:24.795 [INFO][3772] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:11:24.811237 containerd[1517]: 2026-03-07 01:11:24.800 [WARNING][3772] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="4a97572b51fefe5bdee13f951fc167a969d64886d2dafafa80678c97a2e62e53" HandleID="k8s-pod-network.4a97572b51fefe5bdee13f951fc167a969d64886d2dafafa80678c97a2e62e53" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--pmmf8-eth0" Mar 7 01:11:24.811237 containerd[1517]: 2026-03-07 01:11:24.800 [INFO][3772] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="4a97572b51fefe5bdee13f951fc167a969d64886d2dafafa80678c97a2e62e53" HandleID="k8s-pod-network.4a97572b51fefe5bdee13f951fc167a969d64886d2dafafa80678c97a2e62e53" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--pmmf8-eth0" Mar 7 01:11:24.811237 containerd[1517]: 2026-03-07 01:11:24.802 [INFO][3772] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:11:24.811237 containerd[1517]: 2026-03-07 01:11:24.807 [INFO][3737] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="4a97572b51fefe5bdee13f951fc167a969d64886d2dafafa80678c97a2e62e53" Mar 7 01:11:24.814935 containerd[1517]: time="2026-03-07T01:11:24.814848009Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7cdd895f86-pmmf8,Uid:4298d66b-eb58-4358-8b7f-a00b4a5de60e,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"4a97572b51fefe5bdee13f951fc167a969d64886d2dafafa80678c97a2e62e53\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:11:24.815194 kubelet[2578]: E0307 01:11:24.815124 2578 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a97572b51fefe5bdee13f951fc167a969d64886d2dafafa80678c97a2e62e53\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:11:24.815194 kubelet[2578]: E0307 01:11:24.815183 2578 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a97572b51fefe5bdee13f951fc167a969d64886d2dafafa80678c97a2e62e53\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-7cdd895f86-pmmf8" Mar 7 01:11:24.815249 kubelet[2578]: E0307 01:11:24.815207 2578 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a97572b51fefe5bdee13f951fc167a969d64886d2dafafa80678c97a2e62e53\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-7cdd895f86-pmmf8" Mar 7 01:11:24.815278 kubelet[2578]: E0307 01:11:24.815246 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7cdd895f86-pmmf8_calico-system(4298d66b-eb58-4358-8b7f-a00b4a5de60e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7cdd895f86-pmmf8_calico-system(4298d66b-eb58-4358-8b7f-a00b4a5de60e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4a97572b51fefe5bdee13f951fc167a969d64886d2dafafa80678c97a2e62e53\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-7cdd895f86-pmmf8" podUID="4298d66b-eb58-4358-8b7f-a00b4a5de60e" Mar 7 01:11:24.818806 containerd[1517]: 2026-03-07 01:11:24.696 [INFO][3736] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="9a6c845a3e67aa0710f9cc63e66d4fad7ce9935a0c1d222959acf6c357c3ebe8" Mar 7 01:11:24.818806 containerd[1517]: 2026-03-07 01:11:24.701 [INFO][3736] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="9a6c845a3e67aa0710f9cc63e66d4fad7ce9935a0c1d222959acf6c357c3ebe8" iface="eth0" netns="/var/run/netns/cni-877b87a7-1ce6-eee7-7382-75e4bf456605" Mar 7 01:11:24.818806 containerd[1517]: 2026-03-07 01:11:24.703 [INFO][3736] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="9a6c845a3e67aa0710f9cc63e66d4fad7ce9935a0c1d222959acf6c357c3ebe8" iface="eth0" netns="/var/run/netns/cni-877b87a7-1ce6-eee7-7382-75e4bf456605" Mar 7 01:11:24.818806 containerd[1517]: 2026-03-07 01:11:24.704 [INFO][3736] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="9a6c845a3e67aa0710f9cc63e66d4fad7ce9935a0c1d222959acf6c357c3ebe8" iface="eth0" netns="/var/run/netns/cni-877b87a7-1ce6-eee7-7382-75e4bf456605" Mar 7 01:11:24.818806 containerd[1517]: 2026-03-07 01:11:24.704 [INFO][3736] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="9a6c845a3e67aa0710f9cc63e66d4fad7ce9935a0c1d222959acf6c357c3ebe8" Mar 7 01:11:24.818806 containerd[1517]: 2026-03-07 01:11:24.704 [INFO][3736] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="9a6c845a3e67aa0710f9cc63e66d4fad7ce9935a0c1d222959acf6c357c3ebe8" Mar 7 01:11:24.818806 containerd[1517]: 2026-03-07 01:11:24.784 [INFO][3780] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="9a6c845a3e67aa0710f9cc63e66d4fad7ce9935a0c1d222959acf6c357c3ebe8" HandleID="k8s-pod-network.9a6c845a3e67aa0710f9cc63e66d4fad7ce9935a0c1d222959acf6c357c3ebe8" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2fcnt-eth0" Mar 7 01:11:24.818806 containerd[1517]: 2026-03-07 01:11:24.785 [INFO][3780] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:11:24.818806 containerd[1517]: 2026-03-07 01:11:24.802 [INFO][3780] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:11:24.818806 containerd[1517]: 2026-03-07 01:11:24.812 [WARNING][3780] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="9a6c845a3e67aa0710f9cc63e66d4fad7ce9935a0c1d222959acf6c357c3ebe8" HandleID="k8s-pod-network.9a6c845a3e67aa0710f9cc63e66d4fad7ce9935a0c1d222959acf6c357c3ebe8" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2fcnt-eth0" Mar 7 01:11:24.818806 containerd[1517]: 2026-03-07 01:11:24.812 [INFO][3780] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="9a6c845a3e67aa0710f9cc63e66d4fad7ce9935a0c1d222959acf6c357c3ebe8" HandleID="k8s-pod-network.9a6c845a3e67aa0710f9cc63e66d4fad7ce9935a0c1d222959acf6c357c3ebe8" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2fcnt-eth0" Mar 7 01:11:24.818806 containerd[1517]: 2026-03-07 01:11:24.813 [INFO][3780] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:11:24.818806 containerd[1517]: 2026-03-07 01:11:24.816 [INFO][3736] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="9a6c845a3e67aa0710f9cc63e66d4fad7ce9935a0c1d222959acf6c357c3ebe8" Mar 7 01:11:24.821299 containerd[1517]: time="2026-03-07T01:11:24.821265417Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-2fcnt,Uid:9a59049d-3a86-490e-8990-b534f38c98ed,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"9a6c845a3e67aa0710f9cc63e66d4fad7ce9935a0c1d222959acf6c357c3ebe8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:11:24.821857 kubelet[2578]: E0307 01:11:24.821827 2578 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9a6c845a3e67aa0710f9cc63e66d4fad7ce9935a0c1d222959acf6c357c3ebe8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 7 01:11:24.821912 kubelet[2578]: E0307 01:11:24.821869 2578 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9a6c845a3e67aa0710f9cc63e66d4fad7ce9935a0c1d222959acf6c357c3ebe8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-2fcnt" Mar 7 01:11:24.821912 kubelet[2578]: E0307 01:11:24.821898 2578 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9a6c845a3e67aa0710f9cc63e66d4fad7ce9935a0c1d222959acf6c357c3ebe8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-2fcnt" Mar 7 01:11:24.821987 kubelet[2578]: E0307 01:11:24.821942 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-2fcnt_kube-system(9a59049d-3a86-490e-8990-b534f38c98ed)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-2fcnt_kube-system(9a59049d-3a86-490e-8990-b534f38c98ed)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9a6c845a3e67aa0710f9cc63e66d4fad7ce9935a0c1d222959acf6c357c3ebe8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-2fcnt" podUID="9a59049d-3a86-490e-8990-b534f38c98ed" Mar 7 01:11:25.277778 kubelet[2578]: I0307 01:11:25.277665 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" Mar 7 01:11:25.282674 containerd[1517]: time="2026-03-07T01:11:25.280895433Z" level=info msg="StopPodSandbox for \"aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21\"" Mar 7 01:11:25.282674 containerd[1517]: time="2026-03-07T01:11:25.281568383Z" level=info msg="Ensure that sandbox aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21 in task-service has been cleanup successfully" Mar 7 01:11:25.282900 kubelet[2578]: I0307 01:11:25.281306 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" Mar 7 01:11:25.288441 containerd[1517]: time="2026-03-07T01:11:25.285767853Z" level=info msg="StopPodSandbox for \"7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211\"" Mar 7 01:11:25.288441 containerd[1517]: time="2026-03-07T01:11:25.286042270Z" level=info msg="Ensure that sandbox 7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211 in task-service has been cleanup successfully" Mar 7 01:11:25.295359 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211-shm.mount: Deactivated successfully. Mar 7 01:11:25.302906 kubelet[2578]: I0307 01:11:25.301338 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" Mar 7 01:11:25.303068 containerd[1517]: time="2026-03-07T01:11:25.302823191Z" level=info msg="StopPodSandbox for \"6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089\"" Mar 7 01:11:25.303733 containerd[1517]: time="2026-03-07T01:11:25.303331038Z" level=info msg="Ensure that sandbox 6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089 in task-service has been cleanup successfully" Mar 7 01:11:25.317147 containerd[1517]: time="2026-03-07T01:11:25.317046487Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-8nxm8,Uid:9a2395af-edcb-49ca-a47a-8c642cac381d,Namespace:calico-system,Attempt:0,}" Mar 7 01:11:25.318446 containerd[1517]: time="2026-03-07T01:11:25.318086090Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cf679fccd-lv8s8,Uid:c6423fb5-e2f2-4a85-962c-3faf4ee6a34d,Namespace:calico-system,Attempt:0,}" Mar 7 01:11:25.329305 containerd[1517]: time="2026-03-07T01:11:25.327960183Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-2fcnt,Uid:9a59049d-3a86-490e-8990-b534f38c98ed,Namespace:kube-system,Attempt:0,}" Mar 7 01:11:25.329305 containerd[1517]: time="2026-03-07T01:11:25.328272480Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7cdd895f86-pmmf8,Uid:4298d66b-eb58-4358-8b7f-a00b4a5de60e,Namespace:calico-system,Attempt:0,}" Mar 7 01:11:25.348443 kubelet[2578]: I0307 01:11:25.347175 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-vtmpc" podStartSLOduration=3.219008001 podStartE2EDuration="28.347161844s" podCreationTimestamp="2026-03-07 01:10:57 +0000 UTC" firstStartedPulling="2026-03-07 01:10:58.137725064 +0000 UTC m=+16.062692115" lastFinishedPulling="2026-03-07 01:11:23.265878917 +0000 UTC m=+41.190845958" observedRunningTime="2026-03-07 01:11:25.346438028 +0000 UTC m=+43.271405079" watchObservedRunningTime="2026-03-07 01:11:25.347161844 +0000 UTC m=+43.272128885" Mar 7 01:11:25.466294 kubelet[2578]: I0307 01:11:25.466265 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/32e149ae-638d-489c-a5af-ccd7aafddaba-whisker-backend-key-pair\") pod \"32e149ae-638d-489c-a5af-ccd7aafddaba\" (UID: \"32e149ae-638d-489c-a5af-ccd7aafddaba\") " Mar 7 01:11:25.466539 kubelet[2578]: I0307 01:11:25.466526 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/32e149ae-638d-489c-a5af-ccd7aafddaba-nginx-config\") pod \"32e149ae-638d-489c-a5af-ccd7aafddaba\" (UID: \"32e149ae-638d-489c-a5af-ccd7aafddaba\") " Mar 7 01:11:25.466615 kubelet[2578]: I0307 01:11:25.466605 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32e149ae-638d-489c-a5af-ccd7aafddaba-whisker-ca-bundle\") pod \"32e149ae-638d-489c-a5af-ccd7aafddaba\" (UID: \"32e149ae-638d-489c-a5af-ccd7aafddaba\") " Mar 7 01:11:25.466668 kubelet[2578]: I0307 01:11:25.466661 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65nz6\" (UniqueName: \"kubernetes.io/projected/32e149ae-638d-489c-a5af-ccd7aafddaba-kube-api-access-65nz6\") pod \"32e149ae-638d-489c-a5af-ccd7aafddaba\" (UID: \"32e149ae-638d-489c-a5af-ccd7aafddaba\") " Mar 7 01:11:25.474676 kubelet[2578]: I0307 01:11:25.474640 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32e149ae-638d-489c-a5af-ccd7aafddaba-nginx-config" (OuterVolumeSpecName: "nginx-config") pod "32e149ae-638d-489c-a5af-ccd7aafddaba" (UID: "32e149ae-638d-489c-a5af-ccd7aafddaba"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 7 01:11:25.476534 kubelet[2578]: I0307 01:11:25.474802 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32e149ae-638d-489c-a5af-ccd7aafddaba-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "32e149ae-638d-489c-a5af-ccd7aafddaba" (UID: "32e149ae-638d-489c-a5af-ccd7aafddaba"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 7 01:11:25.477559 kubelet[2578]: I0307 01:11:25.477536 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32e149ae-638d-489c-a5af-ccd7aafddaba-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "32e149ae-638d-489c-a5af-ccd7aafddaba" (UID: "32e149ae-638d-489c-a5af-ccd7aafddaba"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 7 01:11:25.479438 kubelet[2578]: I0307 01:11:25.478155 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32e149ae-638d-489c-a5af-ccd7aafddaba-kube-api-access-65nz6" (OuterVolumeSpecName: "kube-api-access-65nz6") pod "32e149ae-638d-489c-a5af-ccd7aafddaba" (UID: "32e149ae-638d-489c-a5af-ccd7aafddaba"). InnerVolumeSpecName "kube-api-access-65nz6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 7 01:11:25.567763 kubelet[2578]: I0307 01:11:25.567731 2578 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/32e149ae-638d-489c-a5af-ccd7aafddaba-whisker-backend-key-pair\") on node \"ci-4081-3-6-n-e40d23dcbc\" DevicePath \"\"" Mar 7 01:11:25.568231 kubelet[2578]: I0307 01:11:25.568176 2578 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/32e149ae-638d-489c-a5af-ccd7aafddaba-nginx-config\") on node \"ci-4081-3-6-n-e40d23dcbc\" DevicePath \"\"" Mar 7 01:11:25.568231 kubelet[2578]: I0307 01:11:25.568189 2578 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32e149ae-638d-489c-a5af-ccd7aafddaba-whisker-ca-bundle\") on node \"ci-4081-3-6-n-e40d23dcbc\" DevicePath \"\"" Mar 7 01:11:25.568231 kubelet[2578]: I0307 01:11:25.568198 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-65nz6\" (UniqueName: \"kubernetes.io/projected/32e149ae-638d-489c-a5af-ccd7aafddaba-kube-api-access-65nz6\") on node \"ci-4081-3-6-n-e40d23dcbc\" DevicePath \"\"" Mar 7 01:11:25.586995 containerd[1517]: 2026-03-07 01:11:25.490 [INFO][3838] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" Mar 7 01:11:25.586995 containerd[1517]: 2026-03-07 01:11:25.490 [INFO][3838] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" iface="eth0" netns="/var/run/netns/cni-37e85831-dac9-f9cb-f2d1-22751a19915e" Mar 7 01:11:25.586995 containerd[1517]: 2026-03-07 01:11:25.490 [INFO][3838] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" iface="eth0" netns="/var/run/netns/cni-37e85831-dac9-f9cb-f2d1-22751a19915e" Mar 7 01:11:25.586995 containerd[1517]: 2026-03-07 01:11:25.493 [INFO][3838] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" iface="eth0" netns="/var/run/netns/cni-37e85831-dac9-f9cb-f2d1-22751a19915e" Mar 7 01:11:25.586995 containerd[1517]: 2026-03-07 01:11:25.494 [INFO][3838] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" Mar 7 01:11:25.586995 containerd[1517]: 2026-03-07 01:11:25.494 [INFO][3838] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" Mar 7 01:11:25.586995 containerd[1517]: 2026-03-07 01:11:25.558 [INFO][3934] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" HandleID="k8s-pod-network.6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-csi--node--driver--gpr4h-eth0" Mar 7 01:11:25.586995 containerd[1517]: 2026-03-07 01:11:25.560 [INFO][3934] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:11:25.586995 containerd[1517]: 2026-03-07 01:11:25.561 [INFO][3934] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:11:25.586995 containerd[1517]: 2026-03-07 01:11:25.569 [WARNING][3934] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" HandleID="k8s-pod-network.6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-csi--node--driver--gpr4h-eth0" Mar 7 01:11:25.586995 containerd[1517]: 2026-03-07 01:11:25.569 [INFO][3934] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" HandleID="k8s-pod-network.6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-csi--node--driver--gpr4h-eth0" Mar 7 01:11:25.586995 containerd[1517]: 2026-03-07 01:11:25.573 [INFO][3934] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:11:25.586995 containerd[1517]: 2026-03-07 01:11:25.580 [INFO][3838] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" Mar 7 01:11:25.587687 containerd[1517]: time="2026-03-07T01:11:25.587170052Z" level=info msg="TearDown network for sandbox \"6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089\" successfully" Mar 7 01:11:25.587687 containerd[1517]: time="2026-03-07T01:11:25.587217510Z" level=info msg="StopPodSandbox for \"6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089\" returns successfully" Mar 7 01:11:25.588324 containerd[1517]: time="2026-03-07T01:11:25.588079542Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gpr4h,Uid:b2c4739c-e360-4c95-9680-eefc754cc98b,Namespace:calico-system,Attempt:1,}" Mar 7 01:11:25.629723 containerd[1517]: 2026-03-07 01:11:25.465 [INFO][3840] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" Mar 7 01:11:25.629723 containerd[1517]: 2026-03-07 01:11:25.465 [INFO][3840] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" iface="eth0" netns="/var/run/netns/cni-d5186757-4b20-b16c-9b2c-544da04a70bb" Mar 7 01:11:25.629723 containerd[1517]: 2026-03-07 01:11:25.466 [INFO][3840] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" iface="eth0" netns="/var/run/netns/cni-d5186757-4b20-b16c-9b2c-544da04a70bb" Mar 7 01:11:25.629723 containerd[1517]: 2026-03-07 01:11:25.466 [INFO][3840] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" iface="eth0" netns="/var/run/netns/cni-d5186757-4b20-b16c-9b2c-544da04a70bb" Mar 7 01:11:25.629723 containerd[1517]: 2026-03-07 01:11:25.466 [INFO][3840] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" Mar 7 01:11:25.629723 containerd[1517]: 2026-03-07 01:11:25.466 [INFO][3840] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" Mar 7 01:11:25.629723 containerd[1517]: 2026-03-07 01:11:25.596 [INFO][3926] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" HandleID="k8s-pod-network.7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2g6xd-eth0" Mar 7 01:11:25.629723 containerd[1517]: 2026-03-07 01:11:25.596 [INFO][3926] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:11:25.629723 containerd[1517]: 2026-03-07 01:11:25.596 [INFO][3926] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:11:25.629723 containerd[1517]: 2026-03-07 01:11:25.610 [WARNING][3926] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" HandleID="k8s-pod-network.7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2g6xd-eth0" Mar 7 01:11:25.629723 containerd[1517]: 2026-03-07 01:11:25.610 [INFO][3926] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" HandleID="k8s-pod-network.7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2g6xd-eth0" Mar 7 01:11:25.629723 containerd[1517]: 2026-03-07 01:11:25.615 [INFO][3926] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:11:25.629723 containerd[1517]: 2026-03-07 01:11:25.623 [INFO][3840] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" Mar 7 01:11:25.630375 containerd[1517]: time="2026-03-07T01:11:25.629871750Z" level=info msg="TearDown network for sandbox \"7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211\" successfully" Mar 7 01:11:25.630375 containerd[1517]: time="2026-03-07T01:11:25.629892089Z" level=info msg="StopPodSandbox for \"7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211\" returns successfully" Mar 7 01:11:25.630650 containerd[1517]: time="2026-03-07T01:11:25.630580778Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-2g6xd,Uid:9e84ea27-025b-43b3-a817-46d96e7f19b0,Namespace:kube-system,Attempt:1,}" Mar 7 01:11:25.649592 containerd[1517]: 2026-03-07 01:11:25.448 [INFO][3839] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" Mar 7 01:11:25.649592 containerd[1517]: 2026-03-07 01:11:25.457 [INFO][3839] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" iface="eth0" netns="/var/run/netns/cni-98a3ccb3-30c5-eef3-0a50-e84fd42273f1" Mar 7 01:11:25.649592 containerd[1517]: 2026-03-07 01:11:25.458 [INFO][3839] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" iface="eth0" netns="/var/run/netns/cni-98a3ccb3-30c5-eef3-0a50-e84fd42273f1" Mar 7 01:11:25.649592 containerd[1517]: 2026-03-07 01:11:25.459 [INFO][3839] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" iface="eth0" netns="/var/run/netns/cni-98a3ccb3-30c5-eef3-0a50-e84fd42273f1" Mar 7 01:11:25.649592 containerd[1517]: 2026-03-07 01:11:25.459 [INFO][3839] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" Mar 7 01:11:25.649592 containerd[1517]: 2026-03-07 01:11:25.459 [INFO][3839] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" Mar 7 01:11:25.649592 containerd[1517]: 2026-03-07 01:11:25.598 [INFO][3920] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" HandleID="k8s-pod-network.aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--5kmtj-eth0" Mar 7 01:11:25.649592 containerd[1517]: 2026-03-07 01:11:25.598 [INFO][3920] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:11:25.649592 containerd[1517]: 2026-03-07 01:11:25.618 [INFO][3920] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:11:25.649592 containerd[1517]: 2026-03-07 01:11:25.630 [WARNING][3920] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" HandleID="k8s-pod-network.aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--5kmtj-eth0" Mar 7 01:11:25.649592 containerd[1517]: 2026-03-07 01:11:25.630 [INFO][3920] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" HandleID="k8s-pod-network.aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--5kmtj-eth0" Mar 7 01:11:25.649592 containerd[1517]: 2026-03-07 01:11:25.633 [INFO][3920] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:11:25.649592 containerd[1517]: 2026-03-07 01:11:25.643 [INFO][3839] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" Mar 7 01:11:25.650714 containerd[1517]: time="2026-03-07T01:11:25.650197511Z" level=info msg="TearDown network for sandbox \"aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21\" successfully" Mar 7 01:11:25.650714 containerd[1517]: time="2026-03-07T01:11:25.650612121Z" level=info msg="StopPodSandbox for \"aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21\" returns successfully" Mar 7 01:11:25.652285 containerd[1517]: time="2026-03-07T01:11:25.652251207Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7cdd895f86-5kmtj,Uid:dd5d2b91-08a1-4e2b-97af-8acbc0f68c75,Namespace:calico-system,Attempt:1,}" Mar 7 01:11:25.713871 systemd-networkd[1392]: calif3629baf55b: Link UP Mar 7 01:11:25.718931 systemd-networkd[1392]: calif3629baf55b: Gained carrier Mar 7 01:11:25.753347 containerd[1517]: 2026-03-07 01:11:25.453 [ERROR][3881] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 01:11:25.753347 containerd[1517]: 2026-03-07 01:11:25.489 [INFO][3881] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--e40d23dcbc-k8s-calico--kube--controllers--cf679fccd--lv8s8-eth0 calico-kube-controllers-cf679fccd- calico-system c6423fb5-e2f2-4a85-962c-3faf4ee6a34d 873 0 2026-03-07 01:10:57 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:cf679fccd projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4081-3-6-n-e40d23dcbc calico-kube-controllers-cf679fccd-lv8s8 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calif3629baf55b [] [] }} ContainerID="febbd0769d9169deac964b48da2739506b9b90adbf8b43d4485fee546aa210c1" Namespace="calico-system" Pod="calico-kube-controllers-cf679fccd-lv8s8" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-calico--kube--controllers--cf679fccd--lv8s8-" Mar 7 01:11:25.753347 containerd[1517]: 2026-03-07 01:11:25.489 [INFO][3881] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="febbd0769d9169deac964b48da2739506b9b90adbf8b43d4485fee546aa210c1" Namespace="calico-system" Pod="calico-kube-controllers-cf679fccd-lv8s8" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-calico--kube--controllers--cf679fccd--lv8s8-eth0" Mar 7 01:11:25.753347 containerd[1517]: 2026-03-07 01:11:25.593 [INFO][3939] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="febbd0769d9169deac964b48da2739506b9b90adbf8b43d4485fee546aa210c1" HandleID="k8s-pod-network.febbd0769d9169deac964b48da2739506b9b90adbf8b43d4485fee546aa210c1" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-calico--kube--controllers--cf679fccd--lv8s8-eth0" Mar 7 01:11:25.753347 containerd[1517]: 2026-03-07 01:11:25.602 [INFO][3939] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="febbd0769d9169deac964b48da2739506b9b90adbf8b43d4485fee546aa210c1" HandleID="k8s-pod-network.febbd0769d9169deac964b48da2739506b9b90adbf8b43d4485fee546aa210c1" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-calico--kube--controllers--cf679fccd--lv8s8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fbe20), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-e40d23dcbc", "pod":"calico-kube-controllers-cf679fccd-lv8s8", "timestamp":"2026-03-07 01:11:25.593127873 +0000 UTC"}, Hostname:"ci-4081-3-6-n-e40d23dcbc", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0004e6160)} Mar 7 01:11:25.753347 containerd[1517]: 2026-03-07 01:11:25.602 [INFO][3939] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:11:25.753347 containerd[1517]: 2026-03-07 01:11:25.635 [INFO][3939] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:11:25.753347 containerd[1517]: 2026-03-07 01:11:25.636 [INFO][3939] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-e40d23dcbc' Mar 7 01:11:25.753347 containerd[1517]: 2026-03-07 01:11:25.643 [INFO][3939] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.febbd0769d9169deac964b48da2739506b9b90adbf8b43d4485fee546aa210c1" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:25.753347 containerd[1517]: 2026-03-07 01:11:25.655 [INFO][3939] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:25.753347 containerd[1517]: 2026-03-07 01:11:25.668 [INFO][3939] ipam/ipam.go 526: Trying affinity for 192.168.95.64/26 host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:25.753347 containerd[1517]: 2026-03-07 01:11:25.671 [INFO][3939] ipam/ipam.go 160: Attempting to load block cidr=192.168.95.64/26 host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:25.753347 containerd[1517]: 2026-03-07 01:11:25.673 [INFO][3939] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.95.64/26 host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:25.753347 containerd[1517]: 2026-03-07 01:11:25.673 [INFO][3939] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.95.64/26 handle="k8s-pod-network.febbd0769d9169deac964b48da2739506b9b90adbf8b43d4485fee546aa210c1" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:25.753347 containerd[1517]: 2026-03-07 01:11:25.675 [INFO][3939] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.febbd0769d9169deac964b48da2739506b9b90adbf8b43d4485fee546aa210c1 Mar 7 01:11:25.753347 containerd[1517]: 2026-03-07 01:11:25.679 [INFO][3939] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.95.64/26 handle="k8s-pod-network.febbd0769d9169deac964b48da2739506b9b90adbf8b43d4485fee546aa210c1" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:25.753347 containerd[1517]: 2026-03-07 01:11:25.685 [INFO][3939] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.95.65/26] block=192.168.95.64/26 handle="k8s-pod-network.febbd0769d9169deac964b48da2739506b9b90adbf8b43d4485fee546aa210c1" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:25.753347 containerd[1517]: 2026-03-07 01:11:25.685 [INFO][3939] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.95.65/26] handle="k8s-pod-network.febbd0769d9169deac964b48da2739506b9b90adbf8b43d4485fee546aa210c1" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:25.753347 containerd[1517]: 2026-03-07 01:11:25.685 [INFO][3939] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:11:25.753347 containerd[1517]: 2026-03-07 01:11:25.685 [INFO][3939] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.95.65/26] IPv6=[] ContainerID="febbd0769d9169deac964b48da2739506b9b90adbf8b43d4485fee546aa210c1" HandleID="k8s-pod-network.febbd0769d9169deac964b48da2739506b9b90adbf8b43d4485fee546aa210c1" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-calico--kube--controllers--cf679fccd--lv8s8-eth0" Mar 7 01:11:25.756550 containerd[1517]: 2026-03-07 01:11:25.694 [INFO][3881] cni-plugin/k8s.go 418: Populated endpoint ContainerID="febbd0769d9169deac964b48da2739506b9b90adbf8b43d4485fee546aa210c1" Namespace="calico-system" Pod="calico-kube-controllers-cf679fccd-lv8s8" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-calico--kube--controllers--cf679fccd--lv8s8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--e40d23dcbc-k8s-calico--kube--controllers--cf679fccd--lv8s8-eth0", GenerateName:"calico-kube-controllers-cf679fccd-", Namespace:"calico-system", SelfLink:"", UID:"c6423fb5-e2f2-4a85-962c-3faf4ee6a34d", ResourceVersion:"873", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 10, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"cf679fccd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-e40d23dcbc", ContainerID:"", Pod:"calico-kube-controllers-cf679fccd-lv8s8", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.95.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calif3629baf55b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:11:25.756550 containerd[1517]: 2026-03-07 01:11:25.694 [INFO][3881] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.95.65/32] ContainerID="febbd0769d9169deac964b48da2739506b9b90adbf8b43d4485fee546aa210c1" Namespace="calico-system" Pod="calico-kube-controllers-cf679fccd-lv8s8" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-calico--kube--controllers--cf679fccd--lv8s8-eth0" Mar 7 01:11:25.756550 containerd[1517]: 2026-03-07 01:11:25.694 [INFO][3881] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif3629baf55b ContainerID="febbd0769d9169deac964b48da2739506b9b90adbf8b43d4485fee546aa210c1" Namespace="calico-system" Pod="calico-kube-controllers-cf679fccd-lv8s8" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-calico--kube--controllers--cf679fccd--lv8s8-eth0" Mar 7 01:11:25.756550 containerd[1517]: 2026-03-07 01:11:25.724 [INFO][3881] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="febbd0769d9169deac964b48da2739506b9b90adbf8b43d4485fee546aa210c1" Namespace="calico-system" Pod="calico-kube-controllers-cf679fccd-lv8s8" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-calico--kube--controllers--cf679fccd--lv8s8-eth0" Mar 7 01:11:25.756550 containerd[1517]: 2026-03-07 01:11:25.726 [INFO][3881] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="febbd0769d9169deac964b48da2739506b9b90adbf8b43d4485fee546aa210c1" Namespace="calico-system" Pod="calico-kube-controllers-cf679fccd-lv8s8" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-calico--kube--controllers--cf679fccd--lv8s8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--e40d23dcbc-k8s-calico--kube--controllers--cf679fccd--lv8s8-eth0", GenerateName:"calico-kube-controllers-cf679fccd-", Namespace:"calico-system", SelfLink:"", UID:"c6423fb5-e2f2-4a85-962c-3faf4ee6a34d", ResourceVersion:"873", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 10, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"cf679fccd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-e40d23dcbc", ContainerID:"febbd0769d9169deac964b48da2739506b9b90adbf8b43d4485fee546aa210c1", Pod:"calico-kube-controllers-cf679fccd-lv8s8", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.95.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calif3629baf55b", MAC:"3e:a8:ee:3a:26:77", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:11:25.756550 containerd[1517]: 2026-03-07 01:11:25.741 [INFO][3881] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="febbd0769d9169deac964b48da2739506b9b90adbf8b43d4485fee546aa210c1" Namespace="calico-system" Pod="calico-kube-controllers-cf679fccd-lv8s8" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-calico--kube--controllers--cf679fccd--lv8s8-eth0" Mar 7 01:11:25.797649 systemd-networkd[1392]: cali8b5458b32a4: Link UP Mar 7 01:11:25.798836 systemd-networkd[1392]: cali8b5458b32a4: Gained carrier Mar 7 01:11:25.813852 containerd[1517]: time="2026-03-07T01:11:25.811570677Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:11:25.813852 containerd[1517]: time="2026-03-07T01:11:25.811696641Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:11:25.813852 containerd[1517]: time="2026-03-07T01:11:25.811727720Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:11:25.813852 containerd[1517]: time="2026-03-07T01:11:25.811873193Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:11:25.832542 systemd[1]: Started cri-containerd-febbd0769d9169deac964b48da2739506b9b90adbf8b43d4485fee546aa210c1.scope - libcontainer container febbd0769d9169deac964b48da2739506b9b90adbf8b43d4485fee546aa210c1. Mar 7 01:11:25.838422 containerd[1517]: 2026-03-07 01:11:25.523 [ERROR][3865] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 01:11:25.838422 containerd[1517]: 2026-03-07 01:11:25.551 [INFO][3865] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2fcnt-eth0 coredns-674b8bbfcf- kube-system 9a59049d-3a86-490e-8990-b534f38c98ed 877 0 2026-03-07 01:10:47 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081-3-6-n-e40d23dcbc coredns-674b8bbfcf-2fcnt eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali8b5458b32a4 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="a2f022f6ac642ad8d23ad1a9dd70fddb4ce2e3e4b2fff712794eae753777718c" Namespace="kube-system" Pod="coredns-674b8bbfcf-2fcnt" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2fcnt-" Mar 7 01:11:25.838422 containerd[1517]: 2026-03-07 01:11:25.551 [INFO][3865] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a2f022f6ac642ad8d23ad1a9dd70fddb4ce2e3e4b2fff712794eae753777718c" Namespace="kube-system" Pod="coredns-674b8bbfcf-2fcnt" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2fcnt-eth0" Mar 7 01:11:25.838422 containerd[1517]: 2026-03-07 01:11:25.636 [INFO][3954] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a2f022f6ac642ad8d23ad1a9dd70fddb4ce2e3e4b2fff712794eae753777718c" HandleID="k8s-pod-network.a2f022f6ac642ad8d23ad1a9dd70fddb4ce2e3e4b2fff712794eae753777718c" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2fcnt-eth0" Mar 7 01:11:25.838422 containerd[1517]: 2026-03-07 01:11:25.642 [INFO][3954] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="a2f022f6ac642ad8d23ad1a9dd70fddb4ce2e3e4b2fff712794eae753777718c" HandleID="k8s-pod-network.a2f022f6ac642ad8d23ad1a9dd70fddb4ce2e3e4b2fff712794eae753777718c" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2fcnt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00030be90), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081-3-6-n-e40d23dcbc", "pod":"coredns-674b8bbfcf-2fcnt", "timestamp":"2026-03-07 01:11:25.636712051 +0000 UTC"}, Hostname:"ci-4081-3-6-n-e40d23dcbc", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000114dc0)} Mar 7 01:11:25.838422 containerd[1517]: 2026-03-07 01:11:25.642 [INFO][3954] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:11:25.838422 containerd[1517]: 2026-03-07 01:11:25.685 [INFO][3954] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:11:25.838422 containerd[1517]: 2026-03-07 01:11:25.685 [INFO][3954] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-e40d23dcbc' Mar 7 01:11:25.838422 containerd[1517]: 2026-03-07 01:11:25.743 [INFO][3954] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.a2f022f6ac642ad8d23ad1a9dd70fddb4ce2e3e4b2fff712794eae753777718c" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:25.838422 containerd[1517]: 2026-03-07 01:11:25.750 [INFO][3954] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:25.838422 containerd[1517]: 2026-03-07 01:11:25.763 [INFO][3954] ipam/ipam.go 526: Trying affinity for 192.168.95.64/26 host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:25.838422 containerd[1517]: 2026-03-07 01:11:25.765 [INFO][3954] ipam/ipam.go 160: Attempting to load block cidr=192.168.95.64/26 host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:25.838422 containerd[1517]: 2026-03-07 01:11:25.766 [INFO][3954] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.95.64/26 host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:25.838422 containerd[1517]: 2026-03-07 01:11:25.766 [INFO][3954] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.95.64/26 handle="k8s-pod-network.a2f022f6ac642ad8d23ad1a9dd70fddb4ce2e3e4b2fff712794eae753777718c" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:25.838422 containerd[1517]: 2026-03-07 01:11:25.767 [INFO][3954] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.a2f022f6ac642ad8d23ad1a9dd70fddb4ce2e3e4b2fff712794eae753777718c Mar 7 01:11:25.838422 containerd[1517]: 2026-03-07 01:11:25.771 [INFO][3954] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.95.64/26 handle="k8s-pod-network.a2f022f6ac642ad8d23ad1a9dd70fddb4ce2e3e4b2fff712794eae753777718c" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:25.838422 containerd[1517]: 2026-03-07 01:11:25.788 [INFO][3954] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.95.66/26] block=192.168.95.64/26 handle="k8s-pod-network.a2f022f6ac642ad8d23ad1a9dd70fddb4ce2e3e4b2fff712794eae753777718c" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:25.838422 containerd[1517]: 2026-03-07 01:11:25.788 [INFO][3954] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.95.66/26] handle="k8s-pod-network.a2f022f6ac642ad8d23ad1a9dd70fddb4ce2e3e4b2fff712794eae753777718c" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:25.838422 containerd[1517]: 2026-03-07 01:11:25.788 [INFO][3954] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:11:25.838422 containerd[1517]: 2026-03-07 01:11:25.788 [INFO][3954] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.95.66/26] IPv6=[] ContainerID="a2f022f6ac642ad8d23ad1a9dd70fddb4ce2e3e4b2fff712794eae753777718c" HandleID="k8s-pod-network.a2f022f6ac642ad8d23ad1a9dd70fddb4ce2e3e4b2fff712794eae753777718c" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2fcnt-eth0" Mar 7 01:11:25.838827 containerd[1517]: 2026-03-07 01:11:25.791 [INFO][3865] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a2f022f6ac642ad8d23ad1a9dd70fddb4ce2e3e4b2fff712794eae753777718c" Namespace="kube-system" Pod="coredns-674b8bbfcf-2fcnt" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2fcnt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2fcnt-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"9a59049d-3a86-490e-8990-b534f38c98ed", ResourceVersion:"877", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 10, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-e40d23dcbc", ContainerID:"", Pod:"coredns-674b8bbfcf-2fcnt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.95.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8b5458b32a4", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:11:25.838827 containerd[1517]: 2026-03-07 01:11:25.791 [INFO][3865] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.95.66/32] ContainerID="a2f022f6ac642ad8d23ad1a9dd70fddb4ce2e3e4b2fff712794eae753777718c" Namespace="kube-system" Pod="coredns-674b8bbfcf-2fcnt" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2fcnt-eth0" Mar 7 01:11:25.838827 containerd[1517]: 2026-03-07 01:11:25.791 [INFO][3865] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8b5458b32a4 ContainerID="a2f022f6ac642ad8d23ad1a9dd70fddb4ce2e3e4b2fff712794eae753777718c" Namespace="kube-system" Pod="coredns-674b8bbfcf-2fcnt" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2fcnt-eth0" Mar 7 01:11:25.838827 containerd[1517]: 2026-03-07 01:11:25.797 [INFO][3865] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a2f022f6ac642ad8d23ad1a9dd70fddb4ce2e3e4b2fff712794eae753777718c" Namespace="kube-system" Pod="coredns-674b8bbfcf-2fcnt" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2fcnt-eth0" Mar 7 01:11:25.838827 containerd[1517]: 2026-03-07 01:11:25.797 [INFO][3865] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a2f022f6ac642ad8d23ad1a9dd70fddb4ce2e3e4b2fff712794eae753777718c" Namespace="kube-system" Pod="coredns-674b8bbfcf-2fcnt" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2fcnt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2fcnt-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"9a59049d-3a86-490e-8990-b534f38c98ed", ResourceVersion:"877", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 10, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-e40d23dcbc", ContainerID:"a2f022f6ac642ad8d23ad1a9dd70fddb4ce2e3e4b2fff712794eae753777718c", Pod:"coredns-674b8bbfcf-2fcnt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.95.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8b5458b32a4", MAC:"62:97:a8:35:67:68", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:11:25.838827 containerd[1517]: 2026-03-07 01:11:25.828 [INFO][3865] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a2f022f6ac642ad8d23ad1a9dd70fddb4ce2e3e4b2fff712794eae753777718c" Namespace="kube-system" Pod="coredns-674b8bbfcf-2fcnt" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2fcnt-eth0" Mar 7 01:11:25.896438 containerd[1517]: time="2026-03-07T01:11:25.895491739Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cf679fccd-lv8s8,Uid:c6423fb5-e2f2-4a85-962c-3faf4ee6a34d,Namespace:calico-system,Attempt:0,} returns sandbox id \"febbd0769d9169deac964b48da2739506b9b90adbf8b43d4485fee546aa210c1\"" Mar 7 01:11:25.897867 containerd[1517]: time="2026-03-07T01:11:25.897649242Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Mar 7 01:11:25.908874 containerd[1517]: time="2026-03-07T01:11:25.908637904Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:11:25.908874 containerd[1517]: time="2026-03-07T01:11:25.908701051Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:11:25.908874 containerd[1517]: time="2026-03-07T01:11:25.908711461Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:11:25.908874 containerd[1517]: time="2026-03-07T01:11:25.908798237Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:11:25.929571 systemd[1]: Started cri-containerd-a2f022f6ac642ad8d23ad1a9dd70fddb4ce2e3e4b2fff712794eae753777718c.scope - libcontainer container a2f022f6ac642ad8d23ad1a9dd70fddb4ce2e3e4b2fff712794eae753777718c. Mar 7 01:11:25.948549 systemd-networkd[1392]: calidfd259856e7: Link UP Mar 7 01:11:25.948738 systemd-networkd[1392]: calidfd259856e7: Gained carrier Mar 7 01:11:25.982429 containerd[1517]: 2026-03-07 01:11:25.510 [ERROR][3857] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 01:11:25.982429 containerd[1517]: 2026-03-07 01:11:25.536 [INFO][3857] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--e40d23dcbc-k8s-goldmane--5b85766d88--8nxm8-eth0 goldmane-5b85766d88- calico-system 9a2395af-edcb-49ca-a47a-8c642cac381d 876 0 2026-03-07 01:10:57 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:5b85766d88 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4081-3-6-n-e40d23dcbc goldmane-5b85766d88-8nxm8 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calidfd259856e7 [] [] }} ContainerID="3430c767ec7cdfeb05b382975eaf4897507eadf286a7b98791b122e01f1da2a5" Namespace="calico-system" Pod="goldmane-5b85766d88-8nxm8" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-goldmane--5b85766d88--8nxm8-" Mar 7 01:11:25.982429 containerd[1517]: 2026-03-07 01:11:25.537 [INFO][3857] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3430c767ec7cdfeb05b382975eaf4897507eadf286a7b98791b122e01f1da2a5" Namespace="calico-system" Pod="goldmane-5b85766d88-8nxm8" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-goldmane--5b85766d88--8nxm8-eth0" Mar 7 01:11:25.982429 containerd[1517]: 2026-03-07 01:11:25.651 [INFO][3950] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3430c767ec7cdfeb05b382975eaf4897507eadf286a7b98791b122e01f1da2a5" HandleID="k8s-pod-network.3430c767ec7cdfeb05b382975eaf4897507eadf286a7b98791b122e01f1da2a5" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-goldmane--5b85766d88--8nxm8-eth0" Mar 7 01:11:25.982429 containerd[1517]: 2026-03-07 01:11:25.658 [INFO][3950] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="3430c767ec7cdfeb05b382975eaf4897507eadf286a7b98791b122e01f1da2a5" HandleID="k8s-pod-network.3430c767ec7cdfeb05b382975eaf4897507eadf286a7b98791b122e01f1da2a5" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-goldmane--5b85766d88--8nxm8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fbd80), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-e40d23dcbc", "pod":"goldmane-5b85766d88-8nxm8", "timestamp":"2026-03-07 01:11:25.651641595 +0000 UTC"}, Hostname:"ci-4081-3-6-n-e40d23dcbc", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000545760)} Mar 7 01:11:25.982429 containerd[1517]: 2026-03-07 01:11:25.659 [INFO][3950] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:11:25.982429 containerd[1517]: 2026-03-07 01:11:25.788 [INFO][3950] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:11:25.982429 containerd[1517]: 2026-03-07 01:11:25.788 [INFO][3950] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-e40d23dcbc' Mar 7 01:11:25.982429 containerd[1517]: 2026-03-07 01:11:25.849 [INFO][3950] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.3430c767ec7cdfeb05b382975eaf4897507eadf286a7b98791b122e01f1da2a5" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:25.982429 containerd[1517]: 2026-03-07 01:11:25.867 [INFO][3950] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:25.982429 containerd[1517]: 2026-03-07 01:11:25.884 [INFO][3950] ipam/ipam.go 526: Trying affinity for 192.168.95.64/26 host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:25.982429 containerd[1517]: 2026-03-07 01:11:25.888 [INFO][3950] ipam/ipam.go 160: Attempting to load block cidr=192.168.95.64/26 host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:25.982429 containerd[1517]: 2026-03-07 01:11:25.899 [INFO][3950] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.95.64/26 host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:25.982429 containerd[1517]: 2026-03-07 01:11:25.900 [INFO][3950] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.95.64/26 handle="k8s-pod-network.3430c767ec7cdfeb05b382975eaf4897507eadf286a7b98791b122e01f1da2a5" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:25.982429 containerd[1517]: 2026-03-07 01:11:25.904 [INFO][3950] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.3430c767ec7cdfeb05b382975eaf4897507eadf286a7b98791b122e01f1da2a5 Mar 7 01:11:25.982429 containerd[1517]: 2026-03-07 01:11:25.911 [INFO][3950] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.95.64/26 handle="k8s-pod-network.3430c767ec7cdfeb05b382975eaf4897507eadf286a7b98791b122e01f1da2a5" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:25.982429 containerd[1517]: 2026-03-07 01:11:25.930 [INFO][3950] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.95.67/26] block=192.168.95.64/26 handle="k8s-pod-network.3430c767ec7cdfeb05b382975eaf4897507eadf286a7b98791b122e01f1da2a5" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:25.982429 containerd[1517]: 2026-03-07 01:11:25.931 [INFO][3950] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.95.67/26] handle="k8s-pod-network.3430c767ec7cdfeb05b382975eaf4897507eadf286a7b98791b122e01f1da2a5" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:25.982429 containerd[1517]: 2026-03-07 01:11:25.931 [INFO][3950] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:11:25.982429 containerd[1517]: 2026-03-07 01:11:25.931 [INFO][3950] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.95.67/26] IPv6=[] ContainerID="3430c767ec7cdfeb05b382975eaf4897507eadf286a7b98791b122e01f1da2a5" HandleID="k8s-pod-network.3430c767ec7cdfeb05b382975eaf4897507eadf286a7b98791b122e01f1da2a5" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-goldmane--5b85766d88--8nxm8-eth0" Mar 7 01:11:25.982869 containerd[1517]: 2026-03-07 01:11:25.941 [INFO][3857] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3430c767ec7cdfeb05b382975eaf4897507eadf286a7b98791b122e01f1da2a5" Namespace="calico-system" Pod="goldmane-5b85766d88-8nxm8" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-goldmane--5b85766d88--8nxm8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--e40d23dcbc-k8s-goldmane--5b85766d88--8nxm8-eth0", GenerateName:"goldmane-5b85766d88-", Namespace:"calico-system", SelfLink:"", UID:"9a2395af-edcb-49ca-a47a-8c642cac381d", ResourceVersion:"876", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 10, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"5b85766d88", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-e40d23dcbc", ContainerID:"", Pod:"goldmane-5b85766d88-8nxm8", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.95.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calidfd259856e7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:11:25.982869 containerd[1517]: 2026-03-07 01:11:25.941 [INFO][3857] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.95.67/32] ContainerID="3430c767ec7cdfeb05b382975eaf4897507eadf286a7b98791b122e01f1da2a5" Namespace="calico-system" Pod="goldmane-5b85766d88-8nxm8" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-goldmane--5b85766d88--8nxm8-eth0" Mar 7 01:11:25.982869 containerd[1517]: 2026-03-07 01:11:25.941 [INFO][3857] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidfd259856e7 ContainerID="3430c767ec7cdfeb05b382975eaf4897507eadf286a7b98791b122e01f1da2a5" Namespace="calico-system" Pod="goldmane-5b85766d88-8nxm8" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-goldmane--5b85766d88--8nxm8-eth0" Mar 7 01:11:25.982869 containerd[1517]: 2026-03-07 01:11:25.948 [INFO][3857] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3430c767ec7cdfeb05b382975eaf4897507eadf286a7b98791b122e01f1da2a5" Namespace="calico-system" Pod="goldmane-5b85766d88-8nxm8" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-goldmane--5b85766d88--8nxm8-eth0" Mar 7 01:11:25.982869 containerd[1517]: 2026-03-07 01:11:25.950 [INFO][3857] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3430c767ec7cdfeb05b382975eaf4897507eadf286a7b98791b122e01f1da2a5" Namespace="calico-system" Pod="goldmane-5b85766d88-8nxm8" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-goldmane--5b85766d88--8nxm8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--e40d23dcbc-k8s-goldmane--5b85766d88--8nxm8-eth0", GenerateName:"goldmane-5b85766d88-", Namespace:"calico-system", SelfLink:"", UID:"9a2395af-edcb-49ca-a47a-8c642cac381d", ResourceVersion:"876", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 10, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"5b85766d88", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-e40d23dcbc", ContainerID:"3430c767ec7cdfeb05b382975eaf4897507eadf286a7b98791b122e01f1da2a5", Pod:"goldmane-5b85766d88-8nxm8", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.95.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calidfd259856e7", MAC:"52:b0:24:68:82:7c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:11:25.982869 containerd[1517]: 2026-03-07 01:11:25.974 [INFO][3857] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3430c767ec7cdfeb05b382975eaf4897507eadf286a7b98791b122e01f1da2a5" Namespace="calico-system" Pod="goldmane-5b85766d88-8nxm8" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-goldmane--5b85766d88--8nxm8-eth0" Mar 7 01:11:26.021601 systemd-networkd[1392]: califc3c9bf6373: Link UP Mar 7 01:11:26.023190 containerd[1517]: time="2026-03-07T01:11:26.022894310Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:11:26.023190 containerd[1517]: time="2026-03-07T01:11:26.022943098Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:11:26.023190 containerd[1517]: time="2026-03-07T01:11:26.022963077Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:11:26.023190 containerd[1517]: time="2026-03-07T01:11:26.023033874Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:11:26.023934 systemd-networkd[1392]: califc3c9bf6373: Gained carrier Mar 7 01:11:26.051037 systemd[1]: Started cri-containerd-3430c767ec7cdfeb05b382975eaf4897507eadf286a7b98791b122e01f1da2a5.scope - libcontainer container 3430c767ec7cdfeb05b382975eaf4897507eadf286a7b98791b122e01f1da2a5. Mar 7 01:11:26.057141 containerd[1517]: 2026-03-07 01:11:25.546 [ERROR][3885] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 01:11:26.057141 containerd[1517]: 2026-03-07 01:11:25.579 [INFO][3885] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--pmmf8-eth0 calico-apiserver-7cdd895f86- calico-system 4298d66b-eb58-4358-8b7f-a00b4a5de60e 875 0 2026-03-07 01:10:57 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7cdd895f86 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081-3-6-n-e40d23dcbc calico-apiserver-7cdd895f86-pmmf8 eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] califc3c9bf6373 [] [] }} ContainerID="7a1f5b82d89e8a71d0438dd8dd244af407072ac3a65d126412fec47cbe237393" Namespace="calico-system" Pod="calico-apiserver-7cdd895f86-pmmf8" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--pmmf8-" Mar 7 01:11:26.057141 containerd[1517]: 2026-03-07 01:11:25.579 [INFO][3885] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7a1f5b82d89e8a71d0438dd8dd244af407072ac3a65d126412fec47cbe237393" Namespace="calico-system" Pod="calico-apiserver-7cdd895f86-pmmf8" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--pmmf8-eth0" Mar 7 01:11:26.057141 containerd[1517]: 2026-03-07 01:11:25.647 [INFO][3964] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7a1f5b82d89e8a71d0438dd8dd244af407072ac3a65d126412fec47cbe237393" HandleID="k8s-pod-network.7a1f5b82d89e8a71d0438dd8dd244af407072ac3a65d126412fec47cbe237393" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--pmmf8-eth0" Mar 7 01:11:26.057141 containerd[1517]: 2026-03-07 01:11:25.664 [INFO][3964] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="7a1f5b82d89e8a71d0438dd8dd244af407072ac3a65d126412fec47cbe237393" HandleID="k8s-pod-network.7a1f5b82d89e8a71d0438dd8dd244af407072ac3a65d126412fec47cbe237393" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--pmmf8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00033b440), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-e40d23dcbc", "pod":"calico-apiserver-7cdd895f86-pmmf8", "timestamp":"2026-03-07 01:11:25.647611658 +0000 UTC"}, Hostname:"ci-4081-3-6-n-e40d23dcbc", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000252f20)} Mar 7 01:11:26.057141 containerd[1517]: 2026-03-07 01:11:25.664 [INFO][3964] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:11:26.057141 containerd[1517]: 2026-03-07 01:11:25.933 [INFO][3964] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:11:26.057141 containerd[1517]: 2026-03-07 01:11:25.933 [INFO][3964] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-e40d23dcbc' Mar 7 01:11:26.057141 containerd[1517]: 2026-03-07 01:11:25.971 [INFO][3964] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.7a1f5b82d89e8a71d0438dd8dd244af407072ac3a65d126412fec47cbe237393" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.057141 containerd[1517]: 2026-03-07 01:11:25.979 [INFO][3964] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.057141 containerd[1517]: 2026-03-07 01:11:25.987 [INFO][3964] ipam/ipam.go 526: Trying affinity for 192.168.95.64/26 host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.057141 containerd[1517]: 2026-03-07 01:11:25.989 [INFO][3964] ipam/ipam.go 160: Attempting to load block cidr=192.168.95.64/26 host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.057141 containerd[1517]: 2026-03-07 01:11:25.995 [INFO][3964] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.95.64/26 host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.057141 containerd[1517]: 2026-03-07 01:11:25.995 [INFO][3964] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.95.64/26 handle="k8s-pod-network.7a1f5b82d89e8a71d0438dd8dd244af407072ac3a65d126412fec47cbe237393" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.057141 containerd[1517]: 2026-03-07 01:11:26.000 [INFO][3964] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.7a1f5b82d89e8a71d0438dd8dd244af407072ac3a65d126412fec47cbe237393 Mar 7 01:11:26.057141 containerd[1517]: 2026-03-07 01:11:26.005 [INFO][3964] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.95.64/26 handle="k8s-pod-network.7a1f5b82d89e8a71d0438dd8dd244af407072ac3a65d126412fec47cbe237393" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.057141 containerd[1517]: 2026-03-07 01:11:26.009 [INFO][3964] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.95.68/26] block=192.168.95.64/26 handle="k8s-pod-network.7a1f5b82d89e8a71d0438dd8dd244af407072ac3a65d126412fec47cbe237393" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.057141 containerd[1517]: 2026-03-07 01:11:26.009 [INFO][3964] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.95.68/26] handle="k8s-pod-network.7a1f5b82d89e8a71d0438dd8dd244af407072ac3a65d126412fec47cbe237393" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.057141 containerd[1517]: 2026-03-07 01:11:26.009 [INFO][3964] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:11:26.057141 containerd[1517]: 2026-03-07 01:11:26.009 [INFO][3964] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.95.68/26] IPv6=[] ContainerID="7a1f5b82d89e8a71d0438dd8dd244af407072ac3a65d126412fec47cbe237393" HandleID="k8s-pod-network.7a1f5b82d89e8a71d0438dd8dd244af407072ac3a65d126412fec47cbe237393" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--pmmf8-eth0" Mar 7 01:11:26.057584 containerd[1517]: 2026-03-07 01:11:26.013 [INFO][3885] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7a1f5b82d89e8a71d0438dd8dd244af407072ac3a65d126412fec47cbe237393" Namespace="calico-system" Pod="calico-apiserver-7cdd895f86-pmmf8" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--pmmf8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--pmmf8-eth0", GenerateName:"calico-apiserver-7cdd895f86-", Namespace:"calico-system", SelfLink:"", UID:"4298d66b-eb58-4358-8b7f-a00b4a5de60e", ResourceVersion:"875", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 10, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7cdd895f86", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-e40d23dcbc", ContainerID:"", Pod:"calico-apiserver-7cdd895f86-pmmf8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.95.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"califc3c9bf6373", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:11:26.057584 containerd[1517]: 2026-03-07 01:11:26.013 [INFO][3885] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.95.68/32] ContainerID="7a1f5b82d89e8a71d0438dd8dd244af407072ac3a65d126412fec47cbe237393" Namespace="calico-system" Pod="calico-apiserver-7cdd895f86-pmmf8" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--pmmf8-eth0" Mar 7 01:11:26.057584 containerd[1517]: 2026-03-07 01:11:26.013 [INFO][3885] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califc3c9bf6373 ContainerID="7a1f5b82d89e8a71d0438dd8dd244af407072ac3a65d126412fec47cbe237393" Namespace="calico-system" Pod="calico-apiserver-7cdd895f86-pmmf8" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--pmmf8-eth0" Mar 7 01:11:26.057584 containerd[1517]: 2026-03-07 01:11:26.026 [INFO][3885] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7a1f5b82d89e8a71d0438dd8dd244af407072ac3a65d126412fec47cbe237393" Namespace="calico-system" Pod="calico-apiserver-7cdd895f86-pmmf8" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--pmmf8-eth0" Mar 7 01:11:26.057584 containerd[1517]: 2026-03-07 01:11:26.035 [INFO][3885] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7a1f5b82d89e8a71d0438dd8dd244af407072ac3a65d126412fec47cbe237393" Namespace="calico-system" Pod="calico-apiserver-7cdd895f86-pmmf8" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--pmmf8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--pmmf8-eth0", GenerateName:"calico-apiserver-7cdd895f86-", Namespace:"calico-system", SelfLink:"", UID:"4298d66b-eb58-4358-8b7f-a00b4a5de60e", ResourceVersion:"875", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 10, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7cdd895f86", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-e40d23dcbc", ContainerID:"7a1f5b82d89e8a71d0438dd8dd244af407072ac3a65d126412fec47cbe237393", Pod:"calico-apiserver-7cdd895f86-pmmf8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.95.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"califc3c9bf6373", MAC:"5e:2e:f1:7c:80:1a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:11:26.057584 containerd[1517]: 2026-03-07 01:11:26.050 [INFO][3885] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7a1f5b82d89e8a71d0438dd8dd244af407072ac3a65d126412fec47cbe237393" Namespace="calico-system" Pod="calico-apiserver-7cdd895f86-pmmf8" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--pmmf8-eth0" Mar 7 01:11:26.060919 containerd[1517]: time="2026-03-07T01:11:26.060710041Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-2fcnt,Uid:9a59049d-3a86-490e-8990-b534f38c98ed,Namespace:kube-system,Attempt:0,} returns sandbox id \"a2f022f6ac642ad8d23ad1a9dd70fddb4ce2e3e4b2fff712794eae753777718c\"" Mar 7 01:11:26.081520 containerd[1517]: time="2026-03-07T01:11:26.081478025Z" level=info msg="CreateContainer within sandbox \"a2f022f6ac642ad8d23ad1a9dd70fddb4ce2e3e4b2fff712794eae753777718c\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 7 01:11:26.094292 containerd[1517]: time="2026-03-07T01:11:26.093591776Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:11:26.094292 containerd[1517]: time="2026-03-07T01:11:26.094075146Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:11:26.094292 containerd[1517]: time="2026-03-07T01:11:26.094086035Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:11:26.094292 containerd[1517]: time="2026-03-07T01:11:26.094169632Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:11:26.119800 systemd[1]: Started cri-containerd-7a1f5b82d89e8a71d0438dd8dd244af407072ac3a65d126412fec47cbe237393.scope - libcontainer container 7a1f5b82d89e8a71d0438dd8dd244af407072ac3a65d126412fec47cbe237393. Mar 7 01:11:26.127259 containerd[1517]: time="2026-03-07T01:11:26.126996329Z" level=info msg="CreateContainer within sandbox \"a2f022f6ac642ad8d23ad1a9dd70fddb4ce2e3e4b2fff712794eae753777718c\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"7c287f1e0b4926774e9878a2a74726917e5f986853bd7bc0f9c32a2c4ec4b742\"" Mar 7 01:11:26.130586 containerd[1517]: time="2026-03-07T01:11:26.130553504Z" level=info msg="StartContainer for \"7c287f1e0b4926774e9878a2a74726917e5f986853bd7bc0f9c32a2c4ec4b742\"" Mar 7 01:11:26.137379 systemd-networkd[1392]: cali4173cd9ad7b: Link UP Mar 7 01:11:26.137661 systemd-networkd[1392]: cali4173cd9ad7b: Gained carrier Mar 7 01:11:26.155991 containerd[1517]: 2026-03-07 01:11:25.689 [ERROR][3990] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 01:11:26.155991 containerd[1517]: 2026-03-07 01:11:25.702 [INFO][3990] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2g6xd-eth0 coredns-674b8bbfcf- kube-system 9e84ea27-025b-43b3-a817-46d96e7f19b0 895 0 2026-03-07 01:10:47 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081-3-6-n-e40d23dcbc coredns-674b8bbfcf-2g6xd eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali4173cd9ad7b [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="2cbb2d69339d11b81f984fc78418b54d058103983570ba9cbb6f1207182611b2" Namespace="kube-system" Pod="coredns-674b8bbfcf-2g6xd" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2g6xd-" Mar 7 01:11:26.155991 containerd[1517]: 2026-03-07 01:11:25.702 [INFO][3990] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2cbb2d69339d11b81f984fc78418b54d058103983570ba9cbb6f1207182611b2" Namespace="kube-system" Pod="coredns-674b8bbfcf-2g6xd" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2g6xd-eth0" Mar 7 01:11:26.155991 containerd[1517]: 2026-03-07 01:11:25.749 [INFO][4019] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2cbb2d69339d11b81f984fc78418b54d058103983570ba9cbb6f1207182611b2" HandleID="k8s-pod-network.2cbb2d69339d11b81f984fc78418b54d058103983570ba9cbb6f1207182611b2" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2g6xd-eth0" Mar 7 01:11:26.155991 containerd[1517]: 2026-03-07 01:11:25.756 [INFO][4019] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="2cbb2d69339d11b81f984fc78418b54d058103983570ba9cbb6f1207182611b2" HandleID="k8s-pod-network.2cbb2d69339d11b81f984fc78418b54d058103983570ba9cbb6f1207182611b2" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2g6xd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ef8e0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081-3-6-n-e40d23dcbc", "pod":"coredns-674b8bbfcf-2g6xd", "timestamp":"2026-03-07 01:11:25.749779884 +0000 UTC"}, Hostname:"ci-4081-3-6-n-e40d23dcbc", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000358f20)} Mar 7 01:11:26.155991 containerd[1517]: 2026-03-07 01:11:25.756 [INFO][4019] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:11:26.155991 containerd[1517]: 2026-03-07 01:11:26.010 [INFO][4019] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:11:26.155991 containerd[1517]: 2026-03-07 01:11:26.010 [INFO][4019] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-e40d23dcbc' Mar 7 01:11:26.155991 containerd[1517]: 2026-03-07 01:11:26.069 [INFO][4019] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.2cbb2d69339d11b81f984fc78418b54d058103983570ba9cbb6f1207182611b2" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.155991 containerd[1517]: 2026-03-07 01:11:26.084 [INFO][4019] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.155991 containerd[1517]: 2026-03-07 01:11:26.093 [INFO][4019] ipam/ipam.go 526: Trying affinity for 192.168.95.64/26 host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.155991 containerd[1517]: 2026-03-07 01:11:26.101 [INFO][4019] ipam/ipam.go 160: Attempting to load block cidr=192.168.95.64/26 host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.155991 containerd[1517]: 2026-03-07 01:11:26.105 [INFO][4019] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.95.64/26 host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.155991 containerd[1517]: 2026-03-07 01:11:26.105 [INFO][4019] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.95.64/26 handle="k8s-pod-network.2cbb2d69339d11b81f984fc78418b54d058103983570ba9cbb6f1207182611b2" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.155991 containerd[1517]: 2026-03-07 01:11:26.108 [INFO][4019] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.2cbb2d69339d11b81f984fc78418b54d058103983570ba9cbb6f1207182611b2 Mar 7 01:11:26.155991 containerd[1517]: 2026-03-07 01:11:26.116 [INFO][4019] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.95.64/26 handle="k8s-pod-network.2cbb2d69339d11b81f984fc78418b54d058103983570ba9cbb6f1207182611b2" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.155991 containerd[1517]: 2026-03-07 01:11:26.126 [INFO][4019] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.95.69/26] block=192.168.95.64/26 handle="k8s-pod-network.2cbb2d69339d11b81f984fc78418b54d058103983570ba9cbb6f1207182611b2" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.155991 containerd[1517]: 2026-03-07 01:11:26.127 [INFO][4019] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.95.69/26] handle="k8s-pod-network.2cbb2d69339d11b81f984fc78418b54d058103983570ba9cbb6f1207182611b2" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.155991 containerd[1517]: 2026-03-07 01:11:26.127 [INFO][4019] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:11:26.155991 containerd[1517]: 2026-03-07 01:11:26.127 [INFO][4019] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.95.69/26] IPv6=[] ContainerID="2cbb2d69339d11b81f984fc78418b54d058103983570ba9cbb6f1207182611b2" HandleID="k8s-pod-network.2cbb2d69339d11b81f984fc78418b54d058103983570ba9cbb6f1207182611b2" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2g6xd-eth0" Mar 7 01:11:26.156649 containerd[1517]: 2026-03-07 01:11:26.129 [INFO][3990] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2cbb2d69339d11b81f984fc78418b54d058103983570ba9cbb6f1207182611b2" Namespace="kube-system" Pod="coredns-674b8bbfcf-2g6xd" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2g6xd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2g6xd-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"9e84ea27-025b-43b3-a817-46d96e7f19b0", ResourceVersion:"895", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 10, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-e40d23dcbc", ContainerID:"", Pod:"coredns-674b8bbfcf-2g6xd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.95.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4173cd9ad7b", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:11:26.156649 containerd[1517]: 2026-03-07 01:11:26.129 [INFO][3990] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.95.69/32] ContainerID="2cbb2d69339d11b81f984fc78418b54d058103983570ba9cbb6f1207182611b2" Namespace="kube-system" Pod="coredns-674b8bbfcf-2g6xd" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2g6xd-eth0" Mar 7 01:11:26.156649 containerd[1517]: 2026-03-07 01:11:26.129 [INFO][3990] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4173cd9ad7b ContainerID="2cbb2d69339d11b81f984fc78418b54d058103983570ba9cbb6f1207182611b2" Namespace="kube-system" Pod="coredns-674b8bbfcf-2g6xd" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2g6xd-eth0" Mar 7 01:11:26.156649 containerd[1517]: 2026-03-07 01:11:26.135 [INFO][3990] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2cbb2d69339d11b81f984fc78418b54d058103983570ba9cbb6f1207182611b2" Namespace="kube-system" Pod="coredns-674b8bbfcf-2g6xd" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2g6xd-eth0" Mar 7 01:11:26.156649 containerd[1517]: 2026-03-07 01:11:26.136 [INFO][3990] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2cbb2d69339d11b81f984fc78418b54d058103983570ba9cbb6f1207182611b2" Namespace="kube-system" Pod="coredns-674b8bbfcf-2g6xd" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2g6xd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2g6xd-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"9e84ea27-025b-43b3-a817-46d96e7f19b0", ResourceVersion:"895", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 10, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-e40d23dcbc", ContainerID:"2cbb2d69339d11b81f984fc78418b54d058103983570ba9cbb6f1207182611b2", Pod:"coredns-674b8bbfcf-2g6xd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.95.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4173cd9ad7b", MAC:"ee:1d:e8:df:a6:cf", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:11:26.156649 containerd[1517]: 2026-03-07 01:11:26.147 [INFO][3990] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2cbb2d69339d11b81f984fc78418b54d058103983570ba9cbb6f1207182611b2" Namespace="kube-system" Pod="coredns-674b8bbfcf-2g6xd" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2g6xd-eth0" Mar 7 01:11:26.160800 systemd[1]: Removed slice kubepods-besteffort-pod32e149ae_638d_489c_a5af_ccd7aafddaba.slice - libcontainer container kubepods-besteffort-pod32e149ae_638d_489c_a5af_ccd7aafddaba.slice. Mar 7 01:11:26.208149 systemd[1]: Started cri-containerd-7c287f1e0b4926774e9878a2a74726917e5f986853bd7bc0f9c32a2c4ec4b742.scope - libcontainer container 7c287f1e0b4926774e9878a2a74726917e5f986853bd7bc0f9c32a2c4ec4b742. Mar 7 01:11:26.231993 containerd[1517]: time="2026-03-07T01:11:26.231903493Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:11:26.233241 containerd[1517]: time="2026-03-07T01:11:26.233116970Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:11:26.233241 containerd[1517]: time="2026-03-07T01:11:26.233170998Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:11:26.233772 containerd[1517]: time="2026-03-07T01:11:26.233584700Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:11:26.234930 containerd[1517]: time="2026-03-07T01:11:26.234842845Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5b85766d88-8nxm8,Uid:9a2395af-edcb-49ca-a47a-8c642cac381d,Namespace:calico-system,Attempt:0,} returns sandbox id \"3430c767ec7cdfeb05b382975eaf4897507eadf286a7b98791b122e01f1da2a5\"" Mar 7 01:11:26.242039 systemd-networkd[1392]: caliaa77fcdc760: Link UP Mar 7 01:11:26.242650 systemd-networkd[1392]: caliaa77fcdc760: Gained carrier Mar 7 01:11:26.268190 containerd[1517]: time="2026-03-07T01:11:26.268159342Z" level=info msg="StartContainer for \"7c287f1e0b4926774e9878a2a74726917e5f986853bd7bc0f9c32a2c4ec4b742\" returns successfully" Mar 7 01:11:26.283775 systemd[1]: Started cri-containerd-2cbb2d69339d11b81f984fc78418b54d058103983570ba9cbb6f1207182611b2.scope - libcontainer container 2cbb2d69339d11b81f984fc78418b54d058103983570ba9cbb6f1207182611b2. Mar 7 01:11:26.298462 containerd[1517]: 2026-03-07 01:11:25.692 [ERROR][3971] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 01:11:26.298462 containerd[1517]: 2026-03-07 01:11:25.709 [INFO][3971] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--e40d23dcbc-k8s-csi--node--driver--gpr4h-eth0 csi-node-driver- calico-system b2c4739c-e360-4c95-9680-eefc754cc98b 896 0 2026-03-07 01:10:57 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6d9d697c7c k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4081-3-6-n-e40d23dcbc csi-node-driver-gpr4h eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] caliaa77fcdc760 [] [] }} ContainerID="f92e9c16fbac8fe41c25f2c23a7308df996ea1160632969acda399e8260d2d2d" Namespace="calico-system" Pod="csi-node-driver-gpr4h" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-csi--node--driver--gpr4h-" Mar 7 01:11:26.298462 containerd[1517]: 2026-03-07 01:11:25.709 [INFO][3971] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f92e9c16fbac8fe41c25f2c23a7308df996ea1160632969acda399e8260d2d2d" Namespace="calico-system" Pod="csi-node-driver-gpr4h" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-csi--node--driver--gpr4h-eth0" Mar 7 01:11:26.298462 containerd[1517]: 2026-03-07 01:11:25.786 [INFO][4026] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f92e9c16fbac8fe41c25f2c23a7308df996ea1160632969acda399e8260d2d2d" HandleID="k8s-pod-network.f92e9c16fbac8fe41c25f2c23a7308df996ea1160632969acda399e8260d2d2d" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-csi--node--driver--gpr4h-eth0" Mar 7 01:11:26.298462 containerd[1517]: 2026-03-07 01:11:25.824 [INFO][4026] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="f92e9c16fbac8fe41c25f2c23a7308df996ea1160632969acda399e8260d2d2d" HandleID="k8s-pod-network.f92e9c16fbac8fe41c25f2c23a7308df996ea1160632969acda399e8260d2d2d" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-csi--node--driver--gpr4h-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000305e90), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-e40d23dcbc", "pod":"csi-node-driver-gpr4h", "timestamp":"2026-03-07 01:11:25.786349878 +0000 UTC"}, Hostname:"ci-4081-3-6-n-e40d23dcbc", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0001142c0)} Mar 7 01:11:26.298462 containerd[1517]: 2026-03-07 01:11:25.824 [INFO][4026] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:11:26.298462 containerd[1517]: 2026-03-07 01:11:26.127 [INFO][4026] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:11:26.298462 containerd[1517]: 2026-03-07 01:11:26.127 [INFO][4026] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-e40d23dcbc' Mar 7 01:11:26.298462 containerd[1517]: 2026-03-07 01:11:26.172 [INFO][4026] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.f92e9c16fbac8fe41c25f2c23a7308df996ea1160632969acda399e8260d2d2d" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.298462 containerd[1517]: 2026-03-07 01:11:26.186 [INFO][4026] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.298462 containerd[1517]: 2026-03-07 01:11:26.197 [INFO][4026] ipam/ipam.go 526: Trying affinity for 192.168.95.64/26 host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.298462 containerd[1517]: 2026-03-07 01:11:26.199 [INFO][4026] ipam/ipam.go 160: Attempting to load block cidr=192.168.95.64/26 host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.298462 containerd[1517]: 2026-03-07 01:11:26.203 [INFO][4026] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.95.64/26 host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.298462 containerd[1517]: 2026-03-07 01:11:26.204 [INFO][4026] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.95.64/26 handle="k8s-pod-network.f92e9c16fbac8fe41c25f2c23a7308df996ea1160632969acda399e8260d2d2d" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.298462 containerd[1517]: 2026-03-07 01:11:26.210 [INFO][4026] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.f92e9c16fbac8fe41c25f2c23a7308df996ea1160632969acda399e8260d2d2d Mar 7 01:11:26.298462 containerd[1517]: 2026-03-07 01:11:26.219 [INFO][4026] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.95.64/26 handle="k8s-pod-network.f92e9c16fbac8fe41c25f2c23a7308df996ea1160632969acda399e8260d2d2d" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.298462 containerd[1517]: 2026-03-07 01:11:26.226 [INFO][4026] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.95.70/26] block=192.168.95.64/26 handle="k8s-pod-network.f92e9c16fbac8fe41c25f2c23a7308df996ea1160632969acda399e8260d2d2d" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.298462 containerd[1517]: 2026-03-07 01:11:26.227 [INFO][4026] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.95.70/26] handle="k8s-pod-network.f92e9c16fbac8fe41c25f2c23a7308df996ea1160632969acda399e8260d2d2d" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.298462 containerd[1517]: 2026-03-07 01:11:26.227 [INFO][4026] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:11:26.298462 containerd[1517]: 2026-03-07 01:11:26.227 [INFO][4026] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.95.70/26] IPv6=[] ContainerID="f92e9c16fbac8fe41c25f2c23a7308df996ea1160632969acda399e8260d2d2d" HandleID="k8s-pod-network.f92e9c16fbac8fe41c25f2c23a7308df996ea1160632969acda399e8260d2d2d" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-csi--node--driver--gpr4h-eth0" Mar 7 01:11:26.298884 containerd[1517]: 2026-03-07 01:11:26.239 [INFO][3971] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f92e9c16fbac8fe41c25f2c23a7308df996ea1160632969acda399e8260d2d2d" Namespace="calico-system" Pod="csi-node-driver-gpr4h" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-csi--node--driver--gpr4h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--e40d23dcbc-k8s-csi--node--driver--gpr4h-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"b2c4739c-e360-4c95-9680-eefc754cc98b", ResourceVersion:"896", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 10, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6d9d697c7c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-e40d23dcbc", ContainerID:"", Pod:"csi-node-driver-gpr4h", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.95.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"caliaa77fcdc760", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:11:26.298884 containerd[1517]: 2026-03-07 01:11:26.239 [INFO][3971] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.95.70/32] ContainerID="f92e9c16fbac8fe41c25f2c23a7308df996ea1160632969acda399e8260d2d2d" Namespace="calico-system" Pod="csi-node-driver-gpr4h" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-csi--node--driver--gpr4h-eth0" Mar 7 01:11:26.298884 containerd[1517]: 2026-03-07 01:11:26.239 [INFO][3971] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliaa77fcdc760 ContainerID="f92e9c16fbac8fe41c25f2c23a7308df996ea1160632969acda399e8260d2d2d" Namespace="calico-system" Pod="csi-node-driver-gpr4h" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-csi--node--driver--gpr4h-eth0" Mar 7 01:11:26.298884 containerd[1517]: 2026-03-07 01:11:26.255 [INFO][3971] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f92e9c16fbac8fe41c25f2c23a7308df996ea1160632969acda399e8260d2d2d" Namespace="calico-system" Pod="csi-node-driver-gpr4h" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-csi--node--driver--gpr4h-eth0" Mar 7 01:11:26.298884 containerd[1517]: 2026-03-07 01:11:26.260 [INFO][3971] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f92e9c16fbac8fe41c25f2c23a7308df996ea1160632969acda399e8260d2d2d" Namespace="calico-system" Pod="csi-node-driver-gpr4h" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-csi--node--driver--gpr4h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--e40d23dcbc-k8s-csi--node--driver--gpr4h-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"b2c4739c-e360-4c95-9680-eefc754cc98b", ResourceVersion:"896", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 10, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6d9d697c7c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-e40d23dcbc", ContainerID:"f92e9c16fbac8fe41c25f2c23a7308df996ea1160632969acda399e8260d2d2d", Pod:"csi-node-driver-gpr4h", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.95.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"caliaa77fcdc760", MAC:"e6:2b:ba:d2:05:4a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:11:26.298884 containerd[1517]: 2026-03-07 01:11:26.293 [INFO][3971] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f92e9c16fbac8fe41c25f2c23a7308df996ea1160632969acda399e8260d2d2d" Namespace="calico-system" Pod="csi-node-driver-gpr4h" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-csi--node--driver--gpr4h-eth0" Mar 7 01:11:26.305940 systemd[1]: run-netns-cni\x2d98a3ccb3\x2d30c5\x2deef3\x2d0a50\x2de84fd42273f1.mount: Deactivated successfully. Mar 7 01:11:26.306769 systemd[1]: run-netns-cni\x2dd5186757\x2d4b20\x2db16c\x2d9b2c\x2d544da04a70bb.mount: Deactivated successfully. Mar 7 01:11:26.306933 systemd[1]: var-lib-kubelet-pods-32e149ae\x2d638d\x2d489c\x2da5af\x2dccd7aafddaba-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d65nz6.mount: Deactivated successfully. Mar 7 01:11:26.307059 systemd[1]: run-netns-cni\x2d37e85831\x2ddac9\x2df9cb\x2df2d1\x2d22751a19915e.mount: Deactivated successfully. Mar 7 01:11:26.307178 systemd[1]: var-lib-kubelet-pods-32e149ae\x2d638d\x2d489c\x2da5af\x2dccd7aafddaba-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Mar 7 01:11:26.338185 containerd[1517]: time="2026-03-07T01:11:26.331587254Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:11:26.338185 containerd[1517]: time="2026-03-07T01:11:26.331643432Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:11:26.338185 containerd[1517]: time="2026-03-07T01:11:26.331661962Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:11:26.338185 containerd[1517]: time="2026-03-07T01:11:26.331740169Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:11:26.375684 systemd[1]: Started cri-containerd-f92e9c16fbac8fe41c25f2c23a7308df996ea1160632969acda399e8260d2d2d.scope - libcontainer container f92e9c16fbac8fe41c25f2c23a7308df996ea1160632969acda399e8260d2d2d. Mar 7 01:11:26.396865 kubelet[2578]: I0307 01:11:26.396803 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-2fcnt" podStartSLOduration=39.396787861 podStartE2EDuration="39.396787861s" podCreationTimestamp="2026-03-07 01:10:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 01:11:26.362698727 +0000 UTC m=+44.287665768" watchObservedRunningTime="2026-03-07 01:11:26.396787861 +0000 UTC m=+44.321754902" Mar 7 01:11:26.434209 systemd-networkd[1392]: cali2b9e67f2168: Link UP Mar 7 01:11:26.434517 systemd-networkd[1392]: cali2b9e67f2168: Gained carrier Mar 7 01:11:26.469448 systemd[1]: Created slice kubepods-besteffort-pod19a5e925_9ab8_49d1_a520_8c5053e158f3.slice - libcontainer container kubepods-besteffort-pod19a5e925_9ab8_49d1_a520_8c5053e158f3.slice. Mar 7 01:11:26.471183 containerd[1517]: time="2026-03-07T01:11:26.470891928Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7cdd895f86-pmmf8,Uid:4298d66b-eb58-4358-8b7f-a00b4a5de60e,Namespace:calico-system,Attempt:0,} returns sandbox id \"7a1f5b82d89e8a71d0438dd8dd244af407072ac3a65d126412fec47cbe237393\"" Mar 7 01:11:26.490584 containerd[1517]: time="2026-03-07T01:11:26.489736806Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gpr4h,Uid:b2c4739c-e360-4c95-9680-eefc754cc98b,Namespace:calico-system,Attempt:1,} returns sandbox id \"f92e9c16fbac8fe41c25f2c23a7308df996ea1160632969acda399e8260d2d2d\"" Mar 7 01:11:26.495003 containerd[1517]: 2026-03-07 01:11:25.734 [ERROR][4000] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 01:11:26.495003 containerd[1517]: 2026-03-07 01:11:25.757 [INFO][4000] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--5kmtj-eth0 calico-apiserver-7cdd895f86- calico-system dd5d2b91-08a1-4e2b-97af-8acbc0f68c75 894 0 2026-03-07 01:10:57 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7cdd895f86 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081-3-6-n-e40d23dcbc calico-apiserver-7cdd895f86-5kmtj eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali2b9e67f2168 [] [] }} ContainerID="2976bafea0560be21ac9a9d8307a3aaa07ffed41608b5ba64af944887655c5a8" Namespace="calico-system" Pod="calico-apiserver-7cdd895f86-5kmtj" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--5kmtj-" Mar 7 01:11:26.495003 containerd[1517]: 2026-03-07 01:11:25.757 [INFO][4000] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2976bafea0560be21ac9a9d8307a3aaa07ffed41608b5ba64af944887655c5a8" Namespace="calico-system" Pod="calico-apiserver-7cdd895f86-5kmtj" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--5kmtj-eth0" Mar 7 01:11:26.495003 containerd[1517]: 2026-03-07 01:11:25.842 [INFO][4039] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2976bafea0560be21ac9a9d8307a3aaa07ffed41608b5ba64af944887655c5a8" HandleID="k8s-pod-network.2976bafea0560be21ac9a9d8307a3aaa07ffed41608b5ba64af944887655c5a8" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--5kmtj-eth0" Mar 7 01:11:26.495003 containerd[1517]: 2026-03-07 01:11:25.858 [INFO][4039] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="2976bafea0560be21ac9a9d8307a3aaa07ffed41608b5ba64af944887655c5a8" HandleID="k8s-pod-network.2976bafea0560be21ac9a9d8307a3aaa07ffed41608b5ba64af944887655c5a8" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--5kmtj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f320), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-e40d23dcbc", "pod":"calico-apiserver-7cdd895f86-5kmtj", "timestamp":"2026-03-07 01:11:25.842132484 +0000 UTC"}, Hostname:"ci-4081-3-6-n-e40d23dcbc", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0001889a0)} Mar 7 01:11:26.495003 containerd[1517]: 2026-03-07 01:11:25.858 [INFO][4039] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:11:26.495003 containerd[1517]: 2026-03-07 01:11:26.229 [INFO][4039] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:11:26.495003 containerd[1517]: 2026-03-07 01:11:26.229 [INFO][4039] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-e40d23dcbc' Mar 7 01:11:26.495003 containerd[1517]: 2026-03-07 01:11:26.276 [INFO][4039] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.2976bafea0560be21ac9a9d8307a3aaa07ffed41608b5ba64af944887655c5a8" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.495003 containerd[1517]: 2026-03-07 01:11:26.301 [INFO][4039] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.495003 containerd[1517]: 2026-03-07 01:11:26.320 [INFO][4039] ipam/ipam.go 526: Trying affinity for 192.168.95.64/26 host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.495003 containerd[1517]: 2026-03-07 01:11:26.335 [INFO][4039] ipam/ipam.go 160: Attempting to load block cidr=192.168.95.64/26 host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.495003 containerd[1517]: 2026-03-07 01:11:26.364 [INFO][4039] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.95.64/26 host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.495003 containerd[1517]: 2026-03-07 01:11:26.365 [INFO][4039] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.95.64/26 handle="k8s-pod-network.2976bafea0560be21ac9a9d8307a3aaa07ffed41608b5ba64af944887655c5a8" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.495003 containerd[1517]: 2026-03-07 01:11:26.378 [INFO][4039] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.2976bafea0560be21ac9a9d8307a3aaa07ffed41608b5ba64af944887655c5a8 Mar 7 01:11:26.495003 containerd[1517]: 2026-03-07 01:11:26.397 [INFO][4039] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.95.64/26 handle="k8s-pod-network.2976bafea0560be21ac9a9d8307a3aaa07ffed41608b5ba64af944887655c5a8" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.495003 containerd[1517]: 2026-03-07 01:11:26.410 [INFO][4039] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.95.71/26] block=192.168.95.64/26 handle="k8s-pod-network.2976bafea0560be21ac9a9d8307a3aaa07ffed41608b5ba64af944887655c5a8" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.495003 containerd[1517]: 2026-03-07 01:11:26.410 [INFO][4039] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.95.71/26] handle="k8s-pod-network.2976bafea0560be21ac9a9d8307a3aaa07ffed41608b5ba64af944887655c5a8" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.495003 containerd[1517]: 2026-03-07 01:11:26.410 [INFO][4039] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:11:26.495003 containerd[1517]: 2026-03-07 01:11:26.410 [INFO][4039] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.95.71/26] IPv6=[] ContainerID="2976bafea0560be21ac9a9d8307a3aaa07ffed41608b5ba64af944887655c5a8" HandleID="k8s-pod-network.2976bafea0560be21ac9a9d8307a3aaa07ffed41608b5ba64af944887655c5a8" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--5kmtj-eth0" Mar 7 01:11:26.496493 containerd[1517]: 2026-03-07 01:11:26.418 [INFO][4000] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2976bafea0560be21ac9a9d8307a3aaa07ffed41608b5ba64af944887655c5a8" Namespace="calico-system" Pod="calico-apiserver-7cdd895f86-5kmtj" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--5kmtj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--5kmtj-eth0", GenerateName:"calico-apiserver-7cdd895f86-", Namespace:"calico-system", SelfLink:"", UID:"dd5d2b91-08a1-4e2b-97af-8acbc0f68c75", ResourceVersion:"894", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 10, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7cdd895f86", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-e40d23dcbc", ContainerID:"", Pod:"calico-apiserver-7cdd895f86-5kmtj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.95.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali2b9e67f2168", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:11:26.496493 containerd[1517]: 2026-03-07 01:11:26.418 [INFO][4000] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.95.71/32] ContainerID="2976bafea0560be21ac9a9d8307a3aaa07ffed41608b5ba64af944887655c5a8" Namespace="calico-system" Pod="calico-apiserver-7cdd895f86-5kmtj" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--5kmtj-eth0" Mar 7 01:11:26.496493 containerd[1517]: 2026-03-07 01:11:26.419 [INFO][4000] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2b9e67f2168 ContainerID="2976bafea0560be21ac9a9d8307a3aaa07ffed41608b5ba64af944887655c5a8" Namespace="calico-system" Pod="calico-apiserver-7cdd895f86-5kmtj" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--5kmtj-eth0" Mar 7 01:11:26.496493 containerd[1517]: 2026-03-07 01:11:26.438 [INFO][4000] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2976bafea0560be21ac9a9d8307a3aaa07ffed41608b5ba64af944887655c5a8" Namespace="calico-system" Pod="calico-apiserver-7cdd895f86-5kmtj" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--5kmtj-eth0" Mar 7 01:11:26.496493 containerd[1517]: 2026-03-07 01:11:26.441 [INFO][4000] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2976bafea0560be21ac9a9d8307a3aaa07ffed41608b5ba64af944887655c5a8" Namespace="calico-system" Pod="calico-apiserver-7cdd895f86-5kmtj" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--5kmtj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--5kmtj-eth0", GenerateName:"calico-apiserver-7cdd895f86-", Namespace:"calico-system", SelfLink:"", UID:"dd5d2b91-08a1-4e2b-97af-8acbc0f68c75", ResourceVersion:"894", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 10, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7cdd895f86", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-e40d23dcbc", ContainerID:"2976bafea0560be21ac9a9d8307a3aaa07ffed41608b5ba64af944887655c5a8", Pod:"calico-apiserver-7cdd895f86-5kmtj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.95.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali2b9e67f2168", MAC:"f6:da:3e:29:12:59", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:11:26.496493 containerd[1517]: 2026-03-07 01:11:26.488 [INFO][4000] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2976bafea0560be21ac9a9d8307a3aaa07ffed41608b5ba64af944887655c5a8" Namespace="calico-system" Pod="calico-apiserver-7cdd895f86-5kmtj" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--5kmtj-eth0" Mar 7 01:11:26.496493 containerd[1517]: time="2026-03-07T01:11:26.496378787Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-2g6xd,Uid:9e84ea27-025b-43b3-a817-46d96e7f19b0,Namespace:kube-system,Attempt:1,} returns sandbox id \"2cbb2d69339d11b81f984fc78418b54d058103983570ba9cbb6f1207182611b2\"" Mar 7 01:11:26.503190 containerd[1517]: time="2026-03-07T01:11:26.502563227Z" level=info msg="CreateContainer within sandbox \"2cbb2d69339d11b81f984fc78418b54d058103983570ba9cbb6f1207182611b2\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 7 01:11:26.527598 containerd[1517]: time="2026-03-07T01:11:26.527555917Z" level=info msg="CreateContainer within sandbox \"2cbb2d69339d11b81f984fc78418b54d058103983570ba9cbb6f1207182611b2\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"b90c3b4c92e94c9bef4c5bd8f65762b83f1bdcb87ad652029575c5a8a9e95b18\"" Mar 7 01:11:26.529252 containerd[1517]: time="2026-03-07T01:11:26.528147661Z" level=info msg="StartContainer for \"b90c3b4c92e94c9bef4c5bd8f65762b83f1bdcb87ad652029575c5a8a9e95b18\"" Mar 7 01:11:26.540800 containerd[1517]: time="2026-03-07T01:11:26.540590169Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:11:26.540800 containerd[1517]: time="2026-03-07T01:11:26.540648136Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:11:26.540800 containerd[1517]: time="2026-03-07T01:11:26.540658475Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:11:26.540800 containerd[1517]: time="2026-03-07T01:11:26.540735832Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:11:26.564604 systemd[1]: Started cri-containerd-b90c3b4c92e94c9bef4c5bd8f65762b83f1bdcb87ad652029575c5a8a9e95b18.scope - libcontainer container b90c3b4c92e94c9bef4c5bd8f65762b83f1bdcb87ad652029575c5a8a9e95b18. Mar 7 01:11:26.569817 systemd[1]: Started cri-containerd-2976bafea0560be21ac9a9d8307a3aaa07ffed41608b5ba64af944887655c5a8.scope - libcontainer container 2976bafea0560be21ac9a9d8307a3aaa07ffed41608b5ba64af944887655c5a8. Mar 7 01:11:26.574897 kubelet[2578]: I0307 01:11:26.574858 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt84c\" (UniqueName: \"kubernetes.io/projected/19a5e925-9ab8-49d1-a520-8c5053e158f3-kube-api-access-mt84c\") pod \"whisker-7c998d4b96-pt4hf\" (UID: \"19a5e925-9ab8-49d1-a520-8c5053e158f3\") " pod="calico-system/whisker-7c998d4b96-pt4hf" Mar 7 01:11:26.574897 kubelet[2578]: I0307 01:11:26.574898 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19a5e925-9ab8-49d1-a520-8c5053e158f3-whisker-ca-bundle\") pod \"whisker-7c998d4b96-pt4hf\" (UID: \"19a5e925-9ab8-49d1-a520-8c5053e158f3\") " pod="calico-system/whisker-7c998d4b96-pt4hf" Mar 7 01:11:26.575007 kubelet[2578]: I0307 01:11:26.574912 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/19a5e925-9ab8-49d1-a520-8c5053e158f3-whisker-backend-key-pair\") pod \"whisker-7c998d4b96-pt4hf\" (UID: \"19a5e925-9ab8-49d1-a520-8c5053e158f3\") " pod="calico-system/whisker-7c998d4b96-pt4hf" Mar 7 01:11:26.575007 kubelet[2578]: I0307 01:11:26.574925 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/19a5e925-9ab8-49d1-a520-8c5053e158f3-nginx-config\") pod \"whisker-7c998d4b96-pt4hf\" (UID: \"19a5e925-9ab8-49d1-a520-8c5053e158f3\") " pod="calico-system/whisker-7c998d4b96-pt4hf" Mar 7 01:11:26.606157 containerd[1517]: time="2026-03-07T01:11:26.606073212Z" level=info msg="StartContainer for \"b90c3b4c92e94c9bef4c5bd8f65762b83f1bdcb87ad652029575c5a8a9e95b18\" returns successfully" Mar 7 01:11:26.696081 containerd[1517]: time="2026-03-07T01:11:26.696024538Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7cdd895f86-5kmtj,Uid:dd5d2b91-08a1-4e2b-97af-8acbc0f68c75,Namespace:calico-system,Attempt:1,} returns sandbox id \"2976bafea0560be21ac9a9d8307a3aaa07ffed41608b5ba64af944887655c5a8\"" Mar 7 01:11:26.775379 containerd[1517]: time="2026-03-07T01:11:26.775340968Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7c998d4b96-pt4hf,Uid:19a5e925-9ab8-49d1-a520-8c5053e158f3,Namespace:calico-system,Attempt:0,}" Mar 7 01:11:26.932079 systemd-networkd[1392]: caliabd2c646706: Link UP Mar 7 01:11:26.932822 systemd-networkd[1392]: caliabd2c646706: Gained carrier Mar 7 01:11:26.958961 containerd[1517]: 2026-03-07 01:11:26.835 [ERROR][4567] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 7 01:11:26.958961 containerd[1517]: 2026-03-07 01:11:26.848 [INFO][4567] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--n--e40d23dcbc-k8s-whisker--7c998d4b96--pt4hf-eth0 whisker-7c998d4b96- calico-system 19a5e925-9ab8-49d1-a520-8c5053e158f3 943 0 2026-03-07 01:11:26 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:7c998d4b96 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4081-3-6-n-e40d23dcbc whisker-7c998d4b96-pt4hf eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] caliabd2c646706 [] [] }} ContainerID="05b03ace6f5a64c75bd4e1ae73264acab50e6e430dfa6c08ed37bbae4da4518a" Namespace="calico-system" Pod="whisker-7c998d4b96-pt4hf" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-whisker--7c998d4b96--pt4hf-" Mar 7 01:11:26.958961 containerd[1517]: 2026-03-07 01:11:26.849 [INFO][4567] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="05b03ace6f5a64c75bd4e1ae73264acab50e6e430dfa6c08ed37bbae4da4518a" Namespace="calico-system" Pod="whisker-7c998d4b96-pt4hf" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-whisker--7c998d4b96--pt4hf-eth0" Mar 7 01:11:26.958961 containerd[1517]: 2026-03-07 01:11:26.888 [INFO][4581] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="05b03ace6f5a64c75bd4e1ae73264acab50e6e430dfa6c08ed37bbae4da4518a" HandleID="k8s-pod-network.05b03ace6f5a64c75bd4e1ae73264acab50e6e430dfa6c08ed37bbae4da4518a" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-whisker--7c998d4b96--pt4hf-eth0" Mar 7 01:11:26.958961 containerd[1517]: 2026-03-07 01:11:26.895 [INFO][4581] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="05b03ace6f5a64c75bd4e1ae73264acab50e6e430dfa6c08ed37bbae4da4518a" HandleID="k8s-pod-network.05b03ace6f5a64c75bd4e1ae73264acab50e6e430dfa6c08ed37bbae4da4518a" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-whisker--7c998d4b96--pt4hf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fddd0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-n-e40d23dcbc", "pod":"whisker-7c998d4b96-pt4hf", "timestamp":"2026-03-07 01:11:26.888942253 +0000 UTC"}, Hostname:"ci-4081-3-6-n-e40d23dcbc", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000371340)} Mar 7 01:11:26.958961 containerd[1517]: 2026-03-07 01:11:26.896 [INFO][4581] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:11:26.958961 containerd[1517]: 2026-03-07 01:11:26.896 [INFO][4581] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:11:26.958961 containerd[1517]: 2026-03-07 01:11:26.896 [INFO][4581] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-n-e40d23dcbc' Mar 7 01:11:26.958961 containerd[1517]: 2026-03-07 01:11:26.898 [INFO][4581] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.05b03ace6f5a64c75bd4e1ae73264acab50e6e430dfa6c08ed37bbae4da4518a" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.958961 containerd[1517]: 2026-03-07 01:11:26.902 [INFO][4581] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.958961 containerd[1517]: 2026-03-07 01:11:26.907 [INFO][4581] ipam/ipam.go 526: Trying affinity for 192.168.95.64/26 host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.958961 containerd[1517]: 2026-03-07 01:11:26.908 [INFO][4581] ipam/ipam.go 160: Attempting to load block cidr=192.168.95.64/26 host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.958961 containerd[1517]: 2026-03-07 01:11:26.911 [INFO][4581] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.95.64/26 host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.958961 containerd[1517]: 2026-03-07 01:11:26.912 [INFO][4581] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.95.64/26 handle="k8s-pod-network.05b03ace6f5a64c75bd4e1ae73264acab50e6e430dfa6c08ed37bbae4da4518a" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.958961 containerd[1517]: 2026-03-07 01:11:26.914 [INFO][4581] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.05b03ace6f5a64c75bd4e1ae73264acab50e6e430dfa6c08ed37bbae4da4518a Mar 7 01:11:26.958961 containerd[1517]: 2026-03-07 01:11:26.918 [INFO][4581] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.95.64/26 handle="k8s-pod-network.05b03ace6f5a64c75bd4e1ae73264acab50e6e430dfa6c08ed37bbae4da4518a" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.958961 containerd[1517]: 2026-03-07 01:11:26.923 [INFO][4581] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.95.72/26] block=192.168.95.64/26 handle="k8s-pod-network.05b03ace6f5a64c75bd4e1ae73264acab50e6e430dfa6c08ed37bbae4da4518a" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.958961 containerd[1517]: 2026-03-07 01:11:26.924 [INFO][4581] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.95.72/26] handle="k8s-pod-network.05b03ace6f5a64c75bd4e1ae73264acab50e6e430dfa6c08ed37bbae4da4518a" host="ci-4081-3-6-n-e40d23dcbc" Mar 7 01:11:26.958961 containerd[1517]: 2026-03-07 01:11:26.924 [INFO][4581] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:11:26.958961 containerd[1517]: 2026-03-07 01:11:26.924 [INFO][4581] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.95.72/26] IPv6=[] ContainerID="05b03ace6f5a64c75bd4e1ae73264acab50e6e430dfa6c08ed37bbae4da4518a" HandleID="k8s-pod-network.05b03ace6f5a64c75bd4e1ae73264acab50e6e430dfa6c08ed37bbae4da4518a" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-whisker--7c998d4b96--pt4hf-eth0" Mar 7 01:11:26.959464 containerd[1517]: 2026-03-07 01:11:26.927 [INFO][4567] cni-plugin/k8s.go 418: Populated endpoint ContainerID="05b03ace6f5a64c75bd4e1ae73264acab50e6e430dfa6c08ed37bbae4da4518a" Namespace="calico-system" Pod="whisker-7c998d4b96-pt4hf" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-whisker--7c998d4b96--pt4hf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--e40d23dcbc-k8s-whisker--7c998d4b96--pt4hf-eth0", GenerateName:"whisker-7c998d4b96-", Namespace:"calico-system", SelfLink:"", UID:"19a5e925-9ab8-49d1-a520-8c5053e158f3", ResourceVersion:"943", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 11, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7c998d4b96", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-e40d23dcbc", ContainerID:"", Pod:"whisker-7c998d4b96-pt4hf", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.95.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"caliabd2c646706", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:11:26.959464 containerd[1517]: 2026-03-07 01:11:26.927 [INFO][4567] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.95.72/32] ContainerID="05b03ace6f5a64c75bd4e1ae73264acab50e6e430dfa6c08ed37bbae4da4518a" Namespace="calico-system" Pod="whisker-7c998d4b96-pt4hf" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-whisker--7c998d4b96--pt4hf-eth0" Mar 7 01:11:26.959464 containerd[1517]: 2026-03-07 01:11:26.927 [INFO][4567] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliabd2c646706 ContainerID="05b03ace6f5a64c75bd4e1ae73264acab50e6e430dfa6c08ed37bbae4da4518a" Namespace="calico-system" Pod="whisker-7c998d4b96-pt4hf" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-whisker--7c998d4b96--pt4hf-eth0" Mar 7 01:11:26.959464 containerd[1517]: 2026-03-07 01:11:26.933 [INFO][4567] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="05b03ace6f5a64c75bd4e1ae73264acab50e6e430dfa6c08ed37bbae4da4518a" Namespace="calico-system" Pod="whisker-7c998d4b96-pt4hf" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-whisker--7c998d4b96--pt4hf-eth0" Mar 7 01:11:26.959464 containerd[1517]: 2026-03-07 01:11:26.934 [INFO][4567] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="05b03ace6f5a64c75bd4e1ae73264acab50e6e430dfa6c08ed37bbae4da4518a" Namespace="calico-system" Pod="whisker-7c998d4b96-pt4hf" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-whisker--7c998d4b96--pt4hf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--e40d23dcbc-k8s-whisker--7c998d4b96--pt4hf-eth0", GenerateName:"whisker-7c998d4b96-", Namespace:"calico-system", SelfLink:"", UID:"19a5e925-9ab8-49d1-a520-8c5053e158f3", ResourceVersion:"943", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 11, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7c998d4b96", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-e40d23dcbc", ContainerID:"05b03ace6f5a64c75bd4e1ae73264acab50e6e430dfa6c08ed37bbae4da4518a", Pod:"whisker-7c998d4b96-pt4hf", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.95.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"caliabd2c646706", MAC:"b2:b1:eb:67:8b:16", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:11:26.959464 containerd[1517]: 2026-03-07 01:11:26.952 [INFO][4567] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="05b03ace6f5a64c75bd4e1ae73264acab50e6e430dfa6c08ed37bbae4da4518a" Namespace="calico-system" Pod="whisker-7c998d4b96-pt4hf" WorkloadEndpoint="ci--4081--3--6--n--e40d23dcbc-k8s-whisker--7c998d4b96--pt4hf-eth0" Mar 7 01:11:26.979919 containerd[1517]: time="2026-03-07T01:11:26.979821728Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 7 01:11:26.979919 containerd[1517]: time="2026-03-07T01:11:26.979916455Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 7 01:11:26.980050 containerd[1517]: time="2026-03-07T01:11:26.979952623Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:11:26.980208 containerd[1517]: time="2026-03-07T01:11:26.980164694Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 7 01:11:27.012618 systemd[1]: Started cri-containerd-05b03ace6f5a64c75bd4e1ae73264acab50e6e430dfa6c08ed37bbae4da4518a.scope - libcontainer container 05b03ace6f5a64c75bd4e1ae73264acab50e6e430dfa6c08ed37bbae4da4518a. Mar 7 01:11:27.035433 kernel: calico-node[4494]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 7 01:11:27.069709 containerd[1517]: time="2026-03-07T01:11:27.069643795Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7c998d4b96-pt4hf,Uid:19a5e925-9ab8-49d1-a520-8c5053e158f3,Namespace:calico-system,Attempt:0,} returns sandbox id \"05b03ace6f5a64c75bd4e1ae73264acab50e6e430dfa6c08ed37bbae4da4518a\"" Mar 7 01:11:27.076667 systemd-networkd[1392]: calidfd259856e7: Gained IPv6LL Mar 7 01:11:27.147117 systemd-networkd[1392]: calif3629baf55b: Gained IPv6LL Mar 7 01:11:27.147391 systemd-networkd[1392]: cali8b5458b32a4: Gained IPv6LL Mar 7 01:11:27.363985 kubelet[2578]: I0307 01:11:27.363734 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-2g6xd" podStartSLOduration=40.363716838 podStartE2EDuration="40.363716838s" podCreationTimestamp="2026-03-07 01:10:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 01:11:27.362587835 +0000 UTC m=+45.287554876" watchObservedRunningTime="2026-03-07 01:11:27.363716838 +0000 UTC m=+45.288683879" Mar 7 01:11:27.557430 systemd-networkd[1392]: vxlan.calico: Link UP Mar 7 01:11:27.557439 systemd-networkd[1392]: vxlan.calico: Gained carrier Mar 7 01:11:27.588547 systemd-networkd[1392]: cali4173cd9ad7b: Gained IPv6LL Mar 7 01:11:27.591504 systemd-networkd[1392]: cali2b9e67f2168: Gained IPv6LL Mar 7 01:11:27.653557 systemd-networkd[1392]: califc3c9bf6373: Gained IPv6LL Mar 7 01:11:28.161610 kubelet[2578]: I0307 01:11:28.161435 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32e149ae-638d-489c-a5af-ccd7aafddaba" path="/var/lib/kubelet/pods/32e149ae-638d-489c-a5af-ccd7aafddaba/volumes" Mar 7 01:11:28.165752 systemd-networkd[1392]: caliaa77fcdc760: Gained IPv6LL Mar 7 01:11:28.996968 systemd-networkd[1392]: caliabd2c646706: Gained IPv6LL Mar 7 01:11:29.061223 systemd-networkd[1392]: vxlan.calico: Gained IPv6LL Mar 7 01:11:29.152639 containerd[1517]: time="2026-03-07T01:11:29.152575598Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:29.153609 containerd[1517]: time="2026-03-07T01:11:29.153427323Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=52406348" Mar 7 01:11:29.154409 containerd[1517]: time="2026-03-07T01:11:29.154301099Z" level=info msg="ImageCreate event name:\"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:29.156179 containerd[1517]: time="2026-03-07T01:11:29.156142337Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:29.156644 containerd[1517]: time="2026-03-07T01:11:29.156610199Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"53962361\" in 3.258932908s" Mar 7 01:11:29.156685 containerd[1517]: time="2026-03-07T01:11:29.156646417Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\"" Mar 7 01:11:29.158525 containerd[1517]: time="2026-03-07T01:11:29.158077151Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Mar 7 01:11:29.170462 containerd[1517]: time="2026-03-07T01:11:29.169955744Z" level=info msg="CreateContainer within sandbox \"febbd0769d9169deac964b48da2739506b9b90adbf8b43d4485fee546aa210c1\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 7 01:11:29.182312 containerd[1517]: time="2026-03-07T01:11:29.182268660Z" level=info msg="CreateContainer within sandbox \"febbd0769d9169deac964b48da2739506b9b90adbf8b43d4485fee546aa210c1\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"2bc79fdeb9a9926590683748125695e7ba12209c1982e46099b384c5b3f3883e\"" Mar 7 01:11:29.184587 containerd[1517]: time="2026-03-07T01:11:29.182852597Z" level=info msg="StartContainer for \"2bc79fdeb9a9926590683748125695e7ba12209c1982e46099b384c5b3f3883e\"" Mar 7 01:11:29.228546 systemd[1]: Started cri-containerd-2bc79fdeb9a9926590683748125695e7ba12209c1982e46099b384c5b3f3883e.scope - libcontainer container 2bc79fdeb9a9926590683748125695e7ba12209c1982e46099b384c5b3f3883e. Mar 7 01:11:29.265772 containerd[1517]: time="2026-03-07T01:11:29.265682272Z" level=info msg="StartContainer for \"2bc79fdeb9a9926590683748125695e7ba12209c1982e46099b384c5b3f3883e\" returns successfully" Mar 7 01:11:29.359585 kubelet[2578]: I0307 01:11:29.359508 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-cf679fccd-lv8s8" podStartSLOduration=29.099216642 podStartE2EDuration="32.359472376s" podCreationTimestamp="2026-03-07 01:10:57 +0000 UTC" firstStartedPulling="2026-03-07 01:11:25.897118065 +0000 UTC m=+43.822085116" lastFinishedPulling="2026-03-07 01:11:29.157373809 +0000 UTC m=+47.082340850" observedRunningTime="2026-03-07 01:11:29.359173437 +0000 UTC m=+47.284140488" watchObservedRunningTime="2026-03-07 01:11:29.359472376 +0000 UTC m=+47.284439427" Mar 7 01:11:31.772890 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount58063285.mount: Deactivated successfully. Mar 7 01:11:33.041298 containerd[1517]: time="2026-03-07T01:11:33.041251835Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:33.042408 containerd[1517]: time="2026-03-07T01:11:33.042363486Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=55623386" Mar 7 01:11:33.043473 containerd[1517]: time="2026-03-07T01:11:33.043447979Z" level=info msg="ImageCreate event name:\"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:33.045640 containerd[1517]: time="2026-03-07T01:11:33.045608384Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:33.046140 containerd[1517]: time="2026-03-07T01:11:33.046055398Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"55623232\" in 3.887952068s" Mar 7 01:11:33.046140 containerd[1517]: time="2026-03-07T01:11:33.046075977Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\"" Mar 7 01:11:33.047441 containerd[1517]: time="2026-03-07T01:11:33.047419460Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 7 01:11:33.049951 containerd[1517]: time="2026-03-07T01:11:33.049927513Z" level=info msg="CreateContainer within sandbox \"3430c767ec7cdfeb05b382975eaf4897507eadf286a7b98791b122e01f1da2a5\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Mar 7 01:11:33.068492 containerd[1517]: time="2026-03-07T01:11:33.068454211Z" level=info msg="CreateContainer within sandbox \"3430c767ec7cdfeb05b382975eaf4897507eadf286a7b98791b122e01f1da2a5\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"5f76eac20603047f69b5a483285eeb195336ad86ba79d7dd10e94192406bda87\"" Mar 7 01:11:33.069237 containerd[1517]: time="2026-03-07T01:11:33.069212335Z" level=info msg="StartContainer for \"5f76eac20603047f69b5a483285eeb195336ad86ba79d7dd10e94192406bda87\"" Mar 7 01:11:33.105518 systemd[1]: Started cri-containerd-5f76eac20603047f69b5a483285eeb195336ad86ba79d7dd10e94192406bda87.scope - libcontainer container 5f76eac20603047f69b5a483285eeb195336ad86ba79d7dd10e94192406bda87. Mar 7 01:11:33.141968 containerd[1517]: time="2026-03-07T01:11:33.141903962Z" level=info msg="StartContainer for \"5f76eac20603047f69b5a483285eeb195336ad86ba79d7dd10e94192406bda87\" returns successfully" Mar 7 01:11:33.385317 kubelet[2578]: I0307 01:11:33.384575 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-5b85766d88-8nxm8" podStartSLOduration=29.577180913 podStartE2EDuration="36.38455708s" podCreationTimestamp="2026-03-07 01:10:57 +0000 UTC" firstStartedPulling="2026-03-07 01:11:26.239451514 +0000 UTC m=+44.164418565" lastFinishedPulling="2026-03-07 01:11:33.046827691 +0000 UTC m=+50.971794732" observedRunningTime="2026-03-07 01:11:33.383665512 +0000 UTC m=+51.308632583" watchObservedRunningTime="2026-03-07 01:11:33.38455708 +0000 UTC m=+51.309524151" Mar 7 01:11:34.406738 systemd[1]: run-containerd-runc-k8s.io-5f76eac20603047f69b5a483285eeb195336ad86ba79d7dd10e94192406bda87-runc.T1LWep.mount: Deactivated successfully. Mar 7 01:11:36.632059 containerd[1517]: time="2026-03-07T01:11:36.632009010Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:36.632998 containerd[1517]: time="2026-03-07T01:11:36.632902390Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=48415780" Mar 7 01:11:36.633736 containerd[1517]: time="2026-03-07T01:11:36.633611518Z" level=info msg="ImageCreate event name:\"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:36.635188 containerd[1517]: time="2026-03-07T01:11:36.635171418Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:36.635658 containerd[1517]: time="2026-03-07T01:11:36.635631994Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 3.587907993s" Mar 7 01:11:36.635691 containerd[1517]: time="2026-03-07T01:11:36.635661833Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Mar 7 01:11:36.637692 containerd[1517]: time="2026-03-07T01:11:36.637574211Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Mar 7 01:11:36.639335 containerd[1517]: time="2026-03-07T01:11:36.639311436Z" level=info msg="CreateContainer within sandbox \"7a1f5b82d89e8a71d0438dd8dd244af407072ac3a65d126412fec47cbe237393\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 7 01:11:36.660211 containerd[1517]: time="2026-03-07T01:11:36.660171699Z" level=info msg="CreateContainer within sandbox \"7a1f5b82d89e8a71d0438dd8dd244af407072ac3a65d126412fec47cbe237393\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"4a9c858d84a1a63546d86e118ae2ae694c3133e8841fe6fefa646a4c0b3a996e\"" Mar 7 01:11:36.660463 containerd[1517]: time="2026-03-07T01:11:36.660447780Z" level=info msg="StartContainer for \"4a9c858d84a1a63546d86e118ae2ae694c3133e8841fe6fefa646a4c0b3a996e\"" Mar 7 01:11:36.687596 systemd[1]: Started cri-containerd-4a9c858d84a1a63546d86e118ae2ae694c3133e8841fe6fefa646a4c0b3a996e.scope - libcontainer container 4a9c858d84a1a63546d86e118ae2ae694c3133e8841fe6fefa646a4c0b3a996e. Mar 7 01:11:36.727249 containerd[1517]: time="2026-03-07T01:11:36.727180358Z" level=info msg="StartContainer for \"4a9c858d84a1a63546d86e118ae2ae694c3133e8841fe6fefa646a4c0b3a996e\" returns successfully" Mar 7 01:11:37.381336 kubelet[2578]: I0307 01:11:37.380997 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-7cdd895f86-pmmf8" podStartSLOduration=30.224544475 podStartE2EDuration="40.38098352s" podCreationTimestamp="2026-03-07 01:10:57 +0000 UTC" firstStartedPulling="2026-03-07 01:11:26.479907895 +0000 UTC m=+44.404874936" lastFinishedPulling="2026-03-07 01:11:36.63634694 +0000 UTC m=+54.561313981" observedRunningTime="2026-03-07 01:11:37.380789826 +0000 UTC m=+55.305756867" watchObservedRunningTime="2026-03-07 01:11:37.38098352 +0000 UTC m=+55.305950571" Mar 7 01:11:39.217181 containerd[1517]: time="2026-03-07T01:11:39.217099449Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:39.218269 containerd[1517]: time="2026-03-07T01:11:39.218114659Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8792502" Mar 7 01:11:39.219194 containerd[1517]: time="2026-03-07T01:11:39.219073561Z" level=info msg="ImageCreate event name:\"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:39.220868 containerd[1517]: time="2026-03-07T01:11:39.220825879Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:39.221428 containerd[1517]: time="2026-03-07T01:11:39.221339404Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"10348547\" in 2.583746333s" Mar 7 01:11:39.221428 containerd[1517]: time="2026-03-07T01:11:39.221376943Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\"" Mar 7 01:11:39.223435 containerd[1517]: time="2026-03-07T01:11:39.223219297Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 7 01:11:39.226268 containerd[1517]: time="2026-03-07T01:11:39.226235658Z" level=info msg="CreateContainer within sandbox \"f92e9c16fbac8fe41c25f2c23a7308df996ea1160632969acda399e8260d2d2d\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 7 01:11:39.248296 containerd[1517]: time="2026-03-07T01:11:39.248260154Z" level=info msg="CreateContainer within sandbox \"f92e9c16fbac8fe41c25f2c23a7308df996ea1160632969acda399e8260d2d2d\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"a6894184dcd060c5e6fa150c9479a7c1d24bd43c41a3666b2394055ffcc8008f\"" Mar 7 01:11:39.250110 containerd[1517]: time="2026-03-07T01:11:39.248978203Z" level=info msg="StartContainer for \"a6894184dcd060c5e6fa150c9479a7c1d24bd43c41a3666b2394055ffcc8008f\"" Mar 7 01:11:39.284577 systemd[1]: Started cri-containerd-a6894184dcd060c5e6fa150c9479a7c1d24bd43c41a3666b2394055ffcc8008f.scope - libcontainer container a6894184dcd060c5e6fa150c9479a7c1d24bd43c41a3666b2394055ffcc8008f. Mar 7 01:11:39.312424 containerd[1517]: time="2026-03-07T01:11:39.312285194Z" level=info msg="StartContainer for \"a6894184dcd060c5e6fa150c9479a7c1d24bd43c41a3666b2394055ffcc8008f\" returns successfully" Mar 7 01:11:39.674907 containerd[1517]: time="2026-03-07T01:11:39.674836759Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:39.676054 containerd[1517]: time="2026-03-07T01:11:39.676018384Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=77" Mar 7 01:11:39.678155 containerd[1517]: time="2026-03-07T01:11:39.678059802Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 454.815425ms" Mar 7 01:11:39.678155 containerd[1517]: time="2026-03-07T01:11:39.678085542Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Mar 7 01:11:39.679719 containerd[1517]: time="2026-03-07T01:11:39.679656356Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Mar 7 01:11:39.684670 containerd[1517]: time="2026-03-07T01:11:39.684579959Z" level=info msg="CreateContainer within sandbox \"2976bafea0560be21ac9a9d8307a3aaa07ffed41608b5ba64af944887655c5a8\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 7 01:11:39.704983 containerd[1517]: time="2026-03-07T01:11:39.704894576Z" level=info msg="CreateContainer within sandbox \"2976bafea0560be21ac9a9d8307a3aaa07ffed41608b5ba64af944887655c5a8\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"d6c8ee3f5456d96910b0a4587d135e72b190b6b0e3b04a0bc91c816726d2bd36\"" Mar 7 01:11:39.707727 containerd[1517]: time="2026-03-07T01:11:39.707441100Z" level=info msg="StartContainer for \"d6c8ee3f5456d96910b0a4587d135e72b190b6b0e3b04a0bc91c816726d2bd36\"" Mar 7 01:11:39.745888 systemd[1]: Started cri-containerd-d6c8ee3f5456d96910b0a4587d135e72b190b6b0e3b04a0bc91c816726d2bd36.scope - libcontainer container d6c8ee3f5456d96910b0a4587d135e72b190b6b0e3b04a0bc91c816726d2bd36. Mar 7 01:11:39.785493 containerd[1517]: time="2026-03-07T01:11:39.785458954Z" level=info msg="StartContainer for \"d6c8ee3f5456d96910b0a4587d135e72b190b6b0e3b04a0bc91c816726d2bd36\" returns successfully" Mar 7 01:11:41.392184 kubelet[2578]: I0307 01:11:41.390542 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 7 01:11:41.588703 containerd[1517]: time="2026-03-07T01:11:41.588339647Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:41.589454 containerd[1517]: time="2026-03-07T01:11:41.589328249Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=6039889" Mar 7 01:11:41.590232 containerd[1517]: time="2026-03-07T01:11:41.590175144Z" level=info msg="ImageCreate event name:\"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:41.591811 containerd[1517]: time="2026-03-07T01:11:41.591775819Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:41.592691 containerd[1517]: time="2026-03-07T01:11:41.592305184Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7595926\" in 1.91258413s" Mar 7 01:11:41.592691 containerd[1517]: time="2026-03-07T01:11:41.592332883Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\"" Mar 7 01:11:41.593832 containerd[1517]: time="2026-03-07T01:11:41.593791562Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Mar 7 01:11:41.595579 containerd[1517]: time="2026-03-07T01:11:41.595470204Z" level=info msg="CreateContainer within sandbox \"05b03ace6f5a64c75bd4e1ae73264acab50e6e430dfa6c08ed37bbae4da4518a\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Mar 7 01:11:41.616496 containerd[1517]: time="2026-03-07T01:11:41.616463367Z" level=info msg="CreateContainer within sandbox \"05b03ace6f5a64c75bd4e1ae73264acab50e6e430dfa6c08ed37bbae4da4518a\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"4e7aa0a8fd65410616af4514c8bb86cc732cf0313146c197352d1461da826146\"" Mar 7 01:11:41.617305 containerd[1517]: time="2026-03-07T01:11:41.617265875Z" level=info msg="StartContainer for \"4e7aa0a8fd65410616af4514c8bb86cc732cf0313146c197352d1461da826146\"" Mar 7 01:11:41.649530 systemd[1]: Started cri-containerd-4e7aa0a8fd65410616af4514c8bb86cc732cf0313146c197352d1461da826146.scope - libcontainer container 4e7aa0a8fd65410616af4514c8bb86cc732cf0313146c197352d1461da826146. Mar 7 01:11:41.685772 containerd[1517]: time="2026-03-07T01:11:41.685665292Z" level=info msg="StartContainer for \"4e7aa0a8fd65410616af4514c8bb86cc732cf0313146c197352d1461da826146\" returns successfully" Mar 7 01:11:42.136469 containerd[1517]: time="2026-03-07T01:11:42.136300908Z" level=info msg="StopPodSandbox for \"aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21\"" Mar 7 01:11:42.237551 containerd[1517]: 2026-03-07 01:11:42.202 [WARNING][5185] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--5kmtj-eth0", GenerateName:"calico-apiserver-7cdd895f86-", Namespace:"calico-system", SelfLink:"", UID:"dd5d2b91-08a1-4e2b-97af-8acbc0f68c75", ResourceVersion:"1035", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 10, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7cdd895f86", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-e40d23dcbc", ContainerID:"2976bafea0560be21ac9a9d8307a3aaa07ffed41608b5ba64af944887655c5a8", Pod:"calico-apiserver-7cdd895f86-5kmtj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.95.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali2b9e67f2168", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:11:42.237551 containerd[1517]: 2026-03-07 01:11:42.203 [INFO][5185] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" Mar 7 01:11:42.237551 containerd[1517]: 2026-03-07 01:11:42.203 [INFO][5185] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" iface="eth0" netns="" Mar 7 01:11:42.237551 containerd[1517]: 2026-03-07 01:11:42.203 [INFO][5185] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" Mar 7 01:11:42.237551 containerd[1517]: 2026-03-07 01:11:42.203 [INFO][5185] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" Mar 7 01:11:42.237551 containerd[1517]: 2026-03-07 01:11:42.224 [INFO][5195] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" HandleID="k8s-pod-network.aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--5kmtj-eth0" Mar 7 01:11:42.237551 containerd[1517]: 2026-03-07 01:11:42.224 [INFO][5195] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:11:42.237551 containerd[1517]: 2026-03-07 01:11:42.224 [INFO][5195] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:11:42.237551 containerd[1517]: 2026-03-07 01:11:42.232 [WARNING][5195] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" HandleID="k8s-pod-network.aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--5kmtj-eth0" Mar 7 01:11:42.237551 containerd[1517]: 2026-03-07 01:11:42.232 [INFO][5195] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" HandleID="k8s-pod-network.aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--5kmtj-eth0" Mar 7 01:11:42.237551 containerd[1517]: 2026-03-07 01:11:42.233 [INFO][5195] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:11:42.237551 containerd[1517]: 2026-03-07 01:11:42.235 [INFO][5185] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" Mar 7 01:11:42.237551 containerd[1517]: time="2026-03-07T01:11:42.237458744Z" level=info msg="TearDown network for sandbox \"aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21\" successfully" Mar 7 01:11:42.237551 containerd[1517]: time="2026-03-07T01:11:42.237477603Z" level=info msg="StopPodSandbox for \"aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21\" returns successfully" Mar 7 01:11:42.238115 containerd[1517]: time="2026-03-07T01:11:42.238091845Z" level=info msg="RemovePodSandbox for \"aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21\"" Mar 7 01:11:42.238115 containerd[1517]: time="2026-03-07T01:11:42.238116005Z" level=info msg="Forcibly stopping sandbox \"aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21\"" Mar 7 01:11:42.289440 containerd[1517]: 2026-03-07 01:11:42.262 [WARNING][5210] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--5kmtj-eth0", GenerateName:"calico-apiserver-7cdd895f86-", Namespace:"calico-system", SelfLink:"", UID:"dd5d2b91-08a1-4e2b-97af-8acbc0f68c75", ResourceVersion:"1035", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 10, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7cdd895f86", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-e40d23dcbc", ContainerID:"2976bafea0560be21ac9a9d8307a3aaa07ffed41608b5ba64af944887655c5a8", Pod:"calico-apiserver-7cdd895f86-5kmtj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.95.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali2b9e67f2168", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:11:42.289440 containerd[1517]: 2026-03-07 01:11:42.262 [INFO][5210] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" Mar 7 01:11:42.289440 containerd[1517]: 2026-03-07 01:11:42.262 [INFO][5210] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" iface="eth0" netns="" Mar 7 01:11:42.289440 containerd[1517]: 2026-03-07 01:11:42.262 [INFO][5210] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" Mar 7 01:11:42.289440 containerd[1517]: 2026-03-07 01:11:42.262 [INFO][5210] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" Mar 7 01:11:42.289440 containerd[1517]: 2026-03-07 01:11:42.279 [INFO][5218] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" HandleID="k8s-pod-network.aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--5kmtj-eth0" Mar 7 01:11:42.289440 containerd[1517]: 2026-03-07 01:11:42.279 [INFO][5218] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:11:42.289440 containerd[1517]: 2026-03-07 01:11:42.279 [INFO][5218] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:11:42.289440 containerd[1517]: 2026-03-07 01:11:42.284 [WARNING][5218] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" HandleID="k8s-pod-network.aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--5kmtj-eth0" Mar 7 01:11:42.289440 containerd[1517]: 2026-03-07 01:11:42.284 [INFO][5218] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" HandleID="k8s-pod-network.aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-calico--apiserver--7cdd895f86--5kmtj-eth0" Mar 7 01:11:42.289440 containerd[1517]: 2026-03-07 01:11:42.285 [INFO][5218] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:11:42.289440 containerd[1517]: 2026-03-07 01:11:42.287 [INFO][5210] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21" Mar 7 01:11:42.289440 containerd[1517]: time="2026-03-07T01:11:42.289048878Z" level=info msg="TearDown network for sandbox \"aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21\" successfully" Mar 7 01:11:42.293066 containerd[1517]: time="2026-03-07T01:11:42.293038007Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 01:11:42.293123 containerd[1517]: time="2026-03-07T01:11:42.293091895Z" level=info msg="RemovePodSandbox \"aa36742035b6c7029f45e2175197ca8cc331b0bc97047ac8dc523d7087a38d21\" returns successfully" Mar 7 01:11:42.293531 containerd[1517]: time="2026-03-07T01:11:42.293510923Z" level=info msg="StopPodSandbox for \"7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211\"" Mar 7 01:11:42.343131 containerd[1517]: 2026-03-07 01:11:42.318 [WARNING][5232] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2g6xd-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"9e84ea27-025b-43b3-a817-46d96e7f19b0", ResourceVersion:"968", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 10, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-e40d23dcbc", ContainerID:"2cbb2d69339d11b81f984fc78418b54d058103983570ba9cbb6f1207182611b2", Pod:"coredns-674b8bbfcf-2g6xd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.95.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4173cd9ad7b", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:11:42.343131 containerd[1517]: 2026-03-07 01:11:42.318 [INFO][5232] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" Mar 7 01:11:42.343131 containerd[1517]: 2026-03-07 01:11:42.318 [INFO][5232] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" iface="eth0" netns="" Mar 7 01:11:42.343131 containerd[1517]: 2026-03-07 01:11:42.318 [INFO][5232] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" Mar 7 01:11:42.343131 containerd[1517]: 2026-03-07 01:11:42.318 [INFO][5232] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" Mar 7 01:11:42.343131 containerd[1517]: 2026-03-07 01:11:42.333 [INFO][5239] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" HandleID="k8s-pod-network.7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2g6xd-eth0" Mar 7 01:11:42.343131 containerd[1517]: 2026-03-07 01:11:42.334 [INFO][5239] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:11:42.343131 containerd[1517]: 2026-03-07 01:11:42.334 [INFO][5239] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:11:42.343131 containerd[1517]: 2026-03-07 01:11:42.338 [WARNING][5239] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" HandleID="k8s-pod-network.7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2g6xd-eth0" Mar 7 01:11:42.343131 containerd[1517]: 2026-03-07 01:11:42.338 [INFO][5239] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" HandleID="k8s-pod-network.7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2g6xd-eth0" Mar 7 01:11:42.343131 containerd[1517]: 2026-03-07 01:11:42.339 [INFO][5239] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:11:42.343131 containerd[1517]: 2026-03-07 01:11:42.341 [INFO][5232] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" Mar 7 01:11:42.343714 containerd[1517]: time="2026-03-07T01:11:42.343161582Z" level=info msg="TearDown network for sandbox \"7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211\" successfully" Mar 7 01:11:42.343714 containerd[1517]: time="2026-03-07T01:11:42.343183602Z" level=info msg="StopPodSandbox for \"7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211\" returns successfully" Mar 7 01:11:42.343810 containerd[1517]: time="2026-03-07T01:11:42.343719866Z" level=info msg="RemovePodSandbox for \"7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211\"" Mar 7 01:11:42.343810 containerd[1517]: time="2026-03-07T01:11:42.343739216Z" level=info msg="Forcibly stopping sandbox \"7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211\"" Mar 7 01:11:42.401156 containerd[1517]: 2026-03-07 01:11:42.367 [WARNING][5253] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2g6xd-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"9e84ea27-025b-43b3-a817-46d96e7f19b0", ResourceVersion:"968", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 10, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-e40d23dcbc", ContainerID:"2cbb2d69339d11b81f984fc78418b54d058103983570ba9cbb6f1207182611b2", Pod:"coredns-674b8bbfcf-2g6xd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.95.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4173cd9ad7b", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:11:42.401156 containerd[1517]: 2026-03-07 01:11:42.367 [INFO][5253] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" Mar 7 01:11:42.401156 containerd[1517]: 2026-03-07 01:11:42.367 [INFO][5253] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" iface="eth0" netns="" Mar 7 01:11:42.401156 containerd[1517]: 2026-03-07 01:11:42.368 [INFO][5253] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" Mar 7 01:11:42.401156 containerd[1517]: 2026-03-07 01:11:42.368 [INFO][5253] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" Mar 7 01:11:42.401156 containerd[1517]: 2026-03-07 01:11:42.388 [INFO][5260] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" HandleID="k8s-pod-network.7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2g6xd-eth0" Mar 7 01:11:42.401156 containerd[1517]: 2026-03-07 01:11:42.389 [INFO][5260] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:11:42.401156 containerd[1517]: 2026-03-07 01:11:42.389 [INFO][5260] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:11:42.401156 containerd[1517]: 2026-03-07 01:11:42.394 [WARNING][5260] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" HandleID="k8s-pod-network.7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2g6xd-eth0" Mar 7 01:11:42.401156 containerd[1517]: 2026-03-07 01:11:42.394 [INFO][5260] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" HandleID="k8s-pod-network.7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-coredns--674b8bbfcf--2g6xd-eth0" Mar 7 01:11:42.401156 containerd[1517]: 2026-03-07 01:11:42.396 [INFO][5260] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:11:42.401156 containerd[1517]: 2026-03-07 01:11:42.398 [INFO][5253] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211" Mar 7 01:11:42.401156 containerd[1517]: time="2026-03-07T01:11:42.400305282Z" level=info msg="TearDown network for sandbox \"7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211\" successfully" Mar 7 01:11:42.404164 containerd[1517]: time="2026-03-07T01:11:42.404080986Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 01:11:42.404164 containerd[1517]: time="2026-03-07T01:11:42.404131975Z" level=info msg="RemovePodSandbox \"7550b93c1236f21cc3534f0954099343d94287df49b186bec307d38d9a388211\" returns successfully" Mar 7 01:11:42.404793 containerd[1517]: time="2026-03-07T01:11:42.404666181Z" level=info msg="StopPodSandbox for \"6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089\"" Mar 7 01:11:42.461892 containerd[1517]: 2026-03-07 01:11:42.436 [WARNING][5276] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--e40d23dcbc-k8s-csi--node--driver--gpr4h-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"b2c4739c-e360-4c95-9680-eefc754cc98b", ResourceVersion:"922", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 10, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6d9d697c7c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-e40d23dcbc", ContainerID:"f92e9c16fbac8fe41c25f2c23a7308df996ea1160632969acda399e8260d2d2d", Pod:"csi-node-driver-gpr4h", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.95.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"caliaa77fcdc760", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:11:42.461892 containerd[1517]: 2026-03-07 01:11:42.437 [INFO][5276] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" Mar 7 01:11:42.461892 containerd[1517]: 2026-03-07 01:11:42.437 [INFO][5276] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" iface="eth0" netns="" Mar 7 01:11:42.461892 containerd[1517]: 2026-03-07 01:11:42.437 [INFO][5276] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" Mar 7 01:11:42.461892 containerd[1517]: 2026-03-07 01:11:42.437 [INFO][5276] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" Mar 7 01:11:42.461892 containerd[1517]: 2026-03-07 01:11:42.452 [INFO][5284] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" HandleID="k8s-pod-network.6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-csi--node--driver--gpr4h-eth0" Mar 7 01:11:42.461892 containerd[1517]: 2026-03-07 01:11:42.452 [INFO][5284] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:11:42.461892 containerd[1517]: 2026-03-07 01:11:42.452 [INFO][5284] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:11:42.461892 containerd[1517]: 2026-03-07 01:11:42.457 [WARNING][5284] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" HandleID="k8s-pod-network.6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-csi--node--driver--gpr4h-eth0" Mar 7 01:11:42.461892 containerd[1517]: 2026-03-07 01:11:42.457 [INFO][5284] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" HandleID="k8s-pod-network.6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-csi--node--driver--gpr4h-eth0" Mar 7 01:11:42.461892 containerd[1517]: 2026-03-07 01:11:42.458 [INFO][5284] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:11:42.461892 containerd[1517]: 2026-03-07 01:11:42.460 [INFO][5276] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" Mar 7 01:11:42.461892 containerd[1517]: time="2026-03-07T01:11:42.461779931Z" level=info msg="TearDown network for sandbox \"6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089\" successfully" Mar 7 01:11:42.461892 containerd[1517]: time="2026-03-07T01:11:42.461802850Z" level=info msg="StopPodSandbox for \"6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089\" returns successfully" Mar 7 01:11:42.462237 containerd[1517]: time="2026-03-07T01:11:42.462174720Z" level=info msg="RemovePodSandbox for \"6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089\"" Mar 7 01:11:42.462237 containerd[1517]: time="2026-03-07T01:11:42.462192680Z" level=info msg="Forcibly stopping sandbox \"6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089\"" Mar 7 01:11:42.511514 containerd[1517]: 2026-03-07 01:11:42.486 [WARNING][5298] cni-plugin/k8s.go 616: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--n--e40d23dcbc-k8s-csi--node--driver--gpr4h-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"b2c4739c-e360-4c95-9680-eefc754cc98b", ResourceVersion:"922", Generation:0, CreationTimestamp:time.Date(2026, time.March, 7, 1, 10, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6d9d697c7c", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-n-e40d23dcbc", ContainerID:"f92e9c16fbac8fe41c25f2c23a7308df996ea1160632969acda399e8260d2d2d", Pod:"csi-node-driver-gpr4h", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.95.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"caliaa77fcdc760", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 7 01:11:42.511514 containerd[1517]: 2026-03-07 01:11:42.486 [INFO][5298] cni-plugin/k8s.go 652: Cleaning up netns ContainerID="6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" Mar 7 01:11:42.511514 containerd[1517]: 2026-03-07 01:11:42.486 [INFO][5298] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" iface="eth0" netns="" Mar 7 01:11:42.511514 containerd[1517]: 2026-03-07 01:11:42.486 [INFO][5298] cni-plugin/k8s.go 659: Releasing IP address(es) ContainerID="6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" Mar 7 01:11:42.511514 containerd[1517]: 2026-03-07 01:11:42.486 [INFO][5298] cni-plugin/utils.go 204: Calico CNI releasing IP address ContainerID="6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" Mar 7 01:11:42.511514 containerd[1517]: 2026-03-07 01:11:42.502 [INFO][5306] ipam/ipam_plugin.go 497: Releasing address using handleID ContainerID="6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" HandleID="k8s-pod-network.6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-csi--node--driver--gpr4h-eth0" Mar 7 01:11:42.511514 containerd[1517]: 2026-03-07 01:11:42.502 [INFO][5306] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 7 01:11:42.511514 containerd[1517]: 2026-03-07 01:11:42.502 [INFO][5306] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 7 01:11:42.511514 containerd[1517]: 2026-03-07 01:11:42.506 [WARNING][5306] ipam/ipam_plugin.go 514: Asked to release address but it doesn't exist. Ignoring ContainerID="6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" HandleID="k8s-pod-network.6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-csi--node--driver--gpr4h-eth0" Mar 7 01:11:42.511514 containerd[1517]: 2026-03-07 01:11:42.506 [INFO][5306] ipam/ipam_plugin.go 525: Releasing address using workloadID ContainerID="6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" HandleID="k8s-pod-network.6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" Workload="ci--4081--3--6--n--e40d23dcbc-k8s-csi--node--driver--gpr4h-eth0" Mar 7 01:11:42.511514 containerd[1517]: 2026-03-07 01:11:42.508 [INFO][5306] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 7 01:11:42.511514 containerd[1517]: 2026-03-07 01:11:42.509 [INFO][5298] cni-plugin/k8s.go 665: Teardown processing complete. ContainerID="6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089" Mar 7 01:11:42.511882 containerd[1517]: time="2026-03-07T01:11:42.511553406Z" level=info msg="TearDown network for sandbox \"6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089\" successfully" Mar 7 01:11:42.518882 containerd[1517]: time="2026-03-07T01:11:42.518845423Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Mar 7 01:11:42.518970 containerd[1517]: time="2026-03-07T01:11:42.518893352Z" level=info msg="RemovePodSandbox \"6d620b985993c3b8720330d88f70b48ec7711327345e0569e0e2eb0531b2e089\" returns successfully" Mar 7 01:11:43.748873 kubelet[2578]: I0307 01:11:43.748756 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 7 01:11:43.783054 kubelet[2578]: I0307 01:11:43.782941 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-apiserver-7cdd895f86-5kmtj" podStartSLOduration=33.804681387 podStartE2EDuration="46.782923116s" podCreationTimestamp="2026-03-07 01:10:57 +0000 UTC" firstStartedPulling="2026-03-07 01:11:26.700505263 +0000 UTC m=+44.625472314" lastFinishedPulling="2026-03-07 01:11:39.678747002 +0000 UTC m=+57.603714043" observedRunningTime="2026-03-07 01:11:40.396013086 +0000 UTC m=+58.320980137" watchObservedRunningTime="2026-03-07 01:11:43.782923116 +0000 UTC m=+61.707890187" Mar 7 01:11:44.104545 containerd[1517]: time="2026-03-07T01:11:44.104061268Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:44.105215 containerd[1517]: time="2026-03-07T01:11:44.105127610Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=14704317" Mar 7 01:11:44.106634 containerd[1517]: time="2026-03-07T01:11:44.106594230Z" level=info msg="ImageCreate event name:\"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:44.110495 containerd[1517]: time="2026-03-07T01:11:44.110452747Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:44.111177 containerd[1517]: time="2026-03-07T01:11:44.110889185Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"16260314\" in 2.517071264s" Mar 7 01:11:44.111177 containerd[1517]: time="2026-03-07T01:11:44.110914594Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\"" Mar 7 01:11:44.113270 containerd[1517]: time="2026-03-07T01:11:44.113082846Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Mar 7 01:11:44.115982 containerd[1517]: time="2026-03-07T01:11:44.115967810Z" level=info msg="CreateContainer within sandbox \"f92e9c16fbac8fe41c25f2c23a7308df996ea1160632969acda399e8260d2d2d\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 7 01:11:44.130887 containerd[1517]: time="2026-03-07T01:11:44.130859141Z" level=info msg="CreateContainer within sandbox \"f92e9c16fbac8fe41c25f2c23a7308df996ea1160632969acda399e8260d2d2d\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"2a2c1f740a80f33962aec28c603096411173445d91c05d0e906c18244eaea850\"" Mar 7 01:11:44.131579 containerd[1517]: time="2026-03-07T01:11:44.131269709Z" level=info msg="StartContainer for \"2a2c1f740a80f33962aec28c603096411173445d91c05d0e906c18244eaea850\"" Mar 7 01:11:44.156365 systemd[1]: run-containerd-runc-k8s.io-2a2c1f740a80f33962aec28c603096411173445d91c05d0e906c18244eaea850-runc.XJlSJ0.mount: Deactivated successfully. Mar 7 01:11:44.162501 systemd[1]: Started cri-containerd-2a2c1f740a80f33962aec28c603096411173445d91c05d0e906c18244eaea850.scope - libcontainer container 2a2c1f740a80f33962aec28c603096411173445d91c05d0e906c18244eaea850. Mar 7 01:11:44.184509 containerd[1517]: time="2026-03-07T01:11:44.184296880Z" level=info msg="StartContainer for \"2a2c1f740a80f33962aec28c603096411173445d91c05d0e906c18244eaea850\" returns successfully" Mar 7 01:11:44.226884 kubelet[2578]: I0307 01:11:44.226826 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 7 01:11:44.228143 kubelet[2578]: I0307 01:11:44.228120 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 7 01:11:46.334542 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount407629868.mount: Deactivated successfully. Mar 7 01:11:46.349574 containerd[1517]: time="2026-03-07T01:11:46.349536196Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:46.350570 containerd[1517]: time="2026-03-07T01:11:46.350391484Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=17609475" Mar 7 01:11:46.351631 containerd[1517]: time="2026-03-07T01:11:46.351502366Z" level=info msg="ImageCreate event name:\"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:46.353537 containerd[1517]: time="2026-03-07T01:11:46.353505784Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 7 01:11:46.353995 containerd[1517]: time="2026-03-07T01:11:46.353885024Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"17609305\" in 2.240766818s" Mar 7 01:11:46.354070 containerd[1517]: time="2026-03-07T01:11:46.354057619Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\"" Mar 7 01:11:46.357521 containerd[1517]: time="2026-03-07T01:11:46.357494941Z" level=info msg="CreateContainer within sandbox \"05b03ace6f5a64c75bd4e1ae73264acab50e6e430dfa6c08ed37bbae4da4518a\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Mar 7 01:11:46.371293 containerd[1517]: time="2026-03-07T01:11:46.371265564Z" level=info msg="CreateContainer within sandbox \"05b03ace6f5a64c75bd4e1ae73264acab50e6e430dfa6c08ed37bbae4da4518a\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"a208fc23d7077b27028a7b026ae1a61e399a28f4b859b5910d5a8390ebea9ddf\"" Mar 7 01:11:46.371698 containerd[1517]: time="2026-03-07T01:11:46.371679424Z" level=info msg="StartContainer for \"a208fc23d7077b27028a7b026ae1a61e399a28f4b859b5910d5a8390ebea9ddf\"" Mar 7 01:11:46.396509 systemd[1]: Started cri-containerd-a208fc23d7077b27028a7b026ae1a61e399a28f4b859b5910d5a8390ebea9ddf.scope - libcontainer container a208fc23d7077b27028a7b026ae1a61e399a28f4b859b5910d5a8390ebea9ddf. Mar 7 01:11:46.435470 containerd[1517]: time="2026-03-07T01:11:46.435439017Z" level=info msg="StartContainer for \"a208fc23d7077b27028a7b026ae1a61e399a28f4b859b5910d5a8390ebea9ddf\" returns successfully" Mar 7 01:11:47.431946 kubelet[2578]: I0307 01:11:47.431856 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-gpr4h" podStartSLOduration=32.815153006 podStartE2EDuration="50.431833573s" podCreationTimestamp="2026-03-07 01:10:57 +0000 UTC" firstStartedPulling="2026-03-07 01:11:26.495598411 +0000 UTC m=+44.420565452" lastFinishedPulling="2026-03-07 01:11:44.112278968 +0000 UTC m=+62.037246019" observedRunningTime="2026-03-07 01:11:44.421302626 +0000 UTC m=+62.346269707" watchObservedRunningTime="2026-03-07 01:11:47.431833573 +0000 UTC m=+65.356800654" Mar 7 01:11:47.432777 kubelet[2578]: I0307 01:11:47.431990 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-7c998d4b96-pt4hf" podStartSLOduration=2.165114178 podStartE2EDuration="21.431984109s" podCreationTimestamp="2026-03-07 01:11:26 +0000 UTC" firstStartedPulling="2026-03-07 01:11:27.087697225 +0000 UTC m=+45.012664276" lastFinishedPulling="2026-03-07 01:11:46.354567156 +0000 UTC m=+64.279534207" observedRunningTime="2026-03-07 01:11:47.431300317 +0000 UTC m=+65.356267398" watchObservedRunningTime="2026-03-07 01:11:47.431984109 +0000 UTC m=+65.356951190" Mar 7 01:12:13.181819 systemd[1]: Started sshd@7-89.167.115.210:22-4.153.228.146:35862.service - OpenSSH per-connection server daemon (4.153.228.146:35862). Mar 7 01:12:13.953117 sshd[5530]: Accepted publickey for core from 4.153.228.146 port 35862 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:12:13.955897 sshd[5530]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:12:13.959695 systemd-logind[1493]: New session 8 of user core. Mar 7 01:12:13.965825 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 7 01:12:14.545618 sshd[5530]: pam_unix(sshd:session): session closed for user core Mar 7 01:12:14.552088 systemd-logind[1493]: Session 8 logged out. Waiting for processes to exit. Mar 7 01:12:14.553002 systemd[1]: sshd@7-89.167.115.210:22-4.153.228.146:35862.service: Deactivated successfully. Mar 7 01:12:14.557453 systemd[1]: session-8.scope: Deactivated successfully. Mar 7 01:12:14.559576 systemd-logind[1493]: Removed session 8. Mar 7 01:12:19.679619 systemd[1]: Started sshd@8-89.167.115.210:22-4.153.228.146:34670.service - OpenSSH per-connection server daemon (4.153.228.146:34670). Mar 7 01:12:20.435454 sshd[5546]: Accepted publickey for core from 4.153.228.146 port 34670 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:12:20.437583 sshd[5546]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:12:20.446529 systemd-logind[1493]: New session 9 of user core. Mar 7 01:12:20.450699 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 7 01:12:21.006069 sshd[5546]: pam_unix(sshd:session): session closed for user core Mar 7 01:12:21.009869 systemd-logind[1493]: Session 9 logged out. Waiting for processes to exit. Mar 7 01:12:21.010503 systemd[1]: sshd@8-89.167.115.210:22-4.153.228.146:34670.service: Deactivated successfully. Mar 7 01:12:21.012542 systemd[1]: session-9.scope: Deactivated successfully. Mar 7 01:12:21.013344 systemd-logind[1493]: Removed session 9. Mar 7 01:12:26.147576 systemd[1]: Started sshd@9-89.167.115.210:22-4.153.228.146:34686.service - OpenSSH per-connection server daemon (4.153.228.146:34686). Mar 7 01:12:26.910546 sshd[5560]: Accepted publickey for core from 4.153.228.146 port 34686 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:12:26.915979 sshd[5560]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:12:26.926615 systemd-logind[1493]: New session 10 of user core. Mar 7 01:12:26.937704 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 7 01:12:27.519809 sshd[5560]: pam_unix(sshd:session): session closed for user core Mar 7 01:12:27.522926 systemd[1]: sshd@9-89.167.115.210:22-4.153.228.146:34686.service: Deactivated successfully. Mar 7 01:12:27.524976 systemd[1]: session-10.scope: Deactivated successfully. Mar 7 01:12:27.527135 systemd-logind[1493]: Session 10 logged out. Waiting for processes to exit. Mar 7 01:12:27.528977 systemd-logind[1493]: Removed session 10. Mar 7 01:12:27.655841 systemd[1]: Started sshd@10-89.167.115.210:22-4.153.228.146:34698.service - OpenSSH per-connection server daemon (4.153.228.146:34698). Mar 7 01:12:28.420243 sshd[5596]: Accepted publickey for core from 4.153.228.146 port 34698 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:12:28.421624 sshd[5596]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:12:28.426664 systemd-logind[1493]: New session 11 of user core. Mar 7 01:12:28.436565 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 7 01:12:29.030251 sshd[5596]: pam_unix(sshd:session): session closed for user core Mar 7 01:12:29.032843 systemd[1]: sshd@10-89.167.115.210:22-4.153.228.146:34698.service: Deactivated successfully. Mar 7 01:12:29.035272 systemd[1]: session-11.scope: Deactivated successfully. Mar 7 01:12:29.036586 systemd-logind[1493]: Session 11 logged out. Waiting for processes to exit. Mar 7 01:12:29.038035 systemd-logind[1493]: Removed session 11. Mar 7 01:12:29.168833 systemd[1]: Started sshd@11-89.167.115.210:22-4.153.228.146:42694.service - OpenSSH per-connection server daemon (4.153.228.146:42694). Mar 7 01:12:29.920574 sshd[5623]: Accepted publickey for core from 4.153.228.146 port 42694 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:12:29.923617 sshd[5623]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:12:29.932471 systemd-logind[1493]: New session 12 of user core. Mar 7 01:12:29.937652 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 7 01:12:30.535689 sshd[5623]: pam_unix(sshd:session): session closed for user core Mar 7 01:12:30.541388 systemd[1]: sshd@11-89.167.115.210:22-4.153.228.146:42694.service: Deactivated successfully. Mar 7 01:12:30.545982 systemd[1]: session-12.scope: Deactivated successfully. Mar 7 01:12:30.548795 systemd-logind[1493]: Session 12 logged out. Waiting for processes to exit. Mar 7 01:12:30.550044 systemd-logind[1493]: Removed session 12. Mar 7 01:12:35.670813 systemd[1]: Started sshd@12-89.167.115.210:22-4.153.228.146:42710.service - OpenSSH per-connection server daemon (4.153.228.146:42710). Mar 7 01:12:36.421789 sshd[5694]: Accepted publickey for core from 4.153.228.146 port 42710 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:12:36.423472 sshd[5694]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:12:36.428243 systemd-logind[1493]: New session 13 of user core. Mar 7 01:12:36.432542 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 7 01:12:37.009691 sshd[5694]: pam_unix(sshd:session): session closed for user core Mar 7 01:12:37.017747 systemd[1]: sshd@12-89.167.115.210:22-4.153.228.146:42710.service: Deactivated successfully. Mar 7 01:12:37.023345 systemd[1]: session-13.scope: Deactivated successfully. Mar 7 01:12:37.024918 systemd-logind[1493]: Session 13 logged out. Waiting for processes to exit. Mar 7 01:12:37.027452 systemd-logind[1493]: Removed session 13. Mar 7 01:12:37.144818 systemd[1]: Started sshd@13-89.167.115.210:22-4.153.228.146:42726.service - OpenSSH per-connection server daemon (4.153.228.146:42726). Mar 7 01:12:37.878457 sshd[5706]: Accepted publickey for core from 4.153.228.146 port 42726 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:12:37.881241 sshd[5706]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:12:37.889460 systemd-logind[1493]: New session 14 of user core. Mar 7 01:12:37.898647 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 7 01:12:38.751754 sshd[5706]: pam_unix(sshd:session): session closed for user core Mar 7 01:12:38.755598 systemd-logind[1493]: Session 14 logged out. Waiting for processes to exit. Mar 7 01:12:38.756655 systemd[1]: sshd@13-89.167.115.210:22-4.153.228.146:42726.service: Deactivated successfully. Mar 7 01:12:38.758415 systemd[1]: session-14.scope: Deactivated successfully. Mar 7 01:12:38.759955 systemd-logind[1493]: Removed session 14. Mar 7 01:12:38.888820 systemd[1]: Started sshd@14-89.167.115.210:22-4.153.228.146:42736.service - OpenSSH per-connection server daemon (4.153.228.146:42736). Mar 7 01:12:39.664527 sshd[5717]: Accepted publickey for core from 4.153.228.146 port 42736 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:12:39.667072 sshd[5717]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:12:39.675123 systemd-logind[1493]: New session 15 of user core. Mar 7 01:12:39.686561 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 7 01:12:40.825691 sshd[5717]: pam_unix(sshd:session): session closed for user core Mar 7 01:12:40.830822 systemd[1]: sshd@14-89.167.115.210:22-4.153.228.146:42736.service: Deactivated successfully. Mar 7 01:12:40.833140 systemd[1]: session-15.scope: Deactivated successfully. Mar 7 01:12:40.834047 systemd-logind[1493]: Session 15 logged out. Waiting for processes to exit. Mar 7 01:12:40.836091 systemd-logind[1493]: Removed session 15. Mar 7 01:12:40.963637 systemd[1]: Started sshd@15-89.167.115.210:22-4.153.228.146:45124.service - OpenSSH per-connection server daemon (4.153.228.146:45124). Mar 7 01:12:41.727161 sshd[5743]: Accepted publickey for core from 4.153.228.146 port 45124 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:12:41.729721 sshd[5743]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:12:41.738713 systemd-logind[1493]: New session 16 of user core. Mar 7 01:12:41.744664 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 7 01:12:42.424939 sshd[5743]: pam_unix(sshd:session): session closed for user core Mar 7 01:12:42.430217 systemd[1]: sshd@15-89.167.115.210:22-4.153.228.146:45124.service: Deactivated successfully. Mar 7 01:12:42.433668 systemd[1]: session-16.scope: Deactivated successfully. Mar 7 01:12:42.437190 systemd-logind[1493]: Session 16 logged out. Waiting for processes to exit. Mar 7 01:12:42.440022 systemd-logind[1493]: Removed session 16. Mar 7 01:12:42.561170 systemd[1]: Started sshd@16-89.167.115.210:22-4.153.228.146:45128.service - OpenSSH per-connection server daemon (4.153.228.146:45128). Mar 7 01:12:43.317642 sshd[5756]: Accepted publickey for core from 4.153.228.146 port 45128 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:12:43.321012 sshd[5756]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:12:43.330180 systemd-logind[1493]: New session 17 of user core. Mar 7 01:12:43.335733 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 7 01:12:43.931216 sshd[5756]: pam_unix(sshd:session): session closed for user core Mar 7 01:12:43.938966 systemd[1]: sshd@16-89.167.115.210:22-4.153.228.146:45128.service: Deactivated successfully. Mar 7 01:12:43.943319 systemd[1]: session-17.scope: Deactivated successfully. Mar 7 01:12:43.944642 systemd-logind[1493]: Session 17 logged out. Waiting for processes to exit. Mar 7 01:12:43.947236 systemd-logind[1493]: Removed session 17. Mar 7 01:12:49.069945 systemd[1]: Started sshd@17-89.167.115.210:22-4.153.228.146:59998.service - OpenSSH per-connection server daemon (4.153.228.146:59998). Mar 7 01:12:49.827592 sshd[5786]: Accepted publickey for core from 4.153.228.146 port 59998 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:12:49.830502 sshd[5786]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:12:49.838845 systemd-logind[1493]: New session 18 of user core. Mar 7 01:12:49.844658 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 7 01:12:50.427462 sshd[5786]: pam_unix(sshd:session): session closed for user core Mar 7 01:12:50.432664 systemd-logind[1493]: Session 18 logged out. Waiting for processes to exit. Mar 7 01:12:50.433710 systemd[1]: sshd@17-89.167.115.210:22-4.153.228.146:59998.service: Deactivated successfully. Mar 7 01:12:50.436852 systemd[1]: session-18.scope: Deactivated successfully. Mar 7 01:12:50.438718 systemd-logind[1493]: Removed session 18. Mar 7 01:12:55.556829 systemd[1]: Started sshd@18-89.167.115.210:22-4.153.228.146:60014.service - OpenSSH per-connection server daemon (4.153.228.146:60014). Mar 7 01:12:56.306135 sshd[5799]: Accepted publickey for core from 4.153.228.146 port 60014 ssh2: RSA SHA256:cfLbcynJBGQiJlcpT05nBKNU4f9DyADpOV1ay9ga6kI Mar 7 01:12:56.307544 sshd[5799]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 7 01:12:56.316199 systemd-logind[1493]: New session 19 of user core. Mar 7 01:12:56.321607 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 7 01:12:56.918786 sshd[5799]: pam_unix(sshd:session): session closed for user core Mar 7 01:12:56.928357 systemd[1]: sshd@18-89.167.115.210:22-4.153.228.146:60014.service: Deactivated successfully. Mar 7 01:12:56.934688 systemd[1]: session-19.scope: Deactivated successfully. Mar 7 01:12:56.936545 systemd-logind[1493]: Session 19 logged out. Waiting for processes to exit. Mar 7 01:12:56.939362 systemd-logind[1493]: Removed session 19. Mar 7 01:13:00.395711 systemd[1]: run-containerd-runc-k8s.io-2bc79fdeb9a9926590683748125695e7ba12209c1982e46099b384c5b3f3883e-runc.tKhTPi.mount: Deactivated successfully. Mar 7 01:13:01.113227 update_engine[1496]: I20260307 01:13:01.113116 1496 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Mar 7 01:13:01.113227 update_engine[1496]: I20260307 01:13:01.113225 1496 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Mar 7 01:13:01.114211 update_engine[1496]: I20260307 01:13:01.113679 1496 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Mar 7 01:13:01.114723 update_engine[1496]: I20260307 01:13:01.114674 1496 omaha_request_params.cc:62] Current group set to lts Mar 7 01:13:01.117181 update_engine[1496]: I20260307 01:13:01.116592 1496 update_attempter.cc:499] Already updated boot flags. Skipping. Mar 7 01:13:01.117181 update_engine[1496]: I20260307 01:13:01.116632 1496 update_attempter.cc:643] Scheduling an action processor start. Mar 7 01:13:01.117181 update_engine[1496]: I20260307 01:13:01.116664 1496 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 7 01:13:01.117181 update_engine[1496]: I20260307 01:13:01.116740 1496 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Mar 7 01:13:01.117181 update_engine[1496]: I20260307 01:13:01.116864 1496 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 7 01:13:01.117181 update_engine[1496]: I20260307 01:13:01.116880 1496 omaha_request_action.cc:272] Request: Mar 7 01:13:01.117181 update_engine[1496]: Mar 7 01:13:01.117181 update_engine[1496]: Mar 7 01:13:01.117181 update_engine[1496]: Mar 7 01:13:01.117181 update_engine[1496]: Mar 7 01:13:01.117181 update_engine[1496]: Mar 7 01:13:01.117181 update_engine[1496]: Mar 7 01:13:01.117181 update_engine[1496]: Mar 7 01:13:01.117181 update_engine[1496]: Mar 7 01:13:01.117181 update_engine[1496]: I20260307 01:13:01.116896 1496 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 7 01:13:01.117803 locksmithd[1531]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Mar 7 01:13:01.124783 update_engine[1496]: I20260307 01:13:01.124729 1496 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 7 01:13:01.125262 update_engine[1496]: I20260307 01:13:01.125180 1496 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 7 01:13:01.126538 update_engine[1496]: E20260307 01:13:01.126478 1496 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 7 01:13:01.126631 update_engine[1496]: I20260307 01:13:01.126587 1496 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Mar 7 01:13:11.117633 update_engine[1496]: I20260307 01:13:11.117489 1496 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 7 01:13:11.118487 update_engine[1496]: I20260307 01:13:11.117973 1496 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 7 01:13:11.118487 update_engine[1496]: I20260307 01:13:11.118362 1496 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 7 01:13:11.119290 update_engine[1496]: E20260307 01:13:11.119199 1496 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 7 01:13:11.119494 update_engine[1496]: I20260307 01:13:11.119321 1496 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Mar 7 01:13:13.835127 kubelet[2578]: E0307 01:13:13.834961 2578 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:48322->10.0.0.2:2379: read: connection timed out" Mar 7 01:13:13.845912 systemd[1]: cri-containerd-b10634c722411af6f2beb366d3e22c24d275189d436fb8401462fb5395750c77.scope: Deactivated successfully. Mar 7 01:13:13.846986 systemd[1]: cri-containerd-b10634c722411af6f2beb366d3e22c24d275189d436fb8401462fb5395750c77.scope: Consumed 2.195s CPU time, 16.1M memory peak, 0B memory swap peak. Mar 7 01:13:13.887564 containerd[1517]: time="2026-03-07T01:13:13.884979810Z" level=info msg="shim disconnected" id=b10634c722411af6f2beb366d3e22c24d275189d436fb8401462fb5395750c77 namespace=k8s.io Mar 7 01:13:13.887564 containerd[1517]: time="2026-03-07T01:13:13.885046169Z" level=warning msg="cleaning up after shim disconnected" id=b10634c722411af6f2beb366d3e22c24d275189d436fb8401462fb5395750c77 namespace=k8s.io Mar 7 01:13:13.887564 containerd[1517]: time="2026-03-07T01:13:13.885058939Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 01:13:13.893113 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b10634c722411af6f2beb366d3e22c24d275189d436fb8401462fb5395750c77-rootfs.mount: Deactivated successfully. Mar 7 01:13:14.238934 systemd[1]: cri-containerd-53ea4186dae333b566c978013acf0388dd4cd1ceb0c6471aaf066554535ca174.scope: Deactivated successfully. Mar 7 01:13:14.239465 systemd[1]: cri-containerd-53ea4186dae333b566c978013acf0388dd4cd1ceb0c6471aaf066554535ca174.scope: Consumed 3.790s CPU time, 17.8M memory peak, 0B memory swap peak. Mar 7 01:13:14.288467 containerd[1517]: time="2026-03-07T01:13:14.286039632Z" level=info msg="shim disconnected" id=53ea4186dae333b566c978013acf0388dd4cd1ceb0c6471aaf066554535ca174 namespace=k8s.io Mar 7 01:13:14.288467 containerd[1517]: time="2026-03-07T01:13:14.286119670Z" level=warning msg="cleaning up after shim disconnected" id=53ea4186dae333b566c978013acf0388dd4cd1ceb0c6471aaf066554535ca174 namespace=k8s.io Mar 7 01:13:14.288467 containerd[1517]: time="2026-03-07T01:13:14.286131960Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 01:13:14.290594 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-53ea4186dae333b566c978013acf0388dd4cd1ceb0c6471aaf066554535ca174-rootfs.mount: Deactivated successfully. Mar 7 01:13:14.308906 containerd[1517]: time="2026-03-07T01:13:14.308822532Z" level=warning msg="cleanup warnings time=\"2026-03-07T01:13:14Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Mar 7 01:13:14.431361 systemd[1]: cri-containerd-8b3ae79596106a8c76118ecce1b81b0457ad20a1d0b3463bd982c82d98b69ab8.scope: Deactivated successfully. Mar 7 01:13:14.432101 systemd[1]: cri-containerd-8b3ae79596106a8c76118ecce1b81b0457ad20a1d0b3463bd982c82d98b69ab8.scope: Consumed 7.857s CPU time. Mar 7 01:13:14.469267 containerd[1517]: time="2026-03-07T01:13:14.468847388Z" level=info msg="shim disconnected" id=8b3ae79596106a8c76118ecce1b81b0457ad20a1d0b3463bd982c82d98b69ab8 namespace=k8s.io Mar 7 01:13:14.469267 containerd[1517]: time="2026-03-07T01:13:14.468925407Z" level=warning msg="cleaning up after shim disconnected" id=8b3ae79596106a8c76118ecce1b81b0457ad20a1d0b3463bd982c82d98b69ab8 namespace=k8s.io Mar 7 01:13:14.469267 containerd[1517]: time="2026-03-07T01:13:14.468945367Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 7 01:13:14.476085 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8b3ae79596106a8c76118ecce1b81b0457ad20a1d0b3463bd982c82d98b69ab8-rootfs.mount: Deactivated successfully. Mar 7 01:13:14.615371 kubelet[2578]: I0307 01:13:14.615277 2578 scope.go:117] "RemoveContainer" containerID="53ea4186dae333b566c978013acf0388dd4cd1ceb0c6471aaf066554535ca174" Mar 7 01:13:14.620435 kubelet[2578]: I0307 01:13:14.620006 2578 scope.go:117] "RemoveContainer" containerID="b10634c722411af6f2beb366d3e22c24d275189d436fb8401462fb5395750c77" Mar 7 01:13:14.620867 containerd[1517]: time="2026-03-07T01:13:14.620811324Z" level=info msg="CreateContainer within sandbox \"c64ee6765446386f22c4f2080672b0e63d6fda831195072753573ca6df2b128c\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Mar 7 01:13:14.623623 kubelet[2578]: I0307 01:13:14.623324 2578 scope.go:117] "RemoveContainer" containerID="8b3ae79596106a8c76118ecce1b81b0457ad20a1d0b3463bd982c82d98b69ab8" Mar 7 01:13:14.627266 containerd[1517]: time="2026-03-07T01:13:14.627212557Z" level=info msg="CreateContainer within sandbox \"413501224bc3aac686f1ce7fd5d5758d55e0e42e5b488d7612830120771e085f\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Mar 7 01:13:14.627527 containerd[1517]: time="2026-03-07T01:13:14.627457494Z" level=info msg="CreateContainer within sandbox \"28451ef59cf4a379a051e5be392783009c391f2ca8e0533974dc9c6f459801b5\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Mar 7 01:13:14.659775 containerd[1517]: time="2026-03-07T01:13:14.659340691Z" level=info msg="CreateContainer within sandbox \"c64ee6765446386f22c4f2080672b0e63d6fda831195072753573ca6df2b128c\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"496d18b39ac62fb08760c4505888d4a14e898167069b009f8ecdfce0dfdaf411\"" Mar 7 01:13:14.660862 containerd[1517]: time="2026-03-07T01:13:14.660822781Z" level=info msg="StartContainer for \"496d18b39ac62fb08760c4505888d4a14e898167069b009f8ecdfce0dfdaf411\"" Mar 7 01:13:14.666738 containerd[1517]: time="2026-03-07T01:13:14.666699811Z" level=info msg="CreateContainer within sandbox \"413501224bc3aac686f1ce7fd5d5758d55e0e42e5b488d7612830120771e085f\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"9a78b53b30479477121752ca082b5240d33b7d46e9a2e65530fbc44b690f7891\"" Mar 7 01:13:14.669002 containerd[1517]: time="2026-03-07T01:13:14.668942300Z" level=info msg="StartContainer for \"9a78b53b30479477121752ca082b5240d33b7d46e9a2e65530fbc44b690f7891\"" Mar 7 01:13:14.671228 containerd[1517]: time="2026-03-07T01:13:14.671121911Z" level=info msg="CreateContainer within sandbox \"28451ef59cf4a379a051e5be392783009c391f2ca8e0533974dc9c6f459801b5\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"6962d387d3f17c0a0715376fb43ec6994f0489584a97b1fd80239269b52e6e15\"" Mar 7 01:13:14.672436 containerd[1517]: time="2026-03-07T01:13:14.671869320Z" level=info msg="StartContainer for \"6962d387d3f17c0a0715376fb43ec6994f0489584a97b1fd80239269b52e6e15\"" Mar 7 01:13:14.703538 systemd[1]: Started cri-containerd-496d18b39ac62fb08760c4505888d4a14e898167069b009f8ecdfce0dfdaf411.scope - libcontainer container 496d18b39ac62fb08760c4505888d4a14e898167069b009f8ecdfce0dfdaf411. Mar 7 01:13:14.710510 systemd[1]: Started cri-containerd-6962d387d3f17c0a0715376fb43ec6994f0489584a97b1fd80239269b52e6e15.scope - libcontainer container 6962d387d3f17c0a0715376fb43ec6994f0489584a97b1fd80239269b52e6e15. Mar 7 01:13:14.712260 systemd[1]: Started cri-containerd-9a78b53b30479477121752ca082b5240d33b7d46e9a2e65530fbc44b690f7891.scope - libcontainer container 9a78b53b30479477121752ca082b5240d33b7d46e9a2e65530fbc44b690f7891. Mar 7 01:13:14.752779 containerd[1517]: time="2026-03-07T01:13:14.752304839Z" level=info msg="StartContainer for \"9a78b53b30479477121752ca082b5240d33b7d46e9a2e65530fbc44b690f7891\" returns successfully" Mar 7 01:13:14.757767 containerd[1517]: time="2026-03-07T01:13:14.756945045Z" level=info msg="StartContainer for \"496d18b39ac62fb08760c4505888d4a14e898167069b009f8ecdfce0dfdaf411\" returns successfully" Mar 7 01:13:14.769090 containerd[1517]: time="2026-03-07T01:13:14.769063280Z" level=info msg="StartContainer for \"6962d387d3f17c0a0715376fb43ec6994f0489584a97b1fd80239269b52e6e15\" returns successfully" Mar 7 01:13:18.248041 kubelet[2578]: E0307 01:13:18.246659 2578 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:48128->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4081-3-6-n-e40d23dcbc.189a6a086f70d82d kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4081-3-6-n-e40d23dcbc,UID:6deea6d952b3f72c7d75a6827f33c096,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4081-3-6-n-e40d23dcbc,},FirstTimestamp:2026-03-07 01:13:07.785660461 +0000 UTC m=+145.710627542,LastTimestamp:2026-03-07 01:13:07.785660461 +0000 UTC m=+145.710627542,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4081-3-6-n-e40d23dcbc,}"