Sep 13 00:07:04.895998 kernel: Linux version 6.6.106-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT_DYNAMIC Fri Sep 12 22:30:50 -00 2025 Sep 13 00:07:04.896032 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=2945e6465d436b7d1da8a9350a0544af0bd9aec821cd06987451d5e1d3071534 Sep 13 00:07:04.896046 kernel: BIOS-provided physical RAM map: Sep 13 00:07:04.896055 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Sep 13 00:07:04.896063 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Sep 13 00:07:04.896071 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Sep 13 00:07:04.896081 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007cfdbfff] usable Sep 13 00:07:04.896090 kernel: BIOS-e820: [mem 0x000000007cfdc000-0x000000007cffffff] reserved Sep 13 00:07:04.896102 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Sep 13 00:07:04.896111 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Sep 13 00:07:04.896120 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 13 00:07:04.896129 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Sep 13 00:07:04.896137 kernel: NX (Execute Disable) protection: active Sep 13 00:07:04.896146 kernel: APIC: Static calls initialized Sep 13 00:07:04.896160 kernel: SMBIOS 2.8 present. Sep 13 00:07:04.896170 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 Sep 13 00:07:04.896180 kernel: Hypervisor detected: KVM Sep 13 00:07:04.896189 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Sep 13 00:07:04.896198 kernel: kvm-clock: using sched offset of 3171398702 cycles Sep 13 00:07:04.896208 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 13 00:07:04.896217 kernel: tsc: Detected 2445.404 MHz processor Sep 13 00:07:04.896243 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 13 00:07:04.896253 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 13 00:07:04.896282 kernel: last_pfn = 0x7cfdc max_arch_pfn = 0x400000000 Sep 13 00:07:04.896294 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Sep 13 00:07:04.896304 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 13 00:07:04.896313 kernel: Using GB pages for direct mapping Sep 13 00:07:04.896322 kernel: ACPI: Early table checksum verification disabled Sep 13 00:07:04.896331 kernel: ACPI: RSDP 0x00000000000F5270 000014 (v00 BOCHS ) Sep 13 00:07:04.896341 kernel: ACPI: RSDT 0x000000007CFE2693 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 00:07:04.896351 kernel: ACPI: FACP 0x000000007CFE2483 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 00:07:04.896361 kernel: ACPI: DSDT 0x000000007CFE0040 002443 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 00:07:04.896376 kernel: ACPI: FACS 0x000000007CFE0000 000040 Sep 13 00:07:04.896387 kernel: ACPI: APIC 0x000000007CFE2577 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 00:07:04.896398 kernel: ACPI: HPET 0x000000007CFE25F7 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 00:07:04.896407 kernel: ACPI: MCFG 0x000000007CFE262F 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 00:07:04.896416 kernel: ACPI: WAET 0x000000007CFE266B 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 13 00:07:04.896426 kernel: ACPI: Reserving FACP table memory at [mem 0x7cfe2483-0x7cfe2576] Sep 13 00:07:04.896435 kernel: ACPI: Reserving DSDT table memory at [mem 0x7cfe0040-0x7cfe2482] Sep 13 00:07:04.896445 kernel: ACPI: Reserving FACS table memory at [mem 0x7cfe0000-0x7cfe003f] Sep 13 00:07:04.896463 kernel: ACPI: Reserving APIC table memory at [mem 0x7cfe2577-0x7cfe25f6] Sep 13 00:07:04.896475 kernel: ACPI: Reserving HPET table memory at [mem 0x7cfe25f7-0x7cfe262e] Sep 13 00:07:04.896486 kernel: ACPI: Reserving MCFG table memory at [mem 0x7cfe262f-0x7cfe266a] Sep 13 00:07:04.896497 kernel: ACPI: Reserving WAET table memory at [mem 0x7cfe266b-0x7cfe2692] Sep 13 00:07:04.896507 kernel: No NUMA configuration found Sep 13 00:07:04.896517 kernel: Faking a node at [mem 0x0000000000000000-0x000000007cfdbfff] Sep 13 00:07:04.896529 kernel: NODE_DATA(0) allocated [mem 0x7cfd6000-0x7cfdbfff] Sep 13 00:07:04.896540 kernel: Zone ranges: Sep 13 00:07:04.896551 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 13 00:07:04.896561 kernel: DMA32 [mem 0x0000000001000000-0x000000007cfdbfff] Sep 13 00:07:04.896572 kernel: Normal empty Sep 13 00:07:04.896582 kernel: Movable zone start for each node Sep 13 00:07:04.896592 kernel: Early memory node ranges Sep 13 00:07:04.896603 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Sep 13 00:07:04.896613 kernel: node 0: [mem 0x0000000000100000-0x000000007cfdbfff] Sep 13 00:07:04.896624 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007cfdbfff] Sep 13 00:07:04.896637 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 13 00:07:04.896648 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Sep 13 00:07:04.896658 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges Sep 13 00:07:04.896667 kernel: ACPI: PM-Timer IO Port: 0x608 Sep 13 00:07:04.896677 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Sep 13 00:07:04.896687 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Sep 13 00:07:04.896698 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 13 00:07:04.896708 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Sep 13 00:07:04.896718 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 13 00:07:04.896731 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Sep 13 00:07:04.896741 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Sep 13 00:07:04.896751 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 13 00:07:04.896761 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Sep 13 00:07:04.896772 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs Sep 13 00:07:04.896782 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Sep 13 00:07:04.896808 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Sep 13 00:07:04.896819 kernel: Booting paravirtualized kernel on KVM Sep 13 00:07:04.896829 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 13 00:07:04.896843 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Sep 13 00:07:04.896854 kernel: percpu: Embedded 58 pages/cpu s197160 r8192 d32216 u1048576 Sep 13 00:07:04.896864 kernel: pcpu-alloc: s197160 r8192 d32216 u1048576 alloc=1*2097152 Sep 13 00:07:04.896874 kernel: pcpu-alloc: [0] 0 1 Sep 13 00:07:04.896884 kernel: kvm-guest: PV spinlocks disabled, no host support Sep 13 00:07:04.896896 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=2945e6465d436b7d1da8a9350a0544af0bd9aec821cd06987451d5e1d3071534 Sep 13 00:07:04.896908 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 13 00:07:04.896918 kernel: random: crng init done Sep 13 00:07:04.896931 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 13 00:07:04.896942 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Sep 13 00:07:04.896951 kernel: Fallback order for Node 0: 0 Sep 13 00:07:04.896961 kernel: Built 1 zonelists, mobility grouping on. Total pages: 503708 Sep 13 00:07:04.896971 kernel: Policy zone: DMA32 Sep 13 00:07:04.896981 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 13 00:07:04.896991 kernel: Memory: 1922052K/2047464K available (12288K kernel code, 2293K rwdata, 22744K rodata, 42884K init, 2312K bss, 125152K reserved, 0K cma-reserved) Sep 13 00:07:04.897001 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Sep 13 00:07:04.897011 kernel: ftrace: allocating 37974 entries in 149 pages Sep 13 00:07:04.897025 kernel: ftrace: allocated 149 pages with 4 groups Sep 13 00:07:04.897035 kernel: Dynamic Preempt: voluntary Sep 13 00:07:04.897044 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 13 00:07:04.897056 kernel: rcu: RCU event tracing is enabled. Sep 13 00:07:04.897067 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Sep 13 00:07:04.897078 kernel: Trampoline variant of Tasks RCU enabled. Sep 13 00:07:04.897089 kernel: Rude variant of Tasks RCU enabled. Sep 13 00:07:04.897099 kernel: Tracing variant of Tasks RCU enabled. Sep 13 00:07:04.897110 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 13 00:07:04.897125 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Sep 13 00:07:04.897135 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Sep 13 00:07:04.897145 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 13 00:07:04.897155 kernel: Console: colour VGA+ 80x25 Sep 13 00:07:04.897166 kernel: printk: console [tty0] enabled Sep 13 00:07:04.897177 kernel: printk: console [ttyS0] enabled Sep 13 00:07:04.897188 kernel: ACPI: Core revision 20230628 Sep 13 00:07:04.897199 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Sep 13 00:07:04.897210 kernel: APIC: Switch to symmetric I/O mode setup Sep 13 00:07:04.897225 kernel: x2apic enabled Sep 13 00:07:04.897255 kernel: APIC: Switched APIC routing to: physical x2apic Sep 13 00:07:04.897312 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Sep 13 00:07:04.897329 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized Sep 13 00:07:04.897340 kernel: Calibrating delay loop (skipped) preset value.. 4890.80 BogoMIPS (lpj=2445404) Sep 13 00:07:04.897352 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Sep 13 00:07:04.897363 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Sep 13 00:07:04.897373 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Sep 13 00:07:04.897384 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 13 00:07:04.897408 kernel: Spectre V2 : Mitigation: Retpolines Sep 13 00:07:04.897419 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Sep 13 00:07:04.897429 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Sep 13 00:07:04.897444 kernel: active return thunk: retbleed_return_thunk Sep 13 00:07:04.897454 kernel: RETBleed: Mitigation: untrained return thunk Sep 13 00:07:04.897465 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 13 00:07:04.897476 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 13 00:07:04.897486 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 13 00:07:04.897497 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 13 00:07:04.897511 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 13 00:07:04.897523 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 13 00:07:04.897534 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Sep 13 00:07:04.897545 kernel: Freeing SMP alternatives memory: 32K Sep 13 00:07:04.897556 kernel: pid_max: default: 32768 minimum: 301 Sep 13 00:07:04.897568 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Sep 13 00:07:04.897578 kernel: landlock: Up and running. Sep 13 00:07:04.897588 kernel: SELinux: Initializing. Sep 13 00:07:04.897603 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 13 00:07:04.897614 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 13 00:07:04.897625 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0) Sep 13 00:07:04.897636 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 13 00:07:04.897647 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 13 00:07:04.897658 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 13 00:07:04.897670 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Sep 13 00:07:04.897680 kernel: ... version: 0 Sep 13 00:07:04.897694 kernel: ... bit width: 48 Sep 13 00:07:04.897705 kernel: ... generic registers: 6 Sep 13 00:07:04.897716 kernel: ... value mask: 0000ffffffffffff Sep 13 00:07:04.897726 kernel: ... max period: 00007fffffffffff Sep 13 00:07:04.897737 kernel: ... fixed-purpose events: 0 Sep 13 00:07:04.897747 kernel: ... event mask: 000000000000003f Sep 13 00:07:04.897758 kernel: signal: max sigframe size: 1776 Sep 13 00:07:04.897768 kernel: rcu: Hierarchical SRCU implementation. Sep 13 00:07:04.897779 kernel: rcu: Max phase no-delay instances is 400. Sep 13 00:07:04.897790 kernel: smp: Bringing up secondary CPUs ... Sep 13 00:07:04.897805 kernel: smpboot: x86: Booting SMP configuration: Sep 13 00:07:04.897815 kernel: .... node #0, CPUs: #1 Sep 13 00:07:04.897826 kernel: smp: Brought up 1 node, 2 CPUs Sep 13 00:07:04.897837 kernel: smpboot: Max logical packages: 1 Sep 13 00:07:04.897848 kernel: smpboot: Total of 2 processors activated (9781.61 BogoMIPS) Sep 13 00:07:04.897859 kernel: devtmpfs: initialized Sep 13 00:07:04.897870 kernel: x86/mm: Memory block size: 128MB Sep 13 00:07:04.897882 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 13 00:07:04.897894 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Sep 13 00:07:04.897909 kernel: pinctrl core: initialized pinctrl subsystem Sep 13 00:07:04.897919 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 13 00:07:04.897929 kernel: audit: initializing netlink subsys (disabled) Sep 13 00:07:04.897941 kernel: audit: type=2000 audit(1757722024.297:1): state=initialized audit_enabled=0 res=1 Sep 13 00:07:04.897953 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 13 00:07:04.897965 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 13 00:07:04.897977 kernel: cpuidle: using governor menu Sep 13 00:07:04.897989 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 13 00:07:04.898000 kernel: dca service started, version 1.12.1 Sep 13 00:07:04.898015 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xb0000000-0xbfffffff] (base 0xb0000000) Sep 13 00:07:04.898025 kernel: PCI: Using configuration type 1 for base access Sep 13 00:07:04.898037 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 13 00:07:04.898049 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 13 00:07:04.898060 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 13 00:07:04.898071 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 13 00:07:04.898082 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 13 00:07:04.898093 kernel: ACPI: Added _OSI(Module Device) Sep 13 00:07:04.898103 kernel: ACPI: Added _OSI(Processor Device) Sep 13 00:07:04.898117 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 13 00:07:04.898129 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 13 00:07:04.898139 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Sep 13 00:07:04.898150 kernel: ACPI: Interpreter enabled Sep 13 00:07:04.898161 kernel: ACPI: PM: (supports S0 S5) Sep 13 00:07:04.898171 kernel: ACPI: Using IOAPIC for interrupt routing Sep 13 00:07:04.898182 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 13 00:07:04.898193 kernel: PCI: Using E820 reservations for host bridge windows Sep 13 00:07:04.898204 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Sep 13 00:07:04.898218 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 13 00:07:04.898503 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 13 00:07:04.898652 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Sep 13 00:07:04.898788 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Sep 13 00:07:04.898806 kernel: PCI host bridge to bus 0000:00 Sep 13 00:07:04.898950 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 13 00:07:04.899100 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 13 00:07:04.899215 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 13 00:07:04.899378 kernel: pci_bus 0000:00: root bus resource [mem 0x7d000000-0xafffffff window] Sep 13 00:07:04.899497 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Sep 13 00:07:04.899614 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] Sep 13 00:07:04.899727 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 13 00:07:04.899872 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 Sep 13 00:07:04.900022 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 Sep 13 00:07:04.900158 kernel: pci 0000:00:01.0: reg 0x10: [mem 0xfb800000-0xfbffffff pref] Sep 13 00:07:04.900352 kernel: pci 0000:00:01.0: reg 0x18: [mem 0xfd200000-0xfd203fff 64bit pref] Sep 13 00:07:04.900483 kernel: pci 0000:00:01.0: reg 0x20: [mem 0xfea10000-0xfea10fff] Sep 13 00:07:04.900610 kernel: pci 0000:00:01.0: reg 0x30: [mem 0xfea00000-0xfea0ffff pref] Sep 13 00:07:04.900772 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 13 00:07:04.900930 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Sep 13 00:07:04.901073 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfea11000-0xfea11fff] Sep 13 00:07:04.901214 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Sep 13 00:07:04.901396 kernel: pci 0000:00:02.1: reg 0x10: [mem 0xfea12000-0xfea12fff] Sep 13 00:07:04.901536 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Sep 13 00:07:04.901667 kernel: pci 0000:00:02.2: reg 0x10: [mem 0xfea13000-0xfea13fff] Sep 13 00:07:04.901810 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Sep 13 00:07:04.901947 kernel: pci 0000:00:02.3: reg 0x10: [mem 0xfea14000-0xfea14fff] Sep 13 00:07:04.902093 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Sep 13 00:07:04.902220 kernel: pci 0000:00:02.4: reg 0x10: [mem 0xfea15000-0xfea15fff] Sep 13 00:07:04.902415 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Sep 13 00:07:04.902486 kernel: pci 0000:00:02.5: reg 0x10: [mem 0xfea16000-0xfea16fff] Sep 13 00:07:04.902558 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Sep 13 00:07:04.902632 kernel: pci 0000:00:02.6: reg 0x10: [mem 0xfea17000-0xfea17fff] Sep 13 00:07:04.902701 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Sep 13 00:07:04.902764 kernel: pci 0000:00:02.7: reg 0x10: [mem 0xfea18000-0xfea18fff] Sep 13 00:07:04.902831 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 Sep 13 00:07:04.902893 kernel: pci 0000:00:03.0: reg 0x10: [mem 0xfea19000-0xfea19fff] Sep 13 00:07:04.902962 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 Sep 13 00:07:04.903044 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Sep 13 00:07:04.903179 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 Sep 13 00:07:04.903382 kernel: pci 0000:00:1f.2: reg 0x20: [io 0xc040-0xc05f] Sep 13 00:07:04.903512 kernel: pci 0000:00:1f.2: reg 0x24: [mem 0xfea1a000-0xfea1afff] Sep 13 00:07:04.903649 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 Sep 13 00:07:04.903777 kernel: pci 0000:00:1f.3: reg 0x20: [io 0x0700-0x073f] Sep 13 00:07:04.903917 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 Sep 13 00:07:04.904063 kernel: pci 0000:01:00.0: reg 0x14: [mem 0xfe880000-0xfe880fff] Sep 13 00:07:04.904196 kernel: pci 0000:01:00.0: reg 0x20: [mem 0xfd000000-0xfd003fff 64bit pref] Sep 13 00:07:04.904394 kernel: pci 0000:01:00.0: reg 0x30: [mem 0xfe800000-0xfe87ffff pref] Sep 13 00:07:04.904521 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Sep 13 00:07:04.904646 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Sep 13 00:07:04.904794 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Sep 13 00:07:04.904946 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 Sep 13 00:07:04.905089 kernel: pci 0000:02:00.0: reg 0x10: [mem 0xfe600000-0xfe603fff 64bit] Sep 13 00:07:04.905222 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Sep 13 00:07:04.905446 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Sep 13 00:07:04.905569 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Sep 13 00:07:04.905714 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 Sep 13 00:07:04.905847 kernel: pci 0000:03:00.0: reg 0x14: [mem 0xfe400000-0xfe400fff] Sep 13 00:07:04.906010 kernel: pci 0000:03:00.0: reg 0x20: [mem 0xfcc00000-0xfcc03fff 64bit pref] Sep 13 00:07:04.906137 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Sep 13 00:07:04.906331 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Sep 13 00:07:04.906467 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 13 00:07:04.906610 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 Sep 13 00:07:04.906739 kernel: pci 0000:04:00.0: reg 0x20: [mem 0xfca00000-0xfca03fff 64bit pref] Sep 13 00:07:04.906867 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Sep 13 00:07:04.907002 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Sep 13 00:07:04.907125 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 13 00:07:04.907326 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 Sep 13 00:07:04.907466 kernel: pci 0000:05:00.0: reg 0x20: [mem 0xfc800000-0xfc803fff 64bit pref] Sep 13 00:07:04.907596 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Sep 13 00:07:04.907722 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Sep 13 00:07:04.907851 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 13 00:07:04.908006 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 Sep 13 00:07:04.908156 kernel: pci 0000:06:00.0: reg 0x14: [mem 0xfde00000-0xfde00fff] Sep 13 00:07:04.908415 kernel: pci 0000:06:00.0: reg 0x20: [mem 0xfc600000-0xfc603fff 64bit pref] Sep 13 00:07:04.908542 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Sep 13 00:07:04.908665 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Sep 13 00:07:04.908794 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 13 00:07:04.908811 kernel: acpiphp: Slot [0] registered Sep 13 00:07:04.909346 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 Sep 13 00:07:04.909497 kernel: pci 0000:07:00.0: reg 0x14: [mem 0xfdc80000-0xfdc80fff] Sep 13 00:07:04.909634 kernel: pci 0000:07:00.0: reg 0x20: [mem 0xfc400000-0xfc403fff 64bit pref] Sep 13 00:07:04.909770 kernel: pci 0000:07:00.0: reg 0x30: [mem 0xfdc00000-0xfdc7ffff pref] Sep 13 00:07:04.909892 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Sep 13 00:07:04.909997 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Sep 13 00:07:04.910112 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 13 00:07:04.910129 kernel: acpiphp: Slot [0-2] registered Sep 13 00:07:04.910260 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Sep 13 00:07:04.910614 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Sep 13 00:07:04.910731 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 13 00:07:04.910747 kernel: acpiphp: Slot [0-3] registered Sep 13 00:07:04.910871 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Sep 13 00:07:04.910996 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Sep 13 00:07:04.911114 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 13 00:07:04.911131 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Sep 13 00:07:04.911142 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Sep 13 00:07:04.911157 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 13 00:07:04.911168 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Sep 13 00:07:04.911179 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Sep 13 00:07:04.911189 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Sep 13 00:07:04.911200 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Sep 13 00:07:04.911210 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Sep 13 00:07:04.911220 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Sep 13 00:07:04.911246 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Sep 13 00:07:04.911256 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Sep 13 00:07:04.911283 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Sep 13 00:07:04.911294 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Sep 13 00:07:04.911305 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Sep 13 00:07:04.911316 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Sep 13 00:07:04.911327 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Sep 13 00:07:04.911338 kernel: iommu: Default domain type: Translated Sep 13 00:07:04.911348 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 13 00:07:04.911358 kernel: PCI: Using ACPI for IRQ routing Sep 13 00:07:04.911369 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 13 00:07:04.911383 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Sep 13 00:07:04.911394 kernel: e820: reserve RAM buffer [mem 0x7cfdc000-0x7fffffff] Sep 13 00:07:04.911522 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Sep 13 00:07:04.911652 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Sep 13 00:07:04.911774 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 13 00:07:04.911792 kernel: vgaarb: loaded Sep 13 00:07:04.911804 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Sep 13 00:07:04.911815 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Sep 13 00:07:04.911827 kernel: clocksource: Switched to clocksource kvm-clock Sep 13 00:07:04.911843 kernel: VFS: Disk quotas dquot_6.6.0 Sep 13 00:07:04.911854 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 13 00:07:04.911865 kernel: pnp: PnP ACPI init Sep 13 00:07:04.912004 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Sep 13 00:07:04.912025 kernel: pnp: PnP ACPI: found 5 devices Sep 13 00:07:04.912037 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 13 00:07:04.912048 kernel: NET: Registered PF_INET protocol family Sep 13 00:07:04.912060 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 13 00:07:04.912076 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Sep 13 00:07:04.912088 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 13 00:07:04.912099 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Sep 13 00:07:04.912111 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 13 00:07:04.912122 kernel: TCP: Hash tables configured (established 16384 bind 16384) Sep 13 00:07:04.912133 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 13 00:07:04.912143 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 13 00:07:04.912152 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 13 00:07:04.912163 kernel: NET: Registered PF_XDP protocol family Sep 13 00:07:04.912325 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Sep 13 00:07:04.912455 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Sep 13 00:07:04.913011 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Sep 13 00:07:04.913143 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x1000-0x1fff] Sep 13 00:07:04.913325 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x2000-0x2fff] Sep 13 00:07:04.913513 kernel: pci 0000:00:03.0: BAR 13: assigned [io 0x3000-0x3fff] Sep 13 00:07:04.914334 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Sep 13 00:07:04.914476 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Sep 13 00:07:04.914606 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Sep 13 00:07:04.914743 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Sep 13 00:07:04.914870 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Sep 13 00:07:04.915057 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Sep 13 00:07:04.915196 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Sep 13 00:07:04.916447 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Sep 13 00:07:04.916586 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 13 00:07:04.916732 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Sep 13 00:07:04.916862 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Sep 13 00:07:04.916986 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 13 00:07:04.917115 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Sep 13 00:07:04.917260 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Sep 13 00:07:04.918438 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 13 00:07:04.918568 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Sep 13 00:07:04.918704 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Sep 13 00:07:04.918857 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 13 00:07:04.918990 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Sep 13 00:07:04.919121 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] Sep 13 00:07:04.920376 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Sep 13 00:07:04.920528 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 13 00:07:04.920663 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Sep 13 00:07:04.920795 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] Sep 13 00:07:04.920925 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Sep 13 00:07:04.921047 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 13 00:07:04.921181 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Sep 13 00:07:04.921358 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] Sep 13 00:07:04.921485 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Sep 13 00:07:04.921613 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 13 00:07:04.921749 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 13 00:07:04.921868 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 13 00:07:04.921986 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 13 00:07:04.922103 kernel: pci_bus 0000:00: resource 7 [mem 0x7d000000-0xafffffff window] Sep 13 00:07:04.922219 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Sep 13 00:07:04.924413 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] Sep 13 00:07:04.924575 kernel: pci_bus 0000:01: resource 1 [mem 0xfe800000-0xfe9fffff] Sep 13 00:07:04.924706 kernel: pci_bus 0000:01: resource 2 [mem 0xfd000000-0xfd1fffff 64bit pref] Sep 13 00:07:04.924845 kernel: pci_bus 0000:02: resource 1 [mem 0xfe600000-0xfe7fffff] Sep 13 00:07:04.924969 kernel: pci_bus 0000:02: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Sep 13 00:07:04.925096 kernel: pci_bus 0000:03: resource 1 [mem 0xfe400000-0xfe5fffff] Sep 13 00:07:04.925206 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 13 00:07:04.926411 kernel: pci_bus 0000:04: resource 1 [mem 0xfe200000-0xfe3fffff] Sep 13 00:07:04.926540 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 13 00:07:04.926677 kernel: pci_bus 0000:05: resource 1 [mem 0xfe000000-0xfe1fffff] Sep 13 00:07:04.926796 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 13 00:07:04.926928 kernel: pci_bus 0000:06: resource 1 [mem 0xfde00000-0xfdffffff] Sep 13 00:07:04.927045 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 13 00:07:04.927184 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] Sep 13 00:07:04.928409 kernel: pci_bus 0000:07: resource 1 [mem 0xfdc00000-0xfddfffff] Sep 13 00:07:04.928538 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 13 00:07:04.928668 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] Sep 13 00:07:04.928777 kernel: pci_bus 0000:08: resource 1 [mem 0xfda00000-0xfdbfffff] Sep 13 00:07:04.928890 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 13 00:07:04.929032 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] Sep 13 00:07:04.929158 kernel: pci_bus 0000:09: resource 1 [mem 0xfd800000-0xfd9fffff] Sep 13 00:07:04.929314 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 13 00:07:04.929334 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Sep 13 00:07:04.929347 kernel: PCI: CLS 0 bytes, default 64 Sep 13 00:07:04.929360 kernel: Initialise system trusted keyrings Sep 13 00:07:04.929371 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Sep 13 00:07:04.929382 kernel: Key type asymmetric registered Sep 13 00:07:04.929394 kernel: Asymmetric key parser 'x509' registered Sep 13 00:07:04.929406 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Sep 13 00:07:04.929423 kernel: io scheduler mq-deadline registered Sep 13 00:07:04.929435 kernel: io scheduler kyber registered Sep 13 00:07:04.929446 kernel: io scheduler bfq registered Sep 13 00:07:04.929579 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Sep 13 00:07:04.933297 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Sep 13 00:07:04.933503 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Sep 13 00:07:04.933645 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Sep 13 00:07:04.933784 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Sep 13 00:07:04.933928 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Sep 13 00:07:04.934065 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Sep 13 00:07:04.934204 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Sep 13 00:07:04.936632 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Sep 13 00:07:04.936846 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Sep 13 00:07:04.937467 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Sep 13 00:07:04.937619 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Sep 13 00:07:04.937764 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Sep 13 00:07:04.937915 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Sep 13 00:07:04.938058 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Sep 13 00:07:04.938196 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Sep 13 00:07:04.938217 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Sep 13 00:07:04.939421 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Sep 13 00:07:04.939564 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Sep 13 00:07:04.939585 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 13 00:07:04.939599 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Sep 13 00:07:04.939611 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 13 00:07:04.939631 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 13 00:07:04.939643 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Sep 13 00:07:04.939655 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 13 00:07:04.939666 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 13 00:07:04.939753 kernel: rtc_cmos 00:03: RTC can wake from S4 Sep 13 00:07:04.939768 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 13 00:07:04.941353 kernel: rtc_cmos 00:03: registered as rtc0 Sep 13 00:07:04.941484 kernel: rtc_cmos 00:03: setting system clock to 2025-09-13T00:07:04 UTC (1757722024) Sep 13 00:07:04.941617 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Sep 13 00:07:04.941637 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Sep 13 00:07:04.941650 kernel: NET: Registered PF_INET6 protocol family Sep 13 00:07:04.941661 kernel: Segment Routing with IPv6 Sep 13 00:07:04.941673 kernel: In-situ OAM (IOAM) with IPv6 Sep 13 00:07:04.941684 kernel: NET: Registered PF_PACKET protocol family Sep 13 00:07:04.941696 kernel: Key type dns_resolver registered Sep 13 00:07:04.941708 kernel: IPI shorthand broadcast: enabled Sep 13 00:07:04.941719 kernel: sched_clock: Marking stable (1264011374, 174754817)->(1452065233, -13299042) Sep 13 00:07:04.941738 kernel: registered taskstats version 1 Sep 13 00:07:04.941750 kernel: Loading compiled-in X.509 certificates Sep 13 00:07:04.941762 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.106-flatcar: 1274e0c573ac8d09163d6bc6d1ee1445fb2f8cc6' Sep 13 00:07:04.941773 kernel: Key type .fscrypt registered Sep 13 00:07:04.941784 kernel: Key type fscrypt-provisioning registered Sep 13 00:07:04.941797 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 13 00:07:04.941809 kernel: ima: Allocated hash algorithm: sha1 Sep 13 00:07:04.941821 kernel: ima: No architecture policies found Sep 13 00:07:04.941837 kernel: clk: Disabling unused clocks Sep 13 00:07:04.941847 kernel: Freeing unused kernel image (initmem) memory: 42884K Sep 13 00:07:04.941853 kernel: Write protecting the kernel read-only data: 36864k Sep 13 00:07:04.941860 kernel: Freeing unused kernel image (rodata/data gap) memory: 1832K Sep 13 00:07:04.941866 kernel: Run /init as init process Sep 13 00:07:04.941873 kernel: with arguments: Sep 13 00:07:04.941879 kernel: /init Sep 13 00:07:04.941885 kernel: with environment: Sep 13 00:07:04.941891 kernel: HOME=/ Sep 13 00:07:04.941897 kernel: TERM=linux Sep 13 00:07:04.941905 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 13 00:07:04.941913 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Sep 13 00:07:04.941922 systemd[1]: Detected virtualization kvm. Sep 13 00:07:04.941929 systemd[1]: Detected architecture x86-64. Sep 13 00:07:04.941935 systemd[1]: Running in initrd. Sep 13 00:07:04.941941 systemd[1]: No hostname configured, using default hostname. Sep 13 00:07:04.941948 systemd[1]: Hostname set to . Sep 13 00:07:04.941956 systemd[1]: Initializing machine ID from VM UUID. Sep 13 00:07:04.941962 systemd[1]: Queued start job for default target initrd.target. Sep 13 00:07:04.941969 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 13 00:07:04.941976 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 13 00:07:04.941983 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 13 00:07:04.941990 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 13 00:07:04.941996 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 13 00:07:04.942003 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 13 00:07:04.942012 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 13 00:07:04.942019 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 13 00:07:04.942025 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 13 00:07:04.942037 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 13 00:07:04.942049 systemd[1]: Reached target paths.target - Path Units. Sep 13 00:07:04.942061 systemd[1]: Reached target slices.target - Slice Units. Sep 13 00:07:04.942074 systemd[1]: Reached target swap.target - Swaps. Sep 13 00:07:04.942089 systemd[1]: Reached target timers.target - Timer Units. Sep 13 00:07:04.942102 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 13 00:07:04.942115 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 13 00:07:04.942127 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 13 00:07:04.942138 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Sep 13 00:07:04.942150 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 13 00:07:04.942163 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 13 00:07:04.942176 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 13 00:07:04.942191 systemd[1]: Reached target sockets.target - Socket Units. Sep 13 00:07:04.942204 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 13 00:07:04.942217 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 13 00:07:04.942244 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 13 00:07:04.942258 systemd[1]: Starting systemd-fsck-usr.service... Sep 13 00:07:04.942287 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 13 00:07:04.942314 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 13 00:07:04.942328 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 13 00:07:04.942366 systemd-journald[188]: Collecting audit messages is disabled. Sep 13 00:07:04.942405 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 13 00:07:04.942419 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 13 00:07:04.942432 systemd[1]: Finished systemd-fsck-usr.service. Sep 13 00:07:04.942449 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 13 00:07:04.942462 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 13 00:07:04.942474 kernel: Bridge firewalling registered Sep 13 00:07:04.942486 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 13 00:07:04.942500 systemd-journald[188]: Journal started Sep 13 00:07:04.942532 systemd-journald[188]: Runtime Journal (/run/log/journal/1640c12c69434b00bc6ccbf0bba0e1db) is 4.8M, max 38.4M, 33.6M free. Sep 13 00:07:04.912855 systemd-modules-load[189]: Inserted module 'overlay' Sep 13 00:07:04.982892 systemd[1]: Started systemd-journald.service - Journal Service. Sep 13 00:07:04.939126 systemd-modules-load[189]: Inserted module 'br_netfilter' Sep 13 00:07:04.984365 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 13 00:07:04.985036 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 13 00:07:04.992409 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 13 00:07:04.994366 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 13 00:07:04.997426 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 13 00:07:05.002122 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 13 00:07:05.015602 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 13 00:07:05.020541 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 13 00:07:05.026475 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 13 00:07:05.028158 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 13 00:07:05.029783 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 13 00:07:05.042016 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 13 00:07:05.052857 dracut-cmdline[219]: dracut-dracut-053 Sep 13 00:07:05.055953 dracut-cmdline[219]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=2945e6465d436b7d1da8a9350a0544af0bd9aec821cd06987451d5e1d3071534 Sep 13 00:07:05.084449 systemd-resolved[223]: Positive Trust Anchors: Sep 13 00:07:05.084464 systemd-resolved[223]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 13 00:07:05.084489 systemd-resolved[223]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 13 00:07:05.087744 systemd-resolved[223]: Defaulting to hostname 'linux'. Sep 13 00:07:05.088638 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 13 00:07:05.094431 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 13 00:07:05.126340 kernel: SCSI subsystem initialized Sep 13 00:07:05.137330 kernel: Loading iSCSI transport class v2.0-870. Sep 13 00:07:05.147301 kernel: iscsi: registered transport (tcp) Sep 13 00:07:05.169380 kernel: iscsi: registered transport (qla4xxx) Sep 13 00:07:05.169473 kernel: QLogic iSCSI HBA Driver Sep 13 00:07:05.206491 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 13 00:07:05.211435 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 13 00:07:05.237317 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 13 00:07:05.237399 kernel: device-mapper: uevent: version 1.0.3 Sep 13 00:07:05.238302 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Sep 13 00:07:05.275316 kernel: raid6: avx2x4 gen() 32088 MB/s Sep 13 00:07:05.293323 kernel: raid6: avx2x2 gen() 23415 MB/s Sep 13 00:07:05.311758 kernel: raid6: avx2x1 gen() 16904 MB/s Sep 13 00:07:05.311844 kernel: raid6: using algorithm avx2x4 gen() 32088 MB/s Sep 13 00:07:05.330607 kernel: raid6: .... xor() 4748 MB/s, rmw enabled Sep 13 00:07:05.330719 kernel: raid6: using avx2x2 recovery algorithm Sep 13 00:07:05.352332 kernel: xor: automatically using best checksumming function avx Sep 13 00:07:05.505310 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 13 00:07:05.517044 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 13 00:07:05.523437 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 13 00:07:05.544715 systemd-udevd[406]: Using default interface naming scheme 'v255'. Sep 13 00:07:05.550224 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 13 00:07:05.559435 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 13 00:07:05.575072 dracut-pre-trigger[411]: rd.md=0: removing MD RAID activation Sep 13 00:07:05.610680 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 13 00:07:05.617573 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 13 00:07:05.662209 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 13 00:07:05.670466 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 13 00:07:05.681703 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 13 00:07:05.683391 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 13 00:07:05.685044 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 13 00:07:05.686160 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 13 00:07:05.693443 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 13 00:07:05.704389 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 13 00:07:05.758301 kernel: ACPI: bus type USB registered Sep 13 00:07:05.761288 kernel: usbcore: registered new interface driver usbfs Sep 13 00:07:05.765473 kernel: usbcore: registered new interface driver hub Sep 13 00:07:05.765537 kernel: scsi host0: Virtio SCSI HBA Sep 13 00:07:05.765782 kernel: usbcore: registered new device driver usb Sep 13 00:07:05.789303 kernel: scsi 0:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Sep 13 00:07:05.791324 kernel: cryptd: max_cpu_qlen set to 1000 Sep 13 00:07:05.828610 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 13 00:07:05.828729 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 13 00:07:05.830100 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 13 00:07:05.831955 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 13 00:07:05.832145 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 13 00:07:05.834810 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 13 00:07:05.842539 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 13 00:07:05.857356 kernel: libata version 3.00 loaded. Sep 13 00:07:05.866289 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Sep 13 00:07:05.866562 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Sep 13 00:07:05.866663 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Sep 13 00:07:05.868667 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Sep 13 00:07:05.868790 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Sep 13 00:07:05.868876 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Sep 13 00:07:05.875370 kernel: hub 1-0:1.0: USB hub found Sep 13 00:07:05.875652 kernel: hub 1-0:1.0: 4 ports detected Sep 13 00:07:05.875746 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Sep 13 00:07:05.877365 kernel: hub 2-0:1.0: USB hub found Sep 13 00:07:05.878289 kernel: hub 2-0:1.0: 4 ports detected Sep 13 00:07:05.890302 kernel: ahci 0000:00:1f.2: version 3.0 Sep 13 00:07:05.890475 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Sep 13 00:07:05.891516 kernel: ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode Sep 13 00:07:05.893705 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Sep 13 00:07:05.893835 kernel: AVX2 version of gcm_enc/dec engaged. Sep 13 00:07:05.893846 kernel: AES CTR mode by8 optimization enabled Sep 13 00:07:05.894285 kernel: scsi host1: ahci Sep 13 00:07:05.897361 kernel: scsi host2: ahci Sep 13 00:07:05.897481 kernel: scsi host3: ahci Sep 13 00:07:05.898930 kernel: scsi host4: ahci Sep 13 00:07:05.899103 kernel: scsi host5: ahci Sep 13 00:07:05.899192 kernel: scsi host6: ahci Sep 13 00:07:05.899326 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a100 irq 49 Sep 13 00:07:05.899337 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a180 irq 49 Sep 13 00:07:05.899345 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a200 irq 49 Sep 13 00:07:05.899353 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a280 irq 49 Sep 13 00:07:05.899367 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a300 irq 49 Sep 13 00:07:05.899374 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a380 irq 49 Sep 13 00:07:05.935713 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 13 00:07:05.942438 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 13 00:07:05.953908 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 13 00:07:06.116321 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Sep 13 00:07:06.213704 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 13 00:07:06.213812 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 13 00:07:06.213833 kernel: ata2: SATA link down (SStatus 0 SControl 300) Sep 13 00:07:06.213850 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Sep 13 00:07:06.213867 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 13 00:07:06.217299 kernel: ata3: SATA link down (SStatus 0 SControl 300) Sep 13 00:07:06.217359 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Sep 13 00:07:06.218525 kernel: ata1.00: applying bridge limits Sep 13 00:07:06.220595 kernel: ata1.00: configured for UDMA/100 Sep 13 00:07:06.221444 kernel: scsi 1:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Sep 13 00:07:06.250781 kernel: sd 0:0:0:0: Power-on or device reset occurred Sep 13 00:07:06.250993 kernel: sd 0:0:0:0: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Sep 13 00:07:06.259802 kernel: sd 0:0:0:0: [sda] Write Protect is off Sep 13 00:07:06.260256 kernel: sd 0:0:0:0: [sda] Mode Sense: 63 00 00 08 Sep 13 00:07:06.260510 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 13 00:07:06.264326 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Sep 13 00:07:06.269483 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 13 00:07:06.269549 kernel: GPT:17805311 != 80003071 Sep 13 00:07:06.269563 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 13 00:07:06.270652 kernel: GPT:17805311 != 80003071 Sep 13 00:07:06.271636 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 13 00:07:06.272748 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 13 00:07:06.274803 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Sep 13 00:07:06.283583 kernel: usbcore: registered new interface driver usbhid Sep 13 00:07:06.283633 kernel: usbhid: USB HID core driver Sep 13 00:07:06.289397 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Sep 13 00:07:06.289437 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Sep 13 00:07:06.298629 kernel: sr 1:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Sep 13 00:07:06.298848 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 13 00:07:06.317624 kernel: sr 1:0:0:0: Attached scsi CD-ROM sr0 Sep 13 00:07:06.323309 kernel: BTRFS: device label OEM devid 1 transid 9 /dev/sda6 scanned by (udev-worker) (449) Sep 13 00:07:06.327304 kernel: BTRFS: device fsid fa70a3b0-3d47-4508-bba0-9fa4607626aa devid 1 transid 36 /dev/sda3 scanned by (udev-worker) (452) Sep 13 00:07:06.327922 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Sep 13 00:07:06.354406 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Sep 13 00:07:06.362320 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Sep 13 00:07:06.365942 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Sep 13 00:07:06.366513 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Sep 13 00:07:06.374426 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 13 00:07:06.380295 disk-uuid[572]: Primary Header is updated. Sep 13 00:07:06.380295 disk-uuid[572]: Secondary Entries is updated. Sep 13 00:07:06.380295 disk-uuid[572]: Secondary Header is updated. Sep 13 00:07:06.389628 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 13 00:07:06.395318 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 13 00:07:06.413310 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 13 00:07:07.402299 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 13 00:07:07.404066 disk-uuid[574]: The operation has completed successfully. Sep 13 00:07:07.447310 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 13 00:07:07.447408 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 13 00:07:07.466413 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 13 00:07:07.468700 sh[593]: Success Sep 13 00:07:07.480357 kernel: device-mapper: verity: sha256 using implementation "sha256-ni" Sep 13 00:07:07.541564 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 13 00:07:07.542354 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 13 00:07:07.544619 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 13 00:07:07.561800 kernel: BTRFS info (device dm-0): first mount of filesystem fa70a3b0-3d47-4508-bba0-9fa4607626aa Sep 13 00:07:07.561851 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 13 00:07:07.561863 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Sep 13 00:07:07.564702 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 13 00:07:07.564734 kernel: BTRFS info (device dm-0): using free space tree Sep 13 00:07:07.573288 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 13 00:07:07.575037 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 13 00:07:07.576198 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 13 00:07:07.588409 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 13 00:07:07.591375 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 13 00:07:07.611106 kernel: BTRFS info (device sda6): first mount of filesystem 94088f30-ba7d-4694-bba6-875359d7b417 Sep 13 00:07:07.611166 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 13 00:07:07.611179 kernel: BTRFS info (device sda6): using free space tree Sep 13 00:07:07.618310 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 13 00:07:07.618354 kernel: BTRFS info (device sda6): auto enabling async discard Sep 13 00:07:07.627773 systemd[1]: mnt-oem.mount: Deactivated successfully. Sep 13 00:07:07.630387 kernel: BTRFS info (device sda6): last unmount of filesystem 94088f30-ba7d-4694-bba6-875359d7b417 Sep 13 00:07:07.635689 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 13 00:07:07.641407 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 13 00:07:07.672331 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 13 00:07:07.685654 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 13 00:07:07.711006 ignition[735]: Ignition 2.19.0 Sep 13 00:07:07.711026 ignition[735]: Stage: fetch-offline Sep 13 00:07:07.711094 ignition[735]: no configs at "/usr/lib/ignition/base.d" Sep 13 00:07:07.711108 ignition[735]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 13 00:07:07.711209 ignition[735]: parsed url from cmdline: "" Sep 13 00:07:07.711213 ignition[735]: no config URL provided Sep 13 00:07:07.711220 ignition[735]: reading system config file "/usr/lib/ignition/user.ign" Sep 13 00:07:07.714977 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 13 00:07:07.711252 ignition[735]: no config at "/usr/lib/ignition/user.ign" Sep 13 00:07:07.716001 systemd-networkd[775]: lo: Link UP Sep 13 00:07:07.711260 ignition[735]: failed to fetch config: resource requires networking Sep 13 00:07:07.716004 systemd-networkd[775]: lo: Gained carrier Sep 13 00:07:07.712941 ignition[735]: Ignition finished successfully Sep 13 00:07:07.720476 systemd-networkd[775]: Enumeration completed Sep 13 00:07:07.720869 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 13 00:07:07.721256 systemd-networkd[775]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 13 00:07:07.721258 systemd-networkd[775]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 13 00:07:07.722115 systemd-networkd[775]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 13 00:07:07.722117 systemd-networkd[775]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 13 00:07:07.722531 systemd-networkd[775]: eth0: Link UP Sep 13 00:07:07.722534 systemd-networkd[775]: eth0: Gained carrier Sep 13 00:07:07.722540 systemd-networkd[775]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 13 00:07:07.725683 systemd[1]: Reached target network.target - Network. Sep 13 00:07:07.726465 systemd-networkd[775]: eth1: Link UP Sep 13 00:07:07.726468 systemd-networkd[775]: eth1: Gained carrier Sep 13 00:07:07.726474 systemd-networkd[775]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 13 00:07:07.732440 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 13 00:07:07.743481 ignition[784]: Ignition 2.19.0 Sep 13 00:07:07.743491 ignition[784]: Stage: fetch Sep 13 00:07:07.744473 ignition[784]: no configs at "/usr/lib/ignition/base.d" Sep 13 00:07:07.744489 ignition[784]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 13 00:07:07.744619 ignition[784]: parsed url from cmdline: "" Sep 13 00:07:07.744624 ignition[784]: no config URL provided Sep 13 00:07:07.744631 ignition[784]: reading system config file "/usr/lib/ignition/user.ign" Sep 13 00:07:07.744645 ignition[784]: no config at "/usr/lib/ignition/user.ign" Sep 13 00:07:07.744665 ignition[784]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Sep 13 00:07:07.744812 ignition[784]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Sep 13 00:07:07.757335 systemd-networkd[775]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Sep 13 00:07:07.788359 systemd-networkd[775]: eth0: DHCPv4 address 157.180.121.11/32, gateway 172.31.1.1 acquired from 172.31.1.1 Sep 13 00:07:07.945284 ignition[784]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Sep 13 00:07:07.949200 ignition[784]: GET result: OK Sep 13 00:07:07.949293 ignition[784]: parsing config with SHA512: ee936842ebdaa933c802325e184340a93df7eac961128942e409c51c8323a9289861592d582d29ba24501e4cf83bafba5112eaa2dc52c8147a6e1845aeae3dda Sep 13 00:07:07.952759 unknown[784]: fetched base config from "system" Sep 13 00:07:07.953054 ignition[784]: fetch: fetch complete Sep 13 00:07:07.952769 unknown[784]: fetched base config from "system" Sep 13 00:07:07.953058 ignition[784]: fetch: fetch passed Sep 13 00:07:07.952773 unknown[784]: fetched user config from "hetzner" Sep 13 00:07:07.953094 ignition[784]: Ignition finished successfully Sep 13 00:07:07.954999 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 13 00:07:07.959484 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 13 00:07:07.974661 ignition[791]: Ignition 2.19.0 Sep 13 00:07:07.974673 ignition[791]: Stage: kargs Sep 13 00:07:07.974849 ignition[791]: no configs at "/usr/lib/ignition/base.d" Sep 13 00:07:07.974860 ignition[791]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 13 00:07:07.975711 ignition[791]: kargs: kargs passed Sep 13 00:07:07.975759 ignition[791]: Ignition finished successfully Sep 13 00:07:07.977568 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 13 00:07:07.981428 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 13 00:07:07.997952 ignition[797]: Ignition 2.19.0 Sep 13 00:07:07.997968 ignition[797]: Stage: disks Sep 13 00:07:08.003765 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 13 00:07:07.998301 ignition[797]: no configs at "/usr/lib/ignition/base.d" Sep 13 00:07:08.005123 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 13 00:07:07.998317 ignition[797]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 13 00:07:08.006225 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 13 00:07:08.000462 ignition[797]: disks: disks passed Sep 13 00:07:08.007468 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 13 00:07:08.000528 ignition[797]: Ignition finished successfully Sep 13 00:07:08.008560 systemd[1]: Reached target sysinit.target - System Initialization. Sep 13 00:07:08.009727 systemd[1]: Reached target basic.target - Basic System. Sep 13 00:07:08.016559 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 13 00:07:08.028538 systemd-fsck[805]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Sep 13 00:07:08.030660 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 13 00:07:08.036424 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 13 00:07:08.101344 kernel: EXT4-fs (sda9): mounted filesystem 3a3ecd49-b269-4fcb-bb61-e2994e1868ee r/w with ordered data mode. Quota mode: none. Sep 13 00:07:08.101998 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 13 00:07:08.103103 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 13 00:07:08.108385 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 13 00:07:08.111398 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 13 00:07:08.113635 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Sep 13 00:07:08.116212 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 13 00:07:08.117370 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 13 00:07:08.120140 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 13 00:07:08.124337 kernel: BTRFS: device label OEM devid 1 transid 10 /dev/sda6 scanned by mount (813) Sep 13 00:07:08.129740 kernel: BTRFS info (device sda6): first mount of filesystem 94088f30-ba7d-4694-bba6-875359d7b417 Sep 13 00:07:08.129769 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 13 00:07:08.129778 kernel: BTRFS info (device sda6): using free space tree Sep 13 00:07:08.130251 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 13 00:07:08.142365 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 13 00:07:08.142435 kernel: BTRFS info (device sda6): auto enabling async discard Sep 13 00:07:08.149181 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 13 00:07:08.182305 initrd-setup-root[841]: cut: /sysroot/etc/passwd: No such file or directory Sep 13 00:07:08.184679 coreos-metadata[815]: Sep 13 00:07:08.183 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Sep 13 00:07:08.185702 coreos-metadata[815]: Sep 13 00:07:08.185 INFO Fetch successful Sep 13 00:07:08.185702 coreos-metadata[815]: Sep 13 00:07:08.185 INFO wrote hostname ci-4081-3-5-n-294a4568b6 to /sysroot/etc/hostname Sep 13 00:07:08.188757 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 13 00:07:08.190146 initrd-setup-root[849]: cut: /sysroot/etc/group: No such file or directory Sep 13 00:07:08.193114 initrd-setup-root[856]: cut: /sysroot/etc/shadow: No such file or directory Sep 13 00:07:08.196745 initrd-setup-root[863]: cut: /sysroot/etc/gshadow: No such file or directory Sep 13 00:07:08.274833 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 13 00:07:08.279419 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 13 00:07:08.283473 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 13 00:07:08.291303 kernel: BTRFS info (device sda6): last unmount of filesystem 94088f30-ba7d-4694-bba6-875359d7b417 Sep 13 00:07:08.318954 ignition[930]: INFO : Ignition 2.19.0 Sep 13 00:07:08.320805 ignition[930]: INFO : Stage: mount Sep 13 00:07:08.320805 ignition[930]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 13 00:07:08.320805 ignition[930]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 13 00:07:08.324360 ignition[930]: INFO : mount: mount passed Sep 13 00:07:08.324360 ignition[930]: INFO : Ignition finished successfully Sep 13 00:07:08.322782 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 13 00:07:08.323645 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 13 00:07:08.332462 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 13 00:07:08.559963 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 13 00:07:08.567506 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 13 00:07:08.581460 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 scanned by mount (942) Sep 13 00:07:08.586519 kernel: BTRFS info (device sda6): first mount of filesystem 94088f30-ba7d-4694-bba6-875359d7b417 Sep 13 00:07:08.586580 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 13 00:07:08.586592 kernel: BTRFS info (device sda6): using free space tree Sep 13 00:07:08.593869 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 13 00:07:08.593933 kernel: BTRFS info (device sda6): auto enabling async discard Sep 13 00:07:08.596740 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 13 00:07:08.624968 ignition[959]: INFO : Ignition 2.19.0 Sep 13 00:07:08.624968 ignition[959]: INFO : Stage: files Sep 13 00:07:08.626437 ignition[959]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 13 00:07:08.626437 ignition[959]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 13 00:07:08.626437 ignition[959]: DEBUG : files: compiled without relabeling support, skipping Sep 13 00:07:08.629485 ignition[959]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 13 00:07:08.629485 ignition[959]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 13 00:07:08.631991 ignition[959]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 13 00:07:08.631991 ignition[959]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 13 00:07:08.631991 ignition[959]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 13 00:07:08.630939 unknown[959]: wrote ssh authorized keys file for user: core Sep 13 00:07:08.635884 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Sep 13 00:07:08.635884 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Sep 13 00:07:08.805720 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 13 00:07:08.952499 systemd-networkd[775]: eth1: Gained IPv6LL Sep 13 00:07:09.117014 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Sep 13 00:07:09.118365 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 13 00:07:09.118365 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 13 00:07:09.118365 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 13 00:07:09.118365 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 13 00:07:09.118365 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 13 00:07:09.118365 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 13 00:07:09.118365 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 13 00:07:09.118365 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 13 00:07:09.118365 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 13 00:07:09.118365 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 13 00:07:09.118365 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 13 00:07:09.118365 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 13 00:07:09.118365 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 13 00:07:09.118365 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-x86-64.raw: attempt #1 Sep 13 00:07:09.464503 systemd-networkd[775]: eth0: Gained IPv6LL Sep 13 00:07:09.535640 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 13 00:07:09.676865 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Sep 13 00:07:09.676865 ignition[959]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 13 00:07:09.680651 ignition[959]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 13 00:07:09.680651 ignition[959]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 13 00:07:09.680651 ignition[959]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 13 00:07:09.680651 ignition[959]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 13 00:07:09.680651 ignition[959]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Sep 13 00:07:09.680651 ignition[959]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Sep 13 00:07:09.680651 ignition[959]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 13 00:07:09.680651 ignition[959]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Sep 13 00:07:09.680651 ignition[959]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Sep 13 00:07:09.680651 ignition[959]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 13 00:07:09.680651 ignition[959]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 13 00:07:09.680651 ignition[959]: INFO : files: files passed Sep 13 00:07:09.680651 ignition[959]: INFO : Ignition finished successfully Sep 13 00:07:09.680479 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 13 00:07:09.688406 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 13 00:07:09.692400 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 13 00:07:09.693573 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 13 00:07:09.693671 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 13 00:07:09.704990 initrd-setup-root-after-ignition[988]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 13 00:07:09.704990 initrd-setup-root-after-ignition[988]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 13 00:07:09.707551 initrd-setup-root-after-ignition[992]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 13 00:07:09.709209 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 13 00:07:09.710190 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 13 00:07:09.715459 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 13 00:07:09.733730 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 13 00:07:09.733842 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 13 00:07:09.734953 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 13 00:07:09.736355 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 13 00:07:09.737698 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 13 00:07:09.743437 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 13 00:07:09.754873 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 13 00:07:09.762488 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 13 00:07:09.772217 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 13 00:07:09.772818 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 13 00:07:09.774188 systemd[1]: Stopped target timers.target - Timer Units. Sep 13 00:07:09.775259 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 13 00:07:09.775425 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 13 00:07:09.776662 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 13 00:07:09.777464 systemd[1]: Stopped target basic.target - Basic System. Sep 13 00:07:09.778653 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 13 00:07:09.779823 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 13 00:07:09.780842 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 13 00:07:09.781911 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 13 00:07:09.782965 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 13 00:07:09.784090 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 13 00:07:09.785108 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 13 00:07:09.786209 systemd[1]: Stopped target swap.target - Swaps. Sep 13 00:07:09.787157 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 13 00:07:09.787286 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 13 00:07:09.788502 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 13 00:07:09.789168 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 13 00:07:09.790104 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 13 00:07:09.792309 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 13 00:07:09.792861 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 13 00:07:09.792946 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 13 00:07:09.794665 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 13 00:07:09.794850 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 13 00:07:09.796350 systemd[1]: ignition-files.service: Deactivated successfully. Sep 13 00:07:09.796450 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 13 00:07:09.797565 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Sep 13 00:07:09.797666 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 13 00:07:09.804717 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 13 00:07:09.805171 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 13 00:07:09.805332 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 13 00:07:09.808431 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 13 00:07:09.808862 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 13 00:07:09.808984 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 13 00:07:09.809623 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 13 00:07:09.809735 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 13 00:07:09.815249 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 13 00:07:09.815345 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 13 00:07:09.821655 ignition[1012]: INFO : Ignition 2.19.0 Sep 13 00:07:09.823496 ignition[1012]: INFO : Stage: umount Sep 13 00:07:09.823496 ignition[1012]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 13 00:07:09.823496 ignition[1012]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 13 00:07:09.823496 ignition[1012]: INFO : umount: umount passed Sep 13 00:07:09.823496 ignition[1012]: INFO : Ignition finished successfully Sep 13 00:07:09.826491 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 13 00:07:09.826564 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 13 00:07:09.831659 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 13 00:07:09.832293 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 13 00:07:09.832421 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 13 00:07:09.833905 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 13 00:07:09.833944 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 13 00:07:09.834476 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 13 00:07:09.834524 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 13 00:07:09.835021 systemd[1]: Stopped target network.target - Network. Sep 13 00:07:09.835937 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 13 00:07:09.835979 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 13 00:07:09.836973 systemd[1]: Stopped target paths.target - Path Units. Sep 13 00:07:09.837889 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 13 00:07:09.841311 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 13 00:07:09.842026 systemd[1]: Stopped target slices.target - Slice Units. Sep 13 00:07:09.842945 systemd[1]: Stopped target sockets.target - Socket Units. Sep 13 00:07:09.844065 systemd[1]: iscsid.socket: Deactivated successfully. Sep 13 00:07:09.844097 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 13 00:07:09.845177 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 13 00:07:09.845206 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 13 00:07:09.846057 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 13 00:07:09.846113 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 13 00:07:09.846937 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 13 00:07:09.846971 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 13 00:07:09.847989 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 13 00:07:09.849032 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 13 00:07:09.850462 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 13 00:07:09.850552 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 13 00:07:09.851531 systemd-networkd[775]: eth0: DHCPv6 lease lost Sep 13 00:07:09.851598 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 13 00:07:09.851668 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 13 00:07:09.855327 systemd-networkd[775]: eth1: DHCPv6 lease lost Sep 13 00:07:09.856090 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 13 00:07:09.856172 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 13 00:07:09.859365 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 13 00:07:09.859498 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 13 00:07:09.861923 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 13 00:07:09.861975 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 13 00:07:09.868439 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 13 00:07:09.869034 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 13 00:07:09.869105 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 13 00:07:09.871653 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 13 00:07:09.871704 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 13 00:07:09.872370 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 13 00:07:09.872444 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 13 00:07:09.873116 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 13 00:07:09.873164 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 13 00:07:09.874437 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 13 00:07:09.886894 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 13 00:07:09.887027 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 13 00:07:09.893822 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 13 00:07:09.894003 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 13 00:07:09.895388 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 13 00:07:09.895437 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 13 00:07:09.896344 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 13 00:07:09.896384 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 13 00:07:09.897505 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 13 00:07:09.897559 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 13 00:07:09.899059 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 13 00:07:09.899108 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 13 00:07:09.900179 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 13 00:07:09.900248 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 13 00:07:09.910455 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 13 00:07:09.911640 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 13 00:07:09.911706 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 13 00:07:09.912420 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 13 00:07:09.912472 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 13 00:07:09.916468 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 13 00:07:09.916561 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 13 00:07:09.917656 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 13 00:07:09.928512 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 13 00:07:09.934947 systemd[1]: Switching root. Sep 13 00:07:09.963581 systemd-journald[188]: Journal stopped Sep 13 00:07:10.743501 systemd-journald[188]: Received SIGTERM from PID 1 (systemd). Sep 13 00:07:10.743560 kernel: SELinux: policy capability network_peer_controls=1 Sep 13 00:07:10.743572 kernel: SELinux: policy capability open_perms=1 Sep 13 00:07:10.743580 kernel: SELinux: policy capability extended_socket_class=1 Sep 13 00:07:10.743587 kernel: SELinux: policy capability always_check_network=0 Sep 13 00:07:10.743597 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 13 00:07:10.743605 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 13 00:07:10.743613 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 13 00:07:10.743624 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 13 00:07:10.743636 kernel: audit: type=1403 audit(1757722030.112:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 13 00:07:10.743648 systemd[1]: Successfully loaded SELinux policy in 36.420ms. Sep 13 00:07:10.743665 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 10.730ms. Sep 13 00:07:10.743674 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Sep 13 00:07:10.743683 systemd[1]: Detected virtualization kvm. Sep 13 00:07:10.743693 systemd[1]: Detected architecture x86-64. Sep 13 00:07:10.743701 systemd[1]: Detected first boot. Sep 13 00:07:10.743709 systemd[1]: Hostname set to . Sep 13 00:07:10.743718 systemd[1]: Initializing machine ID from VM UUID. Sep 13 00:07:10.743726 zram_generator::config[1054]: No configuration found. Sep 13 00:07:10.743737 systemd[1]: Populated /etc with preset unit settings. Sep 13 00:07:10.743745 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 13 00:07:10.743754 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 13 00:07:10.743762 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 13 00:07:10.743772 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 13 00:07:10.743780 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 13 00:07:10.743789 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 13 00:07:10.743797 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 13 00:07:10.743805 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 13 00:07:10.743813 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 13 00:07:10.743822 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 13 00:07:10.743831 systemd[1]: Created slice user.slice - User and Session Slice. Sep 13 00:07:10.743839 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 13 00:07:10.743848 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 13 00:07:10.743856 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 13 00:07:10.743864 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 13 00:07:10.743873 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 13 00:07:10.743881 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 13 00:07:10.743889 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 13 00:07:10.743898 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 13 00:07:10.743907 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 13 00:07:10.743916 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 13 00:07:10.743924 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 13 00:07:10.743932 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 13 00:07:10.743941 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 13 00:07:10.743953 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 13 00:07:10.743963 systemd[1]: Reached target slices.target - Slice Units. Sep 13 00:07:10.743971 systemd[1]: Reached target swap.target - Swaps. Sep 13 00:07:10.743979 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 13 00:07:10.743987 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 13 00:07:10.743995 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 13 00:07:10.744004 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 13 00:07:10.744012 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 13 00:07:10.744020 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 13 00:07:10.744028 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 13 00:07:10.744038 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 13 00:07:10.744046 systemd[1]: Mounting media.mount - External Media Directory... Sep 13 00:07:10.744055 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 00:07:10.744063 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 13 00:07:10.744070 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 13 00:07:10.744079 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 13 00:07:10.744088 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 13 00:07:10.744100 systemd[1]: Reached target machines.target - Containers. Sep 13 00:07:10.744110 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 13 00:07:10.744119 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 13 00:07:10.744127 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 13 00:07:10.744135 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 13 00:07:10.744143 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 13 00:07:10.744151 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 13 00:07:10.744161 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 13 00:07:10.744169 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 13 00:07:10.744178 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 13 00:07:10.744190 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 13 00:07:10.744198 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 13 00:07:10.744207 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 13 00:07:10.744215 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 13 00:07:10.744222 systemd[1]: Stopped systemd-fsck-usr.service. Sep 13 00:07:10.744245 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 13 00:07:10.744254 kernel: fuse: init (API version 7.39) Sep 13 00:07:10.744263 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 13 00:07:10.744372 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 13 00:07:10.744387 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 13 00:07:10.744397 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 13 00:07:10.744407 kernel: ACPI: bus type drm_connector registered Sep 13 00:07:10.744415 systemd[1]: verity-setup.service: Deactivated successfully. Sep 13 00:07:10.744424 kernel: loop: module loaded Sep 13 00:07:10.744432 systemd[1]: Stopped verity-setup.service. Sep 13 00:07:10.744445 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 00:07:10.744454 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 13 00:07:10.744462 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 13 00:07:10.744470 systemd[1]: Mounted media.mount - External Media Directory. Sep 13 00:07:10.744495 systemd-journald[1130]: Collecting audit messages is disabled. Sep 13 00:07:10.744515 systemd-journald[1130]: Journal started Sep 13 00:07:10.744533 systemd-journald[1130]: Runtime Journal (/run/log/journal/1640c12c69434b00bc6ccbf0bba0e1db) is 4.8M, max 38.4M, 33.6M free. Sep 13 00:07:10.499455 systemd[1]: Queued start job for default target multi-user.target. Sep 13 00:07:10.516280 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 13 00:07:10.516624 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 13 00:07:10.748337 systemd[1]: Started systemd-journald.service - Journal Service. Sep 13 00:07:10.748160 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 13 00:07:10.749547 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 13 00:07:10.750223 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 13 00:07:10.752105 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 13 00:07:10.752916 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 13 00:07:10.753030 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 13 00:07:10.753923 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 13 00:07:10.754023 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 13 00:07:10.755194 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 13 00:07:10.755593 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 13 00:07:10.756831 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 13 00:07:10.757417 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 13 00:07:10.758607 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 13 00:07:10.758706 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 13 00:07:10.760683 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 13 00:07:10.760780 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 13 00:07:10.761474 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 13 00:07:10.762108 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 13 00:07:10.763747 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 13 00:07:10.775106 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 13 00:07:10.782386 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 13 00:07:10.785370 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 13 00:07:10.785933 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 13 00:07:10.786015 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 13 00:07:10.787254 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Sep 13 00:07:10.789927 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 13 00:07:10.800527 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 13 00:07:10.801123 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 13 00:07:10.802755 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 13 00:07:10.807090 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 13 00:07:10.807883 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 13 00:07:10.810725 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 13 00:07:10.811217 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 13 00:07:10.814399 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 13 00:07:10.818193 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 13 00:07:10.821516 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 13 00:07:10.822119 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 13 00:07:10.823029 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 13 00:07:10.824914 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 13 00:07:10.831951 systemd-journald[1130]: Time spent on flushing to /var/log/journal/1640c12c69434b00bc6ccbf0bba0e1db is 26.026ms for 1129 entries. Sep 13 00:07:10.831951 systemd-journald[1130]: System Journal (/var/log/journal/1640c12c69434b00bc6ccbf0bba0e1db) is 8.0M, max 584.8M, 576.8M free. Sep 13 00:07:10.871732 systemd-journald[1130]: Received client request to flush runtime journal. Sep 13 00:07:10.834433 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 13 00:07:10.841187 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 13 00:07:10.850311 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Sep 13 00:07:10.851808 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 13 00:07:10.853940 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 13 00:07:10.866410 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Sep 13 00:07:10.873193 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 13 00:07:10.882301 kernel: loop0: detected capacity change from 0 to 229808 Sep 13 00:07:10.895800 udevadm[1182]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Sep 13 00:07:10.901222 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 13 00:07:10.904763 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Sep 13 00:07:10.906625 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 13 00:07:10.922138 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 13 00:07:10.930643 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 13 00:07:10.930918 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 13 00:07:10.952692 systemd-tmpfiles[1193]: ACLs are not supported, ignoring. Sep 13 00:07:10.952986 systemd-tmpfiles[1193]: ACLs are not supported, ignoring. Sep 13 00:07:10.957304 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 13 00:07:10.961783 kernel: loop1: detected capacity change from 0 to 140768 Sep 13 00:07:11.002300 kernel: loop2: detected capacity change from 0 to 142488 Sep 13 00:07:11.043297 kernel: loop3: detected capacity change from 0 to 8 Sep 13 00:07:11.072660 kernel: loop4: detected capacity change from 0 to 229808 Sep 13 00:07:11.092464 kernel: loop5: detected capacity change from 0 to 140768 Sep 13 00:07:11.117307 kernel: loop6: detected capacity change from 0 to 142488 Sep 13 00:07:11.135318 kernel: loop7: detected capacity change from 0 to 8 Sep 13 00:07:11.136713 (sd-merge)[1199]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Sep 13 00:07:11.137169 (sd-merge)[1199]: Merged extensions into '/usr'. Sep 13 00:07:11.143557 systemd[1]: Reloading requested from client PID 1173 ('systemd-sysext') (unit systemd-sysext.service)... Sep 13 00:07:11.143676 systemd[1]: Reloading... Sep 13 00:07:11.212299 zram_generator::config[1227]: No configuration found. Sep 13 00:07:11.321145 ldconfig[1168]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 13 00:07:11.323444 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 13 00:07:11.363240 systemd[1]: Reloading finished in 219 ms. Sep 13 00:07:11.387881 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 13 00:07:11.389223 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 13 00:07:11.399509 systemd[1]: Starting ensure-sysext.service... Sep 13 00:07:11.401649 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 13 00:07:11.409803 systemd[1]: Reloading requested from client PID 1268 ('systemctl') (unit ensure-sysext.service)... Sep 13 00:07:11.409816 systemd[1]: Reloading... Sep 13 00:07:11.421675 systemd-tmpfiles[1269]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 13 00:07:11.421906 systemd-tmpfiles[1269]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 13 00:07:11.422528 systemd-tmpfiles[1269]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 13 00:07:11.422721 systemd-tmpfiles[1269]: ACLs are not supported, ignoring. Sep 13 00:07:11.422773 systemd-tmpfiles[1269]: ACLs are not supported, ignoring. Sep 13 00:07:11.425410 systemd-tmpfiles[1269]: Detected autofs mount point /boot during canonicalization of boot. Sep 13 00:07:11.425420 systemd-tmpfiles[1269]: Skipping /boot Sep 13 00:07:11.431032 systemd-tmpfiles[1269]: Detected autofs mount point /boot during canonicalization of boot. Sep 13 00:07:11.431124 systemd-tmpfiles[1269]: Skipping /boot Sep 13 00:07:11.463300 zram_generator::config[1294]: No configuration found. Sep 13 00:07:11.538778 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 13 00:07:11.576820 systemd[1]: Reloading finished in 166 ms. Sep 13 00:07:11.589953 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 13 00:07:11.594595 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 13 00:07:11.599437 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Sep 13 00:07:11.601401 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 13 00:07:11.604447 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 13 00:07:11.608708 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 13 00:07:11.611685 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 13 00:07:11.621208 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 13 00:07:11.633572 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 13 00:07:11.636004 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 00:07:11.637497 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 13 00:07:11.645695 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 13 00:07:11.651592 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 13 00:07:11.658541 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 13 00:07:11.659349 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 13 00:07:11.659526 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 00:07:11.660219 systemd-udevd[1347]: Using default interface naming scheme 'v255'. Sep 13 00:07:11.660807 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 13 00:07:11.661157 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 13 00:07:11.670450 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 00:07:11.670619 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 13 00:07:11.673605 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 13 00:07:11.675393 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 13 00:07:11.675525 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 00:07:11.676216 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 13 00:07:11.676383 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 13 00:07:11.679148 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 13 00:07:11.691427 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 13 00:07:11.693704 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 13 00:07:11.694045 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 13 00:07:11.697970 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 13 00:07:11.700005 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 13 00:07:11.700133 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 13 00:07:11.704138 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 00:07:11.704802 augenrules[1376]: No rules Sep 13 00:07:11.704899 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 13 00:07:11.719482 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 13 00:07:11.722015 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 13 00:07:11.724401 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 13 00:07:11.724445 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 13 00:07:11.725530 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 13 00:07:11.726057 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 00:07:11.726314 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 13 00:07:11.728663 systemd[1]: Finished ensure-sysext.service. Sep 13 00:07:11.729440 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Sep 13 00:07:11.737664 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 13 00:07:11.740770 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 13 00:07:11.741848 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 13 00:07:11.743354 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 13 00:07:11.767660 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 13 00:07:11.769956 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 13 00:07:11.770067 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 13 00:07:11.770973 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 13 00:07:11.770995 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 13 00:07:11.790163 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 13 00:07:11.796675 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 13 00:07:11.827289 kernel: mousedev: PS/2 mouse device common for all mice Sep 13 00:07:11.860960 systemd-resolved[1345]: Positive Trust Anchors: Sep 13 00:07:11.860972 systemd-resolved[1345]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 13 00:07:11.860998 systemd-resolved[1345]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 13 00:07:11.869699 systemd-resolved[1345]: Using system hostname 'ci-4081-3-5-n-294a4568b6'. Sep 13 00:07:11.870847 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 13 00:07:11.871833 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 13 00:07:11.872301 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Sep 13 00:07:11.878914 systemd-networkd[1394]: lo: Link UP Sep 13 00:07:11.878923 systemd-networkd[1394]: lo: Gained carrier Sep 13 00:07:11.879616 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 13 00:07:11.880556 systemd[1]: Reached target time-set.target - System Time Set. Sep 13 00:07:11.881732 systemd-networkd[1394]: Enumeration completed Sep 13 00:07:11.881796 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 13 00:07:11.882621 systemd[1]: Reached target network.target - Network. Sep 13 00:07:11.885728 systemd-timesyncd[1398]: No network connectivity, watching for changes. Sep 13 00:07:11.888794 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 13 00:07:11.891198 systemd-networkd[1394]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 13 00:07:11.891207 systemd-networkd[1394]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 13 00:07:11.893224 systemd-networkd[1394]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 13 00:07:11.893247 systemd-networkd[1394]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 13 00:07:11.894368 systemd-networkd[1394]: eth0: Link UP Sep 13 00:07:11.894377 systemd-networkd[1394]: eth0: Gained carrier Sep 13 00:07:11.894390 systemd-networkd[1394]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 13 00:07:11.898489 systemd-networkd[1394]: eth1: Link UP Sep 13 00:07:11.898498 systemd-networkd[1394]: eth1: Gained carrier Sep 13 00:07:11.898509 systemd-networkd[1394]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 13 00:07:11.905950 systemd-networkd[1394]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 13 00:07:11.918156 kernel: ACPI: button: Power Button [PWRF] Sep 13 00:07:11.917790 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Sep 13 00:07:11.917832 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 00:07:11.917904 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 13 00:07:11.923834 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 13 00:07:11.926343 systemd-networkd[1394]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Sep 13 00:07:11.926478 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 13 00:07:11.927688 systemd-timesyncd[1398]: Network configuration changed, trying to establish connection. Sep 13 00:07:11.930381 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 13 00:07:11.930876 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 13 00:07:11.930903 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 13 00:07:11.930913 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 13 00:07:11.936566 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 13 00:07:11.936709 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 13 00:07:11.937585 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (1403) Sep 13 00:07:11.948886 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 13 00:07:11.949006 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 13 00:07:11.951098 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 13 00:07:11.953355 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 13 00:07:11.953511 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 13 00:07:11.955856 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 13 00:07:11.957398 systemd-networkd[1394]: eth0: DHCPv4 address 157.180.121.11/32, gateway 172.31.1.1 acquired from 172.31.1.1 Sep 13 00:07:11.957626 systemd-timesyncd[1398]: Network configuration changed, trying to establish connection. Sep 13 00:07:11.958358 systemd-timesyncd[1398]: Network configuration changed, trying to establish connection. Sep 13 00:07:11.967284 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Sep 13 00:07:11.972281 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Sep 13 00:07:11.975499 kernel: Console: switching to colour dummy device 80x25 Sep 13 00:07:11.976999 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Sep 13 00:07:11.977023 kernel: [drm] features: -context_init Sep 13 00:07:11.979284 kernel: [drm] number of scanouts: 1 Sep 13 00:07:11.979317 kernel: [drm] number of cap sets: 0 Sep 13 00:07:11.981291 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input5 Sep 13 00:07:11.987874 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Sep 13 00:07:11.991103 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI) Sep 13 00:07:11.991526 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Sep 13 00:07:11.991684 kernel: EDAC MC: Ver: 3.0.0 Sep 13 00:07:12.008527 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Sep 13 00:07:12.019488 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 13 00:07:12.039962 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 13 00:07:12.043308 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:01.0 on minor 0 Sep 13 00:07:12.043720 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 13 00:07:12.055556 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 13 00:07:12.055918 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 13 00:07:12.062457 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 13 00:07:12.074108 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Sep 13 00:07:12.074185 kernel: Console: switching to colour frame buffer device 160x50 Sep 13 00:07:12.081309 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Sep 13 00:07:12.087896 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 13 00:07:12.088336 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 13 00:07:12.096381 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 13 00:07:12.143564 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 13 00:07:12.204441 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Sep 13 00:07:12.209519 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Sep 13 00:07:12.219469 lvm[1458]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Sep 13 00:07:12.246344 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Sep 13 00:07:12.247050 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 13 00:07:12.247181 systemd[1]: Reached target sysinit.target - System Initialization. Sep 13 00:07:12.247384 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 13 00:07:12.247525 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 13 00:07:12.247772 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 13 00:07:12.247898 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 13 00:07:12.247966 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 13 00:07:12.248029 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 13 00:07:12.248050 systemd[1]: Reached target paths.target - Path Units. Sep 13 00:07:12.248094 systemd[1]: Reached target timers.target - Timer Units. Sep 13 00:07:12.250625 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 13 00:07:12.251877 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 13 00:07:12.256918 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 13 00:07:12.258373 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Sep 13 00:07:12.259875 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 13 00:07:12.260021 systemd[1]: Reached target sockets.target - Socket Units. Sep 13 00:07:12.260106 systemd[1]: Reached target basic.target - Basic System. Sep 13 00:07:12.260212 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 13 00:07:12.260259 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 13 00:07:12.262064 systemd[1]: Starting containerd.service - containerd container runtime... Sep 13 00:07:12.268445 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 13 00:07:12.272960 lvm[1462]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Sep 13 00:07:12.275474 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 13 00:07:12.279389 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 13 00:07:12.292524 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 13 00:07:12.293401 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 13 00:07:12.296576 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 13 00:07:12.307462 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 13 00:07:12.321322 coreos-metadata[1464]: Sep 13 00:07:12.314 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Sep 13 00:07:12.314462 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Sep 13 00:07:12.324440 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 13 00:07:12.328531 extend-filesystems[1469]: Found loop4 Sep 13 00:07:12.328531 extend-filesystems[1469]: Found loop5 Sep 13 00:07:12.328531 extend-filesystems[1469]: Found loop6 Sep 13 00:07:12.328531 extend-filesystems[1469]: Found loop7 Sep 13 00:07:12.328531 extend-filesystems[1469]: Found sda Sep 13 00:07:12.328531 extend-filesystems[1469]: Found sda1 Sep 13 00:07:12.328531 extend-filesystems[1469]: Found sda2 Sep 13 00:07:12.328531 extend-filesystems[1469]: Found sda3 Sep 13 00:07:12.328531 extend-filesystems[1469]: Found usr Sep 13 00:07:12.328531 extend-filesystems[1469]: Found sda4 Sep 13 00:07:12.328531 extend-filesystems[1469]: Found sda6 Sep 13 00:07:12.328531 extend-filesystems[1469]: Found sda7 Sep 13 00:07:12.328531 extend-filesystems[1469]: Found sda9 Sep 13 00:07:12.328531 extend-filesystems[1469]: Checking size of /dev/sda9 Sep 13 00:07:12.403769 coreos-metadata[1464]: Sep 13 00:07:12.338 INFO Fetch successful Sep 13 00:07:12.403769 coreos-metadata[1464]: Sep 13 00:07:12.339 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Sep 13 00:07:12.403769 coreos-metadata[1464]: Sep 13 00:07:12.341 INFO Fetch successful Sep 13 00:07:12.403886 jq[1468]: false Sep 13 00:07:12.332413 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 13 00:07:12.337877 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 13 00:07:12.352166 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 13 00:07:12.352714 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 13 00:07:12.356205 systemd[1]: Starting update-engine.service - Update Engine... Sep 13 00:07:12.416300 extend-filesystems[1469]: Resized partition /dev/sda9 Sep 13 00:07:12.368291 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 13 00:07:12.417789 dbus-daemon[1465]: [system] SELinux support is enabled Sep 13 00:07:12.422688 extend-filesystems[1504]: resize2fs 1.47.1 (20-May-2024) Sep 13 00:07:12.374638 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 13 00:07:12.423206 jq[1484]: true Sep 13 00:07:12.374788 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 13 00:07:12.429530 tar[1488]: linux-amd64/LICENSE Sep 13 00:07:12.429530 tar[1488]: linux-amd64/helm Sep 13 00:07:12.389162 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Sep 13 00:07:12.407325 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 13 00:07:12.408455 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 13 00:07:12.418673 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 13 00:07:12.425083 systemd[1]: motdgen.service: Deactivated successfully. Sep 13 00:07:12.425224 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 13 00:07:12.429084 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 13 00:07:12.440349 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Sep 13 00:07:12.429108 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 13 00:07:12.429518 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 13 00:07:12.429532 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 13 00:07:12.448854 (ntainerd)[1501]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 13 00:07:12.453539 update_engine[1481]: I20250913 00:07:12.453006 1481 main.cc:92] Flatcar Update Engine starting Sep 13 00:07:12.456648 systemd[1]: Started update-engine.service - Update Engine. Sep 13 00:07:12.463405 update_engine[1481]: I20250913 00:07:12.463357 1481 update_check_scheduler.cc:74] Next update check in 2m52s Sep 13 00:07:12.465390 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 13 00:07:12.469434 jq[1500]: true Sep 13 00:07:12.468214 systemd-logind[1477]: New seat seat0. Sep 13 00:07:12.469818 systemd-logind[1477]: Watching system buttons on /dev/input/event2 (Power Button) Sep 13 00:07:12.469836 systemd-logind[1477]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 13 00:07:12.470060 systemd[1]: Started systemd-logind.service - User Login Management. Sep 13 00:07:12.512320 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 13 00:07:12.517125 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 13 00:07:12.573805 bash[1534]: Updated "/home/core/.ssh/authorized_keys" Sep 13 00:07:12.579484 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (1385) Sep 13 00:07:12.574508 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 13 00:07:12.585534 systemd[1]: Starting sshkeys.service... Sep 13 00:07:12.627195 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 13 00:07:12.639902 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 13 00:07:12.657292 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Sep 13 00:07:12.685643 extend-filesystems[1504]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Sep 13 00:07:12.685643 extend-filesystems[1504]: old_desc_blocks = 1, new_desc_blocks = 5 Sep 13 00:07:12.685643 extend-filesystems[1504]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Sep 13 00:07:12.690445 extend-filesystems[1469]: Resized filesystem in /dev/sda9 Sep 13 00:07:12.690445 extend-filesystems[1469]: Found sr0 Sep 13 00:07:12.692209 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 13 00:07:12.692395 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 13 00:07:12.707648 coreos-metadata[1538]: Sep 13 00:07:12.707 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Sep 13 00:07:12.711177 coreos-metadata[1538]: Sep 13 00:07:12.708 INFO Fetch successful Sep 13 00:07:12.718804 unknown[1538]: wrote ssh authorized keys file for user: core Sep 13 00:07:12.750526 update-ssh-keys[1549]: Updated "/home/core/.ssh/authorized_keys" Sep 13 00:07:12.752342 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 13 00:07:12.757982 systemd[1]: Finished sshkeys.service. Sep 13 00:07:12.784877 locksmithd[1514]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 13 00:07:12.801823 containerd[1501]: time="2025-09-13T00:07:12.801757871Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Sep 13 00:07:12.856961 containerd[1501]: time="2025-09-13T00:07:12.855216604Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Sep 13 00:07:12.862407 containerd[1501]: time="2025-09-13T00:07:12.862367415Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.106-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Sep 13 00:07:12.862407 containerd[1501]: time="2025-09-13T00:07:12.862403773Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Sep 13 00:07:12.862504 containerd[1501]: time="2025-09-13T00:07:12.862418431Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Sep 13 00:07:12.862569 containerd[1501]: time="2025-09-13T00:07:12.862549085Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Sep 13 00:07:12.862591 containerd[1501]: time="2025-09-13T00:07:12.862577609Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Sep 13 00:07:12.862656 containerd[1501]: time="2025-09-13T00:07:12.862637953Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Sep 13 00:07:12.862674 containerd[1501]: time="2025-09-13T00:07:12.862657018Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Sep 13 00:07:12.862815 containerd[1501]: time="2025-09-13T00:07:12.862796159Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Sep 13 00:07:12.862815 containerd[1501]: time="2025-09-13T00:07:12.862813411Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Sep 13 00:07:12.862858 containerd[1501]: time="2025-09-13T00:07:12.862823861Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Sep 13 00:07:12.862858 containerd[1501]: time="2025-09-13T00:07:12.862831625Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Sep 13 00:07:12.862915 containerd[1501]: time="2025-09-13T00:07:12.862892970Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Sep 13 00:07:12.863079 containerd[1501]: time="2025-09-13T00:07:12.863055325Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Sep 13 00:07:12.863210 containerd[1501]: time="2025-09-13T00:07:12.863183485Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Sep 13 00:07:12.863210 containerd[1501]: time="2025-09-13T00:07:12.863203623Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Sep 13 00:07:12.865345 containerd[1501]: time="2025-09-13T00:07:12.865325292Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Sep 13 00:07:12.865394 containerd[1501]: time="2025-09-13T00:07:12.865378712Z" level=info msg="metadata content store policy set" policy=shared Sep 13 00:07:12.869208 containerd[1501]: time="2025-09-13T00:07:12.869179110Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Sep 13 00:07:12.869297 containerd[1501]: time="2025-09-13T00:07:12.869239844Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Sep 13 00:07:12.869297 containerd[1501]: time="2025-09-13T00:07:12.869256886Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Sep 13 00:07:12.871319 containerd[1501]: time="2025-09-13T00:07:12.871301421Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Sep 13 00:07:12.871372 containerd[1501]: time="2025-09-13T00:07:12.871321468Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Sep 13 00:07:12.871443 containerd[1501]: time="2025-09-13T00:07:12.871424121Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Sep 13 00:07:12.871662 containerd[1501]: time="2025-09-13T00:07:12.871643893Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Sep 13 00:07:12.871745 containerd[1501]: time="2025-09-13T00:07:12.871726908Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Sep 13 00:07:12.871779 containerd[1501]: time="2025-09-13T00:07:12.871746195Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Sep 13 00:07:12.871779 containerd[1501]: time="2025-09-13T00:07:12.871756915Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Sep 13 00:07:12.871779 containerd[1501]: time="2025-09-13T00:07:12.871768346Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Sep 13 00:07:12.871850 containerd[1501]: time="2025-09-13T00:07:12.871779157Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Sep 13 00:07:12.871850 containerd[1501]: time="2025-09-13T00:07:12.871787883Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Sep 13 00:07:12.871850 containerd[1501]: time="2025-09-13T00:07:12.871799124Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Sep 13 00:07:12.871850 containerd[1501]: time="2025-09-13T00:07:12.871810796Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Sep 13 00:07:12.871850 containerd[1501]: time="2025-09-13T00:07:12.871820695Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Sep 13 00:07:12.871850 containerd[1501]: time="2025-09-13T00:07:12.871831384Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Sep 13 00:07:12.871850 containerd[1501]: time="2025-09-13T00:07:12.871840241Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Sep 13 00:07:12.872039 containerd[1501]: time="2025-09-13T00:07:12.871857644Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Sep 13 00:07:12.872039 containerd[1501]: time="2025-09-13T00:07:12.871868585Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Sep 13 00:07:12.872039 containerd[1501]: time="2025-09-13T00:07:12.871878052Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Sep 13 00:07:12.872039 containerd[1501]: time="2025-09-13T00:07:12.871894513Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Sep 13 00:07:12.872039 containerd[1501]: time="2025-09-13T00:07:12.871906095Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Sep 13 00:07:12.872039 containerd[1501]: time="2025-09-13T00:07:12.871919019Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Sep 13 00:07:12.872039 containerd[1501]: time="2025-09-13T00:07:12.871928246Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Sep 13 00:07:12.872039 containerd[1501]: time="2025-09-13T00:07:12.871937444Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Sep 13 00:07:12.872039 containerd[1501]: time="2025-09-13T00:07:12.871947031Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Sep 13 00:07:12.872039 containerd[1501]: time="2025-09-13T00:07:12.871960336Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Sep 13 00:07:12.872039 containerd[1501]: time="2025-09-13T00:07:12.871969704Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Sep 13 00:07:12.872039 containerd[1501]: time="2025-09-13T00:07:12.871980324Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Sep 13 00:07:12.872039 containerd[1501]: time="2025-09-13T00:07:12.871989852Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Sep 13 00:07:12.872039 containerd[1501]: time="2025-09-13T00:07:12.872004039Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Sep 13 00:07:12.872039 containerd[1501]: time="2025-09-13T00:07:12.872020439Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Sep 13 00:07:12.872679 containerd[1501]: time="2025-09-13T00:07:12.872030929Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Sep 13 00:07:12.872679 containerd[1501]: time="2025-09-13T00:07:12.872039996Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Sep 13 00:07:12.872679 containerd[1501]: time="2025-09-13T00:07:12.872074020Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Sep 13 00:07:12.872679 containerd[1501]: time="2025-09-13T00:07:12.872087275Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Sep 13 00:07:12.872679 containerd[1501]: time="2025-09-13T00:07:12.872095460Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Sep 13 00:07:12.872679 containerd[1501]: time="2025-09-13T00:07:12.872104407Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Sep 13 00:07:12.872679 containerd[1501]: time="2025-09-13T00:07:12.872111309Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Sep 13 00:07:12.872679 containerd[1501]: time="2025-09-13T00:07:12.872120898Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Sep 13 00:07:12.872679 containerd[1501]: time="2025-09-13T00:07:12.872128652Z" level=info msg="NRI interface is disabled by configuration." Sep 13 00:07:12.872679 containerd[1501]: time="2025-09-13T00:07:12.872139342Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Sep 13 00:07:12.878729 containerd[1501]: time="2025-09-13T00:07:12.877809817Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Sep 13 00:07:12.878729 containerd[1501]: time="2025-09-13T00:07:12.878048996Z" level=info msg="Connect containerd service" Sep 13 00:07:12.878729 containerd[1501]: time="2025-09-13T00:07:12.878097096Z" level=info msg="using legacy CRI server" Sep 13 00:07:12.878729 containerd[1501]: time="2025-09-13T00:07:12.878105552Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 13 00:07:12.878729 containerd[1501]: time="2025-09-13T00:07:12.878204468Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Sep 13 00:07:12.880833 containerd[1501]: time="2025-09-13T00:07:12.880248391Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 13 00:07:12.881008 containerd[1501]: time="2025-09-13T00:07:12.880981766Z" level=info msg="Start subscribing containerd event" Sep 13 00:07:12.881065 sshd_keygen[1485]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 13 00:07:12.881339 containerd[1501]: time="2025-09-13T00:07:12.881325030Z" level=info msg="Start recovering state" Sep 13 00:07:12.881620 containerd[1501]: time="2025-09-13T00:07:12.881608492Z" level=info msg="Start event monitor" Sep 13 00:07:12.882321 containerd[1501]: time="2025-09-13T00:07:12.882305399Z" level=info msg="Start snapshots syncer" Sep 13 00:07:12.882385 containerd[1501]: time="2025-09-13T00:07:12.882364199Z" level=info msg="Start cni network conf syncer for default" Sep 13 00:07:12.885881 containerd[1501]: time="2025-09-13T00:07:12.882572560Z" level=info msg="Start streaming server" Sep 13 00:07:12.885881 containerd[1501]: time="2025-09-13T00:07:12.881859102Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 13 00:07:12.885881 containerd[1501]: time="2025-09-13T00:07:12.882636670Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 13 00:07:12.885881 containerd[1501]: time="2025-09-13T00:07:12.882688527Z" level=info msg="containerd successfully booted in 0.082733s" Sep 13 00:07:12.884363 systemd[1]: Started containerd.service - containerd container runtime. Sep 13 00:07:12.903743 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 13 00:07:12.913313 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 13 00:07:12.925702 systemd[1]: issuegen.service: Deactivated successfully. Sep 13 00:07:12.925892 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 13 00:07:12.935183 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 13 00:07:12.947737 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 13 00:07:12.958581 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 13 00:07:12.960790 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 13 00:07:12.962749 systemd[1]: Reached target getty.target - Login Prompts. Sep 13 00:07:13.098667 tar[1488]: linux-amd64/README.md Sep 13 00:07:13.107427 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 13 00:07:13.176505 systemd-networkd[1394]: eth0: Gained IPv6LL Sep 13 00:07:13.177044 systemd-timesyncd[1398]: Network configuration changed, trying to establish connection. Sep 13 00:07:13.179143 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 13 00:07:13.181964 systemd[1]: Reached target network-online.target - Network is Online. Sep 13 00:07:13.193790 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 00:07:13.197939 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 13 00:07:13.218517 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 13 00:07:13.752805 systemd-networkd[1394]: eth1: Gained IPv6LL Sep 13 00:07:13.753984 systemd-timesyncd[1398]: Network configuration changed, trying to establish connection. Sep 13 00:07:14.223121 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 00:07:14.224037 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 13 00:07:14.226418 systemd[1]: Startup finished in 1.390s (kernel) + 5.424s (initrd) + 4.149s (userspace) = 10.964s. Sep 13 00:07:14.227976 (kubelet)[1594]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 13 00:07:15.062835 kubelet[1594]: E0913 00:07:15.062743 1594 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 13 00:07:15.065992 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 13 00:07:15.066321 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 13 00:07:15.066748 systemd[1]: kubelet.service: Consumed 1.233s CPU time. Sep 13 00:07:25.103075 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 13 00:07:25.108696 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 00:07:25.193745 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 00:07:25.206664 (kubelet)[1612]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 13 00:07:25.260327 kubelet[1612]: E0913 00:07:25.260161 1612 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 13 00:07:25.263162 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 13 00:07:25.263375 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 13 00:07:35.352882 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 13 00:07:35.358462 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 00:07:35.456912 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 00:07:35.468506 (kubelet)[1627]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 13 00:07:35.505295 kubelet[1627]: E0913 00:07:35.505174 1627 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 13 00:07:35.508077 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 13 00:07:35.508263 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 13 00:07:37.842526 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 13 00:07:37.843630 systemd[1]: Started sshd@0-157.180.121.11:22-147.75.109.163:33400.service - OpenSSH per-connection server daemon (147.75.109.163:33400). Sep 13 00:07:38.815490 sshd[1635]: Accepted publickey for core from 147.75.109.163 port 33400 ssh2: RSA SHA256:GymMDYnosJimc4ujfdMuxEHSH4lnFIHEzFRMhgLPZDY Sep 13 00:07:38.817050 sshd[1635]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:07:38.826156 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 13 00:07:38.837893 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 13 00:07:38.842213 systemd-logind[1477]: New session 1 of user core. Sep 13 00:07:38.850012 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 13 00:07:38.857031 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 13 00:07:38.862703 (systemd)[1639]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 13 00:07:38.987045 systemd[1639]: Queued start job for default target default.target. Sep 13 00:07:38.999582 systemd[1639]: Created slice app.slice - User Application Slice. Sep 13 00:07:38.999621 systemd[1639]: Reached target paths.target - Paths. Sep 13 00:07:38.999640 systemd[1639]: Reached target timers.target - Timers. Sep 13 00:07:39.001131 systemd[1639]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 13 00:07:39.012436 systemd[1639]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 13 00:07:39.012697 systemd[1639]: Reached target sockets.target - Sockets. Sep 13 00:07:39.012812 systemd[1639]: Reached target basic.target - Basic System. Sep 13 00:07:39.012927 systemd[1639]: Reached target default.target - Main User Target. Sep 13 00:07:39.012959 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 13 00:07:39.013082 systemd[1639]: Startup finished in 143ms. Sep 13 00:07:39.020417 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 13 00:07:39.736021 systemd[1]: Started sshd@1-157.180.121.11:22-147.75.109.163:40084.service - OpenSSH per-connection server daemon (147.75.109.163:40084). Sep 13 00:07:40.813967 sshd[1650]: Accepted publickey for core from 147.75.109.163 port 40084 ssh2: RSA SHA256:GymMDYnosJimc4ujfdMuxEHSH4lnFIHEzFRMhgLPZDY Sep 13 00:07:40.815408 sshd[1650]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:07:40.820208 systemd-logind[1477]: New session 2 of user core. Sep 13 00:07:40.826400 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 13 00:07:41.559332 sshd[1650]: pam_unix(sshd:session): session closed for user core Sep 13 00:07:41.562786 systemd-logind[1477]: Session 2 logged out. Waiting for processes to exit. Sep 13 00:07:41.563324 systemd[1]: sshd@1-157.180.121.11:22-147.75.109.163:40084.service: Deactivated successfully. Sep 13 00:07:41.565162 systemd[1]: session-2.scope: Deactivated successfully. Sep 13 00:07:41.566129 systemd-logind[1477]: Removed session 2. Sep 13 00:07:41.708560 systemd[1]: Started sshd@2-157.180.121.11:22-147.75.109.163:40096.service - OpenSSH per-connection server daemon (147.75.109.163:40096). Sep 13 00:07:42.685891 sshd[1657]: Accepted publickey for core from 147.75.109.163 port 40096 ssh2: RSA SHA256:GymMDYnosJimc4ujfdMuxEHSH4lnFIHEzFRMhgLPZDY Sep 13 00:07:42.687432 sshd[1657]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:07:42.692351 systemd-logind[1477]: New session 3 of user core. Sep 13 00:07:42.702544 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 13 00:07:43.356951 sshd[1657]: pam_unix(sshd:session): session closed for user core Sep 13 00:07:43.360447 systemd[1]: sshd@2-157.180.121.11:22-147.75.109.163:40096.service: Deactivated successfully. Sep 13 00:07:43.362381 systemd[1]: session-3.scope: Deactivated successfully. Sep 13 00:07:43.363023 systemd-logind[1477]: Session 3 logged out. Waiting for processes to exit. Sep 13 00:07:43.364145 systemd-logind[1477]: Removed session 3. Sep 13 00:07:43.527716 systemd[1]: Started sshd@3-157.180.121.11:22-147.75.109.163:40104.service - OpenSSH per-connection server daemon (147.75.109.163:40104). Sep 13 00:07:45.111378 systemd-timesyncd[1398]: Contacted time server 128.140.109.119:123 (2.flatcar.pool.ntp.org). Sep 13 00:07:45.111443 systemd-timesyncd[1398]: Initial clock synchronization to Sat 2025-09-13 00:07:45.111141 UTC. Sep 13 00:07:45.111499 systemd-resolved[1345]: Clock change detected. Flushing caches. Sep 13 00:07:45.503269 sshd[1664]: Accepted publickey for core from 147.75.109.163 port 40104 ssh2: RSA SHA256:GymMDYnosJimc4ujfdMuxEHSH4lnFIHEzFRMhgLPZDY Sep 13 00:07:45.504646 sshd[1664]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:07:45.510541 systemd-logind[1477]: New session 4 of user core. Sep 13 00:07:45.517260 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 13 00:07:46.177552 sshd[1664]: pam_unix(sshd:session): session closed for user core Sep 13 00:07:46.180680 systemd[1]: sshd@3-157.180.121.11:22-147.75.109.163:40104.service: Deactivated successfully. Sep 13 00:07:46.182844 systemd[1]: session-4.scope: Deactivated successfully. Sep 13 00:07:46.184451 systemd-logind[1477]: Session 4 logged out. Waiting for processes to exit. Sep 13 00:07:46.185565 systemd-logind[1477]: Removed session 4. Sep 13 00:07:46.384281 systemd[1]: Started sshd@4-157.180.121.11:22-147.75.109.163:40118.service - OpenSSH per-connection server daemon (147.75.109.163:40118). Sep 13 00:07:46.603561 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 13 00:07:46.609540 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 00:07:46.698032 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 00:07:46.701355 (kubelet)[1681]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 13 00:07:46.737207 kubelet[1681]: E0913 00:07:46.737147 1681 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 13 00:07:46.740351 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 13 00:07:46.740563 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 13 00:07:47.455379 sshd[1671]: Accepted publickey for core from 147.75.109.163 port 40118 ssh2: RSA SHA256:GymMDYnosJimc4ujfdMuxEHSH4lnFIHEzFRMhgLPZDY Sep 13 00:07:47.456855 sshd[1671]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:07:47.461596 systemd-logind[1477]: New session 5 of user core. Sep 13 00:07:47.472149 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 13 00:07:48.032075 sudo[1689]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 13 00:07:48.032367 sudo[1689]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 13 00:07:48.049188 sudo[1689]: pam_unix(sudo:session): session closed for user root Sep 13 00:07:48.225127 sshd[1671]: pam_unix(sshd:session): session closed for user core Sep 13 00:07:48.229499 systemd[1]: sshd@4-157.180.121.11:22-147.75.109.163:40118.service: Deactivated successfully. Sep 13 00:07:48.231361 systemd[1]: session-5.scope: Deactivated successfully. Sep 13 00:07:48.232233 systemd-logind[1477]: Session 5 logged out. Waiting for processes to exit. Sep 13 00:07:48.233522 systemd-logind[1477]: Removed session 5. Sep 13 00:07:48.414256 systemd[1]: Started sshd@5-157.180.121.11:22-147.75.109.163:40128.service - OpenSSH per-connection server daemon (147.75.109.163:40128). Sep 13 00:07:49.486173 sshd[1694]: Accepted publickey for core from 147.75.109.163 port 40128 ssh2: RSA SHA256:GymMDYnosJimc4ujfdMuxEHSH4lnFIHEzFRMhgLPZDY Sep 13 00:07:49.488090 sshd[1694]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:07:49.494026 systemd-logind[1477]: New session 6 of user core. Sep 13 00:07:49.500202 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 13 00:07:50.056332 sudo[1698]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 13 00:07:50.056692 sudo[1698]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 13 00:07:50.060808 sudo[1698]: pam_unix(sudo:session): session closed for user root Sep 13 00:07:50.067142 sudo[1697]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Sep 13 00:07:50.067518 sudo[1697]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 13 00:07:50.085307 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Sep 13 00:07:50.087269 auditctl[1701]: No rules Sep 13 00:07:50.087642 systemd[1]: audit-rules.service: Deactivated successfully. Sep 13 00:07:50.087799 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Sep 13 00:07:50.093446 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Sep 13 00:07:50.120840 augenrules[1719]: No rules Sep 13 00:07:50.122266 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Sep 13 00:07:50.123836 sudo[1697]: pam_unix(sudo:session): session closed for user root Sep 13 00:07:50.299324 sshd[1694]: pam_unix(sshd:session): session closed for user core Sep 13 00:07:50.302802 systemd[1]: sshd@5-157.180.121.11:22-147.75.109.163:40128.service: Deactivated successfully. Sep 13 00:07:50.304735 systemd[1]: session-6.scope: Deactivated successfully. Sep 13 00:07:50.305961 systemd-logind[1477]: Session 6 logged out. Waiting for processes to exit. Sep 13 00:07:50.307202 systemd-logind[1477]: Removed session 6. Sep 13 00:07:50.484201 systemd[1]: Started sshd@6-157.180.121.11:22-147.75.109.163:53390.service - OpenSSH per-connection server daemon (147.75.109.163:53390). Sep 13 00:07:51.557431 sshd[1727]: Accepted publickey for core from 147.75.109.163 port 53390 ssh2: RSA SHA256:GymMDYnosJimc4ujfdMuxEHSH4lnFIHEzFRMhgLPZDY Sep 13 00:07:51.559418 sshd[1727]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:07:51.567170 systemd-logind[1477]: New session 7 of user core. Sep 13 00:07:51.574228 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 13 00:07:52.125987 sudo[1730]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 13 00:07:52.126330 sudo[1730]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 13 00:07:52.379123 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 13 00:07:52.379196 (dockerd)[1745]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 13 00:07:52.769308 dockerd[1745]: time="2025-09-13T00:07:52.769094339Z" level=info msg="Starting up" Sep 13 00:07:52.903240 dockerd[1745]: time="2025-09-13T00:07:52.902903479Z" level=info msg="Loading containers: start." Sep 13 00:07:53.060369 kernel: Initializing XFRM netlink socket Sep 13 00:07:53.170101 systemd-networkd[1394]: docker0: Link UP Sep 13 00:07:53.187113 dockerd[1745]: time="2025-09-13T00:07:53.187054538Z" level=info msg="Loading containers: done." Sep 13 00:07:53.200527 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2585193038-merged.mount: Deactivated successfully. Sep 13 00:07:53.201959 dockerd[1745]: time="2025-09-13T00:07:53.201484561Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 13 00:07:53.201959 dockerd[1745]: time="2025-09-13T00:07:53.201592633Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Sep 13 00:07:53.201959 dockerd[1745]: time="2025-09-13T00:07:53.201688504Z" level=info msg="Daemon has completed initialization" Sep 13 00:07:53.236825 dockerd[1745]: time="2025-09-13T00:07:53.236744011Z" level=info msg="API listen on /run/docker.sock" Sep 13 00:07:53.238466 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 13 00:07:54.356272 containerd[1501]: time="2025-09-13T00:07:54.356224547Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.5\"" Sep 13 00:07:54.872611 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount431188128.mount: Deactivated successfully. Sep 13 00:07:55.859047 systemd[1]: Started sshd@7-157.180.121.11:22-103.13.206.71:34952.service - OpenSSH per-connection server daemon (103.13.206.71:34952). Sep 13 00:07:56.430160 containerd[1501]: time="2025-09-13T00:07:56.430100062Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:07:56.431252 containerd[1501]: time="2025-09-13T00:07:56.431125916Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.5: active requests=0, bytes read=30114993" Sep 13 00:07:56.433549 containerd[1501]: time="2025-09-13T00:07:56.432276544Z" level=info msg="ImageCreate event name:\"sha256:b7335a56022aba291f5df653c01b7ab98d64fb5cab221378617f4a1236e06a62\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:07:56.434628 containerd[1501]: time="2025-09-13T00:07:56.434608077Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:1b9c6c00bc1fe86860e72efb8e4148f9e436a132eba4ca636ca4f48d61d6dfb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:07:56.435513 containerd[1501]: time="2025-09-13T00:07:56.435482176Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.5\" with image id \"sha256:b7335a56022aba291f5df653c01b7ab98d64fb5cab221378617f4a1236e06a62\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:1b9c6c00bc1fe86860e72efb8e4148f9e436a132eba4ca636ca4f48d61d6dfb4\", size \"30111492\" in 2.079205531s" Sep 13 00:07:56.435559 containerd[1501]: time="2025-09-13T00:07:56.435518084Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.5\" returns image reference \"sha256:b7335a56022aba291f5df653c01b7ab98d64fb5cab221378617f4a1236e06a62\"" Sep 13 00:07:56.436488 containerd[1501]: time="2025-09-13T00:07:56.436472153Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.5\"" Sep 13 00:07:56.853805 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Sep 13 00:07:56.863169 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 00:07:56.966100 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 00:07:56.966772 (kubelet)[1951]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 13 00:07:57.011640 kubelet[1951]: E0913 00:07:57.011500 1951 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 13 00:07:57.015390 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 13 00:07:57.015660 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 13 00:07:57.038906 sshd[1937]: Invalid user sales1 from 103.13.206.71 port 34952 Sep 13 00:07:57.260698 sshd[1937]: Received disconnect from 103.13.206.71 port 34952:11: Bye Bye [preauth] Sep 13 00:07:57.260698 sshd[1937]: Disconnected from invalid user sales1 103.13.206.71 port 34952 [preauth] Sep 13 00:07:57.264039 systemd[1]: sshd@7-157.180.121.11:22-103.13.206.71:34952.service: Deactivated successfully. Sep 13 00:07:57.630368 containerd[1501]: time="2025-09-13T00:07:57.630299148Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:07:57.631411 containerd[1501]: time="2025-09-13T00:07:57.631212911Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.5: active requests=0, bytes read=26020866" Sep 13 00:07:57.633160 containerd[1501]: time="2025-09-13T00:07:57.632212947Z" level=info msg="ImageCreate event name:\"sha256:8bb43160a0df4d7d34c89d9edbc48735bc2f830771e4b501937338221be0f668\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:07:57.634511 containerd[1501]: time="2025-09-13T00:07:57.634484538Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:1082a6ab67fb46397314dd36b36cb197ba4a4c5365033e9ad22bc7edaaaabd5c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:07:57.635385 containerd[1501]: time="2025-09-13T00:07:57.635359518Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.5\" with image id \"sha256:8bb43160a0df4d7d34c89d9edbc48735bc2f830771e4b501937338221be0f668\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:1082a6ab67fb46397314dd36b36cb197ba4a4c5365033e9ad22bc7edaaaabd5c\", size \"27681301\" in 1.198806253s" Sep 13 00:07:57.635463 containerd[1501]: time="2025-09-13T00:07:57.635450579Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.5\" returns image reference \"sha256:8bb43160a0df4d7d34c89d9edbc48735bc2f830771e4b501937338221be0f668\"" Sep 13 00:07:57.636141 containerd[1501]: time="2025-09-13T00:07:57.636109746Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.5\"" Sep 13 00:07:58.269041 update_engine[1481]: I20250913 00:07:58.268904 1481 update_attempter.cc:509] Updating boot flags... Sep 13 00:07:58.308937 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (1973) Sep 13 00:07:58.366788 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (1969) Sep 13 00:07:58.903432 containerd[1501]: time="2025-09-13T00:07:58.903371768Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:07:58.904491 containerd[1501]: time="2025-09-13T00:07:58.904279992Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.5: active requests=0, bytes read=20155590" Sep 13 00:07:58.906700 containerd[1501]: time="2025-09-13T00:07:58.905238349Z" level=info msg="ImageCreate event name:\"sha256:33b680aadf474b7e5e73957fc00c6af86dd0484c699c8461ba33ee656d1823bf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:07:58.907843 containerd[1501]: time="2025-09-13T00:07:58.907414020Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:3e7b57c9d9f06b77f0064e5be7f3df61e0151101160acd5fdecce911df28a189\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:07:58.908400 containerd[1501]: time="2025-09-13T00:07:58.908372978Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.5\" with image id \"sha256:33b680aadf474b7e5e73957fc00c6af86dd0484c699c8461ba33ee656d1823bf\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:3e7b57c9d9f06b77f0064e5be7f3df61e0151101160acd5fdecce911df28a189\", size \"21816043\" in 1.272237134s" Sep 13 00:07:58.908443 containerd[1501]: time="2025-09-13T00:07:58.908399598Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.5\" returns image reference \"sha256:33b680aadf474b7e5e73957fc00c6af86dd0484c699c8461ba33ee656d1823bf\"" Sep 13 00:07:58.909302 containerd[1501]: time="2025-09-13T00:07:58.909277614Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.5\"" Sep 13 00:07:59.822596 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount145070678.mount: Deactivated successfully. Sep 13 00:08:00.130048 containerd[1501]: time="2025-09-13T00:08:00.129980395Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:00.131290 containerd[1501]: time="2025-09-13T00:08:00.131125362Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.5: active requests=0, bytes read=31929497" Sep 13 00:08:00.131998 containerd[1501]: time="2025-09-13T00:08:00.131965809Z" level=info msg="ImageCreate event name:\"sha256:2844ee7bb56c2c194e1f4adafb9e7b60b9ed16aa4d07ab8ad1f019362e2efab3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:00.134280 containerd[1501]: time="2025-09-13T00:08:00.134240014Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:71445ec84ad98bd52a7784865a9d31b1b50b56092d3f7699edc39eefd71befe1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:00.134775 containerd[1501]: time="2025-09-13T00:08:00.134741625Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.5\" with image id \"sha256:2844ee7bb56c2c194e1f4adafb9e7b60b9ed16aa4d07ab8ad1f019362e2efab3\", repo tag \"registry.k8s.io/kube-proxy:v1.33.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:71445ec84ad98bd52a7784865a9d31b1b50b56092d3f7699edc39eefd71befe1\", size \"31928488\" in 1.225377618s" Sep 13 00:08:00.134775 containerd[1501]: time="2025-09-13T00:08:00.134771852Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.5\" returns image reference \"sha256:2844ee7bb56c2c194e1f4adafb9e7b60b9ed16aa4d07ab8ad1f019362e2efab3\"" Sep 13 00:08:00.135668 containerd[1501]: time="2025-09-13T00:08:00.135620203Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Sep 13 00:08:00.602619 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4007677666.mount: Deactivated successfully. Sep 13 00:08:01.423044 containerd[1501]: time="2025-09-13T00:08:01.422964490Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:01.424422 containerd[1501]: time="2025-09-13T00:08:01.424374595Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20942332" Sep 13 00:08:01.424927 containerd[1501]: time="2025-09-13T00:08:01.424841030Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:01.427663 containerd[1501]: time="2025-09-13T00:08:01.427628538Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:01.428676 containerd[1501]: time="2025-09-13T00:08:01.428649012Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 1.292998411s" Sep 13 00:08:01.428843 containerd[1501]: time="2025-09-13T00:08:01.428749490Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" Sep 13 00:08:01.429336 containerd[1501]: time="2025-09-13T00:08:01.429305723Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 13 00:08:01.851061 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3377041162.mount: Deactivated successfully. Sep 13 00:08:01.857305 containerd[1501]: time="2025-09-13T00:08:01.857250829Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:01.858192 containerd[1501]: time="2025-09-13T00:08:01.858131410Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321160" Sep 13 00:08:01.860311 containerd[1501]: time="2025-09-13T00:08:01.859038271Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:01.862066 containerd[1501]: time="2025-09-13T00:08:01.861094096Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:01.862066 containerd[1501]: time="2025-09-13T00:08:01.861884579Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 432.529243ms" Sep 13 00:08:01.862066 containerd[1501]: time="2025-09-13T00:08:01.861936487Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 13 00:08:01.862623 containerd[1501]: time="2025-09-13T00:08:01.862517426Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Sep 13 00:08:02.277894 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3706285882.mount: Deactivated successfully. Sep 13 00:08:03.663620 containerd[1501]: time="2025-09-13T00:08:03.663546604Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:03.664810 containerd[1501]: time="2025-09-13T00:08:03.664614377Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=58378491" Sep 13 00:08:03.665973 containerd[1501]: time="2025-09-13T00:08:03.665596970Z" level=info msg="ImageCreate event name:\"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:03.668448 containerd[1501]: time="2025-09-13T00:08:03.668407571Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:03.669478 containerd[1501]: time="2025-09-13T00:08:03.669441771Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"58938593\" in 1.806712188s" Sep 13 00:08:03.669534 containerd[1501]: time="2025-09-13T00:08:03.669480123Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\"" Sep 13 00:08:06.764411 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 00:08:06.770337 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 00:08:06.799623 systemd[1]: Reloading requested from client PID 2133 ('systemctl') (unit session-7.scope)... Sep 13 00:08:06.799642 systemd[1]: Reloading... Sep 13 00:08:06.904945 zram_generator::config[2172]: No configuration found. Sep 13 00:08:07.011793 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 13 00:08:07.088144 systemd[1]: Reloading finished in 288 ms. Sep 13 00:08:07.132619 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 13 00:08:07.132689 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 13 00:08:07.132953 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 00:08:07.138261 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 00:08:07.225144 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 00:08:07.234259 (kubelet)[2228]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 13 00:08:07.276206 kubelet[2228]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 13 00:08:07.276206 kubelet[2228]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 13 00:08:07.276206 kubelet[2228]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 13 00:08:07.278290 kubelet[2228]: I0913 00:08:07.277468 2228 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 13 00:08:07.781169 kubelet[2228]: I0913 00:08:07.781119 2228 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 13 00:08:07.781169 kubelet[2228]: I0913 00:08:07.781150 2228 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 13 00:08:07.781887 kubelet[2228]: I0913 00:08:07.781785 2228 server.go:956] "Client rotation is on, will bootstrap in background" Sep 13 00:08:07.811921 kubelet[2228]: I0913 00:08:07.811807 2228 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 13 00:08:07.815048 kubelet[2228]: E0913 00:08:07.814990 2228 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://157.180.121.11:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 157.180.121.11:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Sep 13 00:08:07.827045 kubelet[2228]: E0913 00:08:07.826995 2228 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Sep 13 00:08:07.827045 kubelet[2228]: I0913 00:08:07.827036 2228 server.go:1423] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Sep 13 00:08:07.831563 kubelet[2228]: I0913 00:08:07.831539 2228 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 13 00:08:07.835788 kubelet[2228]: I0913 00:08:07.835731 2228 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 13 00:08:07.838757 kubelet[2228]: I0913 00:08:07.835776 2228 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081-3-5-n-294a4568b6","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 13 00:08:07.838757 kubelet[2228]: I0913 00:08:07.838751 2228 topology_manager.go:138] "Creating topology manager with none policy" Sep 13 00:08:07.838998 kubelet[2228]: I0913 00:08:07.838767 2228 container_manager_linux.go:303] "Creating device plugin manager" Sep 13 00:08:07.839652 kubelet[2228]: I0913 00:08:07.839619 2228 state_mem.go:36] "Initialized new in-memory state store" Sep 13 00:08:07.843014 kubelet[2228]: I0913 00:08:07.842988 2228 kubelet.go:480] "Attempting to sync node with API server" Sep 13 00:08:07.843014 kubelet[2228]: I0913 00:08:07.843014 2228 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 13 00:08:07.844281 kubelet[2228]: I0913 00:08:07.844045 2228 kubelet.go:386] "Adding apiserver pod source" Sep 13 00:08:07.846450 kubelet[2228]: I0913 00:08:07.846042 2228 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 13 00:08:07.850243 kubelet[2228]: E0913 00:08:07.849589 2228 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://157.180.121.11:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4081-3-5-n-294a4568b6&limit=500&resourceVersion=0\": dial tcp 157.180.121.11:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Sep 13 00:08:07.855639 kubelet[2228]: E0913 00:08:07.855491 2228 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://157.180.121.11:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 157.180.121.11:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Sep 13 00:08:07.856176 kubelet[2228]: I0913 00:08:07.856134 2228 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Sep 13 00:08:07.857410 kubelet[2228]: I0913 00:08:07.856837 2228 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 13 00:08:07.857628 kubelet[2228]: W0913 00:08:07.857590 2228 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 13 00:08:07.862952 kubelet[2228]: I0913 00:08:07.862279 2228 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 13 00:08:07.862952 kubelet[2228]: I0913 00:08:07.862329 2228 server.go:1289] "Started kubelet" Sep 13 00:08:07.866450 kubelet[2228]: I0913 00:08:07.865754 2228 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 13 00:08:07.866450 kubelet[2228]: I0913 00:08:07.866186 2228 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 13 00:08:07.869202 kubelet[2228]: I0913 00:08:07.869185 2228 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 13 00:08:07.873583 kubelet[2228]: E0913 00:08:07.872280 2228 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://157.180.121.11:6443/api/v1/namespaces/default/events\": dial tcp 157.180.121.11:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4081-3-5-n-294a4568b6.1864aeed14e3ebc5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4081-3-5-n-294a4568b6,UID:ci-4081-3-5-n-294a4568b6,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4081-3-5-n-294a4568b6,},FirstTimestamp:2025-09-13 00:08:07.862299589 +0000 UTC m=+0.622409367,LastTimestamp:2025-09-13 00:08:07.862299589 +0000 UTC m=+0.622409367,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4081-3-5-n-294a4568b6,}" Sep 13 00:08:07.875147 kubelet[2228]: I0913 00:08:07.875114 2228 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 13 00:08:07.877546 kubelet[2228]: I0913 00:08:07.877212 2228 server.go:317] "Adding debug handlers to kubelet server" Sep 13 00:08:07.877692 kubelet[2228]: I0913 00:08:07.877657 2228 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 13 00:08:07.878133 kubelet[2228]: I0913 00:08:07.878110 2228 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 13 00:08:07.880969 kubelet[2228]: I0913 00:08:07.880896 2228 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 13 00:08:07.881185 kubelet[2228]: E0913 00:08:07.881151 2228 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-5-n-294a4568b6\" not found" Sep 13 00:08:07.881235 kubelet[2228]: I0913 00:08:07.881224 2228 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 13 00:08:07.881334 kubelet[2228]: I0913 00:08:07.881311 2228 reconciler.go:26] "Reconciler: start to sync state" Sep 13 00:08:07.882548 kubelet[2228]: E0913 00:08:07.882502 2228 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://157.180.121.11:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 157.180.121.11:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Sep 13 00:08:07.882593 kubelet[2228]: E0913 00:08:07.882575 2228 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://157.180.121.11:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-5-n-294a4568b6?timeout=10s\": dial tcp 157.180.121.11:6443: connect: connection refused" interval="200ms" Sep 13 00:08:07.883094 kubelet[2228]: I0913 00:08:07.882792 2228 factory.go:223] Registration of the systemd container factory successfully Sep 13 00:08:07.883094 kubelet[2228]: I0913 00:08:07.882893 2228 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 13 00:08:07.886452 kubelet[2228]: I0913 00:08:07.886435 2228 factory.go:223] Registration of the containerd container factory successfully Sep 13 00:08:07.891387 kubelet[2228]: E0913 00:08:07.891352 2228 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 13 00:08:07.893034 kubelet[2228]: I0913 00:08:07.893014 2228 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 13 00:08:07.893034 kubelet[2228]: I0913 00:08:07.893034 2228 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 13 00:08:07.893100 kubelet[2228]: I0913 00:08:07.893055 2228 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 13 00:08:07.893100 kubelet[2228]: I0913 00:08:07.893061 2228 kubelet.go:2436] "Starting kubelet main sync loop" Sep 13 00:08:07.893100 kubelet[2228]: E0913 00:08:07.893089 2228 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 13 00:08:07.896466 kubelet[2228]: E0913 00:08:07.896437 2228 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://157.180.121.11:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 157.180.121.11:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Sep 13 00:08:07.919788 kubelet[2228]: I0913 00:08:07.919753 2228 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 13 00:08:07.919788 kubelet[2228]: I0913 00:08:07.919770 2228 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 13 00:08:07.919788 kubelet[2228]: I0913 00:08:07.919785 2228 state_mem.go:36] "Initialized new in-memory state store" Sep 13 00:08:07.921971 kubelet[2228]: I0913 00:08:07.921905 2228 policy_none.go:49] "None policy: Start" Sep 13 00:08:07.921971 kubelet[2228]: I0913 00:08:07.921938 2228 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 13 00:08:07.921971 kubelet[2228]: I0913 00:08:07.921947 2228 state_mem.go:35] "Initializing new in-memory state store" Sep 13 00:08:07.927324 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 13 00:08:07.937840 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 13 00:08:07.941006 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 13 00:08:07.952470 kubelet[2228]: E0913 00:08:07.952149 2228 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 13 00:08:07.952470 kubelet[2228]: I0913 00:08:07.952420 2228 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 13 00:08:07.952470 kubelet[2228]: I0913 00:08:07.952435 2228 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 13 00:08:07.952922 kubelet[2228]: I0913 00:08:07.952889 2228 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 13 00:08:07.956698 kubelet[2228]: E0913 00:08:07.956566 2228 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 13 00:08:07.956698 kubelet[2228]: E0913 00:08:07.956614 2228 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4081-3-5-n-294a4568b6\" not found" Sep 13 00:08:07.983570 systemd[1]: Started sshd@8-157.180.121.11:22-101.126.54.167:43778.service - OpenSSH per-connection server daemon (101.126.54.167:43778). Sep 13 00:08:08.024514 systemd[1]: Created slice kubepods-burstable-pod2b01612d6c7200421e1fc5ce4bba481a.slice - libcontainer container kubepods-burstable-pod2b01612d6c7200421e1fc5ce4bba481a.slice. Sep 13 00:08:08.044311 kubelet[2228]: E0913 00:08:08.043327 2228 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-5-n-294a4568b6\" not found" node="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:08.047963 systemd[1]: Created slice kubepods-burstable-pod46af876cb6aae2f53f3768e155faba91.slice - libcontainer container kubepods-burstable-pod46af876cb6aae2f53f3768e155faba91.slice. Sep 13 00:08:08.054849 kubelet[2228]: I0913 00:08:08.054804 2228 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:08.055468 kubelet[2228]: E0913 00:08:08.055299 2228 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://157.180.121.11:6443/api/v1/nodes\": dial tcp 157.180.121.11:6443: connect: connection refused" node="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:08.057397 kubelet[2228]: E0913 00:08:08.057335 2228 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-5-n-294a4568b6\" not found" node="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:08.062149 systemd[1]: Created slice kubepods-burstable-poda3e6731a273bf932ff2542338e26c204.slice - libcontainer container kubepods-burstable-poda3e6731a273bf932ff2542338e26c204.slice. Sep 13 00:08:08.064646 kubelet[2228]: E0913 00:08:08.064616 2228 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-5-n-294a4568b6\" not found" node="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:08.085296 kubelet[2228]: I0913 00:08:08.085247 2228 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/2b01612d6c7200421e1fc5ce4bba481a-flexvolume-dir\") pod \"kube-controller-manager-ci-4081-3-5-n-294a4568b6\" (UID: \"2b01612d6c7200421e1fc5ce4bba481a\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-n-294a4568b6" Sep 13 00:08:08.085296 kubelet[2228]: I0913 00:08:08.085304 2228 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2b01612d6c7200421e1fc5ce4bba481a-k8s-certs\") pod \"kube-controller-manager-ci-4081-3-5-n-294a4568b6\" (UID: \"2b01612d6c7200421e1fc5ce4bba481a\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-n-294a4568b6" Sep 13 00:08:08.085654 kubelet[2228]: I0913 00:08:08.085367 2228 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/2b01612d6c7200421e1fc5ce4bba481a-kubeconfig\") pod \"kube-controller-manager-ci-4081-3-5-n-294a4568b6\" (UID: \"2b01612d6c7200421e1fc5ce4bba481a\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-n-294a4568b6" Sep 13 00:08:08.085654 kubelet[2228]: I0913 00:08:08.085399 2228 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2b01612d6c7200421e1fc5ce4bba481a-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081-3-5-n-294a4568b6\" (UID: \"2b01612d6c7200421e1fc5ce4bba481a\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-n-294a4568b6" Sep 13 00:08:08.085654 kubelet[2228]: I0913 00:08:08.085427 2228 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/46af876cb6aae2f53f3768e155faba91-kubeconfig\") pod \"kube-scheduler-ci-4081-3-5-n-294a4568b6\" (UID: \"46af876cb6aae2f53f3768e155faba91\") " pod="kube-system/kube-scheduler-ci-4081-3-5-n-294a4568b6" Sep 13 00:08:08.085654 kubelet[2228]: I0913 00:08:08.085453 2228 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a3e6731a273bf932ff2542338e26c204-ca-certs\") pod \"kube-apiserver-ci-4081-3-5-n-294a4568b6\" (UID: \"a3e6731a273bf932ff2542338e26c204\") " pod="kube-system/kube-apiserver-ci-4081-3-5-n-294a4568b6" Sep 13 00:08:08.085654 kubelet[2228]: I0913 00:08:08.085485 2228 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a3e6731a273bf932ff2542338e26c204-k8s-certs\") pod \"kube-apiserver-ci-4081-3-5-n-294a4568b6\" (UID: \"a3e6731a273bf932ff2542338e26c204\") " pod="kube-system/kube-apiserver-ci-4081-3-5-n-294a4568b6" Sep 13 00:08:08.085810 kubelet[2228]: I0913 00:08:08.085508 2228 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a3e6731a273bf932ff2542338e26c204-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081-3-5-n-294a4568b6\" (UID: \"a3e6731a273bf932ff2542338e26c204\") " pod="kube-system/kube-apiserver-ci-4081-3-5-n-294a4568b6" Sep 13 00:08:08.085810 kubelet[2228]: I0913 00:08:08.085534 2228 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2b01612d6c7200421e1fc5ce4bba481a-ca-certs\") pod \"kube-controller-manager-ci-4081-3-5-n-294a4568b6\" (UID: \"2b01612d6c7200421e1fc5ce4bba481a\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-n-294a4568b6" Sep 13 00:08:08.087661 kubelet[2228]: E0913 00:08:08.087612 2228 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://157.180.121.11:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-5-n-294a4568b6?timeout=10s\": dial tcp 157.180.121.11:6443: connect: connection refused" interval="400ms" Sep 13 00:08:08.258286 kubelet[2228]: I0913 00:08:08.258241 2228 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:08.258638 kubelet[2228]: E0913 00:08:08.258580 2228 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://157.180.121.11:6443/api/v1/nodes\": dial tcp 157.180.121.11:6443: connect: connection refused" node="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:08.344856 containerd[1501]: time="2025-09-13T00:08:08.344718530Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081-3-5-n-294a4568b6,Uid:2b01612d6c7200421e1fc5ce4bba481a,Namespace:kube-system,Attempt:0,}" Sep 13 00:08:08.360136 containerd[1501]: time="2025-09-13T00:08:08.360089368Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081-3-5-n-294a4568b6,Uid:46af876cb6aae2f53f3768e155faba91,Namespace:kube-system,Attempt:0,}" Sep 13 00:08:08.370772 containerd[1501]: time="2025-09-13T00:08:08.370546821Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081-3-5-n-294a4568b6,Uid:a3e6731a273bf932ff2542338e26c204,Namespace:kube-system,Attempt:0,}" Sep 13 00:08:08.488786 kubelet[2228]: E0913 00:08:08.488717 2228 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://157.180.121.11:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-5-n-294a4568b6?timeout=10s\": dial tcp 157.180.121.11:6443: connect: connection refused" interval="800ms" Sep 13 00:08:08.660428 kubelet[2228]: I0913 00:08:08.660394 2228 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:08.660766 kubelet[2228]: E0913 00:08:08.660726 2228 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://157.180.121.11:6443/api/v1/nodes\": dial tcp 157.180.121.11:6443: connect: connection refused" node="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:08.666378 kubelet[2228]: E0913 00:08:08.666338 2228 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://157.180.121.11:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4081-3-5-n-294a4568b6&limit=500&resourceVersion=0\": dial tcp 157.180.121.11:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Sep 13 00:08:08.790053 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount517402248.mount: Deactivated successfully. Sep 13 00:08:08.798616 containerd[1501]: time="2025-09-13T00:08:08.798550186Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 13 00:08:08.799764 containerd[1501]: time="2025-09-13T00:08:08.799732894Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 13 00:08:08.800971 containerd[1501]: time="2025-09-13T00:08:08.800903499Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Sep 13 00:08:08.801369 containerd[1501]: time="2025-09-13T00:08:08.801318248Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Sep 13 00:08:08.802297 containerd[1501]: time="2025-09-13T00:08:08.802250596Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 13 00:08:08.803653 containerd[1501]: time="2025-09-13T00:08:08.803611698Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 13 00:08:08.804200 containerd[1501]: time="2025-09-13T00:08:08.804122707Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312078" Sep 13 00:08:08.806490 containerd[1501]: time="2025-09-13T00:08:08.806463968Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 13 00:08:08.809469 containerd[1501]: time="2025-09-13T00:08:08.809374958Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 449.204277ms" Sep 13 00:08:08.813477 containerd[1501]: time="2025-09-13T00:08:08.812880723Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 468.053769ms" Sep 13 00:08:08.813477 containerd[1501]: time="2025-09-13T00:08:08.813364280Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 442.716109ms" Sep 13 00:08:08.880221 kubelet[2228]: E0913 00:08:08.880174 2228 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://157.180.121.11:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 157.180.121.11:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Sep 13 00:08:08.935111 containerd[1501]: time="2025-09-13T00:08:08.934890844Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 00:08:08.935111 containerd[1501]: time="2025-09-13T00:08:08.935036016Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 00:08:08.935872 containerd[1501]: time="2025-09-13T00:08:08.935081852Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 00:08:08.936162 containerd[1501]: time="2025-09-13T00:08:08.936092157Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 00:08:08.941839 containerd[1501]: time="2025-09-13T00:08:08.941590900Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 00:08:08.941839 containerd[1501]: time="2025-09-13T00:08:08.941648679Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 00:08:08.941839 containerd[1501]: time="2025-09-13T00:08:08.941678083Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 00:08:08.941839 containerd[1501]: time="2025-09-13T00:08:08.941765317Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 00:08:08.952965 containerd[1501]: time="2025-09-13T00:08:08.952774505Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 00:08:08.952965 containerd[1501]: time="2025-09-13T00:08:08.952827414Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 00:08:08.954432 containerd[1501]: time="2025-09-13T00:08:08.954390336Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 00:08:08.956083 containerd[1501]: time="2025-09-13T00:08:08.955060061Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 00:08:08.971176 systemd[1]: Started cri-containerd-315552f54f8628a455bb6a05910ba78bdd5323a8d7244466a5038537aef4569d.scope - libcontainer container 315552f54f8628a455bb6a05910ba78bdd5323a8d7244466a5038537aef4569d. Sep 13 00:08:08.974056 systemd[1]: Started cri-containerd-0203571e286c334dc0af8869f3d7139b2d9186f941ef835cc74f3a9ae504982f.scope - libcontainer container 0203571e286c334dc0af8869f3d7139b2d9186f941ef835cc74f3a9ae504982f. Sep 13 00:08:08.999372 systemd[1]: Started cri-containerd-4833fe00aeeec52d76423aff34e0d706ab0f2c3692f00c7d9ff005f97b7ff768.scope - libcontainer container 4833fe00aeeec52d76423aff34e0d706ab0f2c3692f00c7d9ff005f97b7ff768. Sep 13 00:08:09.049538 containerd[1501]: time="2025-09-13T00:08:09.049342970Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081-3-5-n-294a4568b6,Uid:46af876cb6aae2f53f3768e155faba91,Namespace:kube-system,Attempt:0,} returns sandbox id \"315552f54f8628a455bb6a05910ba78bdd5323a8d7244466a5038537aef4569d\"" Sep 13 00:08:09.061499 containerd[1501]: time="2025-09-13T00:08:09.061433626Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081-3-5-n-294a4568b6,Uid:2b01612d6c7200421e1fc5ce4bba481a,Namespace:kube-system,Attempt:0,} returns sandbox id \"0203571e286c334dc0af8869f3d7139b2d9186f941ef835cc74f3a9ae504982f\"" Sep 13 00:08:09.066318 containerd[1501]: time="2025-09-13T00:08:09.066200326Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081-3-5-n-294a4568b6,Uid:a3e6731a273bf932ff2542338e26c204,Namespace:kube-system,Attempt:0,} returns sandbox id \"4833fe00aeeec52d76423aff34e0d706ab0f2c3692f00c7d9ff005f97b7ff768\"" Sep 13 00:08:09.068184 containerd[1501]: time="2025-09-13T00:08:09.067899894Z" level=info msg="CreateContainer within sandbox \"315552f54f8628a455bb6a05910ba78bdd5323a8d7244466a5038537aef4569d\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 13 00:08:09.070931 containerd[1501]: time="2025-09-13T00:08:09.070790275Z" level=info msg="CreateContainer within sandbox \"0203571e286c334dc0af8869f3d7139b2d9186f941ef835cc74f3a9ae504982f\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 13 00:08:09.072599 containerd[1501]: time="2025-09-13T00:08:09.072473422Z" level=info msg="CreateContainer within sandbox \"4833fe00aeeec52d76423aff34e0d706ab0f2c3692f00c7d9ff005f97b7ff768\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 13 00:08:09.086944 containerd[1501]: time="2025-09-13T00:08:09.086767390Z" level=info msg="CreateContainer within sandbox \"315552f54f8628a455bb6a05910ba78bdd5323a8d7244466a5038537aef4569d\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"26a2acc9b508df78313e7184612aa6c026fad194beef1fdbff26242749583e63\"" Sep 13 00:08:09.087632 containerd[1501]: time="2025-09-13T00:08:09.087108740Z" level=info msg="CreateContainer within sandbox \"4833fe00aeeec52d76423aff34e0d706ab0f2c3692f00c7d9ff005f97b7ff768\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"3af490f9fad1145d0700825c73e3f7a75dc1597bfd6bdfb5f48172a44ddbbc7e\"" Sep 13 00:08:09.088957 containerd[1501]: time="2025-09-13T00:08:09.087927306Z" level=info msg="StartContainer for \"26a2acc9b508df78313e7184612aa6c026fad194beef1fdbff26242749583e63\"" Sep 13 00:08:09.089084 containerd[1501]: time="2025-09-13T00:08:09.089068426Z" level=info msg="StartContainer for \"3af490f9fad1145d0700825c73e3f7a75dc1597bfd6bdfb5f48172a44ddbbc7e\"" Sep 13 00:08:09.098099 containerd[1501]: time="2025-09-13T00:08:09.098028240Z" level=info msg="CreateContainer within sandbox \"0203571e286c334dc0af8869f3d7139b2d9186f941ef835cc74f3a9ae504982f\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"5d9cfaea7ec87371b1a1d4bee9a785a71a176f78533a5cecfb3e8b9c0da9bd22\"" Sep 13 00:08:09.099124 containerd[1501]: time="2025-09-13T00:08:09.099102494Z" level=info msg="StartContainer for \"5d9cfaea7ec87371b1a1d4bee9a785a71a176f78533a5cecfb3e8b9c0da9bd22\"" Sep 13 00:08:09.124082 systemd[1]: Started cri-containerd-3af490f9fad1145d0700825c73e3f7a75dc1597bfd6bdfb5f48172a44ddbbc7e.scope - libcontainer container 3af490f9fad1145d0700825c73e3f7a75dc1597bfd6bdfb5f48172a44ddbbc7e. Sep 13 00:08:09.126856 systemd[1]: Started cri-containerd-26a2acc9b508df78313e7184612aa6c026fad194beef1fdbff26242749583e63.scope - libcontainer container 26a2acc9b508df78313e7184612aa6c026fad194beef1fdbff26242749583e63. Sep 13 00:08:09.140102 systemd[1]: Started cri-containerd-5d9cfaea7ec87371b1a1d4bee9a785a71a176f78533a5cecfb3e8b9c0da9bd22.scope - libcontainer container 5d9cfaea7ec87371b1a1d4bee9a785a71a176f78533a5cecfb3e8b9c0da9bd22. Sep 13 00:08:09.160261 kubelet[2228]: E0913 00:08:09.160225 2228 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://157.180.121.11:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 157.180.121.11:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Sep 13 00:08:09.190014 containerd[1501]: time="2025-09-13T00:08:09.189766326Z" level=info msg="StartContainer for \"5d9cfaea7ec87371b1a1d4bee9a785a71a176f78533a5cecfb3e8b9c0da9bd22\" returns successfully" Sep 13 00:08:09.206688 containerd[1501]: time="2025-09-13T00:08:09.206276511Z" level=info msg="StartContainer for \"26a2acc9b508df78313e7184612aa6c026fad194beef1fdbff26242749583e63\" returns successfully" Sep 13 00:08:09.206688 containerd[1501]: time="2025-09-13T00:08:09.206359536Z" level=info msg="StartContainer for \"3af490f9fad1145d0700825c73e3f7a75dc1597bfd6bdfb5f48172a44ddbbc7e\" returns successfully" Sep 13 00:08:09.290113 kubelet[2228]: E0913 00:08:09.290041 2228 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://157.180.121.11:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-5-n-294a4568b6?timeout=10s\": dial tcp 157.180.121.11:6443: connect: connection refused" interval="1.6s" Sep 13 00:08:09.464749 kubelet[2228]: I0913 00:08:09.464128 2228 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:09.464749 kubelet[2228]: E0913 00:08:09.464519 2228 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://157.180.121.11:6443/api/v1/nodes\": dial tcp 157.180.121.11:6443: connect: connection refused" node="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:09.472876 kubelet[2228]: E0913 00:08:09.472813 2228 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://157.180.121.11:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 157.180.121.11:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Sep 13 00:08:09.926425 kubelet[2228]: E0913 00:08:09.926392 2228 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-5-n-294a4568b6\" not found" node="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:09.926790 kubelet[2228]: E0913 00:08:09.926763 2228 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-5-n-294a4568b6\" not found" node="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:09.929758 kubelet[2228]: E0913 00:08:09.929734 2228 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-5-n-294a4568b6\" not found" node="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:10.940058 kubelet[2228]: E0913 00:08:10.940025 2228 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-5-n-294a4568b6\" not found" node="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:10.940680 kubelet[2228]: E0913 00:08:10.940373 2228 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-5-n-294a4568b6\" not found" node="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:10.942089 kubelet[2228]: E0913 00:08:10.942074 2228 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-5-n-294a4568b6\" not found" node="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:11.024436 kubelet[2228]: E0913 00:08:11.024370 2228 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4081-3-5-n-294a4568b6\" not found" node="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:11.067035 kubelet[2228]: I0913 00:08:11.067006 2228 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:11.085652 kubelet[2228]: I0913 00:08:11.085442 2228 kubelet_node_status.go:78] "Successfully registered node" node="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:11.182660 kubelet[2228]: I0913 00:08:11.182584 2228 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-5-n-294a4568b6" Sep 13 00:08:11.193515 kubelet[2228]: E0913 00:08:11.193340 2228 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081-3-5-n-294a4568b6\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4081-3-5-n-294a4568b6" Sep 13 00:08:11.193515 kubelet[2228]: I0913 00:08:11.193388 2228 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-5-n-294a4568b6" Sep 13 00:08:11.196184 kubelet[2228]: E0913 00:08:11.196131 2228 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4081-3-5-n-294a4568b6\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4081-3-5-n-294a4568b6" Sep 13 00:08:11.196184 kubelet[2228]: I0913 00:08:11.196160 2228 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-5-n-294a4568b6" Sep 13 00:08:11.197961 kubelet[2228]: E0913 00:08:11.197935 2228 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081-3-5-n-294a4568b6\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4081-3-5-n-294a4568b6" Sep 13 00:08:11.406177 sshd[2264]: Invalid user ftp_id from 101.126.54.167 port 43778 Sep 13 00:08:11.826937 sshd[2264]: Received disconnect from 101.126.54.167 port 43778:11: Bye Bye [preauth] Sep 13 00:08:11.826937 sshd[2264]: Disconnected from invalid user ftp_id 101.126.54.167 port 43778 [preauth] Sep 13 00:08:11.828384 systemd[1]: sshd@8-157.180.121.11:22-101.126.54.167:43778.service: Deactivated successfully. Sep 13 00:08:11.853130 kubelet[2228]: I0913 00:08:11.853074 2228 apiserver.go:52] "Watching apiserver" Sep 13 00:08:11.881398 kubelet[2228]: I0913 00:08:11.881324 2228 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 13 00:08:13.181641 systemd[1]: Reloading requested from client PID 2520 ('systemctl') (unit session-7.scope)... Sep 13 00:08:13.181661 systemd[1]: Reloading... Sep 13 00:08:13.239045 zram_generator::config[2560]: No configuration found. Sep 13 00:08:13.334599 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 13 00:08:13.405226 systemd[1]: Reloading finished in 223 ms. Sep 13 00:08:13.439956 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 00:08:13.456652 systemd[1]: kubelet.service: Deactivated successfully. Sep 13 00:08:13.457157 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 00:08:13.463539 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 13 00:08:13.575143 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 13 00:08:13.579851 (kubelet)[2611]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 13 00:08:13.634191 kubelet[2611]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 13 00:08:13.634191 kubelet[2611]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 13 00:08:13.634191 kubelet[2611]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 13 00:08:13.634534 kubelet[2611]: I0913 00:08:13.634222 2611 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 13 00:08:13.643791 kubelet[2611]: I0913 00:08:13.643748 2611 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 13 00:08:13.644427 kubelet[2611]: I0913 00:08:13.643977 2611 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 13 00:08:13.645295 kubelet[2611]: I0913 00:08:13.645272 2611 server.go:956] "Client rotation is on, will bootstrap in background" Sep 13 00:08:13.647057 kubelet[2611]: I0913 00:08:13.647023 2611 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Sep 13 00:08:13.652949 kubelet[2611]: I0913 00:08:13.652810 2611 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 13 00:08:13.662165 kubelet[2611]: E0913 00:08:13.662135 2611 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Sep 13 00:08:13.662412 kubelet[2611]: I0913 00:08:13.662311 2611 server.go:1423] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Sep 13 00:08:13.664947 kubelet[2611]: I0913 00:08:13.664933 2611 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 13 00:08:13.665423 kubelet[2611]: I0913 00:08:13.665190 2611 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 13 00:08:13.665423 kubelet[2611]: I0913 00:08:13.665215 2611 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081-3-5-n-294a4568b6","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 13 00:08:13.665423 kubelet[2611]: I0913 00:08:13.665352 2611 topology_manager.go:138] "Creating topology manager with none policy" Sep 13 00:08:13.665423 kubelet[2611]: I0913 00:08:13.665360 2611 container_manager_linux.go:303] "Creating device plugin manager" Sep 13 00:08:13.666645 kubelet[2611]: I0913 00:08:13.666591 2611 state_mem.go:36] "Initialized new in-memory state store" Sep 13 00:08:13.668836 kubelet[2611]: I0913 00:08:13.668749 2611 kubelet.go:480] "Attempting to sync node with API server" Sep 13 00:08:13.668836 kubelet[2611]: I0913 00:08:13.668767 2611 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 13 00:08:13.668836 kubelet[2611]: I0913 00:08:13.668786 2611 kubelet.go:386] "Adding apiserver pod source" Sep 13 00:08:13.668836 kubelet[2611]: I0913 00:08:13.668802 2611 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 13 00:08:13.677746 kubelet[2611]: I0913 00:08:13.676470 2611 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Sep 13 00:08:13.677746 kubelet[2611]: I0913 00:08:13.676903 2611 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 13 00:08:13.684059 kubelet[2611]: I0913 00:08:13.684032 2611 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 13 00:08:13.684160 kubelet[2611]: I0913 00:08:13.684124 2611 server.go:1289] "Started kubelet" Sep 13 00:08:13.688892 kubelet[2611]: I0913 00:08:13.688834 2611 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 13 00:08:13.690633 kubelet[2611]: I0913 00:08:13.690569 2611 server.go:317] "Adding debug handlers to kubelet server" Sep 13 00:08:13.691373 kubelet[2611]: I0913 00:08:13.690722 2611 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 13 00:08:13.696802 kubelet[2611]: I0913 00:08:13.696720 2611 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 13 00:08:13.697337 kubelet[2611]: I0913 00:08:13.697248 2611 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 13 00:08:13.697684 kubelet[2611]: I0913 00:08:13.697476 2611 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 13 00:08:13.699520 kubelet[2611]: I0913 00:08:13.699494 2611 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 13 00:08:13.702800 kubelet[2611]: I0913 00:08:13.701537 2611 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 13 00:08:13.702800 kubelet[2611]: I0913 00:08:13.701784 2611 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 13 00:08:13.702800 kubelet[2611]: I0913 00:08:13.702003 2611 reconciler.go:26] "Reconciler: start to sync state" Sep 13 00:08:13.706464 kubelet[2611]: I0913 00:08:13.706029 2611 factory.go:223] Registration of the systemd container factory successfully Sep 13 00:08:13.706464 kubelet[2611]: I0913 00:08:13.706140 2611 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 13 00:08:13.706645 kubelet[2611]: E0913 00:08:13.706625 2611 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 13 00:08:13.709797 kubelet[2611]: I0913 00:08:13.708905 2611 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 13 00:08:13.709797 kubelet[2611]: I0913 00:08:13.708941 2611 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 13 00:08:13.709797 kubelet[2611]: I0913 00:08:13.708964 2611 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 13 00:08:13.709797 kubelet[2611]: I0913 00:08:13.708971 2611 kubelet.go:2436] "Starting kubelet main sync loop" Sep 13 00:08:13.709797 kubelet[2611]: E0913 00:08:13.709003 2611 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 13 00:08:13.713644 kubelet[2611]: I0913 00:08:13.713424 2611 factory.go:223] Registration of the containerd container factory successfully Sep 13 00:08:13.764688 kubelet[2611]: I0913 00:08:13.764666 2611 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 13 00:08:13.765573 kubelet[2611]: I0913 00:08:13.764813 2611 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 13 00:08:13.765573 kubelet[2611]: I0913 00:08:13.764830 2611 state_mem.go:36] "Initialized new in-memory state store" Sep 13 00:08:13.765573 kubelet[2611]: I0913 00:08:13.764989 2611 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 13 00:08:13.765573 kubelet[2611]: I0913 00:08:13.764997 2611 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 13 00:08:13.765573 kubelet[2611]: I0913 00:08:13.765012 2611 policy_none.go:49] "None policy: Start" Sep 13 00:08:13.765573 kubelet[2611]: I0913 00:08:13.765021 2611 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 13 00:08:13.765573 kubelet[2611]: I0913 00:08:13.765028 2611 state_mem.go:35] "Initializing new in-memory state store" Sep 13 00:08:13.765573 kubelet[2611]: I0913 00:08:13.765097 2611 state_mem.go:75] "Updated machine memory state" Sep 13 00:08:13.768580 kubelet[2611]: E0913 00:08:13.768563 2611 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 13 00:08:13.769049 kubelet[2611]: I0913 00:08:13.769029 2611 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 13 00:08:13.769105 kubelet[2611]: I0913 00:08:13.769048 2611 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 13 00:08:13.769362 kubelet[2611]: I0913 00:08:13.769344 2611 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 13 00:08:13.774441 kubelet[2611]: E0913 00:08:13.773066 2611 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 13 00:08:13.810312 kubelet[2611]: I0913 00:08:13.810278 2611 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-5-n-294a4568b6" Sep 13 00:08:13.810613 kubelet[2611]: I0913 00:08:13.810278 2611 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-5-n-294a4568b6" Sep 13 00:08:13.810762 kubelet[2611]: I0913 00:08:13.810440 2611 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-5-n-294a4568b6" Sep 13 00:08:13.881346 kubelet[2611]: I0913 00:08:13.881310 2611 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:13.889665 kubelet[2611]: I0913 00:08:13.889636 2611 kubelet_node_status.go:124] "Node was previously registered" node="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:13.889838 kubelet[2611]: I0913 00:08:13.889707 2611 kubelet_node_status.go:78] "Successfully registered node" node="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:14.003655 kubelet[2611]: I0913 00:08:14.003547 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a3e6731a273bf932ff2542338e26c204-ca-certs\") pod \"kube-apiserver-ci-4081-3-5-n-294a4568b6\" (UID: \"a3e6731a273bf932ff2542338e26c204\") " pod="kube-system/kube-apiserver-ci-4081-3-5-n-294a4568b6" Sep 13 00:08:14.003655 kubelet[2611]: I0913 00:08:14.003584 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a3e6731a273bf932ff2542338e26c204-k8s-certs\") pod \"kube-apiserver-ci-4081-3-5-n-294a4568b6\" (UID: \"a3e6731a273bf932ff2542338e26c204\") " pod="kube-system/kube-apiserver-ci-4081-3-5-n-294a4568b6" Sep 13 00:08:14.003655 kubelet[2611]: I0913 00:08:14.003607 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/2b01612d6c7200421e1fc5ce4bba481a-flexvolume-dir\") pod \"kube-controller-manager-ci-4081-3-5-n-294a4568b6\" (UID: \"2b01612d6c7200421e1fc5ce4bba481a\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-n-294a4568b6" Sep 13 00:08:14.003655 kubelet[2611]: I0913 00:08:14.003626 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/2b01612d6c7200421e1fc5ce4bba481a-kubeconfig\") pod \"kube-controller-manager-ci-4081-3-5-n-294a4568b6\" (UID: \"2b01612d6c7200421e1fc5ce4bba481a\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-n-294a4568b6" Sep 13 00:08:14.003655 kubelet[2611]: I0913 00:08:14.003646 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2b01612d6c7200421e1fc5ce4bba481a-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081-3-5-n-294a4568b6\" (UID: \"2b01612d6c7200421e1fc5ce4bba481a\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-n-294a4568b6" Sep 13 00:08:14.003846 kubelet[2611]: I0913 00:08:14.003692 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/46af876cb6aae2f53f3768e155faba91-kubeconfig\") pod \"kube-scheduler-ci-4081-3-5-n-294a4568b6\" (UID: \"46af876cb6aae2f53f3768e155faba91\") " pod="kube-system/kube-scheduler-ci-4081-3-5-n-294a4568b6" Sep 13 00:08:14.003846 kubelet[2611]: I0913 00:08:14.003720 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a3e6731a273bf932ff2542338e26c204-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081-3-5-n-294a4568b6\" (UID: \"a3e6731a273bf932ff2542338e26c204\") " pod="kube-system/kube-apiserver-ci-4081-3-5-n-294a4568b6" Sep 13 00:08:14.003846 kubelet[2611]: I0913 00:08:14.003738 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2b01612d6c7200421e1fc5ce4bba481a-ca-certs\") pod \"kube-controller-manager-ci-4081-3-5-n-294a4568b6\" (UID: \"2b01612d6c7200421e1fc5ce4bba481a\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-n-294a4568b6" Sep 13 00:08:14.003846 kubelet[2611]: I0913 00:08:14.003756 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2b01612d6c7200421e1fc5ce4bba481a-k8s-certs\") pod \"kube-controller-manager-ci-4081-3-5-n-294a4568b6\" (UID: \"2b01612d6c7200421e1fc5ce4bba481a\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-n-294a4568b6" Sep 13 00:08:14.677730 kubelet[2611]: I0913 00:08:14.675998 2611 apiserver.go:52] "Watching apiserver" Sep 13 00:08:14.703300 kubelet[2611]: I0913 00:08:14.702771 2611 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 13 00:08:14.750935 kubelet[2611]: I0913 00:08:14.749417 2611 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-5-n-294a4568b6" Sep 13 00:08:14.755942 kubelet[2611]: E0913 00:08:14.755883 2611 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081-3-5-n-294a4568b6\" already exists" pod="kube-system/kube-apiserver-ci-4081-3-5-n-294a4568b6" Sep 13 00:08:14.783524 kubelet[2611]: I0913 00:08:14.783279 2611 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4081-3-5-n-294a4568b6" podStartSLOduration=1.7832567030000002 podStartE2EDuration="1.783256703s" podCreationTimestamp="2025-09-13 00:08:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-13 00:08:14.769692293 +0000 UTC m=+1.183725591" watchObservedRunningTime="2025-09-13 00:08:14.783256703 +0000 UTC m=+1.197290001" Sep 13 00:08:14.796198 kubelet[2611]: I0913 00:08:14.796028 2611 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4081-3-5-n-294a4568b6" podStartSLOduration=1.795990776 podStartE2EDuration="1.795990776s" podCreationTimestamp="2025-09-13 00:08:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-13 00:08:14.78411296 +0000 UTC m=+1.198146277" watchObservedRunningTime="2025-09-13 00:08:14.795990776 +0000 UTC m=+1.210024093" Sep 13 00:08:14.796198 kubelet[2611]: I0913 00:08:14.796151 2611 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4081-3-5-n-294a4568b6" podStartSLOduration=1.796140377 podStartE2EDuration="1.796140377s" podCreationTimestamp="2025-09-13 00:08:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-13 00:08:14.794994187 +0000 UTC m=+1.209027504" watchObservedRunningTime="2025-09-13 00:08:14.796140377 +0000 UTC m=+1.210173704" Sep 13 00:08:18.246317 kubelet[2611]: I0913 00:08:18.246261 2611 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 13 00:08:18.248534 containerd[1501]: time="2025-09-13T00:08:18.248474665Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 13 00:08:18.248866 kubelet[2611]: I0913 00:08:18.248686 2611 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 13 00:08:18.734270 systemd[1]: Created slice kubepods-besteffort-pod40f4e968_5762_49d9_b041_daf27f0b3d0c.slice - libcontainer container kubepods-besteffort-pod40f4e968_5762_49d9_b041_daf27f0b3d0c.slice. Sep 13 00:08:18.736749 kubelet[2611]: I0913 00:08:18.736706 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/40f4e968-5762-49d9-b041-daf27f0b3d0c-kube-proxy\") pod \"kube-proxy-vxlgp\" (UID: \"40f4e968-5762-49d9-b041-daf27f0b3d0c\") " pod="kube-system/kube-proxy-vxlgp" Sep 13 00:08:18.736843 kubelet[2611]: I0913 00:08:18.736763 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/40f4e968-5762-49d9-b041-daf27f0b3d0c-lib-modules\") pod \"kube-proxy-vxlgp\" (UID: \"40f4e968-5762-49d9-b041-daf27f0b3d0c\") " pod="kube-system/kube-proxy-vxlgp" Sep 13 00:08:18.736843 kubelet[2611]: I0913 00:08:18.736783 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-462tk\" (UniqueName: \"kubernetes.io/projected/40f4e968-5762-49d9-b041-daf27f0b3d0c-kube-api-access-462tk\") pod \"kube-proxy-vxlgp\" (UID: \"40f4e968-5762-49d9-b041-daf27f0b3d0c\") " pod="kube-system/kube-proxy-vxlgp" Sep 13 00:08:18.736843 kubelet[2611]: I0913 00:08:18.736804 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/40f4e968-5762-49d9-b041-daf27f0b3d0c-xtables-lock\") pod \"kube-proxy-vxlgp\" (UID: \"40f4e968-5762-49d9-b041-daf27f0b3d0c\") " pod="kube-system/kube-proxy-vxlgp" Sep 13 00:08:18.844396 kubelet[2611]: E0913 00:08:18.844360 2611 projected.go:289] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Sep 13 00:08:18.844396 kubelet[2611]: E0913 00:08:18.844393 2611 projected.go:194] Error preparing data for projected volume kube-api-access-462tk for pod kube-system/kube-proxy-vxlgp: configmap "kube-root-ca.crt" not found Sep 13 00:08:18.844541 kubelet[2611]: E0913 00:08:18.844466 2611 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/40f4e968-5762-49d9-b041-daf27f0b3d0c-kube-api-access-462tk podName:40f4e968-5762-49d9-b041-daf27f0b3d0c nodeName:}" failed. No retries permitted until 2025-09-13 00:08:19.344442382 +0000 UTC m=+5.758475689 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-462tk" (UniqueName: "kubernetes.io/projected/40f4e968-5762-49d9-b041-daf27f0b3d0c-kube-api-access-462tk") pod "kube-proxy-vxlgp" (UID: "40f4e968-5762-49d9-b041-daf27f0b3d0c") : configmap "kube-root-ca.crt" not found Sep 13 00:08:19.469289 systemd[1]: Created slice kubepods-besteffort-pod7102f3d0_c242_4316_8a3c_1fb9fdea0dda.slice - libcontainer container kubepods-besteffort-pod7102f3d0_c242_4316_8a3c_1fb9fdea0dda.slice. Sep 13 00:08:19.544871 kubelet[2611]: I0913 00:08:19.544795 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/7102f3d0-c242-4316-8a3c-1fb9fdea0dda-var-lib-calico\") pod \"tigera-operator-755d956888-g9qb5\" (UID: \"7102f3d0-c242-4316-8a3c-1fb9fdea0dda\") " pod="tigera-operator/tigera-operator-755d956888-g9qb5" Sep 13 00:08:19.544871 kubelet[2611]: I0913 00:08:19.544870 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh8z6\" (UniqueName: \"kubernetes.io/projected/7102f3d0-c242-4316-8a3c-1fb9fdea0dda-kube-api-access-dh8z6\") pod \"tigera-operator-755d956888-g9qb5\" (UID: \"7102f3d0-c242-4316-8a3c-1fb9fdea0dda\") " pod="tigera-operator/tigera-operator-755d956888-g9qb5" Sep 13 00:08:19.643897 containerd[1501]: time="2025-09-13T00:08:19.643824504Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-vxlgp,Uid:40f4e968-5762-49d9-b041-daf27f0b3d0c,Namespace:kube-system,Attempt:0,}" Sep 13 00:08:19.677116 containerd[1501]: time="2025-09-13T00:08:19.676198429Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 00:08:19.677116 containerd[1501]: time="2025-09-13T00:08:19.676289897Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 00:08:19.677116 containerd[1501]: time="2025-09-13T00:08:19.676326729Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 00:08:19.677116 containerd[1501]: time="2025-09-13T00:08:19.676425411Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 00:08:19.704218 systemd[1]: Started cri-containerd-30c16441f78d2afb08e9ff2ae5a249c245781b8e866e69788f7b176b2c9ae0f8.scope - libcontainer container 30c16441f78d2afb08e9ff2ae5a249c245781b8e866e69788f7b176b2c9ae0f8. Sep 13 00:08:19.726136 containerd[1501]: time="2025-09-13T00:08:19.725683280Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-vxlgp,Uid:40f4e968-5762-49d9-b041-daf27f0b3d0c,Namespace:kube-system,Attempt:0,} returns sandbox id \"30c16441f78d2afb08e9ff2ae5a249c245781b8e866e69788f7b176b2c9ae0f8\"" Sep 13 00:08:19.732628 containerd[1501]: time="2025-09-13T00:08:19.732588687Z" level=info msg="CreateContainer within sandbox \"30c16441f78d2afb08e9ff2ae5a249c245781b8e866e69788f7b176b2c9ae0f8\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 13 00:08:19.745922 containerd[1501]: time="2025-09-13T00:08:19.745862395Z" level=info msg="CreateContainer within sandbox \"30c16441f78d2afb08e9ff2ae5a249c245781b8e866e69788f7b176b2c9ae0f8\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"67cb38f9a8cc073906bfeda04965df9b6b13cae531b8f88b27b50dc50c64992d\"" Sep 13 00:08:19.746967 containerd[1501]: time="2025-09-13T00:08:19.746934762Z" level=info msg="StartContainer for \"67cb38f9a8cc073906bfeda04965df9b6b13cae531b8f88b27b50dc50c64992d\"" Sep 13 00:08:19.776072 systemd[1]: Started cri-containerd-67cb38f9a8cc073906bfeda04965df9b6b13cae531b8f88b27b50dc50c64992d.scope - libcontainer container 67cb38f9a8cc073906bfeda04965df9b6b13cae531b8f88b27b50dc50c64992d. Sep 13 00:08:19.777664 containerd[1501]: time="2025-09-13T00:08:19.777051045Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-g9qb5,Uid:7102f3d0-c242-4316-8a3c-1fb9fdea0dda,Namespace:tigera-operator,Attempt:0,}" Sep 13 00:08:19.802859 containerd[1501]: time="2025-09-13T00:08:19.802218488Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 00:08:19.802859 containerd[1501]: time="2025-09-13T00:08:19.802281540Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 00:08:19.802859 containerd[1501]: time="2025-09-13T00:08:19.802306930Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 00:08:19.802859 containerd[1501]: time="2025-09-13T00:08:19.802375032Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 00:08:19.828502 systemd[1]: Started cri-containerd-f3aedd9214944d7bc2567f5189347a66ee73b93a8763a391ac77d0888ad8f434.scope - libcontainer container f3aedd9214944d7bc2567f5189347a66ee73b93a8763a391ac77d0888ad8f434. Sep 13 00:08:19.832677 containerd[1501]: time="2025-09-13T00:08:19.832545469Z" level=info msg="StartContainer for \"67cb38f9a8cc073906bfeda04965df9b6b13cae531b8f88b27b50dc50c64992d\" returns successfully" Sep 13 00:08:19.874829 containerd[1501]: time="2025-09-13T00:08:19.874755835Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-g9qb5,Uid:7102f3d0-c242-4316-8a3c-1fb9fdea0dda,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"f3aedd9214944d7bc2567f5189347a66ee73b93a8763a391ac77d0888ad8f434\"" Sep 13 00:08:19.877412 containerd[1501]: time="2025-09-13T00:08:19.877367607Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 13 00:08:20.454679 systemd[1]: run-containerd-runc-k8s.io-30c16441f78d2afb08e9ff2ae5a249c245781b8e866e69788f7b176b2c9ae0f8-runc.fv7yY6.mount: Deactivated successfully. Sep 13 00:08:21.905396 kubelet[2611]: I0913 00:08:21.905321 2611 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-vxlgp" podStartSLOduration=3.905301252 podStartE2EDuration="3.905301252s" podCreationTimestamp="2025-09-13 00:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-13 00:08:20.770413732 +0000 UTC m=+7.184447039" watchObservedRunningTime="2025-09-13 00:08:21.905301252 +0000 UTC m=+8.319334548" Sep 13 00:08:22.009361 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount336736918.mount: Deactivated successfully. Sep 13 00:08:22.424148 containerd[1501]: time="2025-09-13T00:08:22.424070320Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:22.425462 containerd[1501]: time="2025-09-13T00:08:22.425305159Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 13 00:08:22.427945 containerd[1501]: time="2025-09-13T00:08:22.426264073Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:22.429000 containerd[1501]: time="2025-09-13T00:08:22.428958772Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:22.430053 containerd[1501]: time="2025-09-13T00:08:22.430021277Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 2.552617921s" Sep 13 00:08:22.430168 containerd[1501]: time="2025-09-13T00:08:22.430139696Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 13 00:08:22.436221 containerd[1501]: time="2025-09-13T00:08:22.436152722Z" level=info msg="CreateContainer within sandbox \"f3aedd9214944d7bc2567f5189347a66ee73b93a8763a391ac77d0888ad8f434\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 13 00:08:22.453557 containerd[1501]: time="2025-09-13T00:08:22.453483311Z" level=info msg="CreateContainer within sandbox \"f3aedd9214944d7bc2567f5189347a66ee73b93a8763a391ac77d0888ad8f434\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"79b5701b2ceb633b53145f7995275084a94d8ec47131687ba1fb115130506abb\"" Sep 13 00:08:22.456059 containerd[1501]: time="2025-09-13T00:08:22.455991189Z" level=info msg="StartContainer for \"79b5701b2ceb633b53145f7995275084a94d8ec47131687ba1fb115130506abb\"" Sep 13 00:08:22.489112 systemd[1]: Started cri-containerd-79b5701b2ceb633b53145f7995275084a94d8ec47131687ba1fb115130506abb.scope - libcontainer container 79b5701b2ceb633b53145f7995275084a94d8ec47131687ba1fb115130506abb. Sep 13 00:08:22.518857 containerd[1501]: time="2025-09-13T00:08:22.518770372Z" level=info msg="StartContainer for \"79b5701b2ceb633b53145f7995275084a94d8ec47131687ba1fb115130506abb\" returns successfully" Sep 13 00:08:24.356802 kubelet[2611]: I0913 00:08:24.356681 2611 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-g9qb5" podStartSLOduration=2.802109481 podStartE2EDuration="5.35666688s" podCreationTimestamp="2025-09-13 00:08:19 +0000 UTC" firstStartedPulling="2025-09-13 00:08:19.876784763 +0000 UTC m=+6.290818060" lastFinishedPulling="2025-09-13 00:08:22.431342161 +0000 UTC m=+8.845375459" observedRunningTime="2025-09-13 00:08:22.784971517 +0000 UTC m=+9.199004844" watchObservedRunningTime="2025-09-13 00:08:24.35666688 +0000 UTC m=+10.770700177" Sep 13 00:08:28.561751 sudo[1730]: pam_unix(sudo:session): session closed for user root Sep 13 00:08:28.741636 sshd[1727]: pam_unix(sshd:session): session closed for user core Sep 13 00:08:28.745231 systemd[1]: sshd@6-157.180.121.11:22-147.75.109.163:53390.service: Deactivated successfully. Sep 13 00:08:28.747584 systemd[1]: session-7.scope: Deactivated successfully. Sep 13 00:08:28.748014 systemd[1]: session-7.scope: Consumed 4.697s CPU time, 143.2M memory peak, 0B memory swap peak. Sep 13 00:08:28.750010 systemd-logind[1477]: Session 7 logged out. Waiting for processes to exit. Sep 13 00:08:28.751442 systemd-logind[1477]: Removed session 7. Sep 13 00:08:31.881846 systemd[1]: Created slice kubepods-besteffort-pod096b2ea7_daff_4f0b_b6b5_d160ae082bbb.slice - libcontainer container kubepods-besteffort-pod096b2ea7_daff_4f0b_b6b5_d160ae082bbb.slice. Sep 13 00:08:31.925214 kubelet[2611]: I0913 00:08:31.925007 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/096b2ea7-daff-4f0b-b6b5-d160ae082bbb-tigera-ca-bundle\") pod \"calico-typha-5499d5d6c9-9wtgx\" (UID: \"096b2ea7-daff-4f0b-b6b5-d160ae082bbb\") " pod="calico-system/calico-typha-5499d5d6c9-9wtgx" Sep 13 00:08:31.925214 kubelet[2611]: I0913 00:08:31.925050 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/096b2ea7-daff-4f0b-b6b5-d160ae082bbb-typha-certs\") pod \"calico-typha-5499d5d6c9-9wtgx\" (UID: \"096b2ea7-daff-4f0b-b6b5-d160ae082bbb\") " pod="calico-system/calico-typha-5499d5d6c9-9wtgx" Sep 13 00:08:31.925214 kubelet[2611]: I0913 00:08:31.925070 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr9jp\" (UniqueName: \"kubernetes.io/projected/096b2ea7-daff-4f0b-b6b5-d160ae082bbb-kube-api-access-hr9jp\") pod \"calico-typha-5499d5d6c9-9wtgx\" (UID: \"096b2ea7-daff-4f0b-b6b5-d160ae082bbb\") " pod="calico-system/calico-typha-5499d5d6c9-9wtgx" Sep 13 00:08:32.194015 containerd[1501]: time="2025-09-13T00:08:32.193707483Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5499d5d6c9-9wtgx,Uid:096b2ea7-daff-4f0b-b6b5-d160ae082bbb,Namespace:calico-system,Attempt:0,}" Sep 13 00:08:32.248023 containerd[1501]: time="2025-09-13T00:08:32.245431144Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 00:08:32.248023 containerd[1501]: time="2025-09-13T00:08:32.245487591Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 00:08:32.248023 containerd[1501]: time="2025-09-13T00:08:32.245501618Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 00:08:32.248023 containerd[1501]: time="2025-09-13T00:08:32.245564738Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 00:08:32.256158 systemd[1]: Created slice kubepods-besteffort-poda3bdb8cf_f266_48a9_a8db_23795e9e58ef.slice - libcontainer container kubepods-besteffort-poda3bdb8cf_f266_48a9_a8db_23795e9e58ef.slice. Sep 13 00:08:32.309037 systemd[1]: Started cri-containerd-5230d891fbb1d3a36ae5a97a51bc14276e6a73de628494017ab18b9dcbf7db07.scope - libcontainer container 5230d891fbb1d3a36ae5a97a51bc14276e6a73de628494017ab18b9dcbf7db07. Sep 13 00:08:32.328744 kubelet[2611]: I0913 00:08:32.328700 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/a3bdb8cf-f266-48a9-a8db-23795e9e58ef-cni-bin-dir\") pod \"calico-node-hkqvn\" (UID: \"a3bdb8cf-f266-48a9-a8db-23795e9e58ef\") " pod="calico-system/calico-node-hkqvn" Sep 13 00:08:32.328744 kubelet[2611]: I0913 00:08:32.328744 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/a3bdb8cf-f266-48a9-a8db-23795e9e58ef-cni-log-dir\") pod \"calico-node-hkqvn\" (UID: \"a3bdb8cf-f266-48a9-a8db-23795e9e58ef\") " pod="calico-system/calico-node-hkqvn" Sep 13 00:08:32.328949 kubelet[2611]: I0913 00:08:32.328759 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdjjm\" (UniqueName: \"kubernetes.io/projected/a3bdb8cf-f266-48a9-a8db-23795e9e58ef-kube-api-access-kdjjm\") pod \"calico-node-hkqvn\" (UID: \"a3bdb8cf-f266-48a9-a8db-23795e9e58ef\") " pod="calico-system/calico-node-hkqvn" Sep 13 00:08:32.328949 kubelet[2611]: I0913 00:08:32.328776 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a3bdb8cf-f266-48a9-a8db-23795e9e58ef-var-lib-calico\") pod \"calico-node-hkqvn\" (UID: \"a3bdb8cf-f266-48a9-a8db-23795e9e58ef\") " pod="calico-system/calico-node-hkqvn" Sep 13 00:08:32.328949 kubelet[2611]: I0913 00:08:32.328788 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/a3bdb8cf-f266-48a9-a8db-23795e9e58ef-cni-net-dir\") pod \"calico-node-hkqvn\" (UID: \"a3bdb8cf-f266-48a9-a8db-23795e9e58ef\") " pod="calico-system/calico-node-hkqvn" Sep 13 00:08:32.328949 kubelet[2611]: I0913 00:08:32.328803 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a3bdb8cf-f266-48a9-a8db-23795e9e58ef-lib-modules\") pod \"calico-node-hkqvn\" (UID: \"a3bdb8cf-f266-48a9-a8db-23795e9e58ef\") " pod="calico-system/calico-node-hkqvn" Sep 13 00:08:32.328949 kubelet[2611]: I0913 00:08:32.328814 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/a3bdb8cf-f266-48a9-a8db-23795e9e58ef-policysync\") pod \"calico-node-hkqvn\" (UID: \"a3bdb8cf-f266-48a9-a8db-23795e9e58ef\") " pod="calico-system/calico-node-hkqvn" Sep 13 00:08:32.329522 kubelet[2611]: I0913 00:08:32.328835 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/a3bdb8cf-f266-48a9-a8db-23795e9e58ef-var-run-calico\") pod \"calico-node-hkqvn\" (UID: \"a3bdb8cf-f266-48a9-a8db-23795e9e58ef\") " pod="calico-system/calico-node-hkqvn" Sep 13 00:08:32.329522 kubelet[2611]: I0913 00:08:32.328852 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/a3bdb8cf-f266-48a9-a8db-23795e9e58ef-node-certs\") pod \"calico-node-hkqvn\" (UID: \"a3bdb8cf-f266-48a9-a8db-23795e9e58ef\") " pod="calico-system/calico-node-hkqvn" Sep 13 00:08:32.329522 kubelet[2611]: I0913 00:08:32.328870 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3bdb8cf-f266-48a9-a8db-23795e9e58ef-tigera-ca-bundle\") pod \"calico-node-hkqvn\" (UID: \"a3bdb8cf-f266-48a9-a8db-23795e9e58ef\") " pod="calico-system/calico-node-hkqvn" Sep 13 00:08:32.329522 kubelet[2611]: I0913 00:08:32.328881 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a3bdb8cf-f266-48a9-a8db-23795e9e58ef-xtables-lock\") pod \"calico-node-hkqvn\" (UID: \"a3bdb8cf-f266-48a9-a8db-23795e9e58ef\") " pod="calico-system/calico-node-hkqvn" Sep 13 00:08:32.329522 kubelet[2611]: I0913 00:08:32.328898 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/a3bdb8cf-f266-48a9-a8db-23795e9e58ef-flexvol-driver-host\") pod \"calico-node-hkqvn\" (UID: \"a3bdb8cf-f266-48a9-a8db-23795e9e58ef\") " pod="calico-system/calico-node-hkqvn" Sep 13 00:08:32.427373 containerd[1501]: time="2025-09-13T00:08:32.427323083Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5499d5d6c9-9wtgx,Uid:096b2ea7-daff-4f0b-b6b5-d160ae082bbb,Namespace:calico-system,Attempt:0,} returns sandbox id \"5230d891fbb1d3a36ae5a97a51bc14276e6a73de628494017ab18b9dcbf7db07\"" Sep 13 00:08:32.434326 containerd[1501]: time="2025-09-13T00:08:32.433967608Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 13 00:08:32.435131 kubelet[2611]: E0913 00:08:32.434832 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.435131 kubelet[2611]: W0913 00:08:32.435042 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.436824 kubelet[2611]: E0913 00:08:32.436621 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.438949 kubelet[2611]: E0913 00:08:32.438682 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.439036 kubelet[2611]: W0913 00:08:32.438948 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.439036 kubelet[2611]: E0913 00:08:32.438979 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.440096 kubelet[2611]: E0913 00:08:32.439558 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.440096 kubelet[2611]: W0913 00:08:32.439571 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.440096 kubelet[2611]: E0913 00:08:32.439958 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.440404 kubelet[2611]: E0913 00:08:32.440255 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.440404 kubelet[2611]: W0913 00:08:32.440271 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.440404 kubelet[2611]: E0913 00:08:32.440285 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.441830 kubelet[2611]: E0913 00:08:32.441742 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.441830 kubelet[2611]: W0913 00:08:32.441757 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.441830 kubelet[2611]: E0913 00:08:32.441771 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.443136 kubelet[2611]: E0913 00:08:32.443113 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.443136 kubelet[2611]: W0913 00:08:32.443130 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.443199 kubelet[2611]: E0913 00:08:32.443142 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.445834 kubelet[2611]: E0913 00:08:32.445506 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.445834 kubelet[2611]: W0913 00:08:32.445552 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.445834 kubelet[2611]: E0913 00:08:32.445566 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.511745 kubelet[2611]: E0913 00:08:32.511383 2611 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-pdqq4" podUID="d0c095e9-b846-4f71-bc12-2e92665be871" Sep 13 00:08:32.521448 kubelet[2611]: E0913 00:08:32.521416 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.521448 kubelet[2611]: W0913 00:08:32.521437 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.521738 kubelet[2611]: E0913 00:08:32.521455 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.521738 kubelet[2611]: E0913 00:08:32.521674 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.521738 kubelet[2611]: W0913 00:08:32.521683 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.521738 kubelet[2611]: E0913 00:08:32.521696 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.522002 kubelet[2611]: E0913 00:08:32.521906 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.522002 kubelet[2611]: W0913 00:08:32.521937 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.522002 kubelet[2611]: E0913 00:08:32.521949 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.522235 kubelet[2611]: E0913 00:08:32.522208 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.522235 kubelet[2611]: W0913 00:08:32.522217 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.522235 kubelet[2611]: E0913 00:08:32.522228 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.522435 kubelet[2611]: E0913 00:08:32.522421 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.522435 kubelet[2611]: W0913 00:08:32.522433 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.522673 kubelet[2611]: E0913 00:08:32.522442 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.522722 kubelet[2611]: E0913 00:08:32.522678 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.522722 kubelet[2611]: W0913 00:08:32.522686 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.522722 kubelet[2611]: E0913 00:08:32.522695 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.522881 kubelet[2611]: E0913 00:08:32.522864 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.522881 kubelet[2611]: W0913 00:08:32.522877 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.522950 kubelet[2611]: E0913 00:08:32.522887 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.523137 kubelet[2611]: E0913 00:08:32.523123 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.523137 kubelet[2611]: W0913 00:08:32.523134 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.523205 kubelet[2611]: E0913 00:08:32.523143 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.523362 kubelet[2611]: E0913 00:08:32.523339 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.523399 kubelet[2611]: W0913 00:08:32.523372 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.523399 kubelet[2611]: E0913 00:08:32.523383 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.523830 kubelet[2611]: E0913 00:08:32.523813 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.523830 kubelet[2611]: W0913 00:08:32.523826 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.524277 kubelet[2611]: E0913 00:08:32.523835 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.524885 kubelet[2611]: E0913 00:08:32.524868 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.524885 kubelet[2611]: W0913 00:08:32.524883 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.525147 kubelet[2611]: E0913 00:08:32.524893 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.525381 kubelet[2611]: E0913 00:08:32.525269 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.525429 kubelet[2611]: W0913 00:08:32.525381 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.525429 kubelet[2611]: E0913 00:08:32.525391 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.525808 kubelet[2611]: E0913 00:08:32.525788 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.525808 kubelet[2611]: W0913 00:08:32.525806 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.525920 kubelet[2611]: E0913 00:08:32.525818 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.527499 kubelet[2611]: E0913 00:08:32.527297 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.527499 kubelet[2611]: W0913 00:08:32.527312 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.527499 kubelet[2611]: E0913 00:08:32.527368 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.527656 kubelet[2611]: E0913 00:08:32.527571 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.527656 kubelet[2611]: W0913 00:08:32.527579 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.528650 kubelet[2611]: E0913 00:08:32.528614 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.529131 kubelet[2611]: E0913 00:08:32.529115 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.529131 kubelet[2611]: W0913 00:08:32.529128 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.529209 kubelet[2611]: E0913 00:08:32.529139 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.529475 kubelet[2611]: E0913 00:08:32.529394 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.529475 kubelet[2611]: W0913 00:08:32.529404 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.529475 kubelet[2611]: E0913 00:08:32.529414 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.530060 kubelet[2611]: E0913 00:08:32.530044 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.530060 kubelet[2611]: W0913 00:08:32.530057 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.530132 kubelet[2611]: E0913 00:08:32.530067 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.531124 kubelet[2611]: E0913 00:08:32.531092 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.531124 kubelet[2611]: W0913 00:08:32.531110 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.531188 kubelet[2611]: E0913 00:08:32.531164 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.531381 kubelet[2611]: E0913 00:08:32.531361 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.531381 kubelet[2611]: W0913 00:08:32.531374 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.531444 kubelet[2611]: E0913 00:08:32.531383 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.533189 kubelet[2611]: E0913 00:08:32.533169 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.533189 kubelet[2611]: W0913 00:08:32.533182 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.533264 kubelet[2611]: E0913 00:08:32.533196 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.533264 kubelet[2611]: I0913 00:08:32.533217 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhjmn\" (UniqueName: \"kubernetes.io/projected/d0c095e9-b846-4f71-bc12-2e92665be871-kube-api-access-vhjmn\") pod \"csi-node-driver-pdqq4\" (UID: \"d0c095e9-b846-4f71-bc12-2e92665be871\") " pod="calico-system/csi-node-driver-pdqq4" Sep 13 00:08:32.533681 kubelet[2611]: E0913 00:08:32.533660 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.533681 kubelet[2611]: W0913 00:08:32.533676 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.533746 kubelet[2611]: E0913 00:08:32.533685 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.533746 kubelet[2611]: I0913 00:08:32.533707 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0c095e9-b846-4f71-bc12-2e92665be871-kubelet-dir\") pod \"csi-node-driver-pdqq4\" (UID: \"d0c095e9-b846-4f71-bc12-2e92665be871\") " pod="calico-system/csi-node-driver-pdqq4" Sep 13 00:08:32.534348 kubelet[2611]: E0913 00:08:32.534150 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.534348 kubelet[2611]: W0913 00:08:32.534160 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.534348 kubelet[2611]: E0913 00:08:32.534169 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.534348 kubelet[2611]: I0913 00:08:32.534183 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/d0c095e9-b846-4f71-bc12-2e92665be871-varrun\") pod \"csi-node-driver-pdqq4\" (UID: \"d0c095e9-b846-4f71-bc12-2e92665be871\") " pod="calico-system/csi-node-driver-pdqq4" Sep 13 00:08:32.534668 kubelet[2611]: E0913 00:08:32.534650 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.534668 kubelet[2611]: W0913 00:08:32.534665 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.534949 kubelet[2611]: E0913 00:08:32.534676 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.534949 kubelet[2611]: I0913 00:08:32.534691 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d0c095e9-b846-4f71-bc12-2e92665be871-socket-dir\") pod \"csi-node-driver-pdqq4\" (UID: \"d0c095e9-b846-4f71-bc12-2e92665be871\") " pod="calico-system/csi-node-driver-pdqq4" Sep 13 00:08:32.536152 kubelet[2611]: E0913 00:08:32.536128 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.536152 kubelet[2611]: W0913 00:08:32.536149 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.536232 kubelet[2611]: E0913 00:08:32.536163 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.536232 kubelet[2611]: I0913 00:08:32.536212 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d0c095e9-b846-4f71-bc12-2e92665be871-registration-dir\") pod \"csi-node-driver-pdqq4\" (UID: \"d0c095e9-b846-4f71-bc12-2e92665be871\") " pod="calico-system/csi-node-driver-pdqq4" Sep 13 00:08:32.538036 kubelet[2611]: E0913 00:08:32.537990 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.538036 kubelet[2611]: W0913 00:08:32.538025 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.538036 kubelet[2611]: E0913 00:08:32.538035 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.538433 kubelet[2611]: E0913 00:08:32.538297 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.538433 kubelet[2611]: W0913 00:08:32.538307 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.538433 kubelet[2611]: E0913 00:08:32.538319 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.538829 kubelet[2611]: E0913 00:08:32.538692 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.538829 kubelet[2611]: W0913 00:08:32.538702 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.538829 kubelet[2611]: E0913 00:08:32.538714 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.540022 kubelet[2611]: E0913 00:08:32.539994 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.540022 kubelet[2611]: W0913 00:08:32.540024 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.540022 kubelet[2611]: E0913 00:08:32.540037 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.540375 kubelet[2611]: E0913 00:08:32.540172 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.540375 kubelet[2611]: W0913 00:08:32.540184 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.540375 kubelet[2611]: E0913 00:08:32.540192 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.540375 kubelet[2611]: E0913 00:08:32.540344 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.540375 kubelet[2611]: W0913 00:08:32.540355 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.540375 kubelet[2611]: E0913 00:08:32.540362 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.540658 kubelet[2611]: E0913 00:08:32.540554 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.540658 kubelet[2611]: W0913 00:08:32.540564 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.540658 kubelet[2611]: E0913 00:08:32.540609 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.541352 kubelet[2611]: E0913 00:08:32.541215 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.541352 kubelet[2611]: W0913 00:08:32.541226 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.541352 kubelet[2611]: E0913 00:08:32.541236 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.542546 kubelet[2611]: E0913 00:08:32.542215 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.542714 kubelet[2611]: W0913 00:08:32.542633 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.542714 kubelet[2611]: E0913 00:08:32.542649 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.543190 kubelet[2611]: E0913 00:08:32.542946 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.543190 kubelet[2611]: W0913 00:08:32.542958 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.543190 kubelet[2611]: E0913 00:08:32.542967 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.563633 containerd[1501]: time="2025-09-13T00:08:32.563563567Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-hkqvn,Uid:a3bdb8cf-f266-48a9-a8db-23795e9e58ef,Namespace:calico-system,Attempt:0,}" Sep 13 00:08:32.583213 containerd[1501]: time="2025-09-13T00:08:32.583115932Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 00:08:32.583213 containerd[1501]: time="2025-09-13T00:08:32.583168302Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 00:08:32.583213 containerd[1501]: time="2025-09-13T00:08:32.583180886Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 00:08:32.583523 containerd[1501]: time="2025-09-13T00:08:32.583480466Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 00:08:32.602087 systemd[1]: Started cri-containerd-ca5b64bf84b0b58cdcc3ed88055bb8250edefe88dedf30a45c65d45687757299.scope - libcontainer container ca5b64bf84b0b58cdcc3ed88055bb8250edefe88dedf30a45c65d45687757299. Sep 13 00:08:32.619520 containerd[1501]: time="2025-09-13T00:08:32.619488185Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-hkqvn,Uid:a3bdb8cf-f266-48a9-a8db-23795e9e58ef,Namespace:calico-system,Attempt:0,} returns sandbox id \"ca5b64bf84b0b58cdcc3ed88055bb8250edefe88dedf30a45c65d45687757299\"" Sep 13 00:08:32.638158 kubelet[2611]: E0913 00:08:32.638015 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.638158 kubelet[2611]: W0913 00:08:32.638036 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.638158 kubelet[2611]: E0913 00:08:32.638055 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.638422 kubelet[2611]: E0913 00:08:32.638297 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.638422 kubelet[2611]: W0913 00:08:32.638307 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.638668 kubelet[2611]: E0913 00:08:32.638527 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.638792 kubelet[2611]: E0913 00:08:32.638782 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.638901 kubelet[2611]: W0913 00:08:32.638836 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.638901 kubelet[2611]: E0913 00:08:32.638849 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.639413 kubelet[2611]: E0913 00:08:32.639340 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.639413 kubelet[2611]: W0913 00:08:32.639350 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.639413 kubelet[2611]: E0913 00:08:32.639359 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.640808 kubelet[2611]: E0913 00:08:32.640121 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.640808 kubelet[2611]: W0913 00:08:32.640131 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.640808 kubelet[2611]: E0913 00:08:32.640140 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.641082 kubelet[2611]: E0913 00:08:32.640997 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.641082 kubelet[2611]: W0913 00:08:32.641006 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.641082 kubelet[2611]: E0913 00:08:32.641014 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.641649 kubelet[2611]: E0913 00:08:32.641235 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.641649 kubelet[2611]: W0913 00:08:32.641248 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.641649 kubelet[2611]: E0913 00:08:32.641257 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.642627 kubelet[2611]: E0913 00:08:32.642020 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.642627 kubelet[2611]: W0913 00:08:32.642055 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.642627 kubelet[2611]: E0913 00:08:32.642065 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.642983 kubelet[2611]: E0913 00:08:32.642867 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.642983 kubelet[2611]: W0913 00:08:32.642879 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.642983 kubelet[2611]: E0913 00:08:32.642889 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.643246 kubelet[2611]: E0913 00:08:32.643236 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.643317 kubelet[2611]: W0913 00:08:32.643296 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.643317 kubelet[2611]: E0913 00:08:32.643308 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.643816 kubelet[2611]: E0913 00:08:32.643714 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.643816 kubelet[2611]: W0913 00:08:32.643723 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.643816 kubelet[2611]: E0913 00:08:32.643731 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.644843 kubelet[2611]: E0913 00:08:32.644763 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.644843 kubelet[2611]: W0913 00:08:32.644776 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.644843 kubelet[2611]: E0913 00:08:32.644788 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.645127 kubelet[2611]: E0913 00:08:32.645118 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.645309 kubelet[2611]: W0913 00:08:32.645217 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.645309 kubelet[2611]: E0913 00:08:32.645229 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.645774 kubelet[2611]: E0913 00:08:32.645448 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.645774 kubelet[2611]: W0913 00:08:32.645458 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.645774 kubelet[2611]: E0913 00:08:32.645465 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.646063 kubelet[2611]: E0913 00:08:32.645955 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.646063 kubelet[2611]: W0913 00:08:32.645964 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.646063 kubelet[2611]: E0913 00:08:32.645972 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.647196 kubelet[2611]: E0913 00:08:32.647112 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.647196 kubelet[2611]: W0913 00:08:32.647122 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.647196 kubelet[2611]: E0913 00:08:32.647130 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.647386 kubelet[2611]: E0913 00:08:32.647376 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.647505 kubelet[2611]: W0913 00:08:32.647426 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.647505 kubelet[2611]: E0913 00:08:32.647442 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.647625 kubelet[2611]: E0913 00:08:32.647615 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.647672 kubelet[2611]: W0913 00:08:32.647664 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.649677 kubelet[2611]: E0913 00:08:32.649653 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.650046 kubelet[2611]: E0913 00:08:32.649948 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.650046 kubelet[2611]: W0913 00:08:32.649958 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.650046 kubelet[2611]: E0913 00:08:32.649966 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.650339 kubelet[2611]: E0913 00:08:32.650313 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.650339 kubelet[2611]: W0913 00:08:32.650322 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.650339 kubelet[2611]: E0913 00:08:32.650330 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.651996 kubelet[2611]: E0913 00:08:32.651978 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.651996 kubelet[2611]: W0913 00:08:32.651994 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.652062 kubelet[2611]: E0913 00:08:32.652006 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.652260 kubelet[2611]: E0913 00:08:32.652245 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.652260 kubelet[2611]: W0913 00:08:32.652259 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.652490 kubelet[2611]: E0913 00:08:32.652268 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.652490 kubelet[2611]: E0913 00:08:32.652482 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.652529 kubelet[2611]: W0913 00:08:32.652493 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.652529 kubelet[2611]: E0913 00:08:32.652503 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.653539 kubelet[2611]: E0913 00:08:32.653392 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.653539 kubelet[2611]: W0913 00:08:32.653409 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.653539 kubelet[2611]: E0913 00:08:32.653422 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.653662 kubelet[2611]: E0913 00:08:32.653640 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.653690 kubelet[2611]: W0913 00:08:32.653661 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.653690 kubelet[2611]: E0913 00:08:32.653670 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:32.671518 kubelet[2611]: E0913 00:08:32.671485 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:32.671518 kubelet[2611]: W0913 00:08:32.671512 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:32.671691 kubelet[2611]: E0913 00:08:32.671532 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:33.712902 kubelet[2611]: E0913 00:08:33.712772 2611 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-pdqq4" podUID="d0c095e9-b846-4f71-bc12-2e92665be871" Sep 13 00:08:34.139031 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3613970160.mount: Deactivated successfully. Sep 13 00:08:34.563251 containerd[1501]: time="2025-09-13T00:08:34.562780206Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 13 00:08:34.565868 containerd[1501]: time="2025-09-13T00:08:34.565635488Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 2.131575424s" Sep 13 00:08:34.565868 containerd[1501]: time="2025-09-13T00:08:34.565680994Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 13 00:08:34.569313 containerd[1501]: time="2025-09-13T00:08:34.569007492Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 13 00:08:34.572956 containerd[1501]: time="2025-09-13T00:08:34.572891109Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:34.577748 containerd[1501]: time="2025-09-13T00:08:34.574777969Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:34.584409 containerd[1501]: time="2025-09-13T00:08:34.581704674Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:34.591740 containerd[1501]: time="2025-09-13T00:08:34.591698766Z" level=info msg="CreateContainer within sandbox \"5230d891fbb1d3a36ae5a97a51bc14276e6a73de628494017ab18b9dcbf7db07\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 13 00:08:34.624836 containerd[1501]: time="2025-09-13T00:08:34.624789522Z" level=info msg="CreateContainer within sandbox \"5230d891fbb1d3a36ae5a97a51bc14276e6a73de628494017ab18b9dcbf7db07\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"d593e389fe82abb21ec79405424ba6ec229d1cf7483d082b2ebd968e57466d52\"" Sep 13 00:08:34.625743 containerd[1501]: time="2025-09-13T00:08:34.625715824Z" level=info msg="StartContainer for \"d593e389fe82abb21ec79405424ba6ec229d1cf7483d082b2ebd968e57466d52\"" Sep 13 00:08:34.671063 systemd[1]: Started cri-containerd-d593e389fe82abb21ec79405424ba6ec229d1cf7483d082b2ebd968e57466d52.scope - libcontainer container d593e389fe82abb21ec79405424ba6ec229d1cf7483d082b2ebd968e57466d52. Sep 13 00:08:34.714638 containerd[1501]: time="2025-09-13T00:08:34.714593380Z" level=info msg="StartContainer for \"d593e389fe82abb21ec79405424ba6ec229d1cf7483d082b2ebd968e57466d52\" returns successfully" Sep 13 00:08:34.946879 kubelet[2611]: E0913 00:08:34.946742 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.946879 kubelet[2611]: W0913 00:08:34.946766 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.946879 kubelet[2611]: E0913 00:08:34.946787 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.947560 kubelet[2611]: E0913 00:08:34.947445 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.947560 kubelet[2611]: W0913 00:08:34.947461 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.947560 kubelet[2611]: E0913 00:08:34.947476 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.947893 kubelet[2611]: E0913 00:08:34.947677 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.947893 kubelet[2611]: W0913 00:08:34.947687 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.947893 kubelet[2611]: E0913 00:08:34.947697 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.948995 kubelet[2611]: E0913 00:08:34.948025 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.948995 kubelet[2611]: W0913 00:08:34.948039 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.948995 kubelet[2611]: E0913 00:08:34.948053 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.948995 kubelet[2611]: E0913 00:08:34.948429 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.948995 kubelet[2611]: W0913 00:08:34.948438 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.948995 kubelet[2611]: E0913 00:08:34.948446 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.949375 kubelet[2611]: E0913 00:08:34.949274 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.949375 kubelet[2611]: W0913 00:08:34.949285 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.949375 kubelet[2611]: E0913 00:08:34.949294 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.949582 kubelet[2611]: E0913 00:08:34.949573 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.949689 kubelet[2611]: W0913 00:08:34.949625 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.949689 kubelet[2611]: E0913 00:08:34.949638 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.949968 kubelet[2611]: E0913 00:08:34.949886 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.949968 kubelet[2611]: W0913 00:08:34.949897 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.949968 kubelet[2611]: E0913 00:08:34.949904 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.950280 kubelet[2611]: E0913 00:08:34.950204 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.950280 kubelet[2611]: W0913 00:08:34.950215 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.950280 kubelet[2611]: E0913 00:08:34.950224 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.950772 kubelet[2611]: E0913 00:08:34.950585 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.950772 kubelet[2611]: W0913 00:08:34.950604 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.950772 kubelet[2611]: E0913 00:08:34.950612 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.951730 kubelet[2611]: E0913 00:08:34.951583 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.951730 kubelet[2611]: W0913 00:08:34.951601 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.951730 kubelet[2611]: E0913 00:08:34.951614 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.951955 kubelet[2611]: E0913 00:08:34.951943 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.952449 kubelet[2611]: W0913 00:08:34.952024 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.952449 kubelet[2611]: E0913 00:08:34.952036 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.952449 kubelet[2611]: E0913 00:08:34.952214 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.952449 kubelet[2611]: W0913 00:08:34.952224 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.952449 kubelet[2611]: E0913 00:08:34.952231 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.952732 kubelet[2611]: E0913 00:08:34.952618 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.952732 kubelet[2611]: W0913 00:08:34.952629 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.952732 kubelet[2611]: E0913 00:08:34.952637 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.953004 kubelet[2611]: E0913 00:08:34.952828 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.953004 kubelet[2611]: W0913 00:08:34.952838 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.953004 kubelet[2611]: E0913 00:08:34.952846 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.957100 kubelet[2611]: E0913 00:08:34.956203 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.957100 kubelet[2611]: W0913 00:08:34.956219 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.957100 kubelet[2611]: E0913 00:08:34.956232 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.958266 kubelet[2611]: E0913 00:08:34.958228 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.958266 kubelet[2611]: W0913 00:08:34.958245 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.958266 kubelet[2611]: E0913 00:08:34.958258 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.958566 kubelet[2611]: E0913 00:08:34.958511 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.958566 kubelet[2611]: W0913 00:08:34.958519 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.958566 kubelet[2611]: E0913 00:08:34.958528 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.958774 kubelet[2611]: E0913 00:08:34.958700 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.958774 kubelet[2611]: W0913 00:08:34.958708 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.958774 kubelet[2611]: E0913 00:08:34.958715 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.958984 kubelet[2611]: E0913 00:08:34.958946 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.958984 kubelet[2611]: W0913 00:08:34.958954 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.958984 kubelet[2611]: E0913 00:08:34.958962 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.959144 kubelet[2611]: E0913 00:08:34.959128 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.959144 kubelet[2611]: W0913 00:08:34.959135 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.959144 kubelet[2611]: E0913 00:08:34.959142 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.959356 kubelet[2611]: E0913 00:08:34.959340 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.959356 kubelet[2611]: W0913 00:08:34.959354 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.959468 kubelet[2611]: E0913 00:08:34.959365 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.959568 kubelet[2611]: E0913 00:08:34.959551 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.959568 kubelet[2611]: W0913 00:08:34.959564 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.959739 kubelet[2611]: E0913 00:08:34.959571 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.959739 kubelet[2611]: E0913 00:08:34.959729 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.959739 kubelet[2611]: W0913 00:08:34.959737 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.959834 kubelet[2611]: E0913 00:08:34.959744 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.960379 kubelet[2611]: E0913 00:08:34.960361 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.960379 kubelet[2611]: W0913 00:08:34.960373 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.960443 kubelet[2611]: E0913 00:08:34.960383 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.960777 kubelet[2611]: E0913 00:08:34.960715 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.960777 kubelet[2611]: W0913 00:08:34.960727 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.960777 kubelet[2611]: E0913 00:08:34.960737 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.961423 kubelet[2611]: E0913 00:08:34.961344 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.961423 kubelet[2611]: W0913 00:08:34.961355 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.961423 kubelet[2611]: E0913 00:08:34.961364 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.961711 kubelet[2611]: E0913 00:08:34.961615 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.961711 kubelet[2611]: W0913 00:08:34.961624 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.961711 kubelet[2611]: E0913 00:08:34.961632 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.962104 kubelet[2611]: E0913 00:08:34.961990 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.962104 kubelet[2611]: W0913 00:08:34.962000 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.962104 kubelet[2611]: E0913 00:08:34.962008 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.962377 kubelet[2611]: E0913 00:08:34.962335 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.962377 kubelet[2611]: W0913 00:08:34.962346 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.962377 kubelet[2611]: E0913 00:08:34.962367 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.963023 kubelet[2611]: E0913 00:08:34.962902 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.963023 kubelet[2611]: W0913 00:08:34.962931 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.963023 kubelet[2611]: E0913 00:08:34.962944 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.963331 kubelet[2611]: E0913 00:08:34.963236 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.963331 kubelet[2611]: W0913 00:08:34.963246 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.963331 kubelet[2611]: E0913 00:08:34.963254 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:34.963597 kubelet[2611]: E0913 00:08:34.963548 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:34.963597 kubelet[2611]: W0913 00:08:34.963558 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:34.963597 kubelet[2611]: E0913 00:08:34.963566 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.711950 kubelet[2611]: E0913 00:08:35.711854 2611 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-pdqq4" podUID="d0c095e9-b846-4f71-bc12-2e92665be871" Sep 13 00:08:35.881263 kubelet[2611]: I0913 00:08:35.880298 2611 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 13 00:08:35.958139 kubelet[2611]: E0913 00:08:35.958101 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.958139 kubelet[2611]: W0913 00:08:35.958127 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.958139 kubelet[2611]: E0913 00:08:35.958150 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.958849 kubelet[2611]: E0913 00:08:35.958401 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.958849 kubelet[2611]: W0913 00:08:35.958437 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.958849 kubelet[2611]: E0913 00:08:35.958454 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.959086 kubelet[2611]: E0913 00:08:35.959057 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.959086 kubelet[2611]: W0913 00:08:35.959075 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.959850 kubelet[2611]: E0913 00:08:35.959109 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.959850 kubelet[2611]: E0913 00:08:35.959336 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.959850 kubelet[2611]: W0913 00:08:35.959356 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.959850 kubelet[2611]: E0913 00:08:35.959385 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.959850 kubelet[2611]: E0913 00:08:35.959584 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.959850 kubelet[2611]: W0913 00:08:35.959592 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.959850 kubelet[2611]: E0913 00:08:35.959601 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.960231 kubelet[2611]: E0913 00:08:35.960209 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.960231 kubelet[2611]: W0913 00:08:35.960225 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.960231 kubelet[2611]: E0913 00:08:35.960233 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.960485 kubelet[2611]: E0913 00:08:35.960446 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.960485 kubelet[2611]: W0913 00:08:35.960459 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.960485 kubelet[2611]: E0913 00:08:35.960468 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.960627 kubelet[2611]: E0913 00:08:35.960594 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.960627 kubelet[2611]: W0913 00:08:35.960604 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.960627 kubelet[2611]: E0913 00:08:35.960612 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.960874 kubelet[2611]: E0913 00:08:35.960789 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.960874 kubelet[2611]: W0913 00:08:35.960797 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.960874 kubelet[2611]: E0913 00:08:35.960805 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.961022 kubelet[2611]: E0913 00:08:35.960975 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.961022 kubelet[2611]: W0913 00:08:35.960988 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.961022 kubelet[2611]: E0913 00:08:35.961000 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.961190 kubelet[2611]: E0913 00:08:35.961181 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.961190 kubelet[2611]: W0913 00:08:35.961190 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.961248 kubelet[2611]: E0913 00:08:35.961198 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.961366 kubelet[2611]: E0913 00:08:35.961345 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.961366 kubelet[2611]: W0913 00:08:35.961357 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.961366 kubelet[2611]: E0913 00:08:35.961365 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.961526 kubelet[2611]: E0913 00:08:35.961505 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.961526 kubelet[2611]: W0913 00:08:35.961512 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.961526 kubelet[2611]: E0913 00:08:35.961519 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.961736 kubelet[2611]: E0913 00:08:35.961666 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.961736 kubelet[2611]: W0913 00:08:35.961674 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.961736 kubelet[2611]: E0913 00:08:35.961697 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.961839 kubelet[2611]: E0913 00:08:35.961823 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.961839 kubelet[2611]: W0913 00:08:35.961830 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.961839 kubelet[2611]: E0913 00:08:35.961837 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.969119 kubelet[2611]: E0913 00:08:35.969043 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.969119 kubelet[2611]: W0913 00:08:35.969057 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.969119 kubelet[2611]: E0913 00:08:35.969067 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.970956 kubelet[2611]: E0913 00:08:35.970891 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.970956 kubelet[2611]: W0913 00:08:35.970930 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.970956 kubelet[2611]: E0913 00:08:35.970948 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.971237 kubelet[2611]: E0913 00:08:35.971207 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.971237 kubelet[2611]: W0913 00:08:35.971222 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.971237 kubelet[2611]: E0913 00:08:35.971232 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.971568 kubelet[2611]: E0913 00:08:35.971450 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.971568 kubelet[2611]: W0913 00:08:35.971463 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.971568 kubelet[2611]: E0913 00:08:35.971478 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.971990 kubelet[2611]: E0913 00:08:35.971806 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.971990 kubelet[2611]: W0913 00:08:35.971819 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.971990 kubelet[2611]: E0913 00:08:35.971828 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.972192 kubelet[2611]: E0913 00:08:35.972140 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.972192 kubelet[2611]: W0913 00:08:35.972154 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.972192 kubelet[2611]: E0913 00:08:35.972163 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.972699 kubelet[2611]: E0913 00:08:35.972655 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.972762 kubelet[2611]: W0913 00:08:35.972673 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.972762 kubelet[2611]: E0913 00:08:35.972742 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.973059 kubelet[2611]: E0913 00:08:35.973034 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.973059 kubelet[2611]: W0913 00:08:35.973047 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.973059 kubelet[2611]: E0913 00:08:35.973056 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.973345 kubelet[2611]: E0913 00:08:35.973331 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.973345 kubelet[2611]: W0913 00:08:35.973342 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.973477 kubelet[2611]: E0913 00:08:35.973350 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.973520 kubelet[2611]: E0913 00:08:35.973506 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.973520 kubelet[2611]: W0913 00:08:35.973513 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.973594 kubelet[2611]: E0913 00:08:35.973531 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.973742 kubelet[2611]: E0913 00:08:35.973714 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.973742 kubelet[2611]: W0913 00:08:35.973735 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.973822 kubelet[2611]: E0913 00:08:35.973751 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.974069 kubelet[2611]: E0913 00:08:35.974047 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.974069 kubelet[2611]: W0913 00:08:35.974060 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.974069 kubelet[2611]: E0913 00:08:35.974070 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.974783 kubelet[2611]: E0913 00:08:35.974764 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.974783 kubelet[2611]: W0913 00:08:35.974777 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.974855 kubelet[2611]: E0913 00:08:35.974788 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.975040 kubelet[2611]: E0913 00:08:35.975017 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.975040 kubelet[2611]: W0913 00:08:35.975039 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.975216 kubelet[2611]: E0913 00:08:35.975051 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.975316 kubelet[2611]: E0913 00:08:35.975301 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.975316 kubelet[2611]: W0913 00:08:35.975314 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.975446 kubelet[2611]: E0913 00:08:35.975323 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.975547 kubelet[2611]: E0913 00:08:35.975506 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.975547 kubelet[2611]: W0913 00:08:35.975514 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.975547 kubelet[2611]: E0913 00:08:35.975521 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.975937 kubelet[2611]: E0913 00:08:35.975893 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.975937 kubelet[2611]: W0913 00:08:35.975906 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.976065 kubelet[2611]: E0913 00:08:35.975930 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:35.976273 kubelet[2611]: E0913 00:08:35.976212 2611 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 13 00:08:35.976273 kubelet[2611]: W0913 00:08:35.976240 2611 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 13 00:08:35.976273 kubelet[2611]: E0913 00:08:35.976250 2611 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 13 00:08:36.272328 containerd[1501]: time="2025-09-13T00:08:36.272199164Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:36.273601 containerd[1501]: time="2025-09-13T00:08:36.273398111Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 13 00:08:36.274944 containerd[1501]: time="2025-09-13T00:08:36.274413990Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:36.278042 containerd[1501]: time="2025-09-13T00:08:36.277885001Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:36.279537 containerd[1501]: time="2025-09-13T00:08:36.279501101Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.710456579s" Sep 13 00:08:36.279537 containerd[1501]: time="2025-09-13T00:08:36.279533152Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 13 00:08:36.283894 containerd[1501]: time="2025-09-13T00:08:36.283764777Z" level=info msg="CreateContainer within sandbox \"ca5b64bf84b0b58cdcc3ed88055bb8250edefe88dedf30a45c65d45687757299\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 13 00:08:36.314228 containerd[1501]: time="2025-09-13T00:08:36.314193701Z" level=info msg="CreateContainer within sandbox \"ca5b64bf84b0b58cdcc3ed88055bb8250edefe88dedf30a45c65d45687757299\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"e284fe8210742ab71adeeada3bd5ecc1f9f2d432200e024317c1911f3d27c809\"" Sep 13 00:08:36.315596 containerd[1501]: time="2025-09-13T00:08:36.315226763Z" level=info msg="StartContainer for \"e284fe8210742ab71adeeada3bd5ecc1f9f2d432200e024317c1911f3d27c809\"" Sep 13 00:08:36.364086 systemd[1]: Started cri-containerd-e284fe8210742ab71adeeada3bd5ecc1f9f2d432200e024317c1911f3d27c809.scope - libcontainer container e284fe8210742ab71adeeada3bd5ecc1f9f2d432200e024317c1911f3d27c809. Sep 13 00:08:36.396366 containerd[1501]: time="2025-09-13T00:08:36.395081917Z" level=info msg="StartContainer for \"e284fe8210742ab71adeeada3bd5ecc1f9f2d432200e024317c1911f3d27c809\" returns successfully" Sep 13 00:08:36.409672 systemd[1]: cri-containerd-e284fe8210742ab71adeeada3bd5ecc1f9f2d432200e024317c1911f3d27c809.scope: Deactivated successfully. Sep 13 00:08:36.433779 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e284fe8210742ab71adeeada3bd5ecc1f9f2d432200e024317c1911f3d27c809-rootfs.mount: Deactivated successfully. Sep 13 00:08:36.455177 containerd[1501]: time="2025-09-13T00:08:36.442114382Z" level=info msg="shim disconnected" id=e284fe8210742ab71adeeada3bd5ecc1f9f2d432200e024317c1911f3d27c809 namespace=k8s.io Sep 13 00:08:36.455177 containerd[1501]: time="2025-09-13T00:08:36.455169343Z" level=warning msg="cleaning up after shim disconnected" id=e284fe8210742ab71adeeada3bd5ecc1f9f2d432200e024317c1911f3d27c809 namespace=k8s.io Sep 13 00:08:36.455177 containerd[1501]: time="2025-09-13T00:08:36.455185465Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 13 00:08:36.464598 containerd[1501]: time="2025-09-13T00:08:36.464555187Z" level=warning msg="cleanup warnings time=\"2025-09-13T00:08:36Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Sep 13 00:08:36.889678 containerd[1501]: time="2025-09-13T00:08:36.889559289Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 13 00:08:36.914265 kubelet[2611]: I0913 00:08:36.912121 2611 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5499d5d6c9-9wtgx" podStartSLOduration=3.777107919 podStartE2EDuration="5.911153767s" podCreationTimestamp="2025-09-13 00:08:31 +0000 UTC" firstStartedPulling="2025-09-13 00:08:32.432492446 +0000 UTC m=+18.846525744" lastFinishedPulling="2025-09-13 00:08:34.566538295 +0000 UTC m=+20.980571592" observedRunningTime="2025-09-13 00:08:34.900225756 +0000 UTC m=+21.314259063" watchObservedRunningTime="2025-09-13 00:08:36.911153767 +0000 UTC m=+23.325187074" Sep 13 00:08:37.535749 kubelet[2611]: I0913 00:08:37.535364 2611 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 13 00:08:37.711411 kubelet[2611]: E0913 00:08:37.711339 2611 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-pdqq4" podUID="d0c095e9-b846-4f71-bc12-2e92665be871" Sep 13 00:08:39.714466 kubelet[2611]: E0913 00:08:39.712589 2611 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-pdqq4" podUID="d0c095e9-b846-4f71-bc12-2e92665be871" Sep 13 00:08:39.924994 containerd[1501]: time="2025-09-13T00:08:39.924904328Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:39.926570 containerd[1501]: time="2025-09-13T00:08:39.926026844Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 13 00:08:39.927878 containerd[1501]: time="2025-09-13T00:08:39.927835611Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:39.931004 containerd[1501]: time="2025-09-13T00:08:39.930966193Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:39.933426 containerd[1501]: time="2025-09-13T00:08:39.932937849Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 3.04331505s" Sep 13 00:08:39.933426 containerd[1501]: time="2025-09-13T00:08:39.932986762Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 13 00:08:39.938372 containerd[1501]: time="2025-09-13T00:08:39.938323986Z" level=info msg="CreateContainer within sandbox \"ca5b64bf84b0b58cdcc3ed88055bb8250edefe88dedf30a45c65d45687757299\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 13 00:08:39.955816 containerd[1501]: time="2025-09-13T00:08:39.955739385Z" level=info msg="CreateContainer within sandbox \"ca5b64bf84b0b58cdcc3ed88055bb8250edefe88dedf30a45c65d45687757299\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"0beeb42c44d7e2340746d0dc8ad505b83ca5e59071964048b60f5ac80fc26545\"" Sep 13 00:08:39.956895 containerd[1501]: time="2025-09-13T00:08:39.956848838Z" level=info msg="StartContainer for \"0beeb42c44d7e2340746d0dc8ad505b83ca5e59071964048b60f5ac80fc26545\"" Sep 13 00:08:40.001432 systemd[1]: Started cri-containerd-0beeb42c44d7e2340746d0dc8ad505b83ca5e59071964048b60f5ac80fc26545.scope - libcontainer container 0beeb42c44d7e2340746d0dc8ad505b83ca5e59071964048b60f5ac80fc26545. Sep 13 00:08:40.041806 containerd[1501]: time="2025-09-13T00:08:40.041758681Z" level=info msg="StartContainer for \"0beeb42c44d7e2340746d0dc8ad505b83ca5e59071964048b60f5ac80fc26545\" returns successfully" Sep 13 00:08:40.548491 systemd[1]: cri-containerd-0beeb42c44d7e2340746d0dc8ad505b83ca5e59071964048b60f5ac80fc26545.scope: Deactivated successfully. Sep 13 00:08:40.591046 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0beeb42c44d7e2340746d0dc8ad505b83ca5e59071964048b60f5ac80fc26545-rootfs.mount: Deactivated successfully. Sep 13 00:08:40.595473 kubelet[2611]: I0913 00:08:40.595261 2611 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 13 00:08:40.617434 containerd[1501]: time="2025-09-13T00:08:40.617129535Z" level=info msg="shim disconnected" id=0beeb42c44d7e2340746d0dc8ad505b83ca5e59071964048b60f5ac80fc26545 namespace=k8s.io Sep 13 00:08:40.617434 containerd[1501]: time="2025-09-13T00:08:40.617263478Z" level=warning msg="cleaning up after shim disconnected" id=0beeb42c44d7e2340746d0dc8ad505b83ca5e59071964048b60f5ac80fc26545 namespace=k8s.io Sep 13 00:08:40.617434 containerd[1501]: time="2025-09-13T00:08:40.617273105Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 13 00:08:40.679194 systemd[1]: Created slice kubepods-besteffort-pod3a77a062_d867_4bd5_b1d4_b714b684c994.slice - libcontainer container kubepods-besteffort-pod3a77a062_d867_4bd5_b1d4_b714b684c994.slice. Sep 13 00:08:40.687026 systemd[1]: Created slice kubepods-besteffort-pod66c3fc14_19e6_48c3_91eb_518f5181421b.slice - libcontainer container kubepods-besteffort-pod66c3fc14_19e6_48c3_91eb_518f5181421b.slice. Sep 13 00:08:40.696197 systemd[1]: Created slice kubepods-burstable-pod3ba023c8_210d_4e58_bf51_ba6b7484e570.slice - libcontainer container kubepods-burstable-pod3ba023c8_210d_4e58_bf51_ba6b7484e570.slice. Sep 13 00:08:40.703964 systemd[1]: Created slice kubepods-besteffort-pod81fa2d11_a454_4bc0_824e_4ee1c118d1e3.slice - libcontainer container kubepods-besteffort-pod81fa2d11_a454_4bc0_824e_4ee1c118d1e3.slice. Sep 13 00:08:40.709735 kubelet[2611]: I0913 00:08:40.709689 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgfpq\" (UniqueName: \"kubernetes.io/projected/807cba0d-efbf-495e-bbeb-ab5413159760-kube-api-access-bgfpq\") pod \"coredns-674b8bbfcf-ts6l4\" (UID: \"807cba0d-efbf-495e-bbeb-ab5413159760\") " pod="kube-system/coredns-674b8bbfcf-ts6l4" Sep 13 00:08:40.709735 kubelet[2611]: I0913 00:08:40.709728 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x296l\" (UniqueName: \"kubernetes.io/projected/66c3fc14-19e6-48c3-91eb-518f5181421b-kube-api-access-x296l\") pod \"calico-apiserver-9c789df89-chbwf\" (UID: \"66c3fc14-19e6-48c3-91eb-518f5181421b\") " pod="calico-apiserver/calico-apiserver-9c789df89-chbwf" Sep 13 00:08:40.709735 kubelet[2611]: I0913 00:08:40.709745 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/81fa2d11-a454-4bc0-824e-4ee1c118d1e3-calico-apiserver-certs\") pod \"calico-apiserver-9c789df89-kvgsc\" (UID: \"81fa2d11-a454-4bc0-824e-4ee1c118d1e3\") " pod="calico-apiserver/calico-apiserver-9c789df89-kvgsc" Sep 13 00:08:40.710632 kubelet[2611]: I0913 00:08:40.709758 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/807cba0d-efbf-495e-bbeb-ab5413159760-config-volume\") pod \"coredns-674b8bbfcf-ts6l4\" (UID: \"807cba0d-efbf-495e-bbeb-ab5413159760\") " pod="kube-system/coredns-674b8bbfcf-ts6l4" Sep 13 00:08:40.710632 kubelet[2611]: I0913 00:08:40.709773 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/66c3fc14-19e6-48c3-91eb-518f5181421b-calico-apiserver-certs\") pod \"calico-apiserver-9c789df89-chbwf\" (UID: \"66c3fc14-19e6-48c3-91eb-518f5181421b\") " pod="calico-apiserver/calico-apiserver-9c789df89-chbwf" Sep 13 00:08:40.710632 kubelet[2611]: I0913 00:08:40.709801 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a77a062-d867-4bd5-b1d4-b714b684c994-whisker-ca-bundle\") pod \"whisker-57d6c7f84c-tpttf\" (UID: \"3a77a062-d867-4bd5-b1d4-b714b684c994\") " pod="calico-system/whisker-57d6c7f84c-tpttf" Sep 13 00:08:40.710632 kubelet[2611]: I0913 00:08:40.709829 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lggf\" (UniqueName: \"kubernetes.io/projected/e8347be8-88b8-4276-8b5d-e1c006f27806-kube-api-access-4lggf\") pod \"goldmane-54d579b49d-zn7kg\" (UID: \"e8347be8-88b8-4276-8b5d-e1c006f27806\") " pod="calico-system/goldmane-54d579b49d-zn7kg" Sep 13 00:08:40.710632 kubelet[2611]: I0913 00:08:40.709845 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3ba023c8-210d-4e58-bf51-ba6b7484e570-config-volume\") pod \"coredns-674b8bbfcf-q6mzd\" (UID: \"3ba023c8-210d-4e58-bf51-ba6b7484e570\") " pod="kube-system/coredns-674b8bbfcf-q6mzd" Sep 13 00:08:40.711756 kubelet[2611]: I0913 00:08:40.709859 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t7dh\" (UniqueName: \"kubernetes.io/projected/3a77a062-d867-4bd5-b1d4-b714b684c994-kube-api-access-9t7dh\") pod \"whisker-57d6c7f84c-tpttf\" (UID: \"3a77a062-d867-4bd5-b1d4-b714b684c994\") " pod="calico-system/whisker-57d6c7f84c-tpttf" Sep 13 00:08:40.711756 kubelet[2611]: I0913 00:08:40.709871 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8347be8-88b8-4276-8b5d-e1c006f27806-config\") pod \"goldmane-54d579b49d-zn7kg\" (UID: \"e8347be8-88b8-4276-8b5d-e1c006f27806\") " pod="calico-system/goldmane-54d579b49d-zn7kg" Sep 13 00:08:40.711756 kubelet[2611]: I0913 00:08:40.709885 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8347be8-88b8-4276-8b5d-e1c006f27806-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-zn7kg\" (UID: \"e8347be8-88b8-4276-8b5d-e1c006f27806\") " pod="calico-system/goldmane-54d579b49d-zn7kg" Sep 13 00:08:40.711756 kubelet[2611]: I0913 00:08:40.709897 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/e8347be8-88b8-4276-8b5d-e1c006f27806-goldmane-key-pair\") pod \"goldmane-54d579b49d-zn7kg\" (UID: \"e8347be8-88b8-4276-8b5d-e1c006f27806\") " pod="calico-system/goldmane-54d579b49d-zn7kg" Sep 13 00:08:40.711756 kubelet[2611]: I0913 00:08:40.710700 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxrs8\" (UniqueName: \"kubernetes.io/projected/3ba023c8-210d-4e58-bf51-ba6b7484e570-kube-api-access-xxrs8\") pod \"coredns-674b8bbfcf-q6mzd\" (UID: \"3ba023c8-210d-4e58-bf51-ba6b7484e570\") " pod="kube-system/coredns-674b8bbfcf-q6mzd" Sep 13 00:08:40.711922 kubelet[2611]: I0913 00:08:40.710758 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swssl\" (UniqueName: \"kubernetes.io/projected/4166a8bf-c6d2-4b14-bbe8-bc5fa59948b2-kube-api-access-swssl\") pod \"calico-kube-controllers-577b6875b7-hxt89\" (UID: \"4166a8bf-c6d2-4b14-bbe8-bc5fa59948b2\") " pod="calico-system/calico-kube-controllers-577b6875b7-hxt89" Sep 13 00:08:40.711922 kubelet[2611]: I0913 00:08:40.710776 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/3a77a062-d867-4bd5-b1d4-b714b684c994-whisker-backend-key-pair\") pod \"whisker-57d6c7f84c-tpttf\" (UID: \"3a77a062-d867-4bd5-b1d4-b714b684c994\") " pod="calico-system/whisker-57d6c7f84c-tpttf" Sep 13 00:08:40.711922 kubelet[2611]: I0913 00:08:40.710808 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h79p\" (UniqueName: \"kubernetes.io/projected/81fa2d11-a454-4bc0-824e-4ee1c118d1e3-kube-api-access-5h79p\") pod \"calico-apiserver-9c789df89-kvgsc\" (UID: \"81fa2d11-a454-4bc0-824e-4ee1c118d1e3\") " pod="calico-apiserver/calico-apiserver-9c789df89-kvgsc" Sep 13 00:08:40.711922 kubelet[2611]: I0913 00:08:40.710821 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4166a8bf-c6d2-4b14-bbe8-bc5fa59948b2-tigera-ca-bundle\") pod \"calico-kube-controllers-577b6875b7-hxt89\" (UID: \"4166a8bf-c6d2-4b14-bbe8-bc5fa59948b2\") " pod="calico-system/calico-kube-controllers-577b6875b7-hxt89" Sep 13 00:08:40.711987 systemd[1]: Created slice kubepods-burstable-pod807cba0d_efbf_495e_bbeb_ab5413159760.slice - libcontainer container kubepods-burstable-pod807cba0d_efbf_495e_bbeb_ab5413159760.slice. Sep 13 00:08:40.719550 systemd[1]: Created slice kubepods-besteffort-pod4166a8bf_c6d2_4b14_bbe8_bc5fa59948b2.slice - libcontainer container kubepods-besteffort-pod4166a8bf_c6d2_4b14_bbe8_bc5fa59948b2.slice. Sep 13 00:08:40.726086 systemd[1]: Created slice kubepods-besteffort-pode8347be8_88b8_4276_8b5d_e1c006f27806.slice - libcontainer container kubepods-besteffort-pode8347be8_88b8_4276_8b5d_e1c006f27806.slice. Sep 13 00:08:40.900409 containerd[1501]: time="2025-09-13T00:08:40.900316161Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 13 00:08:40.983822 containerd[1501]: time="2025-09-13T00:08:40.983626843Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-57d6c7f84c-tpttf,Uid:3a77a062-d867-4bd5-b1d4-b714b684c994,Namespace:calico-system,Attempt:0,}" Sep 13 00:08:40.993572 containerd[1501]: time="2025-09-13T00:08:40.993503636Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9c789df89-chbwf,Uid:66c3fc14-19e6-48c3-91eb-518f5181421b,Namespace:calico-apiserver,Attempt:0,}" Sep 13 00:08:41.023991 containerd[1501]: time="2025-09-13T00:08:41.022620970Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-ts6l4,Uid:807cba0d-efbf-495e-bbeb-ab5413159760,Namespace:kube-system,Attempt:0,}" Sep 13 00:08:41.029346 containerd[1501]: time="2025-09-13T00:08:41.027307029Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-q6mzd,Uid:3ba023c8-210d-4e58-bf51-ba6b7484e570,Namespace:kube-system,Attempt:0,}" Sep 13 00:08:41.029346 containerd[1501]: time="2025-09-13T00:08:41.027555519Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9c789df89-kvgsc,Uid:81fa2d11-a454-4bc0-824e-4ee1c118d1e3,Namespace:calico-apiserver,Attempt:0,}" Sep 13 00:08:41.029346 containerd[1501]: time="2025-09-13T00:08:41.028025789Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-577b6875b7-hxt89,Uid:4166a8bf-c6d2-4b14-bbe8-bc5fa59948b2,Namespace:calico-system,Attempt:0,}" Sep 13 00:08:41.032190 containerd[1501]: time="2025-09-13T00:08:41.032107683Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-zn7kg,Uid:e8347be8-88b8-4276-8b5d-e1c006f27806,Namespace:calico-system,Attempt:0,}" Sep 13 00:08:41.312369 containerd[1501]: time="2025-09-13T00:08:41.312069406Z" level=error msg="Failed to destroy network for sandbox \"70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.320709 containerd[1501]: time="2025-09-13T00:08:41.320658181Z" level=error msg="encountered an error cleaning up failed sandbox \"70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.320840 containerd[1501]: time="2025-09-13T00:08:41.320739675Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-q6mzd,Uid:3ba023c8-210d-4e58-bf51-ba6b7484e570,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.326063 containerd[1501]: time="2025-09-13T00:08:41.326011533Z" level=error msg="Failed to destroy network for sandbox \"f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.327932 containerd[1501]: time="2025-09-13T00:08:41.327241791Z" level=error msg="encountered an error cleaning up failed sandbox \"f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.327932 containerd[1501]: time="2025-09-13T00:08:41.327305992Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-ts6l4,Uid:807cba0d-efbf-495e-bbeb-ab5413159760,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.336105 containerd[1501]: time="2025-09-13T00:08:41.336061833Z" level=error msg="Failed to destroy network for sandbox \"a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.336443 containerd[1501]: time="2025-09-13T00:08:41.336412285Z" level=error msg="encountered an error cleaning up failed sandbox \"a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.336496 containerd[1501]: time="2025-09-13T00:08:41.336479763Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-577b6875b7-hxt89,Uid:4166a8bf-c6d2-4b14-bbe8-bc5fa59948b2,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.337649 containerd[1501]: time="2025-09-13T00:08:41.337617727Z" level=error msg="Failed to destroy network for sandbox \"9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.338077 kubelet[2611]: E0913 00:08:41.338002 2611 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.339569 containerd[1501]: time="2025-09-13T00:08:41.338091734Z" level=error msg="encountered an error cleaning up failed sandbox \"9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.339569 containerd[1501]: time="2025-09-13T00:08:41.339022095Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-57d6c7f84c-tpttf,Uid:3a77a062-d867-4bd5-b1d4-b714b684c994,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.339889 kubelet[2611]: E0913 00:08:41.339869 2611 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-577b6875b7-hxt89" Sep 13 00:08:41.340365 kubelet[2611]: E0913 00:08:41.340143 2611 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-577b6875b7-hxt89" Sep 13 00:08:41.340365 kubelet[2611]: E0913 00:08:41.340200 2611 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-577b6875b7-hxt89_calico-system(4166a8bf-c6d2-4b14-bbe8-bc5fa59948b2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-577b6875b7-hxt89_calico-system(4166a8bf-c6d2-4b14-bbe8-bc5fa59948b2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-577b6875b7-hxt89" podUID="4166a8bf-c6d2-4b14-bbe8-bc5fa59948b2" Sep 13 00:08:41.341148 kubelet[2611]: E0913 00:08:41.338546 2611 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.342533 kubelet[2611]: E0913 00:08:41.341490 2611 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.342533 kubelet[2611]: E0913 00:08:41.341589 2611 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-57d6c7f84c-tpttf" Sep 13 00:08:41.342533 kubelet[2611]: E0913 00:08:41.341607 2611 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-57d6c7f84c-tpttf" Sep 13 00:08:41.342533 kubelet[2611]: E0913 00:08:41.341488 2611 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-ts6l4" Sep 13 00:08:41.342677 kubelet[2611]: E0913 00:08:41.341744 2611 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-ts6l4" Sep 13 00:08:41.342677 kubelet[2611]: E0913 00:08:41.341780 2611 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-ts6l4_kube-system(807cba0d-efbf-495e-bbeb-ab5413159760)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-ts6l4_kube-system(807cba0d-efbf-495e-bbeb-ab5413159760)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-ts6l4" podUID="807cba0d-efbf-495e-bbeb-ab5413159760" Sep 13 00:08:41.342677 kubelet[2611]: E0913 00:08:41.338524 2611 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.342785 kubelet[2611]: E0913 00:08:41.341919 2611 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-q6mzd" Sep 13 00:08:41.342785 kubelet[2611]: E0913 00:08:41.341951 2611 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-q6mzd" Sep 13 00:08:41.342785 kubelet[2611]: E0913 00:08:41.341976 2611 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-q6mzd_kube-system(3ba023c8-210d-4e58-bf51-ba6b7484e570)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-q6mzd_kube-system(3ba023c8-210d-4e58-bf51-ba6b7484e570)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-q6mzd" podUID="3ba023c8-210d-4e58-bf51-ba6b7484e570" Sep 13 00:08:41.343165 kubelet[2611]: E0913 00:08:41.341808 2611 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-57d6c7f84c-tpttf_calico-system(3a77a062-d867-4bd5-b1d4-b714b684c994)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-57d6c7f84c-tpttf_calico-system(3a77a062-d867-4bd5-b1d4-b714b684c994)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-57d6c7f84c-tpttf" podUID="3a77a062-d867-4bd5-b1d4-b714b684c994" Sep 13 00:08:41.350768 containerd[1501]: time="2025-09-13T00:08:41.349782409Z" level=error msg="Failed to destroy network for sandbox \"545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.351208 containerd[1501]: time="2025-09-13T00:08:41.351087769Z" level=error msg="encountered an error cleaning up failed sandbox \"545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.351208 containerd[1501]: time="2025-09-13T00:08:41.351133606Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9c789df89-kvgsc,Uid:81fa2d11-a454-4bc0-824e-4ee1c118d1e3,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.351764 kubelet[2611]: E0913 00:08:41.351582 2611 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.351764 kubelet[2611]: E0913 00:08:41.351657 2611 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-9c789df89-kvgsc" Sep 13 00:08:41.351764 kubelet[2611]: E0913 00:08:41.351676 2611 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-9c789df89-kvgsc" Sep 13 00:08:41.352065 kubelet[2611]: E0913 00:08:41.351753 2611 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-9c789df89-kvgsc_calico-apiserver(81fa2d11-a454-4bc0-824e-4ee1c118d1e3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-9c789df89-kvgsc_calico-apiserver(81fa2d11-a454-4bc0-824e-4ee1c118d1e3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-9c789df89-kvgsc" podUID="81fa2d11-a454-4bc0-824e-4ee1c118d1e3" Sep 13 00:08:41.355669 containerd[1501]: time="2025-09-13T00:08:41.355633323Z" level=error msg="Failed to destroy network for sandbox \"c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.356109 containerd[1501]: time="2025-09-13T00:08:41.355964539Z" level=error msg="encountered an error cleaning up failed sandbox \"c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.356109 containerd[1501]: time="2025-09-13T00:08:41.356026836Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9c789df89-chbwf,Uid:66c3fc14-19e6-48c3-91eb-518f5181421b,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.357215 kubelet[2611]: E0913 00:08:41.356718 2611 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.357215 kubelet[2611]: E0913 00:08:41.356770 2611 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-9c789df89-chbwf" Sep 13 00:08:41.357215 kubelet[2611]: E0913 00:08:41.356819 2611 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-9c789df89-chbwf" Sep 13 00:08:41.357345 containerd[1501]: time="2025-09-13T00:08:41.357022431Z" level=error msg="Failed to destroy network for sandbox \"917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.357648 kubelet[2611]: E0913 00:08:41.356880 2611 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-9c789df89-chbwf_calico-apiserver(66c3fc14-19e6-48c3-91eb-518f5181421b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-9c789df89-chbwf_calico-apiserver(66c3fc14-19e6-48c3-91eb-518f5181421b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-9c789df89-chbwf" podUID="66c3fc14-19e6-48c3-91eb-518f5181421b" Sep 13 00:08:41.357708 containerd[1501]: time="2025-09-13T00:08:41.357483393Z" level=error msg="encountered an error cleaning up failed sandbox \"917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.357708 containerd[1501]: time="2025-09-13T00:08:41.357519552Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-zn7kg,Uid:e8347be8-88b8-4276-8b5d-e1c006f27806,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.358049 kubelet[2611]: E0913 00:08:41.357896 2611 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.358049 kubelet[2611]: E0913 00:08:41.357983 2611 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-zn7kg" Sep 13 00:08:41.358049 kubelet[2611]: E0913 00:08:41.357996 2611 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-zn7kg" Sep 13 00:08:41.358261 kubelet[2611]: E0913 00:08:41.358201 2611 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-zn7kg_calico-system(e8347be8-88b8-4276-8b5d-e1c006f27806)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-zn7kg_calico-system(e8347be8-88b8-4276-8b5d-e1c006f27806)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-zn7kg" podUID="e8347be8-88b8-4276-8b5d-e1c006f27806" Sep 13 00:08:41.720405 systemd[1]: Created slice kubepods-besteffort-podd0c095e9_b846_4f71_bc12_2e92665be871.slice - libcontainer container kubepods-besteffort-podd0c095e9_b846_4f71_bc12_2e92665be871.slice. Sep 13 00:08:41.723453 containerd[1501]: time="2025-09-13T00:08:41.723415136Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-pdqq4,Uid:d0c095e9-b846-4f71-bc12-2e92665be871,Namespace:calico-system,Attempt:0,}" Sep 13 00:08:41.775976 containerd[1501]: time="2025-09-13T00:08:41.775842035Z" level=error msg="Failed to destroy network for sandbox \"132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.776370 containerd[1501]: time="2025-09-13T00:08:41.776294441Z" level=error msg="encountered an error cleaning up failed sandbox \"132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.776370 containerd[1501]: time="2025-09-13T00:08:41.776348203Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-pdqq4,Uid:d0c095e9-b846-4f71-bc12-2e92665be871,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.776698 kubelet[2611]: E0913 00:08:41.776615 2611 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:41.776818 kubelet[2611]: E0913 00:08:41.776699 2611 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-pdqq4" Sep 13 00:08:41.776818 kubelet[2611]: E0913 00:08:41.776720 2611 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-pdqq4" Sep 13 00:08:41.776818 kubelet[2611]: E0913 00:08:41.776779 2611 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-pdqq4_calico-system(d0c095e9-b846-4f71-bc12-2e92665be871)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-pdqq4_calico-system(d0c095e9-b846-4f71-bc12-2e92665be871)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-pdqq4" podUID="d0c095e9-b846-4f71-bc12-2e92665be871" Sep 13 00:08:41.900565 kubelet[2611]: I0913 00:08:41.900521 2611 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" Sep 13 00:08:41.903859 kubelet[2611]: I0913 00:08:41.903296 2611 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" Sep 13 00:08:41.918003 kubelet[2611]: I0913 00:08:41.917782 2611 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" Sep 13 00:08:41.920747 kubelet[2611]: I0913 00:08:41.920692 2611 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" Sep 13 00:08:41.927943 kubelet[2611]: I0913 00:08:41.926331 2611 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" Sep 13 00:08:41.928083 kubelet[2611]: I0913 00:08:41.928041 2611 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" Sep 13 00:08:41.931221 kubelet[2611]: I0913 00:08:41.930081 2611 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" Sep 13 00:08:41.931683 kubelet[2611]: I0913 00:08:41.931635 2611 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" Sep 13 00:08:41.945753 containerd[1501]: time="2025-09-13T00:08:41.945717157Z" level=info msg="StopPodSandbox for \"a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b\"" Sep 13 00:08:41.947790 containerd[1501]: time="2025-09-13T00:08:41.947741048Z" level=info msg="Ensure that sandbox a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b in task-service has been cleanup successfully" Sep 13 00:08:41.949004 containerd[1501]: time="2025-09-13T00:08:41.948971386Z" level=info msg="StopPodSandbox for \"132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea\"" Sep 13 00:08:41.949142 containerd[1501]: time="2025-09-13T00:08:41.949112493Z" level=info msg="Ensure that sandbox 132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea in task-service has been cleanup successfully" Sep 13 00:08:41.955147 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478-shm.mount: Deactivated successfully. Sep 13 00:08:41.955444 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353-shm.mount: Deactivated successfully. Sep 13 00:08:41.955621 containerd[1501]: time="2025-09-13T00:08:41.955582598Z" level=info msg="StopPodSandbox for \"917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d\"" Sep 13 00:08:41.958751 containerd[1501]: time="2025-09-13T00:08:41.955724015Z" level=info msg="Ensure that sandbox 917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d in task-service has been cleanup successfully" Sep 13 00:08:41.960449 containerd[1501]: time="2025-09-13T00:08:41.959941387Z" level=info msg="StopPodSandbox for \"9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353\"" Sep 13 00:08:41.960449 containerd[1501]: time="2025-09-13T00:08:41.960068848Z" level=info msg="Ensure that sandbox 9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353 in task-service has been cleanup successfully" Sep 13 00:08:41.960449 containerd[1501]: time="2025-09-13T00:08:41.960387992Z" level=info msg="StopPodSandbox for \"70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c\"" Sep 13 00:08:41.960566 containerd[1501]: time="2025-09-13T00:08:41.960540120Z" level=info msg="Ensure that sandbox 70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c in task-service has been cleanup successfully" Sep 13 00:08:41.961369 containerd[1501]: time="2025-09-13T00:08:41.961058301Z" level=info msg="StopPodSandbox for \"545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5\"" Sep 13 00:08:41.961369 containerd[1501]: time="2025-09-13T00:08:41.961201010Z" level=info msg="Ensure that sandbox 545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5 in task-service has been cleanup successfully" Sep 13 00:08:41.965709 containerd[1501]: time="2025-09-13T00:08:41.965687903Z" level=info msg="StopPodSandbox for \"f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36\"" Sep 13 00:08:41.965962 containerd[1501]: time="2025-09-13T00:08:41.965941562Z" level=info msg="Ensure that sandbox f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36 in task-service has been cleanup successfully" Sep 13 00:08:41.968457 containerd[1501]: time="2025-09-13T00:08:41.968093525Z" level=info msg="StopPodSandbox for \"c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478\"" Sep 13 00:08:41.969383 containerd[1501]: time="2025-09-13T00:08:41.969354431Z" level=info msg="Ensure that sandbox c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478 in task-service has been cleanup successfully" Sep 13 00:08:42.038387 containerd[1501]: time="2025-09-13T00:08:42.038182982Z" level=error msg="StopPodSandbox for \"70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c\" failed" error="failed to destroy network for sandbox \"70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:42.039825 kubelet[2611]: E0913 00:08:42.039603 2611 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" Sep 13 00:08:42.039825 kubelet[2611]: E0913 00:08:42.039686 2611 kuberuntime_manager.go:1586] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c"} Sep 13 00:08:42.039825 kubelet[2611]: E0913 00:08:42.039743 2611 kuberuntime_manager.go:1161] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"3ba023c8-210d-4e58-bf51-ba6b7484e570\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 13 00:08:42.039825 kubelet[2611]: E0913 00:08:42.039782 2611 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"3ba023c8-210d-4e58-bf51-ba6b7484e570\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-q6mzd" podUID="3ba023c8-210d-4e58-bf51-ba6b7484e570" Sep 13 00:08:42.041793 containerd[1501]: time="2025-09-13T00:08:42.041485888Z" level=error msg="StopPodSandbox for \"c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478\" failed" error="failed to destroy network for sandbox \"c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:42.041884 kubelet[2611]: E0913 00:08:42.041653 2611 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" Sep 13 00:08:42.041884 kubelet[2611]: E0913 00:08:42.041694 2611 kuberuntime_manager.go:1586] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478"} Sep 13 00:08:42.041884 kubelet[2611]: E0913 00:08:42.041718 2611 kuberuntime_manager.go:1161] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"66c3fc14-19e6-48c3-91eb-518f5181421b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 13 00:08:42.041884 kubelet[2611]: E0913 00:08:42.041739 2611 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"66c3fc14-19e6-48c3-91eb-518f5181421b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-9c789df89-chbwf" podUID="66c3fc14-19e6-48c3-91eb-518f5181421b" Sep 13 00:08:42.058195 containerd[1501]: time="2025-09-13T00:08:42.056232771Z" level=error msg="StopPodSandbox for \"545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5\" failed" error="failed to destroy network for sandbox \"545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:42.058583 kubelet[2611]: E0913 00:08:42.058440 2611 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" Sep 13 00:08:42.058583 kubelet[2611]: E0913 00:08:42.058487 2611 kuberuntime_manager.go:1586] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5"} Sep 13 00:08:42.058925 kubelet[2611]: E0913 00:08:42.058768 2611 kuberuntime_manager.go:1161] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"81fa2d11-a454-4bc0-824e-4ee1c118d1e3\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 13 00:08:42.058925 kubelet[2611]: E0913 00:08:42.058811 2611 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"81fa2d11-a454-4bc0-824e-4ee1c118d1e3\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-9c789df89-kvgsc" podUID="81fa2d11-a454-4bc0-824e-4ee1c118d1e3" Sep 13 00:08:42.059301 containerd[1501]: time="2025-09-13T00:08:42.059072071Z" level=error msg="StopPodSandbox for \"132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea\" failed" error="failed to destroy network for sandbox \"132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:42.059348 kubelet[2611]: E0913 00:08:42.059205 2611 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" Sep 13 00:08:42.059348 kubelet[2611]: E0913 00:08:42.059230 2611 kuberuntime_manager.go:1586] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea"} Sep 13 00:08:42.059348 kubelet[2611]: E0913 00:08:42.059249 2611 kuberuntime_manager.go:1161] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d0c095e9-b846-4f71-bc12-2e92665be871\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 13 00:08:42.059348 kubelet[2611]: E0913 00:08:42.059273 2611 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d0c095e9-b846-4f71-bc12-2e92665be871\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-pdqq4" podUID="d0c095e9-b846-4f71-bc12-2e92665be871" Sep 13 00:08:42.064600 containerd[1501]: time="2025-09-13T00:08:42.064366537Z" level=error msg="StopPodSandbox for \"9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353\" failed" error="failed to destroy network for sandbox \"9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:42.064654 kubelet[2611]: E0913 00:08:42.064489 2611 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" Sep 13 00:08:42.064654 kubelet[2611]: E0913 00:08:42.064515 2611 kuberuntime_manager.go:1586] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353"} Sep 13 00:08:42.064654 kubelet[2611]: E0913 00:08:42.064537 2611 kuberuntime_manager.go:1161] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"3a77a062-d867-4bd5-b1d4-b714b684c994\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 13 00:08:42.064654 kubelet[2611]: E0913 00:08:42.064577 2611 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"3a77a062-d867-4bd5-b1d4-b714b684c994\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-57d6c7f84c-tpttf" podUID="3a77a062-d867-4bd5-b1d4-b714b684c994" Sep 13 00:08:42.065069 containerd[1501]: time="2025-09-13T00:08:42.064842856Z" level=error msg="StopPodSandbox for \"917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d\" failed" error="failed to destroy network for sandbox \"917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:42.065205 kubelet[2611]: E0913 00:08:42.065186 2611 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" Sep 13 00:08:42.065299 kubelet[2611]: E0913 00:08:42.065286 2611 kuberuntime_manager.go:1586] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d"} Sep 13 00:08:42.065382 kubelet[2611]: E0913 00:08:42.065371 2611 kuberuntime_manager.go:1161] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"e8347be8-88b8-4276-8b5d-e1c006f27806\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 13 00:08:42.065497 kubelet[2611]: E0913 00:08:42.065456 2611 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"e8347be8-88b8-4276-8b5d-e1c006f27806\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-zn7kg" podUID="e8347be8-88b8-4276-8b5d-e1c006f27806" Sep 13 00:08:42.067906 containerd[1501]: time="2025-09-13T00:08:42.067588730Z" level=error msg="StopPodSandbox for \"a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b\" failed" error="failed to destroy network for sandbox \"a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:42.068022 kubelet[2611]: E0913 00:08:42.067697 2611 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" Sep 13 00:08:42.068022 kubelet[2611]: E0913 00:08:42.067722 2611 kuberuntime_manager.go:1586] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b"} Sep 13 00:08:42.068022 kubelet[2611]: E0913 00:08:42.067744 2611 kuberuntime_manager.go:1161] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"4166a8bf-c6d2-4b14-bbe8-bc5fa59948b2\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 13 00:08:42.068022 kubelet[2611]: E0913 00:08:42.067766 2611 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"4166a8bf-c6d2-4b14-bbe8-bc5fa59948b2\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-577b6875b7-hxt89" podUID="4166a8bf-c6d2-4b14-bbe8-bc5fa59948b2" Sep 13 00:08:42.069459 containerd[1501]: time="2025-09-13T00:08:42.069420586Z" level=error msg="StopPodSandbox for \"f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36\" failed" error="failed to destroy network for sandbox \"f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 13 00:08:42.069584 kubelet[2611]: E0913 00:08:42.069549 2611 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" Sep 13 00:08:42.069635 kubelet[2611]: E0913 00:08:42.069586 2611 kuberuntime_manager.go:1586] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36"} Sep 13 00:08:42.069635 kubelet[2611]: E0913 00:08:42.069609 2611 kuberuntime_manager.go:1161] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"807cba0d-efbf-495e-bbeb-ab5413159760\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 13 00:08:42.069635 kubelet[2611]: E0913 00:08:42.069625 2611 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"807cba0d-efbf-495e-bbeb-ab5413159760\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-ts6l4" podUID="807cba0d-efbf-495e-bbeb-ab5413159760" Sep 13 00:08:45.106902 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount108999567.mount: Deactivated successfully. Sep 13 00:08:45.185408 containerd[1501]: time="2025-09-13T00:08:45.185322873Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 13 00:08:45.189889 containerd[1501]: time="2025-09-13T00:08:45.189522785Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 4.28911453s" Sep 13 00:08:45.189889 containerd[1501]: time="2025-09-13T00:08:45.189562080Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 13 00:08:45.191100 containerd[1501]: time="2025-09-13T00:08:45.191046814Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:45.238313 containerd[1501]: time="2025-09-13T00:08:45.238164181Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:45.239038 containerd[1501]: time="2025-09-13T00:08:45.238632006Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:45.240875 containerd[1501]: time="2025-09-13T00:08:45.240823354Z" level=info msg="CreateContainer within sandbox \"ca5b64bf84b0b58cdcc3ed88055bb8250edefe88dedf30a45c65d45687757299\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 13 00:08:45.300312 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3071921791.mount: Deactivated successfully. Sep 13 00:08:45.318931 containerd[1501]: time="2025-09-13T00:08:45.318189752Z" level=info msg="CreateContainer within sandbox \"ca5b64bf84b0b58cdcc3ed88055bb8250edefe88dedf30a45c65d45687757299\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"3cfa10dd4f9f1ae5174cdb858108f237a2c81852a624b8c7d404647bc2ba0853\"" Sep 13 00:08:45.323723 containerd[1501]: time="2025-09-13T00:08:45.323672156Z" level=info msg="StartContainer for \"3cfa10dd4f9f1ae5174cdb858108f237a2c81852a624b8c7d404647bc2ba0853\"" Sep 13 00:08:45.384087 systemd[1]: Started cri-containerd-3cfa10dd4f9f1ae5174cdb858108f237a2c81852a624b8c7d404647bc2ba0853.scope - libcontainer container 3cfa10dd4f9f1ae5174cdb858108f237a2c81852a624b8c7d404647bc2ba0853. Sep 13 00:08:45.426117 containerd[1501]: time="2025-09-13T00:08:45.426058529Z" level=info msg="StartContainer for \"3cfa10dd4f9f1ae5174cdb858108f237a2c81852a624b8c7d404647bc2ba0853\" returns successfully" Sep 13 00:08:45.523694 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 13 00:08:45.525063 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 13 00:08:45.744775 containerd[1501]: time="2025-09-13T00:08:45.744599491Z" level=info msg="StopPodSandbox for \"9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353\"" Sep 13 00:08:46.115685 containerd[1501]: 2025-09-13 00:08:45.848 [INFO][3850] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" Sep 13 00:08:46.115685 containerd[1501]: 2025-09-13 00:08:45.849 [INFO][3850] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" iface="eth0" netns="/var/run/netns/cni-1aeeaf98-fc8a-1ad8-7f8a-689ae5f61d44" Sep 13 00:08:46.115685 containerd[1501]: 2025-09-13 00:08:45.850 [INFO][3850] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" iface="eth0" netns="/var/run/netns/cni-1aeeaf98-fc8a-1ad8-7f8a-689ae5f61d44" Sep 13 00:08:46.115685 containerd[1501]: 2025-09-13 00:08:45.851 [INFO][3850] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" iface="eth0" netns="/var/run/netns/cni-1aeeaf98-fc8a-1ad8-7f8a-689ae5f61d44" Sep 13 00:08:46.115685 containerd[1501]: 2025-09-13 00:08:45.851 [INFO][3850] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" Sep 13 00:08:46.115685 containerd[1501]: 2025-09-13 00:08:45.851 [INFO][3850] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" Sep 13 00:08:46.115685 containerd[1501]: 2025-09-13 00:08:46.084 [INFO][3857] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" HandleID="k8s-pod-network.9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" Workload="ci--4081--3--5--n--294a4568b6-k8s-whisker--57d6c7f84c--tpttf-eth0" Sep 13 00:08:46.115685 containerd[1501]: 2025-09-13 00:08:46.087 [INFO][3857] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:08:46.115685 containerd[1501]: 2025-09-13 00:08:46.088 [INFO][3857] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:08:46.115685 containerd[1501]: 2025-09-13 00:08:46.102 [WARNING][3857] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" HandleID="k8s-pod-network.9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" Workload="ci--4081--3--5--n--294a4568b6-k8s-whisker--57d6c7f84c--tpttf-eth0" Sep 13 00:08:46.115685 containerd[1501]: 2025-09-13 00:08:46.102 [INFO][3857] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" HandleID="k8s-pod-network.9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" Workload="ci--4081--3--5--n--294a4568b6-k8s-whisker--57d6c7f84c--tpttf-eth0" Sep 13 00:08:46.115685 containerd[1501]: 2025-09-13 00:08:46.108 [INFO][3857] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:08:46.115685 containerd[1501]: 2025-09-13 00:08:46.113 [INFO][3850] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" Sep 13 00:08:46.118194 containerd[1501]: time="2025-09-13T00:08:46.116345073Z" level=info msg="TearDown network for sandbox \"9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353\" successfully" Sep 13 00:08:46.118194 containerd[1501]: time="2025-09-13T00:08:46.116383125Z" level=info msg="StopPodSandbox for \"9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353\" returns successfully" Sep 13 00:08:46.120649 systemd[1]: run-netns-cni\x2d1aeeaf98\x2dfc8a\x2d1ad8\x2d7f8a\x2d689ae5f61d44.mount: Deactivated successfully. Sep 13 00:08:46.176170 kubelet[2611]: I0913 00:08:46.175655 2611 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t7dh\" (UniqueName: \"kubernetes.io/projected/3a77a062-d867-4bd5-b1d4-b714b684c994-kube-api-access-9t7dh\") pod \"3a77a062-d867-4bd5-b1d4-b714b684c994\" (UID: \"3a77a062-d867-4bd5-b1d4-b714b684c994\") " Sep 13 00:08:46.176170 kubelet[2611]: I0913 00:08:46.175713 2611 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/3a77a062-d867-4bd5-b1d4-b714b684c994-whisker-backend-key-pair\") pod \"3a77a062-d867-4bd5-b1d4-b714b684c994\" (UID: \"3a77a062-d867-4bd5-b1d4-b714b684c994\") " Sep 13 00:08:46.176170 kubelet[2611]: I0913 00:08:46.175744 2611 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a77a062-d867-4bd5-b1d4-b714b684c994-whisker-ca-bundle\") pod \"3a77a062-d867-4bd5-b1d4-b714b684c994\" (UID: \"3a77a062-d867-4bd5-b1d4-b714b684c994\") " Sep 13 00:08:46.203239 systemd[1]: var-lib-kubelet-pods-3a77a062\x2dd867\x2d4bd5\x2db1d4\x2db714b684c994-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 13 00:08:46.209702 kubelet[2611]: I0913 00:08:46.209620 2611 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a77a062-d867-4bd5-b1d4-b714b684c994-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "3a77a062-d867-4bd5-b1d4-b714b684c994" (UID: "3a77a062-d867-4bd5-b1d4-b714b684c994"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 13 00:08:46.210685 kubelet[2611]: I0913 00:08:46.208293 2611 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a77a062-d867-4bd5-b1d4-b714b684c994-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "3a77a062-d867-4bd5-b1d4-b714b684c994" (UID: "3a77a062-d867-4bd5-b1d4-b714b684c994"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 13 00:08:46.218004 systemd[1]: var-lib-kubelet-pods-3a77a062\x2dd867\x2d4bd5\x2db1d4\x2db714b684c994-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d9t7dh.mount: Deactivated successfully. Sep 13 00:08:46.220430 kubelet[2611]: I0913 00:08:46.219241 2611 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a77a062-d867-4bd5-b1d4-b714b684c994-kube-api-access-9t7dh" (OuterVolumeSpecName: "kube-api-access-9t7dh") pod "3a77a062-d867-4bd5-b1d4-b714b684c994" (UID: "3a77a062-d867-4bd5-b1d4-b714b684c994"). InnerVolumeSpecName "kube-api-access-9t7dh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 13 00:08:46.277191 kubelet[2611]: I0913 00:08:46.277129 2611 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9t7dh\" (UniqueName: \"kubernetes.io/projected/3a77a062-d867-4bd5-b1d4-b714b684c994-kube-api-access-9t7dh\") on node \"ci-4081-3-5-n-294a4568b6\" DevicePath \"\"" Sep 13 00:08:46.277191 kubelet[2611]: I0913 00:08:46.277184 2611 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/3a77a062-d867-4bd5-b1d4-b714b684c994-whisker-backend-key-pair\") on node \"ci-4081-3-5-n-294a4568b6\" DevicePath \"\"" Sep 13 00:08:46.277191 kubelet[2611]: I0913 00:08:46.277202 2611 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a77a062-d867-4bd5-b1d4-b714b684c994-whisker-ca-bundle\") on node \"ci-4081-3-5-n-294a4568b6\" DevicePath \"\"" Sep 13 00:08:46.966613 systemd[1]: Removed slice kubepods-besteffort-pod3a77a062_d867_4bd5_b1d4_b714b684c994.slice - libcontainer container kubepods-besteffort-pod3a77a062_d867_4bd5_b1d4_b714b684c994.slice. Sep 13 00:08:47.052290 kubelet[2611]: I0913 00:08:47.048228 2611 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-hkqvn" podStartSLOduration=2.461959791 podStartE2EDuration="15.031867699s" podCreationTimestamp="2025-09-13 00:08:32 +0000 UTC" firstStartedPulling="2025-09-13 00:08:32.621309546 +0000 UTC m=+19.035342844" lastFinishedPulling="2025-09-13 00:08:45.191217456 +0000 UTC m=+31.605250752" observedRunningTime="2025-09-13 00:08:45.988730719 +0000 UTC m=+32.402764016" watchObservedRunningTime="2025-09-13 00:08:47.031867699 +0000 UTC m=+33.445900996" Sep 13 00:08:47.118444 systemd[1]: Created slice kubepods-besteffort-pod883968fc_9ef4_4da5_94b6_4c848d3d1a0f.slice - libcontainer container kubepods-besteffort-pod883968fc_9ef4_4da5_94b6_4c848d3d1a0f.slice. Sep 13 00:08:47.188123 kubelet[2611]: I0913 00:08:47.188064 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h75pg\" (UniqueName: \"kubernetes.io/projected/883968fc-9ef4-4da5-94b6-4c848d3d1a0f-kube-api-access-h75pg\") pod \"whisker-7cf7564cbb-8kk22\" (UID: \"883968fc-9ef4-4da5-94b6-4c848d3d1a0f\") " pod="calico-system/whisker-7cf7564cbb-8kk22" Sep 13 00:08:47.188123 kubelet[2611]: I0913 00:08:47.188114 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/883968fc-9ef4-4da5-94b6-4c848d3d1a0f-whisker-ca-bundle\") pod \"whisker-7cf7564cbb-8kk22\" (UID: \"883968fc-9ef4-4da5-94b6-4c848d3d1a0f\") " pod="calico-system/whisker-7cf7564cbb-8kk22" Sep 13 00:08:47.188123 kubelet[2611]: I0913 00:08:47.188136 2611 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/883968fc-9ef4-4da5-94b6-4c848d3d1a0f-whisker-backend-key-pair\") pod \"whisker-7cf7564cbb-8kk22\" (UID: \"883968fc-9ef4-4da5-94b6-4c848d3d1a0f\") " pod="calico-system/whisker-7cf7564cbb-8kk22" Sep 13 00:08:47.425480 containerd[1501]: time="2025-09-13T00:08:47.425423291Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7cf7564cbb-8kk22,Uid:883968fc-9ef4-4da5-94b6-4c848d3d1a0f,Namespace:calico-system,Attempt:0,}" Sep 13 00:08:47.646016 kernel: bpftool[4060]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Sep 13 00:08:47.672111 systemd-networkd[1394]: cali613030eaccb: Link UP Sep 13 00:08:47.672337 systemd-networkd[1394]: cali613030eaccb: Gained carrier Sep 13 00:08:47.693961 containerd[1501]: 2025-09-13 00:08:47.517 [INFO][4015] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 13 00:08:47.693961 containerd[1501]: 2025-09-13 00:08:47.531 [INFO][4015] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--n--294a4568b6-k8s-whisker--7cf7564cbb--8kk22-eth0 whisker-7cf7564cbb- calico-system 883968fc-9ef4-4da5-94b6-4c848d3d1a0f 883 0 2025-09-13 00:08:47 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:7cf7564cbb projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4081-3-5-n-294a4568b6 whisker-7cf7564cbb-8kk22 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali613030eaccb [] [] }} ContainerID="e9ba8a7f256593d281ab98d097b71e46702e1ce571ca81b684fa37a53832a2c4" Namespace="calico-system" Pod="whisker-7cf7564cbb-8kk22" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-whisker--7cf7564cbb--8kk22-" Sep 13 00:08:47.693961 containerd[1501]: 2025-09-13 00:08:47.531 [INFO][4015] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e9ba8a7f256593d281ab98d097b71e46702e1ce571ca81b684fa37a53832a2c4" Namespace="calico-system" Pod="whisker-7cf7564cbb-8kk22" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-whisker--7cf7564cbb--8kk22-eth0" Sep 13 00:08:47.693961 containerd[1501]: 2025-09-13 00:08:47.563 [INFO][4027] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e9ba8a7f256593d281ab98d097b71e46702e1ce571ca81b684fa37a53832a2c4" HandleID="k8s-pod-network.e9ba8a7f256593d281ab98d097b71e46702e1ce571ca81b684fa37a53832a2c4" Workload="ci--4081--3--5--n--294a4568b6-k8s-whisker--7cf7564cbb--8kk22-eth0" Sep 13 00:08:47.693961 containerd[1501]: 2025-09-13 00:08:47.565 [INFO][4027] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e9ba8a7f256593d281ab98d097b71e46702e1ce571ca81b684fa37a53832a2c4" HandleID="k8s-pod-network.e9ba8a7f256593d281ab98d097b71e46702e1ce571ca81b684fa37a53832a2c4" Workload="ci--4081--3--5--n--294a4568b6-k8s-whisker--7cf7564cbb--8kk22-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cf020), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-5-n-294a4568b6", "pod":"whisker-7cf7564cbb-8kk22", "timestamp":"2025-09-13 00:08:47.563636457 +0000 UTC"}, Hostname:"ci-4081-3-5-n-294a4568b6", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 00:08:47.693961 containerd[1501]: 2025-09-13 00:08:47.565 [INFO][4027] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:08:47.693961 containerd[1501]: 2025-09-13 00:08:47.565 [INFO][4027] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:08:47.693961 containerd[1501]: 2025-09-13 00:08:47.565 [INFO][4027] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-n-294a4568b6' Sep 13 00:08:47.693961 containerd[1501]: 2025-09-13 00:08:47.575 [INFO][4027] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e9ba8a7f256593d281ab98d097b71e46702e1ce571ca81b684fa37a53832a2c4" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:47.693961 containerd[1501]: 2025-09-13 00:08:47.613 [INFO][4027] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:47.693961 containerd[1501]: 2025-09-13 00:08:47.622 [INFO][4027] ipam/ipam.go 511: Trying affinity for 192.168.44.128/26 host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:47.693961 containerd[1501]: 2025-09-13 00:08:47.625 [INFO][4027] ipam/ipam.go 158: Attempting to load block cidr=192.168.44.128/26 host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:47.693961 containerd[1501]: 2025-09-13 00:08:47.627 [INFO][4027] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.44.128/26 host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:47.693961 containerd[1501]: 2025-09-13 00:08:47.627 [INFO][4027] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.44.128/26 handle="k8s-pod-network.e9ba8a7f256593d281ab98d097b71e46702e1ce571ca81b684fa37a53832a2c4" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:47.693961 containerd[1501]: 2025-09-13 00:08:47.628 [INFO][4027] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e9ba8a7f256593d281ab98d097b71e46702e1ce571ca81b684fa37a53832a2c4 Sep 13 00:08:47.693961 containerd[1501]: 2025-09-13 00:08:47.634 [INFO][4027] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.44.128/26 handle="k8s-pod-network.e9ba8a7f256593d281ab98d097b71e46702e1ce571ca81b684fa37a53832a2c4" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:47.693961 containerd[1501]: 2025-09-13 00:08:47.639 [INFO][4027] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.44.129/26] block=192.168.44.128/26 handle="k8s-pod-network.e9ba8a7f256593d281ab98d097b71e46702e1ce571ca81b684fa37a53832a2c4" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:47.693961 containerd[1501]: 2025-09-13 00:08:47.639 [INFO][4027] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.44.129/26] handle="k8s-pod-network.e9ba8a7f256593d281ab98d097b71e46702e1ce571ca81b684fa37a53832a2c4" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:47.693961 containerd[1501]: 2025-09-13 00:08:47.639 [INFO][4027] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:08:47.693961 containerd[1501]: 2025-09-13 00:08:47.639 [INFO][4027] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.44.129/26] IPv6=[] ContainerID="e9ba8a7f256593d281ab98d097b71e46702e1ce571ca81b684fa37a53832a2c4" HandleID="k8s-pod-network.e9ba8a7f256593d281ab98d097b71e46702e1ce571ca81b684fa37a53832a2c4" Workload="ci--4081--3--5--n--294a4568b6-k8s-whisker--7cf7564cbb--8kk22-eth0" Sep 13 00:08:47.695558 containerd[1501]: 2025-09-13 00:08:47.645 [INFO][4015] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e9ba8a7f256593d281ab98d097b71e46702e1ce571ca81b684fa37a53832a2c4" Namespace="calico-system" Pod="whisker-7cf7564cbb-8kk22" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-whisker--7cf7564cbb--8kk22-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--294a4568b6-k8s-whisker--7cf7564cbb--8kk22-eth0", GenerateName:"whisker-7cf7564cbb-", Namespace:"calico-system", SelfLink:"", UID:"883968fc-9ef4-4da5-94b6-4c848d3d1a0f", ResourceVersion:"883", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 8, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7cf7564cbb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-294a4568b6", ContainerID:"", Pod:"whisker-7cf7564cbb-8kk22", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.44.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali613030eaccb", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:08:47.695558 containerd[1501]: 2025-09-13 00:08:47.645 [INFO][4015] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.44.129/32] ContainerID="e9ba8a7f256593d281ab98d097b71e46702e1ce571ca81b684fa37a53832a2c4" Namespace="calico-system" Pod="whisker-7cf7564cbb-8kk22" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-whisker--7cf7564cbb--8kk22-eth0" Sep 13 00:08:47.695558 containerd[1501]: 2025-09-13 00:08:47.645 [INFO][4015] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali613030eaccb ContainerID="e9ba8a7f256593d281ab98d097b71e46702e1ce571ca81b684fa37a53832a2c4" Namespace="calico-system" Pod="whisker-7cf7564cbb-8kk22" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-whisker--7cf7564cbb--8kk22-eth0" Sep 13 00:08:47.695558 containerd[1501]: 2025-09-13 00:08:47.668 [INFO][4015] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e9ba8a7f256593d281ab98d097b71e46702e1ce571ca81b684fa37a53832a2c4" Namespace="calico-system" Pod="whisker-7cf7564cbb-8kk22" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-whisker--7cf7564cbb--8kk22-eth0" Sep 13 00:08:47.695558 containerd[1501]: 2025-09-13 00:08:47.670 [INFO][4015] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e9ba8a7f256593d281ab98d097b71e46702e1ce571ca81b684fa37a53832a2c4" Namespace="calico-system" Pod="whisker-7cf7564cbb-8kk22" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-whisker--7cf7564cbb--8kk22-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--294a4568b6-k8s-whisker--7cf7564cbb--8kk22-eth0", GenerateName:"whisker-7cf7564cbb-", Namespace:"calico-system", SelfLink:"", UID:"883968fc-9ef4-4da5-94b6-4c848d3d1a0f", ResourceVersion:"883", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 8, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7cf7564cbb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-294a4568b6", ContainerID:"e9ba8a7f256593d281ab98d097b71e46702e1ce571ca81b684fa37a53832a2c4", Pod:"whisker-7cf7564cbb-8kk22", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.44.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali613030eaccb", MAC:"2a:07:10:8a:62:e4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:08:47.695558 containerd[1501]: 2025-09-13 00:08:47.689 [INFO][4015] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e9ba8a7f256593d281ab98d097b71e46702e1ce571ca81b684fa37a53832a2c4" Namespace="calico-system" Pod="whisker-7cf7564cbb-8kk22" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-whisker--7cf7564cbb--8kk22-eth0" Sep 13 00:08:47.714248 kubelet[2611]: I0913 00:08:47.713690 2611 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a77a062-d867-4bd5-b1d4-b714b684c994" path="/var/lib/kubelet/pods/3a77a062-d867-4bd5-b1d4-b714b684c994/volumes" Sep 13 00:08:47.724326 containerd[1501]: time="2025-09-13T00:08:47.724046059Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 00:08:47.724326 containerd[1501]: time="2025-09-13T00:08:47.724159674Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 00:08:47.724326 containerd[1501]: time="2025-09-13T00:08:47.724202394Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 00:08:47.725936 containerd[1501]: time="2025-09-13T00:08:47.724593613Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 00:08:47.750704 systemd[1]: Started cri-containerd-e9ba8a7f256593d281ab98d097b71e46702e1ce571ca81b684fa37a53832a2c4.scope - libcontainer container e9ba8a7f256593d281ab98d097b71e46702e1ce571ca81b684fa37a53832a2c4. Sep 13 00:08:47.810782 containerd[1501]: time="2025-09-13T00:08:47.810749424Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7cf7564cbb-8kk22,Uid:883968fc-9ef4-4da5-94b6-4c848d3d1a0f,Namespace:calico-system,Attempt:0,} returns sandbox id \"e9ba8a7f256593d281ab98d097b71e46702e1ce571ca81b684fa37a53832a2c4\"" Sep 13 00:08:47.813520 containerd[1501]: time="2025-09-13T00:08:47.813471040Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 13 00:08:47.912622 systemd-networkd[1394]: vxlan.calico: Link UP Sep 13 00:08:47.912629 systemd-networkd[1394]: vxlan.calico: Gained carrier Sep 13 00:08:49.217571 systemd-networkd[1394]: vxlan.calico: Gained IPv6LL Sep 13 00:08:49.554033 containerd[1501]: time="2025-09-13T00:08:49.553745715Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:49.554886 containerd[1501]: time="2025-09-13T00:08:49.554817667Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 13 00:08:49.556939 containerd[1501]: time="2025-09-13T00:08:49.556066471Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:49.558072 containerd[1501]: time="2025-09-13T00:08:49.558016708Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:49.559579 containerd[1501]: time="2025-09-13T00:08:49.558834842Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 1.745167229s" Sep 13 00:08:49.559579 containerd[1501]: time="2025-09-13T00:08:49.558866221Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 13 00:08:49.563444 containerd[1501]: time="2025-09-13T00:08:49.563396513Z" level=info msg="CreateContainer within sandbox \"e9ba8a7f256593d281ab98d097b71e46702e1ce571ca81b684fa37a53832a2c4\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 13 00:08:49.581808 containerd[1501]: time="2025-09-13T00:08:49.581748068Z" level=info msg="CreateContainer within sandbox \"e9ba8a7f256593d281ab98d097b71e46702e1ce571ca81b684fa37a53832a2c4\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"3cc401bdc503b20ceb8007b5f00e970d494d6944183aeaebc7d66e6cd10dc83f\"" Sep 13 00:08:49.583788 containerd[1501]: time="2025-09-13T00:08:49.582701696Z" level=info msg="StartContainer for \"3cc401bdc503b20ceb8007b5f00e970d494d6944183aeaebc7d66e6cd10dc83f\"" Sep 13 00:08:49.617318 systemd[1]: run-containerd-runc-k8s.io-3cc401bdc503b20ceb8007b5f00e970d494d6944183aeaebc7d66e6cd10dc83f-runc.OIiA4u.mount: Deactivated successfully. Sep 13 00:08:49.625047 systemd[1]: Started cri-containerd-3cc401bdc503b20ceb8007b5f00e970d494d6944183aeaebc7d66e6cd10dc83f.scope - libcontainer container 3cc401bdc503b20ceb8007b5f00e970d494d6944183aeaebc7d66e6cd10dc83f. Sep 13 00:08:49.660852 containerd[1501]: time="2025-09-13T00:08:49.660785986Z" level=info msg="StartContainer for \"3cc401bdc503b20ceb8007b5f00e970d494d6944183aeaebc7d66e6cd10dc83f\" returns successfully" Sep 13 00:08:49.662239 containerd[1501]: time="2025-09-13T00:08:49.662072251Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 13 00:08:49.665945 systemd-networkd[1394]: cali613030eaccb: Gained IPv6LL Sep 13 00:08:51.905005 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount466752289.mount: Deactivated successfully. Sep 13 00:08:51.920366 containerd[1501]: time="2025-09-13T00:08:51.920313703Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:51.921330 containerd[1501]: time="2025-09-13T00:08:51.921281627Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 13 00:08:51.922095 containerd[1501]: time="2025-09-13T00:08:51.922057859Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:51.924267 containerd[1501]: time="2025-09-13T00:08:51.924222879Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:51.925326 containerd[1501]: time="2025-09-13T00:08:51.924695199Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 2.26259807s" Sep 13 00:08:51.925326 containerd[1501]: time="2025-09-13T00:08:51.924720327Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 13 00:08:51.928614 containerd[1501]: time="2025-09-13T00:08:51.928585250Z" level=info msg="CreateContainer within sandbox \"e9ba8a7f256593d281ab98d097b71e46702e1ce571ca81b684fa37a53832a2c4\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 13 00:08:51.938133 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount707450062.mount: Deactivated successfully. Sep 13 00:08:51.941958 containerd[1501]: time="2025-09-13T00:08:51.941902275Z" level=info msg="CreateContainer within sandbox \"e9ba8a7f256593d281ab98d097b71e46702e1ce571ca81b684fa37a53832a2c4\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"68df3c8febc3af09d3e17ff5972f243b4af40c9d62635a6cf3154cc560e5e4bb\"" Sep 13 00:08:51.942631 containerd[1501]: time="2025-09-13T00:08:51.942607894Z" level=info msg="StartContainer for \"68df3c8febc3af09d3e17ff5972f243b4af40c9d62635a6cf3154cc560e5e4bb\"" Sep 13 00:08:51.968058 systemd[1]: Started cri-containerd-68df3c8febc3af09d3e17ff5972f243b4af40c9d62635a6cf3154cc560e5e4bb.scope - libcontainer container 68df3c8febc3af09d3e17ff5972f243b4af40c9d62635a6cf3154cc560e5e4bb. Sep 13 00:08:52.007432 containerd[1501]: time="2025-09-13T00:08:52.007392153Z" level=info msg="StartContainer for \"68df3c8febc3af09d3e17ff5972f243b4af40c9d62635a6cf3154cc560e5e4bb\" returns successfully" Sep 13 00:08:52.710990 containerd[1501]: time="2025-09-13T00:08:52.710632277Z" level=info msg="StopPodSandbox for \"c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478\"" Sep 13 00:08:52.794293 containerd[1501]: 2025-09-13 00:08:52.753 [INFO][4301] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" Sep 13 00:08:52.794293 containerd[1501]: 2025-09-13 00:08:52.754 [INFO][4301] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" iface="eth0" netns="/var/run/netns/cni-de84e222-851e-6156-cf25-1fb5cfb2ca74" Sep 13 00:08:52.794293 containerd[1501]: 2025-09-13 00:08:52.754 [INFO][4301] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" iface="eth0" netns="/var/run/netns/cni-de84e222-851e-6156-cf25-1fb5cfb2ca74" Sep 13 00:08:52.794293 containerd[1501]: 2025-09-13 00:08:52.754 [INFO][4301] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" iface="eth0" netns="/var/run/netns/cni-de84e222-851e-6156-cf25-1fb5cfb2ca74" Sep 13 00:08:52.794293 containerd[1501]: 2025-09-13 00:08:52.754 [INFO][4301] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" Sep 13 00:08:52.794293 containerd[1501]: 2025-09-13 00:08:52.754 [INFO][4301] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" Sep 13 00:08:52.794293 containerd[1501]: 2025-09-13 00:08:52.780 [INFO][4308] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" HandleID="k8s-pod-network.c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--chbwf-eth0" Sep 13 00:08:52.794293 containerd[1501]: 2025-09-13 00:08:52.780 [INFO][4308] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:08:52.794293 containerd[1501]: 2025-09-13 00:08:52.780 [INFO][4308] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:08:52.794293 containerd[1501]: 2025-09-13 00:08:52.787 [WARNING][4308] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" HandleID="k8s-pod-network.c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--chbwf-eth0" Sep 13 00:08:52.794293 containerd[1501]: 2025-09-13 00:08:52.787 [INFO][4308] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" HandleID="k8s-pod-network.c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--chbwf-eth0" Sep 13 00:08:52.794293 containerd[1501]: 2025-09-13 00:08:52.789 [INFO][4308] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:08:52.794293 containerd[1501]: 2025-09-13 00:08:52.791 [INFO][4301] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" Sep 13 00:08:52.797972 containerd[1501]: time="2025-09-13T00:08:52.794476930Z" level=info msg="TearDown network for sandbox \"c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478\" successfully" Sep 13 00:08:52.797972 containerd[1501]: time="2025-09-13T00:08:52.794523137Z" level=info msg="StopPodSandbox for \"c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478\" returns successfully" Sep 13 00:08:52.797972 containerd[1501]: time="2025-09-13T00:08:52.796862123Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9c789df89-chbwf,Uid:66c3fc14-19e6-48c3-91eb-518f5181421b,Namespace:calico-apiserver,Attempt:1,}" Sep 13 00:08:52.798232 systemd[1]: run-netns-cni\x2dde84e222\x2d851e\x2d6156\x2dcf25\x2d1fb5cfb2ca74.mount: Deactivated successfully. Sep 13 00:08:52.933800 systemd-networkd[1394]: cali936bbd726b3: Link UP Sep 13 00:08:52.934125 systemd-networkd[1394]: cali936bbd726b3: Gained carrier Sep 13 00:08:52.960711 containerd[1501]: 2025-09-13 00:08:52.849 [INFO][4318] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--chbwf-eth0 calico-apiserver-9c789df89- calico-apiserver 66c3fc14-19e6-48c3-91eb-518f5181421b 910 0 2025-09-13 00:08:29 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:9c789df89 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081-3-5-n-294a4568b6 calico-apiserver-9c789df89-chbwf eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali936bbd726b3 [] [] }} ContainerID="3a9ad9d11477efcdcd06b6b044f06d51ec462d5c936e47f2d742a40029171596" Namespace="calico-apiserver" Pod="calico-apiserver-9c789df89-chbwf" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--chbwf-" Sep 13 00:08:52.960711 containerd[1501]: 2025-09-13 00:08:52.850 [INFO][4318] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3a9ad9d11477efcdcd06b6b044f06d51ec462d5c936e47f2d742a40029171596" Namespace="calico-apiserver" Pod="calico-apiserver-9c789df89-chbwf" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--chbwf-eth0" Sep 13 00:08:52.960711 containerd[1501]: 2025-09-13 00:08:52.880 [INFO][4326] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3a9ad9d11477efcdcd06b6b044f06d51ec462d5c936e47f2d742a40029171596" HandleID="k8s-pod-network.3a9ad9d11477efcdcd06b6b044f06d51ec462d5c936e47f2d742a40029171596" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--chbwf-eth0" Sep 13 00:08:52.960711 containerd[1501]: 2025-09-13 00:08:52.880 [INFO][4326] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3a9ad9d11477efcdcd06b6b044f06d51ec462d5c936e47f2d742a40029171596" HandleID="k8s-pod-network.3a9ad9d11477efcdcd06b6b044f06d51ec462d5c936e47f2d742a40029171596" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--chbwf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f610), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4081-3-5-n-294a4568b6", "pod":"calico-apiserver-9c789df89-chbwf", "timestamp":"2025-09-13 00:08:52.880261156 +0000 UTC"}, Hostname:"ci-4081-3-5-n-294a4568b6", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 00:08:52.960711 containerd[1501]: 2025-09-13 00:08:52.880 [INFO][4326] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:08:52.960711 containerd[1501]: 2025-09-13 00:08:52.880 [INFO][4326] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:08:52.960711 containerd[1501]: 2025-09-13 00:08:52.880 [INFO][4326] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-n-294a4568b6' Sep 13 00:08:52.960711 containerd[1501]: 2025-09-13 00:08:52.891 [INFO][4326] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3a9ad9d11477efcdcd06b6b044f06d51ec462d5c936e47f2d742a40029171596" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:52.960711 containerd[1501]: 2025-09-13 00:08:52.896 [INFO][4326] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:52.960711 containerd[1501]: 2025-09-13 00:08:52.901 [INFO][4326] ipam/ipam.go 511: Trying affinity for 192.168.44.128/26 host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:52.960711 containerd[1501]: 2025-09-13 00:08:52.902 [INFO][4326] ipam/ipam.go 158: Attempting to load block cidr=192.168.44.128/26 host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:52.960711 containerd[1501]: 2025-09-13 00:08:52.905 [INFO][4326] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.44.128/26 host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:52.960711 containerd[1501]: 2025-09-13 00:08:52.905 [INFO][4326] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.44.128/26 handle="k8s-pod-network.3a9ad9d11477efcdcd06b6b044f06d51ec462d5c936e47f2d742a40029171596" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:52.960711 containerd[1501]: 2025-09-13 00:08:52.906 [INFO][4326] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.3a9ad9d11477efcdcd06b6b044f06d51ec462d5c936e47f2d742a40029171596 Sep 13 00:08:52.960711 containerd[1501]: 2025-09-13 00:08:52.918 [INFO][4326] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.44.128/26 handle="k8s-pod-network.3a9ad9d11477efcdcd06b6b044f06d51ec462d5c936e47f2d742a40029171596" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:52.960711 containerd[1501]: 2025-09-13 00:08:52.926 [INFO][4326] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.44.130/26] block=192.168.44.128/26 handle="k8s-pod-network.3a9ad9d11477efcdcd06b6b044f06d51ec462d5c936e47f2d742a40029171596" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:52.960711 containerd[1501]: 2025-09-13 00:08:52.926 [INFO][4326] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.44.130/26] handle="k8s-pod-network.3a9ad9d11477efcdcd06b6b044f06d51ec462d5c936e47f2d742a40029171596" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:52.960711 containerd[1501]: 2025-09-13 00:08:52.926 [INFO][4326] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:08:52.960711 containerd[1501]: 2025-09-13 00:08:52.926 [INFO][4326] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.44.130/26] IPv6=[] ContainerID="3a9ad9d11477efcdcd06b6b044f06d51ec462d5c936e47f2d742a40029171596" HandleID="k8s-pod-network.3a9ad9d11477efcdcd06b6b044f06d51ec462d5c936e47f2d742a40029171596" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--chbwf-eth0" Sep 13 00:08:52.963159 containerd[1501]: 2025-09-13 00:08:52.929 [INFO][4318] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3a9ad9d11477efcdcd06b6b044f06d51ec462d5c936e47f2d742a40029171596" Namespace="calico-apiserver" Pod="calico-apiserver-9c789df89-chbwf" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--chbwf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--chbwf-eth0", GenerateName:"calico-apiserver-9c789df89-", Namespace:"calico-apiserver", SelfLink:"", UID:"66c3fc14-19e6-48c3-91eb-518f5181421b", ResourceVersion:"910", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 8, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9c789df89", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-294a4568b6", ContainerID:"", Pod:"calico-apiserver-9c789df89-chbwf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.44.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali936bbd726b3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:08:52.963159 containerd[1501]: 2025-09-13 00:08:52.929 [INFO][4318] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.44.130/32] ContainerID="3a9ad9d11477efcdcd06b6b044f06d51ec462d5c936e47f2d742a40029171596" Namespace="calico-apiserver" Pod="calico-apiserver-9c789df89-chbwf" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--chbwf-eth0" Sep 13 00:08:52.963159 containerd[1501]: 2025-09-13 00:08:52.929 [INFO][4318] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali936bbd726b3 ContainerID="3a9ad9d11477efcdcd06b6b044f06d51ec462d5c936e47f2d742a40029171596" Namespace="calico-apiserver" Pod="calico-apiserver-9c789df89-chbwf" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--chbwf-eth0" Sep 13 00:08:52.963159 containerd[1501]: 2025-09-13 00:08:52.935 [INFO][4318] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3a9ad9d11477efcdcd06b6b044f06d51ec462d5c936e47f2d742a40029171596" Namespace="calico-apiserver" Pod="calico-apiserver-9c789df89-chbwf" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--chbwf-eth0" Sep 13 00:08:52.963159 containerd[1501]: 2025-09-13 00:08:52.936 [INFO][4318] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3a9ad9d11477efcdcd06b6b044f06d51ec462d5c936e47f2d742a40029171596" Namespace="calico-apiserver" Pod="calico-apiserver-9c789df89-chbwf" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--chbwf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--chbwf-eth0", GenerateName:"calico-apiserver-9c789df89-", Namespace:"calico-apiserver", SelfLink:"", UID:"66c3fc14-19e6-48c3-91eb-518f5181421b", ResourceVersion:"910", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 8, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9c789df89", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-294a4568b6", ContainerID:"3a9ad9d11477efcdcd06b6b044f06d51ec462d5c936e47f2d742a40029171596", Pod:"calico-apiserver-9c789df89-chbwf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.44.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali936bbd726b3", MAC:"a6:69:a7:e4:d4:eb", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:08:52.963159 containerd[1501]: 2025-09-13 00:08:52.956 [INFO][4318] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3a9ad9d11477efcdcd06b6b044f06d51ec462d5c936e47f2d742a40029171596" Namespace="calico-apiserver" Pod="calico-apiserver-9c789df89-chbwf" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--chbwf-eth0" Sep 13 00:08:52.988393 containerd[1501]: time="2025-09-13T00:08:52.987144832Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 00:08:52.988393 containerd[1501]: time="2025-09-13T00:08:52.987237596Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 00:08:52.988393 containerd[1501]: time="2025-09-13T00:08:52.987254398Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 00:08:52.989282 containerd[1501]: time="2025-09-13T00:08:52.988347377Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 00:08:53.019626 kubelet[2611]: I0913 00:08:53.018803 2611 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-7cf7564cbb-8kk22" podStartSLOduration=1.905596435 podStartE2EDuration="6.018781913s" podCreationTimestamp="2025-09-13 00:08:47 +0000 UTC" firstStartedPulling="2025-09-13 00:08:47.812586271 +0000 UTC m=+34.226619567" lastFinishedPulling="2025-09-13 00:08:51.925771748 +0000 UTC m=+38.339805045" observedRunningTime="2025-09-13 00:08:53.018447022 +0000 UTC m=+39.432480350" watchObservedRunningTime="2025-09-13 00:08:53.018781913 +0000 UTC m=+39.432815220" Sep 13 00:08:53.025461 systemd[1]: Started cri-containerd-3a9ad9d11477efcdcd06b6b044f06d51ec462d5c936e47f2d742a40029171596.scope - libcontainer container 3a9ad9d11477efcdcd06b6b044f06d51ec462d5c936e47f2d742a40029171596. Sep 13 00:08:53.082526 containerd[1501]: time="2025-09-13T00:08:53.082472956Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9c789df89-chbwf,Uid:66c3fc14-19e6-48c3-91eb-518f5181421b,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"3a9ad9d11477efcdcd06b6b044f06d51ec462d5c936e47f2d742a40029171596\"" Sep 13 00:08:53.084671 containerd[1501]: time="2025-09-13T00:08:53.084537043Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 13 00:08:53.714224 containerd[1501]: time="2025-09-13T00:08:53.713772523Z" level=info msg="StopPodSandbox for \"f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36\"" Sep 13 00:08:53.815499 containerd[1501]: 2025-09-13 00:08:53.772 [INFO][4405] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" Sep 13 00:08:53.815499 containerd[1501]: 2025-09-13 00:08:53.772 [INFO][4405] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" iface="eth0" netns="/var/run/netns/cni-9137fdaa-33af-8569-73dd-b45368677239" Sep 13 00:08:53.815499 containerd[1501]: 2025-09-13 00:08:53.774 [INFO][4405] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" iface="eth0" netns="/var/run/netns/cni-9137fdaa-33af-8569-73dd-b45368677239" Sep 13 00:08:53.815499 containerd[1501]: 2025-09-13 00:08:53.774 [INFO][4405] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" iface="eth0" netns="/var/run/netns/cni-9137fdaa-33af-8569-73dd-b45368677239" Sep 13 00:08:53.815499 containerd[1501]: 2025-09-13 00:08:53.774 [INFO][4405] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" Sep 13 00:08:53.815499 containerd[1501]: 2025-09-13 00:08:53.774 [INFO][4405] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" Sep 13 00:08:53.815499 containerd[1501]: 2025-09-13 00:08:53.798 [INFO][4413] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" HandleID="k8s-pod-network.f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" Workload="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--ts6l4-eth0" Sep 13 00:08:53.815499 containerd[1501]: 2025-09-13 00:08:53.798 [INFO][4413] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:08:53.815499 containerd[1501]: 2025-09-13 00:08:53.799 [INFO][4413] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:08:53.815499 containerd[1501]: 2025-09-13 00:08:53.807 [WARNING][4413] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" HandleID="k8s-pod-network.f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" Workload="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--ts6l4-eth0" Sep 13 00:08:53.815499 containerd[1501]: 2025-09-13 00:08:53.807 [INFO][4413] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" HandleID="k8s-pod-network.f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" Workload="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--ts6l4-eth0" Sep 13 00:08:53.815499 containerd[1501]: 2025-09-13 00:08:53.809 [INFO][4413] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:08:53.815499 containerd[1501]: 2025-09-13 00:08:53.813 [INFO][4405] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" Sep 13 00:08:53.818103 containerd[1501]: time="2025-09-13T00:08:53.818028634Z" level=info msg="TearDown network for sandbox \"f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36\" successfully" Sep 13 00:08:53.818103 containerd[1501]: time="2025-09-13T00:08:53.818100940Z" level=info msg="StopPodSandbox for \"f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36\" returns successfully" Sep 13 00:08:53.818681 systemd[1]: run-netns-cni\x2d9137fdaa\x2d33af\x2d8569\x2d73dd\x2db45368677239.mount: Deactivated successfully. Sep 13 00:08:53.821262 containerd[1501]: time="2025-09-13T00:08:53.821107292Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-ts6l4,Uid:807cba0d-efbf-495e-bbeb-ab5413159760,Namespace:kube-system,Attempt:1,}" Sep 13 00:08:53.947757 systemd-networkd[1394]: calida62db25490: Link UP Sep 13 00:08:53.948978 systemd-networkd[1394]: calida62db25490: Gained carrier Sep 13 00:08:53.970350 containerd[1501]: 2025-09-13 00:08:53.867 [INFO][4419] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--ts6l4-eth0 coredns-674b8bbfcf- kube-system 807cba0d-efbf-495e-bbeb-ab5413159760 923 0 2025-09-13 00:08:19 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081-3-5-n-294a4568b6 coredns-674b8bbfcf-ts6l4 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calida62db25490 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="32cb9490f7fee95bc363c6381c707ea9caa62c1397abd7d20ac28e4f70d5b6f0" Namespace="kube-system" Pod="coredns-674b8bbfcf-ts6l4" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--ts6l4-" Sep 13 00:08:53.970350 containerd[1501]: 2025-09-13 00:08:53.867 [INFO][4419] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="32cb9490f7fee95bc363c6381c707ea9caa62c1397abd7d20ac28e4f70d5b6f0" Namespace="kube-system" Pod="coredns-674b8bbfcf-ts6l4" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--ts6l4-eth0" Sep 13 00:08:53.970350 containerd[1501]: 2025-09-13 00:08:53.902 [INFO][4431] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="32cb9490f7fee95bc363c6381c707ea9caa62c1397abd7d20ac28e4f70d5b6f0" HandleID="k8s-pod-network.32cb9490f7fee95bc363c6381c707ea9caa62c1397abd7d20ac28e4f70d5b6f0" Workload="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--ts6l4-eth0" Sep 13 00:08:53.970350 containerd[1501]: 2025-09-13 00:08:53.902 [INFO][4431] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="32cb9490f7fee95bc363c6381c707ea9caa62c1397abd7d20ac28e4f70d5b6f0" HandleID="k8s-pod-network.32cb9490f7fee95bc363c6381c707ea9caa62c1397abd7d20ac28e4f70d5b6f0" Workload="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--ts6l4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5600), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081-3-5-n-294a4568b6", "pod":"coredns-674b8bbfcf-ts6l4", "timestamp":"2025-09-13 00:08:53.902013397 +0000 UTC"}, Hostname:"ci-4081-3-5-n-294a4568b6", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 00:08:53.970350 containerd[1501]: 2025-09-13 00:08:53.902 [INFO][4431] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:08:53.970350 containerd[1501]: 2025-09-13 00:08:53.902 [INFO][4431] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:08:53.970350 containerd[1501]: 2025-09-13 00:08:53.902 [INFO][4431] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-n-294a4568b6' Sep 13 00:08:53.970350 containerd[1501]: 2025-09-13 00:08:53.909 [INFO][4431] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.32cb9490f7fee95bc363c6381c707ea9caa62c1397abd7d20ac28e4f70d5b6f0" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:53.970350 containerd[1501]: 2025-09-13 00:08:53.916 [INFO][4431] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:53.970350 containerd[1501]: 2025-09-13 00:08:53.922 [INFO][4431] ipam/ipam.go 511: Trying affinity for 192.168.44.128/26 host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:53.970350 containerd[1501]: 2025-09-13 00:08:53.924 [INFO][4431] ipam/ipam.go 158: Attempting to load block cidr=192.168.44.128/26 host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:53.970350 containerd[1501]: 2025-09-13 00:08:53.926 [INFO][4431] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.44.128/26 host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:53.970350 containerd[1501]: 2025-09-13 00:08:53.926 [INFO][4431] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.44.128/26 handle="k8s-pod-network.32cb9490f7fee95bc363c6381c707ea9caa62c1397abd7d20ac28e4f70d5b6f0" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:53.970350 containerd[1501]: 2025-09-13 00:08:53.929 [INFO][4431] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.32cb9490f7fee95bc363c6381c707ea9caa62c1397abd7d20ac28e4f70d5b6f0 Sep 13 00:08:53.970350 containerd[1501]: 2025-09-13 00:08:53.933 [INFO][4431] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.44.128/26 handle="k8s-pod-network.32cb9490f7fee95bc363c6381c707ea9caa62c1397abd7d20ac28e4f70d5b6f0" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:53.970350 containerd[1501]: 2025-09-13 00:08:53.941 [INFO][4431] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.44.131/26] block=192.168.44.128/26 handle="k8s-pod-network.32cb9490f7fee95bc363c6381c707ea9caa62c1397abd7d20ac28e4f70d5b6f0" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:53.970350 containerd[1501]: 2025-09-13 00:08:53.941 [INFO][4431] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.44.131/26] handle="k8s-pod-network.32cb9490f7fee95bc363c6381c707ea9caa62c1397abd7d20ac28e4f70d5b6f0" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:53.970350 containerd[1501]: 2025-09-13 00:08:53.941 [INFO][4431] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:08:53.970350 containerd[1501]: 2025-09-13 00:08:53.941 [INFO][4431] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.44.131/26] IPv6=[] ContainerID="32cb9490f7fee95bc363c6381c707ea9caa62c1397abd7d20ac28e4f70d5b6f0" HandleID="k8s-pod-network.32cb9490f7fee95bc363c6381c707ea9caa62c1397abd7d20ac28e4f70d5b6f0" Workload="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--ts6l4-eth0" Sep 13 00:08:53.974638 containerd[1501]: 2025-09-13 00:08:53.944 [INFO][4419] cni-plugin/k8s.go 418: Populated endpoint ContainerID="32cb9490f7fee95bc363c6381c707ea9caa62c1397abd7d20ac28e4f70d5b6f0" Namespace="kube-system" Pod="coredns-674b8bbfcf-ts6l4" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--ts6l4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--ts6l4-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"807cba0d-efbf-495e-bbeb-ab5413159760", ResourceVersion:"923", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 8, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-294a4568b6", ContainerID:"", Pod:"coredns-674b8bbfcf-ts6l4", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.44.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calida62db25490", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:08:53.974638 containerd[1501]: 2025-09-13 00:08:53.944 [INFO][4419] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.44.131/32] ContainerID="32cb9490f7fee95bc363c6381c707ea9caa62c1397abd7d20ac28e4f70d5b6f0" Namespace="kube-system" Pod="coredns-674b8bbfcf-ts6l4" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--ts6l4-eth0" Sep 13 00:08:53.974638 containerd[1501]: 2025-09-13 00:08:53.944 [INFO][4419] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calida62db25490 ContainerID="32cb9490f7fee95bc363c6381c707ea9caa62c1397abd7d20ac28e4f70d5b6f0" Namespace="kube-system" Pod="coredns-674b8bbfcf-ts6l4" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--ts6l4-eth0" Sep 13 00:08:53.974638 containerd[1501]: 2025-09-13 00:08:53.948 [INFO][4419] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="32cb9490f7fee95bc363c6381c707ea9caa62c1397abd7d20ac28e4f70d5b6f0" Namespace="kube-system" Pod="coredns-674b8bbfcf-ts6l4" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--ts6l4-eth0" Sep 13 00:08:53.974638 containerd[1501]: 2025-09-13 00:08:53.950 [INFO][4419] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="32cb9490f7fee95bc363c6381c707ea9caa62c1397abd7d20ac28e4f70d5b6f0" Namespace="kube-system" Pod="coredns-674b8bbfcf-ts6l4" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--ts6l4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--ts6l4-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"807cba0d-efbf-495e-bbeb-ab5413159760", ResourceVersion:"923", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 8, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-294a4568b6", ContainerID:"32cb9490f7fee95bc363c6381c707ea9caa62c1397abd7d20ac28e4f70d5b6f0", Pod:"coredns-674b8bbfcf-ts6l4", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.44.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calida62db25490", MAC:"d6:f2:72:72:f4:41", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:08:53.974638 containerd[1501]: 2025-09-13 00:08:53.965 [INFO][4419] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="32cb9490f7fee95bc363c6381c707ea9caa62c1397abd7d20ac28e4f70d5b6f0" Namespace="kube-system" Pod="coredns-674b8bbfcf-ts6l4" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--ts6l4-eth0" Sep 13 00:08:54.003583 containerd[1501]: time="2025-09-13T00:08:54.003075110Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 00:08:54.003583 containerd[1501]: time="2025-09-13T00:08:54.003166332Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 00:08:54.003583 containerd[1501]: time="2025-09-13T00:08:54.003181591Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 00:08:54.003583 containerd[1501]: time="2025-09-13T00:08:54.003399030Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 00:08:54.041289 systemd[1]: Started cri-containerd-32cb9490f7fee95bc363c6381c707ea9caa62c1397abd7d20ac28e4f70d5b6f0.scope - libcontainer container 32cb9490f7fee95bc363c6381c707ea9caa62c1397abd7d20ac28e4f70d5b6f0. Sep 13 00:08:54.096281 containerd[1501]: time="2025-09-13T00:08:54.096223627Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-ts6l4,Uid:807cba0d-efbf-495e-bbeb-ab5413159760,Namespace:kube-system,Attempt:1,} returns sandbox id \"32cb9490f7fee95bc363c6381c707ea9caa62c1397abd7d20ac28e4f70d5b6f0\"" Sep 13 00:08:54.101306 containerd[1501]: time="2025-09-13T00:08:54.101115077Z" level=info msg="CreateContainer within sandbox \"32cb9490f7fee95bc363c6381c707ea9caa62c1397abd7d20ac28e4f70d5b6f0\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 13 00:08:54.126873 containerd[1501]: time="2025-09-13T00:08:54.126732881Z" level=info msg="CreateContainer within sandbox \"32cb9490f7fee95bc363c6381c707ea9caa62c1397abd7d20ac28e4f70d5b6f0\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"0eea106d62e95ce802aa1bc3b86627f27449acd40fb4d8aa4ce88d78e261e899\"" Sep 13 00:08:54.127935 containerd[1501]: time="2025-09-13T00:08:54.127903605Z" level=info msg="StartContainer for \"0eea106d62e95ce802aa1bc3b86627f27449acd40fb4d8aa4ce88d78e261e899\"" Sep 13 00:08:54.153103 systemd[1]: Started cri-containerd-0eea106d62e95ce802aa1bc3b86627f27449acd40fb4d8aa4ce88d78e261e899.scope - libcontainer container 0eea106d62e95ce802aa1bc3b86627f27449acd40fb4d8aa4ce88d78e261e899. Sep 13 00:08:54.180090 containerd[1501]: time="2025-09-13T00:08:54.180047404Z" level=info msg="StartContainer for \"0eea106d62e95ce802aa1bc3b86627f27449acd40fb4d8aa4ce88d78e261e899\" returns successfully" Sep 13 00:08:54.341210 systemd-networkd[1394]: cali936bbd726b3: Gained IPv6LL Sep 13 00:08:54.714331 containerd[1501]: time="2025-09-13T00:08:54.714066868Z" level=info msg="StopPodSandbox for \"132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea\"" Sep 13 00:08:54.865228 containerd[1501]: 2025-09-13 00:08:54.793 [INFO][4535] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" Sep 13 00:08:54.865228 containerd[1501]: 2025-09-13 00:08:54.793 [INFO][4535] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" iface="eth0" netns="/var/run/netns/cni-eadd8309-919f-6a5f-152e-7ac56cd7b548" Sep 13 00:08:54.865228 containerd[1501]: 2025-09-13 00:08:54.795 [INFO][4535] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" iface="eth0" netns="/var/run/netns/cni-eadd8309-919f-6a5f-152e-7ac56cd7b548" Sep 13 00:08:54.865228 containerd[1501]: 2025-09-13 00:08:54.795 [INFO][4535] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" iface="eth0" netns="/var/run/netns/cni-eadd8309-919f-6a5f-152e-7ac56cd7b548" Sep 13 00:08:54.865228 containerd[1501]: 2025-09-13 00:08:54.795 [INFO][4535] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" Sep 13 00:08:54.865228 containerd[1501]: 2025-09-13 00:08:54.795 [INFO][4535] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" Sep 13 00:08:54.865228 containerd[1501]: 2025-09-13 00:08:54.845 [INFO][4542] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" HandleID="k8s-pod-network.132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" Workload="ci--4081--3--5--n--294a4568b6-k8s-csi--node--driver--pdqq4-eth0" Sep 13 00:08:54.865228 containerd[1501]: 2025-09-13 00:08:54.846 [INFO][4542] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:08:54.865228 containerd[1501]: 2025-09-13 00:08:54.846 [INFO][4542] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:08:54.865228 containerd[1501]: 2025-09-13 00:08:54.856 [WARNING][4542] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" HandleID="k8s-pod-network.132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" Workload="ci--4081--3--5--n--294a4568b6-k8s-csi--node--driver--pdqq4-eth0" Sep 13 00:08:54.865228 containerd[1501]: 2025-09-13 00:08:54.856 [INFO][4542] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" HandleID="k8s-pod-network.132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" Workload="ci--4081--3--5--n--294a4568b6-k8s-csi--node--driver--pdqq4-eth0" Sep 13 00:08:54.865228 containerd[1501]: 2025-09-13 00:08:54.859 [INFO][4542] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:08:54.865228 containerd[1501]: 2025-09-13 00:08:54.861 [INFO][4535] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" Sep 13 00:08:54.868709 containerd[1501]: time="2025-09-13T00:08:54.867097354Z" level=info msg="TearDown network for sandbox \"132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea\" successfully" Sep 13 00:08:54.868709 containerd[1501]: time="2025-09-13T00:08:54.867131046Z" level=info msg="StopPodSandbox for \"132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea\" returns successfully" Sep 13 00:08:54.869337 systemd[1]: run-netns-cni\x2deadd8309\x2d919f\x2d6a5f\x2d152e\x2d7ac56cd7b548.mount: Deactivated successfully. Sep 13 00:08:54.873293 containerd[1501]: time="2025-09-13T00:08:54.873107058Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-pdqq4,Uid:d0c095e9-b846-4f71-bc12-2e92665be871,Namespace:calico-system,Attempt:1,}" Sep 13 00:08:55.017412 systemd-networkd[1394]: calid910e85c87b: Link UP Sep 13 00:08:55.020791 systemd-networkd[1394]: calid910e85c87b: Gained carrier Sep 13 00:08:55.067602 containerd[1501]: 2025-09-13 00:08:54.919 [INFO][4549] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--n--294a4568b6-k8s-csi--node--driver--pdqq4-eth0 csi-node-driver- calico-system d0c095e9-b846-4f71-bc12-2e92665be871 933 0 2025-09-13 00:08:32 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4081-3-5-n-294a4568b6 csi-node-driver-pdqq4 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calid910e85c87b [] [] }} ContainerID="2aeeb1e67e0ab98e4632cde4b2c2a3b3d848b6449cd7ba19bc91d6b34385ec27" Namespace="calico-system" Pod="csi-node-driver-pdqq4" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-csi--node--driver--pdqq4-" Sep 13 00:08:55.067602 containerd[1501]: 2025-09-13 00:08:54.920 [INFO][4549] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2aeeb1e67e0ab98e4632cde4b2c2a3b3d848b6449cd7ba19bc91d6b34385ec27" Namespace="calico-system" Pod="csi-node-driver-pdqq4" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-csi--node--driver--pdqq4-eth0" Sep 13 00:08:55.067602 containerd[1501]: 2025-09-13 00:08:54.960 [INFO][4561] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2aeeb1e67e0ab98e4632cde4b2c2a3b3d848b6449cd7ba19bc91d6b34385ec27" HandleID="k8s-pod-network.2aeeb1e67e0ab98e4632cde4b2c2a3b3d848b6449cd7ba19bc91d6b34385ec27" Workload="ci--4081--3--5--n--294a4568b6-k8s-csi--node--driver--pdqq4-eth0" Sep 13 00:08:55.067602 containerd[1501]: 2025-09-13 00:08:54.960 [INFO][4561] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2aeeb1e67e0ab98e4632cde4b2c2a3b3d848b6449cd7ba19bc91d6b34385ec27" HandleID="k8s-pod-network.2aeeb1e67e0ab98e4632cde4b2c2a3b3d848b6449cd7ba19bc91d6b34385ec27" Workload="ci--4081--3--5--n--294a4568b6-k8s-csi--node--driver--pdqq4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00048e1e0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-5-n-294a4568b6", "pod":"csi-node-driver-pdqq4", "timestamp":"2025-09-13 00:08:54.96062305 +0000 UTC"}, Hostname:"ci-4081-3-5-n-294a4568b6", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 00:08:55.067602 containerd[1501]: 2025-09-13 00:08:54.960 [INFO][4561] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:08:55.067602 containerd[1501]: 2025-09-13 00:08:54.960 [INFO][4561] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:08:55.067602 containerd[1501]: 2025-09-13 00:08:54.960 [INFO][4561] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-n-294a4568b6' Sep 13 00:08:55.067602 containerd[1501]: 2025-09-13 00:08:54.971 [INFO][4561] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2aeeb1e67e0ab98e4632cde4b2c2a3b3d848b6449cd7ba19bc91d6b34385ec27" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:55.067602 containerd[1501]: 2025-09-13 00:08:54.978 [INFO][4561] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:55.067602 containerd[1501]: 2025-09-13 00:08:54.985 [INFO][4561] ipam/ipam.go 511: Trying affinity for 192.168.44.128/26 host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:55.067602 containerd[1501]: 2025-09-13 00:08:54.988 [INFO][4561] ipam/ipam.go 158: Attempting to load block cidr=192.168.44.128/26 host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:55.067602 containerd[1501]: 2025-09-13 00:08:54.991 [INFO][4561] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.44.128/26 host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:55.067602 containerd[1501]: 2025-09-13 00:08:54.991 [INFO][4561] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.44.128/26 handle="k8s-pod-network.2aeeb1e67e0ab98e4632cde4b2c2a3b3d848b6449cd7ba19bc91d6b34385ec27" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:55.067602 containerd[1501]: 2025-09-13 00:08:54.994 [INFO][4561] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.2aeeb1e67e0ab98e4632cde4b2c2a3b3d848b6449cd7ba19bc91d6b34385ec27 Sep 13 00:08:55.067602 containerd[1501]: 2025-09-13 00:08:54.999 [INFO][4561] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.44.128/26 handle="k8s-pod-network.2aeeb1e67e0ab98e4632cde4b2c2a3b3d848b6449cd7ba19bc91d6b34385ec27" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:55.067602 containerd[1501]: 2025-09-13 00:08:55.005 [INFO][4561] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.44.132/26] block=192.168.44.128/26 handle="k8s-pod-network.2aeeb1e67e0ab98e4632cde4b2c2a3b3d848b6449cd7ba19bc91d6b34385ec27" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:55.067602 containerd[1501]: 2025-09-13 00:08:55.005 [INFO][4561] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.44.132/26] handle="k8s-pod-network.2aeeb1e67e0ab98e4632cde4b2c2a3b3d848b6449cd7ba19bc91d6b34385ec27" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:55.067602 containerd[1501]: 2025-09-13 00:08:55.006 [INFO][4561] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:08:55.067602 containerd[1501]: 2025-09-13 00:08:55.006 [INFO][4561] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.44.132/26] IPv6=[] ContainerID="2aeeb1e67e0ab98e4632cde4b2c2a3b3d848b6449cd7ba19bc91d6b34385ec27" HandleID="k8s-pod-network.2aeeb1e67e0ab98e4632cde4b2c2a3b3d848b6449cd7ba19bc91d6b34385ec27" Workload="ci--4081--3--5--n--294a4568b6-k8s-csi--node--driver--pdqq4-eth0" Sep 13 00:08:55.071502 containerd[1501]: 2025-09-13 00:08:55.010 [INFO][4549] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2aeeb1e67e0ab98e4632cde4b2c2a3b3d848b6449cd7ba19bc91d6b34385ec27" Namespace="calico-system" Pod="csi-node-driver-pdqq4" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-csi--node--driver--pdqq4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--294a4568b6-k8s-csi--node--driver--pdqq4-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"d0c095e9-b846-4f71-bc12-2e92665be871", ResourceVersion:"933", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 8, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-294a4568b6", ContainerID:"", Pod:"csi-node-driver-pdqq4", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.44.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid910e85c87b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:08:55.071502 containerd[1501]: 2025-09-13 00:08:55.010 [INFO][4549] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.44.132/32] ContainerID="2aeeb1e67e0ab98e4632cde4b2c2a3b3d848b6449cd7ba19bc91d6b34385ec27" Namespace="calico-system" Pod="csi-node-driver-pdqq4" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-csi--node--driver--pdqq4-eth0" Sep 13 00:08:55.071502 containerd[1501]: 2025-09-13 00:08:55.010 [INFO][4549] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid910e85c87b ContainerID="2aeeb1e67e0ab98e4632cde4b2c2a3b3d848b6449cd7ba19bc91d6b34385ec27" Namespace="calico-system" Pod="csi-node-driver-pdqq4" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-csi--node--driver--pdqq4-eth0" Sep 13 00:08:55.071502 containerd[1501]: 2025-09-13 00:08:55.019 [INFO][4549] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2aeeb1e67e0ab98e4632cde4b2c2a3b3d848b6449cd7ba19bc91d6b34385ec27" Namespace="calico-system" Pod="csi-node-driver-pdqq4" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-csi--node--driver--pdqq4-eth0" Sep 13 00:08:55.071502 containerd[1501]: 2025-09-13 00:08:55.020 [INFO][4549] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2aeeb1e67e0ab98e4632cde4b2c2a3b3d848b6449cd7ba19bc91d6b34385ec27" Namespace="calico-system" Pod="csi-node-driver-pdqq4" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-csi--node--driver--pdqq4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--294a4568b6-k8s-csi--node--driver--pdqq4-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"d0c095e9-b846-4f71-bc12-2e92665be871", ResourceVersion:"933", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 8, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-294a4568b6", ContainerID:"2aeeb1e67e0ab98e4632cde4b2c2a3b3d848b6449cd7ba19bc91d6b34385ec27", Pod:"csi-node-driver-pdqq4", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.44.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid910e85c87b", MAC:"0e:88:ec:02:f2:8a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:08:55.071502 containerd[1501]: 2025-09-13 00:08:55.058 [INFO][4549] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2aeeb1e67e0ab98e4632cde4b2c2a3b3d848b6449cd7ba19bc91d6b34385ec27" Namespace="calico-system" Pod="csi-node-driver-pdqq4" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-csi--node--driver--pdqq4-eth0" Sep 13 00:08:55.086572 kubelet[2611]: I0913 00:08:55.085188 2611 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-ts6l4" podStartSLOduration=36.085167671 podStartE2EDuration="36.085167671s" podCreationTimestamp="2025-09-13 00:08:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-13 00:08:55.053231454 +0000 UTC m=+41.467264751" watchObservedRunningTime="2025-09-13 00:08:55.085167671 +0000 UTC m=+41.499200967" Sep 13 00:08:55.118502 containerd[1501]: time="2025-09-13T00:08:55.117156035Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 00:08:55.118502 containerd[1501]: time="2025-09-13T00:08:55.117242659Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 00:08:55.118502 containerd[1501]: time="2025-09-13T00:08:55.117262025Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 00:08:55.118502 containerd[1501]: time="2025-09-13T00:08:55.117384105Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 00:08:55.153262 systemd[1]: Started cri-containerd-2aeeb1e67e0ab98e4632cde4b2c2a3b3d848b6449cd7ba19bc91d6b34385ec27.scope - libcontainer container 2aeeb1e67e0ab98e4632cde4b2c2a3b3d848b6449cd7ba19bc91d6b34385ec27. Sep 13 00:08:55.206727 containerd[1501]: time="2025-09-13T00:08:55.206680206Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-pdqq4,Uid:d0c095e9-b846-4f71-bc12-2e92665be871,Namespace:calico-system,Attempt:1,} returns sandbox id \"2aeeb1e67e0ab98e4632cde4b2c2a3b3d848b6449cd7ba19bc91d6b34385ec27\"" Sep 13 00:08:55.554047 systemd-networkd[1394]: calida62db25490: Gained IPv6LL Sep 13 00:08:55.587538 containerd[1501]: time="2025-09-13T00:08:55.587459335Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:55.590715 containerd[1501]: time="2025-09-13T00:08:55.589869252Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 13 00:08:55.595020 containerd[1501]: time="2025-09-13T00:08:55.594972267Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:55.602023 containerd[1501]: time="2025-09-13T00:08:55.601975801Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:55.602842 containerd[1501]: time="2025-09-13T00:08:55.602818177Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 2.518256268s" Sep 13 00:08:55.602940 containerd[1501]: time="2025-09-13T00:08:55.602924757Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 13 00:08:55.611167 containerd[1501]: time="2025-09-13T00:08:55.611132548Z" level=info msg="CreateContainer within sandbox \"3a9ad9d11477efcdcd06b6b044f06d51ec462d5c936e47f2d742a40029171596\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 13 00:08:55.613757 containerd[1501]: time="2025-09-13T00:08:55.613738303Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 13 00:08:55.630482 containerd[1501]: time="2025-09-13T00:08:55.630447817Z" level=info msg="CreateContainer within sandbox \"3a9ad9d11477efcdcd06b6b044f06d51ec462d5c936e47f2d742a40029171596\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"3f8d487d5de6e6883659430df407f5c0306811f952171c357b75840a8a3320d9\"" Sep 13 00:08:55.631757 containerd[1501]: time="2025-09-13T00:08:55.631724711Z" level=info msg="StartContainer for \"3f8d487d5de6e6883659430df407f5c0306811f952171c357b75840a8a3320d9\"" Sep 13 00:08:55.665095 systemd[1]: Started cri-containerd-3f8d487d5de6e6883659430df407f5c0306811f952171c357b75840a8a3320d9.scope - libcontainer container 3f8d487d5de6e6883659430df407f5c0306811f952171c357b75840a8a3320d9. Sep 13 00:08:55.732246 containerd[1501]: time="2025-09-13T00:08:55.731769224Z" level=info msg="StopPodSandbox for \"70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c\"" Sep 13 00:08:55.737408 containerd[1501]: time="2025-09-13T00:08:55.737376218Z" level=info msg="StopPodSandbox for \"917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d\"" Sep 13 00:08:55.756935 containerd[1501]: time="2025-09-13T00:08:55.756653416Z" level=info msg="StartContainer for \"3f8d487d5de6e6883659430df407f5c0306811f952171c357b75840a8a3320d9\" returns successfully" Sep 13 00:08:55.912268 containerd[1501]: 2025-09-13 00:08:55.872 [INFO][4680] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" Sep 13 00:08:55.912268 containerd[1501]: 2025-09-13 00:08:55.872 [INFO][4680] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" iface="eth0" netns="/var/run/netns/cni-24dc31fa-5aa4-ec35-3a9e-11e49ec3b43e" Sep 13 00:08:55.912268 containerd[1501]: 2025-09-13 00:08:55.873 [INFO][4680] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" iface="eth0" netns="/var/run/netns/cni-24dc31fa-5aa4-ec35-3a9e-11e49ec3b43e" Sep 13 00:08:55.912268 containerd[1501]: 2025-09-13 00:08:55.873 [INFO][4680] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" iface="eth0" netns="/var/run/netns/cni-24dc31fa-5aa4-ec35-3a9e-11e49ec3b43e" Sep 13 00:08:55.912268 containerd[1501]: 2025-09-13 00:08:55.873 [INFO][4680] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" Sep 13 00:08:55.912268 containerd[1501]: 2025-09-13 00:08:55.873 [INFO][4680] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" Sep 13 00:08:55.912268 containerd[1501]: 2025-09-13 00:08:55.895 [INFO][4703] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" HandleID="k8s-pod-network.70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" Workload="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--q6mzd-eth0" Sep 13 00:08:55.912268 containerd[1501]: 2025-09-13 00:08:55.896 [INFO][4703] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:08:55.912268 containerd[1501]: 2025-09-13 00:08:55.896 [INFO][4703] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:08:55.912268 containerd[1501]: 2025-09-13 00:08:55.902 [WARNING][4703] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" HandleID="k8s-pod-network.70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" Workload="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--q6mzd-eth0" Sep 13 00:08:55.912268 containerd[1501]: 2025-09-13 00:08:55.902 [INFO][4703] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" HandleID="k8s-pod-network.70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" Workload="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--q6mzd-eth0" Sep 13 00:08:55.912268 containerd[1501]: 2025-09-13 00:08:55.904 [INFO][4703] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:08:55.912268 containerd[1501]: 2025-09-13 00:08:55.908 [INFO][4680] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" Sep 13 00:08:55.914205 containerd[1501]: time="2025-09-13T00:08:55.912813432Z" level=info msg="TearDown network for sandbox \"70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c\" successfully" Sep 13 00:08:55.914205 containerd[1501]: time="2025-09-13T00:08:55.912850883Z" level=info msg="StopPodSandbox for \"70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c\" returns successfully" Sep 13 00:08:55.915278 containerd[1501]: time="2025-09-13T00:08:55.914753153Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-q6mzd,Uid:3ba023c8-210d-4e58-bf51-ba6b7484e570,Namespace:kube-system,Attempt:1,}" Sep 13 00:08:55.915173 systemd[1]: run-netns-cni\x2d24dc31fa\x2d5aa4\x2dec35\x2d3a9e\x2d11e49ec3b43e.mount: Deactivated successfully. Sep 13 00:08:55.923818 containerd[1501]: 2025-09-13 00:08:55.866 [INFO][4679] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" Sep 13 00:08:55.923818 containerd[1501]: 2025-09-13 00:08:55.867 [INFO][4679] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" iface="eth0" netns="/var/run/netns/cni-b5aab047-761f-5920-51c8-4b2420a3ec91" Sep 13 00:08:55.923818 containerd[1501]: 2025-09-13 00:08:55.868 [INFO][4679] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" iface="eth0" netns="/var/run/netns/cni-b5aab047-761f-5920-51c8-4b2420a3ec91" Sep 13 00:08:55.923818 containerd[1501]: 2025-09-13 00:08:55.868 [INFO][4679] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" iface="eth0" netns="/var/run/netns/cni-b5aab047-761f-5920-51c8-4b2420a3ec91" Sep 13 00:08:55.923818 containerd[1501]: 2025-09-13 00:08:55.868 [INFO][4679] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" Sep 13 00:08:55.923818 containerd[1501]: 2025-09-13 00:08:55.868 [INFO][4679] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" Sep 13 00:08:55.923818 containerd[1501]: 2025-09-13 00:08:55.900 [INFO][4698] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" HandleID="k8s-pod-network.917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" Workload="ci--4081--3--5--n--294a4568b6-k8s-goldmane--54d579b49d--zn7kg-eth0" Sep 13 00:08:55.923818 containerd[1501]: 2025-09-13 00:08:55.901 [INFO][4698] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:08:55.923818 containerd[1501]: 2025-09-13 00:08:55.904 [INFO][4698] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:08:55.923818 containerd[1501]: 2025-09-13 00:08:55.911 [WARNING][4698] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" HandleID="k8s-pod-network.917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" Workload="ci--4081--3--5--n--294a4568b6-k8s-goldmane--54d579b49d--zn7kg-eth0" Sep 13 00:08:55.923818 containerd[1501]: 2025-09-13 00:08:55.911 [INFO][4698] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" HandleID="k8s-pod-network.917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" Workload="ci--4081--3--5--n--294a4568b6-k8s-goldmane--54d579b49d--zn7kg-eth0" Sep 13 00:08:55.923818 containerd[1501]: 2025-09-13 00:08:55.918 [INFO][4698] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:08:55.923818 containerd[1501]: 2025-09-13 00:08:55.921 [INFO][4679] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" Sep 13 00:08:55.925266 containerd[1501]: time="2025-09-13T00:08:55.924154020Z" level=info msg="TearDown network for sandbox \"917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d\" successfully" Sep 13 00:08:55.925266 containerd[1501]: time="2025-09-13T00:08:55.924177654Z" level=info msg="StopPodSandbox for \"917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d\" returns successfully" Sep 13 00:08:55.925796 containerd[1501]: time="2025-09-13T00:08:55.925570205Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-zn7kg,Uid:e8347be8-88b8-4276-8b5d-e1c006f27806,Namespace:calico-system,Attempt:1,}" Sep 13 00:08:55.927434 systemd[1]: run-netns-cni\x2db5aab047\x2d761f\x2d5920\x2d51c8\x2d4b2420a3ec91.mount: Deactivated successfully. Sep 13 00:08:56.040406 kubelet[2611]: I0913 00:08:56.040348 2611 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-9c789df89-chbwf" podStartSLOduration=24.520206319 podStartE2EDuration="27.04033387s" podCreationTimestamp="2025-09-13 00:08:29 +0000 UTC" firstStartedPulling="2025-09-13 00:08:53.084254301 +0000 UTC m=+39.498287597" lastFinishedPulling="2025-09-13 00:08:55.60438185 +0000 UTC m=+42.018415148" observedRunningTime="2025-09-13 00:08:56.039676103 +0000 UTC m=+42.453709420" watchObservedRunningTime="2025-09-13 00:08:56.04033387 +0000 UTC m=+42.454367167" Sep 13 00:08:56.075365 systemd-networkd[1394]: calicc9c3689856: Link UP Sep 13 00:08:56.076343 systemd-networkd[1394]: calicc9c3689856: Gained carrier Sep 13 00:08:56.100276 containerd[1501]: 2025-09-13 00:08:55.973 [INFO][4711] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--q6mzd-eth0 coredns-674b8bbfcf- kube-system 3ba023c8-210d-4e58-bf51-ba6b7484e570 954 0 2025-09-13 00:08:19 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081-3-5-n-294a4568b6 coredns-674b8bbfcf-q6mzd eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calicc9c3689856 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="acfa6321962f6322b063ced1b193c73c093756247aa55f09cbec0abaac80aa34" Namespace="kube-system" Pod="coredns-674b8bbfcf-q6mzd" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--q6mzd-" Sep 13 00:08:56.100276 containerd[1501]: 2025-09-13 00:08:55.973 [INFO][4711] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="acfa6321962f6322b063ced1b193c73c093756247aa55f09cbec0abaac80aa34" Namespace="kube-system" Pod="coredns-674b8bbfcf-q6mzd" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--q6mzd-eth0" Sep 13 00:08:56.100276 containerd[1501]: 2025-09-13 00:08:56.011 [INFO][4735] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="acfa6321962f6322b063ced1b193c73c093756247aa55f09cbec0abaac80aa34" HandleID="k8s-pod-network.acfa6321962f6322b063ced1b193c73c093756247aa55f09cbec0abaac80aa34" Workload="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--q6mzd-eth0" Sep 13 00:08:56.100276 containerd[1501]: 2025-09-13 00:08:56.011 [INFO][4735] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="acfa6321962f6322b063ced1b193c73c093756247aa55f09cbec0abaac80aa34" HandleID="k8s-pod-network.acfa6321962f6322b063ced1b193c73c093756247aa55f09cbec0abaac80aa34" Workload="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--q6mzd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f2c0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081-3-5-n-294a4568b6", "pod":"coredns-674b8bbfcf-q6mzd", "timestamp":"2025-09-13 00:08:56.011165065 +0000 UTC"}, Hostname:"ci-4081-3-5-n-294a4568b6", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 00:08:56.100276 containerd[1501]: 2025-09-13 00:08:56.011 [INFO][4735] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:08:56.100276 containerd[1501]: 2025-09-13 00:08:56.011 [INFO][4735] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:08:56.100276 containerd[1501]: 2025-09-13 00:08:56.011 [INFO][4735] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-n-294a4568b6' Sep 13 00:08:56.100276 containerd[1501]: 2025-09-13 00:08:56.019 [INFO][4735] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.acfa6321962f6322b063ced1b193c73c093756247aa55f09cbec0abaac80aa34" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:56.100276 containerd[1501]: 2025-09-13 00:08:56.024 [INFO][4735] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:56.100276 containerd[1501]: 2025-09-13 00:08:56.035 [INFO][4735] ipam/ipam.go 511: Trying affinity for 192.168.44.128/26 host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:56.100276 containerd[1501]: 2025-09-13 00:08:56.038 [INFO][4735] ipam/ipam.go 158: Attempting to load block cidr=192.168.44.128/26 host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:56.100276 containerd[1501]: 2025-09-13 00:08:56.041 [INFO][4735] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.44.128/26 host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:56.100276 containerd[1501]: 2025-09-13 00:08:56.042 [INFO][4735] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.44.128/26 handle="k8s-pod-network.acfa6321962f6322b063ced1b193c73c093756247aa55f09cbec0abaac80aa34" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:56.100276 containerd[1501]: 2025-09-13 00:08:56.046 [INFO][4735] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.acfa6321962f6322b063ced1b193c73c093756247aa55f09cbec0abaac80aa34 Sep 13 00:08:56.100276 containerd[1501]: 2025-09-13 00:08:56.052 [INFO][4735] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.44.128/26 handle="k8s-pod-network.acfa6321962f6322b063ced1b193c73c093756247aa55f09cbec0abaac80aa34" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:56.100276 containerd[1501]: 2025-09-13 00:08:56.060 [INFO][4735] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.44.133/26] block=192.168.44.128/26 handle="k8s-pod-network.acfa6321962f6322b063ced1b193c73c093756247aa55f09cbec0abaac80aa34" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:56.100276 containerd[1501]: 2025-09-13 00:08:56.060 [INFO][4735] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.44.133/26] handle="k8s-pod-network.acfa6321962f6322b063ced1b193c73c093756247aa55f09cbec0abaac80aa34" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:56.100276 containerd[1501]: 2025-09-13 00:08:56.061 [INFO][4735] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:08:56.100276 containerd[1501]: 2025-09-13 00:08:56.065 [INFO][4735] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.44.133/26] IPv6=[] ContainerID="acfa6321962f6322b063ced1b193c73c093756247aa55f09cbec0abaac80aa34" HandleID="k8s-pod-network.acfa6321962f6322b063ced1b193c73c093756247aa55f09cbec0abaac80aa34" Workload="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--q6mzd-eth0" Sep 13 00:08:56.103082 containerd[1501]: 2025-09-13 00:08:56.068 [INFO][4711] cni-plugin/k8s.go 418: Populated endpoint ContainerID="acfa6321962f6322b063ced1b193c73c093756247aa55f09cbec0abaac80aa34" Namespace="kube-system" Pod="coredns-674b8bbfcf-q6mzd" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--q6mzd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--q6mzd-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"3ba023c8-210d-4e58-bf51-ba6b7484e570", ResourceVersion:"954", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 8, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-294a4568b6", ContainerID:"", Pod:"coredns-674b8bbfcf-q6mzd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.44.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calicc9c3689856", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:08:56.103082 containerd[1501]: 2025-09-13 00:08:56.069 [INFO][4711] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.44.133/32] ContainerID="acfa6321962f6322b063ced1b193c73c093756247aa55f09cbec0abaac80aa34" Namespace="kube-system" Pod="coredns-674b8bbfcf-q6mzd" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--q6mzd-eth0" Sep 13 00:08:56.103082 containerd[1501]: 2025-09-13 00:08:56.069 [INFO][4711] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calicc9c3689856 ContainerID="acfa6321962f6322b063ced1b193c73c093756247aa55f09cbec0abaac80aa34" Namespace="kube-system" Pod="coredns-674b8bbfcf-q6mzd" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--q6mzd-eth0" Sep 13 00:08:56.103082 containerd[1501]: 2025-09-13 00:08:56.076 [INFO][4711] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="acfa6321962f6322b063ced1b193c73c093756247aa55f09cbec0abaac80aa34" Namespace="kube-system" Pod="coredns-674b8bbfcf-q6mzd" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--q6mzd-eth0" Sep 13 00:08:56.103082 containerd[1501]: 2025-09-13 00:08:56.077 [INFO][4711] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="acfa6321962f6322b063ced1b193c73c093756247aa55f09cbec0abaac80aa34" Namespace="kube-system" Pod="coredns-674b8bbfcf-q6mzd" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--q6mzd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--q6mzd-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"3ba023c8-210d-4e58-bf51-ba6b7484e570", ResourceVersion:"954", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 8, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-294a4568b6", ContainerID:"acfa6321962f6322b063ced1b193c73c093756247aa55f09cbec0abaac80aa34", Pod:"coredns-674b8bbfcf-q6mzd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.44.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calicc9c3689856", MAC:"52:93:71:35:76:ed", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:08:56.103082 containerd[1501]: 2025-09-13 00:08:56.091 [INFO][4711] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="acfa6321962f6322b063ced1b193c73c093756247aa55f09cbec0abaac80aa34" Namespace="kube-system" Pod="coredns-674b8bbfcf-q6mzd" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--q6mzd-eth0" Sep 13 00:08:56.135006 containerd[1501]: time="2025-09-13T00:08:56.130759839Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 00:08:56.135006 containerd[1501]: time="2025-09-13T00:08:56.130805274Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 00:08:56.135006 containerd[1501]: time="2025-09-13T00:08:56.130814682Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 00:08:56.135006 containerd[1501]: time="2025-09-13T00:08:56.131423888Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 00:08:56.175306 systemd[1]: Started cri-containerd-acfa6321962f6322b063ced1b193c73c093756247aa55f09cbec0abaac80aa34.scope - libcontainer container acfa6321962f6322b063ced1b193c73c093756247aa55f09cbec0abaac80aa34. Sep 13 00:08:56.235620 systemd-networkd[1394]: cali255e78b914b: Link UP Sep 13 00:08:56.236062 systemd-networkd[1394]: cali255e78b914b: Gained carrier Sep 13 00:08:56.259588 containerd[1501]: 2025-09-13 00:08:55.978 [INFO][4724] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--n--294a4568b6-k8s-goldmane--54d579b49d--zn7kg-eth0 goldmane-54d579b49d- calico-system e8347be8-88b8-4276-8b5d-e1c006f27806 953 0 2025-09-13 00:08:31 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4081-3-5-n-294a4568b6 goldmane-54d579b49d-zn7kg eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali255e78b914b [] [] }} ContainerID="3e1f1eb8931fb7c95629805f0bfb07bd8f7e57e8c17fd97bb5b65304d8bbb817" Namespace="calico-system" Pod="goldmane-54d579b49d-zn7kg" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-goldmane--54d579b49d--zn7kg-" Sep 13 00:08:56.259588 containerd[1501]: 2025-09-13 00:08:55.979 [INFO][4724] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3e1f1eb8931fb7c95629805f0bfb07bd8f7e57e8c17fd97bb5b65304d8bbb817" Namespace="calico-system" Pod="goldmane-54d579b49d-zn7kg" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-goldmane--54d579b49d--zn7kg-eth0" Sep 13 00:08:56.259588 containerd[1501]: 2025-09-13 00:08:56.019 [INFO][4741] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3e1f1eb8931fb7c95629805f0bfb07bd8f7e57e8c17fd97bb5b65304d8bbb817" HandleID="k8s-pod-network.3e1f1eb8931fb7c95629805f0bfb07bd8f7e57e8c17fd97bb5b65304d8bbb817" Workload="ci--4081--3--5--n--294a4568b6-k8s-goldmane--54d579b49d--zn7kg-eth0" Sep 13 00:08:56.259588 containerd[1501]: 2025-09-13 00:08:56.020 [INFO][4741] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3e1f1eb8931fb7c95629805f0bfb07bd8f7e57e8c17fd97bb5b65304d8bbb817" HandleID="k8s-pod-network.3e1f1eb8931fb7c95629805f0bfb07bd8f7e57e8c17fd97bb5b65304d8bbb817" Workload="ci--4081--3--5--n--294a4568b6-k8s-goldmane--54d579b49d--zn7kg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f210), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-5-n-294a4568b6", "pod":"goldmane-54d579b49d-zn7kg", "timestamp":"2025-09-13 00:08:56.018388499 +0000 UTC"}, Hostname:"ci-4081-3-5-n-294a4568b6", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 00:08:56.259588 containerd[1501]: 2025-09-13 00:08:56.021 [INFO][4741] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:08:56.259588 containerd[1501]: 2025-09-13 00:08:56.060 [INFO][4741] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:08:56.259588 containerd[1501]: 2025-09-13 00:08:56.061 [INFO][4741] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-n-294a4568b6' Sep 13 00:08:56.259588 containerd[1501]: 2025-09-13 00:08:56.122 [INFO][4741] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3e1f1eb8931fb7c95629805f0bfb07bd8f7e57e8c17fd97bb5b65304d8bbb817" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:56.259588 containerd[1501]: 2025-09-13 00:08:56.138 [INFO][4741] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:56.259588 containerd[1501]: 2025-09-13 00:08:56.154 [INFO][4741] ipam/ipam.go 511: Trying affinity for 192.168.44.128/26 host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:56.259588 containerd[1501]: 2025-09-13 00:08:56.183 [INFO][4741] ipam/ipam.go 158: Attempting to load block cidr=192.168.44.128/26 host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:56.259588 containerd[1501]: 2025-09-13 00:08:56.191 [INFO][4741] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.44.128/26 host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:56.259588 containerd[1501]: 2025-09-13 00:08:56.191 [INFO][4741] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.44.128/26 handle="k8s-pod-network.3e1f1eb8931fb7c95629805f0bfb07bd8f7e57e8c17fd97bb5b65304d8bbb817" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:56.259588 containerd[1501]: 2025-09-13 00:08:56.194 [INFO][4741] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.3e1f1eb8931fb7c95629805f0bfb07bd8f7e57e8c17fd97bb5b65304d8bbb817 Sep 13 00:08:56.259588 containerd[1501]: 2025-09-13 00:08:56.199 [INFO][4741] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.44.128/26 handle="k8s-pod-network.3e1f1eb8931fb7c95629805f0bfb07bd8f7e57e8c17fd97bb5b65304d8bbb817" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:56.259588 containerd[1501]: 2025-09-13 00:08:56.209 [INFO][4741] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.44.134/26] block=192.168.44.128/26 handle="k8s-pod-network.3e1f1eb8931fb7c95629805f0bfb07bd8f7e57e8c17fd97bb5b65304d8bbb817" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:56.259588 containerd[1501]: 2025-09-13 00:08:56.209 [INFO][4741] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.44.134/26] handle="k8s-pod-network.3e1f1eb8931fb7c95629805f0bfb07bd8f7e57e8c17fd97bb5b65304d8bbb817" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:56.259588 containerd[1501]: 2025-09-13 00:08:56.210 [INFO][4741] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:08:56.259588 containerd[1501]: 2025-09-13 00:08:56.210 [INFO][4741] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.44.134/26] IPv6=[] ContainerID="3e1f1eb8931fb7c95629805f0bfb07bd8f7e57e8c17fd97bb5b65304d8bbb817" HandleID="k8s-pod-network.3e1f1eb8931fb7c95629805f0bfb07bd8f7e57e8c17fd97bb5b65304d8bbb817" Workload="ci--4081--3--5--n--294a4568b6-k8s-goldmane--54d579b49d--zn7kg-eth0" Sep 13 00:08:56.261782 containerd[1501]: 2025-09-13 00:08:56.216 [INFO][4724] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3e1f1eb8931fb7c95629805f0bfb07bd8f7e57e8c17fd97bb5b65304d8bbb817" Namespace="calico-system" Pod="goldmane-54d579b49d-zn7kg" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-goldmane--54d579b49d--zn7kg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--294a4568b6-k8s-goldmane--54d579b49d--zn7kg-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"e8347be8-88b8-4276-8b5d-e1c006f27806", ResourceVersion:"953", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 8, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-294a4568b6", ContainerID:"", Pod:"goldmane-54d579b49d-zn7kg", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.44.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali255e78b914b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:08:56.261782 containerd[1501]: 2025-09-13 00:08:56.220 [INFO][4724] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.44.134/32] ContainerID="3e1f1eb8931fb7c95629805f0bfb07bd8f7e57e8c17fd97bb5b65304d8bbb817" Namespace="calico-system" Pod="goldmane-54d579b49d-zn7kg" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-goldmane--54d579b49d--zn7kg-eth0" Sep 13 00:08:56.261782 containerd[1501]: 2025-09-13 00:08:56.220 [INFO][4724] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali255e78b914b ContainerID="3e1f1eb8931fb7c95629805f0bfb07bd8f7e57e8c17fd97bb5b65304d8bbb817" Namespace="calico-system" Pod="goldmane-54d579b49d-zn7kg" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-goldmane--54d579b49d--zn7kg-eth0" Sep 13 00:08:56.261782 containerd[1501]: 2025-09-13 00:08:56.237 [INFO][4724] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3e1f1eb8931fb7c95629805f0bfb07bd8f7e57e8c17fd97bb5b65304d8bbb817" Namespace="calico-system" Pod="goldmane-54d579b49d-zn7kg" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-goldmane--54d579b49d--zn7kg-eth0" Sep 13 00:08:56.261782 containerd[1501]: 2025-09-13 00:08:56.238 [INFO][4724] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3e1f1eb8931fb7c95629805f0bfb07bd8f7e57e8c17fd97bb5b65304d8bbb817" Namespace="calico-system" Pod="goldmane-54d579b49d-zn7kg" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-goldmane--54d579b49d--zn7kg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--294a4568b6-k8s-goldmane--54d579b49d--zn7kg-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"e8347be8-88b8-4276-8b5d-e1c006f27806", ResourceVersion:"953", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 8, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-294a4568b6", ContainerID:"3e1f1eb8931fb7c95629805f0bfb07bd8f7e57e8c17fd97bb5b65304d8bbb817", Pod:"goldmane-54d579b49d-zn7kg", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.44.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali255e78b914b", MAC:"ae:81:66:06:2f:8c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:08:56.261782 containerd[1501]: 2025-09-13 00:08:56.257 [INFO][4724] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3e1f1eb8931fb7c95629805f0bfb07bd8f7e57e8c17fd97bb5b65304d8bbb817" Namespace="calico-system" Pod="goldmane-54d579b49d-zn7kg" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-goldmane--54d579b49d--zn7kg-eth0" Sep 13 00:08:56.296385 containerd[1501]: time="2025-09-13T00:08:56.294964227Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 00:08:56.296385 containerd[1501]: time="2025-09-13T00:08:56.295021875Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 00:08:56.296385 containerd[1501]: time="2025-09-13T00:08:56.295034288Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 00:08:56.296385 containerd[1501]: time="2025-09-13T00:08:56.295107085Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 00:08:56.322357 systemd[1]: Started cri-containerd-3e1f1eb8931fb7c95629805f0bfb07bd8f7e57e8c17fd97bb5b65304d8bbb817.scope - libcontainer container 3e1f1eb8931fb7c95629805f0bfb07bd8f7e57e8c17fd97bb5b65304d8bbb817. Sep 13 00:08:56.329838 containerd[1501]: time="2025-09-13T00:08:56.329210768Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-q6mzd,Uid:3ba023c8-210d-4e58-bf51-ba6b7484e570,Namespace:kube-system,Attempt:1,} returns sandbox id \"acfa6321962f6322b063ced1b193c73c093756247aa55f09cbec0abaac80aa34\"" Sep 13 00:08:56.336101 containerd[1501]: time="2025-09-13T00:08:56.336063244Z" level=info msg="CreateContainer within sandbox \"acfa6321962f6322b063ced1b193c73c093756247aa55f09cbec0abaac80aa34\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 13 00:08:56.348500 containerd[1501]: time="2025-09-13T00:08:56.348454954Z" level=info msg="CreateContainer within sandbox \"acfa6321962f6322b063ced1b193c73c093756247aa55f09cbec0abaac80aa34\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"dcac1e74aeae5c45eddc947c65e18a00424f1e841609db1cca561e24b4d754a9\"" Sep 13 00:08:56.349207 containerd[1501]: time="2025-09-13T00:08:56.349179338Z" level=info msg="StartContainer for \"dcac1e74aeae5c45eddc947c65e18a00424f1e841609db1cca561e24b4d754a9\"" Sep 13 00:08:56.407245 systemd[1]: Started cri-containerd-dcac1e74aeae5c45eddc947c65e18a00424f1e841609db1cca561e24b4d754a9.scope - libcontainer container dcac1e74aeae5c45eddc947c65e18a00424f1e841609db1cca561e24b4d754a9. Sep 13 00:08:56.451131 containerd[1501]: time="2025-09-13T00:08:56.449861929Z" level=info msg="StartContainer for \"dcac1e74aeae5c45eddc947c65e18a00424f1e841609db1cca561e24b4d754a9\" returns successfully" Sep 13 00:08:56.474689 containerd[1501]: time="2025-09-13T00:08:56.474648287Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-zn7kg,Uid:e8347be8-88b8-4276-8b5d-e1c006f27806,Namespace:calico-system,Attempt:1,} returns sandbox id \"3e1f1eb8931fb7c95629805f0bfb07bd8f7e57e8c17fd97bb5b65304d8bbb817\"" Sep 13 00:08:56.711983 containerd[1501]: time="2025-09-13T00:08:56.711169764Z" level=info msg="StopPodSandbox for \"545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5\"" Sep 13 00:08:56.834028 containerd[1501]: 2025-09-13 00:08:56.774 [INFO][4902] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" Sep 13 00:08:56.834028 containerd[1501]: 2025-09-13 00:08:56.774 [INFO][4902] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" iface="eth0" netns="/var/run/netns/cni-bd9cd036-1bb5-0d85-f3f2-801a08e9522a" Sep 13 00:08:56.834028 containerd[1501]: 2025-09-13 00:08:56.775 [INFO][4902] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" iface="eth0" netns="/var/run/netns/cni-bd9cd036-1bb5-0d85-f3f2-801a08e9522a" Sep 13 00:08:56.834028 containerd[1501]: 2025-09-13 00:08:56.778 [INFO][4902] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" iface="eth0" netns="/var/run/netns/cni-bd9cd036-1bb5-0d85-f3f2-801a08e9522a" Sep 13 00:08:56.834028 containerd[1501]: 2025-09-13 00:08:56.778 [INFO][4902] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" Sep 13 00:08:56.834028 containerd[1501]: 2025-09-13 00:08:56.778 [INFO][4902] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" Sep 13 00:08:56.834028 containerd[1501]: 2025-09-13 00:08:56.807 [INFO][4910] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" HandleID="k8s-pod-network.545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--kvgsc-eth0" Sep 13 00:08:56.834028 containerd[1501]: 2025-09-13 00:08:56.807 [INFO][4910] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:08:56.834028 containerd[1501]: 2025-09-13 00:08:56.807 [INFO][4910] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:08:56.834028 containerd[1501]: 2025-09-13 00:08:56.817 [WARNING][4910] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" HandleID="k8s-pod-network.545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--kvgsc-eth0" Sep 13 00:08:56.834028 containerd[1501]: 2025-09-13 00:08:56.817 [INFO][4910] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" HandleID="k8s-pod-network.545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--kvgsc-eth0" Sep 13 00:08:56.834028 containerd[1501]: 2025-09-13 00:08:56.819 [INFO][4910] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:08:56.834028 containerd[1501]: 2025-09-13 00:08:56.826 [INFO][4902] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" Sep 13 00:08:56.834028 containerd[1501]: time="2025-09-13T00:08:56.831869948Z" level=info msg="TearDown network for sandbox \"545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5\" successfully" Sep 13 00:08:56.834028 containerd[1501]: time="2025-09-13T00:08:56.833948408Z" level=info msg="StopPodSandbox for \"545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5\" returns successfully" Sep 13 00:08:56.834367 systemd[1]: run-netns-cni\x2dbd9cd036\x2d1bb5\x2d0d85\x2df3f2\x2d801a08e9522a.mount: Deactivated successfully. Sep 13 00:08:56.836382 containerd[1501]: time="2025-09-13T00:08:56.836306206Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9c789df89-kvgsc,Uid:81fa2d11-a454-4bc0-824e-4ee1c118d1e3,Namespace:calico-apiserver,Attempt:1,}" Sep 13 00:08:56.898115 systemd-networkd[1394]: calid910e85c87b: Gained IPv6LL Sep 13 00:08:56.984115 systemd-networkd[1394]: calib93f8e9ce6e: Link UP Sep 13 00:08:56.985461 systemd-networkd[1394]: calib93f8e9ce6e: Gained carrier Sep 13 00:08:57.008826 containerd[1501]: 2025-09-13 00:08:56.893 [INFO][4916] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--kvgsc-eth0 calico-apiserver-9c789df89- calico-apiserver 81fa2d11-a454-4bc0-824e-4ee1c118d1e3 973 0 2025-09-13 00:08:29 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:9c789df89 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081-3-5-n-294a4568b6 calico-apiserver-9c789df89-kvgsc eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calib93f8e9ce6e [] [] }} ContainerID="acf9d09bd39d90b1a9851dd9a15eebe92d6b1f6854703a56a9187441b19428c1" Namespace="calico-apiserver" Pod="calico-apiserver-9c789df89-kvgsc" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--kvgsc-" Sep 13 00:08:57.008826 containerd[1501]: 2025-09-13 00:08:56.893 [INFO][4916] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="acf9d09bd39d90b1a9851dd9a15eebe92d6b1f6854703a56a9187441b19428c1" Namespace="calico-apiserver" Pod="calico-apiserver-9c789df89-kvgsc" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--kvgsc-eth0" Sep 13 00:08:57.008826 containerd[1501]: 2025-09-13 00:08:56.931 [INFO][4929] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="acf9d09bd39d90b1a9851dd9a15eebe92d6b1f6854703a56a9187441b19428c1" HandleID="k8s-pod-network.acf9d09bd39d90b1a9851dd9a15eebe92d6b1f6854703a56a9187441b19428c1" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--kvgsc-eth0" Sep 13 00:08:57.008826 containerd[1501]: 2025-09-13 00:08:56.931 [INFO][4929] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="acf9d09bd39d90b1a9851dd9a15eebe92d6b1f6854703a56a9187441b19428c1" HandleID="k8s-pod-network.acf9d09bd39d90b1a9851dd9a15eebe92d6b1f6854703a56a9187441b19428c1" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--kvgsc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d56c0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4081-3-5-n-294a4568b6", "pod":"calico-apiserver-9c789df89-kvgsc", "timestamp":"2025-09-13 00:08:56.930991512 +0000 UTC"}, Hostname:"ci-4081-3-5-n-294a4568b6", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 00:08:57.008826 containerd[1501]: 2025-09-13 00:08:56.931 [INFO][4929] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:08:57.008826 containerd[1501]: 2025-09-13 00:08:56.931 [INFO][4929] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:08:57.008826 containerd[1501]: 2025-09-13 00:08:56.931 [INFO][4929] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-n-294a4568b6' Sep 13 00:08:57.008826 containerd[1501]: 2025-09-13 00:08:56.938 [INFO][4929] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.acf9d09bd39d90b1a9851dd9a15eebe92d6b1f6854703a56a9187441b19428c1" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:57.008826 containerd[1501]: 2025-09-13 00:08:56.944 [INFO][4929] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:57.008826 containerd[1501]: 2025-09-13 00:08:56.949 [INFO][4929] ipam/ipam.go 511: Trying affinity for 192.168.44.128/26 host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:57.008826 containerd[1501]: 2025-09-13 00:08:56.951 [INFO][4929] ipam/ipam.go 158: Attempting to load block cidr=192.168.44.128/26 host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:57.008826 containerd[1501]: 2025-09-13 00:08:56.955 [INFO][4929] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.44.128/26 host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:57.008826 containerd[1501]: 2025-09-13 00:08:56.955 [INFO][4929] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.44.128/26 handle="k8s-pod-network.acf9d09bd39d90b1a9851dd9a15eebe92d6b1f6854703a56a9187441b19428c1" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:57.008826 containerd[1501]: 2025-09-13 00:08:56.957 [INFO][4929] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.acf9d09bd39d90b1a9851dd9a15eebe92d6b1f6854703a56a9187441b19428c1 Sep 13 00:08:57.008826 containerd[1501]: 2025-09-13 00:08:56.962 [INFO][4929] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.44.128/26 handle="k8s-pod-network.acf9d09bd39d90b1a9851dd9a15eebe92d6b1f6854703a56a9187441b19428c1" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:57.008826 containerd[1501]: 2025-09-13 00:08:56.972 [INFO][4929] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.44.135/26] block=192.168.44.128/26 handle="k8s-pod-network.acf9d09bd39d90b1a9851dd9a15eebe92d6b1f6854703a56a9187441b19428c1" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:57.008826 containerd[1501]: 2025-09-13 00:08:56.972 [INFO][4929] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.44.135/26] handle="k8s-pod-network.acf9d09bd39d90b1a9851dd9a15eebe92d6b1f6854703a56a9187441b19428c1" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:57.008826 containerd[1501]: 2025-09-13 00:08:56.972 [INFO][4929] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:08:57.008826 containerd[1501]: 2025-09-13 00:08:56.972 [INFO][4929] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.44.135/26] IPv6=[] ContainerID="acf9d09bd39d90b1a9851dd9a15eebe92d6b1f6854703a56a9187441b19428c1" HandleID="k8s-pod-network.acf9d09bd39d90b1a9851dd9a15eebe92d6b1f6854703a56a9187441b19428c1" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--kvgsc-eth0" Sep 13 00:08:57.009600 containerd[1501]: 2025-09-13 00:08:56.978 [INFO][4916] cni-plugin/k8s.go 418: Populated endpoint ContainerID="acf9d09bd39d90b1a9851dd9a15eebe92d6b1f6854703a56a9187441b19428c1" Namespace="calico-apiserver" Pod="calico-apiserver-9c789df89-kvgsc" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--kvgsc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--kvgsc-eth0", GenerateName:"calico-apiserver-9c789df89-", Namespace:"calico-apiserver", SelfLink:"", UID:"81fa2d11-a454-4bc0-824e-4ee1c118d1e3", ResourceVersion:"973", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 8, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9c789df89", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-294a4568b6", ContainerID:"", Pod:"calico-apiserver-9c789df89-kvgsc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.44.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib93f8e9ce6e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:08:57.009600 containerd[1501]: 2025-09-13 00:08:56.979 [INFO][4916] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.44.135/32] ContainerID="acf9d09bd39d90b1a9851dd9a15eebe92d6b1f6854703a56a9187441b19428c1" Namespace="calico-apiserver" Pod="calico-apiserver-9c789df89-kvgsc" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--kvgsc-eth0" Sep 13 00:08:57.009600 containerd[1501]: 2025-09-13 00:08:56.979 [INFO][4916] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib93f8e9ce6e ContainerID="acf9d09bd39d90b1a9851dd9a15eebe92d6b1f6854703a56a9187441b19428c1" Namespace="calico-apiserver" Pod="calico-apiserver-9c789df89-kvgsc" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--kvgsc-eth0" Sep 13 00:08:57.009600 containerd[1501]: 2025-09-13 00:08:56.988 [INFO][4916] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="acf9d09bd39d90b1a9851dd9a15eebe92d6b1f6854703a56a9187441b19428c1" Namespace="calico-apiserver" Pod="calico-apiserver-9c789df89-kvgsc" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--kvgsc-eth0" Sep 13 00:08:57.009600 containerd[1501]: 2025-09-13 00:08:56.989 [INFO][4916] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="acf9d09bd39d90b1a9851dd9a15eebe92d6b1f6854703a56a9187441b19428c1" Namespace="calico-apiserver" Pod="calico-apiserver-9c789df89-kvgsc" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--kvgsc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--kvgsc-eth0", GenerateName:"calico-apiserver-9c789df89-", Namespace:"calico-apiserver", SelfLink:"", UID:"81fa2d11-a454-4bc0-824e-4ee1c118d1e3", ResourceVersion:"973", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 8, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9c789df89", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-294a4568b6", ContainerID:"acf9d09bd39d90b1a9851dd9a15eebe92d6b1f6854703a56a9187441b19428c1", Pod:"calico-apiserver-9c789df89-kvgsc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.44.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib93f8e9ce6e", MAC:"86:84:b4:e9:9f:84", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:08:57.009600 containerd[1501]: 2025-09-13 00:08:57.005 [INFO][4916] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="acf9d09bd39d90b1a9851dd9a15eebe92d6b1f6854703a56a9187441b19428c1" Namespace="calico-apiserver" Pod="calico-apiserver-9c789df89-kvgsc" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--kvgsc-eth0" Sep 13 00:08:57.033546 containerd[1501]: time="2025-09-13T00:08:57.033164585Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 00:08:57.033546 containerd[1501]: time="2025-09-13T00:08:57.033536204Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 00:08:57.033705 containerd[1501]: time="2025-09-13T00:08:57.033577061Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 00:08:57.033705 containerd[1501]: time="2025-09-13T00:08:57.033677611Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 00:08:57.074321 systemd[1]: Started cri-containerd-acf9d09bd39d90b1a9851dd9a15eebe92d6b1f6854703a56a9187441b19428c1.scope - libcontainer container acf9d09bd39d90b1a9851dd9a15eebe92d6b1f6854703a56a9187441b19428c1. Sep 13 00:08:57.099839 kubelet[2611]: I0913 00:08:57.099698 2611 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-q6mzd" podStartSLOduration=38.099676731 podStartE2EDuration="38.099676731s" podCreationTimestamp="2025-09-13 00:08:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-13 00:08:57.066427191 +0000 UTC m=+43.480460488" watchObservedRunningTime="2025-09-13 00:08:57.099676731 +0000 UTC m=+43.513710029" Sep 13 00:08:57.192681 containerd[1501]: time="2025-09-13T00:08:57.192466693Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9c789df89-kvgsc,Uid:81fa2d11-a454-4bc0-824e-4ee1c118d1e3,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"acf9d09bd39d90b1a9851dd9a15eebe92d6b1f6854703a56a9187441b19428c1\"" Sep 13 00:08:57.203458 containerd[1501]: time="2025-09-13T00:08:57.203349340Z" level=info msg="CreateContainer within sandbox \"acf9d09bd39d90b1a9851dd9a15eebe92d6b1f6854703a56a9187441b19428c1\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 13 00:08:57.220351 containerd[1501]: time="2025-09-13T00:08:57.220269164Z" level=info msg="CreateContainer within sandbox \"acf9d09bd39d90b1a9851dd9a15eebe92d6b1f6854703a56a9187441b19428c1\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"04977dbf7bd1ac12953d11323ff971bd91a93e6c0ada31a60862058523d84262\"" Sep 13 00:08:57.221463 containerd[1501]: time="2025-09-13T00:08:57.221435789Z" level=info msg="StartContainer for \"04977dbf7bd1ac12953d11323ff971bd91a93e6c0ada31a60862058523d84262\"" Sep 13 00:08:57.307108 systemd[1]: Started cri-containerd-04977dbf7bd1ac12953d11323ff971bd91a93e6c0ada31a60862058523d84262.scope - libcontainer container 04977dbf7bd1ac12953d11323ff971bd91a93e6c0ada31a60862058523d84262. Sep 13 00:08:57.404818 containerd[1501]: time="2025-09-13T00:08:57.404695675Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:57.406247 containerd[1501]: time="2025-09-13T00:08:57.406197801Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 13 00:08:57.407652 containerd[1501]: time="2025-09-13T00:08:57.407343998Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:57.409724 containerd[1501]: time="2025-09-13T00:08:57.409697056Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:08:57.410978 containerd[1501]: time="2025-09-13T00:08:57.410397654Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 1.795684798s" Sep 13 00:08:57.411082 containerd[1501]: time="2025-09-13T00:08:57.411064869Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 13 00:08:57.412191 containerd[1501]: time="2025-09-13T00:08:57.412129673Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 13 00:08:57.417190 containerd[1501]: time="2025-09-13T00:08:57.417090618Z" level=info msg="CreateContainer within sandbox \"2aeeb1e67e0ab98e4632cde4b2c2a3b3d848b6449cd7ba19bc91d6b34385ec27\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 13 00:08:57.449500 containerd[1501]: time="2025-09-13T00:08:57.449069427Z" level=info msg="CreateContainer within sandbox \"2aeeb1e67e0ab98e4632cde4b2c2a3b3d848b6449cd7ba19bc91d6b34385ec27\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"154a47f176118696fc884ded46b05112cd74acd51a612ecf13488e559337fff6\"" Sep 13 00:08:57.454577 containerd[1501]: time="2025-09-13T00:08:57.454440964Z" level=info msg="StartContainer for \"154a47f176118696fc884ded46b05112cd74acd51a612ecf13488e559337fff6\"" Sep 13 00:08:57.495255 containerd[1501]: time="2025-09-13T00:08:57.495206107Z" level=info msg="StartContainer for \"04977dbf7bd1ac12953d11323ff971bd91a93e6c0ada31a60862058523d84262\" returns successfully" Sep 13 00:08:57.537142 systemd-networkd[1394]: calicc9c3689856: Gained IPv6LL Sep 13 00:08:57.544866 systemd[1]: Started cri-containerd-154a47f176118696fc884ded46b05112cd74acd51a612ecf13488e559337fff6.scope - libcontainer container 154a47f176118696fc884ded46b05112cd74acd51a612ecf13488e559337fff6. Sep 13 00:08:57.626181 containerd[1501]: time="2025-09-13T00:08:57.626123818Z" level=info msg="StartContainer for \"154a47f176118696fc884ded46b05112cd74acd51a612ecf13488e559337fff6\" returns successfully" Sep 13 00:08:57.719622 containerd[1501]: time="2025-09-13T00:08:57.719282882Z" level=info msg="StopPodSandbox for \"a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b\"" Sep 13 00:08:57.837599 containerd[1501]: 2025-09-13 00:08:57.793 [INFO][5073] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" Sep 13 00:08:57.837599 containerd[1501]: 2025-09-13 00:08:57.793 [INFO][5073] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" iface="eth0" netns="/var/run/netns/cni-89027727-ddc5-285d-6115-cb5c857a1552" Sep 13 00:08:57.837599 containerd[1501]: 2025-09-13 00:08:57.794 [INFO][5073] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" iface="eth0" netns="/var/run/netns/cni-89027727-ddc5-285d-6115-cb5c857a1552" Sep 13 00:08:57.837599 containerd[1501]: 2025-09-13 00:08:57.794 [INFO][5073] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" iface="eth0" netns="/var/run/netns/cni-89027727-ddc5-285d-6115-cb5c857a1552" Sep 13 00:08:57.837599 containerd[1501]: 2025-09-13 00:08:57.794 [INFO][5073] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" Sep 13 00:08:57.837599 containerd[1501]: 2025-09-13 00:08:57.794 [INFO][5073] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" Sep 13 00:08:57.837599 containerd[1501]: 2025-09-13 00:08:57.821 [INFO][5080] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" HandleID="k8s-pod-network.a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--kube--controllers--577b6875b7--hxt89-eth0" Sep 13 00:08:57.837599 containerd[1501]: 2025-09-13 00:08:57.822 [INFO][5080] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:08:57.837599 containerd[1501]: 2025-09-13 00:08:57.822 [INFO][5080] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:08:57.837599 containerd[1501]: 2025-09-13 00:08:57.828 [WARNING][5080] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" HandleID="k8s-pod-network.a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--kube--controllers--577b6875b7--hxt89-eth0" Sep 13 00:08:57.837599 containerd[1501]: 2025-09-13 00:08:57.828 [INFO][5080] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" HandleID="k8s-pod-network.a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--kube--controllers--577b6875b7--hxt89-eth0" Sep 13 00:08:57.837599 containerd[1501]: 2025-09-13 00:08:57.829 [INFO][5080] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:08:57.837599 containerd[1501]: 2025-09-13 00:08:57.833 [INFO][5073] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" Sep 13 00:08:57.840246 containerd[1501]: time="2025-09-13T00:08:57.840201499Z" level=info msg="TearDown network for sandbox \"a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b\" successfully" Sep 13 00:08:57.840306 containerd[1501]: time="2025-09-13T00:08:57.840246193Z" level=info msg="StopPodSandbox for \"a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b\" returns successfully" Sep 13 00:08:57.842254 containerd[1501]: time="2025-09-13T00:08:57.842215458Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-577b6875b7-hxt89,Uid:4166a8bf-c6d2-4b14-bbe8-bc5fa59948b2,Namespace:calico-system,Attempt:1,}" Sep 13 00:08:57.848219 systemd[1]: run-netns-cni\x2d89027727\x2dddc5\x2d285d\x2d6115\x2dcb5c857a1552.mount: Deactivated successfully. Sep 13 00:08:58.050105 systemd-networkd[1394]: cali255e78b914b: Gained IPv6LL Sep 13 00:08:58.140514 systemd-networkd[1394]: cali8e5aadf41af: Link UP Sep 13 00:08:58.141540 systemd-networkd[1394]: cali8e5aadf41af: Gained carrier Sep 13 00:08:58.179837 kubelet[2611]: I0913 00:08:58.177379 2611 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-9c789df89-kvgsc" podStartSLOduration=29.177354357 podStartE2EDuration="29.177354357s" podCreationTimestamp="2025-09-13 00:08:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-13 00:08:58.103635929 +0000 UTC m=+44.517669226" watchObservedRunningTime="2025-09-13 00:08:58.177354357 +0000 UTC m=+44.591387655" Sep 13 00:08:58.187347 containerd[1501]: 2025-09-13 00:08:57.995 [INFO][5087] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--n--294a4568b6-k8s-calico--kube--controllers--577b6875b7--hxt89-eth0 calico-kube-controllers-577b6875b7- calico-system 4166a8bf-c6d2-4b14-bbe8-bc5fa59948b2 995 0 2025-09-13 00:08:32 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:577b6875b7 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4081-3-5-n-294a4568b6 calico-kube-controllers-577b6875b7-hxt89 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali8e5aadf41af [] [] }} ContainerID="6bf9568832be6da1b5e90f03d75bec7ca27702190bfb20532005105b81e53c81" Namespace="calico-system" Pod="calico-kube-controllers-577b6875b7-hxt89" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-calico--kube--controllers--577b6875b7--hxt89-" Sep 13 00:08:58.187347 containerd[1501]: 2025-09-13 00:08:57.996 [INFO][5087] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6bf9568832be6da1b5e90f03d75bec7ca27702190bfb20532005105b81e53c81" Namespace="calico-system" Pod="calico-kube-controllers-577b6875b7-hxt89" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-calico--kube--controllers--577b6875b7--hxt89-eth0" Sep 13 00:08:58.187347 containerd[1501]: 2025-09-13 00:08:58.033 [INFO][5101] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6bf9568832be6da1b5e90f03d75bec7ca27702190bfb20532005105b81e53c81" HandleID="k8s-pod-network.6bf9568832be6da1b5e90f03d75bec7ca27702190bfb20532005105b81e53c81" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--kube--controllers--577b6875b7--hxt89-eth0" Sep 13 00:08:58.187347 containerd[1501]: 2025-09-13 00:08:58.033 [INFO][5101] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6bf9568832be6da1b5e90f03d75bec7ca27702190bfb20532005105b81e53c81" HandleID="k8s-pod-network.6bf9568832be6da1b5e90f03d75bec7ca27702190bfb20532005105b81e53c81" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--kube--controllers--577b6875b7--hxt89-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f230), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-5-n-294a4568b6", "pod":"calico-kube-controllers-577b6875b7-hxt89", "timestamp":"2025-09-13 00:08:58.03340964 +0000 UTC"}, Hostname:"ci-4081-3-5-n-294a4568b6", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 13 00:08:58.187347 containerd[1501]: 2025-09-13 00:08:58.033 [INFO][5101] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:08:58.187347 containerd[1501]: 2025-09-13 00:08:58.033 [INFO][5101] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:08:58.187347 containerd[1501]: 2025-09-13 00:08:58.033 [INFO][5101] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-n-294a4568b6' Sep 13 00:08:58.187347 containerd[1501]: 2025-09-13 00:08:58.042 [INFO][5101] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6bf9568832be6da1b5e90f03d75bec7ca27702190bfb20532005105b81e53c81" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:58.187347 containerd[1501]: 2025-09-13 00:08:58.049 [INFO][5101] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:58.187347 containerd[1501]: 2025-09-13 00:08:58.072 [INFO][5101] ipam/ipam.go 511: Trying affinity for 192.168.44.128/26 host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:58.187347 containerd[1501]: 2025-09-13 00:08:58.081 [INFO][5101] ipam/ipam.go 158: Attempting to load block cidr=192.168.44.128/26 host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:58.187347 containerd[1501]: 2025-09-13 00:08:58.089 [INFO][5101] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.44.128/26 host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:58.187347 containerd[1501]: 2025-09-13 00:08:58.090 [INFO][5101] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.44.128/26 handle="k8s-pod-network.6bf9568832be6da1b5e90f03d75bec7ca27702190bfb20532005105b81e53c81" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:58.187347 containerd[1501]: 2025-09-13 00:08:58.095 [INFO][5101] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.6bf9568832be6da1b5e90f03d75bec7ca27702190bfb20532005105b81e53c81 Sep 13 00:08:58.187347 containerd[1501]: 2025-09-13 00:08:58.110 [INFO][5101] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.44.128/26 handle="k8s-pod-network.6bf9568832be6da1b5e90f03d75bec7ca27702190bfb20532005105b81e53c81" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:58.187347 containerd[1501]: 2025-09-13 00:08:58.127 [INFO][5101] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.44.136/26] block=192.168.44.128/26 handle="k8s-pod-network.6bf9568832be6da1b5e90f03d75bec7ca27702190bfb20532005105b81e53c81" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:58.187347 containerd[1501]: 2025-09-13 00:08:58.127 [INFO][5101] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.44.136/26] handle="k8s-pod-network.6bf9568832be6da1b5e90f03d75bec7ca27702190bfb20532005105b81e53c81" host="ci-4081-3-5-n-294a4568b6" Sep 13 00:08:58.187347 containerd[1501]: 2025-09-13 00:08:58.128 [INFO][5101] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:08:58.187347 containerd[1501]: 2025-09-13 00:08:58.128 [INFO][5101] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.44.136/26] IPv6=[] ContainerID="6bf9568832be6da1b5e90f03d75bec7ca27702190bfb20532005105b81e53c81" HandleID="k8s-pod-network.6bf9568832be6da1b5e90f03d75bec7ca27702190bfb20532005105b81e53c81" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--kube--controllers--577b6875b7--hxt89-eth0" Sep 13 00:08:58.188410 containerd[1501]: 2025-09-13 00:08:58.133 [INFO][5087] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6bf9568832be6da1b5e90f03d75bec7ca27702190bfb20532005105b81e53c81" Namespace="calico-system" Pod="calico-kube-controllers-577b6875b7-hxt89" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-calico--kube--controllers--577b6875b7--hxt89-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--294a4568b6-k8s-calico--kube--controllers--577b6875b7--hxt89-eth0", GenerateName:"calico-kube-controllers-577b6875b7-", Namespace:"calico-system", SelfLink:"", UID:"4166a8bf-c6d2-4b14-bbe8-bc5fa59948b2", ResourceVersion:"995", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 8, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"577b6875b7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-294a4568b6", ContainerID:"", Pod:"calico-kube-controllers-577b6875b7-hxt89", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.44.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali8e5aadf41af", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:08:58.188410 containerd[1501]: 2025-09-13 00:08:58.133 [INFO][5087] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.44.136/32] ContainerID="6bf9568832be6da1b5e90f03d75bec7ca27702190bfb20532005105b81e53c81" Namespace="calico-system" Pod="calico-kube-controllers-577b6875b7-hxt89" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-calico--kube--controllers--577b6875b7--hxt89-eth0" Sep 13 00:08:58.188410 containerd[1501]: 2025-09-13 00:08:58.133 [INFO][5087] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8e5aadf41af ContainerID="6bf9568832be6da1b5e90f03d75bec7ca27702190bfb20532005105b81e53c81" Namespace="calico-system" Pod="calico-kube-controllers-577b6875b7-hxt89" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-calico--kube--controllers--577b6875b7--hxt89-eth0" Sep 13 00:08:58.188410 containerd[1501]: 2025-09-13 00:08:58.141 [INFO][5087] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6bf9568832be6da1b5e90f03d75bec7ca27702190bfb20532005105b81e53c81" Namespace="calico-system" Pod="calico-kube-controllers-577b6875b7-hxt89" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-calico--kube--controllers--577b6875b7--hxt89-eth0" Sep 13 00:08:58.188410 containerd[1501]: 2025-09-13 00:08:58.142 [INFO][5087] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6bf9568832be6da1b5e90f03d75bec7ca27702190bfb20532005105b81e53c81" Namespace="calico-system" Pod="calico-kube-controllers-577b6875b7-hxt89" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-calico--kube--controllers--577b6875b7--hxt89-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--294a4568b6-k8s-calico--kube--controllers--577b6875b7--hxt89-eth0", GenerateName:"calico-kube-controllers-577b6875b7-", Namespace:"calico-system", SelfLink:"", UID:"4166a8bf-c6d2-4b14-bbe8-bc5fa59948b2", ResourceVersion:"995", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 8, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"577b6875b7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-294a4568b6", ContainerID:"6bf9568832be6da1b5e90f03d75bec7ca27702190bfb20532005105b81e53c81", Pod:"calico-kube-controllers-577b6875b7-hxt89", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.44.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali8e5aadf41af", MAC:"92:35:8a:a0:b0:12", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:08:58.188410 containerd[1501]: 2025-09-13 00:08:58.180 [INFO][5087] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6bf9568832be6da1b5e90f03d75bec7ca27702190bfb20532005105b81e53c81" Namespace="calico-system" Pod="calico-kube-controllers-577b6875b7-hxt89" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-calico--kube--controllers--577b6875b7--hxt89-eth0" Sep 13 00:08:58.229754 containerd[1501]: time="2025-09-13T00:08:58.229517517Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 13 00:08:58.229754 containerd[1501]: time="2025-09-13T00:08:58.229608096Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 13 00:08:58.232239 containerd[1501]: time="2025-09-13T00:08:58.230318363Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 00:08:58.232239 containerd[1501]: time="2025-09-13T00:08:58.230461282Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 13 00:08:58.279318 systemd[1]: Started cri-containerd-6bf9568832be6da1b5e90f03d75bec7ca27702190bfb20532005105b81e53c81.scope - libcontainer container 6bf9568832be6da1b5e90f03d75bec7ca27702190bfb20532005105b81e53c81. Sep 13 00:08:58.369173 systemd-networkd[1394]: calib93f8e9ce6e: Gained IPv6LL Sep 13 00:08:58.446343 containerd[1501]: time="2025-09-13T00:08:58.446269037Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-577b6875b7-hxt89,Uid:4166a8bf-c6d2-4b14-bbe8-bc5fa59948b2,Namespace:calico-system,Attempt:1,} returns sandbox id \"6bf9568832be6da1b5e90f03d75bec7ca27702190bfb20532005105b81e53c81\"" Sep 13 00:08:59.585182 systemd-networkd[1394]: cali8e5aadf41af: Gained IPv6LL Sep 13 00:09:01.359816 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3573084310.mount: Deactivated successfully. Sep 13 00:09:01.965495 containerd[1501]: time="2025-09-13T00:09:01.965385060Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:09:02.035105 containerd[1501]: time="2025-09-13T00:09:02.009158560Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 13 00:09:02.058809 containerd[1501]: time="2025-09-13T00:09:02.058741440Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:09:02.102554 containerd[1501]: time="2025-09-13T00:09:02.102307811Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:09:02.103428 containerd[1501]: time="2025-09-13T00:09:02.103295147Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 4.691133162s" Sep 13 00:09:02.103428 containerd[1501]: time="2025-09-13T00:09:02.103338238Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 13 00:09:02.166105 containerd[1501]: time="2025-09-13T00:09:02.164765214Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 13 00:09:02.280438 containerd[1501]: time="2025-09-13T00:09:02.280293726Z" level=info msg="CreateContainer within sandbox \"3e1f1eb8931fb7c95629805f0bfb07bd8f7e57e8c17fd97bb5b65304d8bbb817\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 13 00:09:02.378971 containerd[1501]: time="2025-09-13T00:09:02.378864108Z" level=info msg="CreateContainer within sandbox \"3e1f1eb8931fb7c95629805f0bfb07bd8f7e57e8c17fd97bb5b65304d8bbb817\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"3ec7bb290736439913447f6240a844e54f5ab8eddd8abed325fcd178b1f913d1\"" Sep 13 00:09:02.382994 containerd[1501]: time="2025-09-13T00:09:02.382945931Z" level=info msg="StartContainer for \"3ec7bb290736439913447f6240a844e54f5ab8eddd8abed325fcd178b1f913d1\"" Sep 13 00:09:02.480092 systemd[1]: Started cri-containerd-3ec7bb290736439913447f6240a844e54f5ab8eddd8abed325fcd178b1f913d1.scope - libcontainer container 3ec7bb290736439913447f6240a844e54f5ab8eddd8abed325fcd178b1f913d1. Sep 13 00:09:02.552454 containerd[1501]: time="2025-09-13T00:09:02.552324184Z" level=info msg="StartContainer for \"3ec7bb290736439913447f6240a844e54f5ab8eddd8abed325fcd178b1f913d1\" returns successfully" Sep 13 00:09:03.423641 kubelet[2611]: I0913 00:09:03.395842 2611 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-zn7kg" podStartSLOduration=26.734818969 podStartE2EDuration="32.386267266s" podCreationTimestamp="2025-09-13 00:08:31 +0000 UTC" firstStartedPulling="2025-09-13 00:08:56.478161108 +0000 UTC m=+42.892194405" lastFinishedPulling="2025-09-13 00:09:02.129609385 +0000 UTC m=+48.543642702" observedRunningTime="2025-09-13 00:09:03.369242426 +0000 UTC m=+49.783275752" watchObservedRunningTime="2025-09-13 00:09:03.386267266 +0000 UTC m=+49.800300563" Sep 13 00:09:04.088440 containerd[1501]: time="2025-09-13T00:09:04.088366374Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:09:04.089878 containerd[1501]: time="2025-09-13T00:09:04.089797793Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 13 00:09:04.091609 containerd[1501]: time="2025-09-13T00:09:04.090185142Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:09:04.094081 containerd[1501]: time="2025-09-13T00:09:04.093128493Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:09:04.094081 containerd[1501]: time="2025-09-13T00:09:04.093572298Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 1.928752632s" Sep 13 00:09:04.094081 containerd[1501]: time="2025-09-13T00:09:04.093603607Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 13 00:09:04.117903 containerd[1501]: time="2025-09-13T00:09:04.117853301Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 13 00:09:04.123902 containerd[1501]: time="2025-09-13T00:09:04.123841836Z" level=info msg="CreateContainer within sandbox \"2aeeb1e67e0ab98e4632cde4b2c2a3b3d848b6449cd7ba19bc91d6b34385ec27\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 13 00:09:04.143476 containerd[1501]: time="2025-09-13T00:09:04.142807309Z" level=info msg="CreateContainer within sandbox \"2aeeb1e67e0ab98e4632cde4b2c2a3b3d848b6449cd7ba19bc91d6b34385ec27\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"4d2d9e530c12ff36bcafb5dd3b838f30d3c7696c8afd2d52854d297d67005f43\"" Sep 13 00:09:04.146775 containerd[1501]: time="2025-09-13T00:09:04.145386997Z" level=info msg="StartContainer for \"4d2d9e530c12ff36bcafb5dd3b838f30d3c7696c8afd2d52854d297d67005f43\"" Sep 13 00:09:04.209255 systemd[1]: Started cri-containerd-4d2d9e530c12ff36bcafb5dd3b838f30d3c7696c8afd2d52854d297d67005f43.scope - libcontainer container 4d2d9e530c12ff36bcafb5dd3b838f30d3c7696c8afd2d52854d297d67005f43. Sep 13 00:09:04.256626 containerd[1501]: time="2025-09-13T00:09:04.256593895Z" level=info msg="StartContainer for \"4d2d9e530c12ff36bcafb5dd3b838f30d3c7696c8afd2d52854d297d67005f43\" returns successfully" Sep 13 00:09:04.899057 kubelet[2611]: I0913 00:09:04.898961 2611 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 13 00:09:04.899057 kubelet[2611]: I0913 00:09:04.899051 2611 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 13 00:09:05.306336 kubelet[2611]: I0913 00:09:05.305766 2611 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-pdqq4" podStartSLOduration=24.41457676 podStartE2EDuration="33.30574479s" podCreationTimestamp="2025-09-13 00:08:32 +0000 UTC" firstStartedPulling="2025-09-13 00:08:55.209136109 +0000 UTC m=+41.623169406" lastFinishedPulling="2025-09-13 00:09:04.100304139 +0000 UTC m=+50.514337436" observedRunningTime="2025-09-13 00:09:05.298505046 +0000 UTC m=+51.712538363" watchObservedRunningTime="2025-09-13 00:09:05.30574479 +0000 UTC m=+51.719778087" Sep 13 00:09:06.261533 containerd[1501]: time="2025-09-13T00:09:06.261150106Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:09:06.262953 containerd[1501]: time="2025-09-13T00:09:06.262639696Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 13 00:09:06.264312 containerd[1501]: time="2025-09-13T00:09:06.264283534Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:09:06.267934 containerd[1501]: time="2025-09-13T00:09:06.266828175Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 13 00:09:06.267934 containerd[1501]: time="2025-09-13T00:09:06.267759505Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 2.149867842s" Sep 13 00:09:06.267934 containerd[1501]: time="2025-09-13T00:09:06.267792837Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 13 00:09:06.380099 containerd[1501]: time="2025-09-13T00:09:06.380056269Z" level=info msg="CreateContainer within sandbox \"6bf9568832be6da1b5e90f03d75bec7ca27702190bfb20532005105b81e53c81\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 13 00:09:06.393898 containerd[1501]: time="2025-09-13T00:09:06.393854374Z" level=info msg="CreateContainer within sandbox \"6bf9568832be6da1b5e90f03d75bec7ca27702190bfb20532005105b81e53c81\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"8b55bc07e1516f8dfd7f7a6acf2a839134d3aa25a0f72760ffeed69bf552163e\"" Sep 13 00:09:06.394727 containerd[1501]: time="2025-09-13T00:09:06.394698150Z" level=info msg="StartContainer for \"8b55bc07e1516f8dfd7f7a6acf2a839134d3aa25a0f72760ffeed69bf552163e\"" Sep 13 00:09:06.460138 systemd[1]: Started cri-containerd-8b55bc07e1516f8dfd7f7a6acf2a839134d3aa25a0f72760ffeed69bf552163e.scope - libcontainer container 8b55bc07e1516f8dfd7f7a6acf2a839134d3aa25a0f72760ffeed69bf552163e. Sep 13 00:09:06.526571 containerd[1501]: time="2025-09-13T00:09:06.525737249Z" level=info msg="StartContainer for \"8b55bc07e1516f8dfd7f7a6acf2a839134d3aa25a0f72760ffeed69bf552163e\" returns successfully" Sep 13 00:09:07.461396 kubelet[2611]: I0913 00:09:07.461276 2611 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-577b6875b7-hxt89" podStartSLOduration=27.597811021 podStartE2EDuration="35.452832791s" podCreationTimestamp="2025-09-13 00:08:32 +0000 UTC" firstStartedPulling="2025-09-13 00:08:58.450704832 +0000 UTC m=+44.864738128" lastFinishedPulling="2025-09-13 00:09:06.305726602 +0000 UTC m=+52.719759898" observedRunningTime="2025-09-13 00:09:07.394717792 +0000 UTC m=+53.808751089" watchObservedRunningTime="2025-09-13 00:09:07.452832791 +0000 UTC m=+53.866866089" Sep 13 00:09:13.907410 containerd[1501]: time="2025-09-13T00:09:13.907193880Z" level=info msg="StopPodSandbox for \"132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea\"" Sep 13 00:09:14.285264 containerd[1501]: 2025-09-13 00:09:14.113 [WARNING][5414] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--294a4568b6-k8s-csi--node--driver--pdqq4-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"d0c095e9-b846-4f71-bc12-2e92665be871", ResourceVersion:"1049", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 8, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-294a4568b6", ContainerID:"2aeeb1e67e0ab98e4632cde4b2c2a3b3d848b6449cd7ba19bc91d6b34385ec27", Pod:"csi-node-driver-pdqq4", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.44.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid910e85c87b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:09:14.285264 containerd[1501]: 2025-09-13 00:09:14.115 [INFO][5414] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" Sep 13 00:09:14.285264 containerd[1501]: 2025-09-13 00:09:14.115 [INFO][5414] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" iface="eth0" netns="" Sep 13 00:09:14.285264 containerd[1501]: 2025-09-13 00:09:14.115 [INFO][5414] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" Sep 13 00:09:14.285264 containerd[1501]: 2025-09-13 00:09:14.115 [INFO][5414] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" Sep 13 00:09:14.285264 containerd[1501]: 2025-09-13 00:09:14.267 [INFO][5421] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" HandleID="k8s-pod-network.132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" Workload="ci--4081--3--5--n--294a4568b6-k8s-csi--node--driver--pdqq4-eth0" Sep 13 00:09:14.285264 containerd[1501]: 2025-09-13 00:09:14.269 [INFO][5421] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:09:14.285264 containerd[1501]: 2025-09-13 00:09:14.270 [INFO][5421] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:09:14.285264 containerd[1501]: 2025-09-13 00:09:14.278 [WARNING][5421] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" HandleID="k8s-pod-network.132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" Workload="ci--4081--3--5--n--294a4568b6-k8s-csi--node--driver--pdqq4-eth0" Sep 13 00:09:14.285264 containerd[1501]: 2025-09-13 00:09:14.278 [INFO][5421] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" HandleID="k8s-pod-network.132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" Workload="ci--4081--3--5--n--294a4568b6-k8s-csi--node--driver--pdqq4-eth0" Sep 13 00:09:14.285264 containerd[1501]: 2025-09-13 00:09:14.280 [INFO][5421] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:09:14.285264 containerd[1501]: 2025-09-13 00:09:14.282 [INFO][5414] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" Sep 13 00:09:14.295085 containerd[1501]: time="2025-09-13T00:09:14.295008827Z" level=info msg="TearDown network for sandbox \"132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea\" successfully" Sep 13 00:09:14.295085 containerd[1501]: time="2025-09-13T00:09:14.295066355Z" level=info msg="StopPodSandbox for \"132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea\" returns successfully" Sep 13 00:09:14.409446 containerd[1501]: time="2025-09-13T00:09:14.409402073Z" level=info msg="RemovePodSandbox for \"132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea\"" Sep 13 00:09:14.409446 containerd[1501]: time="2025-09-13T00:09:14.409449382Z" level=info msg="Forcibly stopping sandbox \"132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea\"" Sep 13 00:09:14.501286 containerd[1501]: 2025-09-13 00:09:14.461 [WARNING][5436] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--294a4568b6-k8s-csi--node--driver--pdqq4-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"d0c095e9-b846-4f71-bc12-2e92665be871", ResourceVersion:"1049", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 8, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-294a4568b6", ContainerID:"2aeeb1e67e0ab98e4632cde4b2c2a3b3d848b6449cd7ba19bc91d6b34385ec27", Pod:"csi-node-driver-pdqq4", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.44.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid910e85c87b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:09:14.501286 containerd[1501]: 2025-09-13 00:09:14.462 [INFO][5436] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" Sep 13 00:09:14.501286 containerd[1501]: 2025-09-13 00:09:14.462 [INFO][5436] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" iface="eth0" netns="" Sep 13 00:09:14.501286 containerd[1501]: 2025-09-13 00:09:14.462 [INFO][5436] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" Sep 13 00:09:14.501286 containerd[1501]: 2025-09-13 00:09:14.463 [INFO][5436] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" Sep 13 00:09:14.501286 containerd[1501]: 2025-09-13 00:09:14.486 [INFO][5443] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" HandleID="k8s-pod-network.132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" Workload="ci--4081--3--5--n--294a4568b6-k8s-csi--node--driver--pdqq4-eth0" Sep 13 00:09:14.501286 containerd[1501]: 2025-09-13 00:09:14.486 [INFO][5443] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:09:14.501286 containerd[1501]: 2025-09-13 00:09:14.486 [INFO][5443] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:09:14.501286 containerd[1501]: 2025-09-13 00:09:14.491 [WARNING][5443] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" HandleID="k8s-pod-network.132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" Workload="ci--4081--3--5--n--294a4568b6-k8s-csi--node--driver--pdqq4-eth0" Sep 13 00:09:14.501286 containerd[1501]: 2025-09-13 00:09:14.492 [INFO][5443] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" HandleID="k8s-pod-network.132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" Workload="ci--4081--3--5--n--294a4568b6-k8s-csi--node--driver--pdqq4-eth0" Sep 13 00:09:14.501286 containerd[1501]: 2025-09-13 00:09:14.493 [INFO][5443] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:09:14.501286 containerd[1501]: 2025-09-13 00:09:14.497 [INFO][5436] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea" Sep 13 00:09:14.501286 containerd[1501]: time="2025-09-13T00:09:14.501079581Z" level=info msg="TearDown network for sandbox \"132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea\" successfully" Sep 13 00:09:14.525784 containerd[1501]: time="2025-09-13T00:09:14.525728951Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 13 00:09:14.531773 containerd[1501]: time="2025-09-13T00:09:14.531728685Z" level=info msg="RemovePodSandbox \"132b716fbd3e34dd666a81d223dc704a3e4527a521620c1192a0fd481e16d6ea\" returns successfully" Sep 13 00:09:14.532512 containerd[1501]: time="2025-09-13T00:09:14.532236409Z" level=info msg="StopPodSandbox for \"a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b\"" Sep 13 00:09:14.588822 containerd[1501]: 2025-09-13 00:09:14.560 [WARNING][5457] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--294a4568b6-k8s-calico--kube--controllers--577b6875b7--hxt89-eth0", GenerateName:"calico-kube-controllers-577b6875b7-", Namespace:"calico-system", SelfLink:"", UID:"4166a8bf-c6d2-4b14-bbe8-bc5fa59948b2", ResourceVersion:"1063", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 8, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"577b6875b7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-294a4568b6", ContainerID:"6bf9568832be6da1b5e90f03d75bec7ca27702190bfb20532005105b81e53c81", Pod:"calico-kube-controllers-577b6875b7-hxt89", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.44.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali8e5aadf41af", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:09:14.588822 containerd[1501]: 2025-09-13 00:09:14.560 [INFO][5457] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" Sep 13 00:09:14.588822 containerd[1501]: 2025-09-13 00:09:14.561 [INFO][5457] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" iface="eth0" netns="" Sep 13 00:09:14.588822 containerd[1501]: 2025-09-13 00:09:14.561 [INFO][5457] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" Sep 13 00:09:14.588822 containerd[1501]: 2025-09-13 00:09:14.561 [INFO][5457] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" Sep 13 00:09:14.588822 containerd[1501]: 2025-09-13 00:09:14.577 [INFO][5464] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" HandleID="k8s-pod-network.a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--kube--controllers--577b6875b7--hxt89-eth0" Sep 13 00:09:14.588822 containerd[1501]: 2025-09-13 00:09:14.577 [INFO][5464] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:09:14.588822 containerd[1501]: 2025-09-13 00:09:14.577 [INFO][5464] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:09:14.588822 containerd[1501]: 2025-09-13 00:09:14.583 [WARNING][5464] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" HandleID="k8s-pod-network.a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--kube--controllers--577b6875b7--hxt89-eth0" Sep 13 00:09:14.588822 containerd[1501]: 2025-09-13 00:09:14.583 [INFO][5464] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" HandleID="k8s-pod-network.a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--kube--controllers--577b6875b7--hxt89-eth0" Sep 13 00:09:14.588822 containerd[1501]: 2025-09-13 00:09:14.584 [INFO][5464] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:09:14.588822 containerd[1501]: 2025-09-13 00:09:14.586 [INFO][5457] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" Sep 13 00:09:14.588822 containerd[1501]: time="2025-09-13T00:09:14.588497701Z" level=info msg="TearDown network for sandbox \"a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b\" successfully" Sep 13 00:09:14.588822 containerd[1501]: time="2025-09-13T00:09:14.588523800Z" level=info msg="StopPodSandbox for \"a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b\" returns successfully" Sep 13 00:09:14.589730 containerd[1501]: time="2025-09-13T00:09:14.589406337Z" level=info msg="RemovePodSandbox for \"a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b\"" Sep 13 00:09:14.589730 containerd[1501]: time="2025-09-13T00:09:14.589432727Z" level=info msg="Forcibly stopping sandbox \"a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b\"" Sep 13 00:09:14.658952 containerd[1501]: 2025-09-13 00:09:14.622 [WARNING][5478] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--294a4568b6-k8s-calico--kube--controllers--577b6875b7--hxt89-eth0", GenerateName:"calico-kube-controllers-577b6875b7-", Namespace:"calico-system", SelfLink:"", UID:"4166a8bf-c6d2-4b14-bbe8-bc5fa59948b2", ResourceVersion:"1063", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 8, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"577b6875b7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-294a4568b6", ContainerID:"6bf9568832be6da1b5e90f03d75bec7ca27702190bfb20532005105b81e53c81", Pod:"calico-kube-controllers-577b6875b7-hxt89", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.44.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali8e5aadf41af", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:09:14.658952 containerd[1501]: 2025-09-13 00:09:14.622 [INFO][5478] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" Sep 13 00:09:14.658952 containerd[1501]: 2025-09-13 00:09:14.622 [INFO][5478] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" iface="eth0" netns="" Sep 13 00:09:14.658952 containerd[1501]: 2025-09-13 00:09:14.622 [INFO][5478] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" Sep 13 00:09:14.658952 containerd[1501]: 2025-09-13 00:09:14.622 [INFO][5478] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" Sep 13 00:09:14.658952 containerd[1501]: 2025-09-13 00:09:14.643 [INFO][5486] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" HandleID="k8s-pod-network.a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--kube--controllers--577b6875b7--hxt89-eth0" Sep 13 00:09:14.658952 containerd[1501]: 2025-09-13 00:09:14.643 [INFO][5486] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:09:14.658952 containerd[1501]: 2025-09-13 00:09:14.643 [INFO][5486] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:09:14.658952 containerd[1501]: 2025-09-13 00:09:14.650 [WARNING][5486] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" HandleID="k8s-pod-network.a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--kube--controllers--577b6875b7--hxt89-eth0" Sep 13 00:09:14.658952 containerd[1501]: 2025-09-13 00:09:14.650 [INFO][5486] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" HandleID="k8s-pod-network.a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--kube--controllers--577b6875b7--hxt89-eth0" Sep 13 00:09:14.658952 containerd[1501]: 2025-09-13 00:09:14.652 [INFO][5486] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:09:14.658952 containerd[1501]: 2025-09-13 00:09:14.655 [INFO][5478] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b" Sep 13 00:09:14.658952 containerd[1501]: time="2025-09-13T00:09:14.658014030Z" level=info msg="TearDown network for sandbox \"a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b\" successfully" Sep 13 00:09:14.662811 containerd[1501]: time="2025-09-13T00:09:14.662775721Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 13 00:09:14.662899 containerd[1501]: time="2025-09-13T00:09:14.662845192Z" level=info msg="RemovePodSandbox \"a628c9f57a387c30d21df3f761670db06db768ddf12b19eb8d70d9c8345f905b\" returns successfully" Sep 13 00:09:14.663408 containerd[1501]: time="2025-09-13T00:09:14.663384725Z" level=info msg="StopPodSandbox for \"917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d\"" Sep 13 00:09:14.755138 containerd[1501]: 2025-09-13 00:09:14.705 [WARNING][5500] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--294a4568b6-k8s-goldmane--54d579b49d--zn7kg-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"e8347be8-88b8-4276-8b5d-e1c006f27806", ResourceVersion:"1035", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 8, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-294a4568b6", ContainerID:"3e1f1eb8931fb7c95629805f0bfb07bd8f7e57e8c17fd97bb5b65304d8bbb817", Pod:"goldmane-54d579b49d-zn7kg", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.44.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali255e78b914b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:09:14.755138 containerd[1501]: 2025-09-13 00:09:14.705 [INFO][5500] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" Sep 13 00:09:14.755138 containerd[1501]: 2025-09-13 00:09:14.705 [INFO][5500] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" iface="eth0" netns="" Sep 13 00:09:14.755138 containerd[1501]: 2025-09-13 00:09:14.705 [INFO][5500] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" Sep 13 00:09:14.755138 containerd[1501]: 2025-09-13 00:09:14.705 [INFO][5500] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" Sep 13 00:09:14.755138 containerd[1501]: 2025-09-13 00:09:14.738 [INFO][5507] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" HandleID="k8s-pod-network.917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" Workload="ci--4081--3--5--n--294a4568b6-k8s-goldmane--54d579b49d--zn7kg-eth0" Sep 13 00:09:14.755138 containerd[1501]: 2025-09-13 00:09:14.739 [INFO][5507] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:09:14.755138 containerd[1501]: 2025-09-13 00:09:14.739 [INFO][5507] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:09:14.755138 containerd[1501]: 2025-09-13 00:09:14.748 [WARNING][5507] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" HandleID="k8s-pod-network.917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" Workload="ci--4081--3--5--n--294a4568b6-k8s-goldmane--54d579b49d--zn7kg-eth0" Sep 13 00:09:14.755138 containerd[1501]: 2025-09-13 00:09:14.748 [INFO][5507] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" HandleID="k8s-pod-network.917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" Workload="ci--4081--3--5--n--294a4568b6-k8s-goldmane--54d579b49d--zn7kg-eth0" Sep 13 00:09:14.755138 containerd[1501]: 2025-09-13 00:09:14.749 [INFO][5507] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:09:14.755138 containerd[1501]: 2025-09-13 00:09:14.752 [INFO][5500] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" Sep 13 00:09:14.755807 containerd[1501]: time="2025-09-13T00:09:14.755169072Z" level=info msg="TearDown network for sandbox \"917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d\" successfully" Sep 13 00:09:14.755807 containerd[1501]: time="2025-09-13T00:09:14.755202625Z" level=info msg="StopPodSandbox for \"917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d\" returns successfully" Sep 13 00:09:14.756181 containerd[1501]: time="2025-09-13T00:09:14.755892491Z" level=info msg="RemovePodSandbox for \"917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d\"" Sep 13 00:09:14.756283 containerd[1501]: time="2025-09-13T00:09:14.756244932Z" level=info msg="Forcibly stopping sandbox \"917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d\"" Sep 13 00:09:14.847536 containerd[1501]: 2025-09-13 00:09:14.803 [WARNING][5521] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--294a4568b6-k8s-goldmane--54d579b49d--zn7kg-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"e8347be8-88b8-4276-8b5d-e1c006f27806", ResourceVersion:"1035", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 8, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-294a4568b6", ContainerID:"3e1f1eb8931fb7c95629805f0bfb07bd8f7e57e8c17fd97bb5b65304d8bbb817", Pod:"goldmane-54d579b49d-zn7kg", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.44.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali255e78b914b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:09:14.847536 containerd[1501]: 2025-09-13 00:09:14.803 [INFO][5521] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" Sep 13 00:09:14.847536 containerd[1501]: 2025-09-13 00:09:14.803 [INFO][5521] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" iface="eth0" netns="" Sep 13 00:09:14.847536 containerd[1501]: 2025-09-13 00:09:14.803 [INFO][5521] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" Sep 13 00:09:14.847536 containerd[1501]: 2025-09-13 00:09:14.803 [INFO][5521] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" Sep 13 00:09:14.847536 containerd[1501]: 2025-09-13 00:09:14.831 [INFO][5528] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" HandleID="k8s-pod-network.917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" Workload="ci--4081--3--5--n--294a4568b6-k8s-goldmane--54d579b49d--zn7kg-eth0" Sep 13 00:09:14.847536 containerd[1501]: 2025-09-13 00:09:14.831 [INFO][5528] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:09:14.847536 containerd[1501]: 2025-09-13 00:09:14.831 [INFO][5528] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:09:14.847536 containerd[1501]: 2025-09-13 00:09:14.839 [WARNING][5528] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" HandleID="k8s-pod-network.917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" Workload="ci--4081--3--5--n--294a4568b6-k8s-goldmane--54d579b49d--zn7kg-eth0" Sep 13 00:09:14.847536 containerd[1501]: 2025-09-13 00:09:14.839 [INFO][5528] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" HandleID="k8s-pod-network.917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" Workload="ci--4081--3--5--n--294a4568b6-k8s-goldmane--54d579b49d--zn7kg-eth0" Sep 13 00:09:14.847536 containerd[1501]: 2025-09-13 00:09:14.841 [INFO][5528] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:09:14.847536 containerd[1501]: 2025-09-13 00:09:14.844 [INFO][5521] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d" Sep 13 00:09:14.847536 containerd[1501]: time="2025-09-13T00:09:14.847325128Z" level=info msg="TearDown network for sandbox \"917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d\" successfully" Sep 13 00:09:14.866984 containerd[1501]: time="2025-09-13T00:09:14.866892707Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 13 00:09:14.867667 containerd[1501]: time="2025-09-13T00:09:14.867260308Z" level=info msg="RemovePodSandbox \"917fdbefc3908f0bf83798fac2c4846875ba56865120df4665ebe4f06d46340d\" returns successfully" Sep 13 00:09:14.868272 containerd[1501]: time="2025-09-13T00:09:14.867988003Z" level=info msg="StopPodSandbox for \"9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353\"" Sep 13 00:09:14.976308 containerd[1501]: 2025-09-13 00:09:14.929 [WARNING][5542] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-whisker--57d6c7f84c--tpttf-eth0" Sep 13 00:09:14.976308 containerd[1501]: 2025-09-13 00:09:14.929 [INFO][5542] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" Sep 13 00:09:14.976308 containerd[1501]: 2025-09-13 00:09:14.929 [INFO][5542] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" iface="eth0" netns="" Sep 13 00:09:14.976308 containerd[1501]: 2025-09-13 00:09:14.929 [INFO][5542] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" Sep 13 00:09:14.976308 containerd[1501]: 2025-09-13 00:09:14.929 [INFO][5542] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" Sep 13 00:09:14.976308 containerd[1501]: 2025-09-13 00:09:14.958 [INFO][5549] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" HandleID="k8s-pod-network.9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" Workload="ci--4081--3--5--n--294a4568b6-k8s-whisker--57d6c7f84c--tpttf-eth0" Sep 13 00:09:14.976308 containerd[1501]: 2025-09-13 00:09:14.958 [INFO][5549] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:09:14.976308 containerd[1501]: 2025-09-13 00:09:14.958 [INFO][5549] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:09:14.976308 containerd[1501]: 2025-09-13 00:09:14.967 [WARNING][5549] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" HandleID="k8s-pod-network.9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" Workload="ci--4081--3--5--n--294a4568b6-k8s-whisker--57d6c7f84c--tpttf-eth0" Sep 13 00:09:14.976308 containerd[1501]: 2025-09-13 00:09:14.967 [INFO][5549] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" HandleID="k8s-pod-network.9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" Workload="ci--4081--3--5--n--294a4568b6-k8s-whisker--57d6c7f84c--tpttf-eth0" Sep 13 00:09:14.976308 containerd[1501]: 2025-09-13 00:09:14.970 [INFO][5549] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:09:14.976308 containerd[1501]: 2025-09-13 00:09:14.973 [INFO][5542] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" Sep 13 00:09:14.976308 containerd[1501]: time="2025-09-13T00:09:14.976155642Z" level=info msg="TearDown network for sandbox \"9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353\" successfully" Sep 13 00:09:14.976308 containerd[1501]: time="2025-09-13T00:09:14.976185448Z" level=info msg="StopPodSandbox for \"9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353\" returns successfully" Sep 13 00:09:14.977445 containerd[1501]: time="2025-09-13T00:09:14.977400970Z" level=info msg="RemovePodSandbox for \"9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353\"" Sep 13 00:09:14.977445 containerd[1501]: time="2025-09-13T00:09:14.977440604Z" level=info msg="Forcibly stopping sandbox \"9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353\"" Sep 13 00:09:15.096979 containerd[1501]: 2025-09-13 00:09:15.044 [WARNING][5563] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" WorkloadEndpoint="ci--4081--3--5--n--294a4568b6-k8s-whisker--57d6c7f84c--tpttf-eth0" Sep 13 00:09:15.096979 containerd[1501]: 2025-09-13 00:09:15.045 [INFO][5563] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" Sep 13 00:09:15.096979 containerd[1501]: 2025-09-13 00:09:15.045 [INFO][5563] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" iface="eth0" netns="" Sep 13 00:09:15.096979 containerd[1501]: 2025-09-13 00:09:15.045 [INFO][5563] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" Sep 13 00:09:15.096979 containerd[1501]: 2025-09-13 00:09:15.045 [INFO][5563] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" Sep 13 00:09:15.096979 containerd[1501]: 2025-09-13 00:09:15.080 [INFO][5571] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" HandleID="k8s-pod-network.9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" Workload="ci--4081--3--5--n--294a4568b6-k8s-whisker--57d6c7f84c--tpttf-eth0" Sep 13 00:09:15.096979 containerd[1501]: 2025-09-13 00:09:15.080 [INFO][5571] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:09:15.096979 containerd[1501]: 2025-09-13 00:09:15.080 [INFO][5571] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:09:15.096979 containerd[1501]: 2025-09-13 00:09:15.089 [WARNING][5571] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" HandleID="k8s-pod-network.9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" Workload="ci--4081--3--5--n--294a4568b6-k8s-whisker--57d6c7f84c--tpttf-eth0" Sep 13 00:09:15.096979 containerd[1501]: 2025-09-13 00:09:15.089 [INFO][5571] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" HandleID="k8s-pod-network.9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" Workload="ci--4081--3--5--n--294a4568b6-k8s-whisker--57d6c7f84c--tpttf-eth0" Sep 13 00:09:15.096979 containerd[1501]: 2025-09-13 00:09:15.090 [INFO][5571] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:09:15.096979 containerd[1501]: 2025-09-13 00:09:15.093 [INFO][5563] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353" Sep 13 00:09:15.104504 containerd[1501]: time="2025-09-13T00:09:15.096980702Z" level=info msg="TearDown network for sandbox \"9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353\" successfully" Sep 13 00:09:15.128043 containerd[1501]: time="2025-09-13T00:09:15.127966245Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 13 00:09:15.128462 containerd[1501]: time="2025-09-13T00:09:15.128078375Z" level=info msg="RemovePodSandbox \"9de479bdd8fcad82feee5c64a72af5b4272a54cf434bea76104a486fa2c45353\" returns successfully" Sep 13 00:09:15.128626 containerd[1501]: time="2025-09-13T00:09:15.128584416Z" level=info msg="StopPodSandbox for \"f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36\"" Sep 13 00:09:15.201841 containerd[1501]: 2025-09-13 00:09:15.163 [WARNING][5585] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--ts6l4-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"807cba0d-efbf-495e-bbeb-ab5413159760", ResourceVersion:"940", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 8, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-294a4568b6", ContainerID:"32cb9490f7fee95bc363c6381c707ea9caa62c1397abd7d20ac28e4f70d5b6f0", Pod:"coredns-674b8bbfcf-ts6l4", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.44.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calida62db25490", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:09:15.201841 containerd[1501]: 2025-09-13 00:09:15.164 [INFO][5585] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" Sep 13 00:09:15.201841 containerd[1501]: 2025-09-13 00:09:15.164 [INFO][5585] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" iface="eth0" netns="" Sep 13 00:09:15.201841 containerd[1501]: 2025-09-13 00:09:15.164 [INFO][5585] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" Sep 13 00:09:15.201841 containerd[1501]: 2025-09-13 00:09:15.164 [INFO][5585] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" Sep 13 00:09:15.201841 containerd[1501]: 2025-09-13 00:09:15.187 [INFO][5592] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" HandleID="k8s-pod-network.f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" Workload="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--ts6l4-eth0" Sep 13 00:09:15.201841 containerd[1501]: 2025-09-13 00:09:15.187 [INFO][5592] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:09:15.201841 containerd[1501]: 2025-09-13 00:09:15.187 [INFO][5592] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:09:15.201841 containerd[1501]: 2025-09-13 00:09:15.195 [WARNING][5592] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" HandleID="k8s-pod-network.f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" Workload="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--ts6l4-eth0" Sep 13 00:09:15.201841 containerd[1501]: 2025-09-13 00:09:15.195 [INFO][5592] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" HandleID="k8s-pod-network.f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" Workload="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--ts6l4-eth0" Sep 13 00:09:15.201841 containerd[1501]: 2025-09-13 00:09:15.197 [INFO][5592] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:09:15.201841 containerd[1501]: 2025-09-13 00:09:15.199 [INFO][5585] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" Sep 13 00:09:15.202879 containerd[1501]: time="2025-09-13T00:09:15.201870894Z" level=info msg="TearDown network for sandbox \"f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36\" successfully" Sep 13 00:09:15.202879 containerd[1501]: time="2025-09-13T00:09:15.201900329Z" level=info msg="StopPodSandbox for \"f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36\" returns successfully" Sep 13 00:09:15.203859 containerd[1501]: time="2025-09-13T00:09:15.203820224Z" level=info msg="RemovePodSandbox for \"f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36\"" Sep 13 00:09:15.203907 containerd[1501]: time="2025-09-13T00:09:15.203891818Z" level=info msg="Forcibly stopping sandbox \"f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36\"" Sep 13 00:09:15.275203 containerd[1501]: 2025-09-13 00:09:15.245 [WARNING][5607] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--ts6l4-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"807cba0d-efbf-495e-bbeb-ab5413159760", ResourceVersion:"940", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 8, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-294a4568b6", ContainerID:"32cb9490f7fee95bc363c6381c707ea9caa62c1397abd7d20ac28e4f70d5b6f0", Pod:"coredns-674b8bbfcf-ts6l4", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.44.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calida62db25490", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:09:15.275203 containerd[1501]: 2025-09-13 00:09:15.245 [INFO][5607] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" Sep 13 00:09:15.275203 containerd[1501]: 2025-09-13 00:09:15.245 [INFO][5607] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" iface="eth0" netns="" Sep 13 00:09:15.275203 containerd[1501]: 2025-09-13 00:09:15.245 [INFO][5607] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" Sep 13 00:09:15.275203 containerd[1501]: 2025-09-13 00:09:15.245 [INFO][5607] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" Sep 13 00:09:15.275203 containerd[1501]: 2025-09-13 00:09:15.262 [INFO][5614] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" HandleID="k8s-pod-network.f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" Workload="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--ts6l4-eth0" Sep 13 00:09:15.275203 containerd[1501]: 2025-09-13 00:09:15.263 [INFO][5614] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:09:15.275203 containerd[1501]: 2025-09-13 00:09:15.263 [INFO][5614] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:09:15.275203 containerd[1501]: 2025-09-13 00:09:15.269 [WARNING][5614] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" HandleID="k8s-pod-network.f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" Workload="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--ts6l4-eth0" Sep 13 00:09:15.275203 containerd[1501]: 2025-09-13 00:09:15.269 [INFO][5614] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" HandleID="k8s-pod-network.f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" Workload="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--ts6l4-eth0" Sep 13 00:09:15.275203 containerd[1501]: 2025-09-13 00:09:15.271 [INFO][5614] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:09:15.275203 containerd[1501]: 2025-09-13 00:09:15.273 [INFO][5607] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36" Sep 13 00:09:15.275880 containerd[1501]: time="2025-09-13T00:09:15.275256820Z" level=info msg="TearDown network for sandbox \"f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36\" successfully" Sep 13 00:09:15.278019 containerd[1501]: time="2025-09-13T00:09:15.277992104Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 13 00:09:15.278264 containerd[1501]: time="2025-09-13T00:09:15.278092463Z" level=info msg="RemovePodSandbox \"f3f76841deaea1b7ce378f2a1523b9b857f562b5d6ba2bed49e181590f793f36\" returns successfully" Sep 13 00:09:15.278537 containerd[1501]: time="2025-09-13T00:09:15.278499687Z" level=info msg="StopPodSandbox for \"c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478\"" Sep 13 00:09:15.342753 containerd[1501]: 2025-09-13 00:09:15.310 [WARNING][5628] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--chbwf-eth0", GenerateName:"calico-apiserver-9c789df89-", Namespace:"calico-apiserver", SelfLink:"", UID:"66c3fc14-19e6-48c3-91eb-518f5181421b", ResourceVersion:"996", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 8, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9c789df89", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-294a4568b6", ContainerID:"3a9ad9d11477efcdcd06b6b044f06d51ec462d5c936e47f2d742a40029171596", Pod:"calico-apiserver-9c789df89-chbwf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.44.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali936bbd726b3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:09:15.342753 containerd[1501]: 2025-09-13 00:09:15.310 [INFO][5628] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" Sep 13 00:09:15.342753 containerd[1501]: 2025-09-13 00:09:15.310 [INFO][5628] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" iface="eth0" netns="" Sep 13 00:09:15.342753 containerd[1501]: 2025-09-13 00:09:15.310 [INFO][5628] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" Sep 13 00:09:15.342753 containerd[1501]: 2025-09-13 00:09:15.310 [INFO][5628] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" Sep 13 00:09:15.342753 containerd[1501]: 2025-09-13 00:09:15.329 [INFO][5635] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" HandleID="k8s-pod-network.c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--chbwf-eth0" Sep 13 00:09:15.342753 containerd[1501]: 2025-09-13 00:09:15.329 [INFO][5635] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:09:15.342753 containerd[1501]: 2025-09-13 00:09:15.330 [INFO][5635] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:09:15.342753 containerd[1501]: 2025-09-13 00:09:15.337 [WARNING][5635] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" HandleID="k8s-pod-network.c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--chbwf-eth0" Sep 13 00:09:15.342753 containerd[1501]: 2025-09-13 00:09:15.337 [INFO][5635] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" HandleID="k8s-pod-network.c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--chbwf-eth0" Sep 13 00:09:15.342753 containerd[1501]: 2025-09-13 00:09:15.338 [INFO][5635] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:09:15.342753 containerd[1501]: 2025-09-13 00:09:15.341 [INFO][5628] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" Sep 13 00:09:15.343802 containerd[1501]: time="2025-09-13T00:09:15.342793232Z" level=info msg="TearDown network for sandbox \"c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478\" successfully" Sep 13 00:09:15.343802 containerd[1501]: time="2025-09-13T00:09:15.342818520Z" level=info msg="StopPodSandbox for \"c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478\" returns successfully" Sep 13 00:09:15.343802 containerd[1501]: time="2025-09-13T00:09:15.343251122Z" level=info msg="RemovePodSandbox for \"c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478\"" Sep 13 00:09:15.343802 containerd[1501]: time="2025-09-13T00:09:15.343277041Z" level=info msg="Forcibly stopping sandbox \"c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478\"" Sep 13 00:09:15.419280 containerd[1501]: 2025-09-13 00:09:15.373 [WARNING][5649] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--chbwf-eth0", GenerateName:"calico-apiserver-9c789df89-", Namespace:"calico-apiserver", SelfLink:"", UID:"66c3fc14-19e6-48c3-91eb-518f5181421b", ResourceVersion:"996", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 8, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9c789df89", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-294a4568b6", ContainerID:"3a9ad9d11477efcdcd06b6b044f06d51ec462d5c936e47f2d742a40029171596", Pod:"calico-apiserver-9c789df89-chbwf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.44.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali936bbd726b3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:09:15.419280 containerd[1501]: 2025-09-13 00:09:15.374 [INFO][5649] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" Sep 13 00:09:15.419280 containerd[1501]: 2025-09-13 00:09:15.374 [INFO][5649] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" iface="eth0" netns="" Sep 13 00:09:15.419280 containerd[1501]: 2025-09-13 00:09:15.374 [INFO][5649] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" Sep 13 00:09:15.419280 containerd[1501]: 2025-09-13 00:09:15.374 [INFO][5649] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" Sep 13 00:09:15.419280 containerd[1501]: 2025-09-13 00:09:15.408 [INFO][5656] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" HandleID="k8s-pod-network.c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--chbwf-eth0" Sep 13 00:09:15.419280 containerd[1501]: 2025-09-13 00:09:15.408 [INFO][5656] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:09:15.419280 containerd[1501]: 2025-09-13 00:09:15.408 [INFO][5656] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:09:15.419280 containerd[1501]: 2025-09-13 00:09:15.414 [WARNING][5656] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" HandleID="k8s-pod-network.c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--chbwf-eth0" Sep 13 00:09:15.419280 containerd[1501]: 2025-09-13 00:09:15.414 [INFO][5656] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" HandleID="k8s-pod-network.c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--chbwf-eth0" Sep 13 00:09:15.419280 containerd[1501]: 2025-09-13 00:09:15.415 [INFO][5656] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:09:15.419280 containerd[1501]: 2025-09-13 00:09:15.417 [INFO][5649] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478" Sep 13 00:09:15.419280 containerd[1501]: time="2025-09-13T00:09:15.419258369Z" level=info msg="TearDown network for sandbox \"c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478\" successfully" Sep 13 00:09:15.424562 containerd[1501]: time="2025-09-13T00:09:15.422899925Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 13 00:09:15.424562 containerd[1501]: time="2025-09-13T00:09:15.422987870Z" level=info msg="RemovePodSandbox \"c1d75c013bdcf693fcd2a3ddcc3e013d373b11c7d8c24377f3bb7e3da51ee478\" returns successfully" Sep 13 00:09:15.424562 containerd[1501]: time="2025-09-13T00:09:15.423421975Z" level=info msg="StopPodSandbox for \"70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c\"" Sep 13 00:09:15.480734 containerd[1501]: 2025-09-13 00:09:15.451 [WARNING][5670] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--q6mzd-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"3ba023c8-210d-4e58-bf51-ba6b7484e570", ResourceVersion:"980", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 8, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-294a4568b6", ContainerID:"acfa6321962f6322b063ced1b193c73c093756247aa55f09cbec0abaac80aa34", Pod:"coredns-674b8bbfcf-q6mzd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.44.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calicc9c3689856", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:09:15.480734 containerd[1501]: 2025-09-13 00:09:15.451 [INFO][5670] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" Sep 13 00:09:15.480734 containerd[1501]: 2025-09-13 00:09:15.451 [INFO][5670] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" iface="eth0" netns="" Sep 13 00:09:15.480734 containerd[1501]: 2025-09-13 00:09:15.451 [INFO][5670] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" Sep 13 00:09:15.480734 containerd[1501]: 2025-09-13 00:09:15.451 [INFO][5670] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" Sep 13 00:09:15.480734 containerd[1501]: 2025-09-13 00:09:15.470 [INFO][5678] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" HandleID="k8s-pod-network.70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" Workload="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--q6mzd-eth0" Sep 13 00:09:15.480734 containerd[1501]: 2025-09-13 00:09:15.470 [INFO][5678] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:09:15.480734 containerd[1501]: 2025-09-13 00:09:15.470 [INFO][5678] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:09:15.480734 containerd[1501]: 2025-09-13 00:09:15.475 [WARNING][5678] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" HandleID="k8s-pod-network.70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" Workload="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--q6mzd-eth0" Sep 13 00:09:15.480734 containerd[1501]: 2025-09-13 00:09:15.475 [INFO][5678] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" HandleID="k8s-pod-network.70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" Workload="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--q6mzd-eth0" Sep 13 00:09:15.480734 containerd[1501]: 2025-09-13 00:09:15.476 [INFO][5678] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:09:15.480734 containerd[1501]: 2025-09-13 00:09:15.479 [INFO][5670] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" Sep 13 00:09:15.481386 containerd[1501]: time="2025-09-13T00:09:15.480766174Z" level=info msg="TearDown network for sandbox \"70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c\" successfully" Sep 13 00:09:15.481386 containerd[1501]: time="2025-09-13T00:09:15.480792463Z" level=info msg="StopPodSandbox for \"70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c\" returns successfully" Sep 13 00:09:15.481386 containerd[1501]: time="2025-09-13T00:09:15.481286310Z" level=info msg="RemovePodSandbox for \"70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c\"" Sep 13 00:09:15.481386 containerd[1501]: time="2025-09-13T00:09:15.481314002Z" level=info msg="Forcibly stopping sandbox \"70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c\"" Sep 13 00:09:15.546512 containerd[1501]: 2025-09-13 00:09:15.512 [WARNING][5693] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--q6mzd-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"3ba023c8-210d-4e58-bf51-ba6b7484e570", ResourceVersion:"980", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 8, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-294a4568b6", ContainerID:"acfa6321962f6322b063ced1b193c73c093756247aa55f09cbec0abaac80aa34", Pod:"coredns-674b8bbfcf-q6mzd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.44.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calicc9c3689856", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:09:15.546512 containerd[1501]: 2025-09-13 00:09:15.512 [INFO][5693] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" Sep 13 00:09:15.546512 containerd[1501]: 2025-09-13 00:09:15.512 [INFO][5693] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" iface="eth0" netns="" Sep 13 00:09:15.546512 containerd[1501]: 2025-09-13 00:09:15.512 [INFO][5693] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" Sep 13 00:09:15.546512 containerd[1501]: 2025-09-13 00:09:15.512 [INFO][5693] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" Sep 13 00:09:15.546512 containerd[1501]: 2025-09-13 00:09:15.532 [INFO][5700] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" HandleID="k8s-pod-network.70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" Workload="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--q6mzd-eth0" Sep 13 00:09:15.546512 containerd[1501]: 2025-09-13 00:09:15.532 [INFO][5700] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:09:15.546512 containerd[1501]: 2025-09-13 00:09:15.532 [INFO][5700] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:09:15.546512 containerd[1501]: 2025-09-13 00:09:15.539 [WARNING][5700] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" HandleID="k8s-pod-network.70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" Workload="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--q6mzd-eth0" Sep 13 00:09:15.546512 containerd[1501]: 2025-09-13 00:09:15.540 [INFO][5700] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" HandleID="k8s-pod-network.70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" Workload="ci--4081--3--5--n--294a4568b6-k8s-coredns--674b8bbfcf--q6mzd-eth0" Sep 13 00:09:15.546512 containerd[1501]: 2025-09-13 00:09:15.541 [INFO][5700] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:09:15.546512 containerd[1501]: 2025-09-13 00:09:15.543 [INFO][5693] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c" Sep 13 00:09:15.546512 containerd[1501]: time="2025-09-13T00:09:15.545250616Z" level=info msg="TearDown network for sandbox \"70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c\" successfully" Sep 13 00:09:15.548494 containerd[1501]: time="2025-09-13T00:09:15.548458329Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 13 00:09:15.548605 containerd[1501]: time="2025-09-13T00:09:15.548525365Z" level=info msg="RemovePodSandbox \"70302e1b99e4a70b05a0e6a485a959769c320e7396db1eccf4b6e32b8c3ba19c\" returns successfully" Sep 13 00:09:15.549340 containerd[1501]: time="2025-09-13T00:09:15.549111225Z" level=info msg="StopPodSandbox for \"545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5\"" Sep 13 00:09:15.609012 containerd[1501]: 2025-09-13 00:09:15.580 [WARNING][5714] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--kvgsc-eth0", GenerateName:"calico-apiserver-9c789df89-", Namespace:"calico-apiserver", SelfLink:"", UID:"81fa2d11-a454-4bc0-824e-4ee1c118d1e3", ResourceVersion:"1012", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 8, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9c789df89", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-294a4568b6", ContainerID:"acf9d09bd39d90b1a9851dd9a15eebe92d6b1f6854703a56a9187441b19428c1", Pod:"calico-apiserver-9c789df89-kvgsc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.44.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib93f8e9ce6e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:09:15.609012 containerd[1501]: 2025-09-13 00:09:15.580 [INFO][5714] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" Sep 13 00:09:15.609012 containerd[1501]: 2025-09-13 00:09:15.581 [INFO][5714] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" iface="eth0" netns="" Sep 13 00:09:15.609012 containerd[1501]: 2025-09-13 00:09:15.581 [INFO][5714] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" Sep 13 00:09:15.609012 containerd[1501]: 2025-09-13 00:09:15.581 [INFO][5714] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" Sep 13 00:09:15.609012 containerd[1501]: 2025-09-13 00:09:15.597 [INFO][5721] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" HandleID="k8s-pod-network.545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--kvgsc-eth0" Sep 13 00:09:15.609012 containerd[1501]: 2025-09-13 00:09:15.598 [INFO][5721] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:09:15.609012 containerd[1501]: 2025-09-13 00:09:15.598 [INFO][5721] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:09:15.609012 containerd[1501]: 2025-09-13 00:09:15.603 [WARNING][5721] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" HandleID="k8s-pod-network.545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--kvgsc-eth0" Sep 13 00:09:15.609012 containerd[1501]: 2025-09-13 00:09:15.603 [INFO][5721] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" HandleID="k8s-pod-network.545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--kvgsc-eth0" Sep 13 00:09:15.609012 containerd[1501]: 2025-09-13 00:09:15.604 [INFO][5721] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:09:15.609012 containerd[1501]: 2025-09-13 00:09:15.606 [INFO][5714] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" Sep 13 00:09:15.609012 containerd[1501]: time="2025-09-13T00:09:15.609095460Z" level=info msg="TearDown network for sandbox \"545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5\" successfully" Sep 13 00:09:15.609012 containerd[1501]: time="2025-09-13T00:09:15.609119084Z" level=info msg="StopPodSandbox for \"545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5\" returns successfully" Sep 13 00:09:15.610160 containerd[1501]: time="2025-09-13T00:09:15.609777871Z" level=info msg="RemovePodSandbox for \"545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5\"" Sep 13 00:09:15.610160 containerd[1501]: time="2025-09-13T00:09:15.609804311Z" level=info msg="Forcibly stopping sandbox \"545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5\"" Sep 13 00:09:15.674968 containerd[1501]: 2025-09-13 00:09:15.637 [WARNING][5736] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--kvgsc-eth0", GenerateName:"calico-apiserver-9c789df89-", Namespace:"calico-apiserver", SelfLink:"", UID:"81fa2d11-a454-4bc0-824e-4ee1c118d1e3", ResourceVersion:"1012", Generation:0, CreationTimestamp:time.Date(2025, time.September, 13, 0, 8, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9c789df89", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-294a4568b6", ContainerID:"acf9d09bd39d90b1a9851dd9a15eebe92d6b1f6854703a56a9187441b19428c1", Pod:"calico-apiserver-9c789df89-kvgsc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.44.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib93f8e9ce6e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 13 00:09:15.674968 containerd[1501]: 2025-09-13 00:09:15.638 [INFO][5736] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" Sep 13 00:09:15.674968 containerd[1501]: 2025-09-13 00:09:15.638 [INFO][5736] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" iface="eth0" netns="" Sep 13 00:09:15.674968 containerd[1501]: 2025-09-13 00:09:15.638 [INFO][5736] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" Sep 13 00:09:15.674968 containerd[1501]: 2025-09-13 00:09:15.638 [INFO][5736] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" Sep 13 00:09:15.674968 containerd[1501]: 2025-09-13 00:09:15.663 [INFO][5743] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" HandleID="k8s-pod-network.545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--kvgsc-eth0" Sep 13 00:09:15.674968 containerd[1501]: 2025-09-13 00:09:15.663 [INFO][5743] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 13 00:09:15.674968 containerd[1501]: 2025-09-13 00:09:15.663 [INFO][5743] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 13 00:09:15.674968 containerd[1501]: 2025-09-13 00:09:15.668 [WARNING][5743] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" HandleID="k8s-pod-network.545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--kvgsc-eth0" Sep 13 00:09:15.674968 containerd[1501]: 2025-09-13 00:09:15.668 [INFO][5743] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" HandleID="k8s-pod-network.545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" Workload="ci--4081--3--5--n--294a4568b6-k8s-calico--apiserver--9c789df89--kvgsc-eth0" Sep 13 00:09:15.674968 containerd[1501]: 2025-09-13 00:09:15.670 [INFO][5743] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 13 00:09:15.674968 containerd[1501]: 2025-09-13 00:09:15.672 [INFO][5736] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5" Sep 13 00:09:15.675732 containerd[1501]: time="2025-09-13T00:09:15.675007083Z" level=info msg="TearDown network for sandbox \"545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5\" successfully" Sep 13 00:09:15.693509 containerd[1501]: time="2025-09-13T00:09:15.693459736Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 13 00:09:15.693581 containerd[1501]: time="2025-09-13T00:09:15.693528795Z" level=info msg="RemovePodSandbox \"545275614684121451b5fa9dd505392c25919306491a0a341953c2c0ea40c9e5\" returns successfully" Sep 13 00:09:18.026007 systemd[1]: run-containerd-runc-k8s.io-3cfa10dd4f9f1ae5174cdb858108f237a2c81852a624b8c7d404647bc2ba0853-runc.ai2fzB.mount: Deactivated successfully. Sep 13 00:10:05.425179 update_engine[1481]: I20250913 00:10:05.425063 1481 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Sep 13 00:10:05.425179 update_engine[1481]: I20250913 00:10:05.425177 1481 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Sep 13 00:10:05.472959 update_engine[1481]: I20250913 00:10:05.454679 1481 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Sep 13 00:10:05.472959 update_engine[1481]: I20250913 00:10:05.460668 1481 omaha_request_params.cc:62] Current group set to lts Sep 13 00:10:05.472959 update_engine[1481]: I20250913 00:10:05.463128 1481 update_attempter.cc:499] Already updated boot flags. Skipping. Sep 13 00:10:05.472959 update_engine[1481]: I20250913 00:10:05.463155 1481 update_attempter.cc:643] Scheduling an action processor start. Sep 13 00:10:05.472959 update_engine[1481]: I20250913 00:10:05.463184 1481 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 13 00:10:05.472959 update_engine[1481]: I20250913 00:10:05.463228 1481 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Sep 13 00:10:05.472959 update_engine[1481]: I20250913 00:10:05.463310 1481 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 13 00:10:05.472959 update_engine[1481]: I20250913 00:10:05.463320 1481 omaha_request_action.cc:272] Request: Sep 13 00:10:05.472959 update_engine[1481]: Sep 13 00:10:05.472959 update_engine[1481]: Sep 13 00:10:05.472959 update_engine[1481]: Sep 13 00:10:05.472959 update_engine[1481]: Sep 13 00:10:05.472959 update_engine[1481]: Sep 13 00:10:05.472959 update_engine[1481]: Sep 13 00:10:05.472959 update_engine[1481]: Sep 13 00:10:05.472959 update_engine[1481]: Sep 13 00:10:05.472959 update_engine[1481]: I20250913 00:10:05.463325 1481 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 13 00:10:05.482118 locksmithd[1514]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Sep 13 00:10:05.499714 update_engine[1481]: I20250913 00:10:05.498733 1481 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 13 00:10:05.499714 update_engine[1481]: I20250913 00:10:05.499198 1481 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 13 00:10:05.504044 update_engine[1481]: E20250913 00:10:05.500748 1481 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 13 00:10:05.504044 update_engine[1481]: I20250913 00:10:05.500838 1481 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Sep 13 00:10:15.273249 update_engine[1481]: I20250913 00:10:15.273063 1481 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 13 00:10:15.277175 update_engine[1481]: I20250913 00:10:15.273409 1481 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 13 00:10:15.277175 update_engine[1481]: I20250913 00:10:15.273679 1481 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 13 00:10:15.277175 update_engine[1481]: E20250913 00:10:15.274461 1481 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 13 00:10:15.277175 update_engine[1481]: I20250913 00:10:15.274514 1481 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Sep 13 00:10:17.985495 systemd[1]: run-containerd-runc-k8s.io-3cfa10dd4f9f1ae5174cdb858108f237a2c81852a624b8c7d404647bc2ba0853-runc.IJJnTo.mount: Deactivated successfully. Sep 13 00:10:25.273966 update_engine[1481]: I20250913 00:10:25.273460 1481 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 13 00:10:25.273966 update_engine[1481]: I20250913 00:10:25.273728 1481 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 13 00:10:25.274516 update_engine[1481]: I20250913 00:10:25.273998 1481 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 13 00:10:25.274632 update_engine[1481]: E20250913 00:10:25.274592 1481 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 13 00:10:25.274717 update_engine[1481]: I20250913 00:10:25.274649 1481 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Sep 13 00:10:35.273781 update_engine[1481]: I20250913 00:10:35.273679 1481 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 13 00:10:35.275195 update_engine[1481]: I20250913 00:10:35.274724 1481 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 13 00:10:35.275195 update_engine[1481]: I20250913 00:10:35.275065 1481 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 13 00:10:35.276596 update_engine[1481]: E20250913 00:10:35.275964 1481 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 13 00:10:35.276596 update_engine[1481]: I20250913 00:10:35.276017 1481 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 13 00:10:35.276596 update_engine[1481]: I20250913 00:10:35.276028 1481 omaha_request_action.cc:617] Omaha request response: Sep 13 00:10:35.276596 update_engine[1481]: E20250913 00:10:35.276118 1481 omaha_request_action.cc:636] Omaha request network transfer failed. Sep 13 00:10:35.276596 update_engine[1481]: I20250913 00:10:35.276144 1481 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Sep 13 00:10:35.276596 update_engine[1481]: I20250913 00:10:35.276149 1481 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 13 00:10:35.276596 update_engine[1481]: I20250913 00:10:35.276152 1481 update_attempter.cc:306] Processing Done. Sep 13 00:10:35.276596 update_engine[1481]: E20250913 00:10:35.276166 1481 update_attempter.cc:619] Update failed. Sep 13 00:10:35.276596 update_engine[1481]: I20250913 00:10:35.276172 1481 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Sep 13 00:10:35.276596 update_engine[1481]: I20250913 00:10:35.276176 1481 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Sep 13 00:10:35.276596 update_engine[1481]: I20250913 00:10:35.276181 1481 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Sep 13 00:10:35.276596 update_engine[1481]: I20250913 00:10:35.276255 1481 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 13 00:10:35.276596 update_engine[1481]: I20250913 00:10:35.276276 1481 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 13 00:10:35.276596 update_engine[1481]: I20250913 00:10:35.276280 1481 omaha_request_action.cc:272] Request: Sep 13 00:10:35.276596 update_engine[1481]: Sep 13 00:10:35.276596 update_engine[1481]: Sep 13 00:10:35.278963 update_engine[1481]: Sep 13 00:10:35.278963 update_engine[1481]: Sep 13 00:10:35.278963 update_engine[1481]: Sep 13 00:10:35.278963 update_engine[1481]: Sep 13 00:10:35.278963 update_engine[1481]: I20250913 00:10:35.276285 1481 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 13 00:10:35.278963 update_engine[1481]: I20250913 00:10:35.276383 1481 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 13 00:10:35.278963 update_engine[1481]: I20250913 00:10:35.276510 1481 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 13 00:10:35.278963 update_engine[1481]: E20250913 00:10:35.277431 1481 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 13 00:10:35.278963 update_engine[1481]: I20250913 00:10:35.277477 1481 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 13 00:10:35.278963 update_engine[1481]: I20250913 00:10:35.277483 1481 omaha_request_action.cc:617] Omaha request response: Sep 13 00:10:35.278963 update_engine[1481]: I20250913 00:10:35.277488 1481 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 13 00:10:35.278963 update_engine[1481]: I20250913 00:10:35.277493 1481 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 13 00:10:35.278963 update_engine[1481]: I20250913 00:10:35.277495 1481 update_attempter.cc:306] Processing Done. Sep 13 00:10:35.278963 update_engine[1481]: I20250913 00:10:35.277499 1481 update_attempter.cc:310] Error event sent. Sep 13 00:10:35.282248 locksmithd[1514]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Sep 13 00:10:35.282524 update_engine[1481]: I20250913 00:10:35.279021 1481 update_check_scheduler.cc:74] Next update check in 44m18s Sep 13 00:10:35.284978 locksmithd[1514]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Sep 13 00:10:46.335214 systemd[1]: Started sshd@9-157.180.121.11:22-103.13.206.71:35694.service - OpenSSH per-connection server daemon (103.13.206.71:35694). Sep 13 00:10:47.521169 sshd[6055]: Invalid user ypwang from 103.13.206.71 port 35694 Sep 13 00:10:47.743032 sshd[6055]: Received disconnect from 103.13.206.71 port 35694:11: Bye Bye [preauth] Sep 13 00:10:47.743032 sshd[6055]: Disconnected from invalid user ypwang 103.13.206.71 port 35694 [preauth] Sep 13 00:10:47.745418 systemd[1]: sshd@9-157.180.121.11:22-103.13.206.71:35694.service: Deactivated successfully. Sep 13 00:10:50.294593 systemd[1]: Started sshd@10-157.180.121.11:22-147.75.109.163:43656.service - OpenSSH per-connection server daemon (147.75.109.163:43656). Sep 13 00:10:51.427386 sshd[6088]: Accepted publickey for core from 147.75.109.163 port 43656 ssh2: RSA SHA256:GymMDYnosJimc4ujfdMuxEHSH4lnFIHEzFRMhgLPZDY Sep 13 00:10:51.430427 sshd[6088]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:10:51.442754 systemd-logind[1477]: New session 8 of user core. Sep 13 00:10:51.446193 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 13 00:10:51.629260 systemd[1]: Started sshd@11-157.180.121.11:22-101.126.54.167:49248.service - OpenSSH per-connection server daemon (101.126.54.167:49248). Sep 13 00:10:52.836715 sshd[6088]: pam_unix(sshd:session): session closed for user core Sep 13 00:10:52.844207 systemd[1]: sshd@10-157.180.121.11:22-147.75.109.163:43656.service: Deactivated successfully. Sep 13 00:10:52.846344 systemd[1]: session-8.scope: Deactivated successfully. Sep 13 00:10:52.848416 systemd-logind[1477]: Session 8 logged out. Waiting for processes to exit. Sep 13 00:10:52.850620 systemd-logind[1477]: Removed session 8. Sep 13 00:10:57.990336 systemd[1]: Started sshd@12-157.180.121.11:22-147.75.109.163:43658.service - OpenSSH per-connection server daemon (147.75.109.163:43658). Sep 13 00:10:58.987794 sshd[6123]: Accepted publickey for core from 147.75.109.163 port 43658 ssh2: RSA SHA256:GymMDYnosJimc4ujfdMuxEHSH4lnFIHEzFRMhgLPZDY Sep 13 00:10:58.990138 sshd[6123]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:10:58.998097 systemd-logind[1477]: New session 9 of user core. Sep 13 00:10:59.002044 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 13 00:10:59.778362 sshd[6123]: pam_unix(sshd:session): session closed for user core Sep 13 00:10:59.781484 systemd[1]: sshd@12-157.180.121.11:22-147.75.109.163:43658.service: Deactivated successfully. Sep 13 00:10:59.783185 systemd[1]: session-9.scope: Deactivated successfully. Sep 13 00:10:59.784498 systemd-logind[1477]: Session 9 logged out. Waiting for processes to exit. Sep 13 00:10:59.785600 systemd-logind[1477]: Removed session 9. Sep 13 00:10:59.979589 systemd[1]: Started sshd@13-157.180.121.11:22-147.75.109.163:43674.service - OpenSSH per-connection server daemon (147.75.109.163:43674). Sep 13 00:11:01.066798 sshd[6137]: Accepted publickey for core from 147.75.109.163 port 43674 ssh2: RSA SHA256:GymMDYnosJimc4ujfdMuxEHSH4lnFIHEzFRMhgLPZDY Sep 13 00:11:01.068599 sshd[6137]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:11:01.074551 systemd-logind[1477]: New session 10 of user core. Sep 13 00:11:01.080127 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 13 00:11:01.954620 sshd[6137]: pam_unix(sshd:session): session closed for user core Sep 13 00:11:01.961587 systemd[1]: sshd@13-157.180.121.11:22-147.75.109.163:43674.service: Deactivated successfully. Sep 13 00:11:01.966322 systemd[1]: session-10.scope: Deactivated successfully. Sep 13 00:11:01.967954 systemd-logind[1477]: Session 10 logged out. Waiting for processes to exit. Sep 13 00:11:01.970663 systemd-logind[1477]: Removed session 10. Sep 13 00:11:02.108258 systemd[1]: Started sshd@14-157.180.121.11:22-147.75.109.163:58476.service - OpenSSH per-connection server daemon (147.75.109.163:58476). Sep 13 00:11:03.089010 sshd[6152]: Accepted publickey for core from 147.75.109.163 port 58476 ssh2: RSA SHA256:GymMDYnosJimc4ujfdMuxEHSH4lnFIHEzFRMhgLPZDY Sep 13 00:11:03.091067 sshd[6152]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:11:03.097506 systemd-logind[1477]: New session 11 of user core. Sep 13 00:11:03.101199 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 13 00:11:03.906748 sshd[6152]: pam_unix(sshd:session): session closed for user core Sep 13 00:11:03.914847 systemd-logind[1477]: Session 11 logged out. Waiting for processes to exit. Sep 13 00:11:03.916288 systemd[1]: sshd@14-157.180.121.11:22-147.75.109.163:58476.service: Deactivated successfully. Sep 13 00:11:03.919091 systemd[1]: session-11.scope: Deactivated successfully. Sep 13 00:11:03.920902 systemd-logind[1477]: Removed session 11. Sep 13 00:11:07.381086 systemd[1]: run-containerd-runc-k8s.io-8b55bc07e1516f8dfd7f7a6acf2a839134d3aa25a0f72760ffeed69bf552163e-runc.1xKIla.mount: Deactivated successfully. Sep 13 00:11:08.286183 sshd[6092]: Connection closed by 101.126.54.167 port 49248 [preauth] Sep 13 00:11:08.289755 systemd[1]: sshd@11-157.180.121.11:22-101.126.54.167:49248.service: Deactivated successfully. Sep 13 00:11:09.074150 systemd[1]: Started sshd@15-157.180.121.11:22-147.75.109.163:58488.service - OpenSSH per-connection server daemon (147.75.109.163:58488). Sep 13 00:11:10.077474 sshd[6207]: Accepted publickey for core from 147.75.109.163 port 58488 ssh2: RSA SHA256:GymMDYnosJimc4ujfdMuxEHSH4lnFIHEzFRMhgLPZDY Sep 13 00:11:10.080411 sshd[6207]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:11:10.085266 systemd-logind[1477]: New session 12 of user core. Sep 13 00:11:10.091206 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 13 00:11:10.899659 sshd[6207]: pam_unix(sshd:session): session closed for user core Sep 13 00:11:10.907544 systemd-logind[1477]: Session 12 logged out. Waiting for processes to exit. Sep 13 00:11:10.908498 systemd[1]: sshd@15-157.180.121.11:22-147.75.109.163:58488.service: Deactivated successfully. Sep 13 00:11:10.911682 systemd[1]: session-12.scope: Deactivated successfully. Sep 13 00:11:10.912963 systemd-logind[1477]: Removed session 12. Sep 13 00:11:11.065980 systemd[1]: Started sshd@16-157.180.121.11:22-147.75.109.163:44676.service - OpenSSH per-connection server daemon (147.75.109.163:44676). Sep 13 00:11:12.077572 sshd[6220]: Accepted publickey for core from 147.75.109.163 port 44676 ssh2: RSA SHA256:GymMDYnosJimc4ujfdMuxEHSH4lnFIHEzFRMhgLPZDY Sep 13 00:11:12.079676 sshd[6220]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:11:12.085092 systemd-logind[1477]: New session 13 of user core. Sep 13 00:11:12.090157 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 13 00:11:13.023469 sshd[6220]: pam_unix(sshd:session): session closed for user core Sep 13 00:11:13.031155 systemd-logind[1477]: Session 13 logged out. Waiting for processes to exit. Sep 13 00:11:13.031608 systemd[1]: sshd@16-157.180.121.11:22-147.75.109.163:44676.service: Deactivated successfully. Sep 13 00:11:13.033647 systemd[1]: session-13.scope: Deactivated successfully. Sep 13 00:11:13.034926 systemd-logind[1477]: Removed session 13. Sep 13 00:11:13.233322 systemd[1]: Started sshd@17-157.180.121.11:22-147.75.109.163:44682.service - OpenSSH per-connection server daemon (147.75.109.163:44682). Sep 13 00:11:14.337686 sshd[6231]: Accepted publickey for core from 147.75.109.163 port 44682 ssh2: RSA SHA256:GymMDYnosJimc4ujfdMuxEHSH4lnFIHEzFRMhgLPZDY Sep 13 00:11:14.339863 sshd[6231]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:11:14.344208 systemd-logind[1477]: New session 14 of user core. Sep 13 00:11:14.345164 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 13 00:11:15.846379 sshd[6231]: pam_unix(sshd:session): session closed for user core Sep 13 00:11:15.864144 systemd[1]: sshd@17-157.180.121.11:22-147.75.109.163:44682.service: Deactivated successfully. Sep 13 00:11:15.867145 systemd[1]: session-14.scope: Deactivated successfully. Sep 13 00:11:15.868544 systemd-logind[1477]: Session 14 logged out. Waiting for processes to exit. Sep 13 00:11:15.871087 systemd-logind[1477]: Removed session 14. Sep 13 00:11:15.990974 systemd[1]: Started sshd@18-157.180.121.11:22-147.75.109.163:44694.service - OpenSSH per-connection server daemon (147.75.109.163:44694). Sep 13 00:11:17.013320 sshd[6251]: Accepted publickey for core from 147.75.109.163 port 44694 ssh2: RSA SHA256:GymMDYnosJimc4ujfdMuxEHSH4lnFIHEzFRMhgLPZDY Sep 13 00:11:17.014891 sshd[6251]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:11:17.019561 systemd-logind[1477]: New session 15 of user core. Sep 13 00:11:17.026169 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 13 00:11:18.203134 sshd[6251]: pam_unix(sshd:session): session closed for user core Sep 13 00:11:18.209573 systemd-logind[1477]: Session 15 logged out. Waiting for processes to exit. Sep 13 00:11:18.210573 systemd[1]: sshd@18-157.180.121.11:22-147.75.109.163:44694.service: Deactivated successfully. Sep 13 00:11:18.214197 systemd[1]: session-15.scope: Deactivated successfully. Sep 13 00:11:18.215858 systemd-logind[1477]: Removed session 15. Sep 13 00:11:18.363254 systemd[1]: Started sshd@19-157.180.121.11:22-147.75.109.163:44708.service - OpenSSH per-connection server daemon (147.75.109.163:44708). Sep 13 00:11:19.391044 sshd[6282]: Accepted publickey for core from 147.75.109.163 port 44708 ssh2: RSA SHA256:GymMDYnosJimc4ujfdMuxEHSH4lnFIHEzFRMhgLPZDY Sep 13 00:11:19.393254 sshd[6282]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:11:19.397706 systemd-logind[1477]: New session 16 of user core. Sep 13 00:11:19.405105 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 13 00:11:20.366137 sshd[6282]: pam_unix(sshd:session): session closed for user core Sep 13 00:11:20.369193 systemd[1]: sshd@19-157.180.121.11:22-147.75.109.163:44708.service: Deactivated successfully. Sep 13 00:11:20.371565 systemd[1]: session-16.scope: Deactivated successfully. Sep 13 00:11:20.372854 systemd-logind[1477]: Session 16 logged out. Waiting for processes to exit. Sep 13 00:11:20.374268 systemd-logind[1477]: Removed session 16. Sep 13 00:11:25.574408 systemd[1]: Started sshd@20-157.180.121.11:22-147.75.109.163:51218.service - OpenSSH per-connection server daemon (147.75.109.163:51218). Sep 13 00:11:26.736004 sshd[6322]: Accepted publickey for core from 147.75.109.163 port 51218 ssh2: RSA SHA256:GymMDYnosJimc4ujfdMuxEHSH4lnFIHEzFRMhgLPZDY Sep 13 00:11:26.738676 sshd[6322]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:11:26.746676 systemd-logind[1477]: New session 17 of user core. Sep 13 00:11:26.751076 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 13 00:11:28.086377 sshd[6322]: pam_unix(sshd:session): session closed for user core Sep 13 00:11:28.090022 systemd[1]: sshd@20-157.180.121.11:22-147.75.109.163:51218.service: Deactivated successfully. Sep 13 00:11:28.092406 systemd[1]: session-17.scope: Deactivated successfully. Sep 13 00:11:28.093786 systemd-logind[1477]: Session 17 logged out. Waiting for processes to exit. Sep 13 00:11:28.095249 systemd-logind[1477]: Removed session 17. Sep 13 00:11:33.271060 systemd[1]: Started sshd@21-157.180.121.11:22-147.75.109.163:47504.service - OpenSSH per-connection server daemon (147.75.109.163:47504). Sep 13 00:11:34.371192 sshd[6343]: Accepted publickey for core from 147.75.109.163 port 47504 ssh2: RSA SHA256:GymMDYnosJimc4ujfdMuxEHSH4lnFIHEzFRMhgLPZDY Sep 13 00:11:34.373510 sshd[6343]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:11:34.378257 systemd-logind[1477]: New session 18 of user core. Sep 13 00:11:34.382094 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 13 00:11:35.218211 sshd[6343]: pam_unix(sshd:session): session closed for user core Sep 13 00:11:35.222984 systemd[1]: sshd@21-157.180.121.11:22-147.75.109.163:47504.service: Deactivated successfully. Sep 13 00:11:35.229048 systemd[1]: session-18.scope: Deactivated successfully. Sep 13 00:11:35.234263 systemd-logind[1477]: Session 18 logged out. Waiting for processes to exit. Sep 13 00:11:35.236122 systemd-logind[1477]: Removed session 18. Sep 13 00:11:43.386712 systemd[1]: Started sshd@22-157.180.121.11:22-101.126.54.167:52740.service - OpenSSH per-connection server daemon (101.126.54.167:52740). Sep 13 00:11:50.855116 systemd[1]: cri-containerd-79b5701b2ceb633b53145f7995275084a94d8ec47131687ba1fb115130506abb.scope: Deactivated successfully. Sep 13 00:11:50.855675 systemd[1]: cri-containerd-79b5701b2ceb633b53145f7995275084a94d8ec47131687ba1fb115130506abb.scope: Consumed 14.977s CPU time. Sep 13 00:11:51.005660 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-79b5701b2ceb633b53145f7995275084a94d8ec47131687ba1fb115130506abb-rootfs.mount: Deactivated successfully. Sep 13 00:11:51.070062 containerd[1501]: time="2025-09-13T00:11:51.033574606Z" level=info msg="shim disconnected" id=79b5701b2ceb633b53145f7995275084a94d8ec47131687ba1fb115130506abb namespace=k8s.io Sep 13 00:11:51.070062 containerd[1501]: time="2025-09-13T00:11:51.070053077Z" level=warning msg="cleaning up after shim disconnected" id=79b5701b2ceb633b53145f7995275084a94d8ec47131687ba1fb115130506abb namespace=k8s.io Sep 13 00:11:51.070062 containerd[1501]: time="2025-09-13T00:11:51.070072453Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 13 00:11:51.355631 kubelet[2611]: E0913 00:11:51.349350 2611 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:33508->10.0.0.2:2379: read: connection timed out" Sep 13 00:11:52.017631 kubelet[2611]: I0913 00:11:52.013718 2611 scope.go:117] "RemoveContainer" containerID="79b5701b2ceb633b53145f7995275084a94d8ec47131687ba1fb115130506abb" Sep 13 00:11:52.090317 systemd[1]: cri-containerd-5d9cfaea7ec87371b1a1d4bee9a785a71a176f78533a5cecfb3e8b9c0da9bd22.scope: Deactivated successfully. Sep 13 00:11:52.090815 systemd[1]: cri-containerd-5d9cfaea7ec87371b1a1d4bee9a785a71a176f78533a5cecfb3e8b9c0da9bd22.scope: Consumed 4.394s CPU time, 25.2M memory peak, 0B memory swap peak. Sep 13 00:11:52.138184 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5d9cfaea7ec87371b1a1d4bee9a785a71a176f78533a5cecfb3e8b9c0da9bd22-rootfs.mount: Deactivated successfully. Sep 13 00:11:52.139139 containerd[1501]: time="2025-09-13T00:11:52.138874427Z" level=info msg="shim disconnected" id=5d9cfaea7ec87371b1a1d4bee9a785a71a176f78533a5cecfb3e8b9c0da9bd22 namespace=k8s.io Sep 13 00:11:52.139139 containerd[1501]: time="2025-09-13T00:11:52.139043644Z" level=warning msg="cleaning up after shim disconnected" id=5d9cfaea7ec87371b1a1d4bee9a785a71a176f78533a5cecfb3e8b9c0da9bd22 namespace=k8s.io Sep 13 00:11:52.139139 containerd[1501]: time="2025-09-13T00:11:52.139053523Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 13 00:11:52.161397 containerd[1501]: time="2025-09-13T00:11:52.161346353Z" level=info msg="CreateContainer within sandbox \"f3aedd9214944d7bc2567f5189347a66ee73b93a8763a391ac77d0888ad8f434\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Sep 13 00:11:52.231837 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2861177698.mount: Deactivated successfully. Sep 13 00:11:52.264492 containerd[1501]: time="2025-09-13T00:11:52.264228189Z" level=info msg="CreateContainer within sandbox \"f3aedd9214944d7bc2567f5189347a66ee73b93a8763a391ac77d0888ad8f434\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"54dc06322f0ecd6caf9eaee0c10ac5a7e767e53118123cbe9100dad77bfb06bb\"" Sep 13 00:11:52.269690 containerd[1501]: time="2025-09-13T00:11:52.269588652Z" level=info msg="StartContainer for \"54dc06322f0ecd6caf9eaee0c10ac5a7e767e53118123cbe9100dad77bfb06bb\"" Sep 13 00:11:52.325587 systemd[1]: Started cri-containerd-54dc06322f0ecd6caf9eaee0c10ac5a7e767e53118123cbe9100dad77bfb06bb.scope - libcontainer container 54dc06322f0ecd6caf9eaee0c10ac5a7e767e53118123cbe9100dad77bfb06bb. Sep 13 00:11:52.385223 containerd[1501]: time="2025-09-13T00:11:52.384474356Z" level=info msg="StartContainer for \"54dc06322f0ecd6caf9eaee0c10ac5a7e767e53118123cbe9100dad77bfb06bb\" returns successfully" Sep 13 00:11:52.982424 kubelet[2611]: I0913 00:11:52.982378 2611 scope.go:117] "RemoveContainer" containerID="5d9cfaea7ec87371b1a1d4bee9a785a71a176f78533a5cecfb3e8b9c0da9bd22" Sep 13 00:11:52.984610 containerd[1501]: time="2025-09-13T00:11:52.984570837Z" level=info msg="CreateContainer within sandbox \"0203571e286c334dc0af8869f3d7139b2d9186f941ef835cc74f3a9ae504982f\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Sep 13 00:11:53.010487 containerd[1501]: time="2025-09-13T00:11:53.010411151Z" level=info msg="CreateContainer within sandbox \"0203571e286c334dc0af8869f3d7139b2d9186f941ef835cc74f3a9ae504982f\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"b6460e4a4582b074493386b803b92ea770549ffe8a387c4e5130eda2630ce370\"" Sep 13 00:11:53.011157 containerd[1501]: time="2025-09-13T00:11:53.011117145Z" level=info msg="StartContainer for \"b6460e4a4582b074493386b803b92ea770549ffe8a387c4e5130eda2630ce370\"" Sep 13 00:11:53.042209 systemd[1]: Started cri-containerd-b6460e4a4582b074493386b803b92ea770549ffe8a387c4e5130eda2630ce370.scope - libcontainer container b6460e4a4582b074493386b803b92ea770549ffe8a387c4e5130eda2630ce370. Sep 13 00:11:53.094360 containerd[1501]: time="2025-09-13T00:11:53.094210997Z" level=info msg="StartContainer for \"b6460e4a4582b074493386b803b92ea770549ffe8a387c4e5130eda2630ce370\" returns successfully" Sep 13 00:11:54.929403 kubelet[2611]: E0913 00:11:54.890401 2611 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:33296->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4081-3-5-n-294a4568b6.1864af1ff3f949c4 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4081-3-5-n-294a4568b6,UID:a3e6731a273bf932ff2542338e26c204,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4081-3-5-n-294a4568b6,},FirstTimestamp:2025-09-13 00:11:46.353383876 +0000 UTC m=+212.767417203,LastTimestamp:2025-09-13 00:11:46.353383876 +0000 UTC m=+212.767417203,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4081-3-5-n-294a4568b6,}" Sep 13 00:11:56.895868 systemd[1]: cri-containerd-26a2acc9b508df78313e7184612aa6c026fad194beef1fdbff26242749583e63.scope: Deactivated successfully. Sep 13 00:11:56.897877 systemd[1]: cri-containerd-26a2acc9b508df78313e7184612aa6c026fad194beef1fdbff26242749583e63.scope: Consumed 2.761s CPU time, 24.2M memory peak, 0B memory swap peak. Sep 13 00:11:56.948842 containerd[1501]: time="2025-09-13T00:11:56.948738517Z" level=info msg="shim disconnected" id=26a2acc9b508df78313e7184612aa6c026fad194beef1fdbff26242749583e63 namespace=k8s.io Sep 13 00:11:56.949337 containerd[1501]: time="2025-09-13T00:11:56.948842752Z" level=warning msg="cleaning up after shim disconnected" id=26a2acc9b508df78313e7184612aa6c026fad194beef1fdbff26242749583e63 namespace=k8s.io Sep 13 00:11:56.949337 containerd[1501]: time="2025-09-13T00:11:56.948859013Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 13 00:11:56.951231 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-26a2acc9b508df78313e7184612aa6c026fad194beef1fdbff26242749583e63-rootfs.mount: Deactivated successfully. Sep 13 00:11:57.030853 kubelet[2611]: I0913 00:11:57.030800 2611 scope.go:117] "RemoveContainer" containerID="26a2acc9b508df78313e7184612aa6c026fad194beef1fdbff26242749583e63" Sep 13 00:11:57.037463 containerd[1501]: time="2025-09-13T00:11:57.037293491Z" level=info msg="CreateContainer within sandbox \"315552f54f8628a455bb6a05910ba78bdd5323a8d7244466a5038537aef4569d\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Sep 13 00:11:57.052937 containerd[1501]: time="2025-09-13T00:11:57.052485048Z" level=info msg="CreateContainer within sandbox \"315552f54f8628a455bb6a05910ba78bdd5323a8d7244466a5038537aef4569d\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"7581116638888a0b51809f37448a8ca0e70ed688335ec0a8ec8b0751f6a7be7c\"" Sep 13 00:11:57.054566 containerd[1501]: time="2025-09-13T00:11:57.054521898Z" level=info msg="StartContainer for \"7581116638888a0b51809f37448a8ca0e70ed688335ec0a8ec8b0751f6a7be7c\"" Sep 13 00:11:57.094503 systemd[1]: Started cri-containerd-7581116638888a0b51809f37448a8ca0e70ed688335ec0a8ec8b0751f6a7be7c.scope - libcontainer container 7581116638888a0b51809f37448a8ca0e70ed688335ec0a8ec8b0751f6a7be7c. Sep 13 00:11:57.136293 containerd[1501]: time="2025-09-13T00:11:57.136222500Z" level=info msg="StartContainer for \"7581116638888a0b51809f37448a8ca0e70ed688335ec0a8ec8b0751f6a7be7c\" returns successfully"