Sep 12 17:32:49.847406 kernel: Linux version 6.6.106-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT_DYNAMIC Fri Sep 12 16:05:08 -00 2025 Sep 12 17:32:49.847427 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=1ff9ec556ac80c67ae2340139aa421bf26af13357ec9e72632b4878e9945dc9a Sep 12 17:32:49.847435 kernel: BIOS-provided physical RAM map: Sep 12 17:32:49.847440 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Sep 12 17:32:49.847444 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Sep 12 17:32:49.847449 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Sep 12 17:32:49.847454 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007cfdbfff] usable Sep 12 17:32:49.847459 kernel: BIOS-e820: [mem 0x000000007cfdc000-0x000000007cffffff] reserved Sep 12 17:32:49.847465 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Sep 12 17:32:49.847470 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Sep 12 17:32:49.847474 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 12 17:32:49.847479 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Sep 12 17:32:49.847483 kernel: NX (Execute Disable) protection: active Sep 12 17:32:49.847488 kernel: APIC: Static calls initialized Sep 12 17:32:49.847495 kernel: SMBIOS 2.8 present. Sep 12 17:32:49.847501 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 Sep 12 17:32:49.847505 kernel: Hypervisor detected: KVM Sep 12 17:32:49.847510 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Sep 12 17:32:49.847515 kernel: kvm-clock: using sched offset of 2957590673 cycles Sep 12 17:32:49.847540 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 12 17:32:49.847546 kernel: tsc: Detected 2445.406 MHz processor Sep 12 17:32:49.847551 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 12 17:32:49.847557 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 12 17:32:49.847564 kernel: last_pfn = 0x7cfdc max_arch_pfn = 0x400000000 Sep 12 17:32:49.847569 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Sep 12 17:32:49.847575 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 12 17:32:49.847580 kernel: Using GB pages for direct mapping Sep 12 17:32:49.847585 kernel: ACPI: Early table checksum verification disabled Sep 12 17:32:49.847590 kernel: ACPI: RSDP 0x00000000000F5270 000014 (v00 BOCHS ) Sep 12 17:32:49.847595 kernel: ACPI: RSDT 0x000000007CFE2693 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:32:49.847600 kernel: ACPI: FACP 0x000000007CFE2483 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:32:49.847605 kernel: ACPI: DSDT 0x000000007CFE0040 002443 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:32:49.847612 kernel: ACPI: FACS 0x000000007CFE0000 000040 Sep 12 17:32:49.847617 kernel: ACPI: APIC 0x000000007CFE2577 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:32:49.847622 kernel: ACPI: HPET 0x000000007CFE25F7 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:32:49.847627 kernel: ACPI: MCFG 0x000000007CFE262F 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:32:49.847632 kernel: ACPI: WAET 0x000000007CFE266B 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:32:49.847638 kernel: ACPI: Reserving FACP table memory at [mem 0x7cfe2483-0x7cfe2576] Sep 12 17:32:49.847643 kernel: ACPI: Reserving DSDT table memory at [mem 0x7cfe0040-0x7cfe2482] Sep 12 17:32:49.847648 kernel: ACPI: Reserving FACS table memory at [mem 0x7cfe0000-0x7cfe003f] Sep 12 17:32:49.847656 kernel: ACPI: Reserving APIC table memory at [mem 0x7cfe2577-0x7cfe25f6] Sep 12 17:32:49.847662 kernel: ACPI: Reserving HPET table memory at [mem 0x7cfe25f7-0x7cfe262e] Sep 12 17:32:49.847668 kernel: ACPI: Reserving MCFG table memory at [mem 0x7cfe262f-0x7cfe266a] Sep 12 17:32:49.847673 kernel: ACPI: Reserving WAET table memory at [mem 0x7cfe266b-0x7cfe2692] Sep 12 17:32:49.847678 kernel: No NUMA configuration found Sep 12 17:32:49.847684 kernel: Faking a node at [mem 0x0000000000000000-0x000000007cfdbfff] Sep 12 17:32:49.847690 kernel: NODE_DATA(0) allocated [mem 0x7cfd6000-0x7cfdbfff] Sep 12 17:32:49.847696 kernel: Zone ranges: Sep 12 17:32:49.847701 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 12 17:32:49.847707 kernel: DMA32 [mem 0x0000000001000000-0x000000007cfdbfff] Sep 12 17:32:49.847712 kernel: Normal empty Sep 12 17:32:49.847717 kernel: Movable zone start for each node Sep 12 17:32:49.847722 kernel: Early memory node ranges Sep 12 17:32:49.847728 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Sep 12 17:32:49.847733 kernel: node 0: [mem 0x0000000000100000-0x000000007cfdbfff] Sep 12 17:32:49.847739 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007cfdbfff] Sep 12 17:32:49.847745 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 12 17:32:49.847751 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Sep 12 17:32:49.847756 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges Sep 12 17:32:49.847761 kernel: ACPI: PM-Timer IO Port: 0x608 Sep 12 17:32:49.847767 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Sep 12 17:32:49.847772 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Sep 12 17:32:49.847777 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 12 17:32:49.847783 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Sep 12 17:32:49.847788 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 12 17:32:49.847795 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Sep 12 17:32:49.847800 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Sep 12 17:32:49.847806 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 12 17:32:49.847811 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Sep 12 17:32:49.847816 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs Sep 12 17:32:49.847822 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Sep 12 17:32:49.847827 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Sep 12 17:32:49.847832 kernel: Booting paravirtualized kernel on KVM Sep 12 17:32:49.847838 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 12 17:32:49.847844 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Sep 12 17:32:49.847850 kernel: percpu: Embedded 58 pages/cpu s197160 r8192 d32216 u1048576 Sep 12 17:32:49.847855 kernel: pcpu-alloc: s197160 r8192 d32216 u1048576 alloc=1*2097152 Sep 12 17:32:49.847860 kernel: pcpu-alloc: [0] 0 1 Sep 12 17:32:49.847865 kernel: kvm-guest: PV spinlocks disabled, no host support Sep 12 17:32:49.847871 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=1ff9ec556ac80c67ae2340139aa421bf26af13357ec9e72632b4878e9945dc9a Sep 12 17:32:49.847877 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 12 17:32:49.847883 kernel: random: crng init done Sep 12 17:32:49.847889 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 12 17:32:49.847907 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Sep 12 17:32:49.847912 kernel: Fallback order for Node 0: 0 Sep 12 17:32:49.847917 kernel: Built 1 zonelists, mobility grouping on. Total pages: 503708 Sep 12 17:32:49.847923 kernel: Policy zone: DMA32 Sep 12 17:32:49.847928 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 12 17:32:49.847934 kernel: Memory: 1922056K/2047464K available (12288K kernel code, 2293K rwdata, 22744K rodata, 42884K init, 2312K bss, 125148K reserved, 0K cma-reserved) Sep 12 17:32:49.847939 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Sep 12 17:32:49.847945 kernel: ftrace: allocating 37974 entries in 149 pages Sep 12 17:32:49.847951 kernel: ftrace: allocated 149 pages with 4 groups Sep 12 17:32:49.847957 kernel: Dynamic Preempt: voluntary Sep 12 17:32:49.847962 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 12 17:32:49.847968 kernel: rcu: RCU event tracing is enabled. Sep 12 17:32:49.847974 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Sep 12 17:32:49.847979 kernel: Trampoline variant of Tasks RCU enabled. Sep 12 17:32:49.847985 kernel: Rude variant of Tasks RCU enabled. Sep 12 17:32:49.847990 kernel: Tracing variant of Tasks RCU enabled. Sep 12 17:32:49.847995 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 12 17:32:49.848001 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Sep 12 17:32:49.848007 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Sep 12 17:32:49.848013 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 12 17:32:49.848018 kernel: Console: colour VGA+ 80x25 Sep 12 17:32:49.848024 kernel: printk: console [tty0] enabled Sep 12 17:32:49.848029 kernel: printk: console [ttyS0] enabled Sep 12 17:32:49.848035 kernel: ACPI: Core revision 20230628 Sep 12 17:32:49.848040 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Sep 12 17:32:49.848046 kernel: APIC: Switch to symmetric I/O mode setup Sep 12 17:32:49.848051 kernel: x2apic enabled Sep 12 17:32:49.848058 kernel: APIC: Switched APIC routing to: physical x2apic Sep 12 17:32:49.848063 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Sep 12 17:32:49.848069 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized Sep 12 17:32:49.848074 kernel: Calibrating delay loop (skipped) preset value.. 4890.81 BogoMIPS (lpj=2445406) Sep 12 17:32:49.848080 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Sep 12 17:32:49.848085 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Sep 12 17:32:49.848091 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Sep 12 17:32:49.848096 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 12 17:32:49.848107 kernel: Spectre V2 : Mitigation: Retpolines Sep 12 17:32:49.848113 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Sep 12 17:32:49.848118 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Sep 12 17:32:49.848125 kernel: active return thunk: retbleed_return_thunk Sep 12 17:32:49.848131 kernel: RETBleed: Mitigation: untrained return thunk Sep 12 17:32:49.848137 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 12 17:32:49.848143 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 12 17:32:49.848148 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 12 17:32:49.848154 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 12 17:32:49.848161 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 12 17:32:49.848167 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 12 17:32:49.848172 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Sep 12 17:32:49.848178 kernel: Freeing SMP alternatives memory: 32K Sep 12 17:32:49.848184 kernel: pid_max: default: 32768 minimum: 301 Sep 12 17:32:49.848190 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Sep 12 17:32:49.848195 kernel: landlock: Up and running. Sep 12 17:32:49.848201 kernel: SELinux: Initializing. Sep 12 17:32:49.848208 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 12 17:32:49.848213 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 12 17:32:49.848219 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0) Sep 12 17:32:49.848225 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 17:32:49.848231 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 17:32:49.848237 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 17:32:49.848243 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Sep 12 17:32:49.848248 kernel: ... version: 0 Sep 12 17:32:49.848254 kernel: ... bit width: 48 Sep 12 17:32:49.848261 kernel: ... generic registers: 6 Sep 12 17:32:49.848266 kernel: ... value mask: 0000ffffffffffff Sep 12 17:32:49.848272 kernel: ... max period: 00007fffffffffff Sep 12 17:32:49.848278 kernel: ... fixed-purpose events: 0 Sep 12 17:32:49.848283 kernel: ... event mask: 000000000000003f Sep 12 17:32:49.848289 kernel: signal: max sigframe size: 1776 Sep 12 17:32:49.848294 kernel: rcu: Hierarchical SRCU implementation. Sep 12 17:32:49.848300 kernel: rcu: Max phase no-delay instances is 400. Sep 12 17:32:49.848306 kernel: smp: Bringing up secondary CPUs ... Sep 12 17:32:49.848312 kernel: smpboot: x86: Booting SMP configuration: Sep 12 17:32:49.848318 kernel: .... node #0, CPUs: #1 Sep 12 17:32:49.848324 kernel: smp: Brought up 1 node, 2 CPUs Sep 12 17:32:49.848329 kernel: smpboot: Max logical packages: 1 Sep 12 17:32:49.848335 kernel: smpboot: Total of 2 processors activated (9781.62 BogoMIPS) Sep 12 17:32:49.848340 kernel: devtmpfs: initialized Sep 12 17:32:49.848346 kernel: x86/mm: Memory block size: 128MB Sep 12 17:32:49.848352 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 12 17:32:49.848358 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Sep 12 17:32:49.848364 kernel: pinctrl core: initialized pinctrl subsystem Sep 12 17:32:49.848370 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 12 17:32:49.848376 kernel: audit: initializing netlink subsys (disabled) Sep 12 17:32:49.848382 kernel: audit: type=2000 audit(1757698369.677:1): state=initialized audit_enabled=0 res=1 Sep 12 17:32:49.848387 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 12 17:32:49.848393 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 12 17:32:49.848398 kernel: cpuidle: using governor menu Sep 12 17:32:49.848404 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 12 17:32:49.848410 kernel: dca service started, version 1.12.1 Sep 12 17:32:49.848417 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xb0000000-0xbfffffff] (base 0xb0000000) Sep 12 17:32:49.848422 kernel: PCI: Using configuration type 1 for base access Sep 12 17:32:49.848428 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 12 17:32:49.848434 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 12 17:32:49.848439 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 12 17:32:49.848445 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 12 17:32:49.848450 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 12 17:32:49.848456 kernel: ACPI: Added _OSI(Module Device) Sep 12 17:32:49.848462 kernel: ACPI: Added _OSI(Processor Device) Sep 12 17:32:49.848468 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 12 17:32:49.848474 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 12 17:32:49.848479 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Sep 12 17:32:49.848485 kernel: ACPI: Interpreter enabled Sep 12 17:32:49.848491 kernel: ACPI: PM: (supports S0 S5) Sep 12 17:32:49.848496 kernel: ACPI: Using IOAPIC for interrupt routing Sep 12 17:32:49.848502 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 12 17:32:49.848508 kernel: PCI: Using E820 reservations for host bridge windows Sep 12 17:32:49.848513 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Sep 12 17:32:49.848587 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 12 17:32:49.848704 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 12 17:32:49.848777 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Sep 12 17:32:49.848844 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Sep 12 17:32:49.848852 kernel: PCI host bridge to bus 0000:00 Sep 12 17:32:49.848941 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 12 17:32:49.849001 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 12 17:32:49.849060 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 12 17:32:49.849115 kernel: pci_bus 0000:00: root bus resource [mem 0x7d000000-0xafffffff window] Sep 12 17:32:49.849167 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Sep 12 17:32:49.849219 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] Sep 12 17:32:49.849272 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 12 17:32:49.849345 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 Sep 12 17:32:49.849420 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 Sep 12 17:32:49.849482 kernel: pci 0000:00:01.0: reg 0x10: [mem 0xfb800000-0xfbffffff pref] Sep 12 17:32:49.849564 kernel: pci 0000:00:01.0: reg 0x18: [mem 0xfd200000-0xfd203fff 64bit pref] Sep 12 17:32:49.849629 kernel: pci 0000:00:01.0: reg 0x20: [mem 0xfea10000-0xfea10fff] Sep 12 17:32:49.849692 kernel: pci 0000:00:01.0: reg 0x30: [mem 0xfea00000-0xfea0ffff pref] Sep 12 17:32:49.849754 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 12 17:32:49.849823 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Sep 12 17:32:49.849889 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfea11000-0xfea11fff] Sep 12 17:32:49.849973 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Sep 12 17:32:49.850037 kernel: pci 0000:00:02.1: reg 0x10: [mem 0xfea12000-0xfea12fff] Sep 12 17:32:49.850106 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Sep 12 17:32:49.850167 kernel: pci 0000:00:02.2: reg 0x10: [mem 0xfea13000-0xfea13fff] Sep 12 17:32:49.850233 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Sep 12 17:32:49.850298 kernel: pci 0000:00:02.3: reg 0x10: [mem 0xfea14000-0xfea14fff] Sep 12 17:32:49.850369 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Sep 12 17:32:49.850431 kernel: pci 0000:00:02.4: reg 0x10: [mem 0xfea15000-0xfea15fff] Sep 12 17:32:49.850497 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Sep 12 17:32:49.850586 kernel: pci 0000:00:02.5: reg 0x10: [mem 0xfea16000-0xfea16fff] Sep 12 17:32:49.850655 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Sep 12 17:32:49.850720 kernel: pci 0000:00:02.6: reg 0x10: [mem 0xfea17000-0xfea17fff] Sep 12 17:32:49.850786 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Sep 12 17:32:49.850848 kernel: pci 0000:00:02.7: reg 0x10: [mem 0xfea18000-0xfea18fff] Sep 12 17:32:49.850929 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 Sep 12 17:32:49.850993 kernel: pci 0000:00:03.0: reg 0x10: [mem 0xfea19000-0xfea19fff] Sep 12 17:32:49.851061 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 Sep 12 17:32:49.851127 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Sep 12 17:32:49.851193 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 Sep 12 17:32:49.851254 kernel: pci 0000:00:1f.2: reg 0x20: [io 0xc040-0xc05f] Sep 12 17:32:49.851313 kernel: pci 0000:00:1f.2: reg 0x24: [mem 0xfea1a000-0xfea1afff] Sep 12 17:32:49.851378 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 Sep 12 17:32:49.851439 kernel: pci 0000:00:1f.3: reg 0x20: [io 0x0700-0x073f] Sep 12 17:32:49.851513 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 Sep 12 17:32:49.851616 kernel: pci 0000:01:00.0: reg 0x14: [mem 0xfe880000-0xfe880fff] Sep 12 17:32:49.851682 kernel: pci 0000:01:00.0: reg 0x20: [mem 0xfd000000-0xfd003fff 64bit pref] Sep 12 17:32:49.851746 kernel: pci 0000:01:00.0: reg 0x30: [mem 0xfe800000-0xfe87ffff pref] Sep 12 17:32:49.851807 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Sep 12 17:32:49.851867 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Sep 12 17:32:49.851942 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Sep 12 17:32:49.852014 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 Sep 12 17:32:49.852172 kernel: pci 0000:02:00.0: reg 0x10: [mem 0xfe600000-0xfe603fff 64bit] Sep 12 17:32:49.852278 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Sep 12 17:32:49.852348 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Sep 12 17:32:49.852411 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Sep 12 17:32:49.852484 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 Sep 12 17:32:49.852959 kernel: pci 0000:03:00.0: reg 0x14: [mem 0xfe400000-0xfe400fff] Sep 12 17:32:49.853040 kernel: pci 0000:03:00.0: reg 0x20: [mem 0xfcc00000-0xfcc03fff 64bit pref] Sep 12 17:32:49.853104 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Sep 12 17:32:49.853165 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Sep 12 17:32:49.854671 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 12 17:32:49.854754 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 Sep 12 17:32:49.854823 kernel: pci 0000:04:00.0: reg 0x20: [mem 0xfca00000-0xfca03fff 64bit pref] Sep 12 17:32:49.854885 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Sep 12 17:32:49.854964 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Sep 12 17:32:49.855032 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 12 17:32:49.855102 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 Sep 12 17:32:49.855166 kernel: pci 0000:05:00.0: reg 0x20: [mem 0xfc800000-0xfc803fff 64bit pref] Sep 12 17:32:49.855229 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Sep 12 17:32:49.855290 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Sep 12 17:32:49.855349 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 12 17:32:49.855419 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 Sep 12 17:32:49.856633 kernel: pci 0000:06:00.0: reg 0x14: [mem 0xfde00000-0xfde00fff] Sep 12 17:32:49.856709 kernel: pci 0000:06:00.0: reg 0x20: [mem 0xfc600000-0xfc603fff 64bit pref] Sep 12 17:32:49.856772 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Sep 12 17:32:49.856875 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Sep 12 17:32:49.856968 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 12 17:32:49.856978 kernel: acpiphp: Slot [0] registered Sep 12 17:32:49.857049 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 Sep 12 17:32:49.857120 kernel: pci 0000:07:00.0: reg 0x14: [mem 0xfdc80000-0xfdc80fff] Sep 12 17:32:49.857183 kernel: pci 0000:07:00.0: reg 0x20: [mem 0xfc400000-0xfc403fff 64bit pref] Sep 12 17:32:49.857246 kernel: pci 0000:07:00.0: reg 0x30: [mem 0xfdc00000-0xfdc7ffff pref] Sep 12 17:32:49.857308 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Sep 12 17:32:49.857367 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Sep 12 17:32:49.857426 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 12 17:32:49.857435 kernel: acpiphp: Slot [0-2] registered Sep 12 17:32:49.857493 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Sep 12 17:32:49.857937 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Sep 12 17:32:49.858014 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 12 17:32:49.858024 kernel: acpiphp: Slot [0-3] registered Sep 12 17:32:49.858134 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Sep 12 17:32:49.858208 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Sep 12 17:32:49.858271 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 12 17:32:49.858280 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Sep 12 17:32:49.858286 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Sep 12 17:32:49.858292 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 12 17:32:49.858301 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Sep 12 17:32:49.858307 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Sep 12 17:32:49.858312 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Sep 12 17:32:49.858318 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Sep 12 17:32:49.858324 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Sep 12 17:32:49.858329 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Sep 12 17:32:49.858335 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Sep 12 17:32:49.858341 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Sep 12 17:32:49.858346 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Sep 12 17:32:49.858354 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Sep 12 17:32:49.858657 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Sep 12 17:32:49.858666 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Sep 12 17:32:49.858672 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Sep 12 17:32:49.858678 kernel: iommu: Default domain type: Translated Sep 12 17:32:49.858684 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 12 17:32:49.858690 kernel: PCI: Using ACPI for IRQ routing Sep 12 17:32:49.858696 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 12 17:32:49.858701 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Sep 12 17:32:49.858709 kernel: e820: reserve RAM buffer [mem 0x7cfdc000-0x7fffffff] Sep 12 17:32:49.858788 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Sep 12 17:32:49.858852 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Sep 12 17:32:49.858927 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 12 17:32:49.858937 kernel: vgaarb: loaded Sep 12 17:32:49.858943 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Sep 12 17:32:49.858949 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Sep 12 17:32:49.858954 kernel: clocksource: Switched to clocksource kvm-clock Sep 12 17:32:49.858963 kernel: VFS: Disk quotas dquot_6.6.0 Sep 12 17:32:49.858969 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 12 17:32:49.858975 kernel: pnp: PnP ACPI init Sep 12 17:32:49.859047 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Sep 12 17:32:49.859057 kernel: pnp: PnP ACPI: found 5 devices Sep 12 17:32:49.859063 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 12 17:32:49.859068 kernel: NET: Registered PF_INET protocol family Sep 12 17:32:49.859074 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 12 17:32:49.859082 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Sep 12 17:32:49.859088 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 12 17:32:49.859094 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Sep 12 17:32:49.859100 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 12 17:32:49.859106 kernel: TCP: Hash tables configured (established 16384 bind 16384) Sep 12 17:32:49.859112 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 12 17:32:49.859117 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 12 17:32:49.859123 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 12 17:32:49.859129 kernel: NET: Registered PF_XDP protocol family Sep 12 17:32:49.859194 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Sep 12 17:32:49.859256 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Sep 12 17:32:49.859317 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Sep 12 17:32:49.859378 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x1000-0x1fff] Sep 12 17:32:49.859437 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x2000-0x2fff] Sep 12 17:32:49.859496 kernel: pci 0000:00:03.0: BAR 13: assigned [io 0x3000-0x3fff] Sep 12 17:32:49.859586 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Sep 12 17:32:49.859654 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Sep 12 17:32:49.859714 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Sep 12 17:32:49.859774 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Sep 12 17:32:49.859835 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Sep 12 17:32:49.859908 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Sep 12 17:32:49.859973 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Sep 12 17:32:49.860034 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Sep 12 17:32:49.860094 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 12 17:32:49.860154 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Sep 12 17:32:49.860251 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Sep 12 17:32:49.860316 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 12 17:32:49.860378 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Sep 12 17:32:49.860438 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Sep 12 17:32:49.860499 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 12 17:32:49.862669 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Sep 12 17:32:49.862762 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Sep 12 17:32:49.862838 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 12 17:32:49.862920 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Sep 12 17:32:49.862986 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] Sep 12 17:32:49.863048 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Sep 12 17:32:49.863110 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 12 17:32:49.863170 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Sep 12 17:32:49.863230 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] Sep 12 17:32:49.863290 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Sep 12 17:32:49.863349 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 12 17:32:49.863409 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Sep 12 17:32:49.863474 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] Sep 12 17:32:49.864505 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Sep 12 17:32:49.864628 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 12 17:32:49.864697 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 12 17:32:49.864754 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 12 17:32:49.864808 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 12 17:32:49.864861 kernel: pci_bus 0000:00: resource 7 [mem 0x7d000000-0xafffffff window] Sep 12 17:32:49.864932 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Sep 12 17:32:49.864987 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] Sep 12 17:32:49.865050 kernel: pci_bus 0000:01: resource 1 [mem 0xfe800000-0xfe9fffff] Sep 12 17:32:49.865112 kernel: pci_bus 0000:01: resource 2 [mem 0xfd000000-0xfd1fffff 64bit pref] Sep 12 17:32:49.865174 kernel: pci_bus 0000:02: resource 1 [mem 0xfe600000-0xfe7fffff] Sep 12 17:32:49.865230 kernel: pci_bus 0000:02: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Sep 12 17:32:49.865290 kernel: pci_bus 0000:03: resource 1 [mem 0xfe400000-0xfe5fffff] Sep 12 17:32:49.865345 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 12 17:32:49.865411 kernel: pci_bus 0000:04: resource 1 [mem 0xfe200000-0xfe3fffff] Sep 12 17:32:49.865468 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 12 17:32:49.865563 kernel: pci_bus 0000:05: resource 1 [mem 0xfe000000-0xfe1fffff] Sep 12 17:32:49.866595 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 12 17:32:49.866667 kernel: pci_bus 0000:06: resource 1 [mem 0xfde00000-0xfdffffff] Sep 12 17:32:49.866724 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 12 17:32:49.866786 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] Sep 12 17:32:49.866847 kernel: pci_bus 0000:07: resource 1 [mem 0xfdc00000-0xfddfffff] Sep 12 17:32:49.866914 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 12 17:32:49.866977 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] Sep 12 17:32:49.867032 kernel: pci_bus 0000:08: resource 1 [mem 0xfda00000-0xfdbfffff] Sep 12 17:32:49.867088 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 12 17:32:49.867151 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] Sep 12 17:32:49.867212 kernel: pci_bus 0000:09: resource 1 [mem 0xfd800000-0xfd9fffff] Sep 12 17:32:49.867268 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 12 17:32:49.867277 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Sep 12 17:32:49.867284 kernel: PCI: CLS 0 bytes, default 64 Sep 12 17:32:49.867290 kernel: Initialise system trusted keyrings Sep 12 17:32:49.867297 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Sep 12 17:32:49.867303 kernel: Key type asymmetric registered Sep 12 17:32:49.867309 kernel: Asymmetric key parser 'x509' registered Sep 12 17:32:49.867315 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Sep 12 17:32:49.867323 kernel: io scheduler mq-deadline registered Sep 12 17:32:49.867329 kernel: io scheduler kyber registered Sep 12 17:32:49.867336 kernel: io scheduler bfq registered Sep 12 17:32:49.867399 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Sep 12 17:32:49.867461 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Sep 12 17:32:49.868543 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Sep 12 17:32:49.868618 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Sep 12 17:32:49.868683 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Sep 12 17:32:49.868750 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Sep 12 17:32:49.868813 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Sep 12 17:32:49.868873 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Sep 12 17:32:49.868951 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Sep 12 17:32:49.869012 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Sep 12 17:32:49.869072 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Sep 12 17:32:49.869131 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Sep 12 17:32:49.869191 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Sep 12 17:32:49.869251 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Sep 12 17:32:49.869317 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Sep 12 17:32:49.869377 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Sep 12 17:32:49.869386 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Sep 12 17:32:49.869444 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Sep 12 17:32:49.869504 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Sep 12 17:32:49.869513 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 12 17:32:49.870557 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Sep 12 17:32:49.870564 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 12 17:32:49.870574 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 12 17:32:49.870580 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Sep 12 17:32:49.870586 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 12 17:32:49.870592 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 12 17:32:49.870672 kernel: rtc_cmos 00:03: RTC can wake from S4 Sep 12 17:32:49.870682 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 12 17:32:49.870739 kernel: rtc_cmos 00:03: registered as rtc0 Sep 12 17:32:49.870795 kernel: rtc_cmos 00:03: setting system clock to 2025-09-12T17:32:49 UTC (1757698369) Sep 12 17:32:49.870853 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Sep 12 17:32:49.870864 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Sep 12 17:32:49.870871 kernel: NET: Registered PF_INET6 protocol family Sep 12 17:32:49.870877 kernel: Segment Routing with IPv6 Sep 12 17:32:49.870883 kernel: In-situ OAM (IOAM) with IPv6 Sep 12 17:32:49.870889 kernel: NET: Registered PF_PACKET protocol family Sep 12 17:32:49.870908 kernel: Key type dns_resolver registered Sep 12 17:32:49.870914 kernel: IPI shorthand broadcast: enabled Sep 12 17:32:49.870920 kernel: sched_clock: Marking stable (1053007485, 134565795)->(1195774129, -8200849) Sep 12 17:32:49.870928 kernel: registered taskstats version 1 Sep 12 17:32:49.870934 kernel: Loading compiled-in X.509 certificates Sep 12 17:32:49.870941 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.106-flatcar: 449ba23cbe21e08b3bddb674b4885682335ee1f9' Sep 12 17:32:49.870946 kernel: Key type .fscrypt registered Sep 12 17:32:49.870952 kernel: Key type fscrypt-provisioning registered Sep 12 17:32:49.870958 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 12 17:32:49.870964 kernel: ima: Allocated hash algorithm: sha1 Sep 12 17:32:49.870970 kernel: ima: No architecture policies found Sep 12 17:32:49.870977 kernel: clk: Disabling unused clocks Sep 12 17:32:49.870985 kernel: Freeing unused kernel image (initmem) memory: 42884K Sep 12 17:32:49.870991 kernel: Write protecting the kernel read-only data: 36864k Sep 12 17:32:49.870997 kernel: Freeing unused kernel image (rodata/data gap) memory: 1832K Sep 12 17:32:49.871003 kernel: Run /init as init process Sep 12 17:32:49.871009 kernel: with arguments: Sep 12 17:32:49.871015 kernel: /init Sep 12 17:32:49.871021 kernel: with environment: Sep 12 17:32:49.871026 kernel: HOME=/ Sep 12 17:32:49.871032 kernel: TERM=linux Sep 12 17:32:49.871039 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 12 17:32:49.871047 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Sep 12 17:32:49.871056 systemd[1]: Detected virtualization kvm. Sep 12 17:32:49.871063 systemd[1]: Detected architecture x86-64. Sep 12 17:32:49.871069 systemd[1]: Running in initrd. Sep 12 17:32:49.871075 systemd[1]: No hostname configured, using default hostname. Sep 12 17:32:49.871081 systemd[1]: Hostname set to . Sep 12 17:32:49.871089 systemd[1]: Initializing machine ID from VM UUID. Sep 12 17:32:49.871095 systemd[1]: Queued start job for default target initrd.target. Sep 12 17:32:49.871101 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 17:32:49.871108 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 17:32:49.871115 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 12 17:32:49.871121 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 17:32:49.871128 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 12 17:32:49.871134 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 12 17:32:49.871143 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 12 17:32:49.871149 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 12 17:32:49.871156 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 17:32:49.871162 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 17:32:49.871169 systemd[1]: Reached target paths.target - Path Units. Sep 12 17:32:49.871175 systemd[1]: Reached target slices.target - Slice Units. Sep 12 17:32:49.871182 systemd[1]: Reached target swap.target - Swaps. Sep 12 17:32:49.871189 systemd[1]: Reached target timers.target - Timer Units. Sep 12 17:32:49.871196 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 17:32:49.871202 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 17:32:49.871209 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 12 17:32:49.871215 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Sep 12 17:32:49.871222 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 17:32:49.871228 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 17:32:49.871234 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 17:32:49.871241 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 17:32:49.871248 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 12 17:32:49.871257 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 17:32:49.871268 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 12 17:32:49.871280 systemd[1]: Starting systemd-fsck-usr.service... Sep 12 17:32:49.871292 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 17:32:49.871300 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 17:32:49.871306 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:32:49.871328 systemd-journald[187]: Collecting audit messages is disabled. Sep 12 17:32:49.871347 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 12 17:32:49.871354 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 17:32:49.871361 systemd[1]: Finished systemd-fsck-usr.service. Sep 12 17:32:49.871370 systemd-journald[187]: Journal started Sep 12 17:32:49.871386 systemd-journald[187]: Runtime Journal (/run/log/journal/4a9a170610e3462c88a2b29b897bd04d) is 4.8M, max 38.4M, 33.6M free. Sep 12 17:32:49.855774 systemd-modules-load[188]: Inserted module 'overlay' Sep 12 17:32:49.909481 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 12 17:32:49.909508 kernel: Bridge firewalling registered Sep 12 17:32:49.909538 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 17:32:49.881749 systemd-modules-load[188]: Inserted module 'br_netfilter' Sep 12 17:32:49.910257 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 17:32:49.911133 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:32:49.916647 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 17:32:49.918258 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 17:32:49.920649 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 17:32:49.922142 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 17:32:49.930615 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 17:32:49.938294 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 17:32:49.941593 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 12 17:32:49.944190 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 17:32:49.950780 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 17:32:49.952925 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 17:32:49.957680 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 17:32:49.959586 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 17:32:49.966541 dracut-cmdline[214]: dracut-dracut-053 Sep 12 17:32:49.969008 dracut-cmdline[214]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=1ff9ec556ac80c67ae2340139aa421bf26af13357ec9e72632b4878e9945dc9a Sep 12 17:32:49.982637 systemd-resolved[225]: Positive Trust Anchors: Sep 12 17:32:49.982653 systemd-resolved[225]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 17:32:49.982694 systemd-resolved[225]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 17:32:49.991371 systemd-resolved[225]: Defaulting to hostname 'linux'. Sep 12 17:32:49.992136 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 17:32:49.992884 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 17:32:50.022566 kernel: SCSI subsystem initialized Sep 12 17:32:50.029548 kernel: Loading iSCSI transport class v2.0-870. Sep 12 17:32:50.038551 kernel: iscsi: registered transport (tcp) Sep 12 17:32:50.055553 kernel: iscsi: registered transport (qla4xxx) Sep 12 17:32:50.055583 kernel: QLogic iSCSI HBA Driver Sep 12 17:32:50.088288 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 12 17:32:50.095642 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 12 17:32:50.115564 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 12 17:32:50.115605 kernel: device-mapper: uevent: version 1.0.3 Sep 12 17:32:50.115618 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Sep 12 17:32:50.153553 kernel: raid6: avx2x4 gen() 32919 MB/s Sep 12 17:32:50.170557 kernel: raid6: avx2x2 gen() 30361 MB/s Sep 12 17:32:50.187650 kernel: raid6: avx2x1 gen() 25612 MB/s Sep 12 17:32:50.187680 kernel: raid6: using algorithm avx2x4 gen() 32919 MB/s Sep 12 17:32:50.205729 kernel: raid6: .... xor() 5375 MB/s, rmw enabled Sep 12 17:32:50.205757 kernel: raid6: using avx2x2 recovery algorithm Sep 12 17:32:50.222550 kernel: xor: automatically using best checksumming function avx Sep 12 17:32:50.343566 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 12 17:32:50.353197 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 12 17:32:50.360697 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 17:32:50.372649 systemd-udevd[406]: Using default interface naming scheme 'v255'. Sep 12 17:32:50.376136 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 17:32:50.384637 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 12 17:32:50.398600 dracut-pre-trigger[408]: rd.md=0: removing MD RAID activation Sep 12 17:32:50.426874 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 17:32:50.429704 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 17:32:50.476293 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 17:32:50.484659 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 12 17:32:50.496684 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 12 17:32:50.498198 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 17:32:50.499678 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 17:32:50.500626 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 17:32:50.505811 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 12 17:32:50.514145 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 12 17:32:50.561056 kernel: scsi host0: Virtio SCSI HBA Sep 12 17:32:50.586550 kernel: scsi 0:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Sep 12 17:32:50.591602 kernel: cryptd: max_cpu_qlen set to 1000 Sep 12 17:32:50.594588 kernel: ACPI: bus type USB registered Sep 12 17:32:50.602254 kernel: usbcore: registered new interface driver usbfs Sep 12 17:32:50.597432 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 12 17:32:50.597497 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 17:32:50.599025 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 17:32:50.599537 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 17:32:50.599660 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:32:50.600513 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:32:50.609476 kernel: usbcore: registered new interface driver hub Sep 12 17:32:50.608603 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:32:50.614563 kernel: usbcore: registered new device driver usb Sep 12 17:32:50.614584 kernel: libata version 3.00 loaded. Sep 12 17:32:50.630540 kernel: AVX2 version of gcm_enc/dec engaged. Sep 12 17:32:50.630562 kernel: AES CTR mode by8 optimization enabled Sep 12 17:32:50.665557 kernel: ahci 0000:00:1f.2: version 3.0 Sep 12 17:32:50.665708 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Sep 12 17:32:50.667537 kernel: ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode Sep 12 17:32:50.667646 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Sep 12 17:32:50.678562 kernel: scsi host1: ahci Sep 12 17:32:50.679143 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:32:50.685540 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Sep 12 17:32:50.685661 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Sep 12 17:32:50.685745 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Sep 12 17:32:50.687686 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Sep 12 17:32:50.688989 kernel: scsi host2: ahci Sep 12 17:32:50.689021 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Sep 12 17:32:50.688752 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 17:32:50.701312 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Sep 12 17:32:50.701426 kernel: hub 1-0:1.0: USB hub found Sep 12 17:32:50.701543 kernel: hub 1-0:1.0: 4 ports detected Sep 12 17:32:50.701648 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Sep 12 17:32:50.701739 kernel: scsi host3: ahci Sep 12 17:32:50.701817 kernel: hub 2-0:1.0: USB hub found Sep 12 17:32:50.701919 kernel: hub 2-0:1.0: 4 ports detected Sep 12 17:32:50.701998 kernel: sd 0:0:0:0: Power-on or device reset occurred Sep 12 17:32:50.702090 kernel: scsi host4: ahci Sep 12 17:32:50.702168 kernel: sd 0:0:0:0: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Sep 12 17:32:50.702261 kernel: sd 0:0:0:0: [sda] Write Protect is off Sep 12 17:32:50.702344 kernel: sd 0:0:0:0: [sda] Mode Sense: 63 00 00 08 Sep 12 17:32:50.702423 kernel: scsi host5: ahci Sep 12 17:32:50.703599 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Sep 12 17:32:50.705547 kernel: scsi host6: ahci Sep 12 17:32:50.711548 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a100 irq 46 Sep 12 17:32:50.711574 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 12 17:32:50.711583 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a180 irq 46 Sep 12 17:32:50.711591 kernel: GPT:17805311 != 80003071 Sep 12 17:32:50.711598 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a200 irq 46 Sep 12 17:32:50.711605 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 12 17:32:50.711613 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a280 irq 46 Sep 12 17:32:50.711624 kernel: GPT:17805311 != 80003071 Sep 12 17:32:50.711631 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a300 irq 46 Sep 12 17:32:50.711638 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 12 17:32:50.711645 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a380 irq 46 Sep 12 17:32:50.711652 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 17:32:50.721702 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Sep 12 17:32:50.729279 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 17:32:50.943547 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Sep 12 17:32:51.020284 kernel: ata2: SATA link down (SStatus 0 SControl 300) Sep 12 17:32:51.020364 kernel: ata3: SATA link down (SStatus 0 SControl 300) Sep 12 17:32:51.020546 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Sep 12 17:32:51.023415 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Sep 12 17:32:51.023435 kernel: ata1.00: applying bridge limits Sep 12 17:32:51.029925 kernel: ata1.00: configured for UDMA/100 Sep 12 17:32:51.029960 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 12 17:32:51.029972 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 12 17:32:51.029981 kernel: scsi 1:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Sep 12 17:32:51.030020 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 12 17:32:51.067537 kernel: sr 1:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Sep 12 17:32:51.067716 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 12 17:32:51.084205 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Sep 12 17:32:51.092840 kernel: BTRFS: device fsid 6dad227e-2c0d-42e6-b0d2-5c756384bc19 devid 1 transid 34 /dev/sda3 scanned by (udev-worker) (453) Sep 12 17:32:51.092858 kernel: sr 1:0:0:0: Attached scsi CD-ROM sr0 Sep 12 17:32:51.093003 kernel: BTRFS: device label OEM devid 1 transid 9 /dev/sda6 scanned by (udev-worker) (474) Sep 12 17:32:51.103294 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Sep 12 17:32:51.111561 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 12 17:32:51.112311 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Sep 12 17:32:51.116679 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Sep 12 17:32:51.119571 kernel: usbcore: registered new interface driver usbhid Sep 12 17:32:51.119586 kernel: usbhid: USB HID core driver Sep 12 17:32:51.118970 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Sep 12 17:32:51.126841 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input2 Sep 12 17:32:51.126863 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Sep 12 17:32:51.127618 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 12 17:32:51.131939 disk-uuid[579]: Primary Header is updated. Sep 12 17:32:51.131939 disk-uuid[579]: Secondary Entries is updated. Sep 12 17:32:51.131939 disk-uuid[579]: Secondary Header is updated. Sep 12 17:32:51.136537 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 17:32:51.141551 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 17:32:51.145553 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 17:32:52.147582 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 17:32:52.149337 disk-uuid[580]: The operation has completed successfully. Sep 12 17:32:52.185969 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 12 17:32:52.186058 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 12 17:32:52.209669 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 12 17:32:52.212001 sh[601]: Success Sep 12 17:32:52.224797 kernel: device-mapper: verity: sha256 using implementation "sha256-ni" Sep 12 17:32:52.262635 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 12 17:32:52.270537 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 12 17:32:52.271993 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 12 17:32:52.284548 kernel: BTRFS info (device dm-0): first mount of filesystem 6dad227e-2c0d-42e6-b0d2-5c756384bc19 Sep 12 17:32:52.284578 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 12 17:32:52.286961 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Sep 12 17:32:52.288961 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 12 17:32:52.291335 kernel: BTRFS info (device dm-0): using free space tree Sep 12 17:32:52.299555 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 12 17:32:52.301068 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 12 17:32:52.301901 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 12 17:32:52.313663 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 12 17:32:52.316742 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 12 17:32:52.329420 kernel: BTRFS info (device sda6): first mount of filesystem 4080f51d-d3f2-4545-8f59-3798077218dc Sep 12 17:32:52.329451 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 17:32:52.329463 kernel: BTRFS info (device sda6): using free space tree Sep 12 17:32:52.335264 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 17:32:52.335294 kernel: BTRFS info (device sda6): auto enabling async discard Sep 12 17:32:52.345138 systemd[1]: mnt-oem.mount: Deactivated successfully. Sep 12 17:32:52.346106 kernel: BTRFS info (device sda6): last unmount of filesystem 4080f51d-d3f2-4545-8f59-3798077218dc Sep 12 17:32:52.350752 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 12 17:32:52.356945 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 12 17:32:52.413935 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 17:32:52.420675 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 17:32:52.423569 ignition[714]: Ignition 2.19.0 Sep 12 17:32:52.423586 ignition[714]: Stage: fetch-offline Sep 12 17:32:52.424759 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 17:32:52.423638 ignition[714]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:32:52.423646 ignition[714]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 17:32:52.423741 ignition[714]: parsed url from cmdline: "" Sep 12 17:32:52.423744 ignition[714]: no config URL provided Sep 12 17:32:52.423749 ignition[714]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 17:32:52.423757 ignition[714]: no config at "/usr/lib/ignition/user.ign" Sep 12 17:32:52.423763 ignition[714]: failed to fetch config: resource requires networking Sep 12 17:32:52.424036 ignition[714]: Ignition finished successfully Sep 12 17:32:52.438191 systemd-networkd[786]: lo: Link UP Sep 12 17:32:52.438197 systemd-networkd[786]: lo: Gained carrier Sep 12 17:32:52.440752 systemd-networkd[786]: Enumeration completed Sep 12 17:32:52.440923 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 17:32:52.442083 systemd-networkd[786]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:32:52.442086 systemd-networkd[786]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 17:32:52.442741 systemd[1]: Reached target network.target - Network. Sep 12 17:32:52.443592 systemd-networkd[786]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:32:52.443595 systemd-networkd[786]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 17:32:52.444183 systemd-networkd[786]: eth0: Link UP Sep 12 17:32:52.444186 systemd-networkd[786]: eth0: Gained carrier Sep 12 17:32:52.444192 systemd-networkd[786]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:32:52.448693 systemd-networkd[786]: eth1: Link UP Sep 12 17:32:52.448696 systemd-networkd[786]: eth1: Gained carrier Sep 12 17:32:52.448701 systemd-networkd[786]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:32:52.449646 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 12 17:32:52.458401 ignition[789]: Ignition 2.19.0 Sep 12 17:32:52.458411 ignition[789]: Stage: fetch Sep 12 17:32:52.458555 ignition[789]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:32:52.458564 ignition[789]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 17:32:52.458622 ignition[789]: parsed url from cmdline: "" Sep 12 17:32:52.458625 ignition[789]: no config URL provided Sep 12 17:32:52.458628 ignition[789]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 17:32:52.458634 ignition[789]: no config at "/usr/lib/ignition/user.ign" Sep 12 17:32:52.458648 ignition[789]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Sep 12 17:32:52.458793 ignition[789]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Sep 12 17:32:52.472554 systemd-networkd[786]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Sep 12 17:32:52.505574 systemd-networkd[786]: eth0: DHCPv4 address 135.181.96.215/32, gateway 172.31.1.1 acquired from 172.31.1.1 Sep 12 17:32:52.659866 ignition[789]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Sep 12 17:32:52.663679 ignition[789]: GET result: OK Sep 12 17:32:52.663749 ignition[789]: parsing config with SHA512: 3e5fafff626a834d71d0564a362b808ae40f48d72fb4bcd6df108b7f4562213525f4fa8b7645c0219258661d6fe206d3d432ad0f0f0639615d1c4ef08118cdf0 Sep 12 17:32:52.667959 unknown[789]: fetched base config from "system" Sep 12 17:32:52.667973 unknown[789]: fetched base config from "system" Sep 12 17:32:52.667978 unknown[789]: fetched user config from "hetzner" Sep 12 17:32:52.669761 ignition[789]: fetch: fetch complete Sep 12 17:32:52.669767 ignition[789]: fetch: fetch passed Sep 12 17:32:52.671477 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 12 17:32:52.669807 ignition[789]: Ignition finished successfully Sep 12 17:32:52.676660 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 12 17:32:52.687319 ignition[797]: Ignition 2.19.0 Sep 12 17:32:52.687330 ignition[797]: Stage: kargs Sep 12 17:32:52.687476 ignition[797]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:32:52.687485 ignition[797]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 17:32:52.688247 ignition[797]: kargs: kargs passed Sep 12 17:32:52.690550 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 12 17:32:52.688282 ignition[797]: Ignition finished successfully Sep 12 17:32:52.697634 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 12 17:32:52.707981 ignition[803]: Ignition 2.19.0 Sep 12 17:32:52.707992 ignition[803]: Stage: disks Sep 12 17:32:52.708148 ignition[803]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:32:52.712191 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 12 17:32:52.708157 ignition[803]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 17:32:52.714609 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 12 17:32:52.708935 ignition[803]: disks: disks passed Sep 12 17:32:52.715398 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 12 17:32:52.708972 ignition[803]: Ignition finished successfully Sep 12 17:32:52.716590 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 17:32:52.717909 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 17:32:52.719164 systemd[1]: Reached target basic.target - Basic System. Sep 12 17:32:52.728689 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 12 17:32:52.741231 systemd-fsck[811]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Sep 12 17:32:52.744172 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 12 17:32:52.748614 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 12 17:32:52.823542 kernel: EXT4-fs (sda9): mounted filesystem 791ad691-63ae-4dbc-8ce3-6c8819e56736 r/w with ordered data mode. Quota mode: none. Sep 12 17:32:52.824258 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 12 17:32:52.825046 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 12 17:32:52.830577 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 17:32:52.832118 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 12 17:32:52.834638 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Sep 12 17:32:52.836080 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 12 17:32:52.836848 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 17:32:52.840544 kernel: BTRFS: device label OEM devid 1 transid 10 /dev/sda6 scanned by mount (819) Sep 12 17:32:52.841608 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 12 17:32:52.852770 kernel: BTRFS info (device sda6): first mount of filesystem 4080f51d-d3f2-4545-8f59-3798077218dc Sep 12 17:32:52.852789 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 17:32:52.852798 kernel: BTRFS info (device sda6): using free space tree Sep 12 17:32:52.852805 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 17:32:52.852813 kernel: BTRFS info (device sda6): auto enabling async discard Sep 12 17:32:52.854374 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 17:32:52.861983 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 12 17:32:52.893397 coreos-metadata[821]: Sep 12 17:32:52.893 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Sep 12 17:32:52.894822 coreos-metadata[821]: Sep 12 17:32:52.894 INFO Fetch successful Sep 12 17:32:52.896207 coreos-metadata[821]: Sep 12 17:32:52.895 INFO wrote hostname ci-4081-3-6-f-c182586e87 to /sysroot/etc/hostname Sep 12 17:32:52.897347 initrd-setup-root[846]: cut: /sysroot/etc/passwd: No such file or directory Sep 12 17:32:52.898254 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 12 17:32:52.901820 initrd-setup-root[854]: cut: /sysroot/etc/group: No such file or directory Sep 12 17:32:52.904900 initrd-setup-root[861]: cut: /sysroot/etc/shadow: No such file or directory Sep 12 17:32:52.907789 initrd-setup-root[868]: cut: /sysroot/etc/gshadow: No such file or directory Sep 12 17:32:52.970447 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 12 17:32:52.975602 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 12 17:32:52.978627 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 12 17:32:52.983540 kernel: BTRFS info (device sda6): last unmount of filesystem 4080f51d-d3f2-4545-8f59-3798077218dc Sep 12 17:32:53.002143 ignition[936]: INFO : Ignition 2.19.0 Sep 12 17:32:53.004013 ignition[936]: INFO : Stage: mount Sep 12 17:32:53.004013 ignition[936]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 17:32:53.004013 ignition[936]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 17:32:53.006049 ignition[936]: INFO : mount: mount passed Sep 12 17:32:53.006049 ignition[936]: INFO : Ignition finished successfully Sep 12 17:32:53.008030 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 12 17:32:53.010448 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 12 17:32:53.015625 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 12 17:32:53.283584 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 12 17:32:53.288991 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 17:32:53.304557 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 scanned by mount (947) Sep 12 17:32:53.309986 kernel: BTRFS info (device sda6): first mount of filesystem 4080f51d-d3f2-4545-8f59-3798077218dc Sep 12 17:32:53.310017 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 17:32:53.312814 kernel: BTRFS info (device sda6): using free space tree Sep 12 17:32:53.322113 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 17:32:53.322141 kernel: BTRFS info (device sda6): auto enabling async discard Sep 12 17:32:53.325984 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 17:32:53.348412 ignition[964]: INFO : Ignition 2.19.0 Sep 12 17:32:53.349563 ignition[964]: INFO : Stage: files Sep 12 17:32:53.349563 ignition[964]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 17:32:53.351723 ignition[964]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 17:32:53.351723 ignition[964]: DEBUG : files: compiled without relabeling support, skipping Sep 12 17:32:53.353330 ignition[964]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 12 17:32:53.353330 ignition[964]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 12 17:32:53.357398 ignition[964]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 12 17:32:53.358465 ignition[964]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 12 17:32:53.359736 unknown[964]: wrote ssh authorized keys file for user: core Sep 12 17:32:53.360980 ignition[964]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 12 17:32:53.362259 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 12 17:32:53.363314 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Sep 12 17:32:53.579761 systemd-networkd[786]: eth1: Gained IPv6LL Sep 12 17:32:53.662185 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 12 17:32:53.900085 systemd-networkd[786]: eth0: Gained IPv6LL Sep 12 17:32:54.059411 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 12 17:32:54.059411 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 12 17:32:54.063351 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 12 17:32:54.063351 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 12 17:32:54.063351 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 12 17:32:54.063351 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 17:32:54.063351 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 17:32:54.063351 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 17:32:54.063351 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 17:32:54.063351 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 17:32:54.063351 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 17:32:54.063351 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 12 17:32:54.063351 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 12 17:32:54.063351 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 12 17:32:54.063351 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Sep 12 17:32:54.512547 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 12 17:32:54.904897 ignition[964]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 12 17:32:54.904897 ignition[964]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 12 17:32:54.909746 ignition[964]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 17:32:54.909746 ignition[964]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 17:32:54.909746 ignition[964]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 12 17:32:54.909746 ignition[964]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 12 17:32:54.909746 ignition[964]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Sep 12 17:32:54.909746 ignition[964]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Sep 12 17:32:54.909746 ignition[964]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 12 17:32:54.909746 ignition[964]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Sep 12 17:32:54.909746 ignition[964]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Sep 12 17:32:54.909746 ignition[964]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 12 17:32:54.909746 ignition[964]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 12 17:32:54.909746 ignition[964]: INFO : files: files passed Sep 12 17:32:54.909746 ignition[964]: INFO : Ignition finished successfully Sep 12 17:32:54.911411 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 12 17:32:54.923731 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 12 17:32:54.926729 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 12 17:32:54.928214 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 12 17:32:54.928339 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 12 17:32:54.948994 initrd-setup-root-after-ignition[993]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 17:32:54.948994 initrd-setup-root-after-ignition[993]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 12 17:32:54.953694 initrd-setup-root-after-ignition[997]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 17:32:54.954819 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 17:32:54.957047 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 12 17:32:54.964734 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 12 17:32:55.004502 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 12 17:32:55.004702 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 12 17:32:55.007352 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 12 17:32:55.009227 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 12 17:32:55.011458 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 12 17:32:55.017720 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 12 17:32:55.036081 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 17:32:55.043792 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 12 17:32:55.059064 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 12 17:32:55.060443 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 17:32:55.063160 systemd[1]: Stopped target timers.target - Timer Units. Sep 12 17:32:55.065486 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 12 17:32:55.065812 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 17:32:55.068186 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 12 17:32:55.069766 systemd[1]: Stopped target basic.target - Basic System. Sep 12 17:32:55.072077 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 12 17:32:55.074074 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 17:32:55.076169 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 12 17:32:55.078250 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 12 17:32:55.080687 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 17:32:55.083005 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 12 17:32:55.085237 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 12 17:32:55.087751 systemd[1]: Stopped target swap.target - Swaps. Sep 12 17:32:55.089683 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 12 17:32:55.089930 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 12 17:32:55.092298 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 12 17:32:55.093729 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 17:32:55.095900 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 12 17:32:55.097083 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 17:32:55.098242 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 12 17:32:55.098415 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 12 17:32:55.101267 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 12 17:32:55.101442 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 17:32:55.102758 systemd[1]: ignition-files.service: Deactivated successfully. Sep 12 17:32:55.103038 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 12 17:32:55.104607 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Sep 12 17:32:55.104900 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 12 17:32:55.112865 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 12 17:32:55.114268 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 12 17:32:55.114515 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 17:32:55.121787 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 12 17:32:55.122695 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 12 17:32:55.122880 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 17:32:55.129958 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 12 17:32:55.130488 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 17:32:55.136795 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 12 17:32:55.136904 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 12 17:32:55.141297 ignition[1017]: INFO : Ignition 2.19.0 Sep 12 17:32:55.143721 ignition[1017]: INFO : Stage: umount Sep 12 17:32:55.143721 ignition[1017]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 17:32:55.143721 ignition[1017]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 17:32:55.143721 ignition[1017]: INFO : umount: umount passed Sep 12 17:32:55.143721 ignition[1017]: INFO : Ignition finished successfully Sep 12 17:32:55.143270 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 12 17:32:55.143352 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 12 17:32:55.144702 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 12 17:32:55.144784 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 12 17:32:55.145560 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 12 17:32:55.145603 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 12 17:32:55.153040 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 12 17:32:55.153082 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 12 17:32:55.159813 systemd[1]: Stopped target network.target - Network. Sep 12 17:32:55.168557 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 12 17:32:55.168611 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 17:32:55.169720 systemd[1]: Stopped target paths.target - Path Units. Sep 12 17:32:55.170907 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 12 17:32:55.171554 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 17:32:55.172262 systemd[1]: Stopped target slices.target - Slice Units. Sep 12 17:32:55.174066 systemd[1]: Stopped target sockets.target - Socket Units. Sep 12 17:32:55.180454 systemd[1]: iscsid.socket: Deactivated successfully. Sep 12 17:32:55.180496 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 17:32:55.181963 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 12 17:32:55.181997 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 17:32:55.183146 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 12 17:32:55.183189 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 12 17:32:55.184364 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 12 17:32:55.184404 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 12 17:32:55.185686 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 12 17:32:55.187255 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 12 17:32:55.189970 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 12 17:32:55.190620 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 12 17:32:55.190743 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 12 17:32:55.191219 systemd-networkd[786]: eth0: DHCPv6 lease lost Sep 12 17:32:55.193278 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 12 17:32:55.193371 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 12 17:32:55.194765 systemd-networkd[786]: eth1: DHCPv6 lease lost Sep 12 17:32:55.196461 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 12 17:32:55.196736 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 12 17:32:55.198421 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 12 17:32:55.198476 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 17:32:55.200284 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 12 17:32:55.200389 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 12 17:32:55.201760 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 12 17:32:55.201803 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 12 17:32:55.207686 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 12 17:32:55.209228 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 12 17:32:55.209293 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 17:32:55.210182 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 12 17:32:55.210233 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 12 17:32:55.211903 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 12 17:32:55.211954 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 12 17:32:55.213490 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 17:32:55.225379 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 12 17:32:55.225510 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 12 17:32:55.229214 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 12 17:32:55.229466 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 17:32:55.231676 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 12 17:32:55.231746 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 12 17:32:55.233054 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 12 17:32:55.233092 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 17:32:55.234776 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 12 17:32:55.234831 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 12 17:32:55.237028 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 12 17:32:55.237083 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 12 17:32:55.238902 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 12 17:32:55.238958 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 17:32:55.248749 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 12 17:32:55.250162 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 12 17:32:55.250258 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 17:32:55.252366 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 17:32:55.252456 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:32:55.266303 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 12 17:32:55.266496 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 12 17:32:55.269085 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 12 17:32:55.274705 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 12 17:32:55.290277 systemd[1]: Switching root. Sep 12 17:32:55.339871 systemd-journald[187]: Journal stopped Sep 12 17:32:56.345682 systemd-journald[187]: Received SIGTERM from PID 1 (systemd). Sep 12 17:32:56.345722 kernel: SELinux: policy capability network_peer_controls=1 Sep 12 17:32:56.345736 kernel: SELinux: policy capability open_perms=1 Sep 12 17:32:56.345744 kernel: SELinux: policy capability extended_socket_class=1 Sep 12 17:32:56.345751 kernel: SELinux: policy capability always_check_network=0 Sep 12 17:32:56.345761 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 12 17:32:56.345771 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 12 17:32:56.345781 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 12 17:32:56.345788 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 12 17:32:56.345795 kernel: audit: type=1403 audit(1757698375.487:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 12 17:32:56.345805 systemd[1]: Successfully loaded SELinux policy in 53.668ms. Sep 12 17:32:56.345823 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 18.053ms. Sep 12 17:32:56.345832 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Sep 12 17:32:56.345851 systemd[1]: Detected virtualization kvm. Sep 12 17:32:56.345859 systemd[1]: Detected architecture x86-64. Sep 12 17:32:56.345867 systemd[1]: Detected first boot. Sep 12 17:32:56.345875 systemd[1]: Hostname set to . Sep 12 17:32:56.345883 systemd[1]: Initializing machine ID from VM UUID. Sep 12 17:32:56.345891 zram_generator::config[1060]: No configuration found. Sep 12 17:32:56.345904 systemd[1]: Populated /etc with preset unit settings. Sep 12 17:32:56.345913 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 12 17:32:56.345920 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 12 17:32:56.345928 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 12 17:32:56.345936 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 12 17:32:56.345944 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 12 17:32:56.345952 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 12 17:32:56.345960 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 12 17:32:56.345969 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 12 17:32:56.345977 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 12 17:32:56.345985 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 12 17:32:56.345993 systemd[1]: Created slice user.slice - User and Session Slice. Sep 12 17:32:56.346002 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 17:32:56.346010 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 17:32:56.346018 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 12 17:32:56.346026 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 12 17:32:56.346034 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 12 17:32:56.346044 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 17:32:56.346052 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 12 17:32:56.346060 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 17:32:56.346068 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 12 17:32:56.346077 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 12 17:32:56.346085 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 12 17:32:56.346094 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 12 17:32:56.346102 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 17:32:56.346112 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 17:32:56.346119 systemd[1]: Reached target slices.target - Slice Units. Sep 12 17:32:56.346128 systemd[1]: Reached target swap.target - Swaps. Sep 12 17:32:56.346136 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 12 17:32:56.346145 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 12 17:32:56.346153 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 17:32:56.346162 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 17:32:56.346170 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 17:32:56.346179 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 12 17:32:56.346187 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 12 17:32:56.346195 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 12 17:32:56.346203 systemd[1]: Mounting media.mount - External Media Directory... Sep 12 17:32:56.346211 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:32:56.346219 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 12 17:32:56.346227 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 12 17:32:56.346235 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 12 17:32:56.346245 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 12 17:32:56.346257 systemd[1]: Reached target machines.target - Containers. Sep 12 17:32:56.346267 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 12 17:32:56.346275 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:32:56.346284 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 17:32:56.346292 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 12 17:32:56.346301 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 17:32:56.346310 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 17:32:56.346318 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 17:32:56.346326 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 12 17:32:56.346334 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 17:32:56.346342 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 12 17:32:56.346350 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 12 17:32:56.346358 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 12 17:32:56.346367 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 12 17:32:56.346375 systemd[1]: Stopped systemd-fsck-usr.service. Sep 12 17:32:56.346382 kernel: loop: module loaded Sep 12 17:32:56.346390 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 17:32:56.346398 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 17:32:56.346406 kernel: fuse: init (API version 7.39) Sep 12 17:32:56.346414 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 17:32:56.346422 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 12 17:32:56.346431 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 17:32:56.346441 systemd[1]: verity-setup.service: Deactivated successfully. Sep 12 17:32:56.346448 kernel: ACPI: bus type drm_connector registered Sep 12 17:32:56.346456 systemd[1]: Stopped verity-setup.service. Sep 12 17:32:56.346465 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:32:56.346474 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 12 17:32:56.346482 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 12 17:32:56.346490 systemd[1]: Mounted media.mount - External Media Directory. Sep 12 17:32:56.346509 systemd-journald[1140]: Collecting audit messages is disabled. Sep 12 17:32:56.347607 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 12 17:32:56.347621 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 12 17:32:56.347631 systemd-journald[1140]: Journal started Sep 12 17:32:56.347652 systemd-journald[1140]: Runtime Journal (/run/log/journal/4a9a170610e3462c88a2b29b897bd04d) is 4.8M, max 38.4M, 33.6M free. Sep 12 17:32:56.068396 systemd[1]: Queued start job for default target multi-user.target. Sep 12 17:32:56.096558 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 12 17:32:56.097153 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 12 17:32:56.349565 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 17:32:56.350862 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 12 17:32:56.351440 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 17:32:56.352978 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 12 17:32:56.353073 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 12 17:32:56.353769 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 17:32:56.353868 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 17:32:56.355690 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 17:32:56.355782 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 17:32:56.356383 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 17:32:56.356468 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 17:32:56.357220 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 12 17:32:56.357302 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 12 17:32:56.358789 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 12 17:32:56.359498 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 17:32:56.359749 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 17:32:56.360422 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 17:32:56.361161 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 17:32:56.361869 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 12 17:32:56.368439 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 17:32:56.373610 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 12 17:32:56.376447 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 12 17:32:56.377025 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 12 17:32:56.377108 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 17:32:56.378288 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Sep 12 17:32:56.381243 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 12 17:32:56.387899 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 12 17:32:56.388563 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:32:56.393334 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 12 17:32:56.395669 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 12 17:32:56.396627 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 17:32:56.400625 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 12 17:32:56.401264 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 17:32:56.402901 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 17:32:56.405097 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 12 17:32:56.409596 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 12 17:32:56.411175 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 12 17:32:56.411720 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 12 17:32:56.412331 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 12 17:32:56.420476 systemd-journald[1140]: Time spent on flushing to /var/log/journal/4a9a170610e3462c88a2b29b897bd04d is 39.426ms for 1130 entries. Sep 12 17:32:56.420476 systemd-journald[1140]: System Journal (/var/log/journal/4a9a170610e3462c88a2b29b897bd04d) is 8.0M, max 584.8M, 576.8M free. Sep 12 17:32:56.476263 systemd-journald[1140]: Received client request to flush runtime journal. Sep 12 17:32:56.476300 kernel: loop0: detected capacity change from 0 to 142488 Sep 12 17:32:56.476315 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 12 17:32:56.422778 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 17:32:56.427312 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Sep 12 17:32:56.428747 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 12 17:32:56.429933 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 12 17:32:56.432693 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Sep 12 17:32:56.463173 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 17:32:56.476492 udevadm[1186]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Sep 12 17:32:56.478057 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 12 17:32:56.490544 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 12 17:32:56.497536 kernel: loop1: detected capacity change from 0 to 8 Sep 12 17:32:56.498008 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 17:32:56.499175 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 12 17:32:56.499736 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Sep 12 17:32:56.519809 systemd-tmpfiles[1198]: ACLs are not supported, ignoring. Sep 12 17:32:56.520226 systemd-tmpfiles[1198]: ACLs are not supported, ignoring. Sep 12 17:32:56.522539 kernel: loop2: detected capacity change from 0 to 140768 Sep 12 17:32:56.528013 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 17:32:56.561062 kernel: loop3: detected capacity change from 0 to 224512 Sep 12 17:32:56.604538 kernel: loop4: detected capacity change from 0 to 142488 Sep 12 17:32:56.627552 kernel: loop5: detected capacity change from 0 to 8 Sep 12 17:32:56.631541 kernel: loop6: detected capacity change from 0 to 140768 Sep 12 17:32:56.645545 kernel: loop7: detected capacity change from 0 to 224512 Sep 12 17:32:56.662337 (sd-merge)[1206]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Sep 12 17:32:56.662759 (sd-merge)[1206]: Merged extensions into '/usr'. Sep 12 17:32:56.674003 systemd[1]: Reloading requested from client PID 1180 ('systemd-sysext') (unit systemd-sysext.service)... Sep 12 17:32:56.674124 systemd[1]: Reloading... Sep 12 17:32:56.736585 zram_generator::config[1229]: No configuration found. Sep 12 17:32:56.813320 ldconfig[1175]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 12 17:32:56.837641 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 12 17:32:56.874504 systemd[1]: Reloading finished in 200 ms. Sep 12 17:32:56.894890 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 12 17:32:56.895856 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 12 17:32:56.906147 systemd[1]: Starting ensure-sysext.service... Sep 12 17:32:56.911761 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 17:32:56.915495 systemd[1]: Reloading requested from client PID 1275 ('systemctl') (unit ensure-sysext.service)... Sep 12 17:32:56.915506 systemd[1]: Reloading... Sep 12 17:32:56.930488 systemd-tmpfiles[1276]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 12 17:32:56.931016 systemd-tmpfiles[1276]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 12 17:32:56.931707 systemd-tmpfiles[1276]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 12 17:32:56.932007 systemd-tmpfiles[1276]: ACLs are not supported, ignoring. Sep 12 17:32:56.932108 systemd-tmpfiles[1276]: ACLs are not supported, ignoring. Sep 12 17:32:56.933981 systemd-tmpfiles[1276]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 17:32:56.934048 systemd-tmpfiles[1276]: Skipping /boot Sep 12 17:32:56.939641 systemd-tmpfiles[1276]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 17:32:56.939704 systemd-tmpfiles[1276]: Skipping /boot Sep 12 17:32:56.980817 zram_generator::config[1301]: No configuration found. Sep 12 17:32:57.053624 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 12 17:32:57.090556 systemd[1]: Reloading finished in 174 ms. Sep 12 17:32:57.103482 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 12 17:32:57.107822 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 17:32:57.112685 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Sep 12 17:32:57.115445 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 12 17:32:57.122647 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 12 17:32:57.125640 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 17:32:57.129647 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 17:32:57.131792 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 12 17:32:57.138684 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:32:57.139202 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:32:57.142188 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 17:32:57.150788 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 17:32:57.156869 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 17:32:57.158329 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:32:57.161787 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 12 17:32:57.163047 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:32:57.163729 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 17:32:57.163822 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 17:32:57.165435 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 17:32:57.165931 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 17:32:57.171148 systemd-udevd[1358]: Using default interface naming scheme 'v255'. Sep 12 17:32:57.172886 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:32:57.173042 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:32:57.176031 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 17:32:57.178683 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 17:32:57.179274 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:32:57.179350 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:32:57.179963 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 12 17:32:57.180906 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 12 17:32:57.182102 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 17:32:57.182344 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 17:32:57.190402 augenrules[1378]: No rules Sep 12 17:32:57.191568 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Sep 12 17:32:57.193288 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:32:57.194756 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:32:57.197688 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 17:32:57.201043 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 17:32:57.202012 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:32:57.208758 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 12 17:32:57.209249 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:32:57.209814 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 12 17:32:57.211083 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 17:32:57.211185 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 17:32:57.212700 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 17:32:57.212823 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 17:32:57.215725 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 17:32:57.215840 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 17:32:57.216947 systemd[1]: Finished ensure-sysext.service. Sep 12 17:32:57.220496 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 17:32:57.228001 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 12 17:32:57.228954 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 12 17:32:57.229691 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 17:32:57.231221 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 17:32:57.231324 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 17:32:57.240646 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 17:32:57.241127 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 17:32:57.251983 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 12 17:32:57.254170 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 12 17:32:57.286313 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 12 17:32:57.318980 systemd-resolved[1357]: Positive Trust Anchors: Sep 12 17:32:57.320304 systemd-resolved[1357]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 17:32:57.320332 systemd-resolved[1357]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 17:32:57.330428 systemd-resolved[1357]: Using system hostname 'ci-4081-3-6-f-c182586e87'. Sep 12 17:32:57.331223 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 12 17:32:57.331845 systemd[1]: Reached target time-set.target - System Time Set. Sep 12 17:32:57.332050 systemd-networkd[1404]: lo: Link UP Sep 12 17:32:57.332053 systemd-networkd[1404]: lo: Gained carrier Sep 12 17:32:57.333065 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 17:32:57.333600 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 17:32:57.336462 systemd-timesyncd[1396]: No network connectivity, watching for changes. Sep 12 17:32:57.337242 systemd-networkd[1404]: Enumeration completed Sep 12 17:32:57.337302 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 17:32:57.338112 systemd[1]: Reached target network.target - Network. Sep 12 17:32:57.340776 systemd-networkd[1404]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:32:57.340785 systemd-networkd[1404]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 17:32:57.343227 systemd-networkd[1404]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:32:57.343236 systemd-networkd[1404]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 17:32:57.345644 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 12 17:32:57.346686 systemd-networkd[1404]: eth0: Link UP Sep 12 17:32:57.346692 systemd-networkd[1404]: eth0: Gained carrier Sep 12 17:32:57.346703 systemd-networkd[1404]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:32:57.350801 systemd-networkd[1404]: eth1: Link UP Sep 12 17:32:57.350810 systemd-networkd[1404]: eth1: Gained carrier Sep 12 17:32:57.350821 systemd-networkd[1404]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:32:57.361467 systemd-networkd[1404]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:32:57.363168 systemd-networkd[1404]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:32:57.369546 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 34 scanned by (udev-worker) (1414) Sep 12 17:32:57.380600 systemd-networkd[1404]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Sep 12 17:32:57.381132 systemd-timesyncd[1396]: Network configuration changed, trying to establish connection. Sep 12 17:32:57.394583 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Sep 12 17:32:57.408548 kernel: mousedev: PS/2 mouse device common for all mice Sep 12 17:32:57.408581 kernel: ACPI: button: Power Button [PWRF] Sep 12 17:32:57.409594 systemd-networkd[1404]: eth0: DHCPv4 address 135.181.96.215/32, gateway 172.31.1.1 acquired from 172.31.1.1 Sep 12 17:32:57.409792 systemd-timesyncd[1396]: Network configuration changed, trying to establish connection. Sep 12 17:32:57.410842 systemd-timesyncd[1396]: Network configuration changed, trying to establish connection. Sep 12 17:32:57.415375 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Sep 12 17:32:57.416620 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:32:57.416705 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:32:57.422663 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 17:32:57.426220 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 17:32:57.431262 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 17:32:57.432101 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:32:57.432128 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 12 17:32:57.432138 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:32:57.432370 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 17:32:57.432843 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 17:32:57.435384 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Sep 12 17:32:57.444538 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input4 Sep 12 17:32:57.448733 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 17:32:57.449002 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 17:32:57.451849 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 12 17:32:57.453564 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 17:32:57.453791 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 17:32:57.453905 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 17:32:57.456533 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Sep 12 17:32:57.455580 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 17:32:57.458544 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Sep 12 17:32:57.463621 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Sep 12 17:32:57.463769 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI) Sep 12 17:32:57.463911 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Sep 12 17:32:57.476116 kernel: Console: switching to colour dummy device 80x25 Sep 12 17:32:57.479546 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Sep 12 17:32:57.479576 kernel: [drm] features: -context_init Sep 12 17:32:57.480611 kernel: EDAC MC: Ver: 3.0.0 Sep 12 17:32:57.483536 kernel: [drm] number of scanouts: 1 Sep 12 17:32:57.483992 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 12 17:32:57.493998 kernel: [drm] number of cap sets: 0 Sep 12 17:32:57.493770 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:32:57.495547 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:01.0 on minor 0 Sep 12 17:32:57.504566 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Sep 12 17:32:57.504627 kernel: Console: switching to colour frame buffer device 160x50 Sep 12 17:32:57.513080 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Sep 12 17:32:57.526584 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 17:32:57.526716 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:32:57.536710 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:32:57.566058 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:32:57.660758 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Sep 12 17:32:57.667744 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Sep 12 17:32:57.683457 lvm[1462]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Sep 12 17:32:57.717310 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Sep 12 17:32:57.718848 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 17:32:57.719006 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 17:32:57.719222 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 12 17:32:57.719414 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 12 17:32:57.719798 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 12 17:32:57.720089 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 12 17:32:57.720188 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 12 17:32:57.720293 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 12 17:32:57.720324 systemd[1]: Reached target paths.target - Path Units. Sep 12 17:32:57.720393 systemd[1]: Reached target timers.target - Timer Units. Sep 12 17:32:57.723729 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 12 17:32:57.725464 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 12 17:32:57.731511 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 12 17:32:57.733237 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Sep 12 17:32:57.734028 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 12 17:32:57.734228 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 17:32:57.734313 systemd[1]: Reached target basic.target - Basic System. Sep 12 17:32:57.734434 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 12 17:32:57.734464 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 12 17:32:57.737661 systemd[1]: Starting containerd.service - containerd container runtime... Sep 12 17:32:57.750776 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 12 17:32:57.754250 lvm[1466]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Sep 12 17:32:57.763762 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 12 17:32:57.780027 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 12 17:32:57.784653 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 12 17:32:57.785364 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 12 17:32:57.790874 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 12 17:32:57.794635 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 12 17:32:57.804689 jq[1472]: false Sep 12 17:32:57.805631 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Sep 12 17:32:57.813727 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 12 17:32:57.819703 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 12 17:32:57.832657 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 12 17:32:57.833436 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 12 17:32:57.835898 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 12 17:32:57.836650 systemd[1]: Starting update-engine.service - Update Engine... Sep 12 17:32:57.840641 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 12 17:32:57.844733 coreos-metadata[1468]: Sep 12 17:32:57.844 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Sep 12 17:32:57.848243 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Sep 12 17:32:57.851378 coreos-metadata[1468]: Sep 12 17:32:57.849 INFO Fetch successful Sep 12 17:32:57.851436 coreos-metadata[1468]: Sep 12 17:32:57.851 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Sep 12 17:32:57.852733 extend-filesystems[1473]: Found loop4 Sep 12 17:32:57.852733 extend-filesystems[1473]: Found loop5 Sep 12 17:32:57.852733 extend-filesystems[1473]: Found loop6 Sep 12 17:32:57.852733 extend-filesystems[1473]: Found loop7 Sep 12 17:32:57.852733 extend-filesystems[1473]: Found sda Sep 12 17:32:57.852733 extend-filesystems[1473]: Found sda1 Sep 12 17:32:57.852733 extend-filesystems[1473]: Found sda2 Sep 12 17:32:57.852733 extend-filesystems[1473]: Found sda3 Sep 12 17:32:57.852733 extend-filesystems[1473]: Found usr Sep 12 17:32:57.852733 extend-filesystems[1473]: Found sda4 Sep 12 17:32:57.852733 extend-filesystems[1473]: Found sda6 Sep 12 17:32:57.852733 extend-filesystems[1473]: Found sda7 Sep 12 17:32:57.852733 extend-filesystems[1473]: Found sda9 Sep 12 17:32:57.852733 extend-filesystems[1473]: Checking size of /dev/sda9 Sep 12 17:32:57.913174 coreos-metadata[1468]: Sep 12 17:32:57.852 INFO Fetch successful Sep 12 17:32:57.913212 extend-filesystems[1473]: Resized partition /dev/sda9 Sep 12 17:32:57.871009 dbus-daemon[1469]: [system] SELinux support is enabled Sep 12 17:32:57.856229 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 12 17:32:57.918436 extend-filesystems[1505]: resize2fs 1.47.1 (20-May-2024) Sep 12 17:32:57.856984 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 12 17:32:57.921254 jq[1483]: true Sep 12 17:32:57.875363 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 12 17:32:57.889906 (ntainerd)[1494]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 12 17:32:57.893749 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 12 17:32:57.921887 update_engine[1482]: I20250912 17:32:57.891009 1482 main.cc:92] Flatcar Update Engine starting Sep 12 17:32:57.921887 update_engine[1482]: I20250912 17:32:57.898650 1482 update_check_scheduler.cc:74] Next update check in 11m37s Sep 12 17:32:57.893780 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 12 17:32:57.899864 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 12 17:32:57.899880 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 12 17:32:57.902086 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 12 17:32:57.902210 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 12 17:32:57.909171 systemd[1]: Started update-engine.service - Update Engine. Sep 12 17:32:57.917026 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 12 17:32:57.928556 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Sep 12 17:32:57.935173 jq[1500]: true Sep 12 17:32:57.935467 tar[1489]: linux-amd64/LICENSE Sep 12 17:32:57.940849 tar[1489]: linux-amd64/helm Sep 12 17:32:57.945120 systemd[1]: motdgen.service: Deactivated successfully. Sep 12 17:32:57.945252 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 12 17:32:58.007540 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 34 scanned by (udev-worker) (1415) Sep 12 17:32:58.042222 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 12 17:32:58.044862 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 12 17:32:58.047861 locksmithd[1506]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 12 17:32:58.049172 systemd-logind[1481]: New seat seat0. Sep 12 17:32:58.060361 systemd-logind[1481]: Watching system buttons on /dev/input/event2 (Power Button) Sep 12 17:32:58.060380 systemd-logind[1481]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 12 17:32:58.060535 systemd[1]: Started systemd-logind.service - User Login Management. Sep 12 17:32:58.079553 bash[1542]: Updated "/home/core/.ssh/authorized_keys" Sep 12 17:32:58.080576 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 12 17:32:58.088884 systemd[1]: Starting sshkeys.service... Sep 12 17:32:58.104544 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Sep 12 17:32:58.103711 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 12 17:32:58.111838 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 12 17:32:58.140972 extend-filesystems[1505]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Sep 12 17:32:58.140972 extend-filesystems[1505]: old_desc_blocks = 1, new_desc_blocks = 5 Sep 12 17:32:58.140972 extend-filesystems[1505]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Sep 12 17:32:58.139604 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 12 17:32:58.142083 extend-filesystems[1473]: Resized filesystem in /dev/sda9 Sep 12 17:32:58.142083 extend-filesystems[1473]: Found sr0 Sep 12 17:32:58.139748 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 12 17:32:58.163655 coreos-metadata[1551]: Sep 12 17:32:58.163 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Sep 12 17:32:58.164841 coreos-metadata[1551]: Sep 12 17:32:58.164 INFO Fetch successful Sep 12 17:32:58.166677 unknown[1551]: wrote ssh authorized keys file for user: core Sep 12 17:32:58.178544 containerd[1494]: time="2025-09-12T17:32:58.177713657Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Sep 12 17:32:58.190589 update-ssh-keys[1558]: Updated "/home/core/.ssh/authorized_keys" Sep 12 17:32:58.191934 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 12 17:32:58.195011 systemd[1]: Finished sshkeys.service. Sep 12 17:32:58.216067 containerd[1494]: time="2025-09-12T17:32:58.215901553Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Sep 12 17:32:58.217269 containerd[1494]: time="2025-09-12T17:32:58.217238811Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.106-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Sep 12 17:32:58.217335 containerd[1494]: time="2025-09-12T17:32:58.217324602Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Sep 12 17:32:58.217399 containerd[1494]: time="2025-09-12T17:32:58.217387009Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Sep 12 17:32:58.217570 containerd[1494]: time="2025-09-12T17:32:58.217555084Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Sep 12 17:32:58.217621 containerd[1494]: time="2025-09-12T17:32:58.217611309Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Sep 12 17:32:58.217711 containerd[1494]: time="2025-09-12T17:32:58.217696368Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Sep 12 17:32:58.217753 containerd[1494]: time="2025-09-12T17:32:58.217744529Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Sep 12 17:32:58.217957 containerd[1494]: time="2025-09-12T17:32:58.217940777Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Sep 12 17:32:58.218811 containerd[1494]: time="2025-09-12T17:32:58.218541053Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Sep 12 17:32:58.218811 containerd[1494]: time="2025-09-12T17:32:58.218558556Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Sep 12 17:32:58.218811 containerd[1494]: time="2025-09-12T17:32:58.218567693Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Sep 12 17:32:58.218811 containerd[1494]: time="2025-09-12T17:32:58.218629919Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Sep 12 17:32:58.218811 containerd[1494]: time="2025-09-12T17:32:58.218786603Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Sep 12 17:32:58.219006 containerd[1494]: time="2025-09-12T17:32:58.218991196Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Sep 12 17:32:58.220801 containerd[1494]: time="2025-09-12T17:32:58.220538969Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Sep 12 17:32:58.220801 containerd[1494]: time="2025-09-12T17:32:58.220614821Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Sep 12 17:32:58.220801 containerd[1494]: time="2025-09-12T17:32:58.220653684Z" level=info msg="metadata content store policy set" policy=shared Sep 12 17:32:58.224940 containerd[1494]: time="2025-09-12T17:32:58.224924292Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Sep 12 17:32:58.225019 containerd[1494]: time="2025-09-12T17:32:58.225006916Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Sep 12 17:32:58.225088 containerd[1494]: time="2025-09-12T17:32:58.225077158Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Sep 12 17:32:58.225135 containerd[1494]: time="2025-09-12T17:32:58.225125439Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Sep 12 17:32:58.225176 containerd[1494]: time="2025-09-12T17:32:58.225167007Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Sep 12 17:32:58.225307 containerd[1494]: time="2025-09-12T17:32:58.225282243Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Sep 12 17:32:58.228340 containerd[1494]: time="2025-09-12T17:32:58.227580202Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Sep 12 17:32:58.228340 containerd[1494]: time="2025-09-12T17:32:58.227693374Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Sep 12 17:32:58.228340 containerd[1494]: time="2025-09-12T17:32:58.227707540Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Sep 12 17:32:58.228340 containerd[1494]: time="2025-09-12T17:32:58.227717830Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Sep 12 17:32:58.228340 containerd[1494]: time="2025-09-12T17:32:58.227728620Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Sep 12 17:32:58.228340 containerd[1494]: time="2025-09-12T17:32:58.227738128Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Sep 12 17:32:58.228340 containerd[1494]: time="2025-09-12T17:32:58.227747134Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Sep 12 17:32:58.228340 containerd[1494]: time="2025-09-12T17:32:58.227757504Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Sep 12 17:32:58.228340 containerd[1494]: time="2025-09-12T17:32:58.227768144Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Sep 12 17:32:58.228340 containerd[1494]: time="2025-09-12T17:32:58.227777502Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Sep 12 17:32:58.228340 containerd[1494]: time="2025-09-12T17:32:58.227786519Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Sep 12 17:32:58.228340 containerd[1494]: time="2025-09-12T17:32:58.227794543Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Sep 12 17:32:58.228340 containerd[1494]: time="2025-09-12T17:32:58.227811455Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Sep 12 17:32:58.228340 containerd[1494]: time="2025-09-12T17:32:58.227833817Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Sep 12 17:32:58.228575 containerd[1494]: time="2025-09-12T17:32:58.227843175Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Sep 12 17:32:58.228575 containerd[1494]: time="2025-09-12T17:32:58.227852392Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Sep 12 17:32:58.228575 containerd[1494]: time="2025-09-12T17:32:58.227861549Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Sep 12 17:32:58.228575 containerd[1494]: time="2025-09-12T17:32:58.227871819Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Sep 12 17:32:58.228575 containerd[1494]: time="2025-09-12T17:32:58.227881076Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Sep 12 17:32:58.228575 containerd[1494]: time="2025-09-12T17:32:58.227890243Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Sep 12 17:32:58.228575 containerd[1494]: time="2025-09-12T17:32:58.227899460Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Sep 12 17:32:58.228575 containerd[1494]: time="2025-09-12T17:32:58.227913366Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Sep 12 17:32:58.228575 containerd[1494]: time="2025-09-12T17:32:58.227922273Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Sep 12 17:32:58.228575 containerd[1494]: time="2025-09-12T17:32:58.227932352Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Sep 12 17:32:58.228575 containerd[1494]: time="2025-09-12T17:32:58.227941630Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Sep 12 17:32:58.228575 containerd[1494]: time="2025-09-12T17:32:58.227952811Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Sep 12 17:32:58.228575 containerd[1494]: time="2025-09-12T17:32:58.227968440Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Sep 12 17:32:58.228575 containerd[1494]: time="2025-09-12T17:32:58.227977206Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Sep 12 17:32:58.228575 containerd[1494]: time="2025-09-12T17:32:58.227985942Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Sep 12 17:32:58.228764 containerd[1494]: time="2025-09-12T17:32:58.228029715Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Sep 12 17:32:58.228764 containerd[1494]: time="2025-09-12T17:32:58.228043721Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Sep 12 17:32:58.228764 containerd[1494]: time="2025-09-12T17:32:58.228051786Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Sep 12 17:32:58.228764 containerd[1494]: time="2025-09-12T17:32:58.228061484Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Sep 12 17:32:58.228764 containerd[1494]: time="2025-09-12T17:32:58.228069119Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Sep 12 17:32:58.228764 containerd[1494]: time="2025-09-12T17:32:58.228127959Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Sep 12 17:32:58.228764 containerd[1494]: time="2025-09-12T17:32:58.228141494Z" level=info msg="NRI interface is disabled by configuration." Sep 12 17:32:58.228764 containerd[1494]: time="2025-09-12T17:32:58.228149459Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Sep 12 17:32:58.231428 containerd[1494]: time="2025-09-12T17:32:58.231377402Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Sep 12 17:32:58.233054 containerd[1494]: time="2025-09-12T17:32:58.232550742Z" level=info msg="Connect containerd service" Sep 12 17:32:58.233054 containerd[1494]: time="2025-09-12T17:32:58.232598301Z" level=info msg="using legacy CRI server" Sep 12 17:32:58.233054 containerd[1494]: time="2025-09-12T17:32:58.232605765Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 12 17:32:58.233054 containerd[1494]: time="2025-09-12T17:32:58.232685635Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Sep 12 17:32:58.233320 containerd[1494]: time="2025-09-12T17:32:58.233301360Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 17:32:58.236779 containerd[1494]: time="2025-09-12T17:32:58.236754945Z" level=info msg="Start subscribing containerd event" Sep 12 17:32:58.236973 containerd[1494]: time="2025-09-12T17:32:58.236958377Z" level=info msg="Start recovering state" Sep 12 17:32:58.237085 containerd[1494]: time="2025-09-12T17:32:58.237072481Z" level=info msg="Start event monitor" Sep 12 17:32:58.243538 containerd[1494]: time="2025-09-12T17:32:58.239539566Z" level=info msg="Start snapshots syncer" Sep 12 17:32:58.243538 containerd[1494]: time="2025-09-12T17:32:58.239554595Z" level=info msg="Start cni network conf syncer for default" Sep 12 17:32:58.243538 containerd[1494]: time="2025-09-12T17:32:58.239560886Z" level=info msg="Start streaming server" Sep 12 17:32:58.243538 containerd[1494]: time="2025-09-12T17:32:58.237607083Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 12 17:32:58.243538 containerd[1494]: time="2025-09-12T17:32:58.239674900Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 12 17:32:58.243538 containerd[1494]: time="2025-09-12T17:32:58.242515747Z" level=info msg="containerd successfully booted in 0.066629s" Sep 12 17:32:58.239776 systemd[1]: Started containerd.service - containerd container runtime. Sep 12 17:32:58.264285 sshd_keygen[1498]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 12 17:32:58.279872 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 12 17:32:58.286733 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 12 17:32:58.293453 systemd[1]: issuegen.service: Deactivated successfully. Sep 12 17:32:58.293621 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 12 17:32:58.313979 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 12 17:32:58.321582 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 12 17:32:58.328989 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 12 17:32:58.338741 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 12 17:32:58.340412 systemd[1]: Reached target getty.target - Login Prompts. Sep 12 17:32:58.520388 tar[1489]: linux-amd64/README.md Sep 12 17:32:58.528438 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 12 17:32:59.019874 systemd-networkd[1404]: eth0: Gained IPv6LL Sep 12 17:32:59.021174 systemd-timesyncd[1396]: Network configuration changed, trying to establish connection. Sep 12 17:32:59.021344 systemd-networkd[1404]: eth1: Gained IPv6LL Sep 12 17:32:59.021775 systemd-timesyncd[1396]: Network configuration changed, trying to establish connection. Sep 12 17:32:59.025326 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 12 17:32:59.027406 systemd[1]: Reached target network-online.target - Network is Online. Sep 12 17:32:59.039698 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:32:59.042773 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 12 17:32:59.060325 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 12 17:32:59.933770 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:32:59.934884 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 12 17:32:59.936584 (kubelet)[1601]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:32:59.936725 systemd[1]: Startup finished in 1.168s (kernel) + 5.798s (initrd) + 4.500s (userspace) = 11.468s. Sep 12 17:33:00.695861 kubelet[1601]: E0912 17:33:00.695787 1601 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:33:00.698150 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:33:00.698290 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:33:00.698545 systemd[1]: kubelet.service: Consumed 1.114s CPU time. Sep 12 17:33:09.303829 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 12 17:33:09.305143 systemd[1]: Started sshd@0-135.181.96.215:22-147.75.109.163:39884.service - OpenSSH per-connection server daemon (147.75.109.163:39884). Sep 12 17:33:10.383936 sshd[1613]: Accepted publickey for core from 147.75.109.163 port 39884 ssh2: RSA SHA256:0lLQDI1exHD7h3/xNDVcYYaT5o4Ke3hnBgY7BErDCBA Sep 12 17:33:10.385655 sshd[1613]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:33:10.392930 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 12 17:33:10.398792 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 12 17:33:10.400771 systemd-logind[1481]: New session 1 of user core. Sep 12 17:33:10.408484 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 12 17:33:10.415049 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 12 17:33:10.418083 (systemd)[1617]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 12 17:33:10.502812 systemd[1617]: Queued start job for default target default.target. Sep 12 17:33:10.512249 systemd[1617]: Created slice app.slice - User Application Slice. Sep 12 17:33:10.512272 systemd[1617]: Reached target paths.target - Paths. Sep 12 17:33:10.512282 systemd[1617]: Reached target timers.target - Timers. Sep 12 17:33:10.513271 systemd[1617]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 12 17:33:10.523358 systemd[1617]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 12 17:33:10.523405 systemd[1617]: Reached target sockets.target - Sockets. Sep 12 17:33:10.523416 systemd[1617]: Reached target basic.target - Basic System. Sep 12 17:33:10.523537 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 12 17:33:10.524035 systemd[1617]: Reached target default.target - Main User Target. Sep 12 17:33:10.524083 systemd[1617]: Startup finished in 101ms. Sep 12 17:33:10.528647 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 12 17:33:10.856091 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 12 17:33:10.866732 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:33:10.950200 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:33:10.952797 (kubelet)[1634]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:33:10.991320 kubelet[1634]: E0912 17:33:10.991270 1634 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:33:10.993945 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:33:10.994099 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:33:11.269749 systemd[1]: Started sshd@1-135.181.96.215:22-147.75.109.163:39890.service - OpenSSH per-connection server daemon (147.75.109.163:39890). Sep 12 17:33:12.235167 sshd[1643]: Accepted publickey for core from 147.75.109.163 port 39890 ssh2: RSA SHA256:0lLQDI1exHD7h3/xNDVcYYaT5o4Ke3hnBgY7BErDCBA Sep 12 17:33:12.236345 sshd[1643]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:33:12.240593 systemd-logind[1481]: New session 2 of user core. Sep 12 17:33:12.246669 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 12 17:33:12.908306 sshd[1643]: pam_unix(sshd:session): session closed for user core Sep 12 17:33:12.910512 systemd[1]: sshd@1-135.181.96.215:22-147.75.109.163:39890.service: Deactivated successfully. Sep 12 17:33:12.911828 systemd[1]: session-2.scope: Deactivated successfully. Sep 12 17:33:12.912754 systemd-logind[1481]: Session 2 logged out. Waiting for processes to exit. Sep 12 17:33:12.913704 systemd-logind[1481]: Removed session 2. Sep 12 17:33:13.073704 systemd[1]: Started sshd@2-135.181.96.215:22-147.75.109.163:39906.service - OpenSSH per-connection server daemon (147.75.109.163:39906). Sep 12 17:33:14.038230 sshd[1650]: Accepted publickey for core from 147.75.109.163 port 39906 ssh2: RSA SHA256:0lLQDI1exHD7h3/xNDVcYYaT5o4Ke3hnBgY7BErDCBA Sep 12 17:33:14.039615 sshd[1650]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:33:14.044037 systemd-logind[1481]: New session 3 of user core. Sep 12 17:33:14.052662 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 12 17:33:14.707750 sshd[1650]: pam_unix(sshd:session): session closed for user core Sep 12 17:33:14.710651 systemd-logind[1481]: Session 3 logged out. Waiting for processes to exit. Sep 12 17:33:14.711348 systemd[1]: sshd@2-135.181.96.215:22-147.75.109.163:39906.service: Deactivated successfully. Sep 12 17:33:14.712481 systemd[1]: session-3.scope: Deactivated successfully. Sep 12 17:33:14.713235 systemd-logind[1481]: Removed session 3. Sep 12 17:33:14.908936 systemd[1]: Started sshd@3-135.181.96.215:22-147.75.109.163:39910.service - OpenSSH per-connection server daemon (147.75.109.163:39910). Sep 12 17:33:15.979825 sshd[1657]: Accepted publickey for core from 147.75.109.163 port 39910 ssh2: RSA SHA256:0lLQDI1exHD7h3/xNDVcYYaT5o4Ke3hnBgY7BErDCBA Sep 12 17:33:15.981051 sshd[1657]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:33:15.985747 systemd-logind[1481]: New session 4 of user core. Sep 12 17:33:15.994712 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 12 17:33:16.723071 sshd[1657]: pam_unix(sshd:session): session closed for user core Sep 12 17:33:16.725685 systemd[1]: sshd@3-135.181.96.215:22-147.75.109.163:39910.service: Deactivated successfully. Sep 12 17:33:16.727162 systemd[1]: session-4.scope: Deactivated successfully. Sep 12 17:33:16.728420 systemd-logind[1481]: Session 4 logged out. Waiting for processes to exit. Sep 12 17:33:16.729583 systemd-logind[1481]: Removed session 4. Sep 12 17:33:16.871798 systemd[1]: Started sshd@4-135.181.96.215:22-147.75.109.163:39916.service - OpenSSH per-connection server daemon (147.75.109.163:39916). Sep 12 17:33:17.837918 sshd[1664]: Accepted publickey for core from 147.75.109.163 port 39916 ssh2: RSA SHA256:0lLQDI1exHD7h3/xNDVcYYaT5o4Ke3hnBgY7BErDCBA Sep 12 17:33:17.839583 sshd[1664]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:33:17.845859 systemd-logind[1481]: New session 5 of user core. Sep 12 17:33:17.858723 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 12 17:33:18.361713 sudo[1667]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 12 17:33:18.361974 sudo[1667]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:33:18.374034 sudo[1667]: pam_unix(sudo:session): session closed for user root Sep 12 17:33:18.531565 sshd[1664]: pam_unix(sshd:session): session closed for user core Sep 12 17:33:18.534109 systemd[1]: sshd@4-135.181.96.215:22-147.75.109.163:39916.service: Deactivated successfully. Sep 12 17:33:18.535867 systemd[1]: session-5.scope: Deactivated successfully. Sep 12 17:33:18.536672 systemd-logind[1481]: Session 5 logged out. Waiting for processes to exit. Sep 12 17:33:18.537577 systemd-logind[1481]: Removed session 5. Sep 12 17:33:18.731740 systemd[1]: Started sshd@5-135.181.96.215:22-147.75.109.163:39924.service - OpenSSH per-connection server daemon (147.75.109.163:39924). Sep 12 17:33:19.802678 sshd[1672]: Accepted publickey for core from 147.75.109.163 port 39924 ssh2: RSA SHA256:0lLQDI1exHD7h3/xNDVcYYaT5o4Ke3hnBgY7BErDCBA Sep 12 17:33:19.804652 sshd[1672]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:33:19.810401 systemd-logind[1481]: New session 6 of user core. Sep 12 17:33:19.815700 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 12 17:33:20.372288 sudo[1676]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 12 17:33:20.372579 sudo[1676]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:33:20.375358 sudo[1676]: pam_unix(sudo:session): session closed for user root Sep 12 17:33:20.379420 sudo[1675]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Sep 12 17:33:20.379747 sudo[1675]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:33:20.391093 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Sep 12 17:33:20.393600 auditctl[1679]: No rules Sep 12 17:33:20.391997 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 17:33:20.392128 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Sep 12 17:33:20.394487 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Sep 12 17:33:20.418135 augenrules[1697]: No rules Sep 12 17:33:20.419189 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Sep 12 17:33:20.420707 sudo[1675]: pam_unix(sudo:session): session closed for user root Sep 12 17:33:20.598462 sshd[1672]: pam_unix(sshd:session): session closed for user core Sep 12 17:33:20.602339 systemd[1]: sshd@5-135.181.96.215:22-147.75.109.163:39924.service: Deactivated successfully. Sep 12 17:33:20.604658 systemd[1]: session-6.scope: Deactivated successfully. Sep 12 17:33:20.605598 systemd-logind[1481]: Session 6 logged out. Waiting for processes to exit. Sep 12 17:33:20.607450 systemd-logind[1481]: Removed session 6. Sep 12 17:33:20.789263 systemd[1]: Started sshd@6-135.181.96.215:22-147.75.109.163:40398.service - OpenSSH per-connection server daemon (147.75.109.163:40398). Sep 12 17:33:21.106293 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 12 17:33:21.112253 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:33:21.250578 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:33:21.261707 (kubelet)[1715]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:33:21.290528 kubelet[1715]: E0912 17:33:21.290480 1715 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:33:21.292666 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:33:21.292781 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:33:21.867011 sshd[1705]: Accepted publickey for core from 147.75.109.163 port 40398 ssh2: RSA SHA256:0lLQDI1exHD7h3/xNDVcYYaT5o4Ke3hnBgY7BErDCBA Sep 12 17:33:21.869179 sshd[1705]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:33:21.877216 systemd-logind[1481]: New session 7 of user core. Sep 12 17:33:21.881661 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 12 17:33:22.439638 sudo[1723]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 12 17:33:22.440140 sudo[1723]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:33:22.832851 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 12 17:33:22.835098 (dockerd)[1740]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 12 17:33:23.248708 dockerd[1740]: time="2025-09-12T17:33:23.248614900Z" level=info msg="Starting up" Sep 12 17:33:23.391918 dockerd[1740]: time="2025-09-12T17:33:23.391660875Z" level=info msg="Loading containers: start." Sep 12 17:33:23.541581 kernel: Initializing XFRM netlink socket Sep 12 17:33:23.581198 systemd-timesyncd[1396]: Network configuration changed, trying to establish connection. Sep 12 17:33:24.762959 systemd-timesyncd[1396]: Contacted time server 85.215.166.214:123 (2.flatcar.pool.ntp.org). Sep 12 17:33:24.763183 systemd-timesyncd[1396]: Initial clock synchronization to Fri 2025-09-12 17:33:24.762657 UTC. Sep 12 17:33:24.763572 systemd-resolved[1357]: Clock change detected. Flushing caches. Sep 12 17:33:24.811378 systemd-networkd[1404]: docker0: Link UP Sep 12 17:33:24.829139 dockerd[1740]: time="2025-09-12T17:33:24.829061475Z" level=info msg="Loading containers: done." Sep 12 17:33:24.849646 dockerd[1740]: time="2025-09-12T17:33:24.849563258Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 12 17:33:24.849826 dockerd[1740]: time="2025-09-12T17:33:24.849691569Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Sep 12 17:33:24.849871 dockerd[1740]: time="2025-09-12T17:33:24.849828656Z" level=info msg="Daemon has completed initialization" Sep 12 17:33:24.885529 dockerd[1740]: time="2025-09-12T17:33:24.885453376Z" level=info msg="API listen on /run/docker.sock" Sep 12 17:33:24.885919 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 12 17:33:26.001010 containerd[1494]: time="2025-09-12T17:33:26.000904701Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.9\"" Sep 12 17:33:26.543779 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount347105986.mount: Deactivated successfully. Sep 12 17:33:27.707389 containerd[1494]: time="2025-09-12T17:33:27.707331164Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:33:27.708449 containerd[1494]: time="2025-09-12T17:33:27.708242523Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.9: active requests=0, bytes read=28838016" Sep 12 17:33:27.710249 containerd[1494]: time="2025-09-12T17:33:27.709161326Z" level=info msg="ImageCreate event name:\"sha256:abd2b525baf428ffb8b8b7d1e09761dc5cdb7ed0c7896a9427e29e84f8eafc59\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:33:27.711556 containerd[1494]: time="2025-09-12T17:33:27.711529166Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:6df11cc2ad9679b1117be34d3a0230add88bc0a08fd7a3ebc26b680575e8de97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:33:27.712475 containerd[1494]: time="2025-09-12T17:33:27.712445173Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.9\" with image id \"sha256:abd2b525baf428ffb8b8b7d1e09761dc5cdb7ed0c7896a9427e29e84f8eafc59\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.9\", repo digest \"registry.k8s.io/kube-apiserver@sha256:6df11cc2ad9679b1117be34d3a0230add88bc0a08fd7a3ebc26b680575e8de97\", size \"28834515\" in 1.71144825s" Sep 12 17:33:27.712515 containerd[1494]: time="2025-09-12T17:33:27.712478696Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.9\" returns image reference \"sha256:abd2b525baf428ffb8b8b7d1e09761dc5cdb7ed0c7896a9427e29e84f8eafc59\"" Sep 12 17:33:27.713099 containerd[1494]: time="2025-09-12T17:33:27.713043005Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.9\"" Sep 12 17:33:28.890812 containerd[1494]: time="2025-09-12T17:33:28.890741509Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:33:28.891959 containerd[1494]: time="2025-09-12T17:33:28.891795475Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.9: active requests=0, bytes read=24787049" Sep 12 17:33:28.893244 containerd[1494]: time="2025-09-12T17:33:28.892959898Z" level=info msg="ImageCreate event name:\"sha256:0debe32fbb7223500fcf8c312f2a568a5abd3ed9274d8ec6780cfb30b8861e91\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:33:28.895670 containerd[1494]: time="2025-09-12T17:33:28.895634904Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:243c4b8e3bce271fcb1b78008ab996ab6976b1a20096deac08338fcd17979922\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:33:28.896405 containerd[1494]: time="2025-09-12T17:33:28.896280055Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.9\" with image id \"sha256:0debe32fbb7223500fcf8c312f2a568a5abd3ed9274d8ec6780cfb30b8861e91\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.9\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:243c4b8e3bce271fcb1b78008ab996ab6976b1a20096deac08338fcd17979922\", size \"26421706\" in 1.183207936s" Sep 12 17:33:28.896405 containerd[1494]: time="2025-09-12T17:33:28.896312044Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.9\" returns image reference \"sha256:0debe32fbb7223500fcf8c312f2a568a5abd3ed9274d8ec6780cfb30b8861e91\"" Sep 12 17:33:28.897010 containerd[1494]: time="2025-09-12T17:33:28.896987170Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.9\"" Sep 12 17:33:29.841757 containerd[1494]: time="2025-09-12T17:33:29.841692840Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:33:29.842553 containerd[1494]: time="2025-09-12T17:33:29.842514690Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.9: active requests=0, bytes read=19176311" Sep 12 17:33:29.845236 containerd[1494]: time="2025-09-12T17:33:29.843302608Z" level=info msg="ImageCreate event name:\"sha256:6934c23b154fcb9bf54ed5913782de746735a49f4daa4732285915050cd44ad5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:33:29.848504 containerd[1494]: time="2025-09-12T17:33:29.848462173Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:50c49520dbd0e8b4076b6a5c77d8014df09ea3d59a73e8bafd2678d51ebb92d5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:33:29.849286 containerd[1494]: time="2025-09-12T17:33:29.849260560Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.9\" with image id \"sha256:6934c23b154fcb9bf54ed5913782de746735a49f4daa4732285915050cd44ad5\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.9\", repo digest \"registry.k8s.io/kube-scheduler@sha256:50c49520dbd0e8b4076b6a5c77d8014df09ea3d59a73e8bafd2678d51ebb92d5\", size \"20810986\" in 952.246149ms" Sep 12 17:33:29.849329 containerd[1494]: time="2025-09-12T17:33:29.849287149Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.9\" returns image reference \"sha256:6934c23b154fcb9bf54ed5913782de746735a49f4daa4732285915050cd44ad5\"" Sep 12 17:33:29.850115 containerd[1494]: time="2025-09-12T17:33:29.850091107Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.9\"" Sep 12 17:33:30.752444 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3873716111.mount: Deactivated successfully. Sep 12 17:33:31.409565 containerd[1494]: time="2025-09-12T17:33:31.409500308Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:33:31.410563 containerd[1494]: time="2025-09-12T17:33:31.410412349Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.9: active requests=0, bytes read=30924234" Sep 12 17:33:31.412102 containerd[1494]: time="2025-09-12T17:33:31.411266500Z" level=info msg="ImageCreate event name:\"sha256:fa3fdca615a501743d8deb39729a96e731312aac8d96accec061d5265360332f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:33:31.414052 containerd[1494]: time="2025-09-12T17:33:31.414024281Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:886af02535dc34886e4618b902f8c140d89af57233a245621d29642224516064\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:33:31.415658 containerd[1494]: time="2025-09-12T17:33:31.415115007Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.9\" with image id \"sha256:fa3fdca615a501743d8deb39729a96e731312aac8d96accec061d5265360332f\", repo tag \"registry.k8s.io/kube-proxy:v1.32.9\", repo digest \"registry.k8s.io/kube-proxy@sha256:886af02535dc34886e4618b902f8c140d89af57233a245621d29642224516064\", size \"30923225\" in 1.564954439s" Sep 12 17:33:31.415658 containerd[1494]: time="2025-09-12T17:33:31.415146475Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.9\" returns image reference \"sha256:fa3fdca615a501743d8deb39729a96e731312aac8d96accec061d5265360332f\"" Sep 12 17:33:31.415847 containerd[1494]: time="2025-09-12T17:33:31.415824005Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 12 17:33:31.890807 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount48035908.mount: Deactivated successfully. Sep 12 17:33:32.502756 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 12 17:33:32.509581 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:33:32.588934 containerd[1494]: time="2025-09-12T17:33:32.588838507Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:33:32.593279 containerd[1494]: time="2025-09-12T17:33:32.590544136Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565335" Sep 12 17:33:32.593279 containerd[1494]: time="2025-09-12T17:33:32.590609408Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:33:32.597283 containerd[1494]: time="2025-09-12T17:33:32.597078859Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:33:32.598807 containerd[1494]: time="2025-09-12T17:33:32.598784988Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.182930374s" Sep 12 17:33:32.598886 containerd[1494]: time="2025-09-12T17:33:32.598873594Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 12 17:33:32.600986 containerd[1494]: time="2025-09-12T17:33:32.600957922Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 12 17:33:32.602414 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:33:32.602518 (kubelet)[2011]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:33:32.634966 kubelet[2011]: E0912 17:33:32.634927 2011 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:33:32.637841 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:33:32.637982 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:33:33.049246 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3241278364.mount: Deactivated successfully. Sep 12 17:33:33.054126 containerd[1494]: time="2025-09-12T17:33:33.054081745Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:33:33.054922 containerd[1494]: time="2025-09-12T17:33:33.054881685Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321160" Sep 12 17:33:33.055638 containerd[1494]: time="2025-09-12T17:33:33.055597708Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:33:33.057596 containerd[1494]: time="2025-09-12T17:33:33.057574695Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:33:33.058882 containerd[1494]: time="2025-09-12T17:33:33.058396546Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 457.408577ms" Sep 12 17:33:33.058882 containerd[1494]: time="2025-09-12T17:33:33.058433185Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 12 17:33:33.059072 containerd[1494]: time="2025-09-12T17:33:33.059057536Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Sep 12 17:33:33.533861 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1404342962.mount: Deactivated successfully. Sep 12 17:33:34.978376 containerd[1494]: time="2025-09-12T17:33:34.978303397Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:33:34.979302 containerd[1494]: time="2025-09-12T17:33:34.979191032Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57682132" Sep 12 17:33:34.980321 containerd[1494]: time="2025-09-12T17:33:34.979943242Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:33:34.982206 containerd[1494]: time="2025-09-12T17:33:34.982178483Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:33:34.983302 containerd[1494]: time="2025-09-12T17:33:34.983269870Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 1.924148174s" Sep 12 17:33:34.983341 containerd[1494]: time="2025-09-12T17:33:34.983300398Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Sep 12 17:33:37.776049 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:33:37.783440 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:33:37.806049 systemd[1]: Reloading requested from client PID 2103 ('systemctl') (unit session-7.scope)... Sep 12 17:33:37.806066 systemd[1]: Reloading... Sep 12 17:33:37.884242 zram_generator::config[2143]: No configuration found. Sep 12 17:33:37.960727 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 12 17:33:38.017901 systemd[1]: Reloading finished in 211 ms. Sep 12 17:33:38.053768 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:33:38.056626 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:33:38.057354 systemd[1]: kubelet.service: Deactivated successfully. Sep 12 17:33:38.057529 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:33:38.062604 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:33:38.142606 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:33:38.146108 (kubelet)[2199]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 17:33:38.184325 kubelet[2199]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:33:38.184325 kubelet[2199]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 12 17:33:38.184325 kubelet[2199]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:33:38.184325 kubelet[2199]: I0912 17:33:38.184127 2199 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 17:33:38.439159 kubelet[2199]: I0912 17:33:38.439055 2199 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 12 17:33:38.439159 kubelet[2199]: I0912 17:33:38.439090 2199 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 17:33:38.439550 kubelet[2199]: I0912 17:33:38.439438 2199 server.go:954] "Client rotation is on, will bootstrap in background" Sep 12 17:33:38.470254 kubelet[2199]: E0912 17:33:38.469524 2199 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://135.181.96.215:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 135.181.96.215:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:33:38.471600 kubelet[2199]: I0912 17:33:38.471439 2199 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 17:33:38.483128 kubelet[2199]: E0912 17:33:38.482662 2199 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Sep 12 17:33:38.483128 kubelet[2199]: I0912 17:33:38.482693 2199 server.go:1421] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Sep 12 17:33:38.487464 kubelet[2199]: I0912 17:33:38.487418 2199 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 17:33:38.491206 kubelet[2199]: I0912 17:33:38.491148 2199 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 17:33:38.491380 kubelet[2199]: I0912 17:33:38.491201 2199 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081-3-6-f-c182586e87","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 17:33:38.493176 kubelet[2199]: I0912 17:33:38.493128 2199 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 17:33:38.493176 kubelet[2199]: I0912 17:33:38.493177 2199 container_manager_linux.go:304] "Creating device plugin manager" Sep 12 17:33:38.494581 kubelet[2199]: I0912 17:33:38.494555 2199 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:33:38.498269 kubelet[2199]: I0912 17:33:38.498253 2199 kubelet.go:446] "Attempting to sync node with API server" Sep 12 17:33:38.498311 kubelet[2199]: I0912 17:33:38.498280 2199 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 17:33:38.498311 kubelet[2199]: I0912 17:33:38.498297 2199 kubelet.go:352] "Adding apiserver pod source" Sep 12 17:33:38.498311 kubelet[2199]: I0912 17:33:38.498306 2199 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 17:33:38.506601 kubelet[2199]: I0912 17:33:38.505925 2199 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Sep 12 17:33:38.509230 kubelet[2199]: I0912 17:33:38.509191 2199 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 17:33:38.510033 kubelet[2199]: W0912 17:33:38.509993 2199 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 12 17:33:38.510910 kubelet[2199]: I0912 17:33:38.510615 2199 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 12 17:33:38.510910 kubelet[2199]: I0912 17:33:38.510650 2199 server.go:1287] "Started kubelet" Sep 12 17:33:38.510910 kubelet[2199]: W0912 17:33:38.510780 2199 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://135.181.96.215:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 135.181.96.215:6443: connect: connection refused Sep 12 17:33:38.510910 kubelet[2199]: E0912 17:33:38.510828 2199 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://135.181.96.215:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 135.181.96.215:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:33:38.510910 kubelet[2199]: W0912 17:33:38.510892 2199 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://135.181.96.215:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4081-3-6-f-c182586e87&limit=500&resourceVersion=0": dial tcp 135.181.96.215:6443: connect: connection refused Sep 12 17:33:38.511050 kubelet[2199]: E0912 17:33:38.510918 2199 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://135.181.96.215:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4081-3-6-f-c182586e87&limit=500&resourceVersion=0\": dial tcp 135.181.96.215:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:33:38.517381 kubelet[2199]: I0912 17:33:38.516985 2199 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 17:33:38.517381 kubelet[2199]: I0912 17:33:38.517293 2199 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 17:33:38.519635 kubelet[2199]: E0912 17:33:38.517842 2199 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://135.181.96.215:6443/api/v1/namespaces/default/events\": dial tcp 135.181.96.215:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4081-3-6-f-c182586e87.1864996621978a5c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4081-3-6-f-c182586e87,UID:ci-4081-3-6-f-c182586e87,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4081-3-6-f-c182586e87,},FirstTimestamp:2025-09-12 17:33:38.510629468 +0000 UTC m=+0.361485885,LastTimestamp:2025-09-12 17:33:38.510629468 +0000 UTC m=+0.361485885,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4081-3-6-f-c182586e87,}" Sep 12 17:33:38.519882 kubelet[2199]: I0912 17:33:38.519859 2199 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 17:33:38.520333 kubelet[2199]: I0912 17:33:38.520307 2199 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 17:33:38.520886 kubelet[2199]: I0912 17:33:38.520851 2199 server.go:479] "Adding debug handlers to kubelet server" Sep 12 17:33:38.521945 kubelet[2199]: I0912 17:33:38.521779 2199 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 17:33:38.524692 kubelet[2199]: E0912 17:33:38.524588 2199 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-f-c182586e87\" not found" Sep 12 17:33:38.524692 kubelet[2199]: I0912 17:33:38.524622 2199 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 12 17:33:38.524809 kubelet[2199]: I0912 17:33:38.524782 2199 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 12 17:33:38.524959 kubelet[2199]: I0912 17:33:38.524839 2199 reconciler.go:26] "Reconciler: start to sync state" Sep 12 17:33:38.526384 kubelet[2199]: W0912 17:33:38.526338 2199 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://135.181.96.215:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 135.181.96.215:6443: connect: connection refused Sep 12 17:33:38.526644 kubelet[2199]: E0912 17:33:38.526393 2199 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://135.181.96.215:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 135.181.96.215:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:33:38.526817 kubelet[2199]: E0912 17:33:38.526698 2199 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://135.181.96.215:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-f-c182586e87?timeout=10s\": dial tcp 135.181.96.215:6443: connect: connection refused" interval="200ms" Sep 12 17:33:38.527818 kubelet[2199]: I0912 17:33:38.527788 2199 factory.go:221] Registration of the systemd container factory successfully Sep 12 17:33:38.527955 kubelet[2199]: I0912 17:33:38.527883 2199 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 17:33:38.531105 kubelet[2199]: I0912 17:33:38.531029 2199 factory.go:221] Registration of the containerd container factory successfully Sep 12 17:33:38.542315 kubelet[2199]: I0912 17:33:38.542273 2199 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 17:33:38.543257 kubelet[2199]: I0912 17:33:38.543209 2199 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 17:33:38.543257 kubelet[2199]: I0912 17:33:38.543242 2199 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 12 17:33:38.543326 kubelet[2199]: I0912 17:33:38.543267 2199 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 12 17:33:38.543326 kubelet[2199]: I0912 17:33:38.543274 2199 kubelet.go:2382] "Starting kubelet main sync loop" Sep 12 17:33:38.543326 kubelet[2199]: E0912 17:33:38.543308 2199 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 17:33:38.552081 kubelet[2199]: W0912 17:33:38.552031 2199 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://135.181.96.215:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 135.181.96.215:6443: connect: connection refused Sep 12 17:33:38.552081 kubelet[2199]: E0912 17:33:38.552075 2199 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://135.181.96.215:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 135.181.96.215:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:33:38.558174 kubelet[2199]: I0912 17:33:38.558138 2199 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 12 17:33:38.558174 kubelet[2199]: I0912 17:33:38.558153 2199 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 12 17:33:38.558309 kubelet[2199]: I0912 17:33:38.558190 2199 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:33:38.560031 kubelet[2199]: I0912 17:33:38.560005 2199 policy_none.go:49] "None policy: Start" Sep 12 17:33:38.560031 kubelet[2199]: I0912 17:33:38.560033 2199 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 12 17:33:38.560098 kubelet[2199]: I0912 17:33:38.560043 2199 state_mem.go:35] "Initializing new in-memory state store" Sep 12 17:33:38.565205 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 12 17:33:38.581179 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 12 17:33:38.583758 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 12 17:33:38.599863 kubelet[2199]: I0912 17:33:38.599840 2199 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 17:33:38.600008 kubelet[2199]: I0912 17:33:38.599976 2199 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 17:33:38.600008 kubelet[2199]: I0912 17:33:38.599992 2199 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 17:33:38.600328 kubelet[2199]: I0912 17:33:38.600305 2199 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 17:33:38.602124 kubelet[2199]: E0912 17:33:38.602093 2199 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 12 17:33:38.602273 kubelet[2199]: E0912 17:33:38.602253 2199 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4081-3-6-f-c182586e87\" not found" Sep 12 17:33:38.654590 systemd[1]: Created slice kubepods-burstable-podc3d885514471810970ca3bb7f49a949f.slice - libcontainer container kubepods-burstable-podc3d885514471810970ca3bb7f49a949f.slice. Sep 12 17:33:38.662952 kubelet[2199]: E0912 17:33:38.662902 2199 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-f-c182586e87\" not found" node="ci-4081-3-6-f-c182586e87" Sep 12 17:33:38.665341 systemd[1]: Created slice kubepods-burstable-pod631e63f3ef4f8ccac2e85a3db4258939.slice - libcontainer container kubepods-burstable-pod631e63f3ef4f8ccac2e85a3db4258939.slice. Sep 12 17:33:38.667984 kubelet[2199]: E0912 17:33:38.667778 2199 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-f-c182586e87\" not found" node="ci-4081-3-6-f-c182586e87" Sep 12 17:33:38.669956 systemd[1]: Created slice kubepods-burstable-podaa00d48b317e00057ea93af74e4cc0d2.slice - libcontainer container kubepods-burstable-podaa00d48b317e00057ea93af74e4cc0d2.slice. Sep 12 17:33:38.671379 kubelet[2199]: E0912 17:33:38.671345 2199 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-f-c182586e87\" not found" node="ci-4081-3-6-f-c182586e87" Sep 12 17:33:38.702343 kubelet[2199]: I0912 17:33:38.702064 2199 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-f-c182586e87" Sep 12 17:33:38.702438 kubelet[2199]: E0912 17:33:38.702400 2199 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://135.181.96.215:6443/api/v1/nodes\": dial tcp 135.181.96.215:6443: connect: connection refused" node="ci-4081-3-6-f-c182586e87" Sep 12 17:33:38.725960 kubelet[2199]: I0912 17:33:38.725747 2199 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c3d885514471810970ca3bb7f49a949f-ca-certs\") pod \"kube-apiserver-ci-4081-3-6-f-c182586e87\" (UID: \"c3d885514471810970ca3bb7f49a949f\") " pod="kube-system/kube-apiserver-ci-4081-3-6-f-c182586e87" Sep 12 17:33:38.725960 kubelet[2199]: I0912 17:33:38.725782 2199 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c3d885514471810970ca3bb7f49a949f-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081-3-6-f-c182586e87\" (UID: \"c3d885514471810970ca3bb7f49a949f\") " pod="kube-system/kube-apiserver-ci-4081-3-6-f-c182586e87" Sep 12 17:33:38.725960 kubelet[2199]: I0912 17:33:38.725803 2199 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/631e63f3ef4f8ccac2e85a3db4258939-ca-certs\") pod \"kube-controller-manager-ci-4081-3-6-f-c182586e87\" (UID: \"631e63f3ef4f8ccac2e85a3db4258939\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-f-c182586e87" Sep 12 17:33:38.725960 kubelet[2199]: I0912 17:33:38.725820 2199 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/631e63f3ef4f8ccac2e85a3db4258939-k8s-certs\") pod \"kube-controller-manager-ci-4081-3-6-f-c182586e87\" (UID: \"631e63f3ef4f8ccac2e85a3db4258939\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-f-c182586e87" Sep 12 17:33:38.725960 kubelet[2199]: I0912 17:33:38.725834 2199 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/aa00d48b317e00057ea93af74e4cc0d2-kubeconfig\") pod \"kube-scheduler-ci-4081-3-6-f-c182586e87\" (UID: \"aa00d48b317e00057ea93af74e4cc0d2\") " pod="kube-system/kube-scheduler-ci-4081-3-6-f-c182586e87" Sep 12 17:33:38.726127 kubelet[2199]: I0912 17:33:38.725848 2199 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c3d885514471810970ca3bb7f49a949f-k8s-certs\") pod \"kube-apiserver-ci-4081-3-6-f-c182586e87\" (UID: \"c3d885514471810970ca3bb7f49a949f\") " pod="kube-system/kube-apiserver-ci-4081-3-6-f-c182586e87" Sep 12 17:33:38.726127 kubelet[2199]: I0912 17:33:38.725863 2199 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/631e63f3ef4f8ccac2e85a3db4258939-flexvolume-dir\") pod \"kube-controller-manager-ci-4081-3-6-f-c182586e87\" (UID: \"631e63f3ef4f8ccac2e85a3db4258939\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-f-c182586e87" Sep 12 17:33:38.726127 kubelet[2199]: I0912 17:33:38.725878 2199 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/631e63f3ef4f8ccac2e85a3db4258939-kubeconfig\") pod \"kube-controller-manager-ci-4081-3-6-f-c182586e87\" (UID: \"631e63f3ef4f8ccac2e85a3db4258939\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-f-c182586e87" Sep 12 17:33:38.726127 kubelet[2199]: I0912 17:33:38.725908 2199 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/631e63f3ef4f8ccac2e85a3db4258939-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081-3-6-f-c182586e87\" (UID: \"631e63f3ef4f8ccac2e85a3db4258939\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-f-c182586e87" Sep 12 17:33:38.728096 kubelet[2199]: E0912 17:33:38.728056 2199 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://135.181.96.215:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-f-c182586e87?timeout=10s\": dial tcp 135.181.96.215:6443: connect: connection refused" interval="400ms" Sep 12 17:33:38.904410 kubelet[2199]: I0912 17:33:38.904348 2199 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-f-c182586e87" Sep 12 17:33:38.904826 kubelet[2199]: E0912 17:33:38.904775 2199 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://135.181.96.215:6443/api/v1/nodes\": dial tcp 135.181.96.215:6443: connect: connection refused" node="ci-4081-3-6-f-c182586e87" Sep 12 17:33:38.964169 containerd[1494]: time="2025-09-12T17:33:38.964053916Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081-3-6-f-c182586e87,Uid:c3d885514471810970ca3bb7f49a949f,Namespace:kube-system,Attempt:0,}" Sep 12 17:33:38.971708 containerd[1494]: time="2025-09-12T17:33:38.971638447Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081-3-6-f-c182586e87,Uid:631e63f3ef4f8ccac2e85a3db4258939,Namespace:kube-system,Attempt:0,}" Sep 12 17:33:38.972511 containerd[1494]: time="2025-09-12T17:33:38.972460769Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081-3-6-f-c182586e87,Uid:aa00d48b317e00057ea93af74e4cc0d2,Namespace:kube-system,Attempt:0,}" Sep 12 17:33:39.129533 kubelet[2199]: E0912 17:33:39.129455 2199 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://135.181.96.215:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-f-c182586e87?timeout=10s\": dial tcp 135.181.96.215:6443: connect: connection refused" interval="800ms" Sep 12 17:33:39.307265 kubelet[2199]: I0912 17:33:39.307206 2199 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-f-c182586e87" Sep 12 17:33:39.307703 kubelet[2199]: E0912 17:33:39.307542 2199 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://135.181.96.215:6443/api/v1/nodes\": dial tcp 135.181.96.215:6443: connect: connection refused" node="ci-4081-3-6-f-c182586e87" Sep 12 17:33:39.434062 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2589375204.mount: Deactivated successfully. Sep 12 17:33:39.441634 containerd[1494]: time="2025-09-12T17:33:39.441550671Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:33:39.444280 containerd[1494]: time="2025-09-12T17:33:39.444206911Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312078" Sep 12 17:33:39.445458 containerd[1494]: time="2025-09-12T17:33:39.445360514Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:33:39.447322 containerd[1494]: time="2025-09-12T17:33:39.447185486Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:33:39.449864 containerd[1494]: time="2025-09-12T17:33:39.447974997Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:33:39.450615 containerd[1494]: time="2025-09-12T17:33:39.450553852Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Sep 12 17:33:39.451525 containerd[1494]: time="2025-09-12T17:33:39.451240480Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Sep 12 17:33:39.453718 containerd[1494]: time="2025-09-12T17:33:39.453671378Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:33:39.456899 containerd[1494]: time="2025-09-12T17:33:39.456824641Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 484.291927ms" Sep 12 17:33:39.458959 containerd[1494]: time="2025-09-12T17:33:39.458899642Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 494.763241ms" Sep 12 17:33:39.464369 containerd[1494]: time="2025-09-12T17:33:39.464318162Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 492.60206ms" Sep 12 17:33:39.565302 kubelet[2199]: W0912 17:33:39.565031 2199 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://135.181.96.215:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 135.181.96.215:6443: connect: connection refused Sep 12 17:33:39.565302 kubelet[2199]: E0912 17:33:39.565133 2199 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://135.181.96.215:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 135.181.96.215:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:33:39.606996 kubelet[2199]: W0912 17:33:39.606938 2199 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://135.181.96.215:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 135.181.96.215:6443: connect: connection refused Sep 12 17:33:39.607204 kubelet[2199]: E0912 17:33:39.607031 2199 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://135.181.96.215:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 135.181.96.215:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:33:39.639106 containerd[1494]: time="2025-09-12T17:33:39.636837924Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:33:39.639106 containerd[1494]: time="2025-09-12T17:33:39.636932000Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:33:39.639106 containerd[1494]: time="2025-09-12T17:33:39.636953441Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:33:39.639106 containerd[1494]: time="2025-09-12T17:33:39.637093574Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:33:39.645038 containerd[1494]: time="2025-09-12T17:33:39.644124367Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:33:39.647493 containerd[1494]: time="2025-09-12T17:33:39.646093740Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:33:39.647493 containerd[1494]: time="2025-09-12T17:33:39.646118907Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:33:39.647493 containerd[1494]: time="2025-09-12T17:33:39.646751152Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:33:39.661629 containerd[1494]: time="2025-09-12T17:33:39.661345478Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:33:39.661629 containerd[1494]: time="2025-09-12T17:33:39.661405441Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:33:39.661629 containerd[1494]: time="2025-09-12T17:33:39.661421531Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:33:39.661629 containerd[1494]: time="2025-09-12T17:33:39.661495580Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:33:39.679363 systemd[1]: Started cri-containerd-68f0a35c74934ce78267c3e621151e23f2d6b6bb0e496461cd802e7aa0d8209e.scope - libcontainer container 68f0a35c74934ce78267c3e621151e23f2d6b6bb0e496461cd802e7aa0d8209e. Sep 12 17:33:39.684308 systemd[1]: Started cri-containerd-c969666c657fc931da851be3b1fea5fd6a61fae8d9168d556458778ca41b5a47.scope - libcontainer container c969666c657fc931da851be3b1fea5fd6a61fae8d9168d556458778ca41b5a47. Sep 12 17:33:39.699318 systemd[1]: Started cri-containerd-e489b027ddbf82b7dc26f7be3be9f77c25a8731cf7f59e4806f2d3ca29ec3ea2.scope - libcontainer container e489b027ddbf82b7dc26f7be3be9f77c25a8731cf7f59e4806f2d3ca29ec3ea2. Sep 12 17:33:39.735055 containerd[1494]: time="2025-09-12T17:33:39.735007351Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081-3-6-f-c182586e87,Uid:c3d885514471810970ca3bb7f49a949f,Namespace:kube-system,Attempt:0,} returns sandbox id \"c969666c657fc931da851be3b1fea5fd6a61fae8d9168d556458778ca41b5a47\"" Sep 12 17:33:39.740662 containerd[1494]: time="2025-09-12T17:33:39.740622661Z" level=info msg="CreateContainer within sandbox \"c969666c657fc931da851be3b1fea5fd6a61fae8d9168d556458778ca41b5a47\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 12 17:33:39.761866 containerd[1494]: time="2025-09-12T17:33:39.761825358Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081-3-6-f-c182586e87,Uid:aa00d48b317e00057ea93af74e4cc0d2,Namespace:kube-system,Attempt:0,} returns sandbox id \"68f0a35c74934ce78267c3e621151e23f2d6b6bb0e496461cd802e7aa0d8209e\"" Sep 12 17:33:39.764779 containerd[1494]: time="2025-09-12T17:33:39.764426095Z" level=info msg="CreateContainer within sandbox \"c969666c657fc931da851be3b1fea5fd6a61fae8d9168d556458778ca41b5a47\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"2974aba6a2acc4622f738c2465c52ec550b736f914eb66456644d1fd3e2d61a8\"" Sep 12 17:33:39.765163 containerd[1494]: time="2025-09-12T17:33:39.765132118Z" level=info msg="StartContainer for \"2974aba6a2acc4622f738c2465c52ec550b736f914eb66456644d1fd3e2d61a8\"" Sep 12 17:33:39.765371 containerd[1494]: time="2025-09-12T17:33:39.765182493Z" level=info msg="CreateContainer within sandbox \"68f0a35c74934ce78267c3e621151e23f2d6b6bb0e496461cd802e7aa0d8209e\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 12 17:33:39.778366 containerd[1494]: time="2025-09-12T17:33:39.778282296Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081-3-6-f-c182586e87,Uid:631e63f3ef4f8ccac2e85a3db4258939,Namespace:kube-system,Attempt:0,} returns sandbox id \"e489b027ddbf82b7dc26f7be3be9f77c25a8731cf7f59e4806f2d3ca29ec3ea2\"" Sep 12 17:33:39.782367 containerd[1494]: time="2025-09-12T17:33:39.782278290Z" level=info msg="CreateContainer within sandbox \"e489b027ddbf82b7dc26f7be3be9f77c25a8731cf7f59e4806f2d3ca29ec3ea2\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 12 17:33:39.790639 containerd[1494]: time="2025-09-12T17:33:39.790596888Z" level=info msg="CreateContainer within sandbox \"68f0a35c74934ce78267c3e621151e23f2d6b6bb0e496461cd802e7aa0d8209e\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"0c9078338a68164d9c44a82df350b50f518ef7b10c6c0343e95d9dbfda6139b3\"" Sep 12 17:33:39.791146 containerd[1494]: time="2025-09-12T17:33:39.791090774Z" level=info msg="StartContainer for \"0c9078338a68164d9c44a82df350b50f518ef7b10c6c0343e95d9dbfda6139b3\"" Sep 12 17:33:39.798651 containerd[1494]: time="2025-09-12T17:33:39.798536906Z" level=info msg="CreateContainer within sandbox \"e489b027ddbf82b7dc26f7be3be9f77c25a8731cf7f59e4806f2d3ca29ec3ea2\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"8caa9e73f6120948e5ba5c5ebb6443ea4f2c29ac056432d78f4ac11f856f9e2b\"" Sep 12 17:33:39.799257 containerd[1494]: time="2025-09-12T17:33:39.798881763Z" level=info msg="StartContainer for \"8caa9e73f6120948e5ba5c5ebb6443ea4f2c29ac056432d78f4ac11f856f9e2b\"" Sep 12 17:33:39.799335 systemd[1]: Started cri-containerd-2974aba6a2acc4622f738c2465c52ec550b736f914eb66456644d1fd3e2d61a8.scope - libcontainer container 2974aba6a2acc4622f738c2465c52ec550b736f914eb66456644d1fd3e2d61a8. Sep 12 17:33:39.832340 systemd[1]: Started cri-containerd-0c9078338a68164d9c44a82df350b50f518ef7b10c6c0343e95d9dbfda6139b3.scope - libcontainer container 0c9078338a68164d9c44a82df350b50f518ef7b10c6c0343e95d9dbfda6139b3. Sep 12 17:33:39.834975 kubelet[2199]: W0912 17:33:39.834928 2199 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://135.181.96.215:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4081-3-6-f-c182586e87&limit=500&resourceVersion=0": dial tcp 135.181.96.215:6443: connect: connection refused Sep 12 17:33:39.835041 kubelet[2199]: E0912 17:33:39.834984 2199 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://135.181.96.215:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4081-3-6-f-c182586e87&limit=500&resourceVersion=0\": dial tcp 135.181.96.215:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:33:39.835560 systemd[1]: Started cri-containerd-8caa9e73f6120948e5ba5c5ebb6443ea4f2c29ac056432d78f4ac11f856f9e2b.scope - libcontainer container 8caa9e73f6120948e5ba5c5ebb6443ea4f2c29ac056432d78f4ac11f856f9e2b. Sep 12 17:33:39.862564 containerd[1494]: time="2025-09-12T17:33:39.862527625Z" level=info msg="StartContainer for \"2974aba6a2acc4622f738c2465c52ec550b736f914eb66456644d1fd3e2d61a8\" returns successfully" Sep 12 17:33:39.890833 containerd[1494]: time="2025-09-12T17:33:39.890797825Z" level=info msg="StartContainer for \"0c9078338a68164d9c44a82df350b50f518ef7b10c6c0343e95d9dbfda6139b3\" returns successfully" Sep 12 17:33:39.891245 containerd[1494]: time="2025-09-12T17:33:39.891067710Z" level=info msg="StartContainer for \"8caa9e73f6120948e5ba5c5ebb6443ea4f2c29ac056432d78f4ac11f856f9e2b\" returns successfully" Sep 12 17:33:39.892498 kubelet[2199]: W0912 17:33:39.892359 2199 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://135.181.96.215:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 135.181.96.215:6443: connect: connection refused Sep 12 17:33:39.892498 kubelet[2199]: E0912 17:33:39.892434 2199 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://135.181.96.215:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 135.181.96.215:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:33:39.930378 kubelet[2199]: E0912 17:33:39.930327 2199 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://135.181.96.215:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-f-c182586e87?timeout=10s\": dial tcp 135.181.96.215:6443: connect: connection refused" interval="1.6s" Sep 12 17:33:40.110468 kubelet[2199]: I0912 17:33:40.110023 2199 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-f-c182586e87" Sep 12 17:33:40.110958 kubelet[2199]: E0912 17:33:40.110928 2199 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://135.181.96.215:6443/api/v1/nodes\": dial tcp 135.181.96.215:6443: connect: connection refused" node="ci-4081-3-6-f-c182586e87" Sep 12 17:33:40.565187 kubelet[2199]: E0912 17:33:40.565022 2199 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-f-c182586e87\" not found" node="ci-4081-3-6-f-c182586e87" Sep 12 17:33:40.566846 kubelet[2199]: E0912 17:33:40.566456 2199 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-f-c182586e87\" not found" node="ci-4081-3-6-f-c182586e87" Sep 12 17:33:40.567751 kubelet[2199]: E0912 17:33:40.567623 2199 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-f-c182586e87\" not found" node="ci-4081-3-6-f-c182586e87" Sep 12 17:33:41.539519 kubelet[2199]: E0912 17:33:41.539401 2199 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4081-3-6-f-c182586e87\" not found" node="ci-4081-3-6-f-c182586e87" Sep 12 17:33:41.574089 kubelet[2199]: E0912 17:33:41.573774 2199 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-f-c182586e87\" not found" node="ci-4081-3-6-f-c182586e87" Sep 12 17:33:41.574089 kubelet[2199]: E0912 17:33:41.573932 2199 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-f-c182586e87\" not found" node="ci-4081-3-6-f-c182586e87" Sep 12 17:33:41.643941 kubelet[2199]: E0912 17:33:41.643878 2199 csi_plugin.go:308] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "ci-4081-3-6-f-c182586e87" not found Sep 12 17:33:41.714100 kubelet[2199]: I0912 17:33:41.714053 2199 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-f-c182586e87" Sep 12 17:33:41.731499 kubelet[2199]: I0912 17:33:41.731454 2199 kubelet_node_status.go:78] "Successfully registered node" node="ci-4081-3-6-f-c182586e87" Sep 12 17:33:41.731499 kubelet[2199]: E0912 17:33:41.731496 2199 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4081-3-6-f-c182586e87\": node \"ci-4081-3-6-f-c182586e87\" not found" Sep 12 17:33:41.746548 kubelet[2199]: E0912 17:33:41.746485 2199 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-f-c182586e87\" not found" Sep 12 17:33:41.846820 kubelet[2199]: E0912 17:33:41.846633 2199 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-f-c182586e87\" not found" Sep 12 17:33:41.947736 kubelet[2199]: E0912 17:33:41.947642 2199 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-f-c182586e87\" not found" Sep 12 17:33:42.048594 kubelet[2199]: E0912 17:33:42.048534 2199 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-f-c182586e87\" not found" Sep 12 17:33:42.149295 kubelet[2199]: E0912 17:33:42.149185 2199 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-f-c182586e87\" not found" Sep 12 17:33:42.249969 kubelet[2199]: E0912 17:33:42.249926 2199 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-f-c182586e87\" not found" Sep 12 17:33:42.350837 kubelet[2199]: E0912 17:33:42.350784 2199 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-f-c182586e87\" not found" Sep 12 17:33:42.451526 kubelet[2199]: E0912 17:33:42.451386 2199 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-f-c182586e87\" not found" Sep 12 17:33:42.551882 kubelet[2199]: E0912 17:33:42.551829 2199 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-f-c182586e87\" not found" Sep 12 17:33:42.652585 kubelet[2199]: E0912 17:33:42.652537 2199 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-f-c182586e87\" not found" Sep 12 17:33:42.753681 kubelet[2199]: E0912 17:33:42.753639 2199 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-f-c182586e87\" not found" Sep 12 17:33:42.853847 kubelet[2199]: E0912 17:33:42.853788 2199 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-f-c182586e87\" not found" Sep 12 17:33:42.886937 kubelet[2199]: E0912 17:33:42.886897 2199 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-f-c182586e87\" not found" node="ci-4081-3-6-f-c182586e87" Sep 12 17:33:42.954009 kubelet[2199]: E0912 17:33:42.953939 2199 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-f-c182586e87\" not found" Sep 12 17:33:43.027022 kubelet[2199]: I0912 17:33:43.026613 2199 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-f-c182586e87" Sep 12 17:33:43.036523 kubelet[2199]: I0912 17:33:43.036473 2199 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-6-f-c182586e87" Sep 12 17:33:43.041109 kubelet[2199]: I0912 17:33:43.041059 2199 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-f-c182586e87" Sep 12 17:33:43.084417 kubelet[2199]: I0912 17:33:43.084098 2199 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-6-f-c182586e87" Sep 12 17:33:43.089548 kubelet[2199]: E0912 17:33:43.089333 2199 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4081-3-6-f-c182586e87\" already exists" pod="kube-system/kube-controller-manager-ci-4081-3-6-f-c182586e87" Sep 12 17:33:43.504674 kubelet[2199]: I0912 17:33:43.504638 2199 apiserver.go:52] "Watching apiserver" Sep 12 17:33:43.525050 kubelet[2199]: I0912 17:33:43.525014 2199 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 12 17:33:43.599155 systemd[1]: Reloading requested from client PID 2470 ('systemctl') (unit session-7.scope)... Sep 12 17:33:43.599171 systemd[1]: Reloading... Sep 12 17:33:43.664255 zram_generator::config[2506]: No configuration found. Sep 12 17:33:43.752776 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 12 17:33:43.822887 systemd[1]: Reloading finished in 223 ms. Sep 12 17:33:43.849043 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:33:43.851170 kubelet[2199]: I0912 17:33:43.849033 2199 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 17:33:43.869156 systemd[1]: kubelet.service: Deactivated successfully. Sep 12 17:33:43.869340 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:33:43.874777 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:33:43.958180 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:33:43.968688 (kubelet)[2561]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 17:33:44.019688 kubelet[2561]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:33:44.019688 kubelet[2561]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 12 17:33:44.019688 kubelet[2561]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:33:44.020007 kubelet[2561]: I0912 17:33:44.019744 2561 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 17:33:44.026345 kubelet[2561]: I0912 17:33:44.026325 2561 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 12 17:33:44.027814 kubelet[2561]: I0912 17:33:44.026433 2561 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 17:33:44.027814 kubelet[2561]: I0912 17:33:44.026767 2561 server.go:954] "Client rotation is on, will bootstrap in background" Sep 12 17:33:44.028831 kubelet[2561]: I0912 17:33:44.028816 2561 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 12 17:33:44.030540 kubelet[2561]: I0912 17:33:44.030526 2561 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 17:33:44.035468 kubelet[2561]: E0912 17:33:44.035434 2561 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Sep 12 17:33:44.035468 kubelet[2561]: I0912 17:33:44.035466 2561 server.go:1421] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Sep 12 17:33:44.040139 kubelet[2561]: I0912 17:33:44.039599 2561 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 17:33:44.040139 kubelet[2561]: I0912 17:33:44.039772 2561 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 17:33:44.040139 kubelet[2561]: I0912 17:33:44.039790 2561 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081-3-6-f-c182586e87","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 17:33:44.040139 kubelet[2561]: I0912 17:33:44.039921 2561 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 17:33:44.040334 kubelet[2561]: I0912 17:33:44.039927 2561 container_manager_linux.go:304] "Creating device plugin manager" Sep 12 17:33:44.040334 kubelet[2561]: I0912 17:33:44.039960 2561 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:33:44.040334 kubelet[2561]: I0912 17:33:44.040069 2561 kubelet.go:446] "Attempting to sync node with API server" Sep 12 17:33:44.042662 kubelet[2561]: I0912 17:33:44.042634 2561 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 17:33:44.042716 kubelet[2561]: I0912 17:33:44.042669 2561 kubelet.go:352] "Adding apiserver pod source" Sep 12 17:33:44.042716 kubelet[2561]: I0912 17:33:44.042681 2561 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 17:33:44.048277 kubelet[2561]: I0912 17:33:44.047391 2561 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Sep 12 17:33:44.048277 kubelet[2561]: I0912 17:33:44.047722 2561 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 17:33:44.048277 kubelet[2561]: I0912 17:33:44.048066 2561 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 12 17:33:44.048277 kubelet[2561]: I0912 17:33:44.048091 2561 server.go:1287] "Started kubelet" Sep 12 17:33:44.049613 kubelet[2561]: I0912 17:33:44.049590 2561 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 17:33:44.050234 kubelet[2561]: I0912 17:33:44.050174 2561 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 17:33:44.050493 kubelet[2561]: I0912 17:33:44.050457 2561 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 17:33:44.060985 kubelet[2561]: I0912 17:33:44.060820 2561 server.go:479] "Adding debug handlers to kubelet server" Sep 12 17:33:44.063845 kubelet[2561]: E0912 17:33:44.063816 2561 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 17:33:44.064100 kubelet[2561]: I0912 17:33:44.064088 2561 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 17:33:44.064523 kubelet[2561]: I0912 17:33:44.064513 2561 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 12 17:33:44.064789 kubelet[2561]: I0912 17:33:44.064776 2561 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 17:33:44.067351 kubelet[2561]: I0912 17:33:44.067140 2561 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 12 17:33:44.068058 kubelet[2561]: I0912 17:33:44.068045 2561 reconciler.go:26] "Reconciler: start to sync state" Sep 12 17:33:44.070050 kubelet[2561]: I0912 17:33:44.070017 2561 factory.go:221] Registration of the systemd container factory successfully Sep 12 17:33:44.070105 kubelet[2561]: I0912 17:33:44.070089 2561 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 17:33:44.073094 kubelet[2561]: I0912 17:33:44.073073 2561 factory.go:221] Registration of the containerd container factory successfully Sep 12 17:33:44.077421 kubelet[2561]: I0912 17:33:44.077339 2561 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 17:33:44.078960 kubelet[2561]: I0912 17:33:44.078934 2561 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 17:33:44.078960 kubelet[2561]: I0912 17:33:44.078958 2561 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 12 17:33:44.079027 kubelet[2561]: I0912 17:33:44.078975 2561 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 12 17:33:44.079027 kubelet[2561]: I0912 17:33:44.078981 2561 kubelet.go:2382] "Starting kubelet main sync loop" Sep 12 17:33:44.079027 kubelet[2561]: E0912 17:33:44.079013 2561 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 17:33:44.121493 kubelet[2561]: I0912 17:33:44.121458 2561 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 12 17:33:44.121493 kubelet[2561]: I0912 17:33:44.121499 2561 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 12 17:33:44.121631 kubelet[2561]: I0912 17:33:44.121518 2561 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:33:44.121686 kubelet[2561]: I0912 17:33:44.121663 2561 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 12 17:33:44.121716 kubelet[2561]: I0912 17:33:44.121681 2561 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 12 17:33:44.121716 kubelet[2561]: I0912 17:33:44.121701 2561 policy_none.go:49] "None policy: Start" Sep 12 17:33:44.121768 kubelet[2561]: I0912 17:33:44.121724 2561 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 12 17:33:44.121768 kubelet[2561]: I0912 17:33:44.121737 2561 state_mem.go:35] "Initializing new in-memory state store" Sep 12 17:33:44.121859 kubelet[2561]: I0912 17:33:44.121840 2561 state_mem.go:75] "Updated machine memory state" Sep 12 17:33:44.127862 kubelet[2561]: I0912 17:33:44.127841 2561 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 17:33:44.128065 kubelet[2561]: I0912 17:33:44.127990 2561 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 17:33:44.128065 kubelet[2561]: I0912 17:33:44.128005 2561 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 17:33:44.128373 kubelet[2561]: I0912 17:33:44.128354 2561 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 17:33:44.131243 kubelet[2561]: E0912 17:33:44.130693 2561 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 12 17:33:44.179857 kubelet[2561]: I0912 17:33:44.179722 2561 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-f-c182586e87" Sep 12 17:33:44.181536 kubelet[2561]: I0912 17:33:44.181070 2561 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-f-c182586e87" Sep 12 17:33:44.181775 kubelet[2561]: I0912 17:33:44.181101 2561 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-6-f-c182586e87" Sep 12 17:33:44.190897 kubelet[2561]: E0912 17:33:44.190875 2561 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081-3-6-f-c182586e87\" already exists" pod="kube-system/kube-apiserver-ci-4081-3-6-f-c182586e87" Sep 12 17:33:44.193582 kubelet[2561]: E0912 17:33:44.193518 2561 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4081-3-6-f-c182586e87\" already exists" pod="kube-system/kube-controller-manager-ci-4081-3-6-f-c182586e87" Sep 12 17:33:44.193777 kubelet[2561]: E0912 17:33:44.193706 2561 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081-3-6-f-c182586e87\" already exists" pod="kube-system/kube-scheduler-ci-4081-3-6-f-c182586e87" Sep 12 17:33:44.238972 kubelet[2561]: I0912 17:33:44.238935 2561 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-f-c182586e87" Sep 12 17:33:44.246070 kubelet[2561]: I0912 17:33:44.246033 2561 kubelet_node_status.go:124] "Node was previously registered" node="ci-4081-3-6-f-c182586e87" Sep 12 17:33:44.246070 kubelet[2561]: I0912 17:33:44.246098 2561 kubelet_node_status.go:78] "Successfully registered node" node="ci-4081-3-6-f-c182586e87" Sep 12 17:33:44.269366 kubelet[2561]: I0912 17:33:44.269322 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c3d885514471810970ca3bb7f49a949f-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081-3-6-f-c182586e87\" (UID: \"c3d885514471810970ca3bb7f49a949f\") " pod="kube-system/kube-apiserver-ci-4081-3-6-f-c182586e87" Sep 12 17:33:44.269472 kubelet[2561]: I0912 17:33:44.269371 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/631e63f3ef4f8ccac2e85a3db4258939-flexvolume-dir\") pod \"kube-controller-manager-ci-4081-3-6-f-c182586e87\" (UID: \"631e63f3ef4f8ccac2e85a3db4258939\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-f-c182586e87" Sep 12 17:33:44.269472 kubelet[2561]: I0912 17:33:44.269399 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/631e63f3ef4f8ccac2e85a3db4258939-kubeconfig\") pod \"kube-controller-manager-ci-4081-3-6-f-c182586e87\" (UID: \"631e63f3ef4f8ccac2e85a3db4258939\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-f-c182586e87" Sep 12 17:33:44.269472 kubelet[2561]: I0912 17:33:44.269419 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/aa00d48b317e00057ea93af74e4cc0d2-kubeconfig\") pod \"kube-scheduler-ci-4081-3-6-f-c182586e87\" (UID: \"aa00d48b317e00057ea93af74e4cc0d2\") " pod="kube-system/kube-scheduler-ci-4081-3-6-f-c182586e87" Sep 12 17:33:44.269472 kubelet[2561]: I0912 17:33:44.269439 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c3d885514471810970ca3bb7f49a949f-ca-certs\") pod \"kube-apiserver-ci-4081-3-6-f-c182586e87\" (UID: \"c3d885514471810970ca3bb7f49a949f\") " pod="kube-system/kube-apiserver-ci-4081-3-6-f-c182586e87" Sep 12 17:33:44.269472 kubelet[2561]: I0912 17:33:44.269459 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c3d885514471810970ca3bb7f49a949f-k8s-certs\") pod \"kube-apiserver-ci-4081-3-6-f-c182586e87\" (UID: \"c3d885514471810970ca3bb7f49a949f\") " pod="kube-system/kube-apiserver-ci-4081-3-6-f-c182586e87" Sep 12 17:33:44.269591 kubelet[2561]: I0912 17:33:44.269477 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/631e63f3ef4f8ccac2e85a3db4258939-ca-certs\") pod \"kube-controller-manager-ci-4081-3-6-f-c182586e87\" (UID: \"631e63f3ef4f8ccac2e85a3db4258939\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-f-c182586e87" Sep 12 17:33:44.269591 kubelet[2561]: I0912 17:33:44.269496 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/631e63f3ef4f8ccac2e85a3db4258939-k8s-certs\") pod \"kube-controller-manager-ci-4081-3-6-f-c182586e87\" (UID: \"631e63f3ef4f8ccac2e85a3db4258939\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-f-c182586e87" Sep 12 17:33:44.269591 kubelet[2561]: I0912 17:33:44.269518 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/631e63f3ef4f8ccac2e85a3db4258939-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081-3-6-f-c182586e87\" (UID: \"631e63f3ef4f8ccac2e85a3db4258939\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-f-c182586e87" Sep 12 17:33:44.616254 update_engine[1482]: I20250912 17:33:44.615782 1482 update_attempter.cc:509] Updating boot flags... Sep 12 17:33:44.682250 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 34 scanned by (udev-worker) (2604) Sep 12 17:33:45.048762 kubelet[2561]: I0912 17:33:45.048726 2561 apiserver.go:52] "Watching apiserver" Sep 12 17:33:45.068176 kubelet[2561]: I0912 17:33:45.068122 2561 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 12 17:33:45.103610 kubelet[2561]: I0912 17:33:45.103416 2561 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-f-c182586e87" Sep 12 17:33:45.104068 kubelet[2561]: I0912 17:33:45.104058 2561 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-f-c182586e87" Sep 12 17:33:45.108875 kubelet[2561]: E0912 17:33:45.108844 2561 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081-3-6-f-c182586e87\" already exists" pod="kube-system/kube-apiserver-ci-4081-3-6-f-c182586e87" Sep 12 17:33:45.113404 kubelet[2561]: E0912 17:33:45.113380 2561 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081-3-6-f-c182586e87\" already exists" pod="kube-system/kube-scheduler-ci-4081-3-6-f-c182586e87" Sep 12 17:33:45.136884 kubelet[2561]: I0912 17:33:45.136772 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4081-3-6-f-c182586e87" podStartSLOduration=2.136759046 podStartE2EDuration="2.136759046s" podCreationTimestamp="2025-09-12 17:33:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:33:45.130023847 +0000 UTC m=+1.157438937" watchObservedRunningTime="2025-09-12 17:33:45.136759046 +0000 UTC m=+1.164174126" Sep 12 17:33:45.145555 kubelet[2561]: I0912 17:33:45.145276 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4081-3-6-f-c182586e87" podStartSLOduration=2.145265316 podStartE2EDuration="2.145265316s" podCreationTimestamp="2025-09-12 17:33:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:33:45.145004628 +0000 UTC m=+1.172419707" watchObservedRunningTime="2025-09-12 17:33:45.145265316 +0000 UTC m=+1.172680396" Sep 12 17:33:45.145555 kubelet[2561]: I0912 17:33:45.145445 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4081-3-6-f-c182586e87" podStartSLOduration=2.14543821 podStartE2EDuration="2.14543821s" podCreationTimestamp="2025-09-12 17:33:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:33:45.138406385 +0000 UTC m=+1.165821475" watchObservedRunningTime="2025-09-12 17:33:45.14543821 +0000 UTC m=+1.172853291" Sep 12 17:33:48.794311 kubelet[2561]: I0912 17:33:48.794209 2561 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 12 17:33:48.795406 kubelet[2561]: I0912 17:33:48.795350 2561 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 12 17:33:48.795462 containerd[1494]: time="2025-09-12T17:33:48.794730545Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 12 17:33:49.757942 systemd[1]: Created slice kubepods-besteffort-pod8ffe2424_eeff_437d_8015_9c61415ed3dd.slice - libcontainer container kubepods-besteffort-pod8ffe2424_eeff_437d_8015_9c61415ed3dd.slice. Sep 12 17:33:49.806411 kubelet[2561]: I0912 17:33:49.806362 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/8ffe2424-eeff-437d-8015-9c61415ed3dd-xtables-lock\") pod \"kube-proxy-qrf6b\" (UID: \"8ffe2424-eeff-437d-8015-9c61415ed3dd\") " pod="kube-system/kube-proxy-qrf6b" Sep 12 17:33:49.806411 kubelet[2561]: I0912 17:33:49.806408 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/8ffe2424-eeff-437d-8015-9c61415ed3dd-kube-proxy\") pod \"kube-proxy-qrf6b\" (UID: \"8ffe2424-eeff-437d-8015-9c61415ed3dd\") " pod="kube-system/kube-proxy-qrf6b" Sep 12 17:33:49.806750 kubelet[2561]: I0912 17:33:49.806427 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8ffe2424-eeff-437d-8015-9c61415ed3dd-lib-modules\") pod \"kube-proxy-qrf6b\" (UID: \"8ffe2424-eeff-437d-8015-9c61415ed3dd\") " pod="kube-system/kube-proxy-qrf6b" Sep 12 17:33:49.806750 kubelet[2561]: I0912 17:33:49.806442 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx9x9\" (UniqueName: \"kubernetes.io/projected/8ffe2424-eeff-437d-8015-9c61415ed3dd-kube-api-access-mx9x9\") pod \"kube-proxy-qrf6b\" (UID: \"8ffe2424-eeff-437d-8015-9c61415ed3dd\") " pod="kube-system/kube-proxy-qrf6b" Sep 12 17:33:49.944907 systemd[1]: Created slice kubepods-besteffort-pod455eafee_2740_43aa_942c_239daaa71966.slice - libcontainer container kubepods-besteffort-pod455eafee_2740_43aa_942c_239daaa71966.slice. Sep 12 17:33:50.008502 kubelet[2561]: I0912 17:33:50.008398 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsp5f\" (UniqueName: \"kubernetes.io/projected/455eafee-2740-43aa-942c-239daaa71966-kube-api-access-wsp5f\") pod \"tigera-operator-755d956888-b89cb\" (UID: \"455eafee-2740-43aa-942c-239daaa71966\") " pod="tigera-operator/tigera-operator-755d956888-b89cb" Sep 12 17:33:50.008645 kubelet[2561]: I0912 17:33:50.008515 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/455eafee-2740-43aa-942c-239daaa71966-var-lib-calico\") pod \"tigera-operator-755d956888-b89cb\" (UID: \"455eafee-2740-43aa-942c-239daaa71966\") " pod="tigera-operator/tigera-operator-755d956888-b89cb" Sep 12 17:33:50.068843 containerd[1494]: time="2025-09-12T17:33:50.068625658Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-qrf6b,Uid:8ffe2424-eeff-437d-8015-9c61415ed3dd,Namespace:kube-system,Attempt:0,}" Sep 12 17:33:50.105273 containerd[1494]: time="2025-09-12T17:33:50.104650228Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:33:50.105273 containerd[1494]: time="2025-09-12T17:33:50.104725179Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:33:50.105273 containerd[1494]: time="2025-09-12T17:33:50.104767518Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:33:50.105273 containerd[1494]: time="2025-09-12T17:33:50.104933369Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:33:50.139479 systemd[1]: Started cri-containerd-d1d3c710c7072c40febf36bc613d77c05e20ca2fd54c726051bbe05fe6d87332.scope - libcontainer container d1d3c710c7072c40febf36bc613d77c05e20ca2fd54c726051bbe05fe6d87332. Sep 12 17:33:50.165852 containerd[1494]: time="2025-09-12T17:33:50.165806893Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-qrf6b,Uid:8ffe2424-eeff-437d-8015-9c61415ed3dd,Namespace:kube-system,Attempt:0,} returns sandbox id \"d1d3c710c7072c40febf36bc613d77c05e20ca2fd54c726051bbe05fe6d87332\"" Sep 12 17:33:50.170531 containerd[1494]: time="2025-09-12T17:33:50.170449638Z" level=info msg="CreateContainer within sandbox \"d1d3c710c7072c40febf36bc613d77c05e20ca2fd54c726051bbe05fe6d87332\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 12 17:33:50.186868 containerd[1494]: time="2025-09-12T17:33:50.186721931Z" level=info msg="CreateContainer within sandbox \"d1d3c710c7072c40febf36bc613d77c05e20ca2fd54c726051bbe05fe6d87332\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"9312cda3ce4f930435d0944f16ba04253f2289dddd5b4ca8b8e73309da90cd06\"" Sep 12 17:33:50.188521 containerd[1494]: time="2025-09-12T17:33:50.188479446Z" level=info msg="StartContainer for \"9312cda3ce4f930435d0944f16ba04253f2289dddd5b4ca8b8e73309da90cd06\"" Sep 12 17:33:50.219370 systemd[1]: Started cri-containerd-9312cda3ce4f930435d0944f16ba04253f2289dddd5b4ca8b8e73309da90cd06.scope - libcontainer container 9312cda3ce4f930435d0944f16ba04253f2289dddd5b4ca8b8e73309da90cd06. Sep 12 17:33:50.248266 containerd[1494]: time="2025-09-12T17:33:50.248016474Z" level=info msg="StartContainer for \"9312cda3ce4f930435d0944f16ba04253f2289dddd5b4ca8b8e73309da90cd06\" returns successfully" Sep 12 17:33:50.249574 containerd[1494]: time="2025-09-12T17:33:50.249530242Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-b89cb,Uid:455eafee-2740-43aa-942c-239daaa71966,Namespace:tigera-operator,Attempt:0,}" Sep 12 17:33:50.284256 containerd[1494]: time="2025-09-12T17:33:50.283397747Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:33:50.284256 containerd[1494]: time="2025-09-12T17:33:50.283478138Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:33:50.284256 containerd[1494]: time="2025-09-12T17:33:50.283543762Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:33:50.284584 containerd[1494]: time="2025-09-12T17:33:50.283889480Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:33:50.311538 systemd[1]: Started cri-containerd-0e355f3981ea5289ada5a20ebfc71d55d43e7b87be88411efa187d7a87bde99c.scope - libcontainer container 0e355f3981ea5289ada5a20ebfc71d55d43e7b87be88411efa187d7a87bde99c. Sep 12 17:33:50.369574 containerd[1494]: time="2025-09-12T17:33:50.369455485Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-b89cb,Uid:455eafee-2740-43aa-942c-239daaa71966,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"0e355f3981ea5289ada5a20ebfc71d55d43e7b87be88411efa187d7a87bde99c\"" Sep 12 17:33:50.373588 containerd[1494]: time="2025-09-12T17:33:50.373353935Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 12 17:33:51.125080 kubelet[2561]: I0912 17:33:51.124673 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-qrf6b" podStartSLOduration=2.124657165 podStartE2EDuration="2.124657165s" podCreationTimestamp="2025-09-12 17:33:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:33:51.124436281 +0000 UTC m=+7.151851371" watchObservedRunningTime="2025-09-12 17:33:51.124657165 +0000 UTC m=+7.152072255" Sep 12 17:33:52.097679 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3455667092.mount: Deactivated successfully. Sep 12 17:33:52.499345 containerd[1494]: time="2025-09-12T17:33:52.499004987Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:33:52.499906 containerd[1494]: time="2025-09-12T17:33:52.499788376Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 12 17:33:52.500486 containerd[1494]: time="2025-09-12T17:33:52.500450007Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:33:52.502196 containerd[1494]: time="2025-09-12T17:33:52.502179108Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:33:52.503050 containerd[1494]: time="2025-09-12T17:33:52.502708571Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 2.129323258s" Sep 12 17:33:52.503050 containerd[1494]: time="2025-09-12T17:33:52.502733949Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 12 17:33:52.504825 containerd[1494]: time="2025-09-12T17:33:52.504792639Z" level=info msg="CreateContainer within sandbox \"0e355f3981ea5289ada5a20ebfc71d55d43e7b87be88411efa187d7a87bde99c\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 12 17:33:52.520862 containerd[1494]: time="2025-09-12T17:33:52.520351033Z" level=info msg="CreateContainer within sandbox \"0e355f3981ea5289ada5a20ebfc71d55d43e7b87be88411efa187d7a87bde99c\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"a1917a782fb112f95cf697c3a66d2973a99473ae138b1910d6043562a5153aff\"" Sep 12 17:33:52.522166 containerd[1494]: time="2025-09-12T17:33:52.522139827Z" level=info msg="StartContainer for \"a1917a782fb112f95cf697c3a66d2973a99473ae138b1910d6043562a5153aff\"" Sep 12 17:33:52.543339 systemd[1]: Started cri-containerd-a1917a782fb112f95cf697c3a66d2973a99473ae138b1910d6043562a5153aff.scope - libcontainer container a1917a782fb112f95cf697c3a66d2973a99473ae138b1910d6043562a5153aff. Sep 12 17:33:52.559095 containerd[1494]: time="2025-09-12T17:33:52.559063973Z" level=info msg="StartContainer for \"a1917a782fb112f95cf697c3a66d2973a99473ae138b1910d6043562a5153aff\" returns successfully" Sep 12 17:33:54.359934 kubelet[2561]: I0912 17:33:54.359837 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-b89cb" podStartSLOduration=3.228326438 podStartE2EDuration="5.359821317s" podCreationTimestamp="2025-09-12 17:33:49 +0000 UTC" firstStartedPulling="2025-09-12 17:33:50.372011066 +0000 UTC m=+6.399426167" lastFinishedPulling="2025-09-12 17:33:52.503505967 +0000 UTC m=+8.530921046" observedRunningTime="2025-09-12 17:33:53.127133342 +0000 UTC m=+9.154548432" watchObservedRunningTime="2025-09-12 17:33:54.359821317 +0000 UTC m=+10.387236397" Sep 12 17:33:58.208607 sudo[1723]: pam_unix(sudo:session): session closed for user root Sep 12 17:33:58.388324 sshd[1705]: pam_unix(sshd:session): session closed for user core Sep 12 17:33:58.391984 systemd[1]: sshd@6-135.181.96.215:22-147.75.109.163:40398.service: Deactivated successfully. Sep 12 17:33:58.394599 systemd[1]: session-7.scope: Deactivated successfully. Sep 12 17:33:58.394738 systemd[1]: session-7.scope: Consumed 4.213s CPU time, 142.0M memory peak, 0B memory swap peak. Sep 12 17:33:58.395508 systemd-logind[1481]: Session 7 logged out. Waiting for processes to exit. Sep 12 17:33:58.398260 systemd-logind[1481]: Removed session 7. Sep 12 17:34:00.837724 systemd[1]: Created slice kubepods-besteffort-pod98dd9f51_0044_4ac8_b33d_c4f2bbbcc67d.slice - libcontainer container kubepods-besteffort-pod98dd9f51_0044_4ac8_b33d_c4f2bbbcc67d.slice. Sep 12 17:34:00.880537 kubelet[2561]: I0912 17:34:00.880460 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/98dd9f51-0044-4ac8-b33d-c4f2bbbcc67d-tigera-ca-bundle\") pod \"calico-typha-7bb9f6f777-n9trb\" (UID: \"98dd9f51-0044-4ac8-b33d-c4f2bbbcc67d\") " pod="calico-system/calico-typha-7bb9f6f777-n9trb" Sep 12 17:34:00.880537 kubelet[2561]: I0912 17:34:00.880505 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldk5v\" (UniqueName: \"kubernetes.io/projected/98dd9f51-0044-4ac8-b33d-c4f2bbbcc67d-kube-api-access-ldk5v\") pod \"calico-typha-7bb9f6f777-n9trb\" (UID: \"98dd9f51-0044-4ac8-b33d-c4f2bbbcc67d\") " pod="calico-system/calico-typha-7bb9f6f777-n9trb" Sep 12 17:34:00.881106 kubelet[2561]: I0912 17:34:00.880842 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/98dd9f51-0044-4ac8-b33d-c4f2bbbcc67d-typha-certs\") pod \"calico-typha-7bb9f6f777-n9trb\" (UID: \"98dd9f51-0044-4ac8-b33d-c4f2bbbcc67d\") " pod="calico-system/calico-typha-7bb9f6f777-n9trb" Sep 12 17:34:01.134539 systemd[1]: Created slice kubepods-besteffort-pod5e4423bf_638a_42c9_bba6_67fb5bbbd0bc.slice - libcontainer container kubepods-besteffort-pod5e4423bf_638a_42c9_bba6_67fb5bbbd0bc.slice. Sep 12 17:34:01.157105 containerd[1494]: time="2025-09-12T17:34:01.157062249Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7bb9f6f777-n9trb,Uid:98dd9f51-0044-4ac8-b33d-c4f2bbbcc67d,Namespace:calico-system,Attempt:0,}" Sep 12 17:34:01.181844 kubelet[2561]: I0912 17:34:01.181725 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/5e4423bf-638a-42c9-bba6-67fb5bbbd0bc-cni-log-dir\") pod \"calico-node-vsntk\" (UID: \"5e4423bf-638a-42c9-bba6-67fb5bbbd0bc\") " pod="calico-system/calico-node-vsntk" Sep 12 17:34:01.181844 kubelet[2561]: I0912 17:34:01.181765 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhfvr\" (UniqueName: \"kubernetes.io/projected/5e4423bf-638a-42c9-bba6-67fb5bbbd0bc-kube-api-access-nhfvr\") pod \"calico-node-vsntk\" (UID: \"5e4423bf-638a-42c9-bba6-67fb5bbbd0bc\") " pod="calico-system/calico-node-vsntk" Sep 12 17:34:01.181844 kubelet[2561]: I0912 17:34:01.181784 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5e4423bf-638a-42c9-bba6-67fb5bbbd0bc-lib-modules\") pod \"calico-node-vsntk\" (UID: \"5e4423bf-638a-42c9-bba6-67fb5bbbd0bc\") " pod="calico-system/calico-node-vsntk" Sep 12 17:34:01.181844 kubelet[2561]: I0912 17:34:01.181796 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/5e4423bf-638a-42c9-bba6-67fb5bbbd0bc-var-lib-calico\") pod \"calico-node-vsntk\" (UID: \"5e4423bf-638a-42c9-bba6-67fb5bbbd0bc\") " pod="calico-system/calico-node-vsntk" Sep 12 17:34:01.181844 kubelet[2561]: I0912 17:34:01.181809 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/5e4423bf-638a-42c9-bba6-67fb5bbbd0bc-cni-net-dir\") pod \"calico-node-vsntk\" (UID: \"5e4423bf-638a-42c9-bba6-67fb5bbbd0bc\") " pod="calico-system/calico-node-vsntk" Sep 12 17:34:01.182370 kubelet[2561]: I0912 17:34:01.181824 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/5e4423bf-638a-42c9-bba6-67fb5bbbd0bc-var-run-calico\") pod \"calico-node-vsntk\" (UID: \"5e4423bf-638a-42c9-bba6-67fb5bbbd0bc\") " pod="calico-system/calico-node-vsntk" Sep 12 17:34:01.182370 kubelet[2561]: I0912 17:34:01.181839 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/5e4423bf-638a-42c9-bba6-67fb5bbbd0bc-flexvol-driver-host\") pod \"calico-node-vsntk\" (UID: \"5e4423bf-638a-42c9-bba6-67fb5bbbd0bc\") " pod="calico-system/calico-node-vsntk" Sep 12 17:34:01.182370 kubelet[2561]: I0912 17:34:01.181852 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/5e4423bf-638a-42c9-bba6-67fb5bbbd0bc-node-certs\") pod \"calico-node-vsntk\" (UID: \"5e4423bf-638a-42c9-bba6-67fb5bbbd0bc\") " pod="calico-system/calico-node-vsntk" Sep 12 17:34:01.182370 kubelet[2561]: I0912 17:34:01.181865 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e4423bf-638a-42c9-bba6-67fb5bbbd0bc-tigera-ca-bundle\") pod \"calico-node-vsntk\" (UID: \"5e4423bf-638a-42c9-bba6-67fb5bbbd0bc\") " pod="calico-system/calico-node-vsntk" Sep 12 17:34:01.182370 kubelet[2561]: I0912 17:34:01.181880 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/5e4423bf-638a-42c9-bba6-67fb5bbbd0bc-xtables-lock\") pod \"calico-node-vsntk\" (UID: \"5e4423bf-638a-42c9-bba6-67fb5bbbd0bc\") " pod="calico-system/calico-node-vsntk" Sep 12 17:34:01.182470 kubelet[2561]: I0912 17:34:01.181896 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/5e4423bf-638a-42c9-bba6-67fb5bbbd0bc-cni-bin-dir\") pod \"calico-node-vsntk\" (UID: \"5e4423bf-638a-42c9-bba6-67fb5bbbd0bc\") " pod="calico-system/calico-node-vsntk" Sep 12 17:34:01.182470 kubelet[2561]: I0912 17:34:01.181909 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/5e4423bf-638a-42c9-bba6-67fb5bbbd0bc-policysync\") pod \"calico-node-vsntk\" (UID: \"5e4423bf-638a-42c9-bba6-67fb5bbbd0bc\") " pod="calico-system/calico-node-vsntk" Sep 12 17:34:01.194161 containerd[1494]: time="2025-09-12T17:34:01.193688140Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:34:01.194270 containerd[1494]: time="2025-09-12T17:34:01.194108678Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:34:01.194377 containerd[1494]: time="2025-09-12T17:34:01.194357548Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:34:01.194570 containerd[1494]: time="2025-09-12T17:34:01.194494371Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:34:01.231351 systemd[1]: Started cri-containerd-4294636397c75518f87a47cb662d85e5b94584868fb3fdcddfcb3ac970479b15.scope - libcontainer container 4294636397c75518f87a47cb662d85e5b94584868fb3fdcddfcb3ac970479b15. Sep 12 17:34:01.269450 containerd[1494]: time="2025-09-12T17:34:01.269391978Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7bb9f6f777-n9trb,Uid:98dd9f51-0044-4ac8-b33d-c4f2bbbcc67d,Namespace:calico-system,Attempt:0,} returns sandbox id \"4294636397c75518f87a47cb662d85e5b94584868fb3fdcddfcb3ac970479b15\"" Sep 12 17:34:01.279584 containerd[1494]: time="2025-09-12T17:34:01.279536142Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 12 17:34:01.287739 kubelet[2561]: E0912 17:34:01.287656 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.287739 kubelet[2561]: W0912 17:34:01.287674 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.288654 kubelet[2561]: E0912 17:34:01.288574 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.293050 kubelet[2561]: E0912 17:34:01.293000 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.293050 kubelet[2561]: W0912 17:34:01.293015 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.293050 kubelet[2561]: E0912 17:34:01.293029 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.348258 kubelet[2561]: E0912 17:34:01.347998 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fsx7j" podUID="53bbc8dc-3621-4d89-be2b-ae2eb974a294" Sep 12 17:34:01.370297 kubelet[2561]: E0912 17:34:01.370259 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.370405 kubelet[2561]: W0912 17:34:01.370304 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.370405 kubelet[2561]: E0912 17:34:01.370323 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.371321 kubelet[2561]: E0912 17:34:01.371288 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.371368 kubelet[2561]: W0912 17:34:01.371305 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.371368 kubelet[2561]: E0912 17:34:01.371338 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.371528 kubelet[2561]: E0912 17:34:01.371508 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.371528 kubelet[2561]: W0912 17:34:01.371521 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.371528 kubelet[2561]: E0912 17:34:01.371529 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.371780 kubelet[2561]: E0912 17:34:01.371758 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.371780 kubelet[2561]: W0912 17:34:01.371771 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.371780 kubelet[2561]: E0912 17:34:01.371780 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.371965 kubelet[2561]: E0912 17:34:01.371946 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.371965 kubelet[2561]: W0912 17:34:01.371958 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.371965 kubelet[2561]: E0912 17:34:01.371966 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.372425 kubelet[2561]: E0912 17:34:01.372405 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.372425 kubelet[2561]: W0912 17:34:01.372418 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.372425 kubelet[2561]: E0912 17:34:01.372426 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.372596 kubelet[2561]: E0912 17:34:01.372582 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.372596 kubelet[2561]: W0912 17:34:01.372595 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.372647 kubelet[2561]: E0912 17:34:01.372602 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.374797 kubelet[2561]: E0912 17:34:01.374421 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.374797 kubelet[2561]: W0912 17:34:01.374433 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.374797 kubelet[2561]: E0912 17:34:01.374441 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.374797 kubelet[2561]: E0912 17:34:01.374575 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.374797 kubelet[2561]: W0912 17:34:01.374604 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.374797 kubelet[2561]: E0912 17:34:01.374611 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.374797 kubelet[2561]: E0912 17:34:01.374738 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.374797 kubelet[2561]: W0912 17:34:01.374745 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.374797 kubelet[2561]: E0912 17:34:01.374751 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.374966 kubelet[2561]: E0912 17:34:01.374913 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.374966 kubelet[2561]: W0912 17:34:01.374921 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.374966 kubelet[2561]: E0912 17:34:01.374928 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.375476 kubelet[2561]: E0912 17:34:01.375453 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.375476 kubelet[2561]: W0912 17:34:01.375468 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.375476 kubelet[2561]: E0912 17:34:01.375476 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.376747 kubelet[2561]: E0912 17:34:01.376727 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.376986 kubelet[2561]: W0912 17:34:01.376955 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.376986 kubelet[2561]: E0912 17:34:01.376972 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.377535 kubelet[2561]: E0912 17:34:01.377236 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.377535 kubelet[2561]: W0912 17:34:01.377532 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.377605 kubelet[2561]: E0912 17:34:01.377544 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.378027 kubelet[2561]: E0912 17:34:01.377996 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.378027 kubelet[2561]: W0912 17:34:01.378010 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.378027 kubelet[2561]: E0912 17:34:01.378018 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.378404 kubelet[2561]: E0912 17:34:01.378373 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.378404 kubelet[2561]: W0912 17:34:01.378388 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.378404 kubelet[2561]: E0912 17:34:01.378396 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.378702 kubelet[2561]: E0912 17:34:01.378680 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.378702 kubelet[2561]: W0912 17:34:01.378694 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.378865 kubelet[2561]: E0912 17:34:01.378833 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.379382 kubelet[2561]: E0912 17:34:01.379360 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.379382 kubelet[2561]: W0912 17:34:01.379374 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.379382 kubelet[2561]: E0912 17:34:01.379383 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.379932 kubelet[2561]: E0912 17:34:01.379899 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.379932 kubelet[2561]: W0912 17:34:01.379913 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.379932 kubelet[2561]: E0912 17:34:01.379921 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.380485 kubelet[2561]: E0912 17:34:01.380464 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.380485 kubelet[2561]: W0912 17:34:01.380478 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.380485 kubelet[2561]: E0912 17:34:01.380487 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.383624 kubelet[2561]: E0912 17:34:01.383592 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.383624 kubelet[2561]: W0912 17:34:01.383607 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.383686 kubelet[2561]: E0912 17:34:01.383615 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.384505 kubelet[2561]: I0912 17:34:01.383974 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgjgr\" (UniqueName: \"kubernetes.io/projected/53bbc8dc-3621-4d89-be2b-ae2eb974a294-kube-api-access-vgjgr\") pod \"csi-node-driver-fsx7j\" (UID: \"53bbc8dc-3621-4d89-be2b-ae2eb974a294\") " pod="calico-system/csi-node-driver-fsx7j" Sep 12 17:34:01.385712 kubelet[2561]: E0912 17:34:01.384830 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.385712 kubelet[2561]: W0912 17:34:01.384843 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.385712 kubelet[2561]: E0912 17:34:01.384855 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.385712 kubelet[2561]: E0912 17:34:01.385253 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.385712 kubelet[2561]: W0912 17:34:01.385261 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.385712 kubelet[2561]: E0912 17:34:01.385342 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.385712 kubelet[2561]: E0912 17:34:01.385660 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.385712 kubelet[2561]: W0912 17:34:01.385677 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.385712 kubelet[2561]: E0912 17:34:01.385685 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.385895 kubelet[2561]: I0912 17:34:01.385702 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53bbc8dc-3621-4d89-be2b-ae2eb974a294-kubelet-dir\") pod \"csi-node-driver-fsx7j\" (UID: \"53bbc8dc-3621-4d89-be2b-ae2eb974a294\") " pod="calico-system/csi-node-driver-fsx7j" Sep 12 17:34:01.386621 kubelet[2561]: E0912 17:34:01.386595 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.386621 kubelet[2561]: W0912 17:34:01.386614 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.386700 kubelet[2561]: E0912 17:34:01.386637 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.387352 kubelet[2561]: E0912 17:34:01.387318 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.387352 kubelet[2561]: W0912 17:34:01.387330 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.387421 kubelet[2561]: E0912 17:34:01.387347 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.387754 kubelet[2561]: E0912 17:34:01.387695 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.387754 kubelet[2561]: W0912 17:34:01.387707 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.387754 kubelet[2561]: E0912 17:34:01.387715 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.388104 kubelet[2561]: I0912 17:34:01.388068 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/53bbc8dc-3621-4d89-be2b-ae2eb974a294-registration-dir\") pod \"csi-node-driver-fsx7j\" (UID: \"53bbc8dc-3621-4d89-be2b-ae2eb974a294\") " pod="calico-system/csi-node-driver-fsx7j" Sep 12 17:34:01.388267 kubelet[2561]: E0912 17:34:01.388212 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.388267 kubelet[2561]: W0912 17:34:01.388260 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.388330 kubelet[2561]: E0912 17:34:01.388273 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.389043 kubelet[2561]: E0912 17:34:01.389005 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.389074 kubelet[2561]: W0912 17:34:01.389048 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.389074 kubelet[2561]: E0912 17:34:01.389069 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.389355 kubelet[2561]: E0912 17:34:01.389330 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.389355 kubelet[2561]: W0912 17:34:01.389345 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.389355 kubelet[2561]: E0912 17:34:01.389353 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.389437 kubelet[2561]: I0912 17:34:01.389368 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/53bbc8dc-3621-4d89-be2b-ae2eb974a294-socket-dir\") pod \"csi-node-driver-fsx7j\" (UID: \"53bbc8dc-3621-4d89-be2b-ae2eb974a294\") " pod="calico-system/csi-node-driver-fsx7j" Sep 12 17:34:01.389964 kubelet[2561]: E0912 17:34:01.389932 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.389964 kubelet[2561]: W0912 17:34:01.389948 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.389964 kubelet[2561]: E0912 17:34:01.389963 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.390301 kubelet[2561]: I0912 17:34:01.390275 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/53bbc8dc-3621-4d89-be2b-ae2eb974a294-varrun\") pod \"csi-node-driver-fsx7j\" (UID: \"53bbc8dc-3621-4d89-be2b-ae2eb974a294\") " pod="calico-system/csi-node-driver-fsx7j" Sep 12 17:34:01.390368 kubelet[2561]: E0912 17:34:01.390351 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.390368 kubelet[2561]: W0912 17:34:01.390363 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.390412 kubelet[2561]: E0912 17:34:01.390375 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.390907 kubelet[2561]: E0912 17:34:01.390883 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.390907 kubelet[2561]: W0912 17:34:01.390898 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.390974 kubelet[2561]: E0912 17:34:01.390940 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.391403 kubelet[2561]: E0912 17:34:01.391380 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.391403 kubelet[2561]: W0912 17:34:01.391394 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.391403 kubelet[2561]: E0912 17:34:01.391403 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.391678 kubelet[2561]: E0912 17:34:01.391655 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.391678 kubelet[2561]: W0912 17:34:01.391668 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.391678 kubelet[2561]: E0912 17:34:01.391678 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.441272 containerd[1494]: time="2025-09-12T17:34:01.441192585Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-vsntk,Uid:5e4423bf-638a-42c9-bba6-67fb5bbbd0bc,Namespace:calico-system,Attempt:0,}" Sep 12 17:34:01.465299 containerd[1494]: time="2025-09-12T17:34:01.464565408Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:34:01.465548 containerd[1494]: time="2025-09-12T17:34:01.465432017Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:34:01.465649 containerd[1494]: time="2025-09-12T17:34:01.465504957Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:34:01.465923 containerd[1494]: time="2025-09-12T17:34:01.465861853Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:34:01.487388 systemd[1]: Started cri-containerd-ca7c90a17f9ecf47ab769b74bab16df3cb9ef935b94714f76e5beff7e8e4cbfe.scope - libcontainer container ca7c90a17f9ecf47ab769b74bab16df3cb9ef935b94714f76e5beff7e8e4cbfe. Sep 12 17:34:01.490869 kubelet[2561]: E0912 17:34:01.490726 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.490869 kubelet[2561]: W0912 17:34:01.490756 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.490869 kubelet[2561]: E0912 17:34:01.490772 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.491143 kubelet[2561]: E0912 17:34:01.491097 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.491143 kubelet[2561]: W0912 17:34:01.491106 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.491345 kubelet[2561]: E0912 17:34:01.491123 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.491545 kubelet[2561]: E0912 17:34:01.491499 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.491545 kubelet[2561]: W0912 17:34:01.491509 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.491758 kubelet[2561]: E0912 17:34:01.491523 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.491880 kubelet[2561]: E0912 17:34:01.491849 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.491880 kubelet[2561]: W0912 17:34:01.491858 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.492048 kubelet[2561]: E0912 17:34:01.491979 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.492420 kubelet[2561]: E0912 17:34:01.492341 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.492420 kubelet[2561]: W0912 17:34:01.492350 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.492737 kubelet[2561]: E0912 17:34:01.492608 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.493003 kubelet[2561]: E0912 17:34:01.492846 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.493003 kubelet[2561]: W0912 17:34:01.492854 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.493003 kubelet[2561]: E0912 17:34:01.492865 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.493406 kubelet[2561]: E0912 17:34:01.493277 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.493406 kubelet[2561]: W0912 17:34:01.493288 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.493975 kubelet[2561]: E0912 17:34:01.493938 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.494164 kubelet[2561]: E0912 17:34:01.494141 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.494382 kubelet[2561]: W0912 17:34:01.494248 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.494458 kubelet[2561]: E0912 17:34:01.494435 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.494658 kubelet[2561]: E0912 17:34:01.494649 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.494747 kubelet[2561]: W0912 17:34:01.494715 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.494820 kubelet[2561]: E0912 17:34:01.494801 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.495013 kubelet[2561]: E0912 17:34:01.494970 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.495013 kubelet[2561]: W0912 17:34:01.494979 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.495136 kubelet[2561]: E0912 17:34:01.495085 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.495369 kubelet[2561]: E0912 17:34:01.495302 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.495369 kubelet[2561]: W0912 17:34:01.495320 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.495637 kubelet[2561]: E0912 17:34:01.495608 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.495766 kubelet[2561]: E0912 17:34:01.495712 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.495766 kubelet[2561]: W0912 17:34:01.495720 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.495885 kubelet[2561]: E0912 17:34:01.495824 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.496062 kubelet[2561]: E0912 17:34:01.496053 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.496195 kubelet[2561]: W0912 17:34:01.496121 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.496279 kubelet[2561]: E0912 17:34:01.496268 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.496584 kubelet[2561]: E0912 17:34:01.496409 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.496584 kubelet[2561]: W0912 17:34:01.496418 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.496695 kubelet[2561]: E0912 17:34:01.496675 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.496862 kubelet[2561]: E0912 17:34:01.496853 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.496930 kubelet[2561]: W0912 17:34:01.496921 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.497032 kubelet[2561]: E0912 17:34:01.496999 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.497208 kubelet[2561]: E0912 17:34:01.497190 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.497208 kubelet[2561]: W0912 17:34:01.497198 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.497402 kubelet[2561]: E0912 17:34:01.497379 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.498052 kubelet[2561]: E0912 17:34:01.497586 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.498052 kubelet[2561]: W0912 17:34:01.497594 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.498052 kubelet[2561]: E0912 17:34:01.497659 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.498052 kubelet[2561]: E0912 17:34:01.497748 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.498052 kubelet[2561]: W0912 17:34:01.497754 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.498052 kubelet[2561]: E0912 17:34:01.497799 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.498392 kubelet[2561]: E0912 17:34:01.498382 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.498529 kubelet[2561]: W0912 17:34:01.498426 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.498529 kubelet[2561]: E0912 17:34:01.498439 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.498715 kubelet[2561]: E0912 17:34:01.498706 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.498831 kubelet[2561]: W0912 17:34:01.498769 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.498907 kubelet[2561]: E0912 17:34:01.498886 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.499027 kubelet[2561]: E0912 17:34:01.499010 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.499027 kubelet[2561]: W0912 17:34:01.499018 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.499336 kubelet[2561]: E0912 17:34:01.499282 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.499336 kubelet[2561]: E0912 17:34:01.499320 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.499336 kubelet[2561]: W0912 17:34:01.499326 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.499459 kubelet[2561]: E0912 17:34:01.499423 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.499697 kubelet[2561]: E0912 17:34:01.499618 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.499697 kubelet[2561]: W0912 17:34:01.499627 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.499697 kubelet[2561]: E0912 17:34:01.499635 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.500135 kubelet[2561]: E0912 17:34:01.500054 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.500135 kubelet[2561]: W0912 17:34:01.500062 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.500135 kubelet[2561]: E0912 17:34:01.500072 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.500371 kubelet[2561]: E0912 17:34:01.500330 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.500371 kubelet[2561]: W0912 17:34:01.500338 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.500371 kubelet[2561]: E0912 17:34:01.500346 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.518955 kubelet[2561]: E0912 17:34:01.518819 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:01.518955 kubelet[2561]: W0912 17:34:01.518833 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:01.518955 kubelet[2561]: E0912 17:34:01.518844 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:01.522859 containerd[1494]: time="2025-09-12T17:34:01.522526752Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-vsntk,Uid:5e4423bf-638a-42c9-bba6-67fb5bbbd0bc,Namespace:calico-system,Attempt:0,} returns sandbox id \"ca7c90a17f9ecf47ab769b74bab16df3cb9ef935b94714f76e5beff7e8e4cbfe\"" Sep 12 17:34:03.084126 kubelet[2561]: E0912 17:34:03.083991 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fsx7j" podUID="53bbc8dc-3621-4d89-be2b-ae2eb974a294" Sep 12 17:34:03.100633 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1224160251.mount: Deactivated successfully. Sep 12 17:34:03.511091 containerd[1494]: time="2025-09-12T17:34:03.510454157Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:03.511786 containerd[1494]: time="2025-09-12T17:34:03.511753530Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 12 17:34:03.526259 containerd[1494]: time="2025-09-12T17:34:03.526167865Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:03.555485 containerd[1494]: time="2025-09-12T17:34:03.555323093Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:03.557344 containerd[1494]: time="2025-09-12T17:34:03.556820947Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 2.277253565s" Sep 12 17:34:03.557659 containerd[1494]: time="2025-09-12T17:34:03.557626712Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 12 17:34:03.559079 containerd[1494]: time="2025-09-12T17:34:03.559045795Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 12 17:34:03.576288 containerd[1494]: time="2025-09-12T17:34:03.576210998Z" level=info msg="CreateContainer within sandbox \"4294636397c75518f87a47cb662d85e5b94584868fb3fdcddfcb3ac970479b15\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 12 17:34:03.591256 containerd[1494]: time="2025-09-12T17:34:03.590436411Z" level=info msg="CreateContainer within sandbox \"4294636397c75518f87a47cb662d85e5b94584868fb3fdcddfcb3ac970479b15\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"2ceebc4ed5bb7e115177d2a21de0ba8e252ffea06a02f8fea63e32777f4dd43e\"" Sep 12 17:34:03.593655 containerd[1494]: time="2025-09-12T17:34:03.593619278Z" level=info msg="StartContainer for \"2ceebc4ed5bb7e115177d2a21de0ba8e252ffea06a02f8fea63e32777f4dd43e\"" Sep 12 17:34:03.640367 systemd[1]: Started cri-containerd-2ceebc4ed5bb7e115177d2a21de0ba8e252ffea06a02f8fea63e32777f4dd43e.scope - libcontainer container 2ceebc4ed5bb7e115177d2a21de0ba8e252ffea06a02f8fea63e32777f4dd43e. Sep 12 17:34:03.679007 containerd[1494]: time="2025-09-12T17:34:03.678860121Z" level=info msg="StartContainer for \"2ceebc4ed5bb7e115177d2a21de0ba8e252ffea06a02f8fea63e32777f4dd43e\" returns successfully" Sep 12 17:34:04.200067 kubelet[2561]: E0912 17:34:04.199951 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.200067 kubelet[2561]: W0912 17:34:04.199971 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.200067 kubelet[2561]: E0912 17:34:04.199991 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.200719 kubelet[2561]: E0912 17:34:04.200444 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.200719 kubelet[2561]: W0912 17:34:04.200454 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.200719 kubelet[2561]: E0912 17:34:04.200466 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.200976 kubelet[2561]: E0912 17:34:04.200848 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.200976 kubelet[2561]: W0912 17:34:04.200860 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.200976 kubelet[2561]: E0912 17:34:04.200871 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.201254 kubelet[2561]: E0912 17:34:04.201122 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.201254 kubelet[2561]: W0912 17:34:04.201134 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.201254 kubelet[2561]: E0912 17:34:04.201145 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.201513 kubelet[2561]: E0912 17:34:04.201501 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.201656 kubelet[2561]: W0912 17:34:04.201570 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.201656 kubelet[2561]: E0912 17:34:04.201585 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.201781 kubelet[2561]: E0912 17:34:04.201770 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.201842 kubelet[2561]: W0912 17:34:04.201832 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.201965 kubelet[2561]: E0912 17:34:04.201890 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.202160 kubelet[2561]: E0912 17:34:04.202059 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.202160 kubelet[2561]: W0912 17:34:04.202070 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.202160 kubelet[2561]: E0912 17:34:04.202081 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.202351 kubelet[2561]: E0912 17:34:04.202340 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.202412 kubelet[2561]: W0912 17:34:04.202402 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.202472 kubelet[2561]: E0912 17:34:04.202461 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.202772 kubelet[2561]: E0912 17:34:04.202683 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.202772 kubelet[2561]: W0912 17:34:04.202694 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.202772 kubelet[2561]: E0912 17:34:04.202704 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.202926 kubelet[2561]: E0912 17:34:04.202915 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.202997 kubelet[2561]: W0912 17:34:04.202985 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.203120 kubelet[2561]: E0912 17:34:04.203046 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.203209 kubelet[2561]: E0912 17:34:04.203199 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.203318 kubelet[2561]: W0912 17:34:04.203306 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.203439 kubelet[2561]: E0912 17:34:04.203369 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.203798 kubelet[2561]: E0912 17:34:04.203709 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.203798 kubelet[2561]: W0912 17:34:04.203721 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.203798 kubelet[2561]: E0912 17:34:04.203732 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.204113 kubelet[2561]: E0912 17:34:04.204101 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.204290 kubelet[2561]: W0912 17:34:04.204165 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.204290 kubelet[2561]: E0912 17:34:04.204178 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.204681 kubelet[2561]: E0912 17:34:04.204591 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.204681 kubelet[2561]: W0912 17:34:04.204630 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.204681 kubelet[2561]: E0912 17:34:04.204641 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.205058 kubelet[2561]: E0912 17:34:04.205019 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.205193 kubelet[2561]: W0912 17:34:04.205031 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.205193 kubelet[2561]: E0912 17:34:04.205121 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.216107 kubelet[2561]: E0912 17:34:04.215688 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.216107 kubelet[2561]: W0912 17:34:04.215703 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.216107 kubelet[2561]: E0912 17:34:04.215716 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.216500 kubelet[2561]: E0912 17:34:04.216397 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.216500 kubelet[2561]: W0912 17:34:04.216410 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.216500 kubelet[2561]: E0912 17:34:04.216421 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.216797 kubelet[2561]: E0912 17:34:04.216728 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.216797 kubelet[2561]: W0912 17:34:04.216739 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.216797 kubelet[2561]: E0912 17:34:04.216754 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.217450 kubelet[2561]: E0912 17:34:04.217378 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.217450 kubelet[2561]: W0912 17:34:04.217390 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.217450 kubelet[2561]: E0912 17:34:04.217408 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.217824 kubelet[2561]: E0912 17:34:04.217713 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.217824 kubelet[2561]: W0912 17:34:04.217724 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.217824 kubelet[2561]: E0912 17:34:04.217742 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.218058 kubelet[2561]: E0912 17:34:04.217962 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.218058 kubelet[2561]: W0912 17:34:04.217973 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.218192 kubelet[2561]: E0912 17:34:04.218149 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.218365 kubelet[2561]: E0912 17:34:04.218311 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.218365 kubelet[2561]: W0912 17:34:04.218322 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.218365 kubelet[2561]: E0912 17:34:04.218354 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.218740 kubelet[2561]: E0912 17:34:04.218625 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.218740 kubelet[2561]: W0912 17:34:04.218636 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.218740 kubelet[2561]: E0912 17:34:04.218664 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.219035 kubelet[2561]: E0912 17:34:04.218888 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.219035 kubelet[2561]: W0912 17:34:04.218898 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.219035 kubelet[2561]: E0912 17:34:04.218915 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.219289 kubelet[2561]: E0912 17:34:04.219268 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.219289 kubelet[2561]: W0912 17:34:04.219284 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.219469 kubelet[2561]: E0912 17:34:04.219295 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.219639 kubelet[2561]: E0912 17:34:04.219619 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.219639 kubelet[2561]: W0912 17:34:04.219635 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.219714 kubelet[2561]: E0912 17:34:04.219648 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.219868 kubelet[2561]: E0912 17:34:04.219844 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.219868 kubelet[2561]: W0912 17:34:04.219858 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.220273 kubelet[2561]: E0912 17:34:04.219870 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.220273 kubelet[2561]: E0912 17:34:04.220148 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.220273 kubelet[2561]: W0912 17:34:04.220156 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.220273 kubelet[2561]: E0912 17:34:04.220169 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.220522 kubelet[2561]: E0912 17:34:04.220497 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.220522 kubelet[2561]: W0912 17:34:04.220513 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.220712 kubelet[2561]: E0912 17:34:04.220534 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.220752 kubelet[2561]: E0912 17:34:04.220725 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.220752 kubelet[2561]: W0912 17:34:04.220733 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.220752 kubelet[2561]: E0912 17:34:04.220741 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.220945 kubelet[2561]: E0912 17:34:04.220931 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.220945 kubelet[2561]: W0912 17:34:04.220942 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.221018 kubelet[2561]: E0912 17:34:04.220950 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.221170 kubelet[2561]: E0912 17:34:04.221149 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.221170 kubelet[2561]: W0912 17:34:04.221163 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.221296 kubelet[2561]: E0912 17:34:04.221172 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:04.221672 kubelet[2561]: E0912 17:34:04.221652 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:04.221672 kubelet[2561]: W0912 17:34:04.221664 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:04.221672 kubelet[2561]: E0912 17:34:04.221673 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.079345 kubelet[2561]: E0912 17:34:05.079267 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fsx7j" podUID="53bbc8dc-3621-4d89-be2b-ae2eb974a294" Sep 12 17:34:05.162961 kubelet[2561]: I0912 17:34:05.162592 2561 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:34:05.211317 kubelet[2561]: E0912 17:34:05.211266 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.211317 kubelet[2561]: W0912 17:34:05.211299 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.211317 kubelet[2561]: E0912 17:34:05.211317 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.211674 kubelet[2561]: E0912 17:34:05.211469 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.211674 kubelet[2561]: W0912 17:34:05.211478 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.211674 kubelet[2561]: E0912 17:34:05.211487 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.211674 kubelet[2561]: E0912 17:34:05.211623 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.211674 kubelet[2561]: W0912 17:34:05.211631 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.211674 kubelet[2561]: E0912 17:34:05.211638 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.211800 kubelet[2561]: E0912 17:34:05.211771 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.211800 kubelet[2561]: W0912 17:34:05.211779 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.211800 kubelet[2561]: E0912 17:34:05.211786 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.212362 kubelet[2561]: E0912 17:34:05.211947 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.212362 kubelet[2561]: W0912 17:34:05.211957 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.212362 kubelet[2561]: E0912 17:34:05.211966 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.212362 kubelet[2561]: E0912 17:34:05.212162 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.212362 kubelet[2561]: W0912 17:34:05.212170 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.212362 kubelet[2561]: E0912 17:34:05.212179 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.212524 kubelet[2561]: E0912 17:34:05.212433 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.212524 kubelet[2561]: W0912 17:34:05.212442 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.212524 kubelet[2561]: E0912 17:34:05.212450 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.212872 kubelet[2561]: E0912 17:34:05.212599 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.212872 kubelet[2561]: W0912 17:34:05.212612 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.212872 kubelet[2561]: E0912 17:34:05.212620 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.212872 kubelet[2561]: E0912 17:34:05.212790 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.212872 kubelet[2561]: W0912 17:34:05.212799 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.212872 kubelet[2561]: E0912 17:34:05.212807 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.213021 kubelet[2561]: E0912 17:34:05.212994 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.213021 kubelet[2561]: W0912 17:34:05.213002 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.213021 kubelet[2561]: E0912 17:34:05.213011 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.213817 kubelet[2561]: E0912 17:34:05.213184 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.213817 kubelet[2561]: W0912 17:34:05.213194 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.213817 kubelet[2561]: E0912 17:34:05.213202 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.213817 kubelet[2561]: E0912 17:34:05.213409 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.213817 kubelet[2561]: W0912 17:34:05.213423 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.213817 kubelet[2561]: E0912 17:34:05.213431 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.213817 kubelet[2561]: E0912 17:34:05.213594 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.213817 kubelet[2561]: W0912 17:34:05.213602 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.213817 kubelet[2561]: E0912 17:34:05.213611 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.213817 kubelet[2561]: E0912 17:34:05.213789 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.214052 kubelet[2561]: W0912 17:34:05.213798 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.214052 kubelet[2561]: E0912 17:34:05.213806 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.214052 kubelet[2561]: E0912 17:34:05.213979 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.214052 kubelet[2561]: W0912 17:34:05.213987 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.214052 kubelet[2561]: E0912 17:34:05.213995 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.224204 kubelet[2561]: E0912 17:34:05.224178 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.224204 kubelet[2561]: W0912 17:34:05.224193 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.224204 kubelet[2561]: E0912 17:34:05.224205 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.224958 kubelet[2561]: E0912 17:34:05.224422 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.224958 kubelet[2561]: W0912 17:34:05.224431 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.224958 kubelet[2561]: E0912 17:34:05.224439 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.224958 kubelet[2561]: E0912 17:34:05.224608 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.224958 kubelet[2561]: W0912 17:34:05.224626 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.224958 kubelet[2561]: E0912 17:34:05.224641 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.224958 kubelet[2561]: E0912 17:34:05.224848 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.224958 kubelet[2561]: W0912 17:34:05.224857 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.224958 kubelet[2561]: E0912 17:34:05.224869 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.225494 kubelet[2561]: E0912 17:34:05.225298 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.225494 kubelet[2561]: W0912 17:34:05.225309 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.225494 kubelet[2561]: E0912 17:34:05.225331 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.225849 kubelet[2561]: E0912 17:34:05.225618 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.225849 kubelet[2561]: W0912 17:34:05.225629 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.225849 kubelet[2561]: E0912 17:34:05.225643 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.226086 kubelet[2561]: E0912 17:34:05.226075 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.226153 kubelet[2561]: W0912 17:34:05.226142 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.226267 kubelet[2561]: E0912 17:34:05.226239 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.226542 kubelet[2561]: E0912 17:34:05.226466 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.226542 kubelet[2561]: W0912 17:34:05.226477 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.226733 kubelet[2561]: E0912 17:34:05.226618 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.226821 kubelet[2561]: E0912 17:34:05.226792 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.226821 kubelet[2561]: W0912 17:34:05.226809 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.226870 kubelet[2561]: E0912 17:34:05.226845 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.227135 kubelet[2561]: E0912 17:34:05.227102 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.227594 kubelet[2561]: W0912 17:34:05.227118 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.227744 kubelet[2561]: E0912 17:34:05.227609 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.227823 kubelet[2561]: E0912 17:34:05.227798 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.227823 kubelet[2561]: W0912 17:34:05.227813 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.227873 kubelet[2561]: E0912 17:34:05.227863 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.228087 kubelet[2561]: E0912 17:34:05.228067 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.228087 kubelet[2561]: W0912 17:34:05.228081 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.228149 kubelet[2561]: E0912 17:34:05.228096 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.228351 kubelet[2561]: E0912 17:34:05.228328 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.228351 kubelet[2561]: W0912 17:34:05.228343 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.228488 kubelet[2561]: E0912 17:34:05.228357 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.228620 kubelet[2561]: E0912 17:34:05.228498 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.228620 kubelet[2561]: W0912 17:34:05.228509 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.228620 kubelet[2561]: E0912 17:34:05.228531 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.228759 kubelet[2561]: E0912 17:34:05.228743 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.228759 kubelet[2561]: W0912 17:34:05.228757 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.228857 kubelet[2561]: E0912 17:34:05.228771 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.229092 kubelet[2561]: E0912 17:34:05.229070 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.229092 kubelet[2561]: W0912 17:34:05.229083 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.229185 kubelet[2561]: E0912 17:34:05.229097 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.229345 kubelet[2561]: E0912 17:34:05.229320 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.229345 kubelet[2561]: W0912 17:34:05.229335 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.229415 kubelet[2561]: E0912 17:34:05.229348 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.229881 kubelet[2561]: E0912 17:34:05.229860 2561 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:34:05.229881 kubelet[2561]: W0912 17:34:05.229876 2561 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:34:05.229967 kubelet[2561]: E0912 17:34:05.229886 2561 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:34:05.373958 containerd[1494]: time="2025-09-12T17:34:05.371995948Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:05.375381 containerd[1494]: time="2025-09-12T17:34:05.375332363Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 12 17:34:05.377120 containerd[1494]: time="2025-09-12T17:34:05.376148945Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:05.378614 containerd[1494]: time="2025-09-12T17:34:05.377906028Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:05.378614 containerd[1494]: time="2025-09-12T17:34:05.378517827Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.819440723s" Sep 12 17:34:05.378614 containerd[1494]: time="2025-09-12T17:34:05.378545090Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 12 17:34:05.381175 containerd[1494]: time="2025-09-12T17:34:05.381137701Z" level=info msg="CreateContainer within sandbox \"ca7c90a17f9ecf47ab769b74bab16df3cb9ef935b94714f76e5beff7e8e4cbfe\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 12 17:34:05.424973 containerd[1494]: time="2025-09-12T17:34:05.424544071Z" level=info msg="CreateContainer within sandbox \"ca7c90a17f9ecf47ab769b74bab16df3cb9ef935b94714f76e5beff7e8e4cbfe\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"61dbaf4a18d06a6bf17404cbb1f42563c350b685d7fa5e2afe51c97ec4a0aaf9\"" Sep 12 17:34:05.427251 containerd[1494]: time="2025-09-12T17:34:05.425888033Z" level=info msg="StartContainer for \"61dbaf4a18d06a6bf17404cbb1f42563c350b685d7fa5e2afe51c97ec4a0aaf9\"" Sep 12 17:34:05.457759 systemd[1]: Started cri-containerd-61dbaf4a18d06a6bf17404cbb1f42563c350b685d7fa5e2afe51c97ec4a0aaf9.scope - libcontainer container 61dbaf4a18d06a6bf17404cbb1f42563c350b685d7fa5e2afe51c97ec4a0aaf9. Sep 12 17:34:05.483168 containerd[1494]: time="2025-09-12T17:34:05.483127521Z" level=info msg="StartContainer for \"61dbaf4a18d06a6bf17404cbb1f42563c350b685d7fa5e2afe51c97ec4a0aaf9\" returns successfully" Sep 12 17:34:05.493173 systemd[1]: cri-containerd-61dbaf4a18d06a6bf17404cbb1f42563c350b685d7fa5e2afe51c97ec4a0aaf9.scope: Deactivated successfully. Sep 12 17:34:05.512961 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-61dbaf4a18d06a6bf17404cbb1f42563c350b685d7fa5e2afe51c97ec4a0aaf9-rootfs.mount: Deactivated successfully. Sep 12 17:34:05.530077 containerd[1494]: time="2025-09-12T17:34:05.517794140Z" level=info msg="shim disconnected" id=61dbaf4a18d06a6bf17404cbb1f42563c350b685d7fa5e2afe51c97ec4a0aaf9 namespace=k8s.io Sep 12 17:34:05.530077 containerd[1494]: time="2025-09-12T17:34:05.530056091Z" level=warning msg="cleaning up after shim disconnected" id=61dbaf4a18d06a6bf17404cbb1f42563c350b685d7fa5e2afe51c97ec4a0aaf9 namespace=k8s.io Sep 12 17:34:05.530077 containerd[1494]: time="2025-09-12T17:34:05.530067073Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 12 17:34:06.163574 containerd[1494]: time="2025-09-12T17:34:06.163085952Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 12 17:34:06.180756 kubelet[2561]: I0912 17:34:06.180446 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-7bb9f6f777-n9trb" podStartSLOduration=3.900941261 podStartE2EDuration="6.180430306s" podCreationTimestamp="2025-09-12 17:34:00 +0000 UTC" firstStartedPulling="2025-09-12 17:34:01.279193473 +0000 UTC m=+17.306608583" lastFinishedPulling="2025-09-12 17:34:03.558682538 +0000 UTC m=+19.586097628" observedRunningTime="2025-09-12 17:34:04.196885977 +0000 UTC m=+20.224301077" watchObservedRunningTime="2025-09-12 17:34:06.180430306 +0000 UTC m=+22.207845396" Sep 12 17:34:07.079457 kubelet[2561]: E0912 17:34:07.079410 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fsx7j" podUID="53bbc8dc-3621-4d89-be2b-ae2eb974a294" Sep 12 17:34:07.948190 kubelet[2561]: I0912 17:34:07.948082 2561 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:34:09.080084 kubelet[2561]: E0912 17:34:09.079991 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fsx7j" podUID="53bbc8dc-3621-4d89-be2b-ae2eb974a294" Sep 12 17:34:10.050108 containerd[1494]: time="2025-09-12T17:34:10.050056463Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:10.051108 containerd[1494]: time="2025-09-12T17:34:10.051008064Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 12 17:34:10.052677 containerd[1494]: time="2025-09-12T17:34:10.051769783Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:10.053971 containerd[1494]: time="2025-09-12T17:34:10.053406027Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:10.053971 containerd[1494]: time="2025-09-12T17:34:10.053869969Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 3.890741527s" Sep 12 17:34:10.053971 containerd[1494]: time="2025-09-12T17:34:10.053899095Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 12 17:34:10.056661 containerd[1494]: time="2025-09-12T17:34:10.056641323Z" level=info msg="CreateContainer within sandbox \"ca7c90a17f9ecf47ab769b74bab16df3cb9ef935b94714f76e5beff7e8e4cbfe\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 12 17:34:10.072975 containerd[1494]: time="2025-09-12T17:34:10.072931159Z" level=info msg="CreateContainer within sandbox \"ca7c90a17f9ecf47ab769b74bab16df3cb9ef935b94714f76e5beff7e8e4cbfe\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"7ded94e2d3b7fa6391de909edeabd66ed45d9d284619ace2c67eea10fdf1ad6f\"" Sep 12 17:34:10.075462 containerd[1494]: time="2025-09-12T17:34:10.074354306Z" level=info msg="StartContainer for \"7ded94e2d3b7fa6391de909edeabd66ed45d9d284619ace2c67eea10fdf1ad6f\"" Sep 12 17:34:10.112417 systemd[1]: Started cri-containerd-7ded94e2d3b7fa6391de909edeabd66ed45d9d284619ace2c67eea10fdf1ad6f.scope - libcontainer container 7ded94e2d3b7fa6391de909edeabd66ed45d9d284619ace2c67eea10fdf1ad6f. Sep 12 17:34:10.145293 containerd[1494]: time="2025-09-12T17:34:10.144212672Z" level=info msg="StartContainer for \"7ded94e2d3b7fa6391de909edeabd66ed45d9d284619ace2c67eea10fdf1ad6f\" returns successfully" Sep 12 17:34:10.690430 systemd[1]: cri-containerd-7ded94e2d3b7fa6391de909edeabd66ed45d9d284619ace2c67eea10fdf1ad6f.scope: Deactivated successfully. Sep 12 17:34:10.732907 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7ded94e2d3b7fa6391de909edeabd66ed45d9d284619ace2c67eea10fdf1ad6f-rootfs.mount: Deactivated successfully. Sep 12 17:34:10.741451 containerd[1494]: time="2025-09-12T17:34:10.741361743Z" level=info msg="shim disconnected" id=7ded94e2d3b7fa6391de909edeabd66ed45d9d284619ace2c67eea10fdf1ad6f namespace=k8s.io Sep 12 17:34:10.741680 containerd[1494]: time="2025-09-12T17:34:10.741631316Z" level=warning msg="cleaning up after shim disconnected" id=7ded94e2d3b7fa6391de909edeabd66ed45d9d284619ace2c67eea10fdf1ad6f namespace=k8s.io Sep 12 17:34:10.741680 containerd[1494]: time="2025-09-12T17:34:10.741661984Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 12 17:34:10.758454 kubelet[2561]: I0912 17:34:10.758392 2561 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 12 17:34:10.837185 kubelet[2561]: W0912 17:34:10.837150 2561 reflector.go:569] object-"calico-apiserver"/"calico-apiserver-certs": failed to list *v1.Secret: secrets "calico-apiserver-certs" is forbidden: User "system:node:ci-4081-3-6-f-c182586e87" cannot list resource "secrets" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-4081-3-6-f-c182586e87' and this object Sep 12 17:34:10.839274 kubelet[2561]: E0912 17:34:10.838949 2561 reflector.go:166] "Unhandled Error" err="object-\"calico-apiserver\"/\"calico-apiserver-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"calico-apiserver-certs\" is forbidden: User \"system:node:ci-4081-3-6-f-c182586e87\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ci-4081-3-6-f-c182586e87' and this object" logger="UnhandledError" Sep 12 17:34:10.841470 systemd[1]: Created slice kubepods-besteffort-pod55ae9bf9_0ab6_42c6_be46_b863c939859b.slice - libcontainer container kubepods-besteffort-pod55ae9bf9_0ab6_42c6_be46_b863c939859b.slice. Sep 12 17:34:10.856494 systemd[1]: Created slice kubepods-besteffort-pod8d975815_4db9_4493_ae13_0fed34b78044.slice - libcontainer container kubepods-besteffort-pod8d975815_4db9_4493_ae13_0fed34b78044.slice. Sep 12 17:34:10.865028 systemd[1]: Created slice kubepods-burstable-pod33f03454_7bb2_47fc_bcae_807921ce8ad3.slice - libcontainer container kubepods-burstable-pod33f03454_7bb2_47fc_bcae_807921ce8ad3.slice. Sep 12 17:34:10.871827 kubelet[2561]: I0912 17:34:10.871808 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d975815-4db9-4493-ae13-0fed34b78044-config\") pod \"goldmane-54d579b49d-wnp7f\" (UID: \"8d975815-4db9-4493-ae13-0fed34b78044\") " pod="calico-system/goldmane-54d579b49d-wnp7f" Sep 12 17:34:10.872140 kubelet[2561]: I0912 17:34:10.872125 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4kpz\" (UniqueName: \"kubernetes.io/projected/55ae9bf9-0ab6-42c6-be46-b863c939859b-kube-api-access-b4kpz\") pod \"whisker-67c45c5486-4zft2\" (UID: \"55ae9bf9-0ab6-42c6-be46-b863c939859b\") " pod="calico-system/whisker-67c45c5486-4zft2" Sep 12 17:34:10.873014 kubelet[2561]: I0912 17:34:10.872399 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/33f03454-7bb2-47fc-bcae-807921ce8ad3-config-volume\") pod \"coredns-668d6bf9bc-s4d7g\" (UID: \"33f03454-7bb2-47fc-bcae-807921ce8ad3\") " pod="kube-system/coredns-668d6bf9bc-s4d7g" Sep 12 17:34:10.872923 systemd[1]: Created slice kubepods-besteffort-pod9ee60034_1a31_4418_b743_0f97ab43bc92.slice - libcontainer container kubepods-besteffort-pod9ee60034_1a31_4418_b743_0f97ab43bc92.slice. Sep 12 17:34:10.873721 kubelet[2561]: I0912 17:34:10.873618 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/0d943c3b-957f-40f9-b21f-ac657206aace-calico-apiserver-certs\") pod \"calico-apiserver-6dd46897bd-htdld\" (UID: \"0d943c3b-957f-40f9-b21f-ac657206aace\") " pod="calico-apiserver/calico-apiserver-6dd46897bd-htdld" Sep 12 17:34:10.873850 kubelet[2561]: I0912 17:34:10.873837 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-775r5\" (UniqueName: \"kubernetes.io/projected/0d943c3b-957f-40f9-b21f-ac657206aace-kube-api-access-775r5\") pod \"calico-apiserver-6dd46897bd-htdld\" (UID: \"0d943c3b-957f-40f9-b21f-ac657206aace\") " pod="calico-apiserver/calico-apiserver-6dd46897bd-htdld" Sep 12 17:34:10.874393 kubelet[2561]: I0912 17:34:10.873909 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmt79\" (UniqueName: \"kubernetes.io/projected/33f03454-7bb2-47fc-bcae-807921ce8ad3-kube-api-access-hmt79\") pod \"coredns-668d6bf9bc-s4d7g\" (UID: \"33f03454-7bb2-47fc-bcae-807921ce8ad3\") " pod="kube-system/coredns-668d6bf9bc-s4d7g" Sep 12 17:34:10.874393 kubelet[2561]: I0912 17:34:10.873929 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxw92\" (UniqueName: \"kubernetes.io/projected/8d975815-4db9-4493-ae13-0fed34b78044-kube-api-access-xxw92\") pod \"goldmane-54d579b49d-wnp7f\" (UID: \"8d975815-4db9-4493-ae13-0fed34b78044\") " pod="calico-system/goldmane-54d579b49d-wnp7f" Sep 12 17:34:10.874393 kubelet[2561]: I0912 17:34:10.873942 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mzkc\" (UniqueName: \"kubernetes.io/projected/9ee60034-1a31-4418-b743-0f97ab43bc92-kube-api-access-9mzkc\") pod \"calico-apiserver-6dd46897bd-bbjqp\" (UID: \"9ee60034-1a31-4418-b743-0f97ab43bc92\") " pod="calico-apiserver/calico-apiserver-6dd46897bd-bbjqp" Sep 12 17:34:10.874393 kubelet[2561]: I0912 17:34:10.873955 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a745a4f-a77e-4660-9ac5-801112a8b773-tigera-ca-bundle\") pod \"calico-kube-controllers-599b568c97-s5dwl\" (UID: \"1a745a4f-a77e-4660-9ac5-801112a8b773\") " pod="calico-system/calico-kube-controllers-599b568c97-s5dwl" Sep 12 17:34:10.874393 kubelet[2561]: I0912 17:34:10.873969 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv4tk\" (UniqueName: \"kubernetes.io/projected/1a745a4f-a77e-4660-9ac5-801112a8b773-kube-api-access-bv4tk\") pod \"calico-kube-controllers-599b568c97-s5dwl\" (UID: \"1a745a4f-a77e-4660-9ac5-801112a8b773\") " pod="calico-system/calico-kube-controllers-599b568c97-s5dwl" Sep 12 17:34:10.874612 kubelet[2561]: I0912 17:34:10.873982 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cbad7146-1812-4262-a0a7-fad3ba169a40-config-volume\") pod \"coredns-668d6bf9bc-szrkp\" (UID: \"cbad7146-1812-4262-a0a7-fad3ba169a40\") " pod="kube-system/coredns-668d6bf9bc-szrkp" Sep 12 17:34:10.874612 kubelet[2561]: I0912 17:34:10.873996 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw2vg\" (UniqueName: \"kubernetes.io/projected/cbad7146-1812-4262-a0a7-fad3ba169a40-kube-api-access-nw2vg\") pod \"coredns-668d6bf9bc-szrkp\" (UID: \"cbad7146-1812-4262-a0a7-fad3ba169a40\") " pod="kube-system/coredns-668d6bf9bc-szrkp" Sep 12 17:34:10.874612 kubelet[2561]: I0912 17:34:10.874009 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/9ee60034-1a31-4418-b743-0f97ab43bc92-calico-apiserver-certs\") pod \"calico-apiserver-6dd46897bd-bbjqp\" (UID: \"9ee60034-1a31-4418-b743-0f97ab43bc92\") " pod="calico-apiserver/calico-apiserver-6dd46897bd-bbjqp" Sep 12 17:34:10.874612 kubelet[2561]: I0912 17:34:10.874021 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55ae9bf9-0ab6-42c6-be46-b863c939859b-whisker-ca-bundle\") pod \"whisker-67c45c5486-4zft2\" (UID: \"55ae9bf9-0ab6-42c6-be46-b863c939859b\") " pod="calico-system/whisker-67c45c5486-4zft2" Sep 12 17:34:10.874612 kubelet[2561]: I0912 17:34:10.874039 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/8d975815-4db9-4493-ae13-0fed34b78044-goldmane-key-pair\") pod \"goldmane-54d579b49d-wnp7f\" (UID: \"8d975815-4db9-4493-ae13-0fed34b78044\") " pod="calico-system/goldmane-54d579b49d-wnp7f" Sep 12 17:34:10.876003 kubelet[2561]: I0912 17:34:10.874051 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d975815-4db9-4493-ae13-0fed34b78044-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-wnp7f\" (UID: \"8d975815-4db9-4493-ae13-0fed34b78044\") " pod="calico-system/goldmane-54d579b49d-wnp7f" Sep 12 17:34:10.876003 kubelet[2561]: I0912 17:34:10.874065 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/55ae9bf9-0ab6-42c6-be46-b863c939859b-whisker-backend-key-pair\") pod \"whisker-67c45c5486-4zft2\" (UID: \"55ae9bf9-0ab6-42c6-be46-b863c939859b\") " pod="calico-system/whisker-67c45c5486-4zft2" Sep 12 17:34:10.880830 systemd[1]: Created slice kubepods-burstable-podcbad7146_1812_4262_a0a7_fad3ba169a40.slice - libcontainer container kubepods-burstable-podcbad7146_1812_4262_a0a7_fad3ba169a40.slice. Sep 12 17:34:10.889009 systemd[1]: Created slice kubepods-besteffort-pod1a745a4f_a77e_4660_9ac5_801112a8b773.slice - libcontainer container kubepods-besteffort-pod1a745a4f_a77e_4660_9ac5_801112a8b773.slice. Sep 12 17:34:10.895430 systemd[1]: Created slice kubepods-besteffort-pod0d943c3b_957f_40f9_b21f_ac657206aace.slice - libcontainer container kubepods-besteffort-pod0d943c3b_957f_40f9_b21f_ac657206aace.slice. Sep 12 17:34:11.087793 systemd[1]: Created slice kubepods-besteffort-pod53bbc8dc_3621_4d89_be2b_ae2eb974a294.slice - libcontainer container kubepods-besteffort-pod53bbc8dc_3621_4d89_be2b_ae2eb974a294.slice. Sep 12 17:34:11.090530 containerd[1494]: time="2025-09-12T17:34:11.090494204Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fsx7j,Uid:53bbc8dc-3621-4d89-be2b-ae2eb974a294,Namespace:calico-system,Attempt:0,}" Sep 12 17:34:11.150440 containerd[1494]: time="2025-09-12T17:34:11.150155743Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-67c45c5486-4zft2,Uid:55ae9bf9-0ab6-42c6-be46-b863c939859b,Namespace:calico-system,Attempt:0,}" Sep 12 17:34:11.162710 containerd[1494]: time="2025-09-12T17:34:11.162654785Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-wnp7f,Uid:8d975815-4db9-4493-ae13-0fed34b78044,Namespace:calico-system,Attempt:0,}" Sep 12 17:34:11.170308 containerd[1494]: time="2025-09-12T17:34:11.170257127Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-s4d7g,Uid:33f03454-7bb2-47fc-bcae-807921ce8ad3,Namespace:kube-system,Attempt:0,}" Sep 12 17:34:11.187830 containerd[1494]: time="2025-09-12T17:34:11.186892453Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-szrkp,Uid:cbad7146-1812-4262-a0a7-fad3ba169a40,Namespace:kube-system,Attempt:0,}" Sep 12 17:34:11.193872 containerd[1494]: time="2025-09-12T17:34:11.193485285Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-599b568c97-s5dwl,Uid:1a745a4f-a77e-4660-9ac5-801112a8b773,Namespace:calico-system,Attempt:0,}" Sep 12 17:34:11.239495 containerd[1494]: time="2025-09-12T17:34:11.239176337Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 12 17:34:11.394625 containerd[1494]: time="2025-09-12T17:34:11.394043198Z" level=error msg="Failed to destroy network for sandbox \"8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.398195 containerd[1494]: time="2025-09-12T17:34:11.398156698Z" level=error msg="encountered an error cleaning up failed sandbox \"8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.398297 containerd[1494]: time="2025-09-12T17:34:11.398237231Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-67c45c5486-4zft2,Uid:55ae9bf9-0ab6-42c6-be46-b863c939859b,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.404413 containerd[1494]: time="2025-09-12T17:34:11.403030363Z" level=error msg="Failed to destroy network for sandbox \"c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.404413 containerd[1494]: time="2025-09-12T17:34:11.403323351Z" level=error msg="encountered an error cleaning up failed sandbox \"c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.404413 containerd[1494]: time="2025-09-12T17:34:11.403358447Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-szrkp,Uid:cbad7146-1812-4262-a0a7-fad3ba169a40,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.404524 kubelet[2561]: E0912 17:34:11.403228 2561 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.404524 kubelet[2561]: E0912 17:34:11.403304 2561 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-67c45c5486-4zft2" Sep 12 17:34:11.404524 kubelet[2561]: E0912 17:34:11.403324 2561 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-67c45c5486-4zft2" Sep 12 17:34:11.404613 kubelet[2561]: E0912 17:34:11.403365 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-67c45c5486-4zft2_calico-system(55ae9bf9-0ab6-42c6-be46-b863c939859b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-67c45c5486-4zft2_calico-system(55ae9bf9-0ab6-42c6-be46-b863c939859b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-67c45c5486-4zft2" podUID="55ae9bf9-0ab6-42c6-be46-b863c939859b" Sep 12 17:34:11.408927 kubelet[2561]: E0912 17:34:11.408906 2561 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.409091 kubelet[2561]: E0912 17:34:11.409076 2561 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-szrkp" Sep 12 17:34:11.409482 kubelet[2561]: E0912 17:34:11.409256 2561 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-szrkp" Sep 12 17:34:11.409482 kubelet[2561]: E0912 17:34:11.409300 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-szrkp_kube-system(cbad7146-1812-4262-a0a7-fad3ba169a40)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-szrkp_kube-system(cbad7146-1812-4262-a0a7-fad3ba169a40)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-szrkp" podUID="cbad7146-1812-4262-a0a7-fad3ba169a40" Sep 12 17:34:11.425433 containerd[1494]: time="2025-09-12T17:34:11.425324867Z" level=error msg="Failed to destroy network for sandbox \"8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.425734 containerd[1494]: time="2025-09-12T17:34:11.425711852Z" level=error msg="encountered an error cleaning up failed sandbox \"8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.425903 containerd[1494]: time="2025-09-12T17:34:11.425883689Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-599b568c97-s5dwl,Uid:1a745a4f-a77e-4660-9ac5-801112a8b773,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.426386 containerd[1494]: time="2025-09-12T17:34:11.425835067Z" level=error msg="Failed to destroy network for sandbox \"fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.426612 containerd[1494]: time="2025-09-12T17:34:11.426591856Z" level=error msg="encountered an error cleaning up failed sandbox \"fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.426743 containerd[1494]: time="2025-09-12T17:34:11.426678079Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fsx7j,Uid:53bbc8dc-3621-4d89-be2b-ae2eb974a294,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.427266 kubelet[2561]: E0912 17:34:11.426899 2561 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.427266 kubelet[2561]: E0912 17:34:11.426949 2561 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fsx7j" Sep 12 17:34:11.427266 kubelet[2561]: E0912 17:34:11.426965 2561 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fsx7j" Sep 12 17:34:11.427353 kubelet[2561]: E0912 17:34:11.426995 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-fsx7j_calico-system(53bbc8dc-3621-4d89-be2b-ae2eb974a294)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-fsx7j_calico-system(53bbc8dc-3621-4d89-be2b-ae2eb974a294)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-fsx7j" podUID="53bbc8dc-3621-4d89-be2b-ae2eb974a294" Sep 12 17:34:11.428696 kubelet[2561]: E0912 17:34:11.428537 2561 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.428696 kubelet[2561]: E0912 17:34:11.428586 2561 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-599b568c97-s5dwl" Sep 12 17:34:11.428696 kubelet[2561]: E0912 17:34:11.428602 2561 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-599b568c97-s5dwl" Sep 12 17:34:11.429130 kubelet[2561]: E0912 17:34:11.428633 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-599b568c97-s5dwl_calico-system(1a745a4f-a77e-4660-9ac5-801112a8b773)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-599b568c97-s5dwl_calico-system(1a745a4f-a77e-4660-9ac5-801112a8b773)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-599b568c97-s5dwl" podUID="1a745a4f-a77e-4660-9ac5-801112a8b773" Sep 12 17:34:11.437268 containerd[1494]: time="2025-09-12T17:34:11.437214429Z" level=error msg="Failed to destroy network for sandbox \"3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.437584 containerd[1494]: time="2025-09-12T17:34:11.437541582Z" level=error msg="encountered an error cleaning up failed sandbox \"3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.437625 containerd[1494]: time="2025-09-12T17:34:11.437602197Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-wnp7f,Uid:8d975815-4db9-4493-ae13-0fed34b78044,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.437965 kubelet[2561]: E0912 17:34:11.437763 2561 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.437965 kubelet[2561]: E0912 17:34:11.437801 2561 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-wnp7f" Sep 12 17:34:11.437965 kubelet[2561]: E0912 17:34:11.437815 2561 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-wnp7f" Sep 12 17:34:11.438044 kubelet[2561]: E0912 17:34:11.437840 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-wnp7f_calico-system(8d975815-4db9-4493-ae13-0fed34b78044)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-wnp7f_calico-system(8d975815-4db9-4493-ae13-0fed34b78044)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-wnp7f" podUID="8d975815-4db9-4493-ae13-0fed34b78044" Sep 12 17:34:11.446053 containerd[1494]: time="2025-09-12T17:34:11.446002386Z" level=error msg="Failed to destroy network for sandbox \"66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.446284 containerd[1494]: time="2025-09-12T17:34:11.446250648Z" level=error msg="encountered an error cleaning up failed sandbox \"66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.446336 containerd[1494]: time="2025-09-12T17:34:11.446300994Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-s4d7g,Uid:33f03454-7bb2-47fc-bcae-807921ce8ad3,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.446567 kubelet[2561]: E0912 17:34:11.446513 2561 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.446643 kubelet[2561]: E0912 17:34:11.446572 2561 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-s4d7g" Sep 12 17:34:11.446643 kubelet[2561]: E0912 17:34:11.446591 2561 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-s4d7g" Sep 12 17:34:11.446643 kubelet[2561]: E0912 17:34:11.446626 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-s4d7g_kube-system(33f03454-7bb2-47fc-bcae-807921ce8ad3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-s4d7g_kube-system(33f03454-7bb2-47fc-bcae-807921ce8ad3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-s4d7g" podUID="33f03454-7bb2-47fc-bcae-807921ce8ad3" Sep 12 17:34:11.778079 containerd[1494]: time="2025-09-12T17:34:11.778035372Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6dd46897bd-bbjqp,Uid:9ee60034-1a31-4418-b743-0f97ab43bc92,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:34:11.800610 containerd[1494]: time="2025-09-12T17:34:11.800329453Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6dd46897bd-htdld,Uid:0d943c3b-957f-40f9-b21f-ac657206aace,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:34:11.842686 containerd[1494]: time="2025-09-12T17:34:11.842627495Z" level=error msg="Failed to destroy network for sandbox \"1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.844401 containerd[1494]: time="2025-09-12T17:34:11.842950700Z" level=error msg="encountered an error cleaning up failed sandbox \"1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.844401 containerd[1494]: time="2025-09-12T17:34:11.842998740Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6dd46897bd-bbjqp,Uid:9ee60034-1a31-4418-b743-0f97ab43bc92,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.844639 kubelet[2561]: E0912 17:34:11.843278 2561 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.844639 kubelet[2561]: E0912 17:34:11.843329 2561 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6dd46897bd-bbjqp" Sep 12 17:34:11.844639 kubelet[2561]: E0912 17:34:11.843352 2561 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6dd46897bd-bbjqp" Sep 12 17:34:11.845140 kubelet[2561]: E0912 17:34:11.843390 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6dd46897bd-bbjqp_calico-apiserver(9ee60034-1a31-4418-b743-0f97ab43bc92)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6dd46897bd-bbjqp_calico-apiserver(9ee60034-1a31-4418-b743-0f97ab43bc92)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6dd46897bd-bbjqp" podUID="9ee60034-1a31-4418-b743-0f97ab43bc92" Sep 12 17:34:11.886969 containerd[1494]: time="2025-09-12T17:34:11.886818104Z" level=error msg="Failed to destroy network for sandbox \"f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.887558 containerd[1494]: time="2025-09-12T17:34:11.887492165Z" level=error msg="encountered an error cleaning up failed sandbox \"f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.887765 containerd[1494]: time="2025-09-12T17:34:11.887577437Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6dd46897bd-htdld,Uid:0d943c3b-957f-40f9-b21f-ac657206aace,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.888021 kubelet[2561]: E0912 17:34:11.887940 2561 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:11.888021 kubelet[2561]: E0912 17:34:11.888005 2561 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6dd46897bd-htdld" Sep 12 17:34:11.888186 kubelet[2561]: E0912 17:34:11.888027 2561 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6dd46897bd-htdld" Sep 12 17:34:11.888186 kubelet[2561]: E0912 17:34:11.888080 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6dd46897bd-htdld_calico-apiserver(0d943c3b-957f-40f9-b21f-ac657206aace)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6dd46897bd-htdld_calico-apiserver(0d943c3b-957f-40f9-b21f-ac657206aace)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6dd46897bd-htdld" podUID="0d943c3b-957f-40f9-b21f-ac657206aace" Sep 12 17:34:12.073095 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405-shm.mount: Deactivated successfully. Sep 12 17:34:12.073197 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3-shm.mount: Deactivated successfully. Sep 12 17:34:12.237589 kubelet[2561]: I0912 17:34:12.237507 2561 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" Sep 12 17:34:12.243051 kubelet[2561]: I0912 17:34:12.243009 2561 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" Sep 12 17:34:12.264912 kubelet[2561]: I0912 17:34:12.264343 2561 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" Sep 12 17:34:12.267491 kubelet[2561]: I0912 17:34:12.267429 2561 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" Sep 12 17:34:12.271372 kubelet[2561]: I0912 17:34:12.271323 2561 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" Sep 12 17:34:12.276381 kubelet[2561]: I0912 17:34:12.275053 2561 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" Sep 12 17:34:12.277941 kubelet[2561]: I0912 17:34:12.277914 2561 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" Sep 12 17:34:12.281188 kubelet[2561]: I0912 17:34:12.281149 2561 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" Sep 12 17:34:12.297254 containerd[1494]: time="2025-09-12T17:34:12.297154207Z" level=info msg="StopPodSandbox for \"c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa\"" Sep 12 17:34:12.298984 containerd[1494]: time="2025-09-12T17:34:12.298318178Z" level=info msg="StopPodSandbox for \"fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3\"" Sep 12 17:34:12.302042 containerd[1494]: time="2025-09-12T17:34:12.301954686Z" level=info msg="Ensure that sandbox fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3 in task-service has been cleanup successfully" Sep 12 17:34:12.303666 containerd[1494]: time="2025-09-12T17:34:12.303119928Z" level=info msg="StopPodSandbox for \"8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa\"" Sep 12 17:34:12.303666 containerd[1494]: time="2025-09-12T17:34:12.303361066Z" level=info msg="Ensure that sandbox 8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa in task-service has been cleanup successfully" Sep 12 17:34:12.304669 containerd[1494]: time="2025-09-12T17:34:12.304613676Z" level=info msg="StopPodSandbox for \"1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6\"" Sep 12 17:34:12.305129 containerd[1494]: time="2025-09-12T17:34:12.304822633Z" level=info msg="Ensure that sandbox c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa in task-service has been cleanup successfully" Sep 12 17:34:12.313276 containerd[1494]: time="2025-09-12T17:34:12.304890431Z" level=info msg="StopPodSandbox for \"f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697\"" Sep 12 17:34:12.313276 containerd[1494]: time="2025-09-12T17:34:12.304918855Z" level=info msg="StopPodSandbox for \"66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30\"" Sep 12 17:34:12.313276 containerd[1494]: time="2025-09-12T17:34:12.304949954Z" level=info msg="StopPodSandbox for \"3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848\"" Sep 12 17:34:12.313276 containerd[1494]: time="2025-09-12T17:34:12.304975464Z" level=info msg="StopPodSandbox for \"8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405\"" Sep 12 17:34:12.314651 containerd[1494]: time="2025-09-12T17:34:12.314324321Z" level=info msg="Ensure that sandbox 1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6 in task-service has been cleanup successfully" Sep 12 17:34:12.314985 containerd[1494]: time="2025-09-12T17:34:12.314923128Z" level=info msg="Ensure that sandbox f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697 in task-service has been cleanup successfully" Sep 12 17:34:12.316755 containerd[1494]: time="2025-09-12T17:34:12.315166921Z" level=info msg="Ensure that sandbox 66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30 in task-service has been cleanup successfully" Sep 12 17:34:12.318207 containerd[1494]: time="2025-09-12T17:34:12.316528417Z" level=info msg="Ensure that sandbox 3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848 in task-service has been cleanup successfully" Sep 12 17:34:12.324297 containerd[1494]: time="2025-09-12T17:34:12.316669085Z" level=info msg="Ensure that sandbox 8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405 in task-service has been cleanup successfully" Sep 12 17:34:12.413545 containerd[1494]: time="2025-09-12T17:34:12.413480241Z" level=error msg="StopPodSandbox for \"8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa\" failed" error="failed to destroy network for sandbox \"8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:12.413933 kubelet[2561]: E0912 17:34:12.413735 2561 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" Sep 12 17:34:12.418448 kubelet[2561]: E0912 17:34:12.413806 2561 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa"} Sep 12 17:34:12.418517 kubelet[2561]: E0912 17:34:12.418468 2561 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"1a745a4f-a77e-4660-9ac5-801112a8b773\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 17:34:12.418517 kubelet[2561]: E0912 17:34:12.418493 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"1a745a4f-a77e-4660-9ac5-801112a8b773\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-599b568c97-s5dwl" podUID="1a745a4f-a77e-4660-9ac5-801112a8b773" Sep 12 17:34:12.424304 containerd[1494]: time="2025-09-12T17:34:12.423617656Z" level=error msg="StopPodSandbox for \"fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3\" failed" error="failed to destroy network for sandbox \"fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:12.424363 kubelet[2561]: E0912 17:34:12.423783 2561 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" Sep 12 17:34:12.424363 kubelet[2561]: E0912 17:34:12.423815 2561 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3"} Sep 12 17:34:12.424363 kubelet[2561]: E0912 17:34:12.423839 2561 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"53bbc8dc-3621-4d89-be2b-ae2eb974a294\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 17:34:12.424363 kubelet[2561]: E0912 17:34:12.423868 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"53bbc8dc-3621-4d89-be2b-ae2eb974a294\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-fsx7j" podUID="53bbc8dc-3621-4d89-be2b-ae2eb974a294" Sep 12 17:34:12.426605 containerd[1494]: time="2025-09-12T17:34:12.426389110Z" level=error msg="StopPodSandbox for \"c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa\" failed" error="failed to destroy network for sandbox \"c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:12.426839 kubelet[2561]: E0912 17:34:12.426811 2561 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" Sep 12 17:34:12.426927 kubelet[2561]: E0912 17:34:12.426845 2561 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa"} Sep 12 17:34:12.426927 kubelet[2561]: E0912 17:34:12.426869 2561 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"cbad7146-1812-4262-a0a7-fad3ba169a40\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 17:34:12.426927 kubelet[2561]: E0912 17:34:12.426886 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"cbad7146-1812-4262-a0a7-fad3ba169a40\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-szrkp" podUID="cbad7146-1812-4262-a0a7-fad3ba169a40" Sep 12 17:34:12.429291 containerd[1494]: time="2025-09-12T17:34:12.429214658Z" level=error msg="StopPodSandbox for \"f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697\" failed" error="failed to destroy network for sandbox \"f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:12.429557 kubelet[2561]: E0912 17:34:12.429537 2561 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" Sep 12 17:34:12.429650 kubelet[2561]: E0912 17:34:12.429636 2561 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697"} Sep 12 17:34:12.429713 kubelet[2561]: E0912 17:34:12.429703 2561 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d943c3b-957f-40f9-b21f-ac657206aace\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 17:34:12.430158 kubelet[2561]: E0912 17:34:12.429786 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d943c3b-957f-40f9-b21f-ac657206aace\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6dd46897bd-htdld" podUID="0d943c3b-957f-40f9-b21f-ac657206aace" Sep 12 17:34:12.430158 kubelet[2561]: E0912 17:34:12.430042 2561 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" Sep 12 17:34:12.430158 kubelet[2561]: E0912 17:34:12.430070 2561 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6"} Sep 12 17:34:12.430158 kubelet[2561]: E0912 17:34:12.430090 2561 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9ee60034-1a31-4418-b743-0f97ab43bc92\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 17:34:12.430308 containerd[1494]: time="2025-09-12T17:34:12.429888087Z" level=error msg="StopPodSandbox for \"1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6\" failed" error="failed to destroy network for sandbox \"1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:12.430334 kubelet[2561]: E0912 17:34:12.430105 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9ee60034-1a31-4418-b743-0f97ab43bc92\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6dd46897bd-bbjqp" podUID="9ee60034-1a31-4418-b743-0f97ab43bc92" Sep 12 17:34:12.431498 containerd[1494]: time="2025-09-12T17:34:12.431374391Z" level=error msg="StopPodSandbox for \"66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30\" failed" error="failed to destroy network for sandbox \"66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:12.431646 kubelet[2561]: E0912 17:34:12.431612 2561 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" Sep 12 17:34:12.431646 kubelet[2561]: E0912 17:34:12.431652 2561 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30"} Sep 12 17:34:12.431723 kubelet[2561]: E0912 17:34:12.431673 2561 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"33f03454-7bb2-47fc-bcae-807921ce8ad3\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 17:34:12.431723 kubelet[2561]: E0912 17:34:12.431689 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"33f03454-7bb2-47fc-bcae-807921ce8ad3\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-s4d7g" podUID="33f03454-7bb2-47fc-bcae-807921ce8ad3" Sep 12 17:34:12.436117 containerd[1494]: time="2025-09-12T17:34:12.436079888Z" level=error msg="StopPodSandbox for \"8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405\" failed" error="failed to destroy network for sandbox \"8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:12.436258 kubelet[2561]: E0912 17:34:12.436211 2561 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" Sep 12 17:34:12.436314 kubelet[2561]: E0912 17:34:12.436261 2561 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405"} Sep 12 17:34:12.436314 kubelet[2561]: E0912 17:34:12.436281 2561 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"55ae9bf9-0ab6-42c6-be46-b863c939859b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 17:34:12.436314 kubelet[2561]: E0912 17:34:12.436297 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"55ae9bf9-0ab6-42c6-be46-b863c939859b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-67c45c5486-4zft2" podUID="55ae9bf9-0ab6-42c6-be46-b863c939859b" Sep 12 17:34:12.437992 containerd[1494]: time="2025-09-12T17:34:12.437956182Z" level=error msg="StopPodSandbox for \"3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848\" failed" error="failed to destroy network for sandbox \"3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:34:12.438088 kubelet[2561]: E0912 17:34:12.438060 2561 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" Sep 12 17:34:12.438134 kubelet[2561]: E0912 17:34:12.438089 2561 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848"} Sep 12 17:34:12.438156 kubelet[2561]: E0912 17:34:12.438122 2561 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8d975815-4db9-4493-ae13-0fed34b78044\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 17:34:12.438197 kubelet[2561]: E0912 17:34:12.438152 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8d975815-4db9-4493-ae13-0fed34b78044\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-wnp7f" podUID="8d975815-4db9-4493-ae13-0fed34b78044" Sep 12 17:34:18.701232 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1629976762.mount: Deactivated successfully. Sep 12 17:34:18.801640 containerd[1494]: time="2025-09-12T17:34:18.789570132Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 12 17:34:18.808912 containerd[1494]: time="2025-09-12T17:34:18.808865351Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:18.811367 containerd[1494]: time="2025-09-12T17:34:18.810811773Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 7.564098988s" Sep 12 17:34:18.811481 containerd[1494]: time="2025-09-12T17:34:18.811461953Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 12 17:34:18.846811 containerd[1494]: time="2025-09-12T17:34:18.846679812Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:18.847178 containerd[1494]: time="2025-09-12T17:34:18.847074650Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:18.874857 containerd[1494]: time="2025-09-12T17:34:18.874829270Z" level=info msg="CreateContainer within sandbox \"ca7c90a17f9ecf47ab769b74bab16df3cb9ef935b94714f76e5beff7e8e4cbfe\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 12 17:34:18.927761 containerd[1494]: time="2025-09-12T17:34:18.927528798Z" level=info msg="CreateContainer within sandbox \"ca7c90a17f9ecf47ab769b74bab16df3cb9ef935b94714f76e5beff7e8e4cbfe\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"286a299eacff7870608a7cd93d1c8ab356a3c3014097a853184039ec28e19e94\"" Sep 12 17:34:18.933094 containerd[1494]: time="2025-09-12T17:34:18.932965642Z" level=info msg="StartContainer for \"286a299eacff7870608a7cd93d1c8ab356a3c3014097a853184039ec28e19e94\"" Sep 12 17:34:19.053342 systemd[1]: Started cri-containerd-286a299eacff7870608a7cd93d1c8ab356a3c3014097a853184039ec28e19e94.scope - libcontainer container 286a299eacff7870608a7cd93d1c8ab356a3c3014097a853184039ec28e19e94. Sep 12 17:34:19.085127 containerd[1494]: time="2025-09-12T17:34:19.084470897Z" level=info msg="StartContainer for \"286a299eacff7870608a7cd93d1c8ab356a3c3014097a853184039ec28e19e94\" returns successfully" Sep 12 17:34:19.166644 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 12 17:34:19.168488 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 12 17:34:19.320347 containerd[1494]: time="2025-09-12T17:34:19.320142667Z" level=info msg="StopPodSandbox for \"8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405\"" Sep 12 17:34:19.465104 kubelet[2561]: I0912 17:34:19.454989 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-vsntk" podStartSLOduration=1.122048944 podStartE2EDuration="18.445590427s" podCreationTimestamp="2025-09-12 17:34:01 +0000 UTC" firstStartedPulling="2025-09-12 17:34:01.524068789 +0000 UTC m=+17.551483869" lastFinishedPulling="2025-09-12 17:34:18.847610272 +0000 UTC m=+34.875025352" observedRunningTime="2025-09-12 17:34:19.351408989 +0000 UTC m=+35.378824079" watchObservedRunningTime="2025-09-12 17:34:19.445590427 +0000 UTC m=+35.473005507" Sep 12 17:34:19.634160 containerd[1494]: 2025-09-12 17:34:19.443 [INFO][3798] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" Sep 12 17:34:19.634160 containerd[1494]: 2025-09-12 17:34:19.444 [INFO][3798] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" iface="eth0" netns="/var/run/netns/cni-17a12a44-804c-9aaf-f684-8dec3e8b273d" Sep 12 17:34:19.634160 containerd[1494]: 2025-09-12 17:34:19.445 [INFO][3798] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" iface="eth0" netns="/var/run/netns/cni-17a12a44-804c-9aaf-f684-8dec3e8b273d" Sep 12 17:34:19.634160 containerd[1494]: 2025-09-12 17:34:19.446 [INFO][3798] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" iface="eth0" netns="/var/run/netns/cni-17a12a44-804c-9aaf-f684-8dec3e8b273d" Sep 12 17:34:19.634160 containerd[1494]: 2025-09-12 17:34:19.446 [INFO][3798] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" Sep 12 17:34:19.634160 containerd[1494]: 2025-09-12 17:34:19.446 [INFO][3798] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" Sep 12 17:34:19.634160 containerd[1494]: 2025-09-12 17:34:19.610 [INFO][3822] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" HandleID="k8s-pod-network.8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" Workload="ci--4081--3--6--f--c182586e87-k8s-whisker--67c45c5486--4zft2-eth0" Sep 12 17:34:19.634160 containerd[1494]: 2025-09-12 17:34:19.612 [INFO][3822] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:19.634160 containerd[1494]: 2025-09-12 17:34:19.612 [INFO][3822] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:19.634160 containerd[1494]: 2025-09-12 17:34:19.627 [WARNING][3822] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" HandleID="k8s-pod-network.8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" Workload="ci--4081--3--6--f--c182586e87-k8s-whisker--67c45c5486--4zft2-eth0" Sep 12 17:34:19.634160 containerd[1494]: 2025-09-12 17:34:19.627 [INFO][3822] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" HandleID="k8s-pod-network.8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" Workload="ci--4081--3--6--f--c182586e87-k8s-whisker--67c45c5486--4zft2-eth0" Sep 12 17:34:19.634160 containerd[1494]: 2025-09-12 17:34:19.629 [INFO][3822] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:19.634160 containerd[1494]: 2025-09-12 17:34:19.632 [INFO][3798] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" Sep 12 17:34:19.640052 containerd[1494]: time="2025-09-12T17:34:19.639996480Z" level=info msg="TearDown network for sandbox \"8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405\" successfully" Sep 12 17:34:19.640052 containerd[1494]: time="2025-09-12T17:34:19.640039392Z" level=info msg="StopPodSandbox for \"8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405\" returns successfully" Sep 12 17:34:19.704138 systemd[1]: run-netns-cni\x2d17a12a44\x2d804c\x2d9aaf\x2df684\x2d8dec3e8b273d.mount: Deactivated successfully. Sep 12 17:34:19.772769 kubelet[2561]: I0912 17:34:19.772409 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55ae9bf9-0ab6-42c6-be46-b863c939859b-whisker-ca-bundle\") pod \"55ae9bf9-0ab6-42c6-be46-b863c939859b\" (UID: \"55ae9bf9-0ab6-42c6-be46-b863c939859b\") " Sep 12 17:34:19.772769 kubelet[2561]: I0912 17:34:19.772485 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/55ae9bf9-0ab6-42c6-be46-b863c939859b-whisker-backend-key-pair\") pod \"55ae9bf9-0ab6-42c6-be46-b863c939859b\" (UID: \"55ae9bf9-0ab6-42c6-be46-b863c939859b\") " Sep 12 17:34:19.772769 kubelet[2561]: I0912 17:34:19.772509 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4kpz\" (UniqueName: \"kubernetes.io/projected/55ae9bf9-0ab6-42c6-be46-b863c939859b-kube-api-access-b4kpz\") pod \"55ae9bf9-0ab6-42c6-be46-b863c939859b\" (UID: \"55ae9bf9-0ab6-42c6-be46-b863c939859b\") " Sep 12 17:34:19.773903 kubelet[2561]: I0912 17:34:19.772706 2561 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55ae9bf9-0ab6-42c6-be46-b863c939859b-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "55ae9bf9-0ab6-42c6-be46-b863c939859b" (UID: "55ae9bf9-0ab6-42c6-be46-b863c939859b"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 12 17:34:19.778342 kubelet[2561]: I0912 17:34:19.778317 2561 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55ae9bf9-0ab6-42c6-be46-b863c939859b-kube-api-access-b4kpz" (OuterVolumeSpecName: "kube-api-access-b4kpz") pod "55ae9bf9-0ab6-42c6-be46-b863c939859b" (UID: "55ae9bf9-0ab6-42c6-be46-b863c939859b"). InnerVolumeSpecName "kube-api-access-b4kpz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 12 17:34:19.781504 kubelet[2561]: I0912 17:34:19.780323 2561 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55ae9bf9-0ab6-42c6-be46-b863c939859b-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "55ae9bf9-0ab6-42c6-be46-b863c939859b" (UID: "55ae9bf9-0ab6-42c6-be46-b863c939859b"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 12 17:34:19.780544 systemd[1]: var-lib-kubelet-pods-55ae9bf9\x2d0ab6\x2d42c6\x2dbe46\x2db863c939859b-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2db4kpz.mount: Deactivated successfully. Sep 12 17:34:19.780652 systemd[1]: var-lib-kubelet-pods-55ae9bf9\x2d0ab6\x2d42c6\x2dbe46\x2db863c939859b-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 12 17:34:19.877088 kubelet[2561]: I0912 17:34:19.877024 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b4kpz\" (UniqueName: \"kubernetes.io/projected/55ae9bf9-0ab6-42c6-be46-b863c939859b-kube-api-access-b4kpz\") on node \"ci-4081-3-6-f-c182586e87\" DevicePath \"\"" Sep 12 17:34:19.877088 kubelet[2561]: I0912 17:34:19.877073 2561 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55ae9bf9-0ab6-42c6-be46-b863c939859b-whisker-ca-bundle\") on node \"ci-4081-3-6-f-c182586e87\" DevicePath \"\"" Sep 12 17:34:19.877088 kubelet[2561]: I0912 17:34:19.877087 2561 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/55ae9bf9-0ab6-42c6-be46-b863c939859b-whisker-backend-key-pair\") on node \"ci-4081-3-6-f-c182586e87\" DevicePath \"\"" Sep 12 17:34:20.096636 systemd[1]: Removed slice kubepods-besteffort-pod55ae9bf9_0ab6_42c6_be46_b863c939859b.slice - libcontainer container kubepods-besteffort-pod55ae9bf9_0ab6_42c6_be46_b863c939859b.slice. Sep 12 17:34:20.435687 systemd[1]: Created slice kubepods-besteffort-poddb25bf8a_44b6_46b5_8949_fab1e8ffb48a.slice - libcontainer container kubepods-besteffort-poddb25bf8a_44b6_46b5_8949_fab1e8ffb48a.slice. Sep 12 17:34:20.484725 kubelet[2561]: I0912 17:34:20.484684 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db25bf8a-44b6-46b5-8949-fab1e8ffb48a-whisker-ca-bundle\") pod \"whisker-58bcf489cc-l5xmn\" (UID: \"db25bf8a-44b6-46b5-8949-fab1e8ffb48a\") " pod="calico-system/whisker-58bcf489cc-l5xmn" Sep 12 17:34:20.485152 kubelet[2561]: I0912 17:34:20.484737 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kcp5\" (UniqueName: \"kubernetes.io/projected/db25bf8a-44b6-46b5-8949-fab1e8ffb48a-kube-api-access-2kcp5\") pod \"whisker-58bcf489cc-l5xmn\" (UID: \"db25bf8a-44b6-46b5-8949-fab1e8ffb48a\") " pod="calico-system/whisker-58bcf489cc-l5xmn" Sep 12 17:34:20.485152 kubelet[2561]: I0912 17:34:20.484764 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/db25bf8a-44b6-46b5-8949-fab1e8ffb48a-whisker-backend-key-pair\") pod \"whisker-58bcf489cc-l5xmn\" (UID: \"db25bf8a-44b6-46b5-8949-fab1e8ffb48a\") " pod="calico-system/whisker-58bcf489cc-l5xmn" Sep 12 17:34:20.754979 containerd[1494]: time="2025-09-12T17:34:20.754945073Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-58bcf489cc-l5xmn,Uid:db25bf8a-44b6-46b5-8949-fab1e8ffb48a,Namespace:calico-system,Attempt:0,}" Sep 12 17:34:20.935763 systemd-networkd[1404]: cali713090c83ee: Link UP Sep 12 17:34:20.936059 systemd-networkd[1404]: cali713090c83ee: Gained carrier Sep 12 17:34:20.961842 containerd[1494]: 2025-09-12 17:34:20.839 [INFO][3963] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 17:34:20.961842 containerd[1494]: 2025-09-12 17:34:20.854 [INFO][3963] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--f--c182586e87-k8s-whisker--58bcf489cc--l5xmn-eth0 whisker-58bcf489cc- calico-system db25bf8a-44b6-46b5-8949-fab1e8ffb48a 870 0 2025-09-12 17:34:20 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:58bcf489cc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4081-3-6-f-c182586e87 whisker-58bcf489cc-l5xmn eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali713090c83ee [] [] }} ContainerID="279aaa5b125d7559f881b35f19266c102c03725463b7236693e224041f7b080e" Namespace="calico-system" Pod="whisker-58bcf489cc-l5xmn" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-whisker--58bcf489cc--l5xmn-" Sep 12 17:34:20.961842 containerd[1494]: 2025-09-12 17:34:20.854 [INFO][3963] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="279aaa5b125d7559f881b35f19266c102c03725463b7236693e224041f7b080e" Namespace="calico-system" Pod="whisker-58bcf489cc-l5xmn" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-whisker--58bcf489cc--l5xmn-eth0" Sep 12 17:34:20.961842 containerd[1494]: 2025-09-12 17:34:20.879 [INFO][3975] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="279aaa5b125d7559f881b35f19266c102c03725463b7236693e224041f7b080e" HandleID="k8s-pod-network.279aaa5b125d7559f881b35f19266c102c03725463b7236693e224041f7b080e" Workload="ci--4081--3--6--f--c182586e87-k8s-whisker--58bcf489cc--l5xmn-eth0" Sep 12 17:34:20.961842 containerd[1494]: 2025-09-12 17:34:20.880 [INFO][3975] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="279aaa5b125d7559f881b35f19266c102c03725463b7236693e224041f7b080e" HandleID="k8s-pod-network.279aaa5b125d7559f881b35f19266c102c03725463b7236693e224041f7b080e" Workload="ci--4081--3--6--f--c182586e87-k8s-whisker--58bcf489cc--l5xmn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000332020), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-f-c182586e87", "pod":"whisker-58bcf489cc-l5xmn", "timestamp":"2025-09-12 17:34:20.879879443 +0000 UTC"}, Hostname:"ci-4081-3-6-f-c182586e87", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:34:20.961842 containerd[1494]: 2025-09-12 17:34:20.880 [INFO][3975] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:20.961842 containerd[1494]: 2025-09-12 17:34:20.880 [INFO][3975] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:20.961842 containerd[1494]: 2025-09-12 17:34:20.880 [INFO][3975] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-f-c182586e87' Sep 12 17:34:20.961842 containerd[1494]: 2025-09-12 17:34:20.888 [INFO][3975] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.279aaa5b125d7559f881b35f19266c102c03725463b7236693e224041f7b080e" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:20.961842 containerd[1494]: 2025-09-12 17:34:20.897 [INFO][3975] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:20.961842 containerd[1494]: 2025-09-12 17:34:20.903 [INFO][3975] ipam/ipam.go 511: Trying affinity for 192.168.34.128/26 host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:20.961842 containerd[1494]: 2025-09-12 17:34:20.904 [INFO][3975] ipam/ipam.go 158: Attempting to load block cidr=192.168.34.128/26 host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:20.961842 containerd[1494]: 2025-09-12 17:34:20.907 [INFO][3975] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.34.128/26 host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:20.961842 containerd[1494]: 2025-09-12 17:34:20.907 [INFO][3975] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.34.128/26 handle="k8s-pod-network.279aaa5b125d7559f881b35f19266c102c03725463b7236693e224041f7b080e" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:20.961842 containerd[1494]: 2025-09-12 17:34:20.908 [INFO][3975] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.279aaa5b125d7559f881b35f19266c102c03725463b7236693e224041f7b080e Sep 12 17:34:20.961842 containerd[1494]: 2025-09-12 17:34:20.912 [INFO][3975] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.34.128/26 handle="k8s-pod-network.279aaa5b125d7559f881b35f19266c102c03725463b7236693e224041f7b080e" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:20.961842 containerd[1494]: 2025-09-12 17:34:20.917 [INFO][3975] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.34.129/26] block=192.168.34.128/26 handle="k8s-pod-network.279aaa5b125d7559f881b35f19266c102c03725463b7236693e224041f7b080e" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:20.961842 containerd[1494]: 2025-09-12 17:34:20.917 [INFO][3975] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.34.129/26] handle="k8s-pod-network.279aaa5b125d7559f881b35f19266c102c03725463b7236693e224041f7b080e" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:20.961842 containerd[1494]: 2025-09-12 17:34:20.917 [INFO][3975] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:20.961842 containerd[1494]: 2025-09-12 17:34:20.917 [INFO][3975] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.34.129/26] IPv6=[] ContainerID="279aaa5b125d7559f881b35f19266c102c03725463b7236693e224041f7b080e" HandleID="k8s-pod-network.279aaa5b125d7559f881b35f19266c102c03725463b7236693e224041f7b080e" Workload="ci--4081--3--6--f--c182586e87-k8s-whisker--58bcf489cc--l5xmn-eth0" Sep 12 17:34:20.962598 containerd[1494]: 2025-09-12 17:34:20.921 [INFO][3963] cni-plugin/k8s.go 418: Populated endpoint ContainerID="279aaa5b125d7559f881b35f19266c102c03725463b7236693e224041f7b080e" Namespace="calico-system" Pod="whisker-58bcf489cc-l5xmn" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-whisker--58bcf489cc--l5xmn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--f--c182586e87-k8s-whisker--58bcf489cc--l5xmn-eth0", GenerateName:"whisker-58bcf489cc-", Namespace:"calico-system", SelfLink:"", UID:"db25bf8a-44b6-46b5-8949-fab1e8ffb48a", ResourceVersion:"870", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 34, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"58bcf489cc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-f-c182586e87", ContainerID:"", Pod:"whisker-58bcf489cc-l5xmn", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.34.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali713090c83ee", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:34:20.962598 containerd[1494]: 2025-09-12 17:34:20.921 [INFO][3963] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.34.129/32] ContainerID="279aaa5b125d7559f881b35f19266c102c03725463b7236693e224041f7b080e" Namespace="calico-system" Pod="whisker-58bcf489cc-l5xmn" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-whisker--58bcf489cc--l5xmn-eth0" Sep 12 17:34:20.962598 containerd[1494]: 2025-09-12 17:34:20.921 [INFO][3963] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali713090c83ee ContainerID="279aaa5b125d7559f881b35f19266c102c03725463b7236693e224041f7b080e" Namespace="calico-system" Pod="whisker-58bcf489cc-l5xmn" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-whisker--58bcf489cc--l5xmn-eth0" Sep 12 17:34:20.962598 containerd[1494]: 2025-09-12 17:34:20.936 [INFO][3963] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="279aaa5b125d7559f881b35f19266c102c03725463b7236693e224041f7b080e" Namespace="calico-system" Pod="whisker-58bcf489cc-l5xmn" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-whisker--58bcf489cc--l5xmn-eth0" Sep 12 17:34:20.962598 containerd[1494]: 2025-09-12 17:34:20.937 [INFO][3963] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="279aaa5b125d7559f881b35f19266c102c03725463b7236693e224041f7b080e" Namespace="calico-system" Pod="whisker-58bcf489cc-l5xmn" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-whisker--58bcf489cc--l5xmn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--f--c182586e87-k8s-whisker--58bcf489cc--l5xmn-eth0", GenerateName:"whisker-58bcf489cc-", Namespace:"calico-system", SelfLink:"", UID:"db25bf8a-44b6-46b5-8949-fab1e8ffb48a", ResourceVersion:"870", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 34, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"58bcf489cc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-f-c182586e87", ContainerID:"279aaa5b125d7559f881b35f19266c102c03725463b7236693e224041f7b080e", Pod:"whisker-58bcf489cc-l5xmn", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.34.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali713090c83ee", MAC:"b2:15:01:d3:5f:d5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:34:20.962598 containerd[1494]: 2025-09-12 17:34:20.955 [INFO][3963] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="279aaa5b125d7559f881b35f19266c102c03725463b7236693e224041f7b080e" Namespace="calico-system" Pod="whisker-58bcf489cc-l5xmn" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-whisker--58bcf489cc--l5xmn-eth0" Sep 12 17:34:20.988516 containerd[1494]: time="2025-09-12T17:34:20.988437526Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:34:20.988516 containerd[1494]: time="2025-09-12T17:34:20.988480357Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:34:20.988516 containerd[1494]: time="2025-09-12T17:34:20.988500364Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:34:20.988837 containerd[1494]: time="2025-09-12T17:34:20.988610363Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:34:21.012013 systemd[1]: Started cri-containerd-279aaa5b125d7559f881b35f19266c102c03725463b7236693e224041f7b080e.scope - libcontainer container 279aaa5b125d7559f881b35f19266c102c03725463b7236693e224041f7b080e. Sep 12 17:34:21.064287 containerd[1494]: time="2025-09-12T17:34:21.064257119Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-58bcf489cc-l5xmn,Uid:db25bf8a-44b6-46b5-8949-fab1e8ffb48a,Namespace:calico-system,Attempt:0,} returns sandbox id \"279aaa5b125d7559f881b35f19266c102c03725463b7236693e224041f7b080e\"" Sep 12 17:34:21.067910 containerd[1494]: time="2025-09-12T17:34:21.067644357Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 12 17:34:21.072239 kernel: bpftool[4057]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Sep 12 17:34:21.257370 systemd-networkd[1404]: vxlan.calico: Link UP Sep 12 17:34:21.257377 systemd-networkd[1404]: vxlan.calico: Gained carrier Sep 12 17:34:22.082745 kubelet[2561]: I0912 17:34:22.082699 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55ae9bf9-0ab6-42c6-be46-b863c939859b" path="/var/lib/kubelet/pods/55ae9bf9-0ab6-42c6-be46-b863c939859b/volumes" Sep 12 17:34:22.854520 systemd-networkd[1404]: vxlan.calico: Gained IPv6LL Sep 12 17:34:22.854864 systemd-networkd[1404]: cali713090c83ee: Gained IPv6LL Sep 12 17:34:23.158629 containerd[1494]: time="2025-09-12T17:34:23.158482398Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:23.160112 containerd[1494]: time="2025-09-12T17:34:23.159973351Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 12 17:34:23.162202 containerd[1494]: time="2025-09-12T17:34:23.160941018Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:23.164256 containerd[1494]: time="2025-09-12T17:34:23.163168019Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:23.164256 containerd[1494]: time="2025-09-12T17:34:23.164088156Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 2.096176945s" Sep 12 17:34:23.164256 containerd[1494]: time="2025-09-12T17:34:23.164121119Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 12 17:34:23.166629 containerd[1494]: time="2025-09-12T17:34:23.166591430Z" level=info msg="CreateContainer within sandbox \"279aaa5b125d7559f881b35f19266c102c03725463b7236693e224041f7b080e\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 12 17:34:23.178461 containerd[1494]: time="2025-09-12T17:34:23.178424610Z" level=info msg="CreateContainer within sandbox \"279aaa5b125d7559f881b35f19266c102c03725463b7236693e224041f7b080e\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"e9e8379fc3fc5af6fb3ff0eefd9755f4a4196b5ea9affc40c6c1b55322eac54f\"" Sep 12 17:34:23.179079 containerd[1494]: time="2025-09-12T17:34:23.179060490Z" level=info msg="StartContainer for \"e9e8379fc3fc5af6fb3ff0eefd9755f4a4196b5ea9affc40c6c1b55322eac54f\"" Sep 12 17:34:23.205372 systemd[1]: Started cri-containerd-e9e8379fc3fc5af6fb3ff0eefd9755f4a4196b5ea9affc40c6c1b55322eac54f.scope - libcontainer container e9e8379fc3fc5af6fb3ff0eefd9755f4a4196b5ea9affc40c6c1b55322eac54f. Sep 12 17:34:23.242212 containerd[1494]: time="2025-09-12T17:34:23.242118956Z" level=info msg="StartContainer for \"e9e8379fc3fc5af6fb3ff0eefd9755f4a4196b5ea9affc40c6c1b55322eac54f\" returns successfully" Sep 12 17:34:23.244322 containerd[1494]: time="2025-09-12T17:34:23.244071630Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 12 17:34:24.084882 containerd[1494]: time="2025-09-12T17:34:24.083865466Z" level=info msg="StopPodSandbox for \"3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848\"" Sep 12 17:34:24.086690 containerd[1494]: time="2025-09-12T17:34:24.086211432Z" level=info msg="StopPodSandbox for \"c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa\"" Sep 12 17:34:24.088588 containerd[1494]: time="2025-09-12T17:34:24.087132129Z" level=info msg="StopPodSandbox for \"1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6\"" Sep 12 17:34:24.294299 containerd[1494]: 2025-09-12 17:34:24.242 [INFO][4217] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" Sep 12 17:34:24.294299 containerd[1494]: 2025-09-12 17:34:24.242 [INFO][4217] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" iface="eth0" netns="/var/run/netns/cni-bf0015d6-c20e-de80-e12f-0f9209e11f10" Sep 12 17:34:24.294299 containerd[1494]: 2025-09-12 17:34:24.242 [INFO][4217] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" iface="eth0" netns="/var/run/netns/cni-bf0015d6-c20e-de80-e12f-0f9209e11f10" Sep 12 17:34:24.294299 containerd[1494]: 2025-09-12 17:34:24.243 [INFO][4217] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" iface="eth0" netns="/var/run/netns/cni-bf0015d6-c20e-de80-e12f-0f9209e11f10" Sep 12 17:34:24.294299 containerd[1494]: 2025-09-12 17:34:24.243 [INFO][4217] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" Sep 12 17:34:24.294299 containerd[1494]: 2025-09-12 17:34:24.243 [INFO][4217] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" Sep 12 17:34:24.294299 containerd[1494]: 2025-09-12 17:34:24.277 [INFO][4253] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" HandleID="k8s-pod-network.3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" Workload="ci--4081--3--6--f--c182586e87-k8s-goldmane--54d579b49d--wnp7f-eth0" Sep 12 17:34:24.294299 containerd[1494]: 2025-09-12 17:34:24.277 [INFO][4253] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:24.294299 containerd[1494]: 2025-09-12 17:34:24.277 [INFO][4253] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:24.294299 containerd[1494]: 2025-09-12 17:34:24.282 [WARNING][4253] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" HandleID="k8s-pod-network.3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" Workload="ci--4081--3--6--f--c182586e87-k8s-goldmane--54d579b49d--wnp7f-eth0" Sep 12 17:34:24.294299 containerd[1494]: 2025-09-12 17:34:24.282 [INFO][4253] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" HandleID="k8s-pod-network.3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" Workload="ci--4081--3--6--f--c182586e87-k8s-goldmane--54d579b49d--wnp7f-eth0" Sep 12 17:34:24.294299 containerd[1494]: 2025-09-12 17:34:24.284 [INFO][4253] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:24.294299 containerd[1494]: 2025-09-12 17:34:24.289 [INFO][4217] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" Sep 12 17:34:24.294299 containerd[1494]: time="2025-09-12T17:34:24.294141144Z" level=info msg="TearDown network for sandbox \"3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848\" successfully" Sep 12 17:34:24.294299 containerd[1494]: time="2025-09-12T17:34:24.294180017Z" level=info msg="StopPodSandbox for \"3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848\" returns successfully" Sep 12 17:34:24.297007 systemd[1]: run-netns-cni\x2dbf0015d6\x2dc20e\x2dde80\x2de12f\x2d0f9209e11f10.mount: Deactivated successfully. Sep 12 17:34:24.300106 containerd[1494]: time="2025-09-12T17:34:24.299691604Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-wnp7f,Uid:8d975815-4db9-4493-ae13-0fed34b78044,Namespace:calico-system,Attempt:1,}" Sep 12 17:34:24.312577 containerd[1494]: 2025-09-12 17:34:24.214 [INFO][4227] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" Sep 12 17:34:24.312577 containerd[1494]: 2025-09-12 17:34:24.216 [INFO][4227] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" iface="eth0" netns="/var/run/netns/cni-38419799-3aab-b671-f36e-3b8929106f0a" Sep 12 17:34:24.312577 containerd[1494]: 2025-09-12 17:34:24.216 [INFO][4227] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" iface="eth0" netns="/var/run/netns/cni-38419799-3aab-b671-f36e-3b8929106f0a" Sep 12 17:34:24.312577 containerd[1494]: 2025-09-12 17:34:24.217 [INFO][4227] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" iface="eth0" netns="/var/run/netns/cni-38419799-3aab-b671-f36e-3b8929106f0a" Sep 12 17:34:24.312577 containerd[1494]: 2025-09-12 17:34:24.217 [INFO][4227] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" Sep 12 17:34:24.312577 containerd[1494]: 2025-09-12 17:34:24.217 [INFO][4227] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" Sep 12 17:34:24.312577 containerd[1494]: 2025-09-12 17:34:24.276 [INFO][4243] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" HandleID="k8s-pod-network.c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" Workload="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--szrkp-eth0" Sep 12 17:34:24.312577 containerd[1494]: 2025-09-12 17:34:24.279 [INFO][4243] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:24.312577 containerd[1494]: 2025-09-12 17:34:24.284 [INFO][4243] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:24.312577 containerd[1494]: 2025-09-12 17:34:24.291 [WARNING][4243] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" HandleID="k8s-pod-network.c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" Workload="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--szrkp-eth0" Sep 12 17:34:24.312577 containerd[1494]: 2025-09-12 17:34:24.291 [INFO][4243] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" HandleID="k8s-pod-network.c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" Workload="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--szrkp-eth0" Sep 12 17:34:24.312577 containerd[1494]: 2025-09-12 17:34:24.292 [INFO][4243] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:24.312577 containerd[1494]: 2025-09-12 17:34:24.303 [INFO][4227] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" Sep 12 17:34:24.315448 systemd[1]: run-netns-cni\x2d38419799\x2d3aab\x2db671\x2df36e\x2d3b8929106f0a.mount: Deactivated successfully. Sep 12 17:34:24.316344 containerd[1494]: time="2025-09-12T17:34:24.315643904Z" level=info msg="TearDown network for sandbox \"c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa\" successfully" Sep 12 17:34:24.316344 containerd[1494]: time="2025-09-12T17:34:24.315661729Z" level=info msg="StopPodSandbox for \"c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa\" returns successfully" Sep 12 17:34:24.316920 containerd[1494]: time="2025-09-12T17:34:24.316772644Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-szrkp,Uid:cbad7146-1812-4262-a0a7-fad3ba169a40,Namespace:kube-system,Attempt:1,}" Sep 12 17:34:24.325379 containerd[1494]: 2025-09-12 17:34:24.219 [INFO][4228] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" Sep 12 17:34:24.325379 containerd[1494]: 2025-09-12 17:34:24.222 [INFO][4228] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" iface="eth0" netns="/var/run/netns/cni-b3226eee-1491-09f5-84d7-b35ceed90f62" Sep 12 17:34:24.325379 containerd[1494]: 2025-09-12 17:34:24.222 [INFO][4228] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" iface="eth0" netns="/var/run/netns/cni-b3226eee-1491-09f5-84d7-b35ceed90f62" Sep 12 17:34:24.325379 containerd[1494]: 2025-09-12 17:34:24.222 [INFO][4228] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" iface="eth0" netns="/var/run/netns/cni-b3226eee-1491-09f5-84d7-b35ceed90f62" Sep 12 17:34:24.325379 containerd[1494]: 2025-09-12 17:34:24.222 [INFO][4228] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" Sep 12 17:34:24.325379 containerd[1494]: 2025-09-12 17:34:24.223 [INFO][4228] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" Sep 12 17:34:24.325379 containerd[1494]: 2025-09-12 17:34:24.289 [INFO][4248] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" HandleID="k8s-pod-network.1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--bbjqp-eth0" Sep 12 17:34:24.325379 containerd[1494]: 2025-09-12 17:34:24.289 [INFO][4248] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:24.325379 containerd[1494]: 2025-09-12 17:34:24.292 [INFO][4248] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:24.325379 containerd[1494]: 2025-09-12 17:34:24.306 [WARNING][4248] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" HandleID="k8s-pod-network.1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--bbjqp-eth0" Sep 12 17:34:24.325379 containerd[1494]: 2025-09-12 17:34:24.310 [INFO][4248] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" HandleID="k8s-pod-network.1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--bbjqp-eth0" Sep 12 17:34:24.325379 containerd[1494]: 2025-09-12 17:34:24.311 [INFO][4248] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:24.325379 containerd[1494]: 2025-09-12 17:34:24.321 [INFO][4228] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" Sep 12 17:34:24.327854 containerd[1494]: time="2025-09-12T17:34:24.327769849Z" level=info msg="TearDown network for sandbox \"1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6\" successfully" Sep 12 17:34:24.327854 containerd[1494]: time="2025-09-12T17:34:24.327789545Z" level=info msg="StopPodSandbox for \"1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6\" returns successfully" Sep 12 17:34:24.328569 systemd[1]: run-netns-cni\x2db3226eee\x2d1491\x2d09f5\x2d84d7\x2db35ceed90f62.mount: Deactivated successfully. Sep 12 17:34:24.329959 containerd[1494]: time="2025-09-12T17:34:24.329608587Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6dd46897bd-bbjqp,Uid:9ee60034-1a31-4418-b743-0f97ab43bc92,Namespace:calico-apiserver,Attempt:1,}" Sep 12 17:34:24.447714 systemd-networkd[1404]: calif622695599e: Link UP Sep 12 17:34:24.447872 systemd-networkd[1404]: calif622695599e: Gained carrier Sep 12 17:34:24.462703 containerd[1494]: 2025-09-12 17:34:24.388 [INFO][4274] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--szrkp-eth0 coredns-668d6bf9bc- kube-system cbad7146-1812-4262-a0a7-fad3ba169a40 891 0 2025-09-12 17:33:49 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081-3-6-f-c182586e87 coredns-668d6bf9bc-szrkp eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calif622695599e [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="ea4dba399394ac14f99bd30ac548af6eaf19877ec2518480e8b0a89a7b6f3199" Namespace="kube-system" Pod="coredns-668d6bf9bc-szrkp" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--szrkp-" Sep 12 17:34:24.462703 containerd[1494]: 2025-09-12 17:34:24.389 [INFO][4274] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ea4dba399394ac14f99bd30ac548af6eaf19877ec2518480e8b0a89a7b6f3199" Namespace="kube-system" Pod="coredns-668d6bf9bc-szrkp" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--szrkp-eth0" Sep 12 17:34:24.462703 containerd[1494]: 2025-09-12 17:34:24.412 [INFO][4309] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ea4dba399394ac14f99bd30ac548af6eaf19877ec2518480e8b0a89a7b6f3199" HandleID="k8s-pod-network.ea4dba399394ac14f99bd30ac548af6eaf19877ec2518480e8b0a89a7b6f3199" Workload="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--szrkp-eth0" Sep 12 17:34:24.462703 containerd[1494]: 2025-09-12 17:34:24.412 [INFO][4309] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ea4dba399394ac14f99bd30ac548af6eaf19877ec2518480e8b0a89a7b6f3199" HandleID="k8s-pod-network.ea4dba399394ac14f99bd30ac548af6eaf19877ec2518480e8b0a89a7b6f3199" Workload="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--szrkp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5860), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081-3-6-f-c182586e87", "pod":"coredns-668d6bf9bc-szrkp", "timestamp":"2025-09-12 17:34:24.412666811 +0000 UTC"}, Hostname:"ci-4081-3-6-f-c182586e87", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:34:24.462703 containerd[1494]: 2025-09-12 17:34:24.412 [INFO][4309] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:24.462703 containerd[1494]: 2025-09-12 17:34:24.413 [INFO][4309] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:24.462703 containerd[1494]: 2025-09-12 17:34:24.413 [INFO][4309] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-f-c182586e87' Sep 12 17:34:24.462703 containerd[1494]: 2025-09-12 17:34:24.418 [INFO][4309] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ea4dba399394ac14f99bd30ac548af6eaf19877ec2518480e8b0a89a7b6f3199" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:24.462703 containerd[1494]: 2025-09-12 17:34:24.422 [INFO][4309] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:24.462703 containerd[1494]: 2025-09-12 17:34:24.426 [INFO][4309] ipam/ipam.go 511: Trying affinity for 192.168.34.128/26 host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:24.462703 containerd[1494]: 2025-09-12 17:34:24.428 [INFO][4309] ipam/ipam.go 158: Attempting to load block cidr=192.168.34.128/26 host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:24.462703 containerd[1494]: 2025-09-12 17:34:24.430 [INFO][4309] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.34.128/26 host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:24.462703 containerd[1494]: 2025-09-12 17:34:24.430 [INFO][4309] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.34.128/26 handle="k8s-pod-network.ea4dba399394ac14f99bd30ac548af6eaf19877ec2518480e8b0a89a7b6f3199" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:24.462703 containerd[1494]: 2025-09-12 17:34:24.431 [INFO][4309] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.ea4dba399394ac14f99bd30ac548af6eaf19877ec2518480e8b0a89a7b6f3199 Sep 12 17:34:24.462703 containerd[1494]: 2025-09-12 17:34:24.435 [INFO][4309] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.34.128/26 handle="k8s-pod-network.ea4dba399394ac14f99bd30ac548af6eaf19877ec2518480e8b0a89a7b6f3199" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:24.462703 containerd[1494]: 2025-09-12 17:34:24.439 [INFO][4309] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.34.130/26] block=192.168.34.128/26 handle="k8s-pod-network.ea4dba399394ac14f99bd30ac548af6eaf19877ec2518480e8b0a89a7b6f3199" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:24.462703 containerd[1494]: 2025-09-12 17:34:24.440 [INFO][4309] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.34.130/26] handle="k8s-pod-network.ea4dba399394ac14f99bd30ac548af6eaf19877ec2518480e8b0a89a7b6f3199" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:24.462703 containerd[1494]: 2025-09-12 17:34:24.440 [INFO][4309] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:24.462703 containerd[1494]: 2025-09-12 17:34:24.440 [INFO][4309] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.34.130/26] IPv6=[] ContainerID="ea4dba399394ac14f99bd30ac548af6eaf19877ec2518480e8b0a89a7b6f3199" HandleID="k8s-pod-network.ea4dba399394ac14f99bd30ac548af6eaf19877ec2518480e8b0a89a7b6f3199" Workload="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--szrkp-eth0" Sep 12 17:34:24.463649 containerd[1494]: 2025-09-12 17:34:24.442 [INFO][4274] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ea4dba399394ac14f99bd30ac548af6eaf19877ec2518480e8b0a89a7b6f3199" Namespace="kube-system" Pod="coredns-668d6bf9bc-szrkp" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--szrkp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--szrkp-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"cbad7146-1812-4262-a0a7-fad3ba169a40", ResourceVersion:"891", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 33, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-f-c182586e87", ContainerID:"", Pod:"coredns-668d6bf9bc-szrkp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.34.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif622695599e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:34:24.463649 containerd[1494]: 2025-09-12 17:34:24.442 [INFO][4274] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.34.130/32] ContainerID="ea4dba399394ac14f99bd30ac548af6eaf19877ec2518480e8b0a89a7b6f3199" Namespace="kube-system" Pod="coredns-668d6bf9bc-szrkp" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--szrkp-eth0" Sep 12 17:34:24.463649 containerd[1494]: 2025-09-12 17:34:24.442 [INFO][4274] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif622695599e ContainerID="ea4dba399394ac14f99bd30ac548af6eaf19877ec2518480e8b0a89a7b6f3199" Namespace="kube-system" Pod="coredns-668d6bf9bc-szrkp" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--szrkp-eth0" Sep 12 17:34:24.463649 containerd[1494]: 2025-09-12 17:34:24.447 [INFO][4274] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ea4dba399394ac14f99bd30ac548af6eaf19877ec2518480e8b0a89a7b6f3199" Namespace="kube-system" Pod="coredns-668d6bf9bc-szrkp" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--szrkp-eth0" Sep 12 17:34:24.463649 containerd[1494]: 2025-09-12 17:34:24.447 [INFO][4274] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ea4dba399394ac14f99bd30ac548af6eaf19877ec2518480e8b0a89a7b6f3199" Namespace="kube-system" Pod="coredns-668d6bf9bc-szrkp" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--szrkp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--szrkp-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"cbad7146-1812-4262-a0a7-fad3ba169a40", ResourceVersion:"891", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 33, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-f-c182586e87", ContainerID:"ea4dba399394ac14f99bd30ac548af6eaf19877ec2518480e8b0a89a7b6f3199", Pod:"coredns-668d6bf9bc-szrkp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.34.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif622695599e", MAC:"7e:5c:c0:d4:15:a9", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:34:24.463649 containerd[1494]: 2025-09-12 17:34:24.460 [INFO][4274] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ea4dba399394ac14f99bd30ac548af6eaf19877ec2518480e8b0a89a7b6f3199" Namespace="kube-system" Pod="coredns-668d6bf9bc-szrkp" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--szrkp-eth0" Sep 12 17:34:24.485342 containerd[1494]: time="2025-09-12T17:34:24.485195043Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:34:24.485881 containerd[1494]: time="2025-09-12T17:34:24.485363190Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:34:24.485881 containerd[1494]: time="2025-09-12T17:34:24.485391673Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:34:24.487486 containerd[1494]: time="2025-09-12T17:34:24.487335871Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:34:24.504385 systemd[1]: Started cri-containerd-ea4dba399394ac14f99bd30ac548af6eaf19877ec2518480e8b0a89a7b6f3199.scope - libcontainer container ea4dba399394ac14f99bd30ac548af6eaf19877ec2518480e8b0a89a7b6f3199. Sep 12 17:34:24.551372 containerd[1494]: time="2025-09-12T17:34:24.551310027Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-szrkp,Uid:cbad7146-1812-4262-a0a7-fad3ba169a40,Namespace:kube-system,Attempt:1,} returns sandbox id \"ea4dba399394ac14f99bd30ac548af6eaf19877ec2518480e8b0a89a7b6f3199\"" Sep 12 17:34:24.554568 containerd[1494]: time="2025-09-12T17:34:24.554414573Z" level=info msg="CreateContainer within sandbox \"ea4dba399394ac14f99bd30ac548af6eaf19877ec2518480e8b0a89a7b6f3199\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 17:34:24.564605 systemd-networkd[1404]: calia6ca5fbe0b7: Link UP Sep 12 17:34:24.565445 systemd-networkd[1404]: calia6ca5fbe0b7: Gained carrier Sep 12 17:34:24.581070 containerd[1494]: 2025-09-12 17:34:24.377 [INFO][4265] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--f--c182586e87-k8s-goldmane--54d579b49d--wnp7f-eth0 goldmane-54d579b49d- calico-system 8d975815-4db9-4493-ae13-0fed34b78044 893 0 2025-09-12 17:34:00 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4081-3-6-f-c182586e87 goldmane-54d579b49d-wnp7f eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calia6ca5fbe0b7 [] [] }} ContainerID="5a0d91b33f460614a3b619958e39f5021fa374631aaf7aa43e52be6e4d546d12" Namespace="calico-system" Pod="goldmane-54d579b49d-wnp7f" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-goldmane--54d579b49d--wnp7f-" Sep 12 17:34:24.581070 containerd[1494]: 2025-09-12 17:34:24.377 [INFO][4265] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5a0d91b33f460614a3b619958e39f5021fa374631aaf7aa43e52be6e4d546d12" Namespace="calico-system" Pod="goldmane-54d579b49d-wnp7f" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-goldmane--54d579b49d--wnp7f-eth0" Sep 12 17:34:24.581070 containerd[1494]: 2025-09-12 17:34:24.423 [INFO][4298] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5a0d91b33f460614a3b619958e39f5021fa374631aaf7aa43e52be6e4d546d12" HandleID="k8s-pod-network.5a0d91b33f460614a3b619958e39f5021fa374631aaf7aa43e52be6e4d546d12" Workload="ci--4081--3--6--f--c182586e87-k8s-goldmane--54d579b49d--wnp7f-eth0" Sep 12 17:34:24.581070 containerd[1494]: 2025-09-12 17:34:24.423 [INFO][4298] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5a0d91b33f460614a3b619958e39f5021fa374631aaf7aa43e52be6e4d546d12" HandleID="k8s-pod-network.5a0d91b33f460614a3b619958e39f5021fa374631aaf7aa43e52be6e4d546d12" Workload="ci--4081--3--6--f--c182586e87-k8s-goldmane--54d579b49d--wnp7f-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024ef50), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-f-c182586e87", "pod":"goldmane-54d579b49d-wnp7f", "timestamp":"2025-09-12 17:34:24.422853887 +0000 UTC"}, Hostname:"ci-4081-3-6-f-c182586e87", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:34:24.581070 containerd[1494]: 2025-09-12 17:34:24.423 [INFO][4298] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:24.581070 containerd[1494]: 2025-09-12 17:34:24.441 [INFO][4298] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:24.581070 containerd[1494]: 2025-09-12 17:34:24.441 [INFO][4298] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-f-c182586e87' Sep 12 17:34:24.581070 containerd[1494]: 2025-09-12 17:34:24.519 [INFO][4298] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5a0d91b33f460614a3b619958e39f5021fa374631aaf7aa43e52be6e4d546d12" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:24.581070 containerd[1494]: 2025-09-12 17:34:24.525 [INFO][4298] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:24.581070 containerd[1494]: 2025-09-12 17:34:24.530 [INFO][4298] ipam/ipam.go 511: Trying affinity for 192.168.34.128/26 host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:24.581070 containerd[1494]: 2025-09-12 17:34:24.532 [INFO][4298] ipam/ipam.go 158: Attempting to load block cidr=192.168.34.128/26 host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:24.581070 containerd[1494]: 2025-09-12 17:34:24.534 [INFO][4298] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.34.128/26 host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:24.581070 containerd[1494]: 2025-09-12 17:34:24.534 [INFO][4298] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.34.128/26 handle="k8s-pod-network.5a0d91b33f460614a3b619958e39f5021fa374631aaf7aa43e52be6e4d546d12" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:24.581070 containerd[1494]: 2025-09-12 17:34:24.535 [INFO][4298] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5a0d91b33f460614a3b619958e39f5021fa374631aaf7aa43e52be6e4d546d12 Sep 12 17:34:24.581070 containerd[1494]: 2025-09-12 17:34:24.539 [INFO][4298] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.34.128/26 handle="k8s-pod-network.5a0d91b33f460614a3b619958e39f5021fa374631aaf7aa43e52be6e4d546d12" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:24.581070 containerd[1494]: 2025-09-12 17:34:24.547 [INFO][4298] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.34.131/26] block=192.168.34.128/26 handle="k8s-pod-network.5a0d91b33f460614a3b619958e39f5021fa374631aaf7aa43e52be6e4d546d12" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:24.581070 containerd[1494]: 2025-09-12 17:34:24.548 [INFO][4298] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.34.131/26] handle="k8s-pod-network.5a0d91b33f460614a3b619958e39f5021fa374631aaf7aa43e52be6e4d546d12" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:24.581070 containerd[1494]: 2025-09-12 17:34:24.548 [INFO][4298] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:24.581070 containerd[1494]: 2025-09-12 17:34:24.548 [INFO][4298] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.34.131/26] IPv6=[] ContainerID="5a0d91b33f460614a3b619958e39f5021fa374631aaf7aa43e52be6e4d546d12" HandleID="k8s-pod-network.5a0d91b33f460614a3b619958e39f5021fa374631aaf7aa43e52be6e4d546d12" Workload="ci--4081--3--6--f--c182586e87-k8s-goldmane--54d579b49d--wnp7f-eth0" Sep 12 17:34:24.582293 containerd[1494]: 2025-09-12 17:34:24.560 [INFO][4265] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5a0d91b33f460614a3b619958e39f5021fa374631aaf7aa43e52be6e4d546d12" Namespace="calico-system" Pod="goldmane-54d579b49d-wnp7f" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-goldmane--54d579b49d--wnp7f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--f--c182586e87-k8s-goldmane--54d579b49d--wnp7f-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"8d975815-4db9-4493-ae13-0fed34b78044", ResourceVersion:"893", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 34, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-f-c182586e87", ContainerID:"", Pod:"goldmane-54d579b49d-wnp7f", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.34.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia6ca5fbe0b7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:34:24.582293 containerd[1494]: 2025-09-12 17:34:24.561 [INFO][4265] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.34.131/32] ContainerID="5a0d91b33f460614a3b619958e39f5021fa374631aaf7aa43e52be6e4d546d12" Namespace="calico-system" Pod="goldmane-54d579b49d-wnp7f" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-goldmane--54d579b49d--wnp7f-eth0" Sep 12 17:34:24.582293 containerd[1494]: 2025-09-12 17:34:24.561 [INFO][4265] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia6ca5fbe0b7 ContainerID="5a0d91b33f460614a3b619958e39f5021fa374631aaf7aa43e52be6e4d546d12" Namespace="calico-system" Pod="goldmane-54d579b49d-wnp7f" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-goldmane--54d579b49d--wnp7f-eth0" Sep 12 17:34:24.582293 containerd[1494]: 2025-09-12 17:34:24.565 [INFO][4265] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5a0d91b33f460614a3b619958e39f5021fa374631aaf7aa43e52be6e4d546d12" Namespace="calico-system" Pod="goldmane-54d579b49d-wnp7f" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-goldmane--54d579b49d--wnp7f-eth0" Sep 12 17:34:24.582293 containerd[1494]: 2025-09-12 17:34:24.566 [INFO][4265] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5a0d91b33f460614a3b619958e39f5021fa374631aaf7aa43e52be6e4d546d12" Namespace="calico-system" Pod="goldmane-54d579b49d-wnp7f" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-goldmane--54d579b49d--wnp7f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--f--c182586e87-k8s-goldmane--54d579b49d--wnp7f-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"8d975815-4db9-4493-ae13-0fed34b78044", ResourceVersion:"893", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 34, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-f-c182586e87", ContainerID:"5a0d91b33f460614a3b619958e39f5021fa374631aaf7aa43e52be6e4d546d12", Pod:"goldmane-54d579b49d-wnp7f", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.34.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia6ca5fbe0b7", MAC:"4e:2d:da:40:b2:59", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:34:24.582293 containerd[1494]: 2025-09-12 17:34:24.578 [INFO][4265] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5a0d91b33f460614a3b619958e39f5021fa374631aaf7aa43e52be6e4d546d12" Namespace="calico-system" Pod="goldmane-54d579b49d-wnp7f" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-goldmane--54d579b49d--wnp7f-eth0" Sep 12 17:34:24.583398 containerd[1494]: time="2025-09-12T17:34:24.583368238Z" level=info msg="CreateContainer within sandbox \"ea4dba399394ac14f99bd30ac548af6eaf19877ec2518480e8b0a89a7b6f3199\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"6109edc69051f77d969c40b6886435753e722d9ba2b2d4300178871af517f9a0\"" Sep 12 17:34:24.583914 containerd[1494]: time="2025-09-12T17:34:24.583791707Z" level=info msg="StartContainer for \"6109edc69051f77d969c40b6886435753e722d9ba2b2d4300178871af517f9a0\"" Sep 12 17:34:24.608353 systemd[1]: Started cri-containerd-6109edc69051f77d969c40b6886435753e722d9ba2b2d4300178871af517f9a0.scope - libcontainer container 6109edc69051f77d969c40b6886435753e722d9ba2b2d4300178871af517f9a0. Sep 12 17:34:24.611042 containerd[1494]: time="2025-09-12T17:34:24.610357589Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:34:24.611042 containerd[1494]: time="2025-09-12T17:34:24.610613723Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:34:24.611138 containerd[1494]: time="2025-09-12T17:34:24.610951990Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:34:24.611138 containerd[1494]: time="2025-09-12T17:34:24.611036590Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:34:24.635377 systemd[1]: Started cri-containerd-5a0d91b33f460614a3b619958e39f5021fa374631aaf7aa43e52be6e4d546d12.scope - libcontainer container 5a0d91b33f460614a3b619958e39f5021fa374631aaf7aa43e52be6e4d546d12. Sep 12 17:34:24.644724 containerd[1494]: time="2025-09-12T17:34:24.644644205Z" level=info msg="StartContainer for \"6109edc69051f77d969c40b6886435753e722d9ba2b2d4300178871af517f9a0\" returns successfully" Sep 12 17:34:24.672414 systemd-networkd[1404]: calif4e7666601f: Link UP Sep 12 17:34:24.673008 systemd-networkd[1404]: calif4e7666601f: Gained carrier Sep 12 17:34:24.686482 containerd[1494]: 2025-09-12 17:34:24.383 [INFO][4283] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--bbjqp-eth0 calico-apiserver-6dd46897bd- calico-apiserver 9ee60034-1a31-4418-b743-0f97ab43bc92 892 0 2025-09-12 17:33:58 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6dd46897bd projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081-3-6-f-c182586e87 calico-apiserver-6dd46897bd-bbjqp eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calif4e7666601f [] [] }} ContainerID="bf699b73a6ad071f5fccc960cf1d02448f7d1eb2a2d22333c04578c66b33e8c5" Namespace="calico-apiserver" Pod="calico-apiserver-6dd46897bd-bbjqp" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--bbjqp-" Sep 12 17:34:24.686482 containerd[1494]: 2025-09-12 17:34:24.383 [INFO][4283] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bf699b73a6ad071f5fccc960cf1d02448f7d1eb2a2d22333c04578c66b33e8c5" Namespace="calico-apiserver" Pod="calico-apiserver-6dd46897bd-bbjqp" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--bbjqp-eth0" Sep 12 17:34:24.686482 containerd[1494]: 2025-09-12 17:34:24.426 [INFO][4304] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bf699b73a6ad071f5fccc960cf1d02448f7d1eb2a2d22333c04578c66b33e8c5" HandleID="k8s-pod-network.bf699b73a6ad071f5fccc960cf1d02448f7d1eb2a2d22333c04578c66b33e8c5" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--bbjqp-eth0" Sep 12 17:34:24.686482 containerd[1494]: 2025-09-12 17:34:24.426 [INFO][4304] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="bf699b73a6ad071f5fccc960cf1d02448f7d1eb2a2d22333c04578c66b33e8c5" HandleID="k8s-pod-network.bf699b73a6ad071f5fccc960cf1d02448f7d1eb2a2d22333c04578c66b33e8c5" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--bbjqp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d56a0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4081-3-6-f-c182586e87", "pod":"calico-apiserver-6dd46897bd-bbjqp", "timestamp":"2025-09-12 17:34:24.426543006 +0000 UTC"}, Hostname:"ci-4081-3-6-f-c182586e87", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:34:24.686482 containerd[1494]: 2025-09-12 17:34:24.426 [INFO][4304] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:24.686482 containerd[1494]: 2025-09-12 17:34:24.549 [INFO][4304] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:24.686482 containerd[1494]: 2025-09-12 17:34:24.549 [INFO][4304] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-f-c182586e87' Sep 12 17:34:24.686482 containerd[1494]: 2025-09-12 17:34:24.619 [INFO][4304] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.bf699b73a6ad071f5fccc960cf1d02448f7d1eb2a2d22333c04578c66b33e8c5" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:24.686482 containerd[1494]: 2025-09-12 17:34:24.627 [INFO][4304] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:24.686482 containerd[1494]: 2025-09-12 17:34:24.638 [INFO][4304] ipam/ipam.go 511: Trying affinity for 192.168.34.128/26 host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:24.686482 containerd[1494]: 2025-09-12 17:34:24.640 [INFO][4304] ipam/ipam.go 158: Attempting to load block cidr=192.168.34.128/26 host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:24.686482 containerd[1494]: 2025-09-12 17:34:24.642 [INFO][4304] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.34.128/26 host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:24.686482 containerd[1494]: 2025-09-12 17:34:24.643 [INFO][4304] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.34.128/26 handle="k8s-pod-network.bf699b73a6ad071f5fccc960cf1d02448f7d1eb2a2d22333c04578c66b33e8c5" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:24.686482 containerd[1494]: 2025-09-12 17:34:24.645 [INFO][4304] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.bf699b73a6ad071f5fccc960cf1d02448f7d1eb2a2d22333c04578c66b33e8c5 Sep 12 17:34:24.686482 containerd[1494]: 2025-09-12 17:34:24.652 [INFO][4304] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.34.128/26 handle="k8s-pod-network.bf699b73a6ad071f5fccc960cf1d02448f7d1eb2a2d22333c04578c66b33e8c5" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:24.686482 containerd[1494]: 2025-09-12 17:34:24.658 [INFO][4304] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.34.132/26] block=192.168.34.128/26 handle="k8s-pod-network.bf699b73a6ad071f5fccc960cf1d02448f7d1eb2a2d22333c04578c66b33e8c5" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:24.686482 containerd[1494]: 2025-09-12 17:34:24.659 [INFO][4304] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.34.132/26] handle="k8s-pod-network.bf699b73a6ad071f5fccc960cf1d02448f7d1eb2a2d22333c04578c66b33e8c5" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:24.686482 containerd[1494]: 2025-09-12 17:34:24.659 [INFO][4304] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:24.686482 containerd[1494]: 2025-09-12 17:34:24.659 [INFO][4304] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.34.132/26] IPv6=[] ContainerID="bf699b73a6ad071f5fccc960cf1d02448f7d1eb2a2d22333c04578c66b33e8c5" HandleID="k8s-pod-network.bf699b73a6ad071f5fccc960cf1d02448f7d1eb2a2d22333c04578c66b33e8c5" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--bbjqp-eth0" Sep 12 17:34:24.687988 containerd[1494]: 2025-09-12 17:34:24.665 [INFO][4283] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bf699b73a6ad071f5fccc960cf1d02448f7d1eb2a2d22333c04578c66b33e8c5" Namespace="calico-apiserver" Pod="calico-apiserver-6dd46897bd-bbjqp" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--bbjqp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--bbjqp-eth0", GenerateName:"calico-apiserver-6dd46897bd-", Namespace:"calico-apiserver", SelfLink:"", UID:"9ee60034-1a31-4418-b743-0f97ab43bc92", ResourceVersion:"892", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 33, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6dd46897bd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-f-c182586e87", ContainerID:"", Pod:"calico-apiserver-6dd46897bd-bbjqp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.34.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif4e7666601f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:34:24.687988 containerd[1494]: 2025-09-12 17:34:24.665 [INFO][4283] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.34.132/32] ContainerID="bf699b73a6ad071f5fccc960cf1d02448f7d1eb2a2d22333c04578c66b33e8c5" Namespace="calico-apiserver" Pod="calico-apiserver-6dd46897bd-bbjqp" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--bbjqp-eth0" Sep 12 17:34:24.687988 containerd[1494]: 2025-09-12 17:34:24.665 [INFO][4283] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif4e7666601f ContainerID="bf699b73a6ad071f5fccc960cf1d02448f7d1eb2a2d22333c04578c66b33e8c5" Namespace="calico-apiserver" Pod="calico-apiserver-6dd46897bd-bbjqp" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--bbjqp-eth0" Sep 12 17:34:24.687988 containerd[1494]: 2025-09-12 17:34:24.673 [INFO][4283] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bf699b73a6ad071f5fccc960cf1d02448f7d1eb2a2d22333c04578c66b33e8c5" Namespace="calico-apiserver" Pod="calico-apiserver-6dd46897bd-bbjqp" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--bbjqp-eth0" Sep 12 17:34:24.687988 containerd[1494]: 2025-09-12 17:34:24.674 [INFO][4283] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bf699b73a6ad071f5fccc960cf1d02448f7d1eb2a2d22333c04578c66b33e8c5" Namespace="calico-apiserver" Pod="calico-apiserver-6dd46897bd-bbjqp" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--bbjqp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--bbjqp-eth0", GenerateName:"calico-apiserver-6dd46897bd-", Namespace:"calico-apiserver", SelfLink:"", UID:"9ee60034-1a31-4418-b743-0f97ab43bc92", ResourceVersion:"892", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 33, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6dd46897bd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-f-c182586e87", ContainerID:"bf699b73a6ad071f5fccc960cf1d02448f7d1eb2a2d22333c04578c66b33e8c5", Pod:"calico-apiserver-6dd46897bd-bbjqp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.34.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif4e7666601f", MAC:"86:ee:5e:b9:5d:11", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:34:24.687988 containerd[1494]: 2025-09-12 17:34:24.683 [INFO][4283] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bf699b73a6ad071f5fccc960cf1d02448f7d1eb2a2d22333c04578c66b33e8c5" Namespace="calico-apiserver" Pod="calico-apiserver-6dd46897bd-bbjqp" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--bbjqp-eth0" Sep 12 17:34:24.711894 containerd[1494]: time="2025-09-12T17:34:24.711792759Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:34:24.713518 containerd[1494]: time="2025-09-12T17:34:24.711873250Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:34:24.713518 containerd[1494]: time="2025-09-12T17:34:24.711884732Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:34:24.713518 containerd[1494]: time="2025-09-12T17:34:24.713417884Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:34:24.737452 systemd[1]: Started cri-containerd-bf699b73a6ad071f5fccc960cf1d02448f7d1eb2a2d22333c04578c66b33e8c5.scope - libcontainer container bf699b73a6ad071f5fccc960cf1d02448f7d1eb2a2d22333c04578c66b33e8c5. Sep 12 17:34:24.738419 containerd[1494]: time="2025-09-12T17:34:24.738342289Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-wnp7f,Uid:8d975815-4db9-4493-ae13-0fed34b78044,Namespace:calico-system,Attempt:1,} returns sandbox id \"5a0d91b33f460614a3b619958e39f5021fa374631aaf7aa43e52be6e4d546d12\"" Sep 12 17:34:24.779253 containerd[1494]: time="2025-09-12T17:34:24.779179962Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6dd46897bd-bbjqp,Uid:9ee60034-1a31-4418-b743-0f97ab43bc92,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"bf699b73a6ad071f5fccc960cf1d02448f7d1eb2a2d22333c04578c66b33e8c5\"" Sep 12 17:34:25.082359 containerd[1494]: time="2025-09-12T17:34:25.082311258Z" level=info msg="StopPodSandbox for \"66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30\"" Sep 12 17:34:25.225716 containerd[1494]: 2025-09-12 17:34:25.156 [INFO][4520] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" Sep 12 17:34:25.225716 containerd[1494]: 2025-09-12 17:34:25.157 [INFO][4520] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" iface="eth0" netns="/var/run/netns/cni-796630c7-2586-8610-8e54-d10c758d159a" Sep 12 17:34:25.225716 containerd[1494]: 2025-09-12 17:34:25.158 [INFO][4520] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" iface="eth0" netns="/var/run/netns/cni-796630c7-2586-8610-8e54-d10c758d159a" Sep 12 17:34:25.225716 containerd[1494]: 2025-09-12 17:34:25.158 [INFO][4520] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" iface="eth0" netns="/var/run/netns/cni-796630c7-2586-8610-8e54-d10c758d159a" Sep 12 17:34:25.225716 containerd[1494]: 2025-09-12 17:34:25.158 [INFO][4520] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" Sep 12 17:34:25.225716 containerd[1494]: 2025-09-12 17:34:25.158 [INFO][4520] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" Sep 12 17:34:25.225716 containerd[1494]: 2025-09-12 17:34:25.206 [INFO][4527] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" HandleID="k8s-pod-network.66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" Workload="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--s4d7g-eth0" Sep 12 17:34:25.225716 containerd[1494]: 2025-09-12 17:34:25.206 [INFO][4527] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:25.225716 containerd[1494]: 2025-09-12 17:34:25.207 [INFO][4527] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:25.225716 containerd[1494]: 2025-09-12 17:34:25.217 [WARNING][4527] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" HandleID="k8s-pod-network.66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" Workload="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--s4d7g-eth0" Sep 12 17:34:25.225716 containerd[1494]: 2025-09-12 17:34:25.217 [INFO][4527] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" HandleID="k8s-pod-network.66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" Workload="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--s4d7g-eth0" Sep 12 17:34:25.225716 containerd[1494]: 2025-09-12 17:34:25.220 [INFO][4527] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:25.225716 containerd[1494]: 2025-09-12 17:34:25.223 [INFO][4520] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" Sep 12 17:34:25.226658 containerd[1494]: time="2025-09-12T17:34:25.225895022Z" level=info msg="TearDown network for sandbox \"66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30\" successfully" Sep 12 17:34:25.226658 containerd[1494]: time="2025-09-12T17:34:25.225932123Z" level=info msg="StopPodSandbox for \"66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30\" returns successfully" Sep 12 17:34:25.226972 containerd[1494]: time="2025-09-12T17:34:25.226907792Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-s4d7g,Uid:33f03454-7bb2-47fc-bcae-807921ce8ad3,Namespace:kube-system,Attempt:1,}" Sep 12 17:34:25.326092 systemd[1]: run-netns-cni\x2d796630c7\x2d2586\x2d8610\x2d8e54\x2dd10c758d159a.mount: Deactivated successfully. Sep 12 17:34:25.361192 kubelet[2561]: I0912 17:34:25.361052 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-szrkp" podStartSLOduration=36.361031448 podStartE2EDuration="36.361031448s" podCreationTimestamp="2025-09-12 17:33:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:34:25.360243232 +0000 UTC m=+41.387658322" watchObservedRunningTime="2025-09-12 17:34:25.361031448 +0000 UTC m=+41.388446548" Sep 12 17:34:25.422422 systemd-networkd[1404]: cali8bcc98ec2c8: Link UP Sep 12 17:34:25.422567 systemd-networkd[1404]: cali8bcc98ec2c8: Gained carrier Sep 12 17:34:25.435795 containerd[1494]: 2025-09-12 17:34:25.291 [INFO][4534] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--s4d7g-eth0 coredns-668d6bf9bc- kube-system 33f03454-7bb2-47fc-bcae-807921ce8ad3 910 0 2025-09-12 17:33:49 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081-3-6-f-c182586e87 coredns-668d6bf9bc-s4d7g eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali8bcc98ec2c8 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="47c0aa89385cfc22baf7a1d40394f1009cfaaa8af45053624c69e492710e4244" Namespace="kube-system" Pod="coredns-668d6bf9bc-s4d7g" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--s4d7g-" Sep 12 17:34:25.435795 containerd[1494]: 2025-09-12 17:34:25.291 [INFO][4534] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="47c0aa89385cfc22baf7a1d40394f1009cfaaa8af45053624c69e492710e4244" Namespace="kube-system" Pod="coredns-668d6bf9bc-s4d7g" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--s4d7g-eth0" Sep 12 17:34:25.435795 containerd[1494]: 2025-09-12 17:34:25.356 [INFO][4545] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="47c0aa89385cfc22baf7a1d40394f1009cfaaa8af45053624c69e492710e4244" HandleID="k8s-pod-network.47c0aa89385cfc22baf7a1d40394f1009cfaaa8af45053624c69e492710e4244" Workload="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--s4d7g-eth0" Sep 12 17:34:25.435795 containerd[1494]: 2025-09-12 17:34:25.357 [INFO][4545] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="47c0aa89385cfc22baf7a1d40394f1009cfaaa8af45053624c69e492710e4244" HandleID="k8s-pod-network.47c0aa89385cfc22baf7a1d40394f1009cfaaa8af45053624c69e492710e4244" Workload="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--s4d7g-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f9d0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081-3-6-f-c182586e87", "pod":"coredns-668d6bf9bc-s4d7g", "timestamp":"2025-09-12 17:34:25.356776104 +0000 UTC"}, Hostname:"ci-4081-3-6-f-c182586e87", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:34:25.435795 containerd[1494]: 2025-09-12 17:34:25.358 [INFO][4545] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:25.435795 containerd[1494]: 2025-09-12 17:34:25.358 [INFO][4545] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:25.435795 containerd[1494]: 2025-09-12 17:34:25.359 [INFO][4545] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-f-c182586e87' Sep 12 17:34:25.435795 containerd[1494]: 2025-09-12 17:34:25.388 [INFO][4545] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.47c0aa89385cfc22baf7a1d40394f1009cfaaa8af45053624c69e492710e4244" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:25.435795 containerd[1494]: 2025-09-12 17:34:25.399 [INFO][4545] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:25.435795 containerd[1494]: 2025-09-12 17:34:25.404 [INFO][4545] ipam/ipam.go 511: Trying affinity for 192.168.34.128/26 host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:25.435795 containerd[1494]: 2025-09-12 17:34:25.406 [INFO][4545] ipam/ipam.go 158: Attempting to load block cidr=192.168.34.128/26 host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:25.435795 containerd[1494]: 2025-09-12 17:34:25.407 [INFO][4545] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.34.128/26 host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:25.435795 containerd[1494]: 2025-09-12 17:34:25.407 [INFO][4545] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.34.128/26 handle="k8s-pod-network.47c0aa89385cfc22baf7a1d40394f1009cfaaa8af45053624c69e492710e4244" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:25.435795 containerd[1494]: 2025-09-12 17:34:25.409 [INFO][4545] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.47c0aa89385cfc22baf7a1d40394f1009cfaaa8af45053624c69e492710e4244 Sep 12 17:34:25.435795 containerd[1494]: 2025-09-12 17:34:25.413 [INFO][4545] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.34.128/26 handle="k8s-pod-network.47c0aa89385cfc22baf7a1d40394f1009cfaaa8af45053624c69e492710e4244" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:25.435795 containerd[1494]: 2025-09-12 17:34:25.417 [INFO][4545] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.34.133/26] block=192.168.34.128/26 handle="k8s-pod-network.47c0aa89385cfc22baf7a1d40394f1009cfaaa8af45053624c69e492710e4244" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:25.435795 containerd[1494]: 2025-09-12 17:34:25.417 [INFO][4545] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.34.133/26] handle="k8s-pod-network.47c0aa89385cfc22baf7a1d40394f1009cfaaa8af45053624c69e492710e4244" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:25.435795 containerd[1494]: 2025-09-12 17:34:25.417 [INFO][4545] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:25.435795 containerd[1494]: 2025-09-12 17:34:25.417 [INFO][4545] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.34.133/26] IPv6=[] ContainerID="47c0aa89385cfc22baf7a1d40394f1009cfaaa8af45053624c69e492710e4244" HandleID="k8s-pod-network.47c0aa89385cfc22baf7a1d40394f1009cfaaa8af45053624c69e492710e4244" Workload="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--s4d7g-eth0" Sep 12 17:34:25.436837 containerd[1494]: 2025-09-12 17:34:25.420 [INFO][4534] cni-plugin/k8s.go 418: Populated endpoint ContainerID="47c0aa89385cfc22baf7a1d40394f1009cfaaa8af45053624c69e492710e4244" Namespace="kube-system" Pod="coredns-668d6bf9bc-s4d7g" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--s4d7g-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--s4d7g-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"33f03454-7bb2-47fc-bcae-807921ce8ad3", ResourceVersion:"910", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 33, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-f-c182586e87", ContainerID:"", Pod:"coredns-668d6bf9bc-s4d7g", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.34.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8bcc98ec2c8", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:34:25.436837 containerd[1494]: 2025-09-12 17:34:25.420 [INFO][4534] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.34.133/32] ContainerID="47c0aa89385cfc22baf7a1d40394f1009cfaaa8af45053624c69e492710e4244" Namespace="kube-system" Pod="coredns-668d6bf9bc-s4d7g" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--s4d7g-eth0" Sep 12 17:34:25.436837 containerd[1494]: 2025-09-12 17:34:25.420 [INFO][4534] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8bcc98ec2c8 ContainerID="47c0aa89385cfc22baf7a1d40394f1009cfaaa8af45053624c69e492710e4244" Namespace="kube-system" Pod="coredns-668d6bf9bc-s4d7g" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--s4d7g-eth0" Sep 12 17:34:25.436837 containerd[1494]: 2025-09-12 17:34:25.422 [INFO][4534] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="47c0aa89385cfc22baf7a1d40394f1009cfaaa8af45053624c69e492710e4244" Namespace="kube-system" Pod="coredns-668d6bf9bc-s4d7g" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--s4d7g-eth0" Sep 12 17:34:25.436837 containerd[1494]: 2025-09-12 17:34:25.422 [INFO][4534] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="47c0aa89385cfc22baf7a1d40394f1009cfaaa8af45053624c69e492710e4244" Namespace="kube-system" Pod="coredns-668d6bf9bc-s4d7g" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--s4d7g-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--s4d7g-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"33f03454-7bb2-47fc-bcae-807921ce8ad3", ResourceVersion:"910", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 33, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-f-c182586e87", ContainerID:"47c0aa89385cfc22baf7a1d40394f1009cfaaa8af45053624c69e492710e4244", Pod:"coredns-668d6bf9bc-s4d7g", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.34.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8bcc98ec2c8", MAC:"02:71:52:b6:41:c9", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:34:25.436837 containerd[1494]: 2025-09-12 17:34:25.430 [INFO][4534] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="47c0aa89385cfc22baf7a1d40394f1009cfaaa8af45053624c69e492710e4244" Namespace="kube-system" Pod="coredns-668d6bf9bc-s4d7g" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--s4d7g-eth0" Sep 12 17:34:25.462758 containerd[1494]: time="2025-09-12T17:34:25.457687105Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:34:25.462758 containerd[1494]: time="2025-09-12T17:34:25.457746947Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:34:25.462758 containerd[1494]: time="2025-09-12T17:34:25.457756635Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:34:25.462758 containerd[1494]: time="2025-09-12T17:34:25.457815686Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:34:25.482484 systemd[1]: Started cri-containerd-47c0aa89385cfc22baf7a1d40394f1009cfaaa8af45053624c69e492710e4244.scope - libcontainer container 47c0aa89385cfc22baf7a1d40394f1009cfaaa8af45053624c69e492710e4244. Sep 12 17:34:25.518291 containerd[1494]: time="2025-09-12T17:34:25.518049816Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-s4d7g,Uid:33f03454-7bb2-47fc-bcae-807921ce8ad3,Namespace:kube-system,Attempt:1,} returns sandbox id \"47c0aa89385cfc22baf7a1d40394f1009cfaaa8af45053624c69e492710e4244\"" Sep 12 17:34:25.521107 containerd[1494]: time="2025-09-12T17:34:25.521075842Z" level=info msg="CreateContainer within sandbox \"47c0aa89385cfc22baf7a1d40394f1009cfaaa8af45053624c69e492710e4244\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 17:34:25.537398 containerd[1494]: time="2025-09-12T17:34:25.537366990Z" level=info msg="CreateContainer within sandbox \"47c0aa89385cfc22baf7a1d40394f1009cfaaa8af45053624c69e492710e4244\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"a5d3018c4520db227a36718d4d61725d7a45103c426eddb2e97eea344a47083e\"" Sep 12 17:34:25.539067 containerd[1494]: time="2025-09-12T17:34:25.538948913Z" level=info msg="StartContainer for \"a5d3018c4520db227a36718d4d61725d7a45103c426eddb2e97eea344a47083e\"" Sep 12 17:34:25.562605 systemd[1]: Started cri-containerd-a5d3018c4520db227a36718d4d61725d7a45103c426eddb2e97eea344a47083e.scope - libcontainer container a5d3018c4520db227a36718d4d61725d7a45103c426eddb2e97eea344a47083e. Sep 12 17:34:25.585516 containerd[1494]: time="2025-09-12T17:34:25.585486485Z" level=info msg="StartContainer for \"a5d3018c4520db227a36718d4d61725d7a45103c426eddb2e97eea344a47083e\" returns successfully" Sep 12 17:34:25.926770 systemd-networkd[1404]: calif622695599e: Gained IPv6LL Sep 12 17:34:25.927031 systemd-networkd[1404]: calif4e7666601f: Gained IPv6LL Sep 12 17:34:26.061463 containerd[1494]: time="2025-09-12T17:34:26.061387624Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:26.062256 containerd[1494]: time="2025-09-12T17:34:26.062200535Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 12 17:34:26.063002 containerd[1494]: time="2025-09-12T17:34:26.062945811Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:26.065047 containerd[1494]: time="2025-09-12T17:34:26.065010764Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:26.065936 containerd[1494]: time="2025-09-12T17:34:26.065567762Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 2.821468812s" Sep 12 17:34:26.065936 containerd[1494]: time="2025-09-12T17:34:26.065599273Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 12 17:34:26.066564 containerd[1494]: time="2025-09-12T17:34:26.066547861Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 12 17:34:26.067934 containerd[1494]: time="2025-09-12T17:34:26.067915279Z" level=info msg="CreateContainer within sandbox \"279aaa5b125d7559f881b35f19266c102c03725463b7236693e224041f7b080e\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 12 17:34:26.077433 containerd[1494]: time="2025-09-12T17:34:26.077400357Z" level=info msg="CreateContainer within sandbox \"279aaa5b125d7559f881b35f19266c102c03725463b7236693e224041f7b080e\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"5b43f3c96d57a85dd50cd2f040698f1d5e7197f578abd3046be6b94ecc4293da\"" Sep 12 17:34:26.077822 containerd[1494]: time="2025-09-12T17:34:26.077785783Z" level=info msg="StartContainer for \"5b43f3c96d57a85dd50cd2f040698f1d5e7197f578abd3046be6b94ecc4293da\"" Sep 12 17:34:26.098347 systemd[1]: Started cri-containerd-5b43f3c96d57a85dd50cd2f040698f1d5e7197f578abd3046be6b94ecc4293da.scope - libcontainer container 5b43f3c96d57a85dd50cd2f040698f1d5e7197f578abd3046be6b94ecc4293da. Sep 12 17:34:26.139626 containerd[1494]: time="2025-09-12T17:34:26.139575003Z" level=info msg="StartContainer for \"5b43f3c96d57a85dd50cd2f040698f1d5e7197f578abd3046be6b94ecc4293da\" returns successfully" Sep 12 17:34:26.378004 kubelet[2561]: I0912 17:34:26.377798 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-s4d7g" podStartSLOduration=37.377781735 podStartE2EDuration="37.377781735s" podCreationTimestamp="2025-09-12 17:33:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:34:26.366142134 +0000 UTC m=+42.393557215" watchObservedRunningTime="2025-09-12 17:34:26.377781735 +0000 UTC m=+42.405196835" Sep 12 17:34:26.379870 kubelet[2561]: I0912 17:34:26.379201 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-58bcf489cc-l5xmn" podStartSLOduration=1.37911334 podStartE2EDuration="6.379086774s" podCreationTimestamp="2025-09-12 17:34:20 +0000 UTC" firstStartedPulling="2025-09-12 17:34:21.066408162 +0000 UTC m=+37.093823243" lastFinishedPulling="2025-09-12 17:34:26.066381587 +0000 UTC m=+42.093796677" observedRunningTime="2025-09-12 17:34:26.37688768 +0000 UTC m=+42.404302770" watchObservedRunningTime="2025-09-12 17:34:26.379086774 +0000 UTC m=+42.406501864" Sep 12 17:34:26.439389 systemd-networkd[1404]: calia6ca5fbe0b7: Gained IPv6LL Sep 12 17:34:27.081847 containerd[1494]: time="2025-09-12T17:34:27.081373748Z" level=info msg="StopPodSandbox for \"f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697\"" Sep 12 17:34:27.082755 containerd[1494]: time="2025-09-12T17:34:27.082462891Z" level=info msg="StopPodSandbox for \"fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3\"" Sep 12 17:34:27.088106 containerd[1494]: time="2025-09-12T17:34:27.088014983Z" level=info msg="StopPodSandbox for \"8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa\"" Sep 12 17:34:27.206638 systemd-networkd[1404]: cali8bcc98ec2c8: Gained IPv6LL Sep 12 17:34:27.283531 containerd[1494]: 2025-09-12 17:34:27.243 [INFO][4717] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" Sep 12 17:34:27.283531 containerd[1494]: 2025-09-12 17:34:27.243 [INFO][4717] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" iface="eth0" netns="/var/run/netns/cni-c6686ed0-737d-a335-d110-99749e84afc6" Sep 12 17:34:27.283531 containerd[1494]: 2025-09-12 17:34:27.243 [INFO][4717] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" iface="eth0" netns="/var/run/netns/cni-c6686ed0-737d-a335-d110-99749e84afc6" Sep 12 17:34:27.283531 containerd[1494]: 2025-09-12 17:34:27.244 [INFO][4717] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" iface="eth0" netns="/var/run/netns/cni-c6686ed0-737d-a335-d110-99749e84afc6" Sep 12 17:34:27.283531 containerd[1494]: 2025-09-12 17:34:27.244 [INFO][4717] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" Sep 12 17:34:27.283531 containerd[1494]: 2025-09-12 17:34:27.244 [INFO][4717] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" Sep 12 17:34:27.283531 containerd[1494]: 2025-09-12 17:34:27.268 [INFO][4753] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" HandleID="k8s-pod-network.f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--htdld-eth0" Sep 12 17:34:27.283531 containerd[1494]: 2025-09-12 17:34:27.269 [INFO][4753] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:27.283531 containerd[1494]: 2025-09-12 17:34:27.269 [INFO][4753] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:27.283531 containerd[1494]: 2025-09-12 17:34:27.273 [WARNING][4753] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" HandleID="k8s-pod-network.f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--htdld-eth0" Sep 12 17:34:27.283531 containerd[1494]: 2025-09-12 17:34:27.273 [INFO][4753] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" HandleID="k8s-pod-network.f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--htdld-eth0" Sep 12 17:34:27.283531 containerd[1494]: 2025-09-12 17:34:27.276 [INFO][4753] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:27.283531 containerd[1494]: 2025-09-12 17:34:27.280 [INFO][4717] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" Sep 12 17:34:27.284924 containerd[1494]: time="2025-09-12T17:34:27.284306684Z" level=info msg="TearDown network for sandbox \"f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697\" successfully" Sep 12 17:34:27.284924 containerd[1494]: time="2025-09-12T17:34:27.284330019Z" level=info msg="StopPodSandbox for \"f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697\" returns successfully" Sep 12 17:34:27.286331 containerd[1494]: time="2025-09-12T17:34:27.286293729Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6dd46897bd-htdld,Uid:0d943c3b-957f-40f9-b21f-ac657206aace,Namespace:calico-apiserver,Attempt:1,}" Sep 12 17:34:27.286836 systemd[1]: run-netns-cni\x2dc6686ed0\x2d737d\x2da335\x2dd110\x2d99749e84afc6.mount: Deactivated successfully. Sep 12 17:34:27.298726 containerd[1494]: 2025-09-12 17:34:27.219 [INFO][4724] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" Sep 12 17:34:27.298726 containerd[1494]: 2025-09-12 17:34:27.219 [INFO][4724] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" iface="eth0" netns="/var/run/netns/cni-fe7c3356-dd8c-7941-200a-0824e2ff5615" Sep 12 17:34:27.298726 containerd[1494]: 2025-09-12 17:34:27.221 [INFO][4724] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" iface="eth0" netns="/var/run/netns/cni-fe7c3356-dd8c-7941-200a-0824e2ff5615" Sep 12 17:34:27.298726 containerd[1494]: 2025-09-12 17:34:27.225 [INFO][4724] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" iface="eth0" netns="/var/run/netns/cni-fe7c3356-dd8c-7941-200a-0824e2ff5615" Sep 12 17:34:27.298726 containerd[1494]: 2025-09-12 17:34:27.225 [INFO][4724] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" Sep 12 17:34:27.298726 containerd[1494]: 2025-09-12 17:34:27.225 [INFO][4724] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" Sep 12 17:34:27.298726 containerd[1494]: 2025-09-12 17:34:27.275 [INFO][4742] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" HandleID="k8s-pod-network.8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--kube--controllers--599b568c97--s5dwl-eth0" Sep 12 17:34:27.298726 containerd[1494]: 2025-09-12 17:34:27.276 [INFO][4742] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:27.298726 containerd[1494]: 2025-09-12 17:34:27.276 [INFO][4742] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:27.298726 containerd[1494]: 2025-09-12 17:34:27.284 [WARNING][4742] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" HandleID="k8s-pod-network.8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--kube--controllers--599b568c97--s5dwl-eth0" Sep 12 17:34:27.298726 containerd[1494]: 2025-09-12 17:34:27.286 [INFO][4742] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" HandleID="k8s-pod-network.8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--kube--controllers--599b568c97--s5dwl-eth0" Sep 12 17:34:27.298726 containerd[1494]: 2025-09-12 17:34:27.288 [INFO][4742] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:27.298726 containerd[1494]: 2025-09-12 17:34:27.292 [INFO][4724] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" Sep 12 17:34:27.302756 systemd[1]: run-netns-cni\x2dfe7c3356\x2ddd8c\x2d7941\x2d200a\x2d0824e2ff5615.mount: Deactivated successfully. Sep 12 17:34:27.304371 containerd[1494]: time="2025-09-12T17:34:27.303290877Z" level=info msg="TearDown network for sandbox \"8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa\" successfully" Sep 12 17:34:27.304371 containerd[1494]: time="2025-09-12T17:34:27.303313369Z" level=info msg="StopPodSandbox for \"8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa\" returns successfully" Sep 12 17:34:27.310390 containerd[1494]: 2025-09-12 17:34:27.235 [INFO][4725] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" Sep 12 17:34:27.310390 containerd[1494]: 2025-09-12 17:34:27.236 [INFO][4725] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" iface="eth0" netns="/var/run/netns/cni-6e347a2a-a675-f4b5-19c2-755ede5a9c1d" Sep 12 17:34:27.310390 containerd[1494]: 2025-09-12 17:34:27.236 [INFO][4725] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" iface="eth0" netns="/var/run/netns/cni-6e347a2a-a675-f4b5-19c2-755ede5a9c1d" Sep 12 17:34:27.310390 containerd[1494]: 2025-09-12 17:34:27.236 [INFO][4725] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" iface="eth0" netns="/var/run/netns/cni-6e347a2a-a675-f4b5-19c2-755ede5a9c1d" Sep 12 17:34:27.310390 containerd[1494]: 2025-09-12 17:34:27.236 [INFO][4725] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" Sep 12 17:34:27.310390 containerd[1494]: 2025-09-12 17:34:27.236 [INFO][4725] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" Sep 12 17:34:27.310390 containerd[1494]: 2025-09-12 17:34:27.276 [INFO][4745] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" HandleID="k8s-pod-network.fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" Workload="ci--4081--3--6--f--c182586e87-k8s-csi--node--driver--fsx7j-eth0" Sep 12 17:34:27.310390 containerd[1494]: 2025-09-12 17:34:27.277 [INFO][4745] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:27.310390 containerd[1494]: 2025-09-12 17:34:27.289 [INFO][4745] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:27.310390 containerd[1494]: 2025-09-12 17:34:27.298 [WARNING][4745] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" HandleID="k8s-pod-network.fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" Workload="ci--4081--3--6--f--c182586e87-k8s-csi--node--driver--fsx7j-eth0" Sep 12 17:34:27.310390 containerd[1494]: 2025-09-12 17:34:27.300 [INFO][4745] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" HandleID="k8s-pod-network.fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" Workload="ci--4081--3--6--f--c182586e87-k8s-csi--node--driver--fsx7j-eth0" Sep 12 17:34:27.310390 containerd[1494]: 2025-09-12 17:34:27.305 [INFO][4745] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:27.310390 containerd[1494]: 2025-09-12 17:34:27.308 [INFO][4725] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" Sep 12 17:34:27.313134 containerd[1494]: time="2025-09-12T17:34:27.312401805Z" level=info msg="TearDown network for sandbox \"fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3\" successfully" Sep 12 17:34:27.313134 containerd[1494]: time="2025-09-12T17:34:27.312421062Z" level=info msg="StopPodSandbox for \"fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3\" returns successfully" Sep 12 17:34:27.313134 containerd[1494]: time="2025-09-12T17:34:27.312601281Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-599b568c97-s5dwl,Uid:1a745a4f-a77e-4660-9ac5-801112a8b773,Namespace:calico-system,Attempt:1,}" Sep 12 17:34:27.313609 systemd[1]: run-netns-cni\x2d6e347a2a\x2da675\x2df4b5\x2d19c2\x2d755ede5a9c1d.mount: Deactivated successfully. Sep 12 17:34:27.316901 containerd[1494]: time="2025-09-12T17:34:27.316874934Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fsx7j,Uid:53bbc8dc-3621-4d89-be2b-ae2eb974a294,Namespace:calico-system,Attempt:1,}" Sep 12 17:34:27.449130 systemd-networkd[1404]: calib96ad64bb2c: Link UP Sep 12 17:34:27.450288 systemd-networkd[1404]: calib96ad64bb2c: Gained carrier Sep 12 17:34:27.468672 containerd[1494]: 2025-09-12 17:34:27.367 [INFO][4766] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--htdld-eth0 calico-apiserver-6dd46897bd- calico-apiserver 0d943c3b-957f-40f9-b21f-ac657206aace 950 0 2025-09-12 17:33:58 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6dd46897bd projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081-3-6-f-c182586e87 calico-apiserver-6dd46897bd-htdld eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calib96ad64bb2c [] [] }} ContainerID="de5c2b268ceea309c4a25537a0bf220e73a72b8c18816adefa558cf960204d33" Namespace="calico-apiserver" Pod="calico-apiserver-6dd46897bd-htdld" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--htdld-" Sep 12 17:34:27.468672 containerd[1494]: 2025-09-12 17:34:27.367 [INFO][4766] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="de5c2b268ceea309c4a25537a0bf220e73a72b8c18816adefa558cf960204d33" Namespace="calico-apiserver" Pod="calico-apiserver-6dd46897bd-htdld" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--htdld-eth0" Sep 12 17:34:27.468672 containerd[1494]: 2025-09-12 17:34:27.400 [INFO][4800] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="de5c2b268ceea309c4a25537a0bf220e73a72b8c18816adefa558cf960204d33" HandleID="k8s-pod-network.de5c2b268ceea309c4a25537a0bf220e73a72b8c18816adefa558cf960204d33" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--htdld-eth0" Sep 12 17:34:27.468672 containerd[1494]: 2025-09-12 17:34:27.401 [INFO][4800] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="de5c2b268ceea309c4a25537a0bf220e73a72b8c18816adefa558cf960204d33" HandleID="k8s-pod-network.de5c2b268ceea309c4a25537a0bf220e73a72b8c18816adefa558cf960204d33" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--htdld-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5a90), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4081-3-6-f-c182586e87", "pod":"calico-apiserver-6dd46897bd-htdld", "timestamp":"2025-09-12 17:34:27.400835055 +0000 UTC"}, Hostname:"ci-4081-3-6-f-c182586e87", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:34:27.468672 containerd[1494]: 2025-09-12 17:34:27.401 [INFO][4800] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:27.468672 containerd[1494]: 2025-09-12 17:34:27.401 [INFO][4800] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:27.468672 containerd[1494]: 2025-09-12 17:34:27.401 [INFO][4800] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-f-c182586e87' Sep 12 17:34:27.468672 containerd[1494]: 2025-09-12 17:34:27.408 [INFO][4800] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.de5c2b268ceea309c4a25537a0bf220e73a72b8c18816adefa558cf960204d33" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:27.468672 containerd[1494]: 2025-09-12 17:34:27.412 [INFO][4800] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:27.468672 containerd[1494]: 2025-09-12 17:34:27.420 [INFO][4800] ipam/ipam.go 511: Trying affinity for 192.168.34.128/26 host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:27.468672 containerd[1494]: 2025-09-12 17:34:27.424 [INFO][4800] ipam/ipam.go 158: Attempting to load block cidr=192.168.34.128/26 host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:27.468672 containerd[1494]: 2025-09-12 17:34:27.426 [INFO][4800] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.34.128/26 host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:27.468672 containerd[1494]: 2025-09-12 17:34:27.427 [INFO][4800] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.34.128/26 handle="k8s-pod-network.de5c2b268ceea309c4a25537a0bf220e73a72b8c18816adefa558cf960204d33" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:27.468672 containerd[1494]: 2025-09-12 17:34:27.428 [INFO][4800] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.de5c2b268ceea309c4a25537a0bf220e73a72b8c18816adefa558cf960204d33 Sep 12 17:34:27.468672 containerd[1494]: 2025-09-12 17:34:27.433 [INFO][4800] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.34.128/26 handle="k8s-pod-network.de5c2b268ceea309c4a25537a0bf220e73a72b8c18816adefa558cf960204d33" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:27.468672 containerd[1494]: 2025-09-12 17:34:27.440 [INFO][4800] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.34.134/26] block=192.168.34.128/26 handle="k8s-pod-network.de5c2b268ceea309c4a25537a0bf220e73a72b8c18816adefa558cf960204d33" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:27.468672 containerd[1494]: 2025-09-12 17:34:27.440 [INFO][4800] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.34.134/26] handle="k8s-pod-network.de5c2b268ceea309c4a25537a0bf220e73a72b8c18816adefa558cf960204d33" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:27.468672 containerd[1494]: 2025-09-12 17:34:27.440 [INFO][4800] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:27.468672 containerd[1494]: 2025-09-12 17:34:27.440 [INFO][4800] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.34.134/26] IPv6=[] ContainerID="de5c2b268ceea309c4a25537a0bf220e73a72b8c18816adefa558cf960204d33" HandleID="k8s-pod-network.de5c2b268ceea309c4a25537a0bf220e73a72b8c18816adefa558cf960204d33" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--htdld-eth0" Sep 12 17:34:27.470718 containerd[1494]: 2025-09-12 17:34:27.445 [INFO][4766] cni-plugin/k8s.go 418: Populated endpoint ContainerID="de5c2b268ceea309c4a25537a0bf220e73a72b8c18816adefa558cf960204d33" Namespace="calico-apiserver" Pod="calico-apiserver-6dd46897bd-htdld" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--htdld-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--htdld-eth0", GenerateName:"calico-apiserver-6dd46897bd-", Namespace:"calico-apiserver", SelfLink:"", UID:"0d943c3b-957f-40f9-b21f-ac657206aace", ResourceVersion:"950", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 33, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6dd46897bd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-f-c182586e87", ContainerID:"", Pod:"calico-apiserver-6dd46897bd-htdld", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.34.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib96ad64bb2c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:34:27.470718 containerd[1494]: 2025-09-12 17:34:27.445 [INFO][4766] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.34.134/32] ContainerID="de5c2b268ceea309c4a25537a0bf220e73a72b8c18816adefa558cf960204d33" Namespace="calico-apiserver" Pod="calico-apiserver-6dd46897bd-htdld" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--htdld-eth0" Sep 12 17:34:27.470718 containerd[1494]: 2025-09-12 17:34:27.445 [INFO][4766] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib96ad64bb2c ContainerID="de5c2b268ceea309c4a25537a0bf220e73a72b8c18816adefa558cf960204d33" Namespace="calico-apiserver" Pod="calico-apiserver-6dd46897bd-htdld" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--htdld-eth0" Sep 12 17:34:27.470718 containerd[1494]: 2025-09-12 17:34:27.452 [INFO][4766] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="de5c2b268ceea309c4a25537a0bf220e73a72b8c18816adefa558cf960204d33" Namespace="calico-apiserver" Pod="calico-apiserver-6dd46897bd-htdld" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--htdld-eth0" Sep 12 17:34:27.470718 containerd[1494]: 2025-09-12 17:34:27.452 [INFO][4766] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="de5c2b268ceea309c4a25537a0bf220e73a72b8c18816adefa558cf960204d33" Namespace="calico-apiserver" Pod="calico-apiserver-6dd46897bd-htdld" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--htdld-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--htdld-eth0", GenerateName:"calico-apiserver-6dd46897bd-", Namespace:"calico-apiserver", SelfLink:"", UID:"0d943c3b-957f-40f9-b21f-ac657206aace", ResourceVersion:"950", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 33, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6dd46897bd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-f-c182586e87", ContainerID:"de5c2b268ceea309c4a25537a0bf220e73a72b8c18816adefa558cf960204d33", Pod:"calico-apiserver-6dd46897bd-htdld", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.34.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib96ad64bb2c", MAC:"9e:d9:21:d3:e9:07", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:34:27.470718 containerd[1494]: 2025-09-12 17:34:27.461 [INFO][4766] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="de5c2b268ceea309c4a25537a0bf220e73a72b8c18816adefa558cf960204d33" Namespace="calico-apiserver" Pod="calico-apiserver-6dd46897bd-htdld" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--htdld-eth0" Sep 12 17:34:27.500474 containerd[1494]: time="2025-09-12T17:34:27.499781333Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:34:27.500474 containerd[1494]: time="2025-09-12T17:34:27.499833602Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:34:27.500474 containerd[1494]: time="2025-09-12T17:34:27.499848791Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:34:27.500474 containerd[1494]: time="2025-09-12T17:34:27.499929191Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:34:27.516818 systemd[1]: Started cri-containerd-de5c2b268ceea309c4a25537a0bf220e73a72b8c18816adefa558cf960204d33.scope - libcontainer container de5c2b268ceea309c4a25537a0bf220e73a72b8c18816adefa558cf960204d33. Sep 12 17:34:27.587308 systemd-networkd[1404]: califec7a5e1d4c: Link UP Sep 12 17:34:27.590329 systemd-networkd[1404]: califec7a5e1d4c: Gained carrier Sep 12 17:34:27.613770 containerd[1494]: time="2025-09-12T17:34:27.613374923Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6dd46897bd-htdld,Uid:0d943c3b-957f-40f9-b21f-ac657206aace,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"de5c2b268ceea309c4a25537a0bf220e73a72b8c18816adefa558cf960204d33\"" Sep 12 17:34:27.621490 containerd[1494]: 2025-09-12 17:34:27.407 [INFO][4777] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--f--c182586e87-k8s-calico--kube--controllers--599b568c97--s5dwl-eth0 calico-kube-controllers-599b568c97- calico-system 1a745a4f-a77e-4660-9ac5-801112a8b773 948 0 2025-09-12 17:34:01 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:599b568c97 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4081-3-6-f-c182586e87 calico-kube-controllers-599b568c97-s5dwl eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] califec7a5e1d4c [] [] }} ContainerID="7737ddf2ca213a87f62309a181f8f48d31a299a97234f587068401b4b3a059af" Namespace="calico-system" Pod="calico-kube-controllers-599b568c97-s5dwl" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-calico--kube--controllers--599b568c97--s5dwl-" Sep 12 17:34:27.621490 containerd[1494]: 2025-09-12 17:34:27.407 [INFO][4777] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7737ddf2ca213a87f62309a181f8f48d31a299a97234f587068401b4b3a059af" Namespace="calico-system" Pod="calico-kube-controllers-599b568c97-s5dwl" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-calico--kube--controllers--599b568c97--s5dwl-eth0" Sep 12 17:34:27.621490 containerd[1494]: 2025-09-12 17:34:27.438 [INFO][4808] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7737ddf2ca213a87f62309a181f8f48d31a299a97234f587068401b4b3a059af" HandleID="k8s-pod-network.7737ddf2ca213a87f62309a181f8f48d31a299a97234f587068401b4b3a059af" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--kube--controllers--599b568c97--s5dwl-eth0" Sep 12 17:34:27.621490 containerd[1494]: 2025-09-12 17:34:27.438 [INFO][4808] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7737ddf2ca213a87f62309a181f8f48d31a299a97234f587068401b4b3a059af" HandleID="k8s-pod-network.7737ddf2ca213a87f62309a181f8f48d31a299a97234f587068401b4b3a059af" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--kube--controllers--599b568c97--s5dwl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024eff0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-f-c182586e87", "pod":"calico-kube-controllers-599b568c97-s5dwl", "timestamp":"2025-09-12 17:34:27.438185456 +0000 UTC"}, Hostname:"ci-4081-3-6-f-c182586e87", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:34:27.621490 containerd[1494]: 2025-09-12 17:34:27.438 [INFO][4808] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:27.621490 containerd[1494]: 2025-09-12 17:34:27.440 [INFO][4808] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:27.621490 containerd[1494]: 2025-09-12 17:34:27.440 [INFO][4808] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-f-c182586e87' Sep 12 17:34:27.621490 containerd[1494]: 2025-09-12 17:34:27.509 [INFO][4808] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7737ddf2ca213a87f62309a181f8f48d31a299a97234f587068401b4b3a059af" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:27.621490 containerd[1494]: 2025-09-12 17:34:27.514 [INFO][4808] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:27.621490 containerd[1494]: 2025-09-12 17:34:27.518 [INFO][4808] ipam/ipam.go 511: Trying affinity for 192.168.34.128/26 host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:27.621490 containerd[1494]: 2025-09-12 17:34:27.520 [INFO][4808] ipam/ipam.go 158: Attempting to load block cidr=192.168.34.128/26 host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:27.621490 containerd[1494]: 2025-09-12 17:34:27.525 [INFO][4808] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.34.128/26 host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:27.621490 containerd[1494]: 2025-09-12 17:34:27.525 [INFO][4808] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.34.128/26 handle="k8s-pod-network.7737ddf2ca213a87f62309a181f8f48d31a299a97234f587068401b4b3a059af" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:27.621490 containerd[1494]: 2025-09-12 17:34:27.529 [INFO][4808] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.7737ddf2ca213a87f62309a181f8f48d31a299a97234f587068401b4b3a059af Sep 12 17:34:27.621490 containerd[1494]: 2025-09-12 17:34:27.544 [INFO][4808] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.34.128/26 handle="k8s-pod-network.7737ddf2ca213a87f62309a181f8f48d31a299a97234f587068401b4b3a059af" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:27.621490 containerd[1494]: 2025-09-12 17:34:27.564 [INFO][4808] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.34.135/26] block=192.168.34.128/26 handle="k8s-pod-network.7737ddf2ca213a87f62309a181f8f48d31a299a97234f587068401b4b3a059af" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:27.621490 containerd[1494]: 2025-09-12 17:34:27.565 [INFO][4808] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.34.135/26] handle="k8s-pod-network.7737ddf2ca213a87f62309a181f8f48d31a299a97234f587068401b4b3a059af" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:27.621490 containerd[1494]: 2025-09-12 17:34:27.565 [INFO][4808] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:27.621490 containerd[1494]: 2025-09-12 17:34:27.566 [INFO][4808] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.34.135/26] IPv6=[] ContainerID="7737ddf2ca213a87f62309a181f8f48d31a299a97234f587068401b4b3a059af" HandleID="k8s-pod-network.7737ddf2ca213a87f62309a181f8f48d31a299a97234f587068401b4b3a059af" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--kube--controllers--599b568c97--s5dwl-eth0" Sep 12 17:34:27.622675 containerd[1494]: 2025-09-12 17:34:27.574 [INFO][4777] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7737ddf2ca213a87f62309a181f8f48d31a299a97234f587068401b4b3a059af" Namespace="calico-system" Pod="calico-kube-controllers-599b568c97-s5dwl" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-calico--kube--controllers--599b568c97--s5dwl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--f--c182586e87-k8s-calico--kube--controllers--599b568c97--s5dwl-eth0", GenerateName:"calico-kube-controllers-599b568c97-", Namespace:"calico-system", SelfLink:"", UID:"1a745a4f-a77e-4660-9ac5-801112a8b773", ResourceVersion:"948", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 34, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"599b568c97", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-f-c182586e87", ContainerID:"", Pod:"calico-kube-controllers-599b568c97-s5dwl", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.34.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"califec7a5e1d4c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:34:27.622675 containerd[1494]: 2025-09-12 17:34:27.575 [INFO][4777] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.34.135/32] ContainerID="7737ddf2ca213a87f62309a181f8f48d31a299a97234f587068401b4b3a059af" Namespace="calico-system" Pod="calico-kube-controllers-599b568c97-s5dwl" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-calico--kube--controllers--599b568c97--s5dwl-eth0" Sep 12 17:34:27.622675 containerd[1494]: 2025-09-12 17:34:27.575 [INFO][4777] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califec7a5e1d4c ContainerID="7737ddf2ca213a87f62309a181f8f48d31a299a97234f587068401b4b3a059af" Namespace="calico-system" Pod="calico-kube-controllers-599b568c97-s5dwl" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-calico--kube--controllers--599b568c97--s5dwl-eth0" Sep 12 17:34:27.622675 containerd[1494]: 2025-09-12 17:34:27.591 [INFO][4777] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7737ddf2ca213a87f62309a181f8f48d31a299a97234f587068401b4b3a059af" Namespace="calico-system" Pod="calico-kube-controllers-599b568c97-s5dwl" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-calico--kube--controllers--599b568c97--s5dwl-eth0" Sep 12 17:34:27.622675 containerd[1494]: 2025-09-12 17:34:27.593 [INFO][4777] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7737ddf2ca213a87f62309a181f8f48d31a299a97234f587068401b4b3a059af" Namespace="calico-system" Pod="calico-kube-controllers-599b568c97-s5dwl" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-calico--kube--controllers--599b568c97--s5dwl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--f--c182586e87-k8s-calico--kube--controllers--599b568c97--s5dwl-eth0", GenerateName:"calico-kube-controllers-599b568c97-", Namespace:"calico-system", SelfLink:"", UID:"1a745a4f-a77e-4660-9ac5-801112a8b773", ResourceVersion:"948", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 34, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"599b568c97", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-f-c182586e87", ContainerID:"7737ddf2ca213a87f62309a181f8f48d31a299a97234f587068401b4b3a059af", Pod:"calico-kube-controllers-599b568c97-s5dwl", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.34.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"califec7a5e1d4c", MAC:"5a:ca:51:8a:b8:ff", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:34:27.622675 containerd[1494]: 2025-09-12 17:34:27.617 [INFO][4777] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7737ddf2ca213a87f62309a181f8f48d31a299a97234f587068401b4b3a059af" Namespace="calico-system" Pod="calico-kube-controllers-599b568c97-s5dwl" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-calico--kube--controllers--599b568c97--s5dwl-eth0" Sep 12 17:34:27.658051 containerd[1494]: time="2025-09-12T17:34:27.657971528Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:34:27.660341 containerd[1494]: time="2025-09-12T17:34:27.658667700Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:34:27.660341 containerd[1494]: time="2025-09-12T17:34:27.660267103Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:34:27.660681 containerd[1494]: time="2025-09-12T17:34:27.660572229Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:34:27.682364 systemd[1]: Started cri-containerd-7737ddf2ca213a87f62309a181f8f48d31a299a97234f587068401b4b3a059af.scope - libcontainer container 7737ddf2ca213a87f62309a181f8f48d31a299a97234f587068401b4b3a059af. Sep 12 17:34:27.704440 systemd-networkd[1404]: cali33a3b8905b9: Link UP Sep 12 17:34:27.706600 systemd-networkd[1404]: cali33a3b8905b9: Gained carrier Sep 12 17:34:27.726412 containerd[1494]: 2025-09-12 17:34:27.433 [INFO][4788] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--f--c182586e87-k8s-csi--node--driver--fsx7j-eth0 csi-node-driver- calico-system 53bbc8dc-3621-4d89-be2b-ae2eb974a294 949 0 2025-09-12 17:34:01 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4081-3-6-f-c182586e87 csi-node-driver-fsx7j eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali33a3b8905b9 [] [] }} ContainerID="2947822c56d8fb60e9e8a7cc57ecf12731351ea16c8878ff24b12e11d93bcdaa" Namespace="calico-system" Pod="csi-node-driver-fsx7j" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-csi--node--driver--fsx7j-" Sep 12 17:34:27.726412 containerd[1494]: 2025-09-12 17:34:27.433 [INFO][4788] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2947822c56d8fb60e9e8a7cc57ecf12731351ea16c8878ff24b12e11d93bcdaa" Namespace="calico-system" Pod="csi-node-driver-fsx7j" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-csi--node--driver--fsx7j-eth0" Sep 12 17:34:27.726412 containerd[1494]: 2025-09-12 17:34:27.484 [INFO][4817] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2947822c56d8fb60e9e8a7cc57ecf12731351ea16c8878ff24b12e11d93bcdaa" HandleID="k8s-pod-network.2947822c56d8fb60e9e8a7cc57ecf12731351ea16c8878ff24b12e11d93bcdaa" Workload="ci--4081--3--6--f--c182586e87-k8s-csi--node--driver--fsx7j-eth0" Sep 12 17:34:27.726412 containerd[1494]: 2025-09-12 17:34:27.484 [INFO][4817] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2947822c56d8fb60e9e8a7cc57ecf12731351ea16c8878ff24b12e11d93bcdaa" HandleID="k8s-pod-network.2947822c56d8fb60e9e8a7cc57ecf12731351ea16c8878ff24b12e11d93bcdaa" Workload="ci--4081--3--6--f--c182586e87-k8s-csi--node--driver--fsx7j-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5600), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-f-c182586e87", "pod":"csi-node-driver-fsx7j", "timestamp":"2025-09-12 17:34:27.484037586 +0000 UTC"}, Hostname:"ci-4081-3-6-f-c182586e87", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:34:27.726412 containerd[1494]: 2025-09-12 17:34:27.484 [INFO][4817] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:27.726412 containerd[1494]: 2025-09-12 17:34:27.566 [INFO][4817] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:27.726412 containerd[1494]: 2025-09-12 17:34:27.566 [INFO][4817] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-f-c182586e87' Sep 12 17:34:27.726412 containerd[1494]: 2025-09-12 17:34:27.613 [INFO][4817] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2947822c56d8fb60e9e8a7cc57ecf12731351ea16c8878ff24b12e11d93bcdaa" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:27.726412 containerd[1494]: 2025-09-12 17:34:27.628 [INFO][4817] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:27.726412 containerd[1494]: 2025-09-12 17:34:27.636 [INFO][4817] ipam/ipam.go 511: Trying affinity for 192.168.34.128/26 host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:27.726412 containerd[1494]: 2025-09-12 17:34:27.641 [INFO][4817] ipam/ipam.go 158: Attempting to load block cidr=192.168.34.128/26 host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:27.726412 containerd[1494]: 2025-09-12 17:34:27.646 [INFO][4817] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.34.128/26 host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:27.726412 containerd[1494]: 2025-09-12 17:34:27.646 [INFO][4817] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.34.128/26 handle="k8s-pod-network.2947822c56d8fb60e9e8a7cc57ecf12731351ea16c8878ff24b12e11d93bcdaa" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:27.726412 containerd[1494]: 2025-09-12 17:34:27.648 [INFO][4817] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.2947822c56d8fb60e9e8a7cc57ecf12731351ea16c8878ff24b12e11d93bcdaa Sep 12 17:34:27.726412 containerd[1494]: 2025-09-12 17:34:27.665 [INFO][4817] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.34.128/26 handle="k8s-pod-network.2947822c56d8fb60e9e8a7cc57ecf12731351ea16c8878ff24b12e11d93bcdaa" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:27.726412 containerd[1494]: 2025-09-12 17:34:27.676 [INFO][4817] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.34.136/26] block=192.168.34.128/26 handle="k8s-pod-network.2947822c56d8fb60e9e8a7cc57ecf12731351ea16c8878ff24b12e11d93bcdaa" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:27.726412 containerd[1494]: 2025-09-12 17:34:27.676 [INFO][4817] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.34.136/26] handle="k8s-pod-network.2947822c56d8fb60e9e8a7cc57ecf12731351ea16c8878ff24b12e11d93bcdaa" host="ci-4081-3-6-f-c182586e87" Sep 12 17:34:27.726412 containerd[1494]: 2025-09-12 17:34:27.676 [INFO][4817] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:27.726412 containerd[1494]: 2025-09-12 17:34:27.676 [INFO][4817] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.34.136/26] IPv6=[] ContainerID="2947822c56d8fb60e9e8a7cc57ecf12731351ea16c8878ff24b12e11d93bcdaa" HandleID="k8s-pod-network.2947822c56d8fb60e9e8a7cc57ecf12731351ea16c8878ff24b12e11d93bcdaa" Workload="ci--4081--3--6--f--c182586e87-k8s-csi--node--driver--fsx7j-eth0" Sep 12 17:34:27.727690 containerd[1494]: 2025-09-12 17:34:27.683 [INFO][4788] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2947822c56d8fb60e9e8a7cc57ecf12731351ea16c8878ff24b12e11d93bcdaa" Namespace="calico-system" Pod="csi-node-driver-fsx7j" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-csi--node--driver--fsx7j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--f--c182586e87-k8s-csi--node--driver--fsx7j-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"53bbc8dc-3621-4d89-be2b-ae2eb974a294", ResourceVersion:"949", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 34, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-f-c182586e87", ContainerID:"", Pod:"csi-node-driver-fsx7j", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.34.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali33a3b8905b9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:34:27.727690 containerd[1494]: 2025-09-12 17:34:27.684 [INFO][4788] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.34.136/32] ContainerID="2947822c56d8fb60e9e8a7cc57ecf12731351ea16c8878ff24b12e11d93bcdaa" Namespace="calico-system" Pod="csi-node-driver-fsx7j" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-csi--node--driver--fsx7j-eth0" Sep 12 17:34:27.727690 containerd[1494]: 2025-09-12 17:34:27.684 [INFO][4788] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali33a3b8905b9 ContainerID="2947822c56d8fb60e9e8a7cc57ecf12731351ea16c8878ff24b12e11d93bcdaa" Namespace="calico-system" Pod="csi-node-driver-fsx7j" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-csi--node--driver--fsx7j-eth0" Sep 12 17:34:27.727690 containerd[1494]: 2025-09-12 17:34:27.709 [INFO][4788] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2947822c56d8fb60e9e8a7cc57ecf12731351ea16c8878ff24b12e11d93bcdaa" Namespace="calico-system" Pod="csi-node-driver-fsx7j" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-csi--node--driver--fsx7j-eth0" Sep 12 17:34:27.727690 containerd[1494]: 2025-09-12 17:34:27.712 [INFO][4788] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2947822c56d8fb60e9e8a7cc57ecf12731351ea16c8878ff24b12e11d93bcdaa" Namespace="calico-system" Pod="csi-node-driver-fsx7j" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-csi--node--driver--fsx7j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--f--c182586e87-k8s-csi--node--driver--fsx7j-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"53bbc8dc-3621-4d89-be2b-ae2eb974a294", ResourceVersion:"949", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 34, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-f-c182586e87", ContainerID:"2947822c56d8fb60e9e8a7cc57ecf12731351ea16c8878ff24b12e11d93bcdaa", Pod:"csi-node-driver-fsx7j", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.34.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali33a3b8905b9", MAC:"16:b3:0f:3b:ac:0c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:34:27.727690 containerd[1494]: 2025-09-12 17:34:27.721 [INFO][4788] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2947822c56d8fb60e9e8a7cc57ecf12731351ea16c8878ff24b12e11d93bcdaa" Namespace="calico-system" Pod="csi-node-driver-fsx7j" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-csi--node--driver--fsx7j-eth0" Sep 12 17:34:27.745054 containerd[1494]: time="2025-09-12T17:34:27.744829438Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:34:27.745054 containerd[1494]: time="2025-09-12T17:34:27.744872741Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:34:27.745054 containerd[1494]: time="2025-09-12T17:34:27.744881937Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:34:27.745054 containerd[1494]: time="2025-09-12T17:34:27.744949374Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:34:27.759781 systemd[1]: Started cri-containerd-2947822c56d8fb60e9e8a7cc57ecf12731351ea16c8878ff24b12e11d93bcdaa.scope - libcontainer container 2947822c56d8fb60e9e8a7cc57ecf12731351ea16c8878ff24b12e11d93bcdaa. Sep 12 17:34:27.793659 containerd[1494]: time="2025-09-12T17:34:27.793499980Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-599b568c97-s5dwl,Uid:1a745a4f-a77e-4660-9ac5-801112a8b773,Namespace:calico-system,Attempt:1,} returns sandbox id \"7737ddf2ca213a87f62309a181f8f48d31a299a97234f587068401b4b3a059af\"" Sep 12 17:34:27.807398 containerd[1494]: time="2025-09-12T17:34:27.807320513Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fsx7j,Uid:53bbc8dc-3621-4d89-be2b-ae2eb974a294,Namespace:calico-system,Attempt:1,} returns sandbox id \"2947822c56d8fb60e9e8a7cc57ecf12731351ea16c8878ff24b12e11d93bcdaa\"" Sep 12 17:34:28.849792 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4037800526.mount: Deactivated successfully. Sep 12 17:34:29.062474 systemd-networkd[1404]: cali33a3b8905b9: Gained IPv6LL Sep 12 17:34:29.184038 containerd[1494]: time="2025-09-12T17:34:29.183790987Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:29.184845 containerd[1494]: time="2025-09-12T17:34:29.184810827Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 12 17:34:29.185875 containerd[1494]: time="2025-09-12T17:34:29.185841257Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:29.195156 containerd[1494]: time="2025-09-12T17:34:29.194428449Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:29.195156 containerd[1494]: time="2025-09-12T17:34:29.195074517Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 3.128098929s" Sep 12 17:34:29.195156 containerd[1494]: time="2025-09-12T17:34:29.195096428Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 12 17:34:29.196292 containerd[1494]: time="2025-09-12T17:34:29.196199725Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 17:34:29.198427 containerd[1494]: time="2025-09-12T17:34:29.198396692Z" level=info msg="CreateContainer within sandbox \"5a0d91b33f460614a3b619958e39f5021fa374631aaf7aa43e52be6e4d546d12\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 12 17:34:29.214342 containerd[1494]: time="2025-09-12T17:34:29.214311063Z" level=info msg="CreateContainer within sandbox \"5a0d91b33f460614a3b619958e39f5021fa374631aaf7aa43e52be6e4d546d12\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"c069596cb3f11b2bc0a19bee2d67332733dfb32070b9427552ea3c39246e5e10\"" Sep 12 17:34:29.215348 containerd[1494]: time="2025-09-12T17:34:29.215320273Z" level=info msg="StartContainer for \"c069596cb3f11b2bc0a19bee2d67332733dfb32070b9427552ea3c39246e5e10\"" Sep 12 17:34:29.245860 systemd[1]: Started cri-containerd-c069596cb3f11b2bc0a19bee2d67332733dfb32070b9427552ea3c39246e5e10.scope - libcontainer container c069596cb3f11b2bc0a19bee2d67332733dfb32070b9427552ea3c39246e5e10. Sep 12 17:34:29.256779 systemd-networkd[1404]: califec7a5e1d4c: Gained IPv6LL Sep 12 17:34:29.284091 containerd[1494]: time="2025-09-12T17:34:29.284044487Z" level=info msg="StartContainer for \"c069596cb3f11b2bc0a19bee2d67332733dfb32070b9427552ea3c39246e5e10\" returns successfully" Sep 12 17:34:29.382729 systemd-networkd[1404]: calib96ad64bb2c: Gained IPv6LL Sep 12 17:34:29.409246 kubelet[2561]: I0912 17:34:29.409088 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-wnp7f" podStartSLOduration=24.953333231 podStartE2EDuration="29.409072382s" podCreationTimestamp="2025-09-12 17:34:00 +0000 UTC" firstStartedPulling="2025-09-12 17:34:24.740298188 +0000 UTC m=+40.767713278" lastFinishedPulling="2025-09-12 17:34:29.196037339 +0000 UTC m=+45.223452429" observedRunningTime="2025-09-12 17:34:29.407591213 +0000 UTC m=+45.435006293" watchObservedRunningTime="2025-09-12 17:34:29.409072382 +0000 UTC m=+45.436487462" Sep 12 17:34:29.456365 systemd[1]: run-containerd-runc-k8s.io-c069596cb3f11b2bc0a19bee2d67332733dfb32070b9427552ea3c39246e5e10-runc.Wq5tCX.mount: Deactivated successfully. Sep 12 17:34:32.572688 containerd[1494]: time="2025-09-12T17:34:32.572641229Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:32.573660 containerd[1494]: time="2025-09-12T17:34:32.573590936Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 12 17:34:32.575588 containerd[1494]: time="2025-09-12T17:34:32.574759545Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:32.576953 containerd[1494]: time="2025-09-12T17:34:32.576394733Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:32.576953 containerd[1494]: time="2025-09-12T17:34:32.576854208Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 3.380564905s" Sep 12 17:34:32.576953 containerd[1494]: time="2025-09-12T17:34:32.576878042Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 12 17:34:32.578463 containerd[1494]: time="2025-09-12T17:34:32.577718383Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 17:34:32.579832 containerd[1494]: time="2025-09-12T17:34:32.579781165Z" level=info msg="CreateContainer within sandbox \"bf699b73a6ad071f5fccc960cf1d02448f7d1eb2a2d22333c04578c66b33e8c5\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 17:34:32.628045 containerd[1494]: time="2025-09-12T17:34:32.628011966Z" level=info msg="CreateContainer within sandbox \"bf699b73a6ad071f5fccc960cf1d02448f7d1eb2a2d22333c04578c66b33e8c5\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"e94337ca3fcc19a1c713de9c1c3d2a8bb3b1e0a3076e8916122a231f525f7884\"" Sep 12 17:34:32.629234 containerd[1494]: time="2025-09-12T17:34:32.629197427Z" level=info msg="StartContainer for \"e94337ca3fcc19a1c713de9c1c3d2a8bb3b1e0a3076e8916122a231f525f7884\"" Sep 12 17:34:32.668393 systemd[1]: Started cri-containerd-e94337ca3fcc19a1c713de9c1c3d2a8bb3b1e0a3076e8916122a231f525f7884.scope - libcontainer container e94337ca3fcc19a1c713de9c1c3d2a8bb3b1e0a3076e8916122a231f525f7884. Sep 12 17:34:32.714574 containerd[1494]: time="2025-09-12T17:34:32.714519385Z" level=info msg="StartContainer for \"e94337ca3fcc19a1c713de9c1c3d2a8bb3b1e0a3076e8916122a231f525f7884\" returns successfully" Sep 12 17:34:33.179443 containerd[1494]: time="2025-09-12T17:34:33.179390363Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:33.180501 containerd[1494]: time="2025-09-12T17:34:33.180466477Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 12 17:34:33.185664 containerd[1494]: time="2025-09-12T17:34:33.185622999Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 607.88005ms" Sep 12 17:34:33.185923 containerd[1494]: time="2025-09-12T17:34:33.185777850Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 12 17:34:33.187341 containerd[1494]: time="2025-09-12T17:34:33.187316345Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 12 17:34:33.190766 containerd[1494]: time="2025-09-12T17:34:33.190739415Z" level=info msg="CreateContainer within sandbox \"de5c2b268ceea309c4a25537a0bf220e73a72b8c18816adefa558cf960204d33\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 17:34:33.210118 containerd[1494]: time="2025-09-12T17:34:33.209769067Z" level=info msg="CreateContainer within sandbox \"de5c2b268ceea309c4a25537a0bf220e73a72b8c18816adefa558cf960204d33\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"e2fe180c915108fb98652d0a2f57e335f21719bd8859d34ff7eece4383c71d56\"" Sep 12 17:34:33.211024 containerd[1494]: time="2025-09-12T17:34:33.210293653Z" level=info msg="StartContainer for \"e2fe180c915108fb98652d0a2f57e335f21719bd8859d34ff7eece4383c71d56\"" Sep 12 17:34:33.245864 systemd[1]: Started cri-containerd-e2fe180c915108fb98652d0a2f57e335f21719bd8859d34ff7eece4383c71d56.scope - libcontainer container e2fe180c915108fb98652d0a2f57e335f21719bd8859d34ff7eece4383c71d56. Sep 12 17:34:33.300376 containerd[1494]: time="2025-09-12T17:34:33.300320658Z" level=info msg="StartContainer for \"e2fe180c915108fb98652d0a2f57e335f21719bd8859d34ff7eece4383c71d56\" returns successfully" Sep 12 17:34:33.415822 kubelet[2561]: I0912 17:34:33.415770 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6dd46897bd-htdld" podStartSLOduration=29.845927071 podStartE2EDuration="35.41575321s" podCreationTimestamp="2025-09-12 17:33:58 +0000 UTC" firstStartedPulling="2025-09-12 17:34:27.616969636 +0000 UTC m=+43.644384716" lastFinishedPulling="2025-09-12 17:34:33.186795765 +0000 UTC m=+49.214210855" observedRunningTime="2025-09-12 17:34:33.414968635 +0000 UTC m=+49.442383714" watchObservedRunningTime="2025-09-12 17:34:33.41575321 +0000 UTC m=+49.443168291" Sep 12 17:34:34.412983 kubelet[2561]: I0912 17:34:34.410023 2561 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:34:34.413434 kubelet[2561]: I0912 17:34:34.410041 2561 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:34:35.313399 kubelet[2561]: I0912 17:34:35.312741 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6dd46897bd-bbjqp" podStartSLOduration=29.51558052 podStartE2EDuration="37.312710225s" podCreationTimestamp="2025-09-12 17:33:58 +0000 UTC" firstStartedPulling="2025-09-12 17:34:24.780502978 +0000 UTC m=+40.807918058" lastFinishedPulling="2025-09-12 17:34:32.577632682 +0000 UTC m=+48.605047763" observedRunningTime="2025-09-12 17:34:33.427430279 +0000 UTC m=+49.454845359" watchObservedRunningTime="2025-09-12 17:34:35.312710225 +0000 UTC m=+51.340125315" Sep 12 17:34:36.574489 containerd[1494]: time="2025-09-12T17:34:36.574424068Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:36.575898 containerd[1494]: time="2025-09-12T17:34:36.575627601Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 12 17:34:36.577259 containerd[1494]: time="2025-09-12T17:34:36.576965927Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:36.579887 containerd[1494]: time="2025-09-12T17:34:36.579846674Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:36.580296 containerd[1494]: time="2025-09-12T17:34:36.580263818Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 3.392918559s" Sep 12 17:34:36.580296 containerd[1494]: time="2025-09-12T17:34:36.580294377Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 12 17:34:36.587923 containerd[1494]: time="2025-09-12T17:34:36.587892082Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 12 17:34:36.758173 containerd[1494]: time="2025-09-12T17:34:36.757987918Z" level=info msg="CreateContainer within sandbox \"7737ddf2ca213a87f62309a181f8f48d31a299a97234f587068401b4b3a059af\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 12 17:34:36.826754 containerd[1494]: time="2025-09-12T17:34:36.826636086Z" level=info msg="CreateContainer within sandbox \"7737ddf2ca213a87f62309a181f8f48d31a299a97234f587068401b4b3a059af\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"21bb87a3fda7d5325fac931727e1c42d0390ec9032f0d28c1c0fb5313199a735\"" Sep 12 17:34:36.827732 containerd[1494]: time="2025-09-12T17:34:36.827682793Z" level=info msg="StartContainer for \"21bb87a3fda7d5325fac931727e1c42d0390ec9032f0d28c1c0fb5313199a735\"" Sep 12 17:34:36.988569 systemd[1]: Started cri-containerd-21bb87a3fda7d5325fac931727e1c42d0390ec9032f0d28c1c0fb5313199a735.scope - libcontainer container 21bb87a3fda7d5325fac931727e1c42d0390ec9032f0d28c1c0fb5313199a735. Sep 12 17:34:37.059704 containerd[1494]: time="2025-09-12T17:34:37.058853929Z" level=info msg="StartContainer for \"21bb87a3fda7d5325fac931727e1c42d0390ec9032f0d28c1c0fb5313199a735\" returns successfully" Sep 12 17:34:37.630840 kubelet[2561]: I0912 17:34:37.630785 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-599b568c97-s5dwl" podStartSLOduration=27.835704816 podStartE2EDuration="36.621844122s" podCreationTimestamp="2025-09-12 17:34:01 +0000 UTC" firstStartedPulling="2025-09-12 17:34:27.795529676 +0000 UTC m=+43.822944755" lastFinishedPulling="2025-09-12 17:34:36.581668981 +0000 UTC m=+52.609084061" observedRunningTime="2025-09-12 17:34:37.618469929 +0000 UTC m=+53.645885010" watchObservedRunningTime="2025-09-12 17:34:37.621844122 +0000 UTC m=+53.649259202" Sep 12 17:34:38.327114 containerd[1494]: time="2025-09-12T17:34:38.327029237Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:38.329260 containerd[1494]: time="2025-09-12T17:34:38.329088327Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 12 17:34:38.340161 containerd[1494]: time="2025-09-12T17:34:38.340136214Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:38.342155 containerd[1494]: time="2025-09-12T17:34:38.342118931Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:38.342923 containerd[1494]: time="2025-09-12T17:34:38.342554199Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 1.754632211s" Sep 12 17:34:38.342923 containerd[1494]: time="2025-09-12T17:34:38.342577733Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 12 17:34:38.352837 containerd[1494]: time="2025-09-12T17:34:38.352807932Z" level=info msg="CreateContainer within sandbox \"2947822c56d8fb60e9e8a7cc57ecf12731351ea16c8878ff24b12e11d93bcdaa\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 12 17:34:38.380285 containerd[1494]: time="2025-09-12T17:34:38.380249472Z" level=info msg="CreateContainer within sandbox \"2947822c56d8fb60e9e8a7cc57ecf12731351ea16c8878ff24b12e11d93bcdaa\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"4cd0a197c3531133e7586a50606030c026600480910934024329d31884cebc0a\"" Sep 12 17:34:38.380872 containerd[1494]: time="2025-09-12T17:34:38.380849871Z" level=info msg="StartContainer for \"4cd0a197c3531133e7586a50606030c026600480910934024329d31884cebc0a\"" Sep 12 17:34:38.413331 systemd[1]: Started cri-containerd-4cd0a197c3531133e7586a50606030c026600480910934024329d31884cebc0a.scope - libcontainer container 4cd0a197c3531133e7586a50606030c026600480910934024329d31884cebc0a. Sep 12 17:34:38.436363 containerd[1494]: time="2025-09-12T17:34:38.436325816Z" level=info msg="StartContainer for \"4cd0a197c3531133e7586a50606030c026600480910934024329d31884cebc0a\" returns successfully" Sep 12 17:34:38.443235 containerd[1494]: time="2025-09-12T17:34:38.443199156Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 12 17:34:40.447704 containerd[1494]: time="2025-09-12T17:34:40.447635127Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:40.448687 containerd[1494]: time="2025-09-12T17:34:40.448613055Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 12 17:34:40.450203 containerd[1494]: time="2025-09-12T17:34:40.449279538Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:40.451562 containerd[1494]: time="2025-09-12T17:34:40.451000090Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:34:40.451562 containerd[1494]: time="2025-09-12T17:34:40.451455997Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 2.008097412s" Sep 12 17:34:40.451562 containerd[1494]: time="2025-09-12T17:34:40.451480061Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 12 17:34:40.499762 containerd[1494]: time="2025-09-12T17:34:40.499725029Z" level=info msg="CreateContainer within sandbox \"2947822c56d8fb60e9e8a7cc57ecf12731351ea16c8878ff24b12e11d93bcdaa\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 12 17:34:40.529623 containerd[1494]: time="2025-09-12T17:34:40.529582805Z" level=info msg="CreateContainer within sandbox \"2947822c56d8fb60e9e8a7cc57ecf12731351ea16c8878ff24b12e11d93bcdaa\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"1a4bba00bebb4515befc15a86cd2a1adc4514f595b645bf5415b5dbd4cd6af35\"" Sep 12 17:34:40.530110 containerd[1494]: time="2025-09-12T17:34:40.530094476Z" level=info msg="StartContainer for \"1a4bba00bebb4515befc15a86cd2a1adc4514f595b645bf5415b5dbd4cd6af35\"" Sep 12 17:34:40.607881 systemd[1]: Started cri-containerd-1a4bba00bebb4515befc15a86cd2a1adc4514f595b645bf5415b5dbd4cd6af35.scope - libcontainer container 1a4bba00bebb4515befc15a86cd2a1adc4514f595b645bf5415b5dbd4cd6af35. Sep 12 17:34:40.639135 containerd[1494]: time="2025-09-12T17:34:40.639092514Z" level=info msg="StartContainer for \"1a4bba00bebb4515befc15a86cd2a1adc4514f595b645bf5415b5dbd4cd6af35\" returns successfully" Sep 12 17:34:41.238304 kubelet[2561]: I0912 17:34:41.232437 2561 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 12 17:34:41.242356 kubelet[2561]: I0912 17:34:41.242333 2561 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 12 17:34:41.657617 kubelet[2561]: I0912 17:34:41.657527 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-fsx7j" podStartSLOduration=27.979738713 podStartE2EDuration="40.655475693s" podCreationTimestamp="2025-09-12 17:34:01 +0000 UTC" firstStartedPulling="2025-09-12 17:34:27.808568034 +0000 UTC m=+43.835983114" lastFinishedPulling="2025-09-12 17:34:40.484305013 +0000 UTC m=+56.511720094" observedRunningTime="2025-09-12 17:34:41.654733419 +0000 UTC m=+57.682148519" watchObservedRunningTime="2025-09-12 17:34:41.655475693 +0000 UTC m=+57.682890772" Sep 12 17:34:44.115026 containerd[1494]: time="2025-09-12T17:34:44.114869113Z" level=info msg="StopPodSandbox for \"3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848\"" Sep 12 17:34:44.551469 containerd[1494]: 2025-09-12 17:34:44.345 [WARNING][5335] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--f--c182586e87-k8s-goldmane--54d579b49d--wnp7f-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"8d975815-4db9-4493-ae13-0fed34b78044", ResourceVersion:"976", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 34, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-f-c182586e87", ContainerID:"5a0d91b33f460614a3b619958e39f5021fa374631aaf7aa43e52be6e4d546d12", Pod:"goldmane-54d579b49d-wnp7f", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.34.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia6ca5fbe0b7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:34:44.551469 containerd[1494]: 2025-09-12 17:34:44.347 [INFO][5335] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" Sep 12 17:34:44.551469 containerd[1494]: 2025-09-12 17:34:44.347 [INFO][5335] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" iface="eth0" netns="" Sep 12 17:34:44.551469 containerd[1494]: 2025-09-12 17:34:44.347 [INFO][5335] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" Sep 12 17:34:44.551469 containerd[1494]: 2025-09-12 17:34:44.347 [INFO][5335] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" Sep 12 17:34:44.551469 containerd[1494]: 2025-09-12 17:34:44.528 [INFO][5342] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" HandleID="k8s-pod-network.3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" Workload="ci--4081--3--6--f--c182586e87-k8s-goldmane--54d579b49d--wnp7f-eth0" Sep 12 17:34:44.551469 containerd[1494]: 2025-09-12 17:34:44.530 [INFO][5342] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:44.551469 containerd[1494]: 2025-09-12 17:34:44.530 [INFO][5342] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:44.551469 containerd[1494]: 2025-09-12 17:34:44.544 [WARNING][5342] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" HandleID="k8s-pod-network.3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" Workload="ci--4081--3--6--f--c182586e87-k8s-goldmane--54d579b49d--wnp7f-eth0" Sep 12 17:34:44.551469 containerd[1494]: 2025-09-12 17:34:44.544 [INFO][5342] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" HandleID="k8s-pod-network.3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" Workload="ci--4081--3--6--f--c182586e87-k8s-goldmane--54d579b49d--wnp7f-eth0" Sep 12 17:34:44.551469 containerd[1494]: 2025-09-12 17:34:44.546 [INFO][5342] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:44.551469 containerd[1494]: 2025-09-12 17:34:44.548 [INFO][5335] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" Sep 12 17:34:44.556685 containerd[1494]: time="2025-09-12T17:34:44.551507301Z" level=info msg="TearDown network for sandbox \"3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848\" successfully" Sep 12 17:34:44.556685 containerd[1494]: time="2025-09-12T17:34:44.551539542Z" level=info msg="StopPodSandbox for \"3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848\" returns successfully" Sep 12 17:34:44.748666 containerd[1494]: time="2025-09-12T17:34:44.748034275Z" level=info msg="RemovePodSandbox for \"3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848\"" Sep 12 17:34:44.754071 containerd[1494]: time="2025-09-12T17:34:44.754016552Z" level=info msg="Forcibly stopping sandbox \"3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848\"" Sep 12 17:34:44.848539 containerd[1494]: 2025-09-12 17:34:44.801 [WARNING][5357] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--f--c182586e87-k8s-goldmane--54d579b49d--wnp7f-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"8d975815-4db9-4493-ae13-0fed34b78044", ResourceVersion:"976", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 34, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-f-c182586e87", ContainerID:"5a0d91b33f460614a3b619958e39f5021fa374631aaf7aa43e52be6e4d546d12", Pod:"goldmane-54d579b49d-wnp7f", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.34.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia6ca5fbe0b7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:34:44.848539 containerd[1494]: 2025-09-12 17:34:44.801 [INFO][5357] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" Sep 12 17:34:44.848539 containerd[1494]: 2025-09-12 17:34:44.801 [INFO][5357] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" iface="eth0" netns="" Sep 12 17:34:44.848539 containerd[1494]: 2025-09-12 17:34:44.801 [INFO][5357] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" Sep 12 17:34:44.848539 containerd[1494]: 2025-09-12 17:34:44.801 [INFO][5357] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" Sep 12 17:34:44.848539 containerd[1494]: 2025-09-12 17:34:44.830 [INFO][5364] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" HandleID="k8s-pod-network.3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" Workload="ci--4081--3--6--f--c182586e87-k8s-goldmane--54d579b49d--wnp7f-eth0" Sep 12 17:34:44.848539 containerd[1494]: 2025-09-12 17:34:44.831 [INFO][5364] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:44.848539 containerd[1494]: 2025-09-12 17:34:44.831 [INFO][5364] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:44.848539 containerd[1494]: 2025-09-12 17:34:44.839 [WARNING][5364] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" HandleID="k8s-pod-network.3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" Workload="ci--4081--3--6--f--c182586e87-k8s-goldmane--54d579b49d--wnp7f-eth0" Sep 12 17:34:44.848539 containerd[1494]: 2025-09-12 17:34:44.839 [INFO][5364] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" HandleID="k8s-pod-network.3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" Workload="ci--4081--3--6--f--c182586e87-k8s-goldmane--54d579b49d--wnp7f-eth0" Sep 12 17:34:44.848539 containerd[1494]: 2025-09-12 17:34:44.841 [INFO][5364] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:44.848539 containerd[1494]: 2025-09-12 17:34:44.844 [INFO][5357] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848" Sep 12 17:34:44.851617 containerd[1494]: time="2025-09-12T17:34:44.849373209Z" level=info msg="TearDown network for sandbox \"3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848\" successfully" Sep 12 17:34:44.872995 containerd[1494]: time="2025-09-12T17:34:44.872923511Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 17:34:44.897816 containerd[1494]: time="2025-09-12T17:34:44.897764316Z" level=info msg="RemovePodSandbox \"3dcba8787d31ee60c778e4fcb2630c4594e8b38a67e6d502bba2326930192848\" returns successfully" Sep 12 17:34:44.906398 containerd[1494]: time="2025-09-12T17:34:44.906329785Z" level=info msg="StopPodSandbox for \"fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3\"" Sep 12 17:34:44.994765 containerd[1494]: 2025-09-12 17:34:44.952 [WARNING][5379] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--f--c182586e87-k8s-csi--node--driver--fsx7j-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"53bbc8dc-3621-4d89-be2b-ae2eb974a294", ResourceVersion:"1047", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 34, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-f-c182586e87", ContainerID:"2947822c56d8fb60e9e8a7cc57ecf12731351ea16c8878ff24b12e11d93bcdaa", Pod:"csi-node-driver-fsx7j", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.34.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali33a3b8905b9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:34:44.994765 containerd[1494]: 2025-09-12 17:34:44.952 [INFO][5379] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" Sep 12 17:34:44.994765 containerd[1494]: 2025-09-12 17:34:44.953 [INFO][5379] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" iface="eth0" netns="" Sep 12 17:34:44.994765 containerd[1494]: 2025-09-12 17:34:44.953 [INFO][5379] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" Sep 12 17:34:44.994765 containerd[1494]: 2025-09-12 17:34:44.953 [INFO][5379] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" Sep 12 17:34:44.994765 containerd[1494]: 2025-09-12 17:34:44.981 [INFO][5386] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" HandleID="k8s-pod-network.fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" Workload="ci--4081--3--6--f--c182586e87-k8s-csi--node--driver--fsx7j-eth0" Sep 12 17:34:44.994765 containerd[1494]: 2025-09-12 17:34:44.981 [INFO][5386] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:44.994765 containerd[1494]: 2025-09-12 17:34:44.982 [INFO][5386] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:44.994765 containerd[1494]: 2025-09-12 17:34:44.988 [WARNING][5386] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" HandleID="k8s-pod-network.fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" Workload="ci--4081--3--6--f--c182586e87-k8s-csi--node--driver--fsx7j-eth0" Sep 12 17:34:44.994765 containerd[1494]: 2025-09-12 17:34:44.988 [INFO][5386] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" HandleID="k8s-pod-network.fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" Workload="ci--4081--3--6--f--c182586e87-k8s-csi--node--driver--fsx7j-eth0" Sep 12 17:34:44.994765 containerd[1494]: 2025-09-12 17:34:44.990 [INFO][5386] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:44.994765 containerd[1494]: 2025-09-12 17:34:44.992 [INFO][5379] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" Sep 12 17:34:44.994765 containerd[1494]: time="2025-09-12T17:34:44.994641010Z" level=info msg="TearDown network for sandbox \"fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3\" successfully" Sep 12 17:34:44.994765 containerd[1494]: time="2025-09-12T17:34:44.994665696Z" level=info msg="StopPodSandbox for \"fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3\" returns successfully" Sep 12 17:34:44.996638 containerd[1494]: time="2025-09-12T17:34:44.995027274Z" level=info msg="RemovePodSandbox for \"fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3\"" Sep 12 17:34:44.996638 containerd[1494]: time="2025-09-12T17:34:44.995050869Z" level=info msg="Forcibly stopping sandbox \"fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3\"" Sep 12 17:34:45.055470 containerd[1494]: 2025-09-12 17:34:45.025 [WARNING][5400] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--f--c182586e87-k8s-csi--node--driver--fsx7j-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"53bbc8dc-3621-4d89-be2b-ae2eb974a294", ResourceVersion:"1047", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 34, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-f-c182586e87", ContainerID:"2947822c56d8fb60e9e8a7cc57ecf12731351ea16c8878ff24b12e11d93bcdaa", Pod:"csi-node-driver-fsx7j", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.34.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali33a3b8905b9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:34:45.055470 containerd[1494]: 2025-09-12 17:34:45.025 [INFO][5400] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" Sep 12 17:34:45.055470 containerd[1494]: 2025-09-12 17:34:45.025 [INFO][5400] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" iface="eth0" netns="" Sep 12 17:34:45.055470 containerd[1494]: 2025-09-12 17:34:45.025 [INFO][5400] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" Sep 12 17:34:45.055470 containerd[1494]: 2025-09-12 17:34:45.025 [INFO][5400] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" Sep 12 17:34:45.055470 containerd[1494]: 2025-09-12 17:34:45.043 [INFO][5407] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" HandleID="k8s-pod-network.fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" Workload="ci--4081--3--6--f--c182586e87-k8s-csi--node--driver--fsx7j-eth0" Sep 12 17:34:45.055470 containerd[1494]: 2025-09-12 17:34:45.043 [INFO][5407] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:45.055470 containerd[1494]: 2025-09-12 17:34:45.043 [INFO][5407] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:45.055470 containerd[1494]: 2025-09-12 17:34:45.049 [WARNING][5407] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" HandleID="k8s-pod-network.fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" Workload="ci--4081--3--6--f--c182586e87-k8s-csi--node--driver--fsx7j-eth0" Sep 12 17:34:45.055470 containerd[1494]: 2025-09-12 17:34:45.049 [INFO][5407] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" HandleID="k8s-pod-network.fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" Workload="ci--4081--3--6--f--c182586e87-k8s-csi--node--driver--fsx7j-eth0" Sep 12 17:34:45.055470 containerd[1494]: 2025-09-12 17:34:45.051 [INFO][5407] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:45.055470 containerd[1494]: 2025-09-12 17:34:45.053 [INFO][5400] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3" Sep 12 17:34:45.056995 containerd[1494]: time="2025-09-12T17:34:45.055523207Z" level=info msg="TearDown network for sandbox \"fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3\" successfully" Sep 12 17:34:45.062657 containerd[1494]: time="2025-09-12T17:34:45.062619574Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 17:34:45.062715 containerd[1494]: time="2025-09-12T17:34:45.062695586Z" level=info msg="RemovePodSandbox \"fda3f332aa630c70d7cfbd1b402fa296a294f6bf502a4c10eb91d19b97d978e3\" returns successfully" Sep 12 17:34:45.063187 containerd[1494]: time="2025-09-12T17:34:45.063152745Z" level=info msg="StopPodSandbox for \"8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405\"" Sep 12 17:34:45.129553 containerd[1494]: 2025-09-12 17:34:45.095 [WARNING][5421] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-whisker--67c45c5486--4zft2-eth0" Sep 12 17:34:45.129553 containerd[1494]: 2025-09-12 17:34:45.096 [INFO][5421] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" Sep 12 17:34:45.129553 containerd[1494]: 2025-09-12 17:34:45.096 [INFO][5421] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" iface="eth0" netns="" Sep 12 17:34:45.129553 containerd[1494]: 2025-09-12 17:34:45.096 [INFO][5421] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" Sep 12 17:34:45.129553 containerd[1494]: 2025-09-12 17:34:45.096 [INFO][5421] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" Sep 12 17:34:45.129553 containerd[1494]: 2025-09-12 17:34:45.117 [INFO][5429] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" HandleID="k8s-pod-network.8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" Workload="ci--4081--3--6--f--c182586e87-k8s-whisker--67c45c5486--4zft2-eth0" Sep 12 17:34:45.129553 containerd[1494]: 2025-09-12 17:34:45.118 [INFO][5429] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:45.129553 containerd[1494]: 2025-09-12 17:34:45.118 [INFO][5429] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:45.129553 containerd[1494]: 2025-09-12 17:34:45.123 [WARNING][5429] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" HandleID="k8s-pod-network.8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" Workload="ci--4081--3--6--f--c182586e87-k8s-whisker--67c45c5486--4zft2-eth0" Sep 12 17:34:45.129553 containerd[1494]: 2025-09-12 17:34:45.123 [INFO][5429] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" HandleID="k8s-pod-network.8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" Workload="ci--4081--3--6--f--c182586e87-k8s-whisker--67c45c5486--4zft2-eth0" Sep 12 17:34:45.129553 containerd[1494]: 2025-09-12 17:34:45.125 [INFO][5429] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:45.129553 containerd[1494]: 2025-09-12 17:34:45.127 [INFO][5421] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" Sep 12 17:34:45.129553 containerd[1494]: time="2025-09-12T17:34:45.129493372Z" level=info msg="TearDown network for sandbox \"8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405\" successfully" Sep 12 17:34:45.129553 containerd[1494]: time="2025-09-12T17:34:45.129516636Z" level=info msg="StopPodSandbox for \"8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405\" returns successfully" Sep 12 17:34:45.133051 containerd[1494]: time="2025-09-12T17:34:45.131348035Z" level=info msg="RemovePodSandbox for \"8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405\"" Sep 12 17:34:45.133051 containerd[1494]: time="2025-09-12T17:34:45.131375416Z" level=info msg="Forcibly stopping sandbox \"8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405\"" Sep 12 17:34:45.187732 containerd[1494]: 2025-09-12 17:34:45.159 [WARNING][5444] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" WorkloadEndpoint="ci--4081--3--6--f--c182586e87-k8s-whisker--67c45c5486--4zft2-eth0" Sep 12 17:34:45.187732 containerd[1494]: 2025-09-12 17:34:45.159 [INFO][5444] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" Sep 12 17:34:45.187732 containerd[1494]: 2025-09-12 17:34:45.159 [INFO][5444] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" iface="eth0" netns="" Sep 12 17:34:45.187732 containerd[1494]: 2025-09-12 17:34:45.159 [INFO][5444] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" Sep 12 17:34:45.187732 containerd[1494]: 2025-09-12 17:34:45.159 [INFO][5444] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" Sep 12 17:34:45.187732 containerd[1494]: 2025-09-12 17:34:45.177 [INFO][5452] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" HandleID="k8s-pod-network.8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" Workload="ci--4081--3--6--f--c182586e87-k8s-whisker--67c45c5486--4zft2-eth0" Sep 12 17:34:45.187732 containerd[1494]: 2025-09-12 17:34:45.177 [INFO][5452] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:45.187732 containerd[1494]: 2025-09-12 17:34:45.177 [INFO][5452] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:45.187732 containerd[1494]: 2025-09-12 17:34:45.183 [WARNING][5452] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" HandleID="k8s-pod-network.8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" Workload="ci--4081--3--6--f--c182586e87-k8s-whisker--67c45c5486--4zft2-eth0" Sep 12 17:34:45.187732 containerd[1494]: 2025-09-12 17:34:45.183 [INFO][5452] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" HandleID="k8s-pod-network.8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" Workload="ci--4081--3--6--f--c182586e87-k8s-whisker--67c45c5486--4zft2-eth0" Sep 12 17:34:45.187732 containerd[1494]: 2025-09-12 17:34:45.184 [INFO][5452] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:45.187732 containerd[1494]: 2025-09-12 17:34:45.186 [INFO][5444] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405" Sep 12 17:34:45.188591 containerd[1494]: time="2025-09-12T17:34:45.187769826Z" level=info msg="TearDown network for sandbox \"8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405\" successfully" Sep 12 17:34:45.190796 containerd[1494]: time="2025-09-12T17:34:45.190768647Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 17:34:45.190888 containerd[1494]: time="2025-09-12T17:34:45.190821777Z" level=info msg="RemovePodSandbox \"8839e8ab9f8eb8f13c11b60ca60e7bc34bd56fdca9a761fac69e876e635d1405\" returns successfully" Sep 12 17:34:45.191255 containerd[1494]: time="2025-09-12T17:34:45.191191712Z" level=info msg="StopPodSandbox for \"8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa\"" Sep 12 17:34:45.257189 containerd[1494]: 2025-09-12 17:34:45.227 [WARNING][5466] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--f--c182586e87-k8s-calico--kube--controllers--599b568c97--s5dwl-eth0", GenerateName:"calico-kube-controllers-599b568c97-", Namespace:"calico-system", SelfLink:"", UID:"1a745a4f-a77e-4660-9ac5-801112a8b773", ResourceVersion:"1029", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 34, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"599b568c97", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-f-c182586e87", ContainerID:"7737ddf2ca213a87f62309a181f8f48d31a299a97234f587068401b4b3a059af", Pod:"calico-kube-controllers-599b568c97-s5dwl", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.34.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"califec7a5e1d4c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:34:45.257189 containerd[1494]: 2025-09-12 17:34:45.228 [INFO][5466] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" Sep 12 17:34:45.257189 containerd[1494]: 2025-09-12 17:34:45.228 [INFO][5466] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" iface="eth0" netns="" Sep 12 17:34:45.257189 containerd[1494]: 2025-09-12 17:34:45.228 [INFO][5466] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" Sep 12 17:34:45.257189 containerd[1494]: 2025-09-12 17:34:45.228 [INFO][5466] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" Sep 12 17:34:45.257189 containerd[1494]: 2025-09-12 17:34:45.245 [INFO][5473] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" HandleID="k8s-pod-network.8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--kube--controllers--599b568c97--s5dwl-eth0" Sep 12 17:34:45.257189 containerd[1494]: 2025-09-12 17:34:45.245 [INFO][5473] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:45.257189 containerd[1494]: 2025-09-12 17:34:45.245 [INFO][5473] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:45.257189 containerd[1494]: 2025-09-12 17:34:45.250 [WARNING][5473] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" HandleID="k8s-pod-network.8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--kube--controllers--599b568c97--s5dwl-eth0" Sep 12 17:34:45.257189 containerd[1494]: 2025-09-12 17:34:45.250 [INFO][5473] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" HandleID="k8s-pod-network.8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--kube--controllers--599b568c97--s5dwl-eth0" Sep 12 17:34:45.257189 containerd[1494]: 2025-09-12 17:34:45.253 [INFO][5473] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:45.257189 containerd[1494]: 2025-09-12 17:34:45.255 [INFO][5466] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" Sep 12 17:34:45.259070 containerd[1494]: time="2025-09-12T17:34:45.257233848Z" level=info msg="TearDown network for sandbox \"8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa\" successfully" Sep 12 17:34:45.259070 containerd[1494]: time="2025-09-12T17:34:45.257257002Z" level=info msg="StopPodSandbox for \"8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa\" returns successfully" Sep 12 17:34:45.259070 containerd[1494]: time="2025-09-12T17:34:45.257861487Z" level=info msg="RemovePodSandbox for \"8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa\"" Sep 12 17:34:45.259070 containerd[1494]: time="2025-09-12T17:34:45.257891153Z" level=info msg="Forcibly stopping sandbox \"8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa\"" Sep 12 17:34:45.315248 containerd[1494]: 2025-09-12 17:34:45.283 [WARNING][5487] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--f--c182586e87-k8s-calico--kube--controllers--599b568c97--s5dwl-eth0", GenerateName:"calico-kube-controllers-599b568c97-", Namespace:"calico-system", SelfLink:"", UID:"1a745a4f-a77e-4660-9ac5-801112a8b773", ResourceVersion:"1029", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 34, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"599b568c97", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-f-c182586e87", ContainerID:"7737ddf2ca213a87f62309a181f8f48d31a299a97234f587068401b4b3a059af", Pod:"calico-kube-controllers-599b568c97-s5dwl", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.34.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"califec7a5e1d4c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:34:45.315248 containerd[1494]: 2025-09-12 17:34:45.284 [INFO][5487] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" Sep 12 17:34:45.315248 containerd[1494]: 2025-09-12 17:34:45.284 [INFO][5487] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" iface="eth0" netns="" Sep 12 17:34:45.315248 containerd[1494]: 2025-09-12 17:34:45.284 [INFO][5487] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" Sep 12 17:34:45.315248 containerd[1494]: 2025-09-12 17:34:45.284 [INFO][5487] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" Sep 12 17:34:45.315248 containerd[1494]: 2025-09-12 17:34:45.301 [INFO][5494] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" HandleID="k8s-pod-network.8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--kube--controllers--599b568c97--s5dwl-eth0" Sep 12 17:34:45.315248 containerd[1494]: 2025-09-12 17:34:45.301 [INFO][5494] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:45.315248 containerd[1494]: 2025-09-12 17:34:45.301 [INFO][5494] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:45.315248 containerd[1494]: 2025-09-12 17:34:45.308 [WARNING][5494] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" HandleID="k8s-pod-network.8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--kube--controllers--599b568c97--s5dwl-eth0" Sep 12 17:34:45.315248 containerd[1494]: 2025-09-12 17:34:45.308 [INFO][5494] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" HandleID="k8s-pod-network.8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--kube--controllers--599b568c97--s5dwl-eth0" Sep 12 17:34:45.315248 containerd[1494]: 2025-09-12 17:34:45.310 [INFO][5494] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:45.315248 containerd[1494]: 2025-09-12 17:34:45.312 [INFO][5487] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa" Sep 12 17:34:45.315248 containerd[1494]: time="2025-09-12T17:34:45.315087557Z" level=info msg="TearDown network for sandbox \"8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa\" successfully" Sep 12 17:34:45.338156 containerd[1494]: time="2025-09-12T17:34:45.338112098Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 17:34:45.338295 containerd[1494]: time="2025-09-12T17:34:45.338166049Z" level=info msg="RemovePodSandbox \"8d3c00ad0af03fd362d9677b923aa1969166be62afd063bd0b73f7ecf9b0feaa\" returns successfully" Sep 12 17:34:45.338624 containerd[1494]: time="2025-09-12T17:34:45.338571339Z" level=info msg="StopPodSandbox for \"1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6\"" Sep 12 17:34:45.425393 containerd[1494]: 2025-09-12 17:34:45.396 [WARNING][5508] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--bbjqp-eth0", GenerateName:"calico-apiserver-6dd46897bd-", Namespace:"calico-apiserver", SelfLink:"", UID:"9ee60034-1a31-4418-b743-0f97ab43bc92", ResourceVersion:"1007", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 33, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6dd46897bd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-f-c182586e87", ContainerID:"bf699b73a6ad071f5fccc960cf1d02448f7d1eb2a2d22333c04578c66b33e8c5", Pod:"calico-apiserver-6dd46897bd-bbjqp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.34.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif4e7666601f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:34:45.425393 containerd[1494]: 2025-09-12 17:34:45.397 [INFO][5508] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" Sep 12 17:34:45.425393 containerd[1494]: 2025-09-12 17:34:45.397 [INFO][5508] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" iface="eth0" netns="" Sep 12 17:34:45.425393 containerd[1494]: 2025-09-12 17:34:45.397 [INFO][5508] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" Sep 12 17:34:45.425393 containerd[1494]: 2025-09-12 17:34:45.397 [INFO][5508] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" Sep 12 17:34:45.425393 containerd[1494]: 2025-09-12 17:34:45.415 [INFO][5515] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" HandleID="k8s-pod-network.1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--bbjqp-eth0" Sep 12 17:34:45.425393 containerd[1494]: 2025-09-12 17:34:45.415 [INFO][5515] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:45.425393 containerd[1494]: 2025-09-12 17:34:45.415 [INFO][5515] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:45.425393 containerd[1494]: 2025-09-12 17:34:45.420 [WARNING][5515] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" HandleID="k8s-pod-network.1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--bbjqp-eth0" Sep 12 17:34:45.425393 containerd[1494]: 2025-09-12 17:34:45.420 [INFO][5515] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" HandleID="k8s-pod-network.1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--bbjqp-eth0" Sep 12 17:34:45.425393 containerd[1494]: 2025-09-12 17:34:45.421 [INFO][5515] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:45.425393 containerd[1494]: 2025-09-12 17:34:45.423 [INFO][5508] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" Sep 12 17:34:45.425393 containerd[1494]: time="2025-09-12T17:34:45.425287311Z" level=info msg="TearDown network for sandbox \"1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6\" successfully" Sep 12 17:34:45.425393 containerd[1494]: time="2025-09-12T17:34:45.425311106Z" level=info msg="StopPodSandbox for \"1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6\" returns successfully" Sep 12 17:34:45.426674 containerd[1494]: time="2025-09-12T17:34:45.426342683Z" level=info msg="RemovePodSandbox for \"1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6\"" Sep 12 17:34:45.426674 containerd[1494]: time="2025-09-12T17:34:45.426376196Z" level=info msg="Forcibly stopping sandbox \"1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6\"" Sep 12 17:34:45.482322 containerd[1494]: 2025-09-12 17:34:45.455 [WARNING][5529] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--bbjqp-eth0", GenerateName:"calico-apiserver-6dd46897bd-", Namespace:"calico-apiserver", SelfLink:"", UID:"9ee60034-1a31-4418-b743-0f97ab43bc92", ResourceVersion:"1007", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 33, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6dd46897bd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-f-c182586e87", ContainerID:"bf699b73a6ad071f5fccc960cf1d02448f7d1eb2a2d22333c04578c66b33e8c5", Pod:"calico-apiserver-6dd46897bd-bbjqp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.34.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif4e7666601f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:34:45.482322 containerd[1494]: 2025-09-12 17:34:45.456 [INFO][5529] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" Sep 12 17:34:45.482322 containerd[1494]: 2025-09-12 17:34:45.456 [INFO][5529] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" iface="eth0" netns="" Sep 12 17:34:45.482322 containerd[1494]: 2025-09-12 17:34:45.456 [INFO][5529] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" Sep 12 17:34:45.482322 containerd[1494]: 2025-09-12 17:34:45.456 [INFO][5529] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" Sep 12 17:34:45.482322 containerd[1494]: 2025-09-12 17:34:45.472 [INFO][5536] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" HandleID="k8s-pod-network.1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--bbjqp-eth0" Sep 12 17:34:45.482322 containerd[1494]: 2025-09-12 17:34:45.472 [INFO][5536] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:45.482322 containerd[1494]: 2025-09-12 17:34:45.472 [INFO][5536] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:45.482322 containerd[1494]: 2025-09-12 17:34:45.477 [WARNING][5536] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" HandleID="k8s-pod-network.1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--bbjqp-eth0" Sep 12 17:34:45.482322 containerd[1494]: 2025-09-12 17:34:45.477 [INFO][5536] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" HandleID="k8s-pod-network.1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--bbjqp-eth0" Sep 12 17:34:45.482322 containerd[1494]: 2025-09-12 17:34:45.479 [INFO][5536] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:45.482322 containerd[1494]: 2025-09-12 17:34:45.480 [INFO][5529] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6" Sep 12 17:34:45.482908 containerd[1494]: time="2025-09-12T17:34:45.482402023Z" level=info msg="TearDown network for sandbox \"1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6\" successfully" Sep 12 17:34:45.485652 containerd[1494]: time="2025-09-12T17:34:45.485590461Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 17:34:45.485708 containerd[1494]: time="2025-09-12T17:34:45.485672955Z" level=info msg="RemovePodSandbox \"1e490ff9aeb9142a469899b9994e9e803af0a211a6813cd0269e8e566e92a5b6\" returns successfully" Sep 12 17:34:45.486132 containerd[1494]: time="2025-09-12T17:34:45.486104076Z" level=info msg="StopPodSandbox for \"66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30\"" Sep 12 17:34:45.543060 containerd[1494]: 2025-09-12 17:34:45.514 [WARNING][5551] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--s4d7g-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"33f03454-7bb2-47fc-bcae-807921ce8ad3", ResourceVersion:"936", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 33, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-f-c182586e87", ContainerID:"47c0aa89385cfc22baf7a1d40394f1009cfaaa8af45053624c69e492710e4244", Pod:"coredns-668d6bf9bc-s4d7g", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.34.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8bcc98ec2c8", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:34:45.543060 containerd[1494]: 2025-09-12 17:34:45.514 [INFO][5551] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" Sep 12 17:34:45.543060 containerd[1494]: 2025-09-12 17:34:45.514 [INFO][5551] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" iface="eth0" netns="" Sep 12 17:34:45.543060 containerd[1494]: 2025-09-12 17:34:45.514 [INFO][5551] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" Sep 12 17:34:45.543060 containerd[1494]: 2025-09-12 17:34:45.515 [INFO][5551] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" Sep 12 17:34:45.543060 containerd[1494]: 2025-09-12 17:34:45.532 [INFO][5558] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" HandleID="k8s-pod-network.66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" Workload="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--s4d7g-eth0" Sep 12 17:34:45.543060 containerd[1494]: 2025-09-12 17:34:45.532 [INFO][5558] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:45.543060 containerd[1494]: 2025-09-12 17:34:45.532 [INFO][5558] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:45.543060 containerd[1494]: 2025-09-12 17:34:45.538 [WARNING][5558] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" HandleID="k8s-pod-network.66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" Workload="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--s4d7g-eth0" Sep 12 17:34:45.543060 containerd[1494]: 2025-09-12 17:34:45.538 [INFO][5558] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" HandleID="k8s-pod-network.66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" Workload="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--s4d7g-eth0" Sep 12 17:34:45.543060 containerd[1494]: 2025-09-12 17:34:45.539 [INFO][5558] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:45.543060 containerd[1494]: 2025-09-12 17:34:45.541 [INFO][5551] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" Sep 12 17:34:45.544153 containerd[1494]: time="2025-09-12T17:34:45.543094243Z" level=info msg="TearDown network for sandbox \"66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30\" successfully" Sep 12 17:34:45.544153 containerd[1494]: time="2025-09-12T17:34:45.543117236Z" level=info msg="StopPodSandbox for \"66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30\" returns successfully" Sep 12 17:34:45.544153 containerd[1494]: time="2025-09-12T17:34:45.543545630Z" level=info msg="RemovePodSandbox for \"66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30\"" Sep 12 17:34:45.544153 containerd[1494]: time="2025-09-12T17:34:45.543568223Z" level=info msg="Forcibly stopping sandbox \"66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30\"" Sep 12 17:34:45.605247 containerd[1494]: 2025-09-12 17:34:45.577 [WARNING][5573] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--s4d7g-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"33f03454-7bb2-47fc-bcae-807921ce8ad3", ResourceVersion:"936", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 33, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-f-c182586e87", ContainerID:"47c0aa89385cfc22baf7a1d40394f1009cfaaa8af45053624c69e492710e4244", Pod:"coredns-668d6bf9bc-s4d7g", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.34.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8bcc98ec2c8", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:34:45.605247 containerd[1494]: 2025-09-12 17:34:45.577 [INFO][5573] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" Sep 12 17:34:45.605247 containerd[1494]: 2025-09-12 17:34:45.577 [INFO][5573] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" iface="eth0" netns="" Sep 12 17:34:45.605247 containerd[1494]: 2025-09-12 17:34:45.577 [INFO][5573] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" Sep 12 17:34:45.605247 containerd[1494]: 2025-09-12 17:34:45.577 [INFO][5573] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" Sep 12 17:34:45.605247 containerd[1494]: 2025-09-12 17:34:45.595 [INFO][5580] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" HandleID="k8s-pod-network.66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" Workload="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--s4d7g-eth0" Sep 12 17:34:45.605247 containerd[1494]: 2025-09-12 17:34:45.595 [INFO][5580] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:45.605247 containerd[1494]: 2025-09-12 17:34:45.595 [INFO][5580] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:45.605247 containerd[1494]: 2025-09-12 17:34:45.600 [WARNING][5580] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" HandleID="k8s-pod-network.66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" Workload="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--s4d7g-eth0" Sep 12 17:34:45.605247 containerd[1494]: 2025-09-12 17:34:45.600 [INFO][5580] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" HandleID="k8s-pod-network.66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" Workload="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--s4d7g-eth0" Sep 12 17:34:45.605247 containerd[1494]: 2025-09-12 17:34:45.601 [INFO][5580] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:45.605247 containerd[1494]: 2025-09-12 17:34:45.603 [INFO][5573] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30" Sep 12 17:34:45.605247 containerd[1494]: time="2025-09-12T17:34:45.605073749Z" level=info msg="TearDown network for sandbox \"66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30\" successfully" Sep 12 17:34:45.608206 containerd[1494]: time="2025-09-12T17:34:45.608074155Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 17:34:45.608206 containerd[1494]: time="2025-09-12T17:34:45.608128296Z" level=info msg="RemovePodSandbox \"66452d647c7edebd5506ddc6e3bc28bf1104546ea2fa33230da48fad374cdb30\" returns successfully" Sep 12 17:34:45.608669 containerd[1494]: time="2025-09-12T17:34:45.608652711Z" level=info msg="StopPodSandbox for \"c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa\"" Sep 12 17:34:45.660994 containerd[1494]: 2025-09-12 17:34:45.634 [WARNING][5595] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--szrkp-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"cbad7146-1812-4262-a0a7-fad3ba169a40", ResourceVersion:"915", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 33, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-f-c182586e87", ContainerID:"ea4dba399394ac14f99bd30ac548af6eaf19877ec2518480e8b0a89a7b6f3199", Pod:"coredns-668d6bf9bc-szrkp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.34.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif622695599e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:34:45.660994 containerd[1494]: 2025-09-12 17:34:45.634 [INFO][5595] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" Sep 12 17:34:45.660994 containerd[1494]: 2025-09-12 17:34:45.634 [INFO][5595] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" iface="eth0" netns="" Sep 12 17:34:45.660994 containerd[1494]: 2025-09-12 17:34:45.634 [INFO][5595] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" Sep 12 17:34:45.660994 containerd[1494]: 2025-09-12 17:34:45.634 [INFO][5595] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" Sep 12 17:34:45.660994 containerd[1494]: 2025-09-12 17:34:45.650 [INFO][5602] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" HandleID="k8s-pod-network.c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" Workload="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--szrkp-eth0" Sep 12 17:34:45.660994 containerd[1494]: 2025-09-12 17:34:45.650 [INFO][5602] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:45.660994 containerd[1494]: 2025-09-12 17:34:45.651 [INFO][5602] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:45.660994 containerd[1494]: 2025-09-12 17:34:45.656 [WARNING][5602] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" HandleID="k8s-pod-network.c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" Workload="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--szrkp-eth0" Sep 12 17:34:45.660994 containerd[1494]: 2025-09-12 17:34:45.656 [INFO][5602] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" HandleID="k8s-pod-network.c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" Workload="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--szrkp-eth0" Sep 12 17:34:45.660994 containerd[1494]: 2025-09-12 17:34:45.658 [INFO][5602] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:45.660994 containerd[1494]: 2025-09-12 17:34:45.659 [INFO][5595] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" Sep 12 17:34:45.663001 containerd[1494]: time="2025-09-12T17:34:45.661308999Z" level=info msg="TearDown network for sandbox \"c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa\" successfully" Sep 12 17:34:45.663001 containerd[1494]: time="2025-09-12T17:34:45.661330269Z" level=info msg="StopPodSandbox for \"c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa\" returns successfully" Sep 12 17:34:45.663001 containerd[1494]: time="2025-09-12T17:34:45.661607359Z" level=info msg="RemovePodSandbox for \"c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa\"" Sep 12 17:34:45.663001 containerd[1494]: time="2025-09-12T17:34:45.661636554Z" level=info msg="Forcibly stopping sandbox \"c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa\"" Sep 12 17:34:45.722372 containerd[1494]: 2025-09-12 17:34:45.695 [WARNING][5616] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--szrkp-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"cbad7146-1812-4262-a0a7-fad3ba169a40", ResourceVersion:"915", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 33, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-f-c182586e87", ContainerID:"ea4dba399394ac14f99bd30ac548af6eaf19877ec2518480e8b0a89a7b6f3199", Pod:"coredns-668d6bf9bc-szrkp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.34.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif622695599e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:34:45.722372 containerd[1494]: 2025-09-12 17:34:45.695 [INFO][5616] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" Sep 12 17:34:45.722372 containerd[1494]: 2025-09-12 17:34:45.696 [INFO][5616] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" iface="eth0" netns="" Sep 12 17:34:45.722372 containerd[1494]: 2025-09-12 17:34:45.696 [INFO][5616] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" Sep 12 17:34:45.722372 containerd[1494]: 2025-09-12 17:34:45.696 [INFO][5616] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" Sep 12 17:34:45.722372 containerd[1494]: 2025-09-12 17:34:45.713 [INFO][5623] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" HandleID="k8s-pod-network.c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" Workload="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--szrkp-eth0" Sep 12 17:34:45.722372 containerd[1494]: 2025-09-12 17:34:45.713 [INFO][5623] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:45.722372 containerd[1494]: 2025-09-12 17:34:45.713 [INFO][5623] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:45.722372 containerd[1494]: 2025-09-12 17:34:45.717 [WARNING][5623] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" HandleID="k8s-pod-network.c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" Workload="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--szrkp-eth0" Sep 12 17:34:45.722372 containerd[1494]: 2025-09-12 17:34:45.718 [INFO][5623] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" HandleID="k8s-pod-network.c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" Workload="ci--4081--3--6--f--c182586e87-k8s-coredns--668d6bf9bc--szrkp-eth0" Sep 12 17:34:45.722372 containerd[1494]: 2025-09-12 17:34:45.719 [INFO][5623] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:45.722372 containerd[1494]: 2025-09-12 17:34:45.720 [INFO][5616] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa" Sep 12 17:34:45.724266 containerd[1494]: time="2025-09-12T17:34:45.722759944Z" level=info msg="TearDown network for sandbox \"c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa\" successfully" Sep 12 17:34:45.744762 containerd[1494]: time="2025-09-12T17:34:45.744718051Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 17:34:45.744855 containerd[1494]: time="2025-09-12T17:34:45.744789024Z" level=info msg="RemovePodSandbox \"c5a7a51de0db103b1e8498594fbe9c552c1c5356287bf859d6c0f88b7d13cefa\" returns successfully" Sep 12 17:34:45.745207 containerd[1494]: time="2025-09-12T17:34:45.745185098Z" level=info msg="StopPodSandbox for \"f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697\"" Sep 12 17:34:45.806911 containerd[1494]: 2025-09-12 17:34:45.774 [WARNING][5637] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--htdld-eth0", GenerateName:"calico-apiserver-6dd46897bd-", Namespace:"calico-apiserver", SelfLink:"", UID:"0d943c3b-957f-40f9-b21f-ac657206aace", ResourceVersion:"998", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 33, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6dd46897bd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-f-c182586e87", ContainerID:"de5c2b268ceea309c4a25537a0bf220e73a72b8c18816adefa558cf960204d33", Pod:"calico-apiserver-6dd46897bd-htdld", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.34.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib96ad64bb2c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:34:45.806911 containerd[1494]: 2025-09-12 17:34:45.774 [INFO][5637] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" Sep 12 17:34:45.806911 containerd[1494]: 2025-09-12 17:34:45.775 [INFO][5637] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" iface="eth0" netns="" Sep 12 17:34:45.806911 containerd[1494]: 2025-09-12 17:34:45.776 [INFO][5637] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" Sep 12 17:34:45.806911 containerd[1494]: 2025-09-12 17:34:45.776 [INFO][5637] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" Sep 12 17:34:45.806911 containerd[1494]: 2025-09-12 17:34:45.796 [INFO][5644] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" HandleID="k8s-pod-network.f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--htdld-eth0" Sep 12 17:34:45.806911 containerd[1494]: 2025-09-12 17:34:45.796 [INFO][5644] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:45.806911 containerd[1494]: 2025-09-12 17:34:45.796 [INFO][5644] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:45.806911 containerd[1494]: 2025-09-12 17:34:45.801 [WARNING][5644] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" HandleID="k8s-pod-network.f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--htdld-eth0" Sep 12 17:34:45.806911 containerd[1494]: 2025-09-12 17:34:45.801 [INFO][5644] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" HandleID="k8s-pod-network.f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--htdld-eth0" Sep 12 17:34:45.806911 containerd[1494]: 2025-09-12 17:34:45.803 [INFO][5644] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:45.806911 containerd[1494]: 2025-09-12 17:34:45.804 [INFO][5637] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" Sep 12 17:34:45.806911 containerd[1494]: time="2025-09-12T17:34:45.806797015Z" level=info msg="TearDown network for sandbox \"f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697\" successfully" Sep 12 17:34:45.806911 containerd[1494]: time="2025-09-12T17:34:45.806818255Z" level=info msg="StopPodSandbox for \"f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697\" returns successfully" Sep 12 17:34:45.807793 containerd[1494]: time="2025-09-12T17:34:45.807766285Z" level=info msg="RemovePodSandbox for \"f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697\"" Sep 12 17:34:45.807845 containerd[1494]: time="2025-09-12T17:34:45.807795179Z" level=info msg="Forcibly stopping sandbox \"f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697\"" Sep 12 17:34:45.871161 containerd[1494]: 2025-09-12 17:34:45.839 [WARNING][5658] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--htdld-eth0", GenerateName:"calico-apiserver-6dd46897bd-", Namespace:"calico-apiserver", SelfLink:"", UID:"0d943c3b-957f-40f9-b21f-ac657206aace", ResourceVersion:"998", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 33, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6dd46897bd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-f-c182586e87", ContainerID:"de5c2b268ceea309c4a25537a0bf220e73a72b8c18816adefa558cf960204d33", Pod:"calico-apiserver-6dd46897bd-htdld", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.34.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib96ad64bb2c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:34:45.871161 containerd[1494]: 2025-09-12 17:34:45.840 [INFO][5658] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" Sep 12 17:34:45.871161 containerd[1494]: 2025-09-12 17:34:45.840 [INFO][5658] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" iface="eth0" netns="" Sep 12 17:34:45.871161 containerd[1494]: 2025-09-12 17:34:45.840 [INFO][5658] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" Sep 12 17:34:45.871161 containerd[1494]: 2025-09-12 17:34:45.840 [INFO][5658] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" Sep 12 17:34:45.871161 containerd[1494]: 2025-09-12 17:34:45.860 [INFO][5665] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" HandleID="k8s-pod-network.f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--htdld-eth0" Sep 12 17:34:45.871161 containerd[1494]: 2025-09-12 17:34:45.860 [INFO][5665] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:34:45.871161 containerd[1494]: 2025-09-12 17:34:45.861 [INFO][5665] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:34:45.871161 containerd[1494]: 2025-09-12 17:34:45.865 [WARNING][5665] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" HandleID="k8s-pod-network.f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--htdld-eth0" Sep 12 17:34:45.871161 containerd[1494]: 2025-09-12 17:34:45.866 [INFO][5665] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" HandleID="k8s-pod-network.f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" Workload="ci--4081--3--6--f--c182586e87-k8s-calico--apiserver--6dd46897bd--htdld-eth0" Sep 12 17:34:45.871161 containerd[1494]: 2025-09-12 17:34:45.867 [INFO][5665] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:34:45.871161 containerd[1494]: 2025-09-12 17:34:45.869 [INFO][5658] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697" Sep 12 17:34:45.872156 containerd[1494]: time="2025-09-12T17:34:45.871189794Z" level=info msg="TearDown network for sandbox \"f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697\" successfully" Sep 12 17:34:45.876237 containerd[1494]: time="2025-09-12T17:34:45.876141253Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 17:34:45.876320 containerd[1494]: time="2025-09-12T17:34:45.876192970Z" level=info msg="RemovePodSandbox \"f5db53b3536194d1bf8beb85b1555599a7f618239326baeb59bc23059b5d3697\" returns successfully" Sep 12 17:34:57.071675 systemd[1]: run-containerd-runc-k8s.io-c069596cb3f11b2bc0a19bee2d67332733dfb32070b9427552ea3c39246e5e10-runc.D0fCii.mount: Deactivated successfully. Sep 12 17:35:21.361965 systemd[1]: run-containerd-runc-k8s.io-286a299eacff7870608a7cd93d1c8ab356a3c3014097a853184039ec28e19e94-runc.x3RvG6.mount: Deactivated successfully. Sep 12 17:35:24.295656 systemd[1]: Started sshd@7-135.181.96.215:22-147.75.109.163:58990.service - OpenSSH per-connection server daemon (147.75.109.163:58990). Sep 12 17:35:25.347258 sshd[5818]: Accepted publickey for core from 147.75.109.163 port 58990 ssh2: RSA SHA256:0lLQDI1exHD7h3/xNDVcYYaT5o4Ke3hnBgY7BErDCBA Sep 12 17:35:25.350491 sshd[5818]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:35:25.358305 systemd-logind[1481]: New session 8 of user core. Sep 12 17:35:25.364379 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 12 17:35:26.541686 sshd[5818]: pam_unix(sshd:session): session closed for user core Sep 12 17:35:26.548366 systemd-logind[1481]: Session 8 logged out. Waiting for processes to exit. Sep 12 17:35:26.549179 systemd[1]: sshd@7-135.181.96.215:22-147.75.109.163:58990.service: Deactivated successfully. Sep 12 17:35:26.551196 systemd[1]: session-8.scope: Deactivated successfully. Sep 12 17:35:26.552626 systemd-logind[1481]: Removed session 8. Sep 12 17:35:31.721627 systemd[1]: Started sshd@8-135.181.96.215:22-147.75.109.163:52350.service - OpenSSH per-connection server daemon (147.75.109.163:52350). Sep 12 17:35:32.746256 sshd[5854]: Accepted publickey for core from 147.75.109.163 port 52350 ssh2: RSA SHA256:0lLQDI1exHD7h3/xNDVcYYaT5o4Ke3hnBgY7BErDCBA Sep 12 17:35:32.747537 sshd[5854]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:35:32.752233 systemd-logind[1481]: New session 9 of user core. Sep 12 17:35:32.758370 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 12 17:35:33.579114 sshd[5854]: pam_unix(sshd:session): session closed for user core Sep 12 17:35:33.586443 systemd[1]: sshd@8-135.181.96.215:22-147.75.109.163:52350.service: Deactivated successfully. Sep 12 17:35:33.587992 systemd[1]: session-9.scope: Deactivated successfully. Sep 12 17:35:33.590097 systemd-logind[1481]: Session 9 logged out. Waiting for processes to exit. Sep 12 17:35:33.591336 systemd-logind[1481]: Removed session 9. Sep 12 17:35:37.644537 systemd[1]: run-containerd-runc-k8s.io-21bb87a3fda7d5325fac931727e1c42d0390ec9032f0d28c1c0fb5313199a735-runc.1NHmMm.mount: Deactivated successfully. Sep 12 17:35:38.748725 systemd[1]: Started sshd@9-135.181.96.215:22-147.75.109.163:52352.service - OpenSSH per-connection server daemon (147.75.109.163:52352). Sep 12 17:35:39.760376 sshd[5886]: Accepted publickey for core from 147.75.109.163 port 52352 ssh2: RSA SHA256:0lLQDI1exHD7h3/xNDVcYYaT5o4Ke3hnBgY7BErDCBA Sep 12 17:35:39.763412 sshd[5886]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:35:39.771666 systemd-logind[1481]: New session 10 of user core. Sep 12 17:35:39.778473 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 12 17:35:40.602759 sshd[5886]: pam_unix(sshd:session): session closed for user core Sep 12 17:35:40.605815 systemd-logind[1481]: Session 10 logged out. Waiting for processes to exit. Sep 12 17:35:40.606924 systemd[1]: sshd@9-135.181.96.215:22-147.75.109.163:52352.service: Deactivated successfully. Sep 12 17:35:40.609610 systemd[1]: session-10.scope: Deactivated successfully. Sep 12 17:35:40.612181 systemd-logind[1481]: Removed session 10. Sep 12 17:35:40.780299 systemd[1]: Started sshd@10-135.181.96.215:22-147.75.109.163:37842.service - OpenSSH per-connection server daemon (147.75.109.163:37842). Sep 12 17:35:41.757442 sshd[5904]: Accepted publickey for core from 147.75.109.163 port 37842 ssh2: RSA SHA256:0lLQDI1exHD7h3/xNDVcYYaT5o4Ke3hnBgY7BErDCBA Sep 12 17:35:41.759570 sshd[5904]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:35:41.766386 systemd-logind[1481]: New session 11 of user core. Sep 12 17:35:41.770549 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 12 17:35:42.614365 sshd[5904]: pam_unix(sshd:session): session closed for user core Sep 12 17:35:42.622826 systemd[1]: sshd@10-135.181.96.215:22-147.75.109.163:37842.service: Deactivated successfully. Sep 12 17:35:42.626211 systemd[1]: session-11.scope: Deactivated successfully. Sep 12 17:35:42.627363 systemd-logind[1481]: Session 11 logged out. Waiting for processes to exit. Sep 12 17:35:42.628451 systemd-logind[1481]: Removed session 11. Sep 12 17:35:42.777821 systemd[1]: Started sshd@11-135.181.96.215:22-147.75.109.163:37854.service - OpenSSH per-connection server daemon (147.75.109.163:37854). Sep 12 17:35:43.797779 sshd[5924]: Accepted publickey for core from 147.75.109.163 port 37854 ssh2: RSA SHA256:0lLQDI1exHD7h3/xNDVcYYaT5o4Ke3hnBgY7BErDCBA Sep 12 17:35:43.799510 sshd[5924]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:35:43.803249 systemd-logind[1481]: New session 12 of user core. Sep 12 17:35:43.813340 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 12 17:35:44.551196 sshd[5924]: pam_unix(sshd:session): session closed for user core Sep 12 17:35:44.554960 systemd[1]: sshd@11-135.181.96.215:22-147.75.109.163:37854.service: Deactivated successfully. Sep 12 17:35:44.556656 systemd[1]: session-12.scope: Deactivated successfully. Sep 12 17:35:44.558015 systemd-logind[1481]: Session 12 logged out. Waiting for processes to exit. Sep 12 17:35:44.559180 systemd-logind[1481]: Removed session 12. Sep 12 17:35:49.724734 systemd[1]: Started sshd@12-135.181.96.215:22-147.75.109.163:37862.service - OpenSSH per-connection server daemon (147.75.109.163:37862). Sep 12 17:35:50.736703 sshd[5941]: Accepted publickey for core from 147.75.109.163 port 37862 ssh2: RSA SHA256:0lLQDI1exHD7h3/xNDVcYYaT5o4Ke3hnBgY7BErDCBA Sep 12 17:35:50.739393 sshd[5941]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:35:50.744455 systemd-logind[1481]: New session 13 of user core. Sep 12 17:35:50.751358 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 12 17:35:51.578678 sshd[5941]: pam_unix(sshd:session): session closed for user core Sep 12 17:35:51.586417 systemd[1]: sshd@12-135.181.96.215:22-147.75.109.163:37862.service: Deactivated successfully. Sep 12 17:35:51.588316 systemd[1]: session-13.scope: Deactivated successfully. Sep 12 17:35:51.590622 systemd-logind[1481]: Session 13 logged out. Waiting for processes to exit. Sep 12 17:35:51.592086 systemd-logind[1481]: Removed session 13. Sep 12 17:35:51.738545 systemd[1]: Started sshd@13-135.181.96.215:22-147.75.109.163:40682.service - OpenSSH per-connection server daemon (147.75.109.163:40682). Sep 12 17:35:52.771640 sshd[5992]: Accepted publickey for core from 147.75.109.163 port 40682 ssh2: RSA SHA256:0lLQDI1exHD7h3/xNDVcYYaT5o4Ke3hnBgY7BErDCBA Sep 12 17:35:52.773499 sshd[5992]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:35:52.777395 systemd-logind[1481]: New session 14 of user core. Sep 12 17:35:52.781399 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 12 17:35:53.775786 sshd[5992]: pam_unix(sshd:session): session closed for user core Sep 12 17:35:53.785393 systemd[1]: sshd@13-135.181.96.215:22-147.75.109.163:40682.service: Deactivated successfully. Sep 12 17:35:53.786931 systemd[1]: session-14.scope: Deactivated successfully. Sep 12 17:35:53.788261 systemd-logind[1481]: Session 14 logged out. Waiting for processes to exit. Sep 12 17:35:53.789362 systemd-logind[1481]: Removed session 14. Sep 12 17:35:53.942681 systemd[1]: Started sshd@14-135.181.96.215:22-147.75.109.163:40684.service - OpenSSH per-connection server daemon (147.75.109.163:40684). Sep 12 17:35:54.939698 sshd[6003]: Accepted publickey for core from 147.75.109.163 port 40684 ssh2: RSA SHA256:0lLQDI1exHD7h3/xNDVcYYaT5o4Ke3hnBgY7BErDCBA Sep 12 17:35:54.941033 sshd[6003]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:35:54.946103 systemd-logind[1481]: New session 15 of user core. Sep 12 17:35:54.949366 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 12 17:35:56.387850 sshd[6003]: pam_unix(sshd:session): session closed for user core Sep 12 17:35:56.398472 systemd[1]: sshd@14-135.181.96.215:22-147.75.109.163:40684.service: Deactivated successfully. Sep 12 17:35:56.401110 systemd[1]: session-15.scope: Deactivated successfully. Sep 12 17:35:56.403065 systemd-logind[1481]: Session 15 logged out. Waiting for processes to exit. Sep 12 17:35:56.405790 systemd-logind[1481]: Removed session 15. Sep 12 17:35:56.547510 systemd[1]: Started sshd@15-135.181.96.215:22-147.75.109.163:40700.service - OpenSSH per-connection server daemon (147.75.109.163:40700). Sep 12 17:35:57.560199 sshd[6023]: Accepted publickey for core from 147.75.109.163 port 40700 ssh2: RSA SHA256:0lLQDI1exHD7h3/xNDVcYYaT5o4Ke3hnBgY7BErDCBA Sep 12 17:35:57.563948 sshd[6023]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:35:57.569717 systemd-logind[1481]: New session 16 of user core. Sep 12 17:35:57.575508 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 12 17:35:58.900127 sshd[6023]: pam_unix(sshd:session): session closed for user core Sep 12 17:35:58.905839 systemd[1]: sshd@15-135.181.96.215:22-147.75.109.163:40700.service: Deactivated successfully. Sep 12 17:35:58.906426 systemd-logind[1481]: Session 16 logged out. Waiting for processes to exit. Sep 12 17:35:58.909487 systemd[1]: session-16.scope: Deactivated successfully. Sep 12 17:35:58.911839 systemd-logind[1481]: Removed session 16. Sep 12 17:35:59.074594 systemd[1]: Started sshd@16-135.181.96.215:22-147.75.109.163:40710.service - OpenSSH per-connection server daemon (147.75.109.163:40710). Sep 12 17:36:00.098887 sshd[6063]: Accepted publickey for core from 147.75.109.163 port 40710 ssh2: RSA SHA256:0lLQDI1exHD7h3/xNDVcYYaT5o4Ke3hnBgY7BErDCBA Sep 12 17:36:00.101072 sshd[6063]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:36:00.109064 systemd-logind[1481]: New session 17 of user core. Sep 12 17:36:00.113183 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 12 17:36:01.124633 sshd[6063]: pam_unix(sshd:session): session closed for user core Sep 12 17:36:01.129466 systemd[1]: sshd@16-135.181.96.215:22-147.75.109.163:40710.service: Deactivated successfully. Sep 12 17:36:01.131805 systemd[1]: session-17.scope: Deactivated successfully. Sep 12 17:36:01.133722 systemd-logind[1481]: Session 17 logged out. Waiting for processes to exit. Sep 12 17:36:01.134898 systemd-logind[1481]: Removed session 17. Sep 12 17:36:06.332303 systemd[1]: Started sshd@17-135.181.96.215:22-147.75.109.163:42768.service - OpenSSH per-connection server daemon (147.75.109.163:42768). Sep 12 17:36:07.467606 sshd[6098]: Accepted publickey for core from 147.75.109.163 port 42768 ssh2: RSA SHA256:0lLQDI1exHD7h3/xNDVcYYaT5o4Ke3hnBgY7BErDCBA Sep 12 17:36:07.469645 sshd[6098]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:36:07.476812 systemd-logind[1481]: New session 18 of user core. Sep 12 17:36:07.480328 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 12 17:36:07.657608 systemd[1]: run-containerd-runc-k8s.io-21bb87a3fda7d5325fac931727e1c42d0390ec9032f0d28c1c0fb5313199a735-runc.QqY2ac.mount: Deactivated successfully. Sep 12 17:36:08.694601 sshd[6098]: pam_unix(sshd:session): session closed for user core Sep 12 17:36:08.698404 systemd-logind[1481]: Session 18 logged out. Waiting for processes to exit. Sep 12 17:36:08.700151 systemd[1]: sshd@17-135.181.96.215:22-147.75.109.163:42768.service: Deactivated successfully. Sep 12 17:36:08.702819 systemd[1]: session-18.scope: Deactivated successfully. Sep 12 17:36:08.704104 systemd-logind[1481]: Removed session 18. Sep 12 17:36:13.893546 systemd[1]: Started sshd@18-135.181.96.215:22-147.75.109.163:38522.service - OpenSSH per-connection server daemon (147.75.109.163:38522). Sep 12 17:36:15.005992 sshd[6129]: Accepted publickey for core from 147.75.109.163 port 38522 ssh2: RSA SHA256:0lLQDI1exHD7h3/xNDVcYYaT5o4Ke3hnBgY7BErDCBA Sep 12 17:36:15.005889 sshd[6129]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:36:15.010860 systemd-logind[1481]: New session 19 of user core. Sep 12 17:36:15.018358 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 12 17:36:16.016584 sshd[6129]: pam_unix(sshd:session): session closed for user core Sep 12 17:36:16.023868 systemd-logind[1481]: Session 19 logged out. Waiting for processes to exit. Sep 12 17:36:16.028460 systemd[1]: sshd@18-135.181.96.215:22-147.75.109.163:38522.service: Deactivated successfully. Sep 12 17:36:16.031889 systemd[1]: session-19.scope: Deactivated successfully. Sep 12 17:36:16.037281 systemd-logind[1481]: Removed session 19. Sep 12 17:36:32.287397 systemd[1]: cri-containerd-0c9078338a68164d9c44a82df350b50f518ef7b10c6c0343e95d9dbfda6139b3.scope: Deactivated successfully. Sep 12 17:36:32.287700 systemd[1]: cri-containerd-0c9078338a68164d9c44a82df350b50f518ef7b10c6c0343e95d9dbfda6139b3.scope: Consumed 1.913s CPU time, 24.0M memory peak, 0B memory swap peak. Sep 12 17:36:32.330190 kubelet[2561]: E0912 17:36:32.323732 2561 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:39484->10.0.0.2:2379: read: connection timed out" Sep 12 17:36:32.464902 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0c9078338a68164d9c44a82df350b50f518ef7b10c6c0343e95d9dbfda6139b3-rootfs.mount: Deactivated successfully. Sep 12 17:36:32.550457 systemd[1]: cri-containerd-a1917a782fb112f95cf697c3a66d2973a99473ae138b1910d6043562a5153aff.scope: Deactivated successfully. Sep 12 17:36:32.550663 systemd[1]: cri-containerd-a1917a782fb112f95cf697c3a66d2973a99473ae138b1910d6043562a5153aff.scope: Consumed 15.269s CPU time. Sep 12 17:36:32.569825 containerd[1494]: time="2025-09-12T17:36:32.495725427Z" level=info msg="shim disconnected" id=0c9078338a68164d9c44a82df350b50f518ef7b10c6c0343e95d9dbfda6139b3 namespace=k8s.io Sep 12 17:36:32.570961 containerd[1494]: time="2025-09-12T17:36:32.569824239Z" level=warning msg="cleaning up after shim disconnected" id=0c9078338a68164d9c44a82df350b50f518ef7b10c6c0343e95d9dbfda6139b3 namespace=k8s.io Sep 12 17:36:32.570961 containerd[1494]: time="2025-09-12T17:36:32.569839187Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 12 17:36:32.592380 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a1917a782fb112f95cf697c3a66d2973a99473ae138b1910d6043562a5153aff-rootfs.mount: Deactivated successfully. Sep 12 17:36:32.594442 containerd[1494]: time="2025-09-12T17:36:32.593617083Z" level=info msg="shim disconnected" id=a1917a782fb112f95cf697c3a66d2973a99473ae138b1910d6043562a5153aff namespace=k8s.io Sep 12 17:36:32.594442 containerd[1494]: time="2025-09-12T17:36:32.593659061Z" level=warning msg="cleaning up after shim disconnected" id=a1917a782fb112f95cf697c3a66d2973a99473ae138b1910d6043562a5153aff namespace=k8s.io Sep 12 17:36:32.594442 containerd[1494]: time="2025-09-12T17:36:32.593665734Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 12 17:36:32.625274 systemd[1]: cri-containerd-8caa9e73f6120948e5ba5c5ebb6443ea4f2c29ac056432d78f4ac11f856f9e2b.scope: Deactivated successfully. Sep 12 17:36:32.625688 systemd[1]: cri-containerd-8caa9e73f6120948e5ba5c5ebb6443ea4f2c29ac056432d78f4ac11f856f9e2b.scope: Consumed 3.352s CPU time, 24.8M memory peak, 0B memory swap peak. Sep 12 17:36:32.670877 containerd[1494]: time="2025-09-12T17:36:32.670812811Z" level=info msg="shim disconnected" id=8caa9e73f6120948e5ba5c5ebb6443ea4f2c29ac056432d78f4ac11f856f9e2b namespace=k8s.io Sep 12 17:36:32.671312 containerd[1494]: time="2025-09-12T17:36:32.671257294Z" level=warning msg="cleaning up after shim disconnected" id=8caa9e73f6120948e5ba5c5ebb6443ea4f2c29ac056432d78f4ac11f856f9e2b namespace=k8s.io Sep 12 17:36:32.671312 containerd[1494]: time="2025-09-12T17:36:32.671278644Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 12 17:36:32.672760 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8caa9e73f6120948e5ba5c5ebb6443ea4f2c29ac056432d78f4ac11f856f9e2b-rootfs.mount: Deactivated successfully. Sep 12 17:36:33.145087 kubelet[2561]: I0912 17:36:33.145036 2561 scope.go:117] "RemoveContainer" containerID="8caa9e73f6120948e5ba5c5ebb6443ea4f2c29ac056432d78f4ac11f856f9e2b" Sep 12 17:36:33.148460 kubelet[2561]: I0912 17:36:33.148308 2561 scope.go:117] "RemoveContainer" containerID="a1917a782fb112f95cf697c3a66d2973a99473ae138b1910d6043562a5153aff" Sep 12 17:36:33.148607 kubelet[2561]: I0912 17:36:33.148574 2561 scope.go:117] "RemoveContainer" containerID="0c9078338a68164d9c44a82df350b50f518ef7b10c6c0343e95d9dbfda6139b3" Sep 12 17:36:33.186528 containerd[1494]: time="2025-09-12T17:36:33.186463724Z" level=info msg="CreateContainer within sandbox \"0e355f3981ea5289ada5a20ebfc71d55d43e7b87be88411efa187d7a87bde99c\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Sep 12 17:36:33.186749 containerd[1494]: time="2025-09-12T17:36:33.186703534Z" level=info msg="CreateContainer within sandbox \"68f0a35c74934ce78267c3e621151e23f2d6b6bb0e496461cd802e7aa0d8209e\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Sep 12 17:36:33.188476 containerd[1494]: time="2025-09-12T17:36:33.188373153Z" level=info msg="CreateContainer within sandbox \"e489b027ddbf82b7dc26f7be3be9f77c25a8731cf7f59e4806f2d3ca29ec3ea2\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Sep 12 17:36:33.332172 containerd[1494]: time="2025-09-12T17:36:33.331807338Z" level=info msg="CreateContainer within sandbox \"e489b027ddbf82b7dc26f7be3be9f77c25a8731cf7f59e4806f2d3ca29ec3ea2\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"b0df3af36b057c43fb1b623f0f78689f89a9aed20b5de5f87fe4936dddecca27\"" Sep 12 17:36:33.336151 containerd[1494]: time="2025-09-12T17:36:33.336129320Z" level=info msg="CreateContainer within sandbox \"0e355f3981ea5289ada5a20ebfc71d55d43e7b87be88411efa187d7a87bde99c\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"b09684be4242994b93a9a1f82efc5972ea86d500c4ee21bcf198bad931f4b8df\"" Sep 12 17:36:33.338335 containerd[1494]: time="2025-09-12T17:36:33.337413949Z" level=info msg="StartContainer for \"b0df3af36b057c43fb1b623f0f78689f89a9aed20b5de5f87fe4936dddecca27\"" Sep 12 17:36:33.341683 containerd[1494]: time="2025-09-12T17:36:33.341659308Z" level=info msg="CreateContainer within sandbox \"68f0a35c74934ce78267c3e621151e23f2d6b6bb0e496461cd802e7aa0d8209e\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"6bb3c6a973221e3a473cb25dacff6ac390a65b84f98e4a8da48b9cde31c9c0de\"" Sep 12 17:36:33.342119 containerd[1494]: time="2025-09-12T17:36:33.342102348Z" level=info msg="StartContainer for \"b09684be4242994b93a9a1f82efc5972ea86d500c4ee21bcf198bad931f4b8df\"" Sep 12 17:36:33.354347 containerd[1494]: time="2025-09-12T17:36:33.354321997Z" level=info msg="StartContainer for \"6bb3c6a973221e3a473cb25dacff6ac390a65b84f98e4a8da48b9cde31c9c0de\"" Sep 12 17:36:33.384888 systemd[1]: Started cri-containerd-b09684be4242994b93a9a1f82efc5972ea86d500c4ee21bcf198bad931f4b8df.scope - libcontainer container b09684be4242994b93a9a1f82efc5972ea86d500c4ee21bcf198bad931f4b8df. Sep 12 17:36:33.394991 systemd[1]: Started cri-containerd-6bb3c6a973221e3a473cb25dacff6ac390a65b84f98e4a8da48b9cde31c9c0de.scope - libcontainer container 6bb3c6a973221e3a473cb25dacff6ac390a65b84f98e4a8da48b9cde31c9c0de. Sep 12 17:36:33.404981 systemd[1]: Started cri-containerd-b0df3af36b057c43fb1b623f0f78689f89a9aed20b5de5f87fe4936dddecca27.scope - libcontainer container b0df3af36b057c43fb1b623f0f78689f89a9aed20b5de5f87fe4936dddecca27. Sep 12 17:36:33.471717 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount482300301.mount: Deactivated successfully. Sep 12 17:36:33.473551 containerd[1494]: time="2025-09-12T17:36:33.473475332Z" level=info msg="StartContainer for \"6bb3c6a973221e3a473cb25dacff6ac390a65b84f98e4a8da48b9cde31c9c0de\" returns successfully" Sep 12 17:36:33.483850 containerd[1494]: time="2025-09-12T17:36:33.482866076Z" level=info msg="StartContainer for \"b09684be4242994b93a9a1f82efc5972ea86d500c4ee21bcf198bad931f4b8df\" returns successfully" Sep 12 17:36:33.492758 containerd[1494]: time="2025-09-12T17:36:33.492679814Z" level=info msg="StartContainer for \"b0df3af36b057c43fb1b623f0f78689f89a9aed20b5de5f87fe4936dddecca27\" returns successfully" Sep 12 17:36:35.486946 kubelet[2561]: E0912 17:36:35.464253 2561 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:39308->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4081-3-6-f-c182586e87.1864998ce4b68265 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4081-3-6-f-c182586e87,UID:c3d885514471810970ca3bb7f49a949f,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4081-3-6-f-c182586e87,},FirstTimestamp:2025-09-12 17:36:24.992973413 +0000 UTC m=+161.020388523,LastTimestamp:2025-09-12 17:36:24.992973413 +0000 UTC m=+161.020388523,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4081-3-6-f-c182586e87,}"