Sep 12 23:56:08.901600 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Sep 12 23:56:08.901631 kernel: Linux version 6.6.106-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT Fri Sep 12 22:36:20 -00 2025 Sep 12 23:56:08.901644 kernel: KASLR enabled Sep 12 23:56:08.901652 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Sep 12 23:56:08.901660 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x1390c1018 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b43d18 Sep 12 23:56:08.901667 kernel: random: crng init done Sep 12 23:56:08.901677 kernel: ACPI: Early table checksum verification disabled Sep 12 23:56:08.901684 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) Sep 12 23:56:08.901693 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Sep 12 23:56:08.901703 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 23:56:08.903788 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 23:56:08.903797 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 23:56:08.903803 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 23:56:08.903810 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 23:56:08.903818 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 23:56:08.903833 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 23:56:08.903839 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 23:56:08.903846 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 23:56:08.903852 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) Sep 12 23:56:08.903859 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Sep 12 23:56:08.903866 kernel: NUMA: Failed to initialise from firmware Sep 12 23:56:08.903872 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Sep 12 23:56:08.903879 kernel: NUMA: NODE_DATA [mem 0x13966e800-0x139673fff] Sep 12 23:56:08.903885 kernel: Zone ranges: Sep 12 23:56:08.903891 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Sep 12 23:56:08.903899 kernel: DMA32 empty Sep 12 23:56:08.903906 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Sep 12 23:56:08.903912 kernel: Movable zone start for each node Sep 12 23:56:08.903918 kernel: Early memory node ranges Sep 12 23:56:08.903924 kernel: node 0: [mem 0x0000000040000000-0x000000013676ffff] Sep 12 23:56:08.903931 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] Sep 12 23:56:08.903937 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] Sep 12 23:56:08.903943 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] Sep 12 23:56:08.903950 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] Sep 12 23:56:08.903956 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] Sep 12 23:56:08.903963 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] Sep 12 23:56:08.903969 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Sep 12 23:56:08.903977 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Sep 12 23:56:08.903983 kernel: psci: probing for conduit method from ACPI. Sep 12 23:56:08.903990 kernel: psci: PSCIv1.1 detected in firmware. Sep 12 23:56:08.904000 kernel: psci: Using standard PSCI v0.2 function IDs Sep 12 23:56:08.904007 kernel: psci: Trusted OS migration not required Sep 12 23:56:08.904014 kernel: psci: SMC Calling Convention v1.1 Sep 12 23:56:08.904023 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Sep 12 23:56:08.904030 kernel: percpu: Embedded 31 pages/cpu s86632 r8192 d32152 u126976 Sep 12 23:56:08.904037 kernel: pcpu-alloc: s86632 r8192 d32152 u126976 alloc=31*4096 Sep 12 23:56:08.904044 kernel: pcpu-alloc: [0] 0 [0] 1 Sep 12 23:56:08.904051 kernel: Detected PIPT I-cache on CPU0 Sep 12 23:56:08.904057 kernel: CPU features: detected: GIC system register CPU interface Sep 12 23:56:08.904064 kernel: CPU features: detected: Hardware dirty bit management Sep 12 23:56:08.904071 kernel: CPU features: detected: Spectre-v4 Sep 12 23:56:08.904077 kernel: CPU features: detected: Spectre-BHB Sep 12 23:56:08.904084 kernel: CPU features: kernel page table isolation forced ON by KASLR Sep 12 23:56:08.904093 kernel: CPU features: detected: Kernel page table isolation (KPTI) Sep 12 23:56:08.904100 kernel: CPU features: detected: ARM erratum 1418040 Sep 12 23:56:08.904107 kernel: CPU features: detected: SSBS not fully self-synchronizing Sep 12 23:56:08.904113 kernel: alternatives: applying boot alternatives Sep 12 23:56:08.904122 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=e1b46f3c9e154636c32f6cde6e746a00a6b37ca7432cb4e16d172c05f584a8c9 Sep 12 23:56:08.904129 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 12 23:56:08.904136 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 12 23:56:08.904143 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 12 23:56:08.904150 kernel: Fallback order for Node 0: 0 Sep 12 23:56:08.904156 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1008000 Sep 12 23:56:08.904163 kernel: Policy zone: Normal Sep 12 23:56:08.904171 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 12 23:56:08.904178 kernel: software IO TLB: area num 2. Sep 12 23:56:08.904185 kernel: software IO TLB: mapped [mem 0x00000000fbfff000-0x00000000fffff000] (64MB) Sep 12 23:56:08.904192 kernel: Memory: 3882740K/4096000K available (10304K kernel code, 2186K rwdata, 8108K rodata, 39488K init, 897K bss, 213260K reserved, 0K cma-reserved) Sep 12 23:56:08.904199 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Sep 12 23:56:08.904206 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 12 23:56:08.904213 kernel: rcu: RCU event tracing is enabled. Sep 12 23:56:08.904220 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Sep 12 23:56:08.904227 kernel: Trampoline variant of Tasks RCU enabled. Sep 12 23:56:08.904234 kernel: Tracing variant of Tasks RCU enabled. Sep 12 23:56:08.904241 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 12 23:56:08.904249 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Sep 12 23:56:08.904256 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Sep 12 23:56:08.904263 kernel: GICv3: 256 SPIs implemented Sep 12 23:56:08.904269 kernel: GICv3: 0 Extended SPIs implemented Sep 12 23:56:08.904276 kernel: Root IRQ handler: gic_handle_irq Sep 12 23:56:08.904283 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Sep 12 23:56:08.904290 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Sep 12 23:56:08.904297 kernel: ITS [mem 0x08080000-0x0809ffff] Sep 12 23:56:08.904304 kernel: ITS@0x0000000008080000: allocated 8192 Devices @1000c0000 (indirect, esz 8, psz 64K, shr 1) Sep 12 23:56:08.904311 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @1000d0000 (flat, esz 8, psz 64K, shr 1) Sep 12 23:56:08.904318 kernel: GICv3: using LPI property table @0x00000001000e0000 Sep 12 23:56:08.904325 kernel: GICv3: CPU0: using allocated LPI pending table @0x00000001000f0000 Sep 12 23:56:08.904333 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 12 23:56:08.904340 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 12 23:56:08.904347 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Sep 12 23:56:08.904368 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Sep 12 23:56:08.904376 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Sep 12 23:56:08.904383 kernel: Console: colour dummy device 80x25 Sep 12 23:56:08.904390 kernel: ACPI: Core revision 20230628 Sep 12 23:56:08.904398 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Sep 12 23:56:08.904405 kernel: pid_max: default: 32768 minimum: 301 Sep 12 23:56:08.904412 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Sep 12 23:56:08.904421 kernel: landlock: Up and running. Sep 12 23:56:08.904428 kernel: SELinux: Initializing. Sep 12 23:56:08.904435 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 12 23:56:08.904442 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 12 23:56:08.904449 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 23:56:08.904457 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 23:56:08.904464 kernel: rcu: Hierarchical SRCU implementation. Sep 12 23:56:08.904471 kernel: rcu: Max phase no-delay instances is 400. Sep 12 23:56:08.904478 kernel: Platform MSI: ITS@0x8080000 domain created Sep 12 23:56:08.904487 kernel: PCI/MSI: ITS@0x8080000 domain created Sep 12 23:56:08.904494 kernel: Remapping and enabling EFI services. Sep 12 23:56:08.904501 kernel: smp: Bringing up secondary CPUs ... Sep 12 23:56:08.904518 kernel: Detected PIPT I-cache on CPU1 Sep 12 23:56:08.904525 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Sep 12 23:56:08.904532 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100100000 Sep 12 23:56:08.904539 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 12 23:56:08.904547 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Sep 12 23:56:08.904554 kernel: smp: Brought up 1 node, 2 CPUs Sep 12 23:56:08.904561 kernel: SMP: Total of 2 processors activated. Sep 12 23:56:08.904571 kernel: CPU features: detected: 32-bit EL0 Support Sep 12 23:56:08.904579 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Sep 12 23:56:08.904592 kernel: CPU features: detected: Common not Private translations Sep 12 23:56:08.904601 kernel: CPU features: detected: CRC32 instructions Sep 12 23:56:08.904608 kernel: CPU features: detected: Enhanced Virtualization Traps Sep 12 23:56:08.904616 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Sep 12 23:56:08.904623 kernel: CPU features: detected: LSE atomic instructions Sep 12 23:56:08.904631 kernel: CPU features: detected: Privileged Access Never Sep 12 23:56:08.904638 kernel: CPU features: detected: RAS Extension Support Sep 12 23:56:08.904647 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Sep 12 23:56:08.904655 kernel: CPU: All CPU(s) started at EL1 Sep 12 23:56:08.904662 kernel: alternatives: applying system-wide alternatives Sep 12 23:56:08.904670 kernel: devtmpfs: initialized Sep 12 23:56:08.904678 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 12 23:56:08.904686 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Sep 12 23:56:08.904693 kernel: pinctrl core: initialized pinctrl subsystem Sep 12 23:56:08.904702 kernel: SMBIOS 3.0.0 present. Sep 12 23:56:08.905861 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Sep 12 23:56:08.905872 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 12 23:56:08.905879 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Sep 12 23:56:08.905900 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Sep 12 23:56:08.905909 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Sep 12 23:56:08.905917 kernel: audit: initializing netlink subsys (disabled) Sep 12 23:56:08.905924 kernel: audit: type=2000 audit(0.012:1): state=initialized audit_enabled=0 res=1 Sep 12 23:56:08.905932 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 12 23:56:08.905947 kernel: cpuidle: using governor menu Sep 12 23:56:08.905955 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Sep 12 23:56:08.905962 kernel: ASID allocator initialised with 32768 entries Sep 12 23:56:08.905970 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 12 23:56:08.905977 kernel: Serial: AMBA PL011 UART driver Sep 12 23:56:08.905985 kernel: Modules: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Sep 12 23:56:08.905992 kernel: Modules: 0 pages in range for non-PLT usage Sep 12 23:56:08.906000 kernel: Modules: 508992 pages in range for PLT usage Sep 12 23:56:08.906008 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 12 23:56:08.906015 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Sep 12 23:56:08.906025 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Sep 12 23:56:08.906033 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Sep 12 23:56:08.906040 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 12 23:56:08.906048 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Sep 12 23:56:08.906055 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Sep 12 23:56:08.906062 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Sep 12 23:56:08.906069 kernel: ACPI: Added _OSI(Module Device) Sep 12 23:56:08.906077 kernel: ACPI: Added _OSI(Processor Device) Sep 12 23:56:08.906084 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 12 23:56:08.906093 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 12 23:56:08.906101 kernel: ACPI: Interpreter enabled Sep 12 23:56:08.906109 kernel: ACPI: Using GIC for interrupt routing Sep 12 23:56:08.906116 kernel: ACPI: MCFG table detected, 1 entries Sep 12 23:56:08.906124 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Sep 12 23:56:08.906131 kernel: printk: console [ttyAMA0] enabled Sep 12 23:56:08.906139 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 12 23:56:08.906327 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 12 23:56:08.906406 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Sep 12 23:56:08.906473 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Sep 12 23:56:08.906562 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Sep 12 23:56:08.906631 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Sep 12 23:56:08.906641 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Sep 12 23:56:08.906649 kernel: PCI host bridge to bus 0000:00 Sep 12 23:56:08.907850 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Sep 12 23:56:08.907955 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Sep 12 23:56:08.908019 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Sep 12 23:56:08.908078 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 12 23:56:08.908165 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 Sep 12 23:56:08.908244 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 Sep 12 23:56:08.908313 kernel: pci 0000:00:01.0: reg 0x14: [mem 0x11289000-0x11289fff] Sep 12 23:56:08.908381 kernel: pci 0000:00:01.0: reg 0x20: [mem 0x8000600000-0x8000603fff 64bit pref] Sep 12 23:56:08.908465 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Sep 12 23:56:08.908576 kernel: pci 0000:00:02.0: reg 0x10: [mem 0x11288000-0x11288fff] Sep 12 23:56:08.908653 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Sep 12 23:56:08.908738 kernel: pci 0000:00:02.1: reg 0x10: [mem 0x11287000-0x11287fff] Sep 12 23:56:08.908817 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Sep 12 23:56:08.908885 kernel: pci 0000:00:02.2: reg 0x10: [mem 0x11286000-0x11286fff] Sep 12 23:56:08.908963 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Sep 12 23:56:08.909030 kernel: pci 0000:00:02.3: reg 0x10: [mem 0x11285000-0x11285fff] Sep 12 23:56:08.909102 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Sep 12 23:56:08.909169 kernel: pci 0000:00:02.4: reg 0x10: [mem 0x11284000-0x11284fff] Sep 12 23:56:08.909242 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Sep 12 23:56:08.909308 kernel: pci 0000:00:02.5: reg 0x10: [mem 0x11283000-0x11283fff] Sep 12 23:56:08.909384 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Sep 12 23:56:08.909662 kernel: pci 0000:00:02.6: reg 0x10: [mem 0x11282000-0x11282fff] Sep 12 23:56:08.909780 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Sep 12 23:56:08.909851 kernel: pci 0000:00:02.7: reg 0x10: [mem 0x11281000-0x11281fff] Sep 12 23:56:08.909992 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 Sep 12 23:56:08.910066 kernel: pci 0000:00:03.0: reg 0x10: [mem 0x11280000-0x11280fff] Sep 12 23:56:08.910147 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 Sep 12 23:56:08.910214 kernel: pci 0000:00:04.0: reg 0x10: [io 0x0000-0x0007] Sep 12 23:56:08.910290 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 Sep 12 23:56:08.910358 kernel: pci 0000:01:00.0: reg 0x14: [mem 0x11000000-0x11000fff] Sep 12 23:56:08.910427 kernel: pci 0000:01:00.0: reg 0x20: [mem 0x8000000000-0x8000003fff 64bit pref] Sep 12 23:56:08.910496 kernel: pci 0000:01:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Sep 12 23:56:08.910595 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 Sep 12 23:56:08.910673 kernel: pci 0000:02:00.0: reg 0x10: [mem 0x10e00000-0x10e03fff 64bit] Sep 12 23:56:08.910889 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 Sep 12 23:56:08.910963 kernel: pci 0000:03:00.0: reg 0x14: [mem 0x10c00000-0x10c00fff] Sep 12 23:56:08.911029 kernel: pci 0000:03:00.0: reg 0x20: [mem 0x8000100000-0x8000103fff 64bit pref] Sep 12 23:56:08.911106 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 Sep 12 23:56:08.911175 kernel: pci 0000:04:00.0: reg 0x20: [mem 0x8000200000-0x8000203fff 64bit pref] Sep 12 23:56:08.911256 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 Sep 12 23:56:08.911324 kernel: pci 0000:05:00.0: reg 0x20: [mem 0x8000300000-0x8000303fff 64bit pref] Sep 12 23:56:08.911399 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 Sep 12 23:56:08.911469 kernel: pci 0000:06:00.0: reg 0x14: [mem 0x10600000-0x10600fff] Sep 12 23:56:08.911597 kernel: pci 0000:06:00.0: reg 0x20: [mem 0x8000400000-0x8000403fff 64bit pref] Sep 12 23:56:08.911693 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 Sep 12 23:56:08.911792 kernel: pci 0000:07:00.0: reg 0x14: [mem 0x10400000-0x10400fff] Sep 12 23:56:08.911865 kernel: pci 0000:07:00.0: reg 0x20: [mem 0x8000500000-0x8000503fff 64bit pref] Sep 12 23:56:08.911933 kernel: pci 0000:07:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Sep 12 23:56:08.917823 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Sep 12 23:56:08.917958 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Sep 12 23:56:08.918142 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Sep 12 23:56:08.918227 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Sep 12 23:56:08.918306 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Sep 12 23:56:08.918373 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Sep 12 23:56:08.918445 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Sep 12 23:56:08.918555 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Sep 12 23:56:08.918676 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Sep 12 23:56:08.918807 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Sep 12 23:56:08.918880 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Sep 12 23:56:08.918954 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Sep 12 23:56:08.919025 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Sep 12 23:56:08.919094 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Sep 12 23:56:08.919177 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 05] add_size 200000 add_align 100000 Sep 12 23:56:08.919255 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Sep 12 23:56:08.919322 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Sep 12 23:56:08.919388 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Sep 12 23:56:08.919460 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Sep 12 23:56:08.919562 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Sep 12 23:56:08.919652 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Sep 12 23:56:08.921859 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Sep 12 23:56:08.921969 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Sep 12 23:56:08.922036 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Sep 12 23:56:08.922108 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Sep 12 23:56:08.922175 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Sep 12 23:56:08.922306 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Sep 12 23:56:08.922382 kernel: pci 0000:00:02.0: BAR 14: assigned [mem 0x10000000-0x101fffff] Sep 12 23:56:08.922449 kernel: pci 0000:00:02.0: BAR 15: assigned [mem 0x8000000000-0x80001fffff 64bit pref] Sep 12 23:56:08.922541 kernel: pci 0000:00:02.1: BAR 14: assigned [mem 0x10200000-0x103fffff] Sep 12 23:56:08.922619 kernel: pci 0000:00:02.1: BAR 15: assigned [mem 0x8000200000-0x80003fffff 64bit pref] Sep 12 23:56:08.922691 kernel: pci 0000:00:02.2: BAR 14: assigned [mem 0x10400000-0x105fffff] Sep 12 23:56:08.922776 kernel: pci 0000:00:02.2: BAR 15: assigned [mem 0x8000400000-0x80005fffff 64bit pref] Sep 12 23:56:08.922855 kernel: pci 0000:00:02.3: BAR 14: assigned [mem 0x10600000-0x107fffff] Sep 12 23:56:08.922921 kernel: pci 0000:00:02.3: BAR 15: assigned [mem 0x8000600000-0x80007fffff 64bit pref] Sep 12 23:56:08.922991 kernel: pci 0000:00:02.4: BAR 14: assigned [mem 0x10800000-0x109fffff] Sep 12 23:56:08.923057 kernel: pci 0000:00:02.4: BAR 15: assigned [mem 0x8000800000-0x80009fffff 64bit pref] Sep 12 23:56:08.923127 kernel: pci 0000:00:02.5: BAR 14: assigned [mem 0x10a00000-0x10bfffff] Sep 12 23:56:08.923193 kernel: pci 0000:00:02.5: BAR 15: assigned [mem 0x8000a00000-0x8000bfffff 64bit pref] Sep 12 23:56:08.923265 kernel: pci 0000:00:02.6: BAR 14: assigned [mem 0x10c00000-0x10dfffff] Sep 12 23:56:08.923335 kernel: pci 0000:00:02.6: BAR 15: assigned [mem 0x8000c00000-0x8000dfffff 64bit pref] Sep 12 23:56:08.923403 kernel: pci 0000:00:02.7: BAR 14: assigned [mem 0x10e00000-0x10ffffff] Sep 12 23:56:08.923469 kernel: pci 0000:00:02.7: BAR 15: assigned [mem 0x8000e00000-0x8000ffffff 64bit pref] Sep 12 23:56:08.923584 kernel: pci 0000:00:03.0: BAR 14: assigned [mem 0x11000000-0x111fffff] Sep 12 23:56:08.923658 kernel: pci 0000:00:03.0: BAR 15: assigned [mem 0x8001000000-0x80011fffff 64bit pref] Sep 12 23:56:08.925198 kernel: pci 0000:00:01.0: BAR 4: assigned [mem 0x8001200000-0x8001203fff 64bit pref] Sep 12 23:56:08.925314 kernel: pci 0000:00:01.0: BAR 1: assigned [mem 0x11200000-0x11200fff] Sep 12 23:56:08.925397 kernel: pci 0000:00:02.0: BAR 0: assigned [mem 0x11201000-0x11201fff] Sep 12 23:56:08.925464 kernel: pci 0000:00:02.0: BAR 13: assigned [io 0x1000-0x1fff] Sep 12 23:56:08.925582 kernel: pci 0000:00:02.1: BAR 0: assigned [mem 0x11202000-0x11202fff] Sep 12 23:56:08.925653 kernel: pci 0000:00:02.1: BAR 13: assigned [io 0x2000-0x2fff] Sep 12 23:56:08.925738 kernel: pci 0000:00:02.2: BAR 0: assigned [mem 0x11203000-0x11203fff] Sep 12 23:56:08.925808 kernel: pci 0000:00:02.2: BAR 13: assigned [io 0x3000-0x3fff] Sep 12 23:56:08.925878 kernel: pci 0000:00:02.3: BAR 0: assigned [mem 0x11204000-0x11204fff] Sep 12 23:56:08.925944 kernel: pci 0000:00:02.3: BAR 13: assigned [io 0x4000-0x4fff] Sep 12 23:56:08.926022 kernel: pci 0000:00:02.4: BAR 0: assigned [mem 0x11205000-0x11205fff] Sep 12 23:56:08.926091 kernel: pci 0000:00:02.4: BAR 13: assigned [io 0x5000-0x5fff] Sep 12 23:56:08.926163 kernel: pci 0000:00:02.5: BAR 0: assigned [mem 0x11206000-0x11206fff] Sep 12 23:56:08.926228 kernel: pci 0000:00:02.5: BAR 13: assigned [io 0x6000-0x6fff] Sep 12 23:56:08.926299 kernel: pci 0000:00:02.6: BAR 0: assigned [mem 0x11207000-0x11207fff] Sep 12 23:56:08.926366 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x7000-0x7fff] Sep 12 23:56:08.926434 kernel: pci 0000:00:02.7: BAR 0: assigned [mem 0x11208000-0x11208fff] Sep 12 23:56:08.926501 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x8000-0x8fff] Sep 12 23:56:08.926588 kernel: pci 0000:00:03.0: BAR 0: assigned [mem 0x11209000-0x11209fff] Sep 12 23:56:08.926656 kernel: pci 0000:00:03.0: BAR 13: assigned [io 0x9000-0x9fff] Sep 12 23:56:08.926785 kernel: pci 0000:00:04.0: BAR 0: assigned [io 0xa000-0xa007] Sep 12 23:56:08.926866 kernel: pci 0000:01:00.0: BAR 6: assigned [mem 0x10000000-0x1007ffff pref] Sep 12 23:56:08.926935 kernel: pci 0000:01:00.0: BAR 4: assigned [mem 0x8000000000-0x8000003fff 64bit pref] Sep 12 23:56:08.927005 kernel: pci 0000:01:00.0: BAR 1: assigned [mem 0x10080000-0x10080fff] Sep 12 23:56:08.927074 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Sep 12 23:56:08.927145 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Sep 12 23:56:08.927211 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Sep 12 23:56:08.927277 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Sep 12 23:56:08.927349 kernel: pci 0000:02:00.0: BAR 0: assigned [mem 0x10200000-0x10203fff 64bit] Sep 12 23:56:08.927419 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Sep 12 23:56:08.927489 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Sep 12 23:56:08.927587 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Sep 12 23:56:08.927655 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Sep 12 23:56:08.927830 kernel: pci 0000:03:00.0: BAR 4: assigned [mem 0x8000400000-0x8000403fff 64bit pref] Sep 12 23:56:08.927905 kernel: pci 0000:03:00.0: BAR 1: assigned [mem 0x10400000-0x10400fff] Sep 12 23:56:08.927975 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Sep 12 23:56:08.928040 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Sep 12 23:56:08.928103 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Sep 12 23:56:08.928176 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Sep 12 23:56:08.928250 kernel: pci 0000:04:00.0: BAR 4: assigned [mem 0x8000600000-0x8000603fff 64bit pref] Sep 12 23:56:08.928319 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Sep 12 23:56:08.928385 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Sep 12 23:56:08.928449 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Sep 12 23:56:08.928575 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Sep 12 23:56:08.928666 kernel: pci 0000:05:00.0: BAR 4: assigned [mem 0x8000800000-0x8000803fff 64bit pref] Sep 12 23:56:08.928803 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Sep 12 23:56:08.928891 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Sep 12 23:56:08.928956 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Sep 12 23:56:08.929024 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Sep 12 23:56:08.929097 kernel: pci 0000:06:00.0: BAR 4: assigned [mem 0x8000a00000-0x8000a03fff 64bit pref] Sep 12 23:56:08.929166 kernel: pci 0000:06:00.0: BAR 1: assigned [mem 0x10a00000-0x10a00fff] Sep 12 23:56:08.929236 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Sep 12 23:56:08.929326 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Sep 12 23:56:08.929401 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Sep 12 23:56:08.929495 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Sep 12 23:56:08.929599 kernel: pci 0000:07:00.0: BAR 6: assigned [mem 0x10c00000-0x10c7ffff pref] Sep 12 23:56:08.929686 kernel: pci 0000:07:00.0: BAR 4: assigned [mem 0x8000c00000-0x8000c03fff 64bit pref] Sep 12 23:56:08.929941 kernel: pci 0000:07:00.0: BAR 1: assigned [mem 0x10c80000-0x10c80fff] Sep 12 23:56:08.930026 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Sep 12 23:56:08.930092 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Sep 12 23:56:08.930159 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Sep 12 23:56:08.930249 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Sep 12 23:56:08.930321 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Sep 12 23:56:08.930385 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Sep 12 23:56:08.930450 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Sep 12 23:56:08.930529 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Sep 12 23:56:08.930604 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Sep 12 23:56:08.930671 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Sep 12 23:56:08.930785 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Sep 12 23:56:08.930861 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Sep 12 23:56:08.930930 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Sep 12 23:56:08.930989 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Sep 12 23:56:08.931047 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Sep 12 23:56:08.931125 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Sep 12 23:56:08.931187 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Sep 12 23:56:08.931248 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Sep 12 23:56:08.931319 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Sep 12 23:56:08.931401 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Sep 12 23:56:08.931464 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Sep 12 23:56:08.931544 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Sep 12 23:56:08.931607 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Sep 12 23:56:08.931670 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Sep 12 23:56:08.931768 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Sep 12 23:56:08.931836 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Sep 12 23:56:08.931899 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Sep 12 23:56:08.931978 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Sep 12 23:56:08.932042 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Sep 12 23:56:08.932102 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Sep 12 23:56:08.932172 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Sep 12 23:56:08.932238 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Sep 12 23:56:08.932300 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Sep 12 23:56:08.932371 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Sep 12 23:56:08.932452 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Sep 12 23:56:08.932562 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Sep 12 23:56:08.932647 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Sep 12 23:56:08.934316 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Sep 12 23:56:08.934430 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Sep 12 23:56:08.934563 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Sep 12 23:56:08.934691 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Sep 12 23:56:08.934800 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Sep 12 23:56:08.934819 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Sep 12 23:56:08.934828 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Sep 12 23:56:08.934836 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Sep 12 23:56:08.934843 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Sep 12 23:56:08.934851 kernel: iommu: Default domain type: Translated Sep 12 23:56:08.934859 kernel: iommu: DMA domain TLB invalidation policy: strict mode Sep 12 23:56:08.934867 kernel: efivars: Registered efivars operations Sep 12 23:56:08.934875 kernel: vgaarb: loaded Sep 12 23:56:08.934883 kernel: clocksource: Switched to clocksource arch_sys_counter Sep 12 23:56:08.934891 kernel: VFS: Disk quotas dquot_6.6.0 Sep 12 23:56:08.934901 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 12 23:56:08.934909 kernel: pnp: PnP ACPI init Sep 12 23:56:08.934988 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Sep 12 23:56:08.935001 kernel: pnp: PnP ACPI: found 1 devices Sep 12 23:56:08.935008 kernel: NET: Registered PF_INET protocol family Sep 12 23:56:08.935016 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 12 23:56:08.935024 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 12 23:56:08.935032 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 12 23:56:08.935043 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 12 23:56:08.935050 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 12 23:56:08.935058 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 12 23:56:08.935066 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 12 23:56:08.935074 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 12 23:56:08.935082 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 12 23:56:08.935159 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Sep 12 23:56:08.935171 kernel: PCI: CLS 0 bytes, default 64 Sep 12 23:56:08.935181 kernel: kvm [1]: HYP mode not available Sep 12 23:56:08.935189 kernel: Initialise system trusted keyrings Sep 12 23:56:08.935199 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 12 23:56:08.935207 kernel: Key type asymmetric registered Sep 12 23:56:08.935215 kernel: Asymmetric key parser 'x509' registered Sep 12 23:56:08.935223 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 12 23:56:08.935230 kernel: io scheduler mq-deadline registered Sep 12 23:56:08.935238 kernel: io scheduler kyber registered Sep 12 23:56:08.935246 kernel: io scheduler bfq registered Sep 12 23:56:08.935254 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Sep 12 23:56:08.935374 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Sep 12 23:56:08.935449 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Sep 12 23:56:08.935531 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 23:56:08.935610 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Sep 12 23:56:08.935678 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Sep 12 23:56:08.936753 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 23:56:08.936869 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Sep 12 23:56:08.936944 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Sep 12 23:56:08.937012 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 23:56:08.937085 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Sep 12 23:56:08.937152 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Sep 12 23:56:08.937218 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 23:56:08.937311 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Sep 12 23:56:08.937379 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Sep 12 23:56:08.937447 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 23:56:08.937536 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Sep 12 23:56:08.937611 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Sep 12 23:56:08.937679 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 23:56:08.937771 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Sep 12 23:56:08.937841 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Sep 12 23:56:08.937907 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 23:56:08.937978 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Sep 12 23:56:08.938055 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Sep 12 23:56:08.938125 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 23:56:08.938139 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Sep 12 23:56:08.938210 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Sep 12 23:56:08.938280 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Sep 12 23:56:08.938347 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 23:56:08.938357 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Sep 12 23:56:08.938365 kernel: ACPI: button: Power Button [PWRB] Sep 12 23:56:08.938373 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Sep 12 23:56:08.938448 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Sep 12 23:56:08.938587 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Sep 12 23:56:08.938602 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 12 23:56:08.938610 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Sep 12 23:56:08.938683 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Sep 12 23:56:08.938695 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Sep 12 23:56:08.938703 kernel: thunder_xcv, ver 1.0 Sep 12 23:56:08.938726 kernel: thunder_bgx, ver 1.0 Sep 12 23:56:08.938735 kernel: nicpf, ver 1.0 Sep 12 23:56:08.938748 kernel: nicvf, ver 1.0 Sep 12 23:56:08.938840 kernel: rtc-efi rtc-efi.0: registered as rtc0 Sep 12 23:56:08.938956 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-09-12T23:56:08 UTC (1757721368) Sep 12 23:56:08.938971 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 12 23:56:08.938979 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 counters available Sep 12 23:56:08.938987 kernel: watchdog: Delayed init of the lockup detector failed: -19 Sep 12 23:56:08.938995 kernel: watchdog: Hard watchdog permanently disabled Sep 12 23:56:08.939003 kernel: NET: Registered PF_INET6 protocol family Sep 12 23:56:08.939015 kernel: Segment Routing with IPv6 Sep 12 23:56:08.939023 kernel: In-situ OAM (IOAM) with IPv6 Sep 12 23:56:08.939031 kernel: NET: Registered PF_PACKET protocol family Sep 12 23:56:08.939039 kernel: Key type dns_resolver registered Sep 12 23:56:08.939047 kernel: registered taskstats version 1 Sep 12 23:56:08.939055 kernel: Loading compiled-in X.509 certificates Sep 12 23:56:08.939062 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.106-flatcar: 036ad4721a31543be5c000f2896b40d1e5515c6e' Sep 12 23:56:08.939070 kernel: Key type .fscrypt registered Sep 12 23:56:08.939078 kernel: Key type fscrypt-provisioning registered Sep 12 23:56:08.939087 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 12 23:56:08.939095 kernel: ima: Allocated hash algorithm: sha1 Sep 12 23:56:08.939105 kernel: ima: No architecture policies found Sep 12 23:56:08.939113 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Sep 12 23:56:08.939121 kernel: clk: Disabling unused clocks Sep 12 23:56:08.939129 kernel: Freeing unused kernel memory: 39488K Sep 12 23:56:08.939137 kernel: Run /init as init process Sep 12 23:56:08.939144 kernel: with arguments: Sep 12 23:56:08.939152 kernel: /init Sep 12 23:56:08.939161 kernel: with environment: Sep 12 23:56:08.939169 kernel: HOME=/ Sep 12 23:56:08.939177 kernel: TERM=linux Sep 12 23:56:08.939184 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 12 23:56:08.939194 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Sep 12 23:56:08.939204 systemd[1]: Detected virtualization kvm. Sep 12 23:56:08.939213 systemd[1]: Detected architecture arm64. Sep 12 23:56:08.939222 systemd[1]: Running in initrd. Sep 12 23:56:08.939230 systemd[1]: No hostname configured, using default hostname. Sep 12 23:56:08.939238 systemd[1]: Hostname set to . Sep 12 23:56:08.939247 systemd[1]: Initializing machine ID from VM UUID. Sep 12 23:56:08.939255 systemd[1]: Queued start job for default target initrd.target. Sep 12 23:56:08.939264 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 23:56:08.939272 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 23:56:08.939281 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 12 23:56:08.939291 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 23:56:08.939299 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 12 23:56:08.939308 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 12 23:56:08.939318 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 12 23:56:08.939331 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 12 23:56:08.939340 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 23:56:08.939348 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 23:56:08.939358 systemd[1]: Reached target paths.target - Path Units. Sep 12 23:56:08.939367 systemd[1]: Reached target slices.target - Slice Units. Sep 12 23:56:08.939375 systemd[1]: Reached target swap.target - Swaps. Sep 12 23:56:08.939383 systemd[1]: Reached target timers.target - Timer Units. Sep 12 23:56:08.939391 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 23:56:08.939400 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 23:56:08.939408 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 12 23:56:08.939416 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Sep 12 23:56:08.939424 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 23:56:08.939434 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 23:56:08.939443 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 23:56:08.939451 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 23:56:08.939459 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 12 23:56:08.939468 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 23:56:08.939477 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 12 23:56:08.939485 systemd[1]: Starting systemd-fsck-usr.service... Sep 12 23:56:08.939493 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 23:56:08.939503 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 23:56:08.939552 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 23:56:08.939561 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 12 23:56:08.939598 systemd-journald[235]: Collecting audit messages is disabled. Sep 12 23:56:08.939624 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 23:56:08.939632 systemd[1]: Finished systemd-fsck-usr.service. Sep 12 23:56:08.939641 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 23:56:08.939650 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 23:56:08.939659 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 12 23:56:08.939669 kernel: Bridge firewalling registered Sep 12 23:56:08.939678 systemd-journald[235]: Journal started Sep 12 23:56:08.939698 systemd-journald[235]: Runtime Journal (/run/log/journal/e9dddb51187f473fafc9db20b1b43a71) is 8.0M, max 76.6M, 68.6M free. Sep 12 23:56:08.915985 systemd-modules-load[236]: Inserted module 'overlay' Sep 12 23:56:08.936406 systemd-modules-load[236]: Inserted module 'br_netfilter' Sep 12 23:56:08.951205 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 23:56:08.951231 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 23:56:08.954085 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 23:56:08.955056 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 23:56:08.961099 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 23:56:08.967022 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 23:56:08.969963 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 23:56:08.974939 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 23:56:08.988170 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 23:56:08.997838 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 23:56:09.003018 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 12 23:56:09.004929 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 23:56:09.009899 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 23:56:09.020227 dracut-cmdline[271]: dracut-dracut-053 Sep 12 23:56:09.025212 dracut-cmdline[271]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=e1b46f3c9e154636c32f6cde6e746a00a6b37ca7432cb4e16d172c05f584a8c9 Sep 12 23:56:09.052268 systemd-resolved[277]: Positive Trust Anchors: Sep 12 23:56:09.052283 systemd-resolved[277]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 23:56:09.052315 systemd-resolved[277]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 23:56:09.063303 systemd-resolved[277]: Defaulting to hostname 'linux'. Sep 12 23:56:09.065545 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 23:56:09.066409 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 23:56:09.111783 kernel: SCSI subsystem initialized Sep 12 23:56:09.117752 kernel: Loading iSCSI transport class v2.0-870. Sep 12 23:56:09.129788 kernel: iscsi: registered transport (tcp) Sep 12 23:56:09.144378 kernel: iscsi: registered transport (qla4xxx) Sep 12 23:56:09.144484 kernel: QLogic iSCSI HBA Driver Sep 12 23:56:09.193752 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 12 23:56:09.200898 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 12 23:56:09.224094 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 12 23:56:09.224176 kernel: device-mapper: uevent: version 1.0.3 Sep 12 23:56:09.224801 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Sep 12 23:56:09.275744 kernel: raid6: neonx8 gen() 15410 MB/s Sep 12 23:56:09.292763 kernel: raid6: neonx4 gen() 15442 MB/s Sep 12 23:56:09.309771 kernel: raid6: neonx2 gen() 12984 MB/s Sep 12 23:56:09.326771 kernel: raid6: neonx1 gen() 10313 MB/s Sep 12 23:56:09.343776 kernel: raid6: int64x8 gen() 6862 MB/s Sep 12 23:56:09.360769 kernel: raid6: int64x4 gen() 7236 MB/s Sep 12 23:56:09.377780 kernel: raid6: int64x2 gen() 6042 MB/s Sep 12 23:56:09.394778 kernel: raid6: int64x1 gen() 4957 MB/s Sep 12 23:56:09.394862 kernel: raid6: using algorithm neonx4 gen() 15442 MB/s Sep 12 23:56:09.411786 kernel: raid6: .... xor() 11876 MB/s, rmw enabled Sep 12 23:56:09.411883 kernel: raid6: using neon recovery algorithm Sep 12 23:56:09.417759 kernel: xor: measuring software checksum speed Sep 12 23:56:09.417835 kernel: 8regs : 19735 MB/sec Sep 12 23:56:09.417846 kernel: 32regs : 17499 MB/sec Sep 12 23:56:09.418870 kernel: arm64_neon : 26963 MB/sec Sep 12 23:56:09.418920 kernel: xor: using function: arm64_neon (26963 MB/sec) Sep 12 23:56:09.471756 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 12 23:56:09.486399 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 12 23:56:09.493024 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 23:56:09.511546 systemd-udevd[456]: Using default interface naming scheme 'v255'. Sep 12 23:56:09.516114 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 23:56:09.527102 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 12 23:56:09.544984 dracut-pre-trigger[465]: rd.md=0: removing MD RAID activation Sep 12 23:56:09.586696 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 23:56:09.595081 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 23:56:09.660979 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 23:56:09.667953 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 12 23:56:09.697773 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 12 23:56:09.700194 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 23:56:09.701026 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 23:56:09.704599 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 23:56:09.709992 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 12 23:56:09.727986 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 12 23:56:09.757165 kernel: ACPI: bus type USB registered Sep 12 23:56:09.757223 kernel: usbcore: registered new interface driver usbfs Sep 12 23:56:09.757247 kernel: usbcore: registered new interface driver hub Sep 12 23:56:09.758769 kernel: usbcore: registered new device driver usb Sep 12 23:56:09.807745 kernel: scsi host0: Virtio SCSI HBA Sep 12 23:56:09.815772 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Sep 12 23:56:09.816023 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Sep 12 23:56:09.816053 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Sep 12 23:56:09.816069 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Sep 12 23:56:09.816165 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Sep 12 23:56:09.814646 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 12 23:56:09.821743 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Sep 12 23:56:09.821915 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Sep 12 23:56:09.821998 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Sep 12 23:56:09.822077 kernel: hub 1-0:1.0: USB hub found Sep 12 23:56:09.822202 kernel: hub 1-0:1.0: 4 ports detected Sep 12 23:56:09.814800 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 23:56:09.822315 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 23:56:09.826438 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Sep 12 23:56:09.825411 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 23:56:09.829449 kernel: hub 2-0:1.0: USB hub found Sep 12 23:56:09.829633 kernel: hub 2-0:1.0: 4 ports detected Sep 12 23:56:09.825609 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 23:56:09.827471 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 23:56:09.840021 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 23:56:09.855024 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 23:56:09.862922 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 23:56:09.866411 kernel: sr 0:0:0:0: Power-on or device reset occurred Sep 12 23:56:09.866598 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Sep 12 23:56:09.866693 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 12 23:56:09.868770 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Sep 12 23:56:09.881823 kernel: sd 0:0:0:1: Power-on or device reset occurred Sep 12 23:56:09.884604 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Sep 12 23:56:09.884900 kernel: sd 0:0:0:1: [sda] Write Protect is off Sep 12 23:56:09.885023 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Sep 12 23:56:09.885112 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Sep 12 23:56:09.893815 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 12 23:56:09.893882 kernel: GPT:17805311 != 80003071 Sep 12 23:56:09.893894 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 12 23:56:09.893906 kernel: GPT:17805311 != 80003071 Sep 12 23:56:09.895121 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 12 23:56:09.895737 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 23:56:09.896735 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Sep 12 23:56:09.903753 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 23:56:09.951560 kernel: BTRFS: device label OEM devid 1 transid 9 /dev/sda6 scanned by (udev-worker) (514) Sep 12 23:56:09.951620 kernel: BTRFS: device fsid 29bc4da8-c689-46a2-a16a-b7bbc722db77 devid 1 transid 37 /dev/sda3 scanned by (udev-worker) (512) Sep 12 23:56:09.959355 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Sep 12 23:56:09.967534 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Sep 12 23:56:09.973851 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Sep 12 23:56:09.975316 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Sep 12 23:56:09.983790 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Sep 12 23:56:09.994011 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 12 23:56:10.000971 disk-uuid[574]: Primary Header is updated. Sep 12 23:56:10.000971 disk-uuid[574]: Secondary Entries is updated. Sep 12 23:56:10.000971 disk-uuid[574]: Secondary Header is updated. Sep 12 23:56:10.007773 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 23:56:10.013751 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 23:56:10.021771 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 23:56:10.061856 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Sep 12 23:56:10.201731 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Sep 12 23:56:10.207736 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Sep 12 23:56:10.208000 kernel: usbcore: registered new interface driver usbhid Sep 12 23:56:10.210106 kernel: usbhid: USB HID core driver Sep 12 23:56:10.305747 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Sep 12 23:56:10.434746 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Sep 12 23:56:10.487786 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Sep 12 23:56:11.030899 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 23:56:11.031623 disk-uuid[575]: The operation has completed successfully. Sep 12 23:56:11.086992 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 12 23:56:11.087117 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 12 23:56:11.109110 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 12 23:56:11.115254 sh[592]: Success Sep 12 23:56:11.130842 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Sep 12 23:56:11.194604 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 12 23:56:11.208104 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 12 23:56:11.213151 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 12 23:56:11.229369 kernel: BTRFS info (device dm-0): first mount of filesystem 29bc4da8-c689-46a2-a16a-b7bbc722db77 Sep 12 23:56:11.229435 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Sep 12 23:56:11.229450 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Sep 12 23:56:11.230758 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 12 23:56:11.230808 kernel: BTRFS info (device dm-0): using free space tree Sep 12 23:56:11.237880 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 12 23:56:11.240168 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 12 23:56:11.240991 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 12 23:56:11.247158 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 12 23:56:11.253002 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 12 23:56:11.269417 kernel: BTRFS info (device sda6): first mount of filesystem abbcf5a1-cc71-42ce-94f9-860f3aeda368 Sep 12 23:56:11.269556 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 12 23:56:11.269582 kernel: BTRFS info (device sda6): using free space tree Sep 12 23:56:11.276851 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 23:56:11.276919 kernel: BTRFS info (device sda6): auto enabling async discard Sep 12 23:56:11.290741 kernel: BTRFS info (device sda6): last unmount of filesystem abbcf5a1-cc71-42ce-94f9-860f3aeda368 Sep 12 23:56:11.291441 systemd[1]: mnt-oem.mount: Deactivated successfully. Sep 12 23:56:11.300189 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 12 23:56:11.304931 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 12 23:56:11.393056 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 23:56:11.400096 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 23:56:11.438195 ignition[684]: Ignition 2.19.0 Sep 12 23:56:11.438208 ignition[684]: Stage: fetch-offline Sep 12 23:56:11.440920 systemd-networkd[779]: lo: Link UP Sep 12 23:56:11.438250 ignition[684]: no configs at "/usr/lib/ignition/base.d" Sep 12 23:56:11.440924 systemd-networkd[779]: lo: Gained carrier Sep 12 23:56:11.438260 ignition[684]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 23:56:11.441876 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 23:56:11.438432 ignition[684]: parsed url from cmdline: "" Sep 12 23:56:11.442743 systemd-networkd[779]: Enumeration completed Sep 12 23:56:11.438436 ignition[684]: no config URL provided Sep 12 23:56:11.443533 systemd-networkd[779]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 23:56:11.438440 ignition[684]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 23:56:11.443538 systemd-networkd[779]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 23:56:11.438447 ignition[684]: no config at "/usr/lib/ignition/user.ign" Sep 12 23:56:11.444231 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 23:56:11.438452 ignition[684]: failed to fetch config: resource requires networking Sep 12 23:56:11.446578 systemd-networkd[779]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 23:56:11.438670 ignition[684]: Ignition finished successfully Sep 12 23:56:11.446581 systemd-networkd[779]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 23:56:11.447187 systemd[1]: Reached target network.target - Network. Sep 12 23:56:11.447234 systemd-networkd[779]: eth0: Link UP Sep 12 23:56:11.447237 systemd-networkd[779]: eth0: Gained carrier Sep 12 23:56:11.447245 systemd-networkd[779]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 23:56:11.456104 systemd-networkd[779]: eth1: Link UP Sep 12 23:56:11.456107 systemd-networkd[779]: eth1: Gained carrier Sep 12 23:56:11.456119 systemd-networkd[779]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 23:56:11.456936 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 12 23:56:11.473605 ignition[782]: Ignition 2.19.0 Sep 12 23:56:11.473617 ignition[782]: Stage: fetch Sep 12 23:56:11.473924 ignition[782]: no configs at "/usr/lib/ignition/base.d" Sep 12 23:56:11.473935 ignition[782]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 23:56:11.474064 ignition[782]: parsed url from cmdline: "" Sep 12 23:56:11.474070 ignition[782]: no config URL provided Sep 12 23:56:11.474075 ignition[782]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 23:56:11.474083 ignition[782]: no config at "/usr/lib/ignition/user.ign" Sep 12 23:56:11.474113 ignition[782]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Sep 12 23:56:11.474592 ignition[782]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Sep 12 23:56:11.504825 systemd-networkd[779]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Sep 12 23:56:11.518894 systemd-networkd[779]: eth0: DHCPv4 address 91.99.152.252/32, gateway 172.31.1.1 acquired from 172.31.1.1 Sep 12 23:56:11.674803 ignition[782]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Sep 12 23:56:11.682968 ignition[782]: GET result: OK Sep 12 23:56:11.683275 ignition[782]: parsing config with SHA512: 243da4112d671121768d7fb4ec584853e9f1786a0caf8de9c1ddfecefa14d25ca815eb01a2efd19bd3e717bf17f31d6a6ffcd566e965e6ebe3a7d9e9bc9a096f Sep 12 23:56:11.691201 unknown[782]: fetched base config from "system" Sep 12 23:56:11.691224 unknown[782]: fetched base config from "system" Sep 12 23:56:11.691973 ignition[782]: fetch: fetch complete Sep 12 23:56:11.691230 unknown[782]: fetched user config from "hetzner" Sep 12 23:56:11.691979 ignition[782]: fetch: fetch passed Sep 12 23:56:11.692062 ignition[782]: Ignition finished successfully Sep 12 23:56:11.694439 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 12 23:56:11.704050 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 12 23:56:11.721201 ignition[789]: Ignition 2.19.0 Sep 12 23:56:11.721224 ignition[789]: Stage: kargs Sep 12 23:56:11.721637 ignition[789]: no configs at "/usr/lib/ignition/base.d" Sep 12 23:56:11.721656 ignition[789]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 23:56:11.727694 ignition[789]: kargs: kargs passed Sep 12 23:56:11.728835 ignition[789]: Ignition finished successfully Sep 12 23:56:11.731058 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 12 23:56:11.737183 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 12 23:56:11.752251 ignition[795]: Ignition 2.19.0 Sep 12 23:56:11.752261 ignition[795]: Stage: disks Sep 12 23:56:11.752522 ignition[795]: no configs at "/usr/lib/ignition/base.d" Sep 12 23:56:11.752534 ignition[795]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 23:56:11.753632 ignition[795]: disks: disks passed Sep 12 23:56:11.755389 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 12 23:56:11.753690 ignition[795]: Ignition finished successfully Sep 12 23:56:11.757144 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 12 23:56:11.758880 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 12 23:56:11.760154 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 23:56:11.761445 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 23:56:11.762792 systemd[1]: Reached target basic.target - Basic System. Sep 12 23:56:11.770050 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 12 23:56:11.793615 systemd-fsck[804]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Sep 12 23:56:11.799323 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 12 23:56:11.807949 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 12 23:56:11.870779 kernel: EXT4-fs (sda9): mounted filesystem d35fd879-6758-447b-9fdd-bb21dd7c5b2b r/w with ordered data mode. Quota mode: none. Sep 12 23:56:11.872590 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 12 23:56:11.874471 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 12 23:56:11.882947 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 23:56:11.889026 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 12 23:56:11.898035 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Sep 12 23:56:11.901823 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 12 23:56:11.905503 kernel: BTRFS: device label OEM devid 1 transid 10 /dev/sda6 scanned by mount (812) Sep 12 23:56:11.905531 kernel: BTRFS info (device sda6): first mount of filesystem abbcf5a1-cc71-42ce-94f9-860f3aeda368 Sep 12 23:56:11.901989 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 23:56:11.909454 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 12 23:56:11.909634 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 12 23:56:11.913742 kernel: BTRFS info (device sda6): using free space tree Sep 12 23:56:11.918994 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 12 23:56:11.932752 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 23:56:11.932830 kernel: BTRFS info (device sda6): auto enabling async discard Sep 12 23:56:11.937798 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 23:56:11.978743 initrd-setup-root[839]: cut: /sysroot/etc/passwd: No such file or directory Sep 12 23:56:11.982993 coreos-metadata[814]: Sep 12 23:56:11.982 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Sep 12 23:56:11.987416 initrd-setup-root[846]: cut: /sysroot/etc/group: No such file or directory Sep 12 23:56:11.988460 coreos-metadata[814]: Sep 12 23:56:11.986 INFO Fetch successful Sep 12 23:56:11.988460 coreos-metadata[814]: Sep 12 23:56:11.986 INFO wrote hostname ci-4081-3-5-n-326e2e5946 to /sysroot/etc/hostname Sep 12 23:56:11.991632 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 12 23:56:11.996792 initrd-setup-root[854]: cut: /sysroot/etc/shadow: No such file or directory Sep 12 23:56:12.002725 initrd-setup-root[861]: cut: /sysroot/etc/gshadow: No such file or directory Sep 12 23:56:12.122769 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 12 23:56:12.134020 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 12 23:56:12.137994 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 12 23:56:12.147948 kernel: BTRFS info (device sda6): last unmount of filesystem abbcf5a1-cc71-42ce-94f9-860f3aeda368 Sep 12 23:56:12.174264 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 12 23:56:12.175788 ignition[929]: INFO : Ignition 2.19.0 Sep 12 23:56:12.177757 ignition[929]: INFO : Stage: mount Sep 12 23:56:12.177757 ignition[929]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 23:56:12.177757 ignition[929]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 23:56:12.180827 ignition[929]: INFO : mount: mount passed Sep 12 23:56:12.180827 ignition[929]: INFO : Ignition finished successfully Sep 12 23:56:12.180117 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 12 23:56:12.189929 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 12 23:56:12.230309 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 12 23:56:12.238078 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 23:56:12.253985 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 scanned by mount (940) Sep 12 23:56:12.256156 kernel: BTRFS info (device sda6): first mount of filesystem abbcf5a1-cc71-42ce-94f9-860f3aeda368 Sep 12 23:56:12.256221 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 12 23:56:12.256236 kernel: BTRFS info (device sda6): using free space tree Sep 12 23:56:12.260814 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 23:56:12.260891 kernel: BTRFS info (device sda6): auto enabling async discard Sep 12 23:56:12.264394 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 23:56:12.290272 ignition[958]: INFO : Ignition 2.19.0 Sep 12 23:56:12.290272 ignition[958]: INFO : Stage: files Sep 12 23:56:12.293091 ignition[958]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 23:56:12.293091 ignition[958]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 23:56:12.293091 ignition[958]: DEBUG : files: compiled without relabeling support, skipping Sep 12 23:56:12.293091 ignition[958]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 12 23:56:12.293091 ignition[958]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 12 23:56:12.299067 ignition[958]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 12 23:56:12.299067 ignition[958]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 12 23:56:12.301282 unknown[958]: wrote ssh authorized keys file for user: core Sep 12 23:56:12.302914 ignition[958]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 12 23:56:12.304760 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Sep 12 23:56:12.306172 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-arm64.tar.gz: attempt #1 Sep 12 23:56:12.431893 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 12 23:56:12.473983 systemd-networkd[779]: eth0: Gained IPv6LL Sep 12 23:56:12.707809 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Sep 12 23:56:12.707809 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 12 23:56:12.710854 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 12 23:56:12.710854 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 12 23:56:12.710854 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 12 23:56:12.710854 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 23:56:12.710854 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 23:56:12.710854 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 23:56:12.710854 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 23:56:12.710854 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 23:56:12.710854 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 23:56:12.710854 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 12 23:56:12.710854 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 12 23:56:12.710854 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 12 23:56:12.710854 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-arm64.raw: attempt #1 Sep 12 23:56:13.079598 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 12 23:56:13.498068 systemd-networkd[779]: eth1: Gained IPv6LL Sep 12 23:56:14.165575 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 12 23:56:14.167985 ignition[958]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 12 23:56:14.167985 ignition[958]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 23:56:14.167985 ignition[958]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 23:56:14.167985 ignition[958]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 12 23:56:14.167985 ignition[958]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 12 23:56:14.167985 ignition[958]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Sep 12 23:56:14.167985 ignition[958]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Sep 12 23:56:14.167985 ignition[958]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 12 23:56:14.167985 ignition[958]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Sep 12 23:56:14.167985 ignition[958]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Sep 12 23:56:14.167985 ignition[958]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 12 23:56:14.183003 ignition[958]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 12 23:56:14.183003 ignition[958]: INFO : files: files passed Sep 12 23:56:14.183003 ignition[958]: INFO : Ignition finished successfully Sep 12 23:56:14.170338 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 12 23:56:14.178066 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 12 23:56:14.184923 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 12 23:56:14.191392 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 12 23:56:14.191566 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 12 23:56:14.212775 initrd-setup-root-after-ignition[986]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 23:56:14.212775 initrd-setup-root-after-ignition[986]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 12 23:56:14.215655 initrd-setup-root-after-ignition[990]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 23:56:14.218376 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 23:56:14.219625 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 12 23:56:14.231979 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 12 23:56:14.271081 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 12 23:56:14.271287 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 12 23:56:14.274290 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 12 23:56:14.275791 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 12 23:56:14.277382 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 12 23:56:14.282994 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 12 23:56:14.300760 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 23:56:14.309038 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 12 23:56:14.320206 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 12 23:56:14.322920 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 23:56:14.325045 systemd[1]: Stopped target timers.target - Timer Units. Sep 12 23:56:14.325962 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 12 23:56:14.326117 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 23:56:14.328021 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 12 23:56:14.330090 systemd[1]: Stopped target basic.target - Basic System. Sep 12 23:56:14.330838 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 12 23:56:14.331994 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 23:56:14.333284 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 12 23:56:14.334488 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 12 23:56:14.335768 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 23:56:14.337282 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 12 23:56:14.338577 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 12 23:56:14.339876 systemd[1]: Stopped target swap.target - Swaps. Sep 12 23:56:14.341023 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 12 23:56:14.341198 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 12 23:56:14.342471 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 12 23:56:14.343848 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 23:56:14.345154 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 12 23:56:14.345272 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 23:56:14.346641 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 12 23:56:14.346881 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 12 23:56:14.348696 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 12 23:56:14.348889 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 23:56:14.350350 systemd[1]: ignition-files.service: Deactivated successfully. Sep 12 23:56:14.350551 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 12 23:56:14.351575 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Sep 12 23:56:14.351737 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 12 23:56:14.364922 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 12 23:56:14.369096 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 12 23:56:14.369673 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 12 23:56:14.370879 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 23:56:14.373851 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 12 23:56:14.374250 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 23:56:14.381927 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 12 23:56:14.382124 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 12 23:56:14.393143 ignition[1011]: INFO : Ignition 2.19.0 Sep 12 23:56:14.393143 ignition[1011]: INFO : Stage: umount Sep 12 23:56:14.396314 ignition[1011]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 23:56:14.396314 ignition[1011]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 23:56:14.396314 ignition[1011]: INFO : umount: umount passed Sep 12 23:56:14.396314 ignition[1011]: INFO : Ignition finished successfully Sep 12 23:56:14.396741 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 12 23:56:14.396849 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 12 23:56:14.397728 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 12 23:56:14.397785 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 12 23:56:14.399564 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 12 23:56:14.399620 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 12 23:56:14.401228 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 12 23:56:14.401271 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 12 23:56:14.402339 systemd[1]: Stopped target network.target - Network. Sep 12 23:56:14.404077 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 12 23:56:14.404147 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 23:56:14.405496 systemd[1]: Stopped target paths.target - Path Units. Sep 12 23:56:14.407554 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 12 23:56:14.410789 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 23:56:14.411946 systemd[1]: Stopped target slices.target - Slice Units. Sep 12 23:56:14.413437 systemd[1]: Stopped target sockets.target - Socket Units. Sep 12 23:56:14.415047 systemd[1]: iscsid.socket: Deactivated successfully. Sep 12 23:56:14.415105 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 23:56:14.416131 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 12 23:56:14.416172 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 23:56:14.417279 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 12 23:56:14.417338 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 12 23:56:14.418418 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 12 23:56:14.418470 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 12 23:56:14.420164 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 12 23:56:14.421141 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 12 23:56:14.423678 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 12 23:56:14.424229 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 12 23:56:14.424321 systemd-networkd[779]: eth1: DHCPv6 lease lost Sep 12 23:56:14.424325 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 12 23:56:14.425978 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 12 23:56:14.426062 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 12 23:56:14.428251 systemd-networkd[779]: eth0: DHCPv6 lease lost Sep 12 23:56:14.431120 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 12 23:56:14.431255 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 12 23:56:14.433809 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 12 23:56:14.434260 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 12 23:56:14.437338 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 12 23:56:14.437392 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 12 23:56:14.446154 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 12 23:56:14.448402 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 12 23:56:14.448496 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 23:56:14.449624 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 12 23:56:14.449700 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 12 23:56:14.451935 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 12 23:56:14.451997 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 12 23:56:14.453973 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 12 23:56:14.454102 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 23:56:14.455729 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 23:56:14.470964 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 12 23:56:14.471170 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 12 23:56:14.485688 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 12 23:56:14.486110 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 23:56:14.489018 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 12 23:56:14.489201 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 12 23:56:14.490681 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 12 23:56:14.491188 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 23:56:14.492045 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 12 23:56:14.492102 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 12 23:56:14.494189 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 12 23:56:14.494244 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 12 23:56:14.495849 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 12 23:56:14.495897 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 23:56:14.503955 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 12 23:56:14.504632 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 12 23:56:14.504720 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 23:56:14.506604 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 12 23:56:14.506658 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 23:56:14.507604 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 12 23:56:14.507647 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 23:56:14.508588 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 23:56:14.508635 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 23:56:14.514384 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 12 23:56:14.514567 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 12 23:56:14.516572 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 12 23:56:14.522967 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 12 23:56:14.531409 systemd[1]: Switching root. Sep 12 23:56:14.556447 systemd-journald[235]: Journal stopped Sep 12 23:56:15.533558 systemd-journald[235]: Received SIGTERM from PID 1 (systemd). Sep 12 23:56:15.533648 kernel: SELinux: policy capability network_peer_controls=1 Sep 12 23:56:15.533662 kernel: SELinux: policy capability open_perms=1 Sep 12 23:56:15.533676 kernel: SELinux: policy capability extended_socket_class=1 Sep 12 23:56:15.533685 kernel: SELinux: policy capability always_check_network=0 Sep 12 23:56:15.533694 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 12 23:56:15.533727 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 12 23:56:15.533738 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 12 23:56:15.533753 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 12 23:56:15.533763 kernel: audit: type=1403 audit(1757721374.729:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 12 23:56:15.533774 systemd[1]: Successfully loaded SELinux policy in 37.768ms. Sep 12 23:56:15.533791 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 12.072ms. Sep 12 23:56:15.533805 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Sep 12 23:56:15.533816 systemd[1]: Detected virtualization kvm. Sep 12 23:56:15.533827 systemd[1]: Detected architecture arm64. Sep 12 23:56:15.533837 systemd[1]: Detected first boot. Sep 12 23:56:15.533847 systemd[1]: Hostname set to . Sep 12 23:56:15.533862 systemd[1]: Initializing machine ID from VM UUID. Sep 12 23:56:15.533873 zram_generator::config[1054]: No configuration found. Sep 12 23:56:15.533886 systemd[1]: Populated /etc with preset unit settings. Sep 12 23:56:15.533899 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 12 23:56:15.533910 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 12 23:56:15.533921 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 12 23:56:15.533932 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 12 23:56:15.533943 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 12 23:56:15.533954 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 12 23:56:15.533965 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 12 23:56:15.533976 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 12 23:56:15.533988 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 12 23:56:15.533999 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 12 23:56:15.534009 systemd[1]: Created slice user.slice - User and Session Slice. Sep 12 23:56:15.534020 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 23:56:15.534032 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 23:56:15.534047 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 12 23:56:15.534058 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 12 23:56:15.534068 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 12 23:56:15.534079 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 23:56:15.534091 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Sep 12 23:56:15.534102 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 23:56:15.534113 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 12 23:56:15.534123 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 12 23:56:15.534135 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 12 23:56:15.534146 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 12 23:56:15.534158 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 23:56:15.534168 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 23:56:15.534178 systemd[1]: Reached target slices.target - Slice Units. Sep 12 23:56:15.534189 systemd[1]: Reached target swap.target - Swaps. Sep 12 23:56:15.534199 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 12 23:56:15.534211 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 12 23:56:15.534221 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 23:56:15.534237 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 23:56:15.534249 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 23:56:15.534261 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 12 23:56:15.534272 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 12 23:56:15.534282 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 12 23:56:15.534293 systemd[1]: Mounting media.mount - External Media Directory... Sep 12 23:56:15.534303 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 12 23:56:15.534314 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 12 23:56:15.534324 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 12 23:56:15.534335 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 12 23:56:15.534346 systemd[1]: Reached target machines.target - Containers. Sep 12 23:56:15.534358 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 12 23:56:15.534370 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 23:56:15.534385 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 23:56:15.534402 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 12 23:56:15.534413 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 23:56:15.534425 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 23:56:15.534436 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 23:56:15.534446 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 12 23:56:15.534457 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 23:56:15.534479 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 12 23:56:15.534494 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 12 23:56:15.534505 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 12 23:56:15.534515 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 12 23:56:15.534526 systemd[1]: Stopped systemd-fsck-usr.service. Sep 12 23:56:15.534539 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 23:56:15.534550 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 23:56:15.534561 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 23:56:15.534571 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 12 23:56:15.534581 kernel: fuse: init (API version 7.39) Sep 12 23:56:15.534592 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 23:56:15.534603 systemd[1]: verity-setup.service: Deactivated successfully. Sep 12 23:56:15.534614 systemd[1]: Stopped verity-setup.service. Sep 12 23:56:15.534625 kernel: loop: module loaded Sep 12 23:56:15.534637 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 12 23:56:15.534649 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 12 23:56:15.534659 systemd[1]: Mounted media.mount - External Media Directory. Sep 12 23:56:15.534670 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 12 23:56:15.534681 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 12 23:56:15.534693 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 12 23:56:15.536537 systemd-journald[1128]: Collecting audit messages is disabled. Sep 12 23:56:15.536591 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 23:56:15.536605 systemd-journald[1128]: Journal started Sep 12 23:56:15.536629 systemd-journald[1128]: Runtime Journal (/run/log/journal/e9dddb51187f473fafc9db20b1b43a71) is 8.0M, max 76.6M, 68.6M free. Sep 12 23:56:15.253523 systemd[1]: Queued start job for default target multi-user.target. Sep 12 23:56:15.279087 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 12 23:56:15.279566 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 12 23:56:15.546668 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 23:56:15.546777 kernel: ACPI: bus type drm_connector registered Sep 12 23:56:15.550902 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 12 23:56:15.551572 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 12 23:56:15.554406 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 12 23:56:15.556129 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 23:56:15.556283 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 23:56:15.558078 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 23:56:15.559905 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 23:56:15.560926 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 23:56:15.561065 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 23:56:15.562141 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 12 23:56:15.562269 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 12 23:56:15.563257 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 23:56:15.563396 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 23:56:15.565394 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 23:56:15.566595 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 23:56:15.574922 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 12 23:56:15.582911 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 23:56:15.589976 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 12 23:56:15.595041 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 12 23:56:15.596144 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 12 23:56:15.596188 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 23:56:15.597923 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Sep 12 23:56:15.600914 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 12 23:56:15.608101 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 12 23:56:15.609151 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 23:56:15.612978 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 12 23:56:15.618981 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 12 23:56:15.621549 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 23:56:15.624940 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 12 23:56:15.626102 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 23:56:15.629987 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 23:56:15.634604 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 12 23:56:15.643109 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 23:56:15.647583 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 12 23:56:15.648918 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 12 23:56:15.651327 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 12 23:56:15.663319 systemd-journald[1128]: Time spent on flushing to /var/log/journal/e9dddb51187f473fafc9db20b1b43a71 is 75.508ms for 1126 entries. Sep 12 23:56:15.663319 systemd-journald[1128]: System Journal (/var/log/journal/e9dddb51187f473fafc9db20b1b43a71) is 8.0M, max 584.8M, 576.8M free. Sep 12 23:56:15.763700 systemd-journald[1128]: Received client request to flush runtime journal. Sep 12 23:56:15.763801 kernel: loop0: detected capacity change from 0 to 8 Sep 12 23:56:15.763842 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 12 23:56:15.763872 kernel: loop1: detected capacity change from 0 to 114432 Sep 12 23:56:15.685132 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 12 23:56:15.686550 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 12 23:56:15.698091 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Sep 12 23:56:15.732775 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 23:56:15.741239 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Sep 12 23:56:15.761554 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 23:56:15.771334 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 12 23:56:15.773457 systemd-tmpfiles[1169]: ACLs are not supported, ignoring. Sep 12 23:56:15.773482 systemd-tmpfiles[1169]: ACLs are not supported, ignoring. Sep 12 23:56:15.784816 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 23:56:15.801133 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 12 23:56:15.808133 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 12 23:56:15.809029 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Sep 12 23:56:15.814064 udevadm[1180]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Sep 12 23:56:15.820038 kernel: loop2: detected capacity change from 0 to 207008 Sep 12 23:56:15.842081 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 12 23:56:15.852040 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 23:56:15.870130 systemd-tmpfiles[1192]: ACLs are not supported, ignoring. Sep 12 23:56:15.870622 systemd-tmpfiles[1192]: ACLs are not supported, ignoring. Sep 12 23:56:15.875797 kernel: loop3: detected capacity change from 0 to 114328 Sep 12 23:56:15.878190 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 23:56:15.911860 kernel: loop4: detected capacity change from 0 to 8 Sep 12 23:56:15.916759 kernel: loop5: detected capacity change from 0 to 114432 Sep 12 23:56:15.935373 kernel: loop6: detected capacity change from 0 to 207008 Sep 12 23:56:15.955780 kernel: loop7: detected capacity change from 0 to 114328 Sep 12 23:56:15.971080 (sd-merge)[1196]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Sep 12 23:56:15.971622 (sd-merge)[1196]: Merged extensions into '/usr'. Sep 12 23:56:15.978943 systemd[1]: Reloading requested from client PID 1168 ('systemd-sysext') (unit systemd-sysext.service)... Sep 12 23:56:15.978967 systemd[1]: Reloading... Sep 12 23:56:16.116767 zram_generator::config[1222]: No configuration found. Sep 12 23:56:16.255175 ldconfig[1163]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 12 23:56:16.273617 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 12 23:56:16.324623 systemd[1]: Reloading finished in 344 ms. Sep 12 23:56:16.348874 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 12 23:56:16.356766 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 12 23:56:16.367444 systemd[1]: Starting ensure-sysext.service... Sep 12 23:56:16.371993 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 23:56:16.381393 systemd[1]: Reloading requested from client PID 1259 ('systemctl') (unit ensure-sysext.service)... Sep 12 23:56:16.381419 systemd[1]: Reloading... Sep 12 23:56:16.434534 systemd-tmpfiles[1260]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 12 23:56:16.434938 systemd-tmpfiles[1260]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 12 23:56:16.435615 systemd-tmpfiles[1260]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 12 23:56:16.436167 systemd-tmpfiles[1260]: ACLs are not supported, ignoring. Sep 12 23:56:16.436217 systemd-tmpfiles[1260]: ACLs are not supported, ignoring. Sep 12 23:56:16.442785 systemd-tmpfiles[1260]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 23:56:16.442800 systemd-tmpfiles[1260]: Skipping /boot Sep 12 23:56:16.458210 systemd-tmpfiles[1260]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 23:56:16.458228 systemd-tmpfiles[1260]: Skipping /boot Sep 12 23:56:16.512743 zram_generator::config[1289]: No configuration found. Sep 12 23:56:16.629783 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 12 23:56:16.678818 systemd[1]: Reloading finished in 296 ms. Sep 12 23:56:16.697896 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 12 23:56:16.706531 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 23:56:16.716297 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Sep 12 23:56:16.724764 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 12 23:56:16.728689 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 12 23:56:16.748803 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 23:56:16.755146 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 23:56:16.764032 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 12 23:56:16.775673 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 12 23:56:16.778838 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 23:56:16.783891 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 23:56:16.788069 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 23:56:16.792236 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 23:56:16.793302 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 23:56:16.797400 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 23:56:16.797650 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 23:56:16.802191 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 23:56:16.804926 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 23:56:16.806988 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 23:56:16.821850 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 23:56:16.822038 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 23:56:16.829819 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 12 23:56:16.831317 systemd[1]: Finished ensure-sysext.service. Sep 12 23:56:16.849802 augenrules[1355]: No rules Sep 12 23:56:16.850393 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 12 23:56:16.852808 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Sep 12 23:56:16.855798 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 12 23:56:16.869689 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 12 23:56:16.871346 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 23:56:16.871535 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 23:56:16.874102 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 23:56:16.874263 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 23:56:16.875341 systemd-udevd[1338]: Using default interface naming scheme 'v255'. Sep 12 23:56:16.877256 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 23:56:16.877845 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 23:56:16.884182 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 23:56:16.884249 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 23:56:16.896326 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 12 23:56:16.898239 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 12 23:56:16.904675 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 12 23:56:16.912554 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 12 23:56:16.920941 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 23:56:16.930999 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 23:56:16.997589 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 12 23:56:16.998661 systemd[1]: Reached target time-set.target - System Time Set. Sep 12 23:56:17.046645 systemd-resolved[1335]: Positive Trust Anchors: Sep 12 23:56:17.046667 systemd-resolved[1335]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 23:56:17.046701 systemd-resolved[1335]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 23:56:17.051229 systemd-networkd[1378]: lo: Link UP Sep 12 23:56:17.051579 systemd-networkd[1378]: lo: Gained carrier Sep 12 23:56:17.052330 systemd-networkd[1378]: Enumeration completed Sep 12 23:56:17.052570 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 23:56:17.061895 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 12 23:56:17.063550 systemd-resolved[1335]: Using system hostname 'ci-4081-3-5-n-326e2e5946'. Sep 12 23:56:17.070804 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 23:56:17.072915 systemd[1]: Reached target network.target - Network. Sep 12 23:56:17.073575 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 23:56:17.085494 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Sep 12 23:56:17.184752 kernel: mousedev: PS/2 mouse device common for all mice Sep 12 23:56:17.187333 systemd-networkd[1378]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 23:56:17.187343 systemd-networkd[1378]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 23:56:17.188207 systemd-networkd[1378]: eth1: Link UP Sep 12 23:56:17.188211 systemd-networkd[1378]: eth1: Gained carrier Sep 12 23:56:17.188254 systemd-networkd[1378]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 23:56:17.204294 systemd-networkd[1378]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 23:56:17.204308 systemd-networkd[1378]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 23:56:17.205089 systemd-networkd[1378]: eth0: Link UP Sep 12 23:56:17.205098 systemd-networkd[1378]: eth0: Gained carrier Sep 12 23:56:17.205114 systemd-networkd[1378]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 23:56:17.214807 systemd-networkd[1378]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Sep 12 23:56:17.215480 systemd-timesyncd[1354]: Network configuration changed, trying to establish connection. Sep 12 23:56:17.250337 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 37 scanned by (udev-worker) (1383) Sep 12 23:56:17.266964 systemd-networkd[1378]: eth0: DHCPv4 address 91.99.152.252/32, gateway 172.31.1.1 acquired from 172.31.1.1 Sep 12 23:56:17.267584 systemd-timesyncd[1354]: Network configuration changed, trying to establish connection. Sep 12 23:56:17.267874 systemd-timesyncd[1354]: Network configuration changed, trying to establish connection. Sep 12 23:56:17.296371 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Sep 12 23:56:17.297447 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 23:56:17.306046 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 23:56:17.309555 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 23:56:17.312852 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 23:56:17.314180 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 23:56:17.314218 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 12 23:56:17.319638 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 23:56:17.320663 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 23:56:17.342078 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 23:56:17.344278 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 23:56:17.347333 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 23:56:17.348043 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 23:56:17.357630 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 23:56:17.357687 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 23:56:17.371046 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Sep 12 23:56:17.371130 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Sep 12 23:56:17.371144 kernel: [drm] features: -context_init Sep 12 23:56:17.375860 kernel: [drm] number of scanouts: 1 Sep 12 23:56:17.375961 kernel: [drm] number of cap sets: 0 Sep 12 23:56:17.375976 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:01.0 on minor 0 Sep 12 23:56:17.376082 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Sep 12 23:56:17.382941 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 12 23:56:17.383745 kernel: Console: switching to colour frame buffer device 160x50 Sep 12 23:56:17.394775 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Sep 12 23:56:17.408179 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 23:56:17.418865 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 12 23:56:17.491546 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 23:56:17.519492 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Sep 12 23:56:17.529933 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Sep 12 23:56:17.545818 lvm[1436]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Sep 12 23:56:17.571859 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Sep 12 23:56:17.573702 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 23:56:17.574648 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 23:56:17.575586 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 12 23:56:17.576550 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 12 23:56:17.577702 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 12 23:56:17.578808 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 12 23:56:17.579547 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 12 23:56:17.580385 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 12 23:56:17.580420 systemd[1]: Reached target paths.target - Path Units. Sep 12 23:56:17.581150 systemd[1]: Reached target timers.target - Timer Units. Sep 12 23:56:17.583019 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 12 23:56:17.585994 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 12 23:56:17.593687 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 12 23:56:17.596232 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Sep 12 23:56:17.597928 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 12 23:56:17.598837 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 23:56:17.599650 systemd[1]: Reached target basic.target - Basic System. Sep 12 23:56:17.600684 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 12 23:56:17.600732 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 12 23:56:17.603867 systemd[1]: Starting containerd.service - containerd container runtime... Sep 12 23:56:17.607645 lvm[1440]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Sep 12 23:56:17.613024 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 12 23:56:17.618059 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 12 23:56:17.630095 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 12 23:56:17.634000 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 12 23:56:17.635509 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 12 23:56:17.639259 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 12 23:56:17.649955 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 12 23:56:17.653146 jq[1444]: false Sep 12 23:56:17.654179 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Sep 12 23:56:17.658354 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 12 23:56:17.663996 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 12 23:56:17.667840 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 12 23:56:17.671639 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 12 23:56:17.672207 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 12 23:56:17.673008 systemd[1]: Starting update-engine.service - Update Engine... Sep 12 23:56:17.684414 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 12 23:56:17.687236 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Sep 12 23:56:17.693590 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 12 23:56:17.693795 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 12 23:56:17.720762 jq[1455]: true Sep 12 23:56:17.724001 (ntainerd)[1466]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 12 23:56:17.747900 coreos-metadata[1442]: Sep 12 23:56:17.742 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Sep 12 23:56:17.751279 coreos-metadata[1442]: Sep 12 23:56:17.750 INFO Fetch successful Sep 12 23:56:17.754406 coreos-metadata[1442]: Sep 12 23:56:17.751 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Sep 12 23:56:17.751638 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 12 23:56:17.751854 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 12 23:56:17.757311 coreos-metadata[1442]: Sep 12 23:56:17.756 INFO Fetch successful Sep 12 23:56:17.786765 dbus-daemon[1443]: [system] SELinux support is enabled Sep 12 23:56:17.787345 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 12 23:56:17.793210 jq[1472]: true Sep 12 23:56:17.793419 tar[1459]: linux-arm64/LICENSE Sep 12 23:56:17.793419 tar[1459]: linux-arm64/helm Sep 12 23:56:17.799484 extend-filesystems[1445]: Found loop4 Sep 12 23:56:17.799484 extend-filesystems[1445]: Found loop5 Sep 12 23:56:17.799484 extend-filesystems[1445]: Found loop6 Sep 12 23:56:17.799484 extend-filesystems[1445]: Found loop7 Sep 12 23:56:17.799484 extend-filesystems[1445]: Found sda Sep 12 23:56:17.799484 extend-filesystems[1445]: Found sda1 Sep 12 23:56:17.799484 extend-filesystems[1445]: Found sda2 Sep 12 23:56:17.799484 extend-filesystems[1445]: Found sda3 Sep 12 23:56:17.799484 extend-filesystems[1445]: Found usr Sep 12 23:56:17.799484 extend-filesystems[1445]: Found sda4 Sep 12 23:56:17.799484 extend-filesystems[1445]: Found sda6 Sep 12 23:56:17.799484 extend-filesystems[1445]: Found sda7 Sep 12 23:56:17.799484 extend-filesystems[1445]: Found sda9 Sep 12 23:56:17.799484 extend-filesystems[1445]: Checking size of /dev/sda9 Sep 12 23:56:17.794027 systemd[1]: motdgen.service: Deactivated successfully. Sep 12 23:56:17.850691 update_engine[1454]: I20250912 23:56:17.817948 1454 main.cc:92] Flatcar Update Engine starting Sep 12 23:56:17.850691 update_engine[1454]: I20250912 23:56:17.833168 1454 update_check_scheduler.cc:74] Next update check in 2m31s Sep 12 23:56:17.794230 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 12 23:56:17.806112 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 12 23:56:17.806154 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 12 23:56:17.808830 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 12 23:56:17.808854 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 12 23:56:17.833115 systemd[1]: Started update-engine.service - Update Engine. Sep 12 23:56:17.843906 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 12 23:56:17.871601 extend-filesystems[1445]: Resized partition /dev/sda9 Sep 12 23:56:17.874290 systemd-logind[1453]: New seat seat0. Sep 12 23:56:17.885946 extend-filesystems[1498]: resize2fs 1.47.1 (20-May-2024) Sep 12 23:56:17.877468 systemd-logind[1453]: Watching system buttons on /dev/input/event0 (Power Button) Sep 12 23:56:17.877492 systemd-logind[1453]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Sep 12 23:56:17.878005 systemd[1]: Started systemd-logind.service - User Login Management. Sep 12 23:56:17.896546 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Sep 12 23:56:17.967790 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 12 23:56:17.970421 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 12 23:56:18.009314 bash[1512]: Updated "/home/core/.ssh/authorized_keys" Sep 12 23:56:18.010048 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 12 23:56:18.032379 systemd[1]: Starting sshkeys.service... Sep 12 23:56:18.080663 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 12 23:56:18.099776 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 37 scanned by (udev-worker) (1394) Sep 12 23:56:18.104732 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Sep 12 23:56:18.129263 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 12 23:56:18.137377 extend-filesystems[1498]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Sep 12 23:56:18.137377 extend-filesystems[1498]: old_desc_blocks = 1, new_desc_blocks = 5 Sep 12 23:56:18.137377 extend-filesystems[1498]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Sep 12 23:56:18.137586 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 12 23:56:18.150091 extend-filesystems[1445]: Resized filesystem in /dev/sda9 Sep 12 23:56:18.150091 extend-filesystems[1445]: Found sr0 Sep 12 23:56:18.139771 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 12 23:56:18.159840 containerd[1466]: time="2025-09-12T23:56:18.156159280Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Sep 12 23:56:18.227733 containerd[1466]: time="2025-09-12T23:56:18.225425840Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Sep 12 23:56:18.227733 containerd[1466]: time="2025-09-12T23:56:18.227173920Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.106-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Sep 12 23:56:18.227733 containerd[1466]: time="2025-09-12T23:56:18.227216360Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Sep 12 23:56:18.227733 containerd[1466]: time="2025-09-12T23:56:18.227234520Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Sep 12 23:56:18.227733 containerd[1466]: time="2025-09-12T23:56:18.227480240Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Sep 12 23:56:18.227733 containerd[1466]: time="2025-09-12T23:56:18.227505640Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Sep 12 23:56:18.227733 containerd[1466]: time="2025-09-12T23:56:18.227569880Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Sep 12 23:56:18.227733 containerd[1466]: time="2025-09-12T23:56:18.227582840Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Sep 12 23:56:18.227989 coreos-metadata[1521]: Sep 12 23:56:18.226 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Sep 12 23:56:18.228241 containerd[1466]: time="2025-09-12T23:56:18.227837920Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Sep 12 23:56:18.228241 containerd[1466]: time="2025-09-12T23:56:18.227856520Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Sep 12 23:56:18.228241 containerd[1466]: time="2025-09-12T23:56:18.227870960Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Sep 12 23:56:18.228241 containerd[1466]: time="2025-09-12T23:56:18.227880680Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Sep 12 23:56:18.228241 containerd[1466]: time="2025-09-12T23:56:18.227960120Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Sep 12 23:56:18.228241 containerd[1466]: time="2025-09-12T23:56:18.228149960Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Sep 12 23:56:18.228366 containerd[1466]: time="2025-09-12T23:56:18.228244840Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Sep 12 23:56:18.228366 containerd[1466]: time="2025-09-12T23:56:18.228258560Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Sep 12 23:56:18.228366 containerd[1466]: time="2025-09-12T23:56:18.228344440Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Sep 12 23:56:18.228431 containerd[1466]: time="2025-09-12T23:56:18.228386960Z" level=info msg="metadata content store policy set" policy=shared Sep 12 23:56:18.230078 coreos-metadata[1521]: Sep 12 23:56:18.229 INFO Fetch successful Sep 12 23:56:18.232684 unknown[1521]: wrote ssh authorized keys file for user: core Sep 12 23:56:18.233861 systemd-networkd[1378]: eth0: Gained IPv6LL Sep 12 23:56:18.235826 systemd-timesyncd[1354]: Network configuration changed, trying to establish connection. Sep 12 23:56:18.239294 containerd[1466]: time="2025-09-12T23:56:18.239238840Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Sep 12 23:56:18.239377 containerd[1466]: time="2025-09-12T23:56:18.239323800Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Sep 12 23:56:18.239377 containerd[1466]: time="2025-09-12T23:56:18.239343920Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Sep 12 23:56:18.239377 containerd[1466]: time="2025-09-12T23:56:18.239362800Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Sep 12 23:56:18.239428 containerd[1466]: time="2025-09-12T23:56:18.239385440Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Sep 12 23:56:18.240782 containerd[1466]: time="2025-09-12T23:56:18.239588560Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Sep 12 23:56:18.240354 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 12 23:56:18.244148 systemd[1]: Reached target network-online.target - Network is Online. Sep 12 23:56:18.244923 containerd[1466]: time="2025-09-12T23:56:18.244875520Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Sep 12 23:56:18.245194 containerd[1466]: time="2025-09-12T23:56:18.245161320Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Sep 12 23:56:18.245777 containerd[1466]: time="2025-09-12T23:56:18.245756640Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Sep 12 23:56:18.245884 containerd[1466]: time="2025-09-12T23:56:18.245865680Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Sep 12 23:56:18.246054 containerd[1466]: time="2025-09-12T23:56:18.246037520Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Sep 12 23:56:18.248745 containerd[1466]: time="2025-09-12T23:56:18.246370120Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Sep 12 23:56:18.248745 containerd[1466]: time="2025-09-12T23:56:18.246398960Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Sep 12 23:56:18.248745 containerd[1466]: time="2025-09-12T23:56:18.246417040Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Sep 12 23:56:18.248745 containerd[1466]: time="2025-09-12T23:56:18.246442920Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Sep 12 23:56:18.248745 containerd[1466]: time="2025-09-12T23:56:18.246478400Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Sep 12 23:56:18.248745 containerd[1466]: time="2025-09-12T23:56:18.246498040Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Sep 12 23:56:18.248745 containerd[1466]: time="2025-09-12T23:56:18.246515360Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Sep 12 23:56:18.248745 containerd[1466]: time="2025-09-12T23:56:18.246544680Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Sep 12 23:56:18.248745 containerd[1466]: time="2025-09-12T23:56:18.246565160Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Sep 12 23:56:18.248745 containerd[1466]: time="2025-09-12T23:56:18.246582280Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Sep 12 23:56:18.248745 containerd[1466]: time="2025-09-12T23:56:18.246597400Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Sep 12 23:56:18.248745 containerd[1466]: time="2025-09-12T23:56:18.246613680Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Sep 12 23:56:18.248745 containerd[1466]: time="2025-09-12T23:56:18.246632720Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Sep 12 23:56:18.248745 containerd[1466]: time="2025-09-12T23:56:18.246649320Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Sep 12 23:56:18.249964 containerd[1466]: time="2025-09-12T23:56:18.246666840Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Sep 12 23:56:18.249964 containerd[1466]: time="2025-09-12T23:56:18.246684040Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Sep 12 23:56:18.249964 containerd[1466]: time="2025-09-12T23:56:18.246737160Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Sep 12 23:56:18.249964 containerd[1466]: time="2025-09-12T23:56:18.246757720Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Sep 12 23:56:18.249964 containerd[1466]: time="2025-09-12T23:56:18.246774520Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Sep 12 23:56:18.249964 containerd[1466]: time="2025-09-12T23:56:18.246792000Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Sep 12 23:56:18.249964 containerd[1466]: time="2025-09-12T23:56:18.246821720Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Sep 12 23:56:18.249964 containerd[1466]: time="2025-09-12T23:56:18.246852320Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Sep 12 23:56:18.249964 containerd[1466]: time="2025-09-12T23:56:18.246869200Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Sep 12 23:56:18.249964 containerd[1466]: time="2025-09-12T23:56:18.246892520Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Sep 12 23:56:18.249964 containerd[1466]: time="2025-09-12T23:56:18.247017720Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Sep 12 23:56:18.249964 containerd[1466]: time="2025-09-12T23:56:18.247041840Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Sep 12 23:56:18.249964 containerd[1466]: time="2025-09-12T23:56:18.247054320Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Sep 12 23:56:18.250213 containerd[1466]: time="2025-09-12T23:56:18.247072160Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Sep 12 23:56:18.250213 containerd[1466]: time="2025-09-12T23:56:18.247086080Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Sep 12 23:56:18.250213 containerd[1466]: time="2025-09-12T23:56:18.247117120Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Sep 12 23:56:18.250213 containerd[1466]: time="2025-09-12T23:56:18.247132080Z" level=info msg="NRI interface is disabled by configuration." Sep 12 23:56:18.250213 containerd[1466]: time="2025-09-12T23:56:18.247146800Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Sep 12 23:56:18.250312 containerd[1466]: time="2025-09-12T23:56:18.247619960Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Sep 12 23:56:18.252035 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:56:18.256762 containerd[1466]: time="2025-09-12T23:56:18.247695360Z" level=info msg="Connect containerd service" Sep 12 23:56:18.257387 containerd[1466]: time="2025-09-12T23:56:18.256940520Z" level=info msg="using legacy CRI server" Sep 12 23:56:18.257387 containerd[1466]: time="2025-09-12T23:56:18.256970800Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 12 23:56:18.257387 containerd[1466]: time="2025-09-12T23:56:18.257095160Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Sep 12 23:56:18.258384 containerd[1466]: time="2025-09-12T23:56:18.258303000Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 23:56:18.259155 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 12 23:56:18.262350 containerd[1466]: time="2025-09-12T23:56:18.261307320Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 12 23:56:18.270914 containerd[1466]: time="2025-09-12T23:56:18.263640120Z" level=info msg="Start subscribing containerd event" Sep 12 23:56:18.270914 containerd[1466]: time="2025-09-12T23:56:18.270900880Z" level=info msg="Start recovering state" Sep 12 23:56:18.271059 containerd[1466]: time="2025-09-12T23:56:18.270988160Z" level=info msg="Start event monitor" Sep 12 23:56:18.271059 containerd[1466]: time="2025-09-12T23:56:18.271001320Z" level=info msg="Start snapshots syncer" Sep 12 23:56:18.271059 containerd[1466]: time="2025-09-12T23:56:18.271013320Z" level=info msg="Start cni network conf syncer for default" Sep 12 23:56:18.271059 containerd[1466]: time="2025-09-12T23:56:18.271020920Z" level=info msg="Start streaming server" Sep 12 23:56:18.272684 containerd[1466]: time="2025-09-12T23:56:18.270848920Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 12 23:56:18.272684 containerd[1466]: time="2025-09-12T23:56:18.271252800Z" level=info msg="containerd successfully booted in 0.125892s" Sep 12 23:56:18.271367 systemd[1]: Started containerd.service - containerd container runtime. Sep 12 23:56:18.304819 update-ssh-keys[1538]: Updated "/home/core/.ssh/authorized_keys" Sep 12 23:56:18.305854 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 12 23:56:18.310557 systemd[1]: Finished sshkeys.service. Sep 12 23:56:18.328023 locksmithd[1488]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 12 23:56:18.347784 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 12 23:56:18.843381 sshd_keygen[1484]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 12 23:56:18.897522 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 12 23:56:18.898827 tar[1459]: linux-arm64/README.md Sep 12 23:56:18.917236 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 12 23:56:18.928320 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 12 23:56:18.931508 systemd[1]: issuegen.service: Deactivated successfully. Sep 12 23:56:18.931865 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 12 23:56:18.940232 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 12 23:56:18.956232 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 12 23:56:18.967224 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 12 23:56:18.976669 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Sep 12 23:56:18.980028 systemd[1]: Reached target getty.target - Login Prompts. Sep 12 23:56:19.193904 systemd-networkd[1378]: eth1: Gained IPv6LL Sep 12 23:56:19.195641 systemd-timesyncd[1354]: Network configuration changed, trying to establish connection. Sep 12 23:56:19.240123 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:56:19.241955 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 12 23:56:19.247435 systemd[1]: Startup finished in 826ms (kernel) + 6.031s (initrd) + 4.555s (userspace) = 11.412s. Sep 12 23:56:19.251667 (kubelet)[1574]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 23:56:19.806135 kubelet[1574]: E0912 23:56:19.805160 1574 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 23:56:19.809156 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 23:56:19.809319 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 23:56:30.060762 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 12 23:56:30.071068 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:56:30.229051 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:56:30.229940 (kubelet)[1593]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 23:56:30.281215 kubelet[1593]: E0912 23:56:30.281108 1593 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 23:56:30.285009 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 23:56:30.285173 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 23:56:40.536207 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 12 23:56:40.546093 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:56:40.672700 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:56:40.678056 (kubelet)[1607]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 23:56:40.724610 kubelet[1607]: E0912 23:56:40.724536 1607 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 23:56:40.727511 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 23:56:40.727661 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 23:56:49.451558 systemd-timesyncd[1354]: Contacted time server 157.90.247.99:123 (2.flatcar.pool.ntp.org). Sep 12 23:56:49.451643 systemd-timesyncd[1354]: Initial clock synchronization to Fri 2025-09-12 23:56:49.129431 UTC. Sep 12 23:56:50.978272 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 12 23:56:50.986056 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:56:51.112444 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:56:51.123650 (kubelet)[1621]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 23:56:51.169141 kubelet[1621]: E0912 23:56:51.169092 1621 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 23:56:51.173058 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 23:56:51.173285 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 23:56:56.230139 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 12 23:56:56.238358 systemd[1]: Started sshd@0-91.99.152.252:22-147.75.109.163:48800.service - OpenSSH per-connection server daemon (147.75.109.163:48800). Sep 12 23:56:57.201178 sshd[1630]: Accepted publickey for core from 147.75.109.163 port 48800 ssh2: RSA SHA256:bk/7TLrptUsRlsRU8kT0ooDVsm6tbA2jrK7QjRZsxaM Sep 12 23:56:57.203227 sshd[1630]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:56:57.213138 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 12 23:56:57.220186 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 12 23:56:57.224817 systemd-logind[1453]: New session 1 of user core. Sep 12 23:56:57.233741 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 12 23:56:57.239048 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 12 23:56:57.262816 (systemd)[1634]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 12 23:56:57.379927 systemd[1634]: Queued start job for default target default.target. Sep 12 23:56:57.391982 systemd[1634]: Created slice app.slice - User Application Slice. Sep 12 23:56:57.392049 systemd[1634]: Reached target paths.target - Paths. Sep 12 23:56:57.392078 systemd[1634]: Reached target timers.target - Timers. Sep 12 23:56:57.394198 systemd[1634]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 12 23:56:57.410907 systemd[1634]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 12 23:56:57.411080 systemd[1634]: Reached target sockets.target - Sockets. Sep 12 23:56:57.411097 systemd[1634]: Reached target basic.target - Basic System. Sep 12 23:56:57.411149 systemd[1634]: Reached target default.target - Main User Target. Sep 12 23:56:57.411180 systemd[1634]: Startup finished in 141ms. Sep 12 23:56:57.411306 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 12 23:56:57.422059 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 12 23:56:58.103072 systemd[1]: Started sshd@1-91.99.152.252:22-147.75.109.163:48808.service - OpenSSH per-connection server daemon (147.75.109.163:48808). Sep 12 23:56:59.076063 sshd[1645]: Accepted publickey for core from 147.75.109.163 port 48808 ssh2: RSA SHA256:bk/7TLrptUsRlsRU8kT0ooDVsm6tbA2jrK7QjRZsxaM Sep 12 23:56:59.078181 sshd[1645]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:56:59.086114 systemd-logind[1453]: New session 2 of user core. Sep 12 23:56:59.094038 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 12 23:56:59.757136 sshd[1645]: pam_unix(sshd:session): session closed for user core Sep 12 23:56:59.761102 systemd[1]: sshd@1-91.99.152.252:22-147.75.109.163:48808.service: Deactivated successfully. Sep 12 23:56:59.762623 systemd[1]: session-2.scope: Deactivated successfully. Sep 12 23:56:59.764553 systemd-logind[1453]: Session 2 logged out. Waiting for processes to exit. Sep 12 23:56:59.765891 systemd-logind[1453]: Removed session 2. Sep 12 23:56:59.926008 systemd[1]: Started sshd@2-91.99.152.252:22-147.75.109.163:48812.service - OpenSSH per-connection server daemon (147.75.109.163:48812). Sep 12 23:57:00.917583 sshd[1652]: Accepted publickey for core from 147.75.109.163 port 48812 ssh2: RSA SHA256:bk/7TLrptUsRlsRU8kT0ooDVsm6tbA2jrK7QjRZsxaM Sep 12 23:57:00.919657 sshd[1652]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:57:00.927591 systemd-logind[1453]: New session 3 of user core. Sep 12 23:57:00.935135 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 12 23:57:01.405228 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Sep 12 23:57:01.418530 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:57:01.568916 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:57:01.581177 (kubelet)[1664]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 23:57:01.598654 sshd[1652]: pam_unix(sshd:session): session closed for user core Sep 12 23:57:01.602127 systemd-logind[1453]: Session 3 logged out. Waiting for processes to exit. Sep 12 23:57:01.604327 systemd[1]: sshd@2-91.99.152.252:22-147.75.109.163:48812.service: Deactivated successfully. Sep 12 23:57:01.606716 systemd[1]: session-3.scope: Deactivated successfully. Sep 12 23:57:01.608210 systemd-logind[1453]: Removed session 3. Sep 12 23:57:01.630139 kubelet[1664]: E0912 23:57:01.630062 1664 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 23:57:01.633334 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 23:57:01.634007 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 23:57:01.778368 systemd[1]: Started sshd@3-91.99.152.252:22-147.75.109.163:57874.service - OpenSSH per-connection server daemon (147.75.109.163:57874). Sep 12 23:57:02.758149 sshd[1674]: Accepted publickey for core from 147.75.109.163 port 57874 ssh2: RSA SHA256:bk/7TLrptUsRlsRU8kT0ooDVsm6tbA2jrK7QjRZsxaM Sep 12 23:57:02.760235 sshd[1674]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:57:02.765318 systemd-logind[1453]: New session 4 of user core. Sep 12 23:57:02.773056 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 12 23:57:02.835865 update_engine[1454]: I20250912 23:57:02.835583 1454 update_attempter.cc:509] Updating boot flags... Sep 12 23:57:02.873615 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 37 scanned by (udev-worker) (1686) Sep 12 23:57:02.937742 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 37 scanned by (udev-worker) (1686) Sep 12 23:57:03.438995 sshd[1674]: pam_unix(sshd:session): session closed for user core Sep 12 23:57:03.445408 systemd[1]: sshd@3-91.99.152.252:22-147.75.109.163:57874.service: Deactivated successfully. Sep 12 23:57:03.448445 systemd[1]: session-4.scope: Deactivated successfully. Sep 12 23:57:03.451422 systemd-logind[1453]: Session 4 logged out. Waiting for processes to exit. Sep 12 23:57:03.452764 systemd-logind[1453]: Removed session 4. Sep 12 23:57:03.619141 systemd[1]: Started sshd@4-91.99.152.252:22-147.75.109.163:57890.service - OpenSSH per-connection server daemon (147.75.109.163:57890). Sep 12 23:57:04.598680 sshd[1699]: Accepted publickey for core from 147.75.109.163 port 57890 ssh2: RSA SHA256:bk/7TLrptUsRlsRU8kT0ooDVsm6tbA2jrK7QjRZsxaM Sep 12 23:57:04.600776 sshd[1699]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:57:04.607546 systemd-logind[1453]: New session 5 of user core. Sep 12 23:57:04.611926 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 12 23:57:05.133758 sudo[1702]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 12 23:57:05.134137 sudo[1702]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 23:57:05.148808 sudo[1702]: pam_unix(sudo:session): session closed for user root Sep 12 23:57:05.309204 sshd[1699]: pam_unix(sshd:session): session closed for user core Sep 12 23:57:05.314450 systemd[1]: sshd@4-91.99.152.252:22-147.75.109.163:57890.service: Deactivated successfully. Sep 12 23:57:05.316934 systemd[1]: session-5.scope: Deactivated successfully. Sep 12 23:57:05.318227 systemd-logind[1453]: Session 5 logged out. Waiting for processes to exit. Sep 12 23:57:05.320810 systemd-logind[1453]: Removed session 5. Sep 12 23:57:05.486148 systemd[1]: Started sshd@5-91.99.152.252:22-147.75.109.163:57898.service - OpenSSH per-connection server daemon (147.75.109.163:57898). Sep 12 23:57:06.453698 sshd[1707]: Accepted publickey for core from 147.75.109.163 port 57898 ssh2: RSA SHA256:bk/7TLrptUsRlsRU8kT0ooDVsm6tbA2jrK7QjRZsxaM Sep 12 23:57:06.458588 sshd[1707]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:57:06.464228 systemd-logind[1453]: New session 6 of user core. Sep 12 23:57:06.477101 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 12 23:57:06.970773 sudo[1711]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 12 23:57:06.971052 sudo[1711]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 23:57:06.982552 sudo[1711]: pam_unix(sudo:session): session closed for user root Sep 12 23:57:06.988673 sudo[1710]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Sep 12 23:57:06.989244 sudo[1710]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 23:57:07.013586 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Sep 12 23:57:07.017550 auditctl[1714]: No rules Sep 12 23:57:07.018195 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 23:57:07.018383 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Sep 12 23:57:07.027387 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Sep 12 23:57:07.056310 augenrules[1732]: No rules Sep 12 23:57:07.058836 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Sep 12 23:57:07.060973 sudo[1710]: pam_unix(sudo:session): session closed for user root Sep 12 23:57:07.220504 sshd[1707]: pam_unix(sshd:session): session closed for user core Sep 12 23:57:07.226567 systemd[1]: sshd@5-91.99.152.252:22-147.75.109.163:57898.service: Deactivated successfully. Sep 12 23:57:07.230722 systemd[1]: session-6.scope: Deactivated successfully. Sep 12 23:57:07.233973 systemd-logind[1453]: Session 6 logged out. Waiting for processes to exit. Sep 12 23:57:07.235469 systemd-logind[1453]: Removed session 6. Sep 12 23:57:07.401182 systemd[1]: Started sshd@6-91.99.152.252:22-147.75.109.163:57904.service - OpenSSH per-connection server daemon (147.75.109.163:57904). Sep 12 23:57:08.380330 sshd[1740]: Accepted publickey for core from 147.75.109.163 port 57904 ssh2: RSA SHA256:bk/7TLrptUsRlsRU8kT0ooDVsm6tbA2jrK7QjRZsxaM Sep 12 23:57:08.383729 sshd[1740]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:57:08.391796 systemd-logind[1453]: New session 7 of user core. Sep 12 23:57:08.398936 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 12 23:57:08.903388 sudo[1743]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 12 23:57:08.903683 sudo[1743]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 23:57:09.218224 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 12 23:57:09.218627 (dockerd)[1758]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 12 23:57:09.478605 dockerd[1758]: time="2025-09-12T23:57:09.478134841Z" level=info msg="Starting up" Sep 12 23:57:09.560220 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport2558344079-merged.mount: Deactivated successfully. Sep 12 23:57:09.585791 dockerd[1758]: time="2025-09-12T23:57:09.585454352Z" level=info msg="Loading containers: start." Sep 12 23:57:09.706728 kernel: Initializing XFRM netlink socket Sep 12 23:57:09.799097 systemd-networkd[1378]: docker0: Link UP Sep 12 23:57:09.819933 dockerd[1758]: time="2025-09-12T23:57:09.819186055Z" level=info msg="Loading containers: done." Sep 12 23:57:09.839363 dockerd[1758]: time="2025-09-12T23:57:09.839290506Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 12 23:57:09.840181 dockerd[1758]: time="2025-09-12T23:57:09.839741067Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Sep 12 23:57:09.840181 dockerd[1758]: time="2025-09-12T23:57:09.839905662Z" level=info msg="Daemon has completed initialization" Sep 12 23:57:09.884082 dockerd[1758]: time="2025-09-12T23:57:09.883874199Z" level=info msg="API listen on /run/docker.sock" Sep 12 23:57:09.884686 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 12 23:57:10.556111 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck1826171707-merged.mount: Deactivated successfully. Sep 12 23:57:11.011018 containerd[1466]: time="2025-09-12T23:57:11.010889463Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.9\"" Sep 12 23:57:11.654745 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Sep 12 23:57:11.666001 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:57:11.759594 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2927866426.mount: Deactivated successfully. Sep 12 23:57:11.855012 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:57:11.858611 (kubelet)[1913]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 23:57:11.918634 kubelet[1913]: E0912 23:57:11.918508 1913 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 23:57:11.922426 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 23:57:11.922580 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 23:57:13.288778 containerd[1466]: time="2025-09-12T23:57:13.288670620Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:57:13.290854 containerd[1466]: time="2025-09-12T23:57:13.290787543Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.9: active requests=0, bytes read=26363783" Sep 12 23:57:13.292343 containerd[1466]: time="2025-09-12T23:57:13.292281192Z" level=info msg="ImageCreate event name:\"sha256:02ea53851f07db91ed471dab1ab11541f5c294802371cd8f0cfd423cd5c71002\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:57:13.295735 containerd[1466]: time="2025-09-12T23:57:13.295589488Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:6df11cc2ad9679b1117be34d3a0230add88bc0a08fd7a3ebc26b680575e8de97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:57:13.297650 containerd[1466]: time="2025-09-12T23:57:13.297385491Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.9\" with image id \"sha256:02ea53851f07db91ed471dab1ab11541f5c294802371cd8f0cfd423cd5c71002\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.9\", repo digest \"registry.k8s.io/kube-apiserver@sha256:6df11cc2ad9679b1117be34d3a0230add88bc0a08fd7a3ebc26b680575e8de97\", size \"26360284\" in 2.28635483s" Sep 12 23:57:13.297650 containerd[1466]: time="2025-09-12T23:57:13.297441706Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.9\" returns image reference \"sha256:02ea53851f07db91ed471dab1ab11541f5c294802371cd8f0cfd423cd5c71002\"" Sep 12 23:57:13.298872 containerd[1466]: time="2025-09-12T23:57:13.298642023Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.9\"" Sep 12 23:57:14.976848 containerd[1466]: time="2025-09-12T23:57:14.976784147Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:57:14.978791 containerd[1466]: time="2025-09-12T23:57:14.978744861Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.9: active requests=0, bytes read=22531220" Sep 12 23:57:14.979685 containerd[1466]: time="2025-09-12T23:57:14.978912028Z" level=info msg="ImageCreate event name:\"sha256:f0bcbad5082c944520b370596a2384affda710b9d7daf84e8a48352699af8e4b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:57:14.983543 containerd[1466]: time="2025-09-12T23:57:14.983476125Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:243c4b8e3bce271fcb1b78008ab996ab6976b1a20096deac08338fcd17979922\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:57:14.985012 containerd[1466]: time="2025-09-12T23:57:14.984962735Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.9\" with image id \"sha256:f0bcbad5082c944520b370596a2384affda710b9d7daf84e8a48352699af8e4b\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.9\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:243c4b8e3bce271fcb1b78008ab996ab6976b1a20096deac08338fcd17979922\", size \"24099975\" in 1.686279027s" Sep 12 23:57:14.985012 containerd[1466]: time="2025-09-12T23:57:14.985007222Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.9\" returns image reference \"sha256:f0bcbad5082c944520b370596a2384affda710b9d7daf84e8a48352699af8e4b\"" Sep 12 23:57:14.985647 containerd[1466]: time="2025-09-12T23:57:14.985614070Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.9\"" Sep 12 23:57:16.373985 containerd[1466]: time="2025-09-12T23:57:16.373911853Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:57:16.375768 containerd[1466]: time="2025-09-12T23:57:16.375726303Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.9: active requests=0, bytes read=17484344" Sep 12 23:57:16.377583 containerd[1466]: time="2025-09-12T23:57:16.377524013Z" level=info msg="ImageCreate event name:\"sha256:1d625baf81b59592006d97a6741bc947698ed222b612ac10efa57b7aa96d2a27\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:57:16.383873 containerd[1466]: time="2025-09-12T23:57:16.383798242Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:50c49520dbd0e8b4076b6a5c77d8014df09ea3d59a73e8bafd2678d51ebb92d5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:57:16.385948 containerd[1466]: time="2025-09-12T23:57:16.385613570Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.9\" with image id \"sha256:1d625baf81b59592006d97a6741bc947698ed222b612ac10efa57b7aa96d2a27\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.9\", repo digest \"registry.k8s.io/kube-scheduler@sha256:50c49520dbd0e8b4076b6a5c77d8014df09ea3d59a73e8bafd2678d51ebb92d5\", size \"19053117\" in 1.399953409s" Sep 12 23:57:16.385948 containerd[1466]: time="2025-09-12T23:57:16.385677650Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.9\" returns image reference \"sha256:1d625baf81b59592006d97a6741bc947698ed222b612ac10efa57b7aa96d2a27\"" Sep 12 23:57:16.386687 containerd[1466]: time="2025-09-12T23:57:16.386643761Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.9\"" Sep 12 23:57:17.617360 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3155698932.mount: Deactivated successfully. Sep 12 23:57:17.944436 containerd[1466]: time="2025-09-12T23:57:17.944330206Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:57:17.946769 containerd[1466]: time="2025-09-12T23:57:17.946719870Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.9: active requests=0, bytes read=27417843" Sep 12 23:57:17.948373 containerd[1466]: time="2025-09-12T23:57:17.948313525Z" level=info msg="ImageCreate event name:\"sha256:72b57ec14d31e8422925ef4c3eff44822cdc04a11fd30d13824f1897d83a16d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:57:17.955482 containerd[1466]: time="2025-09-12T23:57:17.955116557Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:886af02535dc34886e4618b902f8c140d89af57233a245621d29642224516064\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:57:17.956688 containerd[1466]: time="2025-09-12T23:57:17.956645883Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.9\" with image id \"sha256:72b57ec14d31e8422925ef4c3eff44822cdc04a11fd30d13824f1897d83a16d4\", repo tag \"registry.k8s.io/kube-proxy:v1.32.9\", repo digest \"registry.k8s.io/kube-proxy@sha256:886af02535dc34886e4618b902f8c140d89af57233a245621d29642224516064\", size \"27416836\" in 1.569792902s" Sep 12 23:57:17.956844 containerd[1466]: time="2025-09-12T23:57:17.956825407Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.9\" returns image reference \"sha256:72b57ec14d31e8422925ef4c3eff44822cdc04a11fd30d13824f1897d83a16d4\"" Sep 12 23:57:17.957364 containerd[1466]: time="2025-09-12T23:57:17.957337086Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 12 23:57:18.647070 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2135946054.mount: Deactivated successfully. Sep 12 23:57:19.439784 containerd[1466]: time="2025-09-12T23:57:19.439454769Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:57:19.441476 containerd[1466]: time="2025-09-12T23:57:19.440970459Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951714" Sep 12 23:57:19.442955 containerd[1466]: time="2025-09-12T23:57:19.442903239Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:57:19.448302 containerd[1466]: time="2025-09-12T23:57:19.448213909Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:57:19.450751 containerd[1466]: time="2025-09-12T23:57:19.449842624Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.492467456s" Sep 12 23:57:19.450751 containerd[1466]: time="2025-09-12T23:57:19.449890704Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Sep 12 23:57:19.451363 containerd[1466]: time="2025-09-12T23:57:19.451138178Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 12 23:57:19.980268 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount431211981.mount: Deactivated successfully. Sep 12 23:57:19.989925 containerd[1466]: time="2025-09-12T23:57:19.987798551Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:57:19.989925 containerd[1466]: time="2025-09-12T23:57:19.988938995Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" Sep 12 23:57:19.990551 containerd[1466]: time="2025-09-12T23:57:19.990243022Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:57:19.992955 containerd[1466]: time="2025-09-12T23:57:19.992915583Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:57:19.994276 containerd[1466]: time="2025-09-12T23:57:19.994240393Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 543.065685ms" Sep 12 23:57:19.994418 containerd[1466]: time="2025-09-12T23:57:19.994402257Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Sep 12 23:57:19.995264 containerd[1466]: time="2025-09-12T23:57:19.995223329Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Sep 12 23:57:20.664793 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2660171769.mount: Deactivated successfully. Sep 12 23:57:22.154601 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. Sep 12 23:57:22.164091 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:57:22.313479 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:57:22.318499 (kubelet)[2101]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 23:57:22.365641 kubelet[2101]: E0912 23:57:22.365597 2101 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 23:57:22.368184 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 23:57:22.368337 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 23:57:22.996421 containerd[1466]: time="2025-09-12T23:57:22.996350396Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:57:22.998057 containerd[1466]: time="2025-09-12T23:57:22.997989444Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=67943239" Sep 12 23:57:22.999620 containerd[1466]: time="2025-09-12T23:57:22.999530423Z" level=info msg="ImageCreate event name:\"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:57:23.004608 containerd[1466]: time="2025-09-12T23:57:23.004541812Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:57:23.007585 containerd[1466]: time="2025-09-12T23:57:23.007293426Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"67941650\" in 3.012023912s" Sep 12 23:57:23.007585 containerd[1466]: time="2025-09-12T23:57:23.007407138Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\"" Sep 12 23:57:29.473401 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:57:29.482244 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:57:29.521464 systemd[1]: Reloading requested from client PID 2137 ('systemctl') (unit session-7.scope)... Sep 12 23:57:29.521487 systemd[1]: Reloading... Sep 12 23:57:29.659817 zram_generator::config[2180]: No configuration found. Sep 12 23:57:29.756088 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 12 23:57:29.829420 systemd[1]: Reloading finished in 307 ms. Sep 12 23:57:29.893215 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 12 23:57:29.893352 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 12 23:57:29.893833 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:57:29.900228 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:57:30.036666 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:57:30.047611 (kubelet)[2226]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 23:57:30.096658 kubelet[2226]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 23:57:30.097069 kubelet[2226]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 12 23:57:30.097144 kubelet[2226]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 23:57:30.097346 kubelet[2226]: I0912 23:57:30.097277 2226 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 23:57:31.121505 kubelet[2226]: I0912 23:57:31.121448 2226 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 12 23:57:31.122586 kubelet[2226]: I0912 23:57:31.122137 2226 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 23:57:31.122947 kubelet[2226]: I0912 23:57:31.122868 2226 server.go:954] "Client rotation is on, will bootstrap in background" Sep 12 23:57:31.165124 kubelet[2226]: E0912 23:57:31.165011 2226 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://91.99.152.252:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 91.99.152.252:6443: connect: connection refused" logger="UnhandledError" Sep 12 23:57:31.167162 kubelet[2226]: I0912 23:57:31.167112 2226 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 23:57:31.174084 kubelet[2226]: E0912 23:57:31.174046 2226 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Sep 12 23:57:31.174084 kubelet[2226]: I0912 23:57:31.174080 2226 server.go:1421] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Sep 12 23:57:31.176959 kubelet[2226]: I0912 23:57:31.176929 2226 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 23:57:31.178017 kubelet[2226]: I0912 23:57:31.177944 2226 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 23:57:31.178219 kubelet[2226]: I0912 23:57:31.178007 2226 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081-3-5-n-326e2e5946","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 23:57:31.178338 kubelet[2226]: I0912 23:57:31.178275 2226 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 23:57:31.178338 kubelet[2226]: I0912 23:57:31.178306 2226 container_manager_linux.go:304] "Creating device plugin manager" Sep 12 23:57:31.178551 kubelet[2226]: I0912 23:57:31.178518 2226 state_mem.go:36] "Initialized new in-memory state store" Sep 12 23:57:31.182451 kubelet[2226]: I0912 23:57:31.182289 2226 kubelet.go:446] "Attempting to sync node with API server" Sep 12 23:57:31.182451 kubelet[2226]: I0912 23:57:31.182328 2226 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 23:57:31.182451 kubelet[2226]: I0912 23:57:31.182351 2226 kubelet.go:352] "Adding apiserver pod source" Sep 12 23:57:31.182451 kubelet[2226]: I0912 23:57:31.182361 2226 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 23:57:31.186496 kubelet[2226]: W0912 23:57:31.185783 2226 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://91.99.152.252:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4081-3-5-n-326e2e5946&limit=500&resourceVersion=0": dial tcp 91.99.152.252:6443: connect: connection refused Sep 12 23:57:31.186496 kubelet[2226]: E0912 23:57:31.185867 2226 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://91.99.152.252:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4081-3-5-n-326e2e5946&limit=500&resourceVersion=0\": dial tcp 91.99.152.252:6443: connect: connection refused" logger="UnhandledError" Sep 12 23:57:31.186496 kubelet[2226]: W0912 23:57:31.186397 2226 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://91.99.152.252:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 91.99.152.252:6443: connect: connection refused Sep 12 23:57:31.186496 kubelet[2226]: E0912 23:57:31.186439 2226 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://91.99.152.252:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 91.99.152.252:6443: connect: connection refused" logger="UnhandledError" Sep 12 23:57:31.188069 kubelet[2226]: I0912 23:57:31.188046 2226 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Sep 12 23:57:31.188891 kubelet[2226]: I0912 23:57:31.188873 2226 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 23:57:31.189089 kubelet[2226]: W0912 23:57:31.189078 2226 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 12 23:57:31.190089 kubelet[2226]: I0912 23:57:31.190066 2226 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 12 23:57:31.190468 kubelet[2226]: I0912 23:57:31.190190 2226 server.go:1287] "Started kubelet" Sep 12 23:57:31.196521 kubelet[2226]: I0912 23:57:31.196487 2226 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 23:57:31.199732 kubelet[2226]: E0912 23:57:31.199352 2226 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://91.99.152.252:6443/api/v1/namespaces/default/events\": dial tcp 91.99.152.252:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4081-3-5-n-326e2e5946.1864ae58d8461bdd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4081-3-5-n-326e2e5946,UID:ci-4081-3-5-n-326e2e5946,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4081-3-5-n-326e2e5946,},FirstTimestamp:2025-09-12 23:57:31.190164445 +0000 UTC m=+1.137593475,LastTimestamp:2025-09-12 23:57:31.190164445 +0000 UTC m=+1.137593475,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4081-3-5-n-326e2e5946,}" Sep 12 23:57:31.204782 kubelet[2226]: I0912 23:57:31.204607 2226 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 12 23:57:31.204910 kubelet[2226]: I0912 23:57:31.204861 2226 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 23:57:31.205271 kubelet[2226]: E0912 23:57:31.205243 2226 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-5-n-326e2e5946\" not found" Sep 12 23:57:31.207109 kubelet[2226]: I0912 23:57:31.206693 2226 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 23:57:31.207814 kubelet[2226]: I0912 23:57:31.207790 2226 server.go:479] "Adding debug handlers to kubelet server" Sep 12 23:57:31.208894 kubelet[2226]: I0912 23:57:31.208836 2226 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 23:57:31.209985 kubelet[2226]: I0912 23:57:31.209089 2226 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 23:57:31.209985 kubelet[2226]: E0912 23:57:31.209351 2226 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.99.152.252:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-5-n-326e2e5946?timeout=10s\": dial tcp 91.99.152.252:6443: connect: connection refused" interval="200ms" Sep 12 23:57:31.209985 kubelet[2226]: I0912 23:57:31.209641 2226 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 12 23:57:31.210950 kubelet[2226]: I0912 23:57:31.210927 2226 reconciler.go:26] "Reconciler: start to sync state" Sep 12 23:57:31.211166 kubelet[2226]: W0912 23:57:31.211101 2226 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://91.99.152.252:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 91.99.152.252:6443: connect: connection refused Sep 12 23:57:31.211244 kubelet[2226]: E0912 23:57:31.211177 2226 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://91.99.152.252:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 91.99.152.252:6443: connect: connection refused" logger="UnhandledError" Sep 12 23:57:31.212859 kubelet[2226]: I0912 23:57:31.212813 2226 factory.go:221] Registration of the systemd container factory successfully Sep 12 23:57:31.213035 kubelet[2226]: I0912 23:57:31.212996 2226 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 23:57:31.215350 kubelet[2226]: E0912 23:57:31.214871 2226 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 23:57:31.215350 kubelet[2226]: I0912 23:57:31.215053 2226 factory.go:221] Registration of the containerd container factory successfully Sep 12 23:57:31.226356 kubelet[2226]: I0912 23:57:31.226306 2226 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 23:57:31.227957 kubelet[2226]: I0912 23:57:31.227920 2226 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 23:57:31.228085 kubelet[2226]: I0912 23:57:31.228075 2226 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 12 23:57:31.228155 kubelet[2226]: I0912 23:57:31.228144 2226 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 12 23:57:31.228204 kubelet[2226]: I0912 23:57:31.228195 2226 kubelet.go:2382] "Starting kubelet main sync loop" Sep 12 23:57:31.228317 kubelet[2226]: E0912 23:57:31.228282 2226 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 23:57:31.236419 kubelet[2226]: W0912 23:57:31.236269 2226 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://91.99.152.252:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 91.99.152.252:6443: connect: connection refused Sep 12 23:57:31.237727 kubelet[2226]: E0912 23:57:31.236740 2226 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://91.99.152.252:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 91.99.152.252:6443: connect: connection refused" logger="UnhandledError" Sep 12 23:57:31.243981 kubelet[2226]: I0912 23:57:31.243847 2226 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 12 23:57:31.243981 kubelet[2226]: I0912 23:57:31.243976 2226 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 12 23:57:31.244138 kubelet[2226]: I0912 23:57:31.243998 2226 state_mem.go:36] "Initialized new in-memory state store" Sep 12 23:57:31.245965 kubelet[2226]: I0912 23:57:31.245940 2226 policy_none.go:49] "None policy: Start" Sep 12 23:57:31.246024 kubelet[2226]: I0912 23:57:31.245971 2226 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 12 23:57:31.246024 kubelet[2226]: I0912 23:57:31.245984 2226 state_mem.go:35] "Initializing new in-memory state store" Sep 12 23:57:31.252048 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 12 23:57:31.262232 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 12 23:57:31.266905 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 12 23:57:31.274976 kubelet[2226]: I0912 23:57:31.274940 2226 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 23:57:31.275757 kubelet[2226]: I0912 23:57:31.275735 2226 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 23:57:31.277026 kubelet[2226]: I0912 23:57:31.275884 2226 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 23:57:31.277492 kubelet[2226]: I0912 23:57:31.277459 2226 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 23:57:31.279856 kubelet[2226]: E0912 23:57:31.279265 2226 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 12 23:57:31.279976 kubelet[2226]: E0912 23:57:31.279958 2226 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4081-3-5-n-326e2e5946\" not found" Sep 12 23:57:31.346959 systemd[1]: Created slice kubepods-burstable-podd337cf5b029a73c04e03d8ba88efb8af.slice - libcontainer container kubepods-burstable-podd337cf5b029a73c04e03d8ba88efb8af.slice. Sep 12 23:57:31.356222 kubelet[2226]: E0912 23:57:31.356012 2226 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-5-n-326e2e5946\" not found" node="ci-4081-3-5-n-326e2e5946" Sep 12 23:57:31.363306 systemd[1]: Created slice kubepods-burstable-pod725b835a0e6843a5ff8e88d12bc3be4c.slice - libcontainer container kubepods-burstable-pod725b835a0e6843a5ff8e88d12bc3be4c.slice. Sep 12 23:57:31.367069 kubelet[2226]: E0912 23:57:31.366834 2226 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-5-n-326e2e5946\" not found" node="ci-4081-3-5-n-326e2e5946" Sep 12 23:57:31.369837 systemd[1]: Created slice kubepods-burstable-pod3af20e4468a469bbc7b7ef803c7541a1.slice - libcontainer container kubepods-burstable-pod3af20e4468a469bbc7b7ef803c7541a1.slice. Sep 12 23:57:31.373746 kubelet[2226]: E0912 23:57:31.372098 2226 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-5-n-326e2e5946\" not found" node="ci-4081-3-5-n-326e2e5946" Sep 12 23:57:31.380761 kubelet[2226]: I0912 23:57:31.380212 2226 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-5-n-326e2e5946" Sep 12 23:57:31.381131 kubelet[2226]: E0912 23:57:31.381100 2226 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://91.99.152.252:6443/api/v1/nodes\": dial tcp 91.99.152.252:6443: connect: connection refused" node="ci-4081-3-5-n-326e2e5946" Sep 12 23:57:31.410210 kubelet[2226]: E0912 23:57:31.410160 2226 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.99.152.252:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-5-n-326e2e5946?timeout=10s\": dial tcp 91.99.152.252:6443: connect: connection refused" interval="400ms" Sep 12 23:57:31.412844 kubelet[2226]: I0912 23:57:31.412733 2226 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d337cf5b029a73c04e03d8ba88efb8af-k8s-certs\") pod \"kube-apiserver-ci-4081-3-5-n-326e2e5946\" (UID: \"d337cf5b029a73c04e03d8ba88efb8af\") " pod="kube-system/kube-apiserver-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:31.412844 kubelet[2226]: I0912 23:57:31.412786 2226 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/725b835a0e6843a5ff8e88d12bc3be4c-k8s-certs\") pod \"kube-controller-manager-ci-4081-3-5-n-326e2e5946\" (UID: \"725b835a0e6843a5ff8e88d12bc3be4c\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:31.412844 kubelet[2226]: I0912 23:57:31.412811 2226 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/725b835a0e6843a5ff8e88d12bc3be4c-kubeconfig\") pod \"kube-controller-manager-ci-4081-3-5-n-326e2e5946\" (UID: \"725b835a0e6843a5ff8e88d12bc3be4c\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:31.412844 kubelet[2226]: I0912 23:57:31.412834 2226 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/725b835a0e6843a5ff8e88d12bc3be4c-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081-3-5-n-326e2e5946\" (UID: \"725b835a0e6843a5ff8e88d12bc3be4c\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:31.413189 kubelet[2226]: I0912 23:57:31.412878 2226 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3af20e4468a469bbc7b7ef803c7541a1-kubeconfig\") pod \"kube-scheduler-ci-4081-3-5-n-326e2e5946\" (UID: \"3af20e4468a469bbc7b7ef803c7541a1\") " pod="kube-system/kube-scheduler-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:31.413189 kubelet[2226]: I0912 23:57:31.412902 2226 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d337cf5b029a73c04e03d8ba88efb8af-ca-certs\") pod \"kube-apiserver-ci-4081-3-5-n-326e2e5946\" (UID: \"d337cf5b029a73c04e03d8ba88efb8af\") " pod="kube-system/kube-apiserver-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:31.413189 kubelet[2226]: I0912 23:57:31.412939 2226 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d337cf5b029a73c04e03d8ba88efb8af-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081-3-5-n-326e2e5946\" (UID: \"d337cf5b029a73c04e03d8ba88efb8af\") " pod="kube-system/kube-apiserver-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:31.413189 kubelet[2226]: I0912 23:57:31.412961 2226 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/725b835a0e6843a5ff8e88d12bc3be4c-ca-certs\") pod \"kube-controller-manager-ci-4081-3-5-n-326e2e5946\" (UID: \"725b835a0e6843a5ff8e88d12bc3be4c\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:31.413189 kubelet[2226]: I0912 23:57:31.412983 2226 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/725b835a0e6843a5ff8e88d12bc3be4c-flexvolume-dir\") pod \"kube-controller-manager-ci-4081-3-5-n-326e2e5946\" (UID: \"725b835a0e6843a5ff8e88d12bc3be4c\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:31.584439 kubelet[2226]: I0912 23:57:31.584234 2226 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-5-n-326e2e5946" Sep 12 23:57:31.585273 kubelet[2226]: E0912 23:57:31.585233 2226 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://91.99.152.252:6443/api/v1/nodes\": dial tcp 91.99.152.252:6443: connect: connection refused" node="ci-4081-3-5-n-326e2e5946" Sep 12 23:57:31.658638 containerd[1466]: time="2025-09-12T23:57:31.658213625Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081-3-5-n-326e2e5946,Uid:d337cf5b029a73c04e03d8ba88efb8af,Namespace:kube-system,Attempt:0,}" Sep 12 23:57:31.670036 containerd[1466]: time="2025-09-12T23:57:31.669985518Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081-3-5-n-326e2e5946,Uid:725b835a0e6843a5ff8e88d12bc3be4c,Namespace:kube-system,Attempt:0,}" Sep 12 23:57:31.675223 containerd[1466]: time="2025-09-12T23:57:31.674654204Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081-3-5-n-326e2e5946,Uid:3af20e4468a469bbc7b7ef803c7541a1,Namespace:kube-system,Attempt:0,}" Sep 12 23:57:31.813187 kubelet[2226]: E0912 23:57:31.812537 2226 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.99.152.252:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-5-n-326e2e5946?timeout=10s\": dial tcp 91.99.152.252:6443: connect: connection refused" interval="800ms" Sep 12 23:57:31.988036 kubelet[2226]: I0912 23:57:31.987316 2226 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-5-n-326e2e5946" Sep 12 23:57:31.988036 kubelet[2226]: E0912 23:57:31.987835 2226 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://91.99.152.252:6443/api/v1/nodes\": dial tcp 91.99.152.252:6443: connect: connection refused" node="ci-4081-3-5-n-326e2e5946" Sep 12 23:57:32.095102 kubelet[2226]: W0912 23:57:32.094971 2226 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://91.99.152.252:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4081-3-5-n-326e2e5946&limit=500&resourceVersion=0": dial tcp 91.99.152.252:6443: connect: connection refused Sep 12 23:57:32.095293 kubelet[2226]: E0912 23:57:32.095112 2226 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://91.99.152.252:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4081-3-5-n-326e2e5946&limit=500&resourceVersion=0\": dial tcp 91.99.152.252:6443: connect: connection refused" logger="UnhandledError" Sep 12 23:57:32.294209 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1620540924.mount: Deactivated successfully. Sep 12 23:57:32.301723 containerd[1466]: time="2025-09-12T23:57:32.301623157Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 23:57:32.304460 containerd[1466]: time="2025-09-12T23:57:32.304224045Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Sep 12 23:57:32.307341 containerd[1466]: time="2025-09-12T23:57:32.305901173Z" level=info msg="ImageCreate event name:\"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 23:57:32.309836 containerd[1466]: time="2025-09-12T23:57:32.308736300Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 23:57:32.310125 containerd[1466]: time="2025-09-12T23:57:32.310102615Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Sep 12 23:57:32.310276 containerd[1466]: time="2025-09-12T23:57:32.310255762Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 23:57:32.311911 containerd[1466]: time="2025-09-12T23:57:32.311878881Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 23:57:32.313095 containerd[1466]: time="2025-09-12T23:57:32.313064725Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 654.753603ms" Sep 12 23:57:32.314009 containerd[1466]: time="2025-09-12T23:57:32.313967920Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=269193" Sep 12 23:57:32.321620 containerd[1466]: time="2025-09-12T23:57:32.321574028Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 645.699723ms" Sep 12 23:57:32.322577 containerd[1466]: time="2025-09-12T23:57:32.322543155Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 651.882955ms" Sep 12 23:57:32.430803 containerd[1466]: time="2025-09-12T23:57:32.430587134Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 23:57:32.430980 containerd[1466]: time="2025-09-12T23:57:32.430765885Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 23:57:32.434006 containerd[1466]: time="2025-09-12T23:57:32.433912146Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 23:57:32.434261 containerd[1466]: time="2025-09-12T23:57:32.434071493Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 23:57:32.437966 containerd[1466]: time="2025-09-12T23:57:32.437669752Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 23:57:32.437966 containerd[1466]: time="2025-09-12T23:57:32.437748046Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 23:57:32.437966 containerd[1466]: time="2025-09-12T23:57:32.437773010Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 23:57:32.437966 containerd[1466]: time="2025-09-12T23:57:32.437868626Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 23:57:32.443968 containerd[1466]: time="2025-09-12T23:57:32.443840093Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 23:57:32.443968 containerd[1466]: time="2025-09-12T23:57:32.443907585Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 23:57:32.443968 containerd[1466]: time="2025-09-12T23:57:32.443918707Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 23:57:32.444840 containerd[1466]: time="2025-09-12T23:57:32.444785376Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 23:57:32.460233 kubelet[2226]: W0912 23:57:32.460151 2226 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://91.99.152.252:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 91.99.152.252:6443: connect: connection refused Sep 12 23:57:32.460233 kubelet[2226]: E0912 23:57:32.460221 2226 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://91.99.152.252:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 91.99.152.252:6443: connect: connection refused" logger="UnhandledError" Sep 12 23:57:32.465462 systemd[1]: Started cri-containerd-1679c1f24e4fbc3fece971bd04a285eab5f6e08d124a156adc60a06173b09e69.scope - libcontainer container 1679c1f24e4fbc3fece971bd04a285eab5f6e08d124a156adc60a06173b09e69. Sep 12 23:57:32.482202 systemd[1]: Started cri-containerd-e8e8d41df0140b4b135eafe49358d0de7193fa327b38d9f23e771e008055395a.scope - libcontainer container e8e8d41df0140b4b135eafe49358d0de7193fa327b38d9f23e771e008055395a. Sep 12 23:57:32.490062 systemd[1]: Started cri-containerd-88edc675682a63e2069128978a3f0daf8d826d3fd1cc641dcae29457e9fc4898.scope - libcontainer container 88edc675682a63e2069128978a3f0daf8d826d3fd1cc641dcae29457e9fc4898. Sep 12 23:57:32.542172 containerd[1466]: time="2025-09-12T23:57:32.542033819Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081-3-5-n-326e2e5946,Uid:d337cf5b029a73c04e03d8ba88efb8af,Namespace:kube-system,Attempt:0,} returns sandbox id \"1679c1f24e4fbc3fece971bd04a285eab5f6e08d124a156adc60a06173b09e69\"" Sep 12 23:57:32.550148 containerd[1466]: time="2025-09-12T23:57:32.549916374Z" level=info msg="CreateContainer within sandbox \"1679c1f24e4fbc3fece971bd04a285eab5f6e08d124a156adc60a06173b09e69\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 12 23:57:32.557100 containerd[1466]: time="2025-09-12T23:57:32.556294871Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081-3-5-n-326e2e5946,Uid:725b835a0e6843a5ff8e88d12bc3be4c,Namespace:kube-system,Attempt:0,} returns sandbox id \"e8e8d41df0140b4b135eafe49358d0de7193fa327b38d9f23e771e008055395a\"" Sep 12 23:57:32.560175 containerd[1466]: time="2025-09-12T23:57:32.560140813Z" level=info msg="CreateContainer within sandbox \"e8e8d41df0140b4b135eafe49358d0de7193fa327b38d9f23e771e008055395a\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 12 23:57:32.567168 containerd[1466]: time="2025-09-12T23:57:32.567127214Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081-3-5-n-326e2e5946,Uid:3af20e4468a469bbc7b7ef803c7541a1,Namespace:kube-system,Attempt:0,} returns sandbox id \"88edc675682a63e2069128978a3f0daf8d826d3fd1cc641dcae29457e9fc4898\"" Sep 12 23:57:32.570923 containerd[1466]: time="2025-09-12T23:57:32.570820049Z" level=info msg="CreateContainer within sandbox \"88edc675682a63e2069128978a3f0daf8d826d3fd1cc641dcae29457e9fc4898\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 12 23:57:32.574117 kubelet[2226]: W0912 23:57:32.573977 2226 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://91.99.152.252:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 91.99.152.252:6443: connect: connection refused Sep 12 23:57:32.574117 kubelet[2226]: E0912 23:57:32.574059 2226 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://91.99.152.252:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 91.99.152.252:6443: connect: connection refused" logger="UnhandledError" Sep 12 23:57:32.575959 containerd[1466]: time="2025-09-12T23:57:32.575910885Z" level=info msg="CreateContainer within sandbox \"1679c1f24e4fbc3fece971bd04a285eab5f6e08d124a156adc60a06173b09e69\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"4a298e71bf0cc1747ae38972e5800e9b4bed697de591c0a82c2c7fa952a58fe1\"" Sep 12 23:57:32.578744 containerd[1466]: time="2025-09-12T23:57:32.577415263Z" level=info msg="StartContainer for \"4a298e71bf0cc1747ae38972e5800e9b4bed697de591c0a82c2c7fa952a58fe1\"" Sep 12 23:57:32.591310 containerd[1466]: time="2025-09-12T23:57:32.591261204Z" level=info msg="CreateContainer within sandbox \"e8e8d41df0140b4b135eafe49358d0de7193fa327b38d9f23e771e008055395a\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"80056097e8513360e60e718873c9e9449add05ec256867a1eb1be4f5d2a69e4b\"" Sep 12 23:57:32.592332 containerd[1466]: time="2025-09-12T23:57:32.592297862Z" level=info msg="StartContainer for \"80056097e8513360e60e718873c9e9449add05ec256867a1eb1be4f5d2a69e4b\"" Sep 12 23:57:32.597234 containerd[1466]: time="2025-09-12T23:57:32.596961544Z" level=info msg="CreateContainer within sandbox \"88edc675682a63e2069128978a3f0daf8d826d3fd1cc641dcae29457e9fc4898\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"39cbfc4ed1703a11461788e254170fff464333fbe3251ef2079c81db41d4ce14\"" Sep 12 23:57:32.597812 containerd[1466]: time="2025-09-12T23:57:32.597744879Z" level=info msg="StartContainer for \"39cbfc4ed1703a11461788e254170fff464333fbe3251ef2079c81db41d4ce14\"" Sep 12 23:57:32.613774 kubelet[2226]: E0912 23:57:32.613392 2226 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.99.152.252:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-5-n-326e2e5946?timeout=10s\": dial tcp 91.99.152.252:6443: connect: connection refused" interval="1.6s" Sep 12 23:57:32.625974 systemd[1]: Started cri-containerd-4a298e71bf0cc1747ae38972e5800e9b4bed697de591c0a82c2c7fa952a58fe1.scope - libcontainer container 4a298e71bf0cc1747ae38972e5800e9b4bed697de591c0a82c2c7fa952a58fe1. Sep 12 23:57:32.630462 systemd[1]: Started cri-containerd-80056097e8513360e60e718873c9e9449add05ec256867a1eb1be4f5d2a69e4b.scope - libcontainer container 80056097e8513360e60e718873c9e9449add05ec256867a1eb1be4f5d2a69e4b. Sep 12 23:57:32.667116 systemd[1]: Started cri-containerd-39cbfc4ed1703a11461788e254170fff464333fbe3251ef2079c81db41d4ce14.scope - libcontainer container 39cbfc4ed1703a11461788e254170fff464333fbe3251ef2079c81db41d4ce14. Sep 12 23:57:32.733737 containerd[1466]: time="2025-09-12T23:57:32.731986684Z" level=info msg="StartContainer for \"4a298e71bf0cc1747ae38972e5800e9b4bed697de591c0a82c2c7fa952a58fe1\" returns successfully" Sep 12 23:57:32.733737 containerd[1466]: time="2025-09-12T23:57:32.732130989Z" level=info msg="StartContainer for \"80056097e8513360e60e718873c9e9449add05ec256867a1eb1be4f5d2a69e4b\" returns successfully" Sep 12 23:57:32.741173 containerd[1466]: time="2025-09-12T23:57:32.741081928Z" level=info msg="StartContainer for \"39cbfc4ed1703a11461788e254170fff464333fbe3251ef2079c81db41d4ce14\" returns successfully" Sep 12 23:57:32.775367 kubelet[2226]: W0912 23:57:32.775212 2226 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://91.99.152.252:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 91.99.152.252:6443: connect: connection refused Sep 12 23:57:32.775367 kubelet[2226]: E0912 23:57:32.775285 2226 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://91.99.152.252:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 91.99.152.252:6443: connect: connection refused" logger="UnhandledError" Sep 12 23:57:32.792143 kubelet[2226]: I0912 23:57:32.792055 2226 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-5-n-326e2e5946" Sep 12 23:57:32.792502 kubelet[2226]: E0912 23:57:32.792471 2226 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://91.99.152.252:6443/api/v1/nodes\": dial tcp 91.99.152.252:6443: connect: connection refused" node="ci-4081-3-5-n-326e2e5946" Sep 12 23:57:33.252625 kubelet[2226]: E0912 23:57:33.252581 2226 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-5-n-326e2e5946\" not found" node="ci-4081-3-5-n-326e2e5946" Sep 12 23:57:33.254328 kubelet[2226]: E0912 23:57:33.254293 2226 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-5-n-326e2e5946\" not found" node="ci-4081-3-5-n-326e2e5946" Sep 12 23:57:33.258588 kubelet[2226]: E0912 23:57:33.258559 2226 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-5-n-326e2e5946\" not found" node="ci-4081-3-5-n-326e2e5946" Sep 12 23:57:34.260231 kubelet[2226]: E0912 23:57:34.260191 2226 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-5-n-326e2e5946\" not found" node="ci-4081-3-5-n-326e2e5946" Sep 12 23:57:34.261897 kubelet[2226]: E0912 23:57:34.261864 2226 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-5-n-326e2e5946\" not found" node="ci-4081-3-5-n-326e2e5946" Sep 12 23:57:34.396369 kubelet[2226]: I0912 23:57:34.396103 2226 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-5-n-326e2e5946" Sep 12 23:57:34.684907 kubelet[2226]: E0912 23:57:34.684832 2226 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4081-3-5-n-326e2e5946\" not found" node="ci-4081-3-5-n-326e2e5946" Sep 12 23:57:34.692406 kubelet[2226]: I0912 23:57:34.692360 2226 kubelet_node_status.go:78] "Successfully registered node" node="ci-4081-3-5-n-326e2e5946" Sep 12 23:57:34.705939 kubelet[2226]: I0912 23:57:34.705886 2226 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:34.819455 kubelet[2226]: E0912 23:57:34.819366 2226 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4081-3-5-n-326e2e5946\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:34.819455 kubelet[2226]: I0912 23:57:34.819433 2226 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:34.827787 kubelet[2226]: E0912 23:57:34.827737 2226 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081-3-5-n-326e2e5946\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:34.827787 kubelet[2226]: I0912 23:57:34.827776 2226 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:34.838039 kubelet[2226]: E0912 23:57:34.837982 2226 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081-3-5-n-326e2e5946\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:35.189740 kubelet[2226]: I0912 23:57:35.188838 2226 apiserver.go:52] "Watching apiserver" Sep 12 23:57:35.210000 kubelet[2226]: I0912 23:57:35.209958 2226 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 12 23:57:35.403229 kubelet[2226]: I0912 23:57:35.403193 2226 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:36.996339 systemd[1]: Reloading requested from client PID 2500 ('systemctl') (unit session-7.scope)... Sep 12 23:57:36.996731 systemd[1]: Reloading... Sep 12 23:57:37.088749 zram_generator::config[2536]: No configuration found. Sep 12 23:57:37.217229 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 12 23:57:37.309400 systemd[1]: Reloading finished in 312 ms. Sep 12 23:57:37.347957 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:57:37.356139 systemd[1]: kubelet.service: Deactivated successfully. Sep 12 23:57:37.356444 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:57:37.356511 systemd[1]: kubelet.service: Consumed 1.579s CPU time, 127.8M memory peak, 0B memory swap peak. Sep 12 23:57:37.362178 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:57:37.500103 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:57:37.512211 (kubelet)[2585]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 23:57:37.568299 kubelet[2585]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 23:57:37.568299 kubelet[2585]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 12 23:57:37.568299 kubelet[2585]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 23:57:37.568299 kubelet[2585]: I0912 23:57:37.567352 2585 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 23:57:37.580110 kubelet[2585]: I0912 23:57:37.580058 2585 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 12 23:57:37.580110 kubelet[2585]: I0912 23:57:37.580096 2585 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 23:57:37.580489 kubelet[2585]: I0912 23:57:37.580455 2585 server.go:954] "Client rotation is on, will bootstrap in background" Sep 12 23:57:37.581957 kubelet[2585]: I0912 23:57:37.581927 2585 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 12 23:57:37.584474 kubelet[2585]: I0912 23:57:37.584428 2585 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 23:57:37.588679 kubelet[2585]: E0912 23:57:37.588380 2585 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Sep 12 23:57:37.588679 kubelet[2585]: I0912 23:57:37.588453 2585 server.go:1421] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Sep 12 23:57:37.591312 kubelet[2585]: I0912 23:57:37.591122 2585 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 23:57:37.591471 kubelet[2585]: I0912 23:57:37.591431 2585 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 23:57:37.591747 kubelet[2585]: I0912 23:57:37.591473 2585 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081-3-5-n-326e2e5946","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 23:57:37.591747 kubelet[2585]: I0912 23:57:37.591739 2585 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 23:57:37.591876 kubelet[2585]: I0912 23:57:37.591753 2585 container_manager_linux.go:304] "Creating device plugin manager" Sep 12 23:57:37.591876 kubelet[2585]: I0912 23:57:37.591807 2585 state_mem.go:36] "Initialized new in-memory state store" Sep 12 23:57:37.591991 kubelet[2585]: I0912 23:57:37.591977 2585 kubelet.go:446] "Attempting to sync node with API server" Sep 12 23:57:37.592022 kubelet[2585]: I0912 23:57:37.592004 2585 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 23:57:37.592059 kubelet[2585]: I0912 23:57:37.592027 2585 kubelet.go:352] "Adding apiserver pod source" Sep 12 23:57:37.592059 kubelet[2585]: I0912 23:57:37.592040 2585 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 23:57:37.596650 kubelet[2585]: I0912 23:57:37.596321 2585 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Sep 12 23:57:37.599084 kubelet[2585]: I0912 23:57:37.597699 2585 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 23:57:37.602330 kubelet[2585]: I0912 23:57:37.602178 2585 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 12 23:57:37.602513 kubelet[2585]: I0912 23:57:37.602501 2585 server.go:1287] "Started kubelet" Sep 12 23:57:37.615098 kubelet[2585]: I0912 23:57:37.615070 2585 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 23:57:37.621973 kubelet[2585]: I0912 23:57:37.621939 2585 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 23:57:37.622590 kubelet[2585]: I0912 23:57:37.622546 2585 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 23:57:37.624114 kubelet[2585]: I0912 23:57:37.624094 2585 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 12 23:57:37.625663 kubelet[2585]: I0912 23:57:37.625592 2585 server.go:479] "Adding debug handlers to kubelet server" Sep 12 23:57:37.626110 kubelet[2585]: E0912 23:57:37.626084 2585 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-5-n-326e2e5946\" not found" Sep 12 23:57:37.627616 kubelet[2585]: I0912 23:57:37.627533 2585 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 23:57:37.629744 kubelet[2585]: I0912 23:57:37.628815 2585 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 23:57:37.631678 kubelet[2585]: I0912 23:57:37.630559 2585 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 12 23:57:37.631975 kubelet[2585]: I0912 23:57:37.631960 2585 reconciler.go:26] "Reconciler: start to sync state" Sep 12 23:57:37.636079 kubelet[2585]: I0912 23:57:37.635823 2585 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 23:57:37.637420 kubelet[2585]: I0912 23:57:37.637393 2585 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 23:57:37.637540 kubelet[2585]: I0912 23:57:37.637530 2585 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 12 23:57:37.637604 kubelet[2585]: I0912 23:57:37.637595 2585 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 12 23:57:37.637662 kubelet[2585]: I0912 23:57:37.637652 2585 kubelet.go:2382] "Starting kubelet main sync loop" Sep 12 23:57:37.637840 kubelet[2585]: E0912 23:57:37.637813 2585 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 23:57:37.647823 kubelet[2585]: I0912 23:57:37.647779 2585 factory.go:221] Registration of the systemd container factory successfully Sep 12 23:57:37.647958 kubelet[2585]: I0912 23:57:37.647888 2585 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 23:57:37.651730 kubelet[2585]: E0912 23:57:37.651324 2585 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 23:57:37.651858 kubelet[2585]: I0912 23:57:37.651815 2585 factory.go:221] Registration of the containerd container factory successfully Sep 12 23:57:37.709491 kubelet[2585]: I0912 23:57:37.709447 2585 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 12 23:57:37.709811 kubelet[2585]: I0912 23:57:37.709794 2585 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 12 23:57:37.709888 kubelet[2585]: I0912 23:57:37.709878 2585 state_mem.go:36] "Initialized new in-memory state store" Sep 12 23:57:37.710131 kubelet[2585]: I0912 23:57:37.710113 2585 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 12 23:57:37.710223 kubelet[2585]: I0912 23:57:37.710197 2585 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 12 23:57:37.710298 kubelet[2585]: I0912 23:57:37.710288 2585 policy_none.go:49] "None policy: Start" Sep 12 23:57:37.710358 kubelet[2585]: I0912 23:57:37.710349 2585 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 12 23:57:37.710518 kubelet[2585]: I0912 23:57:37.710406 2585 state_mem.go:35] "Initializing new in-memory state store" Sep 12 23:57:37.710745 kubelet[2585]: I0912 23:57:37.710729 2585 state_mem.go:75] "Updated machine memory state" Sep 12 23:57:37.715766 kubelet[2585]: I0912 23:57:37.715418 2585 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 23:57:37.715766 kubelet[2585]: I0912 23:57:37.715588 2585 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 23:57:37.715766 kubelet[2585]: I0912 23:57:37.715599 2585 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 23:57:37.716370 kubelet[2585]: I0912 23:57:37.716352 2585 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 23:57:37.718477 kubelet[2585]: E0912 23:57:37.718447 2585 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 12 23:57:37.738679 kubelet[2585]: I0912 23:57:37.738639 2585 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:37.739133 kubelet[2585]: I0912 23:57:37.738888 2585 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:37.739455 kubelet[2585]: I0912 23:57:37.739440 2585 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:37.749373 kubelet[2585]: E0912 23:57:37.749183 2585 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4081-3-5-n-326e2e5946\" already exists" pod="kube-system/kube-controller-manager-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:37.821809 kubelet[2585]: I0912 23:57:37.820808 2585 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-5-n-326e2e5946" Sep 12 23:57:37.833421 kubelet[2585]: I0912 23:57:37.833268 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/725b835a0e6843a5ff8e88d12bc3be4c-ca-certs\") pod \"kube-controller-manager-ci-4081-3-5-n-326e2e5946\" (UID: \"725b835a0e6843a5ff8e88d12bc3be4c\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:37.833421 kubelet[2585]: I0912 23:57:37.833361 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/725b835a0e6843a5ff8e88d12bc3be4c-kubeconfig\") pod \"kube-controller-manager-ci-4081-3-5-n-326e2e5946\" (UID: \"725b835a0e6843a5ff8e88d12bc3be4c\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:37.833421 kubelet[2585]: I0912 23:57:37.833381 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3af20e4468a469bbc7b7ef803c7541a1-kubeconfig\") pod \"kube-scheduler-ci-4081-3-5-n-326e2e5946\" (UID: \"3af20e4468a469bbc7b7ef803c7541a1\") " pod="kube-system/kube-scheduler-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:37.834004 kubelet[2585]: I0912 23:57:37.833493 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d337cf5b029a73c04e03d8ba88efb8af-k8s-certs\") pod \"kube-apiserver-ci-4081-3-5-n-326e2e5946\" (UID: \"d337cf5b029a73c04e03d8ba88efb8af\") " pod="kube-system/kube-apiserver-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:37.834004 kubelet[2585]: I0912 23:57:37.833535 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d337cf5b029a73c04e03d8ba88efb8af-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081-3-5-n-326e2e5946\" (UID: \"d337cf5b029a73c04e03d8ba88efb8af\") " pod="kube-system/kube-apiserver-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:37.834004 kubelet[2585]: I0912 23:57:37.833660 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/725b835a0e6843a5ff8e88d12bc3be4c-flexvolume-dir\") pod \"kube-controller-manager-ci-4081-3-5-n-326e2e5946\" (UID: \"725b835a0e6843a5ff8e88d12bc3be4c\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:37.834004 kubelet[2585]: I0912 23:57:37.833701 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/725b835a0e6843a5ff8e88d12bc3be4c-k8s-certs\") pod \"kube-controller-manager-ci-4081-3-5-n-326e2e5946\" (UID: \"725b835a0e6843a5ff8e88d12bc3be4c\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:37.834004 kubelet[2585]: I0912 23:57:37.833777 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/725b835a0e6843a5ff8e88d12bc3be4c-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081-3-5-n-326e2e5946\" (UID: \"725b835a0e6843a5ff8e88d12bc3be4c\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:37.834250 kubelet[2585]: I0912 23:57:37.833811 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d337cf5b029a73c04e03d8ba88efb8af-ca-certs\") pod \"kube-apiserver-ci-4081-3-5-n-326e2e5946\" (UID: \"d337cf5b029a73c04e03d8ba88efb8af\") " pod="kube-system/kube-apiserver-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:37.834945 kubelet[2585]: I0912 23:57:37.834903 2585 kubelet_node_status.go:124] "Node was previously registered" node="ci-4081-3-5-n-326e2e5946" Sep 12 23:57:37.835009 kubelet[2585]: I0912 23:57:37.834990 2585 kubelet_node_status.go:78] "Successfully registered node" node="ci-4081-3-5-n-326e2e5946" Sep 12 23:57:38.593390 kubelet[2585]: I0912 23:57:38.593051 2585 apiserver.go:52] "Watching apiserver" Sep 12 23:57:38.632042 kubelet[2585]: I0912 23:57:38.631965 2585 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 12 23:57:38.689798 kubelet[2585]: I0912 23:57:38.688509 2585 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:38.691527 kubelet[2585]: I0912 23:57:38.691220 2585 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:38.699998 kubelet[2585]: E0912 23:57:38.698953 2585 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081-3-5-n-326e2e5946\" already exists" pod="kube-system/kube-apiserver-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:38.702278 kubelet[2585]: E0912 23:57:38.702126 2585 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081-3-5-n-326e2e5946\" already exists" pod="kube-system/kube-scheduler-ci-4081-3-5-n-326e2e5946" Sep 12 23:57:38.738739 kubelet[2585]: I0912 23:57:38.738574 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4081-3-5-n-326e2e5946" podStartSLOduration=3.7385549769999997 podStartE2EDuration="3.738554977s" podCreationTimestamp="2025-09-12 23:57:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 23:57:38.726997784 +0000 UTC m=+1.206230911" watchObservedRunningTime="2025-09-12 23:57:38.738554977 +0000 UTC m=+1.217788104" Sep 12 23:57:38.760171 kubelet[2585]: I0912 23:57:38.759540 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4081-3-5-n-326e2e5946" podStartSLOduration=1.7595187669999999 podStartE2EDuration="1.759518767s" podCreationTimestamp="2025-09-12 23:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 23:57:38.739591069 +0000 UTC m=+1.218824236" watchObservedRunningTime="2025-09-12 23:57:38.759518767 +0000 UTC m=+1.238751894" Sep 12 23:57:38.760171 kubelet[2585]: I0912 23:57:38.759699 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4081-3-5-n-326e2e5946" podStartSLOduration=1.759694069 podStartE2EDuration="1.759694069s" podCreationTimestamp="2025-09-12 23:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 23:57:38.755078201 +0000 UTC m=+1.234311368" watchObservedRunningTime="2025-09-12 23:57:38.759694069 +0000 UTC m=+1.238927196" Sep 12 23:57:41.302916 kubelet[2585]: I0912 23:57:41.302871 2585 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 12 23:57:41.303867 kubelet[2585]: I0912 23:57:41.303372 2585 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 12 23:57:41.303905 containerd[1466]: time="2025-09-12T23:57:41.303169731Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 12 23:57:42.165457 kubelet[2585]: I0912 23:57:42.165356 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/3cb272c4-4739-4ce9-ba27-b306ad8242d3-xtables-lock\") pod \"kube-proxy-7lnz7\" (UID: \"3cb272c4-4739-4ce9-ba27-b306ad8242d3\") " pod="kube-system/kube-proxy-7lnz7" Sep 12 23:57:42.165457 kubelet[2585]: I0912 23:57:42.165398 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvdt2\" (UniqueName: \"kubernetes.io/projected/3cb272c4-4739-4ce9-ba27-b306ad8242d3-kube-api-access-cvdt2\") pod \"kube-proxy-7lnz7\" (UID: \"3cb272c4-4739-4ce9-ba27-b306ad8242d3\") " pod="kube-system/kube-proxy-7lnz7" Sep 12 23:57:42.165457 kubelet[2585]: I0912 23:57:42.165418 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/3cb272c4-4739-4ce9-ba27-b306ad8242d3-kube-proxy\") pod \"kube-proxy-7lnz7\" (UID: \"3cb272c4-4739-4ce9-ba27-b306ad8242d3\") " pod="kube-system/kube-proxy-7lnz7" Sep 12 23:57:42.165457 kubelet[2585]: I0912 23:57:42.165434 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3cb272c4-4739-4ce9-ba27-b306ad8242d3-lib-modules\") pod \"kube-proxy-7lnz7\" (UID: \"3cb272c4-4739-4ce9-ba27-b306ad8242d3\") " pod="kube-system/kube-proxy-7lnz7" Sep 12 23:57:42.166336 systemd[1]: Created slice kubepods-besteffort-pod3cb272c4_4739_4ce9_ba27_b306ad8242d3.slice - libcontainer container kubepods-besteffort-pod3cb272c4_4739_4ce9_ba27_b306ad8242d3.slice. Sep 12 23:57:42.441008 systemd[1]: Created slice kubepods-besteffort-pod83609fce_88b6_432b_827c_b41a82031760.slice - libcontainer container kubepods-besteffort-pod83609fce_88b6_432b_827c_b41a82031760.slice. Sep 12 23:57:42.468182 kubelet[2585]: I0912 23:57:42.468035 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/83609fce-88b6-432b-827c-b41a82031760-var-lib-calico\") pod \"tigera-operator-755d956888-6lb7w\" (UID: \"83609fce-88b6-432b-827c-b41a82031760\") " pod="tigera-operator/tigera-operator-755d956888-6lb7w" Sep 12 23:57:42.468182 kubelet[2585]: I0912 23:57:42.468104 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxkfl\" (UniqueName: \"kubernetes.io/projected/83609fce-88b6-432b-827c-b41a82031760-kube-api-access-dxkfl\") pod \"tigera-operator-755d956888-6lb7w\" (UID: \"83609fce-88b6-432b-827c-b41a82031760\") " pod="tigera-operator/tigera-operator-755d956888-6lb7w" Sep 12 23:57:42.475816 containerd[1466]: time="2025-09-12T23:57:42.475762373Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-7lnz7,Uid:3cb272c4-4739-4ce9-ba27-b306ad8242d3,Namespace:kube-system,Attempt:0,}" Sep 12 23:57:42.505855 containerd[1466]: time="2025-09-12T23:57:42.505751310Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 23:57:42.505997 containerd[1466]: time="2025-09-12T23:57:42.505876643Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 23:57:42.505997 containerd[1466]: time="2025-09-12T23:57:42.505909207Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 23:57:42.506511 containerd[1466]: time="2025-09-12T23:57:42.506459505Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 23:57:42.539121 systemd[1]: Started cri-containerd-d76d3a5c2199ab7618a193a266d9f018fbc212840d208b63533a5a90d31b14d0.scope - libcontainer container d76d3a5c2199ab7618a193a266d9f018fbc212840d208b63533a5a90d31b14d0. Sep 12 23:57:42.578800 containerd[1466]: time="2025-09-12T23:57:42.578758404Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-7lnz7,Uid:3cb272c4-4739-4ce9-ba27-b306ad8242d3,Namespace:kube-system,Attempt:0,} returns sandbox id \"d76d3a5c2199ab7618a193a266d9f018fbc212840d208b63533a5a90d31b14d0\"" Sep 12 23:57:42.583038 containerd[1466]: time="2025-09-12T23:57:42.582995933Z" level=info msg="CreateContainer within sandbox \"d76d3a5c2199ab7618a193a266d9f018fbc212840d208b63533a5a90d31b14d0\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 12 23:57:42.601387 containerd[1466]: time="2025-09-12T23:57:42.601321954Z" level=info msg="CreateContainer within sandbox \"d76d3a5c2199ab7618a193a266d9f018fbc212840d208b63533a5a90d31b14d0\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"3cebc4e55ed7eb2697568f5c8cd175313e28a81a48f6825f3185b42a9ba3af56\"" Sep 12 23:57:42.603336 containerd[1466]: time="2025-09-12T23:57:42.603140947Z" level=info msg="StartContainer for \"3cebc4e55ed7eb2697568f5c8cd175313e28a81a48f6825f3185b42a9ba3af56\"" Sep 12 23:57:42.630918 systemd[1]: Started cri-containerd-3cebc4e55ed7eb2697568f5c8cd175313e28a81a48f6825f3185b42a9ba3af56.scope - libcontainer container 3cebc4e55ed7eb2697568f5c8cd175313e28a81a48f6825f3185b42a9ba3af56. Sep 12 23:57:42.661454 containerd[1466]: time="2025-09-12T23:57:42.661409040Z" level=info msg="StartContainer for \"3cebc4e55ed7eb2697568f5c8cd175313e28a81a48f6825f3185b42a9ba3af56\" returns successfully" Sep 12 23:57:42.716522 kubelet[2585]: I0912 23:57:42.715187 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-7lnz7" podStartSLOduration=0.715158413 podStartE2EDuration="715.158413ms" podCreationTimestamp="2025-09-12 23:57:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 23:57:42.713816831 +0000 UTC m=+5.193049958" watchObservedRunningTime="2025-09-12 23:57:42.715158413 +0000 UTC m=+5.194391580" Sep 12 23:57:42.745900 containerd[1466]: time="2025-09-12T23:57:42.745396377Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-6lb7w,Uid:83609fce-88b6-432b-827c-b41a82031760,Namespace:tigera-operator,Attempt:0,}" Sep 12 23:57:42.791152 containerd[1466]: time="2025-09-12T23:57:42.785892467Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 23:57:42.791152 containerd[1466]: time="2025-09-12T23:57:42.785947712Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 23:57:42.791152 containerd[1466]: time="2025-09-12T23:57:42.785958314Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 23:57:42.791152 containerd[1466]: time="2025-09-12T23:57:42.786052124Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 23:57:42.809152 systemd[1]: Started cri-containerd-2113b2e3e5c70c8fb3ec3c27c0133fda6a3b87ff46ec94954b2ff416879b196d.scope - libcontainer container 2113b2e3e5c70c8fb3ec3c27c0133fda6a3b87ff46ec94954b2ff416879b196d. Sep 12 23:57:42.851356 containerd[1466]: time="2025-09-12T23:57:42.851310157Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-6lb7w,Uid:83609fce-88b6-432b-827c-b41a82031760,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"2113b2e3e5c70c8fb3ec3c27c0133fda6a3b87ff46ec94954b2ff416879b196d\"" Sep 12 23:57:42.854978 containerd[1466]: time="2025-09-12T23:57:42.854945502Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 12 23:57:44.494683 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount780344958.mount: Deactivated successfully. Sep 12 23:57:44.913436 containerd[1466]: time="2025-09-12T23:57:44.911751572Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:57:44.913436 containerd[1466]: time="2025-09-12T23:57:44.913365808Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=22152365" Sep 12 23:57:44.914100 containerd[1466]: time="2025-09-12T23:57:44.913938264Z" level=info msg="ImageCreate event name:\"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:57:44.916751 containerd[1466]: time="2025-09-12T23:57:44.916662049Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:57:44.917644 containerd[1466]: time="2025-09-12T23:57:44.917603180Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"22148360\" in 2.062620994s" Sep 12 23:57:44.917730 containerd[1466]: time="2025-09-12T23:57:44.917643744Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\"" Sep 12 23:57:44.921650 containerd[1466]: time="2025-09-12T23:57:44.921606729Z" level=info msg="CreateContainer within sandbox \"2113b2e3e5c70c8fb3ec3c27c0133fda6a3b87ff46ec94954b2ff416879b196d\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 12 23:57:44.940689 containerd[1466]: time="2025-09-12T23:57:44.940595693Z" level=info msg="CreateContainer within sandbox \"2113b2e3e5c70c8fb3ec3c27c0133fda6a3b87ff46ec94954b2ff416879b196d\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"af616cc4cf459d10914859f7e902fe0176edb59d6465e7b3a83262548dbd05c7\"" Sep 12 23:57:44.942358 containerd[1466]: time="2025-09-12T23:57:44.941608671Z" level=info msg="StartContainer for \"af616cc4cf459d10914859f7e902fe0176edb59d6465e7b3a83262548dbd05c7\"" Sep 12 23:57:44.975758 systemd[1]: Started cri-containerd-af616cc4cf459d10914859f7e902fe0176edb59d6465e7b3a83262548dbd05c7.scope - libcontainer container af616cc4cf459d10914859f7e902fe0176edb59d6465e7b3a83262548dbd05c7. Sep 12 23:57:45.012401 containerd[1466]: time="2025-09-12T23:57:45.012353135Z" level=info msg="StartContainer for \"af616cc4cf459d10914859f7e902fe0176edb59d6465e7b3a83262548dbd05c7\" returns successfully" Sep 12 23:57:45.729355 kubelet[2585]: I0912 23:57:45.729047 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-6lb7w" podStartSLOduration=1.663080614 podStartE2EDuration="3.729028148s" podCreationTimestamp="2025-09-12 23:57:42 +0000 UTC" firstStartedPulling="2025-09-12 23:57:42.85313355 +0000 UTC m=+5.332366637" lastFinishedPulling="2025-09-12 23:57:44.919081044 +0000 UTC m=+7.398314171" observedRunningTime="2025-09-12 23:57:45.726778578 +0000 UTC m=+8.206011745" watchObservedRunningTime="2025-09-12 23:57:45.729028148 +0000 UTC m=+8.208261275" Sep 12 23:57:51.326236 sudo[1743]: pam_unix(sudo:session): session closed for user root Sep 12 23:57:51.487872 sshd[1740]: pam_unix(sshd:session): session closed for user core Sep 12 23:57:51.493129 systemd[1]: sshd@6-91.99.152.252:22-147.75.109.163:57904.service: Deactivated successfully. Sep 12 23:57:51.498654 systemd[1]: session-7.scope: Deactivated successfully. Sep 12 23:57:51.500341 systemd[1]: session-7.scope: Consumed 8.268s CPU time, 152.2M memory peak, 0B memory swap peak. Sep 12 23:57:51.502812 systemd-logind[1453]: Session 7 logged out. Waiting for processes to exit. Sep 12 23:57:51.505867 systemd-logind[1453]: Removed session 7. Sep 12 23:58:00.618400 systemd[1]: Created slice kubepods-besteffort-pod97b510e2_b26c_4933_a30d_6f3e0aaf1ce5.slice - libcontainer container kubepods-besteffort-pod97b510e2_b26c_4933_a30d_6f3e0aaf1ce5.slice. Sep 12 23:58:00.691628 kubelet[2585]: I0912 23:58:00.691549 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97b510e2-b26c-4933-a30d-6f3e0aaf1ce5-tigera-ca-bundle\") pod \"calico-typha-67cc5b5fbd-z76hj\" (UID: \"97b510e2-b26c-4933-a30d-6f3e0aaf1ce5\") " pod="calico-system/calico-typha-67cc5b5fbd-z76hj" Sep 12 23:58:00.691628 kubelet[2585]: I0912 23:58:00.691604 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/97b510e2-b26c-4933-a30d-6f3e0aaf1ce5-typha-certs\") pod \"calico-typha-67cc5b5fbd-z76hj\" (UID: \"97b510e2-b26c-4933-a30d-6f3e0aaf1ce5\") " pod="calico-system/calico-typha-67cc5b5fbd-z76hj" Sep 12 23:58:00.691628 kubelet[2585]: I0912 23:58:00.691625 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npb2t\" (UniqueName: \"kubernetes.io/projected/97b510e2-b26c-4933-a30d-6f3e0aaf1ce5-kube-api-access-npb2t\") pod \"calico-typha-67cc5b5fbd-z76hj\" (UID: \"97b510e2-b26c-4933-a30d-6f3e0aaf1ce5\") " pod="calico-system/calico-typha-67cc5b5fbd-z76hj" Sep 12 23:58:00.815146 systemd[1]: Created slice kubepods-besteffort-podbff68e9e_4a73_4202_ad54_0c25f6a3277b.slice - libcontainer container kubepods-besteffort-podbff68e9e_4a73_4202_ad54_0c25f6a3277b.slice. Sep 12 23:58:00.893802 kubelet[2585]: I0912 23:58:00.893619 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bff68e9e-4a73-4202-ad54-0c25f6a3277b-lib-modules\") pod \"calico-node-n2hgd\" (UID: \"bff68e9e-4a73-4202-ad54-0c25f6a3277b\") " pod="calico-system/calico-node-n2hgd" Sep 12 23:58:00.893802 kubelet[2585]: I0912 23:58:00.893665 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/bff68e9e-4a73-4202-ad54-0c25f6a3277b-policysync\") pod \"calico-node-n2hgd\" (UID: \"bff68e9e-4a73-4202-ad54-0c25f6a3277b\") " pod="calico-system/calico-node-n2hgd" Sep 12 23:58:00.893802 kubelet[2585]: I0912 23:58:00.893691 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/bff68e9e-4a73-4202-ad54-0c25f6a3277b-flexvol-driver-host\") pod \"calico-node-n2hgd\" (UID: \"bff68e9e-4a73-4202-ad54-0c25f6a3277b\") " pod="calico-system/calico-node-n2hgd" Sep 12 23:58:00.894226 kubelet[2585]: I0912 23:58:00.893975 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bff68e9e-4a73-4202-ad54-0c25f6a3277b-tigera-ca-bundle\") pod \"calico-node-n2hgd\" (UID: \"bff68e9e-4a73-4202-ad54-0c25f6a3277b\") " pod="calico-system/calico-node-n2hgd" Sep 12 23:58:00.894226 kubelet[2585]: I0912 23:58:00.894006 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/bff68e9e-4a73-4202-ad54-0c25f6a3277b-cni-bin-dir\") pod \"calico-node-n2hgd\" (UID: \"bff68e9e-4a73-4202-ad54-0c25f6a3277b\") " pod="calico-system/calico-node-n2hgd" Sep 12 23:58:00.894226 kubelet[2585]: I0912 23:58:00.894025 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/bff68e9e-4a73-4202-ad54-0c25f6a3277b-cni-log-dir\") pod \"calico-node-n2hgd\" (UID: \"bff68e9e-4a73-4202-ad54-0c25f6a3277b\") " pod="calico-system/calico-node-n2hgd" Sep 12 23:58:00.894226 kubelet[2585]: I0912 23:58:00.894043 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/bff68e9e-4a73-4202-ad54-0c25f6a3277b-var-lib-calico\") pod \"calico-node-n2hgd\" (UID: \"bff68e9e-4a73-4202-ad54-0c25f6a3277b\") " pod="calico-system/calico-node-n2hgd" Sep 12 23:58:00.894226 kubelet[2585]: I0912 23:58:00.894057 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/bff68e9e-4a73-4202-ad54-0c25f6a3277b-var-run-calico\") pod \"calico-node-n2hgd\" (UID: \"bff68e9e-4a73-4202-ad54-0c25f6a3277b\") " pod="calico-system/calico-node-n2hgd" Sep 12 23:58:00.894370 kubelet[2585]: I0912 23:58:00.894074 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/bff68e9e-4a73-4202-ad54-0c25f6a3277b-cni-net-dir\") pod \"calico-node-n2hgd\" (UID: \"bff68e9e-4a73-4202-ad54-0c25f6a3277b\") " pod="calico-system/calico-node-n2hgd" Sep 12 23:58:00.894370 kubelet[2585]: I0912 23:58:00.894090 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/bff68e9e-4a73-4202-ad54-0c25f6a3277b-node-certs\") pod \"calico-node-n2hgd\" (UID: \"bff68e9e-4a73-4202-ad54-0c25f6a3277b\") " pod="calico-system/calico-node-n2hgd" Sep 12 23:58:00.894370 kubelet[2585]: I0912 23:58:00.894129 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/bff68e9e-4a73-4202-ad54-0c25f6a3277b-xtables-lock\") pod \"calico-node-n2hgd\" (UID: \"bff68e9e-4a73-4202-ad54-0c25f6a3277b\") " pod="calico-system/calico-node-n2hgd" Sep 12 23:58:00.894370 kubelet[2585]: I0912 23:58:00.894151 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h275s\" (UniqueName: \"kubernetes.io/projected/bff68e9e-4a73-4202-ad54-0c25f6a3277b-kube-api-access-h275s\") pod \"calico-node-n2hgd\" (UID: \"bff68e9e-4a73-4202-ad54-0c25f6a3277b\") " pod="calico-system/calico-node-n2hgd" Sep 12 23:58:00.923860 containerd[1466]: time="2025-09-12T23:58:00.923805184Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-67cc5b5fbd-z76hj,Uid:97b510e2-b26c-4933-a30d-6f3e0aaf1ce5,Namespace:calico-system,Attempt:0,}" Sep 12 23:58:00.971744 containerd[1466]: time="2025-09-12T23:58:00.971463808Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 23:58:00.971744 containerd[1466]: time="2025-09-12T23:58:00.971531172Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 23:58:00.971744 containerd[1466]: time="2025-09-12T23:58:00.971542652Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 23:58:00.971744 containerd[1466]: time="2025-09-12T23:58:00.971633377Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 23:58:00.997894 kubelet[2585]: E0912 23:58:00.997415 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:00.997894 kubelet[2585]: W0912 23:58:00.997813 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:00.997894 kubelet[2585]: E0912 23:58:00.997850 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:00.998405 kubelet[2585]: E0912 23:58:00.998093 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:00.998405 kubelet[2585]: W0912 23:58:00.998348 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:00.998405 kubelet[2585]: E0912 23:58:00.998363 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.000812 kubelet[2585]: E0912 23:58:00.998535 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.000812 kubelet[2585]: W0912 23:58:00.998551 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.000812 kubelet[2585]: E0912 23:58:00.998561 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.000812 kubelet[2585]: E0912 23:58:00.999889 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.000812 kubelet[2585]: W0912 23:58:00.999905 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.000812 kubelet[2585]: E0912 23:58:00.999918 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.000812 kubelet[2585]: E0912 23:58:01.000378 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.000812 kubelet[2585]: W0912 23:58:01.000388 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.000812 kubelet[2585]: E0912 23:58:01.000414 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.000812 kubelet[2585]: E0912 23:58:01.000613 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.001171 kubelet[2585]: W0912 23:58:01.000622 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.001171 kubelet[2585]: E0912 23:58:01.000631 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.002305 kubelet[2585]: E0912 23:58:01.001098 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.002305 kubelet[2585]: W0912 23:58:01.001743 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.002305 kubelet[2585]: E0912 23:58:01.001766 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.002305 kubelet[2585]: E0912 23:58:01.001968 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.002305 kubelet[2585]: W0912 23:58:01.001978 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.002305 kubelet[2585]: E0912 23:58:01.001988 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.002536 kubelet[2585]: E0912 23:58:01.002329 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.002536 kubelet[2585]: W0912 23:58:01.002339 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.002536 kubelet[2585]: E0912 23:58:01.002401 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.010946 systemd[1]: Started cri-containerd-dd7692968a4046c6b3d661112edd6c5e40e762f0f8202b13be4737ee151638e4.scope - libcontainer container dd7692968a4046c6b3d661112edd6c5e40e762f0f8202b13be4737ee151638e4. Sep 12 23:58:01.012565 kubelet[2585]: E0912 23:58:01.011451 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.012565 kubelet[2585]: W0912 23:58:01.011476 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.012565 kubelet[2585]: E0912 23:58:01.011501 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.028583 kubelet[2585]: E0912 23:58:01.028509 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.028583 kubelet[2585]: W0912 23:58:01.028536 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.028583 kubelet[2585]: E0912 23:58:01.028566 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.074355 kubelet[2585]: E0912 23:58:01.073407 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vdhzb" podUID="ed6d6d1f-7445-4602-8d9e-b0dd54215b8f" Sep 12 23:58:01.080392 kubelet[2585]: I0912 23:58:01.080331 2585 status_manager.go:890] "Failed to get status for pod" podUID="ed6d6d1f-7445-4602-8d9e-b0dd54215b8f" pod="calico-system/csi-node-driver-vdhzb" err="pods \"csi-node-driver-vdhzb\" is forbidden: User \"system:node:ci-4081-3-5-n-326e2e5946\" cannot get resource \"pods\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4081-3-5-n-326e2e5946' and this object" Sep 12 23:58:01.085568 kubelet[2585]: E0912 23:58:01.085531 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.085568 kubelet[2585]: W0912 23:58:01.085558 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.085882 kubelet[2585]: E0912 23:58:01.085582 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.086867 kubelet[2585]: E0912 23:58:01.086831 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.086974 kubelet[2585]: W0912 23:58:01.086863 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.086974 kubelet[2585]: E0912 23:58:01.086919 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.089028 kubelet[2585]: E0912 23:58:01.088966 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.089028 kubelet[2585]: W0912 23:58:01.088986 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.089028 kubelet[2585]: E0912 23:58:01.089003 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.089283 kubelet[2585]: E0912 23:58:01.089260 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.089320 kubelet[2585]: W0912 23:58:01.089278 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.089320 kubelet[2585]: E0912 23:58:01.089306 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.089521 kubelet[2585]: E0912 23:58:01.089499 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.089556 kubelet[2585]: W0912 23:58:01.089526 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.089556 kubelet[2585]: E0912 23:58:01.089536 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.089747 kubelet[2585]: E0912 23:58:01.089732 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.089747 kubelet[2585]: W0912 23:58:01.089744 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.089815 kubelet[2585]: E0912 23:58:01.089754 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.090012 kubelet[2585]: E0912 23:58:01.089923 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.090012 kubelet[2585]: W0912 23:58:01.089937 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.090012 kubelet[2585]: E0912 23:58:01.089960 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.090377 kubelet[2585]: E0912 23:58:01.090296 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.090517 kubelet[2585]: W0912 23:58:01.090415 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.090517 kubelet[2585]: E0912 23:58:01.090431 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.091809 kubelet[2585]: E0912 23:58:01.091265 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.091809 kubelet[2585]: W0912 23:58:01.091290 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.091809 kubelet[2585]: E0912 23:58:01.091303 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.092054 kubelet[2585]: E0912 23:58:01.092023 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.092104 kubelet[2585]: W0912 23:58:01.092088 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.092153 kubelet[2585]: E0912 23:58:01.092105 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.092748 kubelet[2585]: E0912 23:58:01.092726 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.092748 kubelet[2585]: W0912 23:58:01.092744 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.092839 kubelet[2585]: E0912 23:58:01.092761 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.094245 kubelet[2585]: E0912 23:58:01.094214 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.094245 kubelet[2585]: W0912 23:58:01.094236 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.094245 kubelet[2585]: E0912 23:58:01.094249 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.095463 kubelet[2585]: E0912 23:58:01.095433 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.095463 kubelet[2585]: W0912 23:58:01.095453 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.095463 kubelet[2585]: E0912 23:58:01.095466 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.095960 kubelet[2585]: E0912 23:58:01.095920 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.095960 kubelet[2585]: W0912 23:58:01.095936 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.096043 kubelet[2585]: E0912 23:58:01.095968 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.097258 kubelet[2585]: E0912 23:58:01.097231 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.097258 kubelet[2585]: W0912 23:58:01.097250 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.097356 kubelet[2585]: E0912 23:58:01.097264 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.097578 kubelet[2585]: E0912 23:58:01.097556 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.097578 kubelet[2585]: W0912 23:58:01.097571 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.097671 kubelet[2585]: E0912 23:58:01.097581 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.097997 kubelet[2585]: E0912 23:58:01.097975 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.097997 kubelet[2585]: W0912 23:58:01.097989 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.098076 kubelet[2585]: E0912 23:58:01.098000 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.098568 kubelet[2585]: E0912 23:58:01.098538 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.098568 kubelet[2585]: W0912 23:58:01.098557 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.098568 kubelet[2585]: E0912 23:58:01.098569 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.099080 kubelet[2585]: E0912 23:58:01.099060 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.099080 kubelet[2585]: W0912 23:58:01.099075 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.099176 kubelet[2585]: E0912 23:58:01.099087 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.100034 kubelet[2585]: E0912 23:58:01.099572 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.100034 kubelet[2585]: W0912 23:58:01.099590 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.100034 kubelet[2585]: E0912 23:58:01.099602 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.100380 kubelet[2585]: E0912 23:58:01.100361 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.100380 kubelet[2585]: W0912 23:58:01.100376 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.100456 kubelet[2585]: E0912 23:58:01.100388 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.100650 kubelet[2585]: I0912 23:58:01.100628 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed6d6d1f-7445-4602-8d9e-b0dd54215b8f-kubelet-dir\") pod \"csi-node-driver-vdhzb\" (UID: \"ed6d6d1f-7445-4602-8d9e-b0dd54215b8f\") " pod="calico-system/csi-node-driver-vdhzb" Sep 12 23:58:01.101002 kubelet[2585]: E0912 23:58:01.100975 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.101002 kubelet[2585]: W0912 23:58:01.100995 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.101685 kubelet[2585]: E0912 23:58:01.101013 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.101826 kubelet[2585]: E0912 23:58:01.101808 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.101826 kubelet[2585]: W0912 23:58:01.101823 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.101902 kubelet[2585]: E0912 23:58:01.101837 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.102246 kubelet[2585]: E0912 23:58:01.102224 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.102246 kubelet[2585]: W0912 23:58:01.102240 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.102327 kubelet[2585]: E0912 23:58:01.102251 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.102524 kubelet[2585]: I0912 23:58:01.102277 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/ed6d6d1f-7445-4602-8d9e-b0dd54215b8f-varrun\") pod \"csi-node-driver-vdhzb\" (UID: \"ed6d6d1f-7445-4602-8d9e-b0dd54215b8f\") " pod="calico-system/csi-node-driver-vdhzb" Sep 12 23:58:01.103358 kubelet[2585]: E0912 23:58:01.102750 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.103358 kubelet[2585]: W0912 23:58:01.102787 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.103358 kubelet[2585]: E0912 23:58:01.102807 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.103618 kubelet[2585]: E0912 23:58:01.103599 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.103695 kubelet[2585]: W0912 23:58:01.103681 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.104221 kubelet[2585]: E0912 23:58:01.103993 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.104727 kubelet[2585]: E0912 23:58:01.104343 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.104829 kubelet[2585]: W0912 23:58:01.104812 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.104884 kubelet[2585]: E0912 23:58:01.104873 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.105187 kubelet[2585]: I0912 23:58:01.104958 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ed6d6d1f-7445-4602-8d9e-b0dd54215b8f-socket-dir\") pod \"csi-node-driver-vdhzb\" (UID: \"ed6d6d1f-7445-4602-8d9e-b0dd54215b8f\") " pod="calico-system/csi-node-driver-vdhzb" Sep 12 23:58:01.106820 kubelet[2585]: E0912 23:58:01.106796 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.106936 kubelet[2585]: W0912 23:58:01.106922 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.107069 kubelet[2585]: E0912 23:58:01.107053 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.107191 kubelet[2585]: I0912 23:58:01.107176 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6ldb\" (UniqueName: \"kubernetes.io/projected/ed6d6d1f-7445-4602-8d9e-b0dd54215b8f-kube-api-access-c6ldb\") pod \"csi-node-driver-vdhzb\" (UID: \"ed6d6d1f-7445-4602-8d9e-b0dd54215b8f\") " pod="calico-system/csi-node-driver-vdhzb" Sep 12 23:58:01.107434 kubelet[2585]: E0912 23:58:01.107404 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.107434 kubelet[2585]: W0912 23:58:01.107429 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.107510 kubelet[2585]: E0912 23:58:01.107448 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.107953 kubelet[2585]: E0912 23:58:01.107929 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.107953 kubelet[2585]: W0912 23:58:01.107950 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.108067 kubelet[2585]: E0912 23:58:01.107976 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.108399 kubelet[2585]: E0912 23:58:01.108372 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.108399 kubelet[2585]: W0912 23:58:01.108386 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.108399 kubelet[2585]: E0912 23:58:01.108405 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.108519 kubelet[2585]: I0912 23:58:01.108428 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ed6d6d1f-7445-4602-8d9e-b0dd54215b8f-registration-dir\") pod \"csi-node-driver-vdhzb\" (UID: \"ed6d6d1f-7445-4602-8d9e-b0dd54215b8f\") " pod="calico-system/csi-node-driver-vdhzb" Sep 12 23:58:01.108827 kubelet[2585]: E0912 23:58:01.108802 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.108886 kubelet[2585]: W0912 23:58:01.108868 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.108912 kubelet[2585]: E0912 23:58:01.108900 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.109538 kubelet[2585]: E0912 23:58:01.109514 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.109538 kubelet[2585]: W0912 23:58:01.109533 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.109781 kubelet[2585]: E0912 23:58:01.109642 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.110038 kubelet[2585]: E0912 23:58:01.110019 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.110038 kubelet[2585]: W0912 23:58:01.110034 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.110133 kubelet[2585]: E0912 23:58:01.110046 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.110329 kubelet[2585]: E0912 23:58:01.110314 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.110329 kubelet[2585]: W0912 23:58:01.110326 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.110393 kubelet[2585]: E0912 23:58:01.110336 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.118805 containerd[1466]: time="2025-09-12T23:58:01.118741512Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-n2hgd,Uid:bff68e9e-4a73-4202-ad54-0c25f6a3277b,Namespace:calico-system,Attempt:0,}" Sep 12 23:58:01.177856 containerd[1466]: time="2025-09-12T23:58:01.177743166Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 23:58:01.177856 containerd[1466]: time="2025-09-12T23:58:01.177811089Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 23:58:01.178055 containerd[1466]: time="2025-09-12T23:58:01.177827610Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 23:58:01.178055 containerd[1466]: time="2025-09-12T23:58:01.177931856Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 23:58:01.186298 containerd[1466]: time="2025-09-12T23:58:01.186256069Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-67cc5b5fbd-z76hj,Uid:97b510e2-b26c-4933-a30d-6f3e0aaf1ce5,Namespace:calico-system,Attempt:0,} returns sandbox id \"dd7692968a4046c6b3d661112edd6c5e40e762f0f8202b13be4737ee151638e4\"" Sep 12 23:58:01.191784 containerd[1466]: time="2025-09-12T23:58:01.191574559Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 12 23:58:01.210056 kubelet[2585]: E0912 23:58:01.210019 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.210056 kubelet[2585]: W0912 23:58:01.210046 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.210262 kubelet[2585]: E0912 23:58:01.210073 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.211633 kubelet[2585]: E0912 23:58:01.211602 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.211633 kubelet[2585]: W0912 23:58:01.211626 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.211840 kubelet[2585]: E0912 23:58:01.211663 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.213737 kubelet[2585]: E0912 23:58:01.213035 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.213737 kubelet[2585]: W0912 23:58:01.213062 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.213737 kubelet[2585]: E0912 23:58:01.213087 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.213737 kubelet[2585]: E0912 23:58:01.213845 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.213737 kubelet[2585]: W0912 23:58:01.213859 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.213737 kubelet[2585]: E0912 23:58:01.213873 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.214275 kubelet[2585]: E0912 23:58:01.214072 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.214275 kubelet[2585]: W0912 23:58:01.214082 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.214275 kubelet[2585]: E0912 23:58:01.214092 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.215726 kubelet[2585]: E0912 23:58:01.214389 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.215726 kubelet[2585]: W0912 23:58:01.214406 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.215726 kubelet[2585]: E0912 23:58:01.214417 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.215726 kubelet[2585]: E0912 23:58:01.214620 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.215726 kubelet[2585]: W0912 23:58:01.214630 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.215726 kubelet[2585]: E0912 23:58:01.214640 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.215726 kubelet[2585]: E0912 23:58:01.214985 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.215726 kubelet[2585]: W0912 23:58:01.214997 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.215726 kubelet[2585]: E0912 23:58:01.215008 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.215726 kubelet[2585]: E0912 23:58:01.215486 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.214968 systemd[1]: Started cri-containerd-eaaf952d1de8d62483e39d3768aac0024efd2f4f83101ab04a5c62c0de1ebc0c.scope - libcontainer container eaaf952d1de8d62483e39d3768aac0024efd2f4f83101ab04a5c62c0de1ebc0c. Sep 12 23:58:01.216620 kubelet[2585]: W0912 23:58:01.215500 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.216620 kubelet[2585]: E0912 23:58:01.215529 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.216620 kubelet[2585]: E0912 23:58:01.215810 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.216620 kubelet[2585]: W0912 23:58:01.215822 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.216620 kubelet[2585]: E0912 23:58:01.215831 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.216620 kubelet[2585]: E0912 23:58:01.216397 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.216620 kubelet[2585]: W0912 23:58:01.216412 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.216620 kubelet[2585]: E0912 23:58:01.216424 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.216620 kubelet[2585]: E0912 23:58:01.216600 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.216620 kubelet[2585]: W0912 23:58:01.216609 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.219451 kubelet[2585]: E0912 23:58:01.216617 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.219451 kubelet[2585]: E0912 23:58:01.216928 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.219451 kubelet[2585]: W0912 23:58:01.216940 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.219451 kubelet[2585]: E0912 23:58:01.216950 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.219451 kubelet[2585]: E0912 23:58:01.217756 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.219451 kubelet[2585]: W0912 23:58:01.217777 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.219451 kubelet[2585]: E0912 23:58:01.217816 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.219451 kubelet[2585]: E0912 23:58:01.218520 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.219451 kubelet[2585]: W0912 23:58:01.218534 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.219451 kubelet[2585]: E0912 23:58:01.218623 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.219653 kubelet[2585]: E0912 23:58:01.218754 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.219653 kubelet[2585]: W0912 23:58:01.218762 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.219653 kubelet[2585]: E0912 23:58:01.218849 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.219653 kubelet[2585]: E0912 23:58:01.218932 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.219653 kubelet[2585]: W0912 23:58:01.218939 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.219653 kubelet[2585]: E0912 23:58:01.219039 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.220406 kubelet[2585]: E0912 23:58:01.219688 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.220406 kubelet[2585]: W0912 23:58:01.219703 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.220406 kubelet[2585]: E0912 23:58:01.219805 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.220406 kubelet[2585]: E0912 23:58:01.220162 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.220406 kubelet[2585]: W0912 23:58:01.220177 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.220406 kubelet[2585]: E0912 23:58:01.220215 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.220553 kubelet[2585]: E0912 23:58:01.220469 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.220553 kubelet[2585]: W0912 23:58:01.220480 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.221773 kubelet[2585]: E0912 23:58:01.220941 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.221773 kubelet[2585]: W0912 23:58:01.220961 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.221773 kubelet[2585]: E0912 23:58:01.220977 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.221773 kubelet[2585]: E0912 23:58:01.221478 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.221773 kubelet[2585]: W0912 23:58:01.221498 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.221773 kubelet[2585]: E0912 23:58:01.221526 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.221773 kubelet[2585]: E0912 23:58:01.221740 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.222129 kubelet[2585]: E0912 23:58:01.221883 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.222129 kubelet[2585]: W0912 23:58:01.221893 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.222129 kubelet[2585]: E0912 23:58:01.221940 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.222555 kubelet[2585]: E0912 23:58:01.222520 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.223243 kubelet[2585]: W0912 23:58:01.222700 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.223243 kubelet[2585]: E0912 23:58:01.222893 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.223728 kubelet[2585]: E0912 23:58:01.223472 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.223728 kubelet[2585]: W0912 23:58:01.223508 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.223728 kubelet[2585]: E0912 23:58:01.223523 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.271843 kubelet[2585]: E0912 23:58:01.271780 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:01.271843 kubelet[2585]: W0912 23:58:01.271803 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:01.271843 kubelet[2585]: E0912 23:58:01.271843 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:01.272897 containerd[1466]: time="2025-09-12T23:58:01.272485687Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-n2hgd,Uid:bff68e9e-4a73-4202-ad54-0c25f6a3277b,Namespace:calico-system,Attempt:0,} returns sandbox id \"eaaf952d1de8d62483e39d3768aac0024efd2f4f83101ab04a5c62c0de1ebc0c\"" Sep 12 23:58:02.638758 kubelet[2585]: E0912 23:58:02.638610 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vdhzb" podUID="ed6d6d1f-7445-4602-8d9e-b0dd54215b8f" Sep 12 23:58:02.697489 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3424065043.mount: Deactivated successfully. Sep 12 23:58:03.635992 containerd[1466]: time="2025-09-12T23:58:03.635916648Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:03.637768 containerd[1466]: time="2025-09-12T23:58:03.637514811Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=33105775" Sep 12 23:58:03.638977 containerd[1466]: time="2025-09-12T23:58:03.638926084Z" level=info msg="ImageCreate event name:\"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:03.643660 containerd[1466]: time="2025-09-12T23:58:03.643537164Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:03.644332 containerd[1466]: time="2025-09-12T23:58:03.644283602Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"33105629\" in 2.451960763s" Sep 12 23:58:03.644332 containerd[1466]: time="2025-09-12T23:58:03.644321884Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\"" Sep 12 23:58:03.645812 containerd[1466]: time="2025-09-12T23:58:03.645587670Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 12 23:58:03.665113 containerd[1466]: time="2025-09-12T23:58:03.665037199Z" level=info msg="CreateContainer within sandbox \"dd7692968a4046c6b3d661112edd6c5e40e762f0f8202b13be4737ee151638e4\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 12 23:58:03.689255 containerd[1466]: time="2025-09-12T23:58:03.689077886Z" level=info msg="CreateContainer within sandbox \"dd7692968a4046c6b3d661112edd6c5e40e762f0f8202b13be4737ee151638e4\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"7c8c1434bbd45524797dc0d97b688151c9eaa64105c8e48335df8e54e44f47be\"" Sep 12 23:58:03.690766 containerd[1466]: time="2025-09-12T23:58:03.690090899Z" level=info msg="StartContainer for \"7c8c1434bbd45524797dc0d97b688151c9eaa64105c8e48335df8e54e44f47be\"" Sep 12 23:58:03.737982 systemd[1]: Started cri-containerd-7c8c1434bbd45524797dc0d97b688151c9eaa64105c8e48335df8e54e44f47be.scope - libcontainer container 7c8c1434bbd45524797dc0d97b688151c9eaa64105c8e48335df8e54e44f47be. Sep 12 23:58:03.787814 containerd[1466]: time="2025-09-12T23:58:03.787651481Z" level=info msg="StartContainer for \"7c8c1434bbd45524797dc0d97b688151c9eaa64105c8e48335df8e54e44f47be\" returns successfully" Sep 12 23:58:04.639430 kubelet[2585]: E0912 23:58:04.638559 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vdhzb" podUID="ed6d6d1f-7445-4602-8d9e-b0dd54215b8f" Sep 12 23:58:04.656210 systemd[1]: run-containerd-runc-k8s.io-7c8c1434bbd45524797dc0d97b688151c9eaa64105c8e48335df8e54e44f47be-runc.lAsIpF.mount: Deactivated successfully. Sep 12 23:58:04.788051 kubelet[2585]: I0912 23:58:04.787168 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-67cc5b5fbd-z76hj" podStartSLOduration=2.332363381 podStartE2EDuration="4.787145055s" podCreationTimestamp="2025-09-12 23:58:00 +0000 UTC" firstStartedPulling="2025-09-12 23:58:01.190532342 +0000 UTC m=+23.669765429" lastFinishedPulling="2025-09-12 23:58:03.645314016 +0000 UTC m=+26.124547103" observedRunningTime="2025-09-12 23:58:04.78644758 +0000 UTC m=+27.265680747" watchObservedRunningTime="2025-09-12 23:58:04.787145055 +0000 UTC m=+27.266378222" Sep 12 23:58:04.824327 kubelet[2585]: E0912 23:58:04.824217 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.824327 kubelet[2585]: W0912 23:58:04.824256 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.824327 kubelet[2585]: E0912 23:58:04.824312 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.824888 kubelet[2585]: E0912 23:58:04.824865 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.824961 kubelet[2585]: W0912 23:58:04.824882 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.824961 kubelet[2585]: E0912 23:58:04.824956 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.825482 kubelet[2585]: E0912 23:58:04.825443 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.825542 kubelet[2585]: W0912 23:58:04.825484 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.825542 kubelet[2585]: E0912 23:58:04.825511 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.826028 kubelet[2585]: E0912 23:58:04.825946 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.826028 kubelet[2585]: W0912 23:58:04.825961 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.826159 kubelet[2585]: E0912 23:58:04.825984 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.826349 kubelet[2585]: E0912 23:58:04.826334 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.826349 kubelet[2585]: W0912 23:58:04.826348 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.826437 kubelet[2585]: E0912 23:58:04.826360 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.826629 kubelet[2585]: E0912 23:58:04.826614 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.826629 kubelet[2585]: W0912 23:58:04.826627 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.826764 kubelet[2585]: E0912 23:58:04.826638 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.826844 kubelet[2585]: E0912 23:58:04.826832 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.826844 kubelet[2585]: W0912 23:58:04.826843 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.826918 kubelet[2585]: E0912 23:58:04.826854 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.827032 kubelet[2585]: E0912 23:58:04.827013 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.827032 kubelet[2585]: W0912 23:58:04.827031 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.827116 kubelet[2585]: E0912 23:58:04.827044 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.827226 kubelet[2585]: E0912 23:58:04.827212 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.827226 kubelet[2585]: W0912 23:58:04.827223 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.827378 kubelet[2585]: E0912 23:58:04.827232 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.827425 kubelet[2585]: E0912 23:58:04.827394 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.827425 kubelet[2585]: W0912 23:58:04.827412 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.827425 kubelet[2585]: E0912 23:58:04.827421 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.827597 kubelet[2585]: E0912 23:58:04.827585 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.827597 kubelet[2585]: W0912 23:58:04.827595 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.827597 kubelet[2585]: E0912 23:58:04.827606 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.827901 kubelet[2585]: E0912 23:58:04.827883 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.827901 kubelet[2585]: W0912 23:58:04.827894 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.827970 kubelet[2585]: E0912 23:58:04.827914 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.828175 kubelet[2585]: E0912 23:58:04.828163 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.828208 kubelet[2585]: W0912 23:58:04.828176 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.828208 kubelet[2585]: E0912 23:58:04.828186 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.828409 kubelet[2585]: E0912 23:58:04.828397 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.828443 kubelet[2585]: W0912 23:58:04.828410 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.828443 kubelet[2585]: E0912 23:58:04.828421 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.828587 kubelet[2585]: E0912 23:58:04.828577 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.828616 kubelet[2585]: W0912 23:58:04.828587 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.828616 kubelet[2585]: E0912 23:58:04.828595 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.842626 kubelet[2585]: E0912 23:58:04.842433 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.842626 kubelet[2585]: W0912 23:58:04.842458 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.842626 kubelet[2585]: E0912 23:58:04.842478 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.843013 kubelet[2585]: E0912 23:58:04.842882 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.843013 kubelet[2585]: W0912 23:58:04.842898 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.843013 kubelet[2585]: E0912 23:58:04.842914 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.843665 kubelet[2585]: E0912 23:58:04.843358 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.843665 kubelet[2585]: W0912 23:58:04.843373 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.843665 kubelet[2585]: E0912 23:58:04.843391 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.845225 kubelet[2585]: E0912 23:58:04.844923 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.845225 kubelet[2585]: W0912 23:58:04.844942 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.845225 kubelet[2585]: E0912 23:58:04.845155 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.845225 kubelet[2585]: W0912 23:58:04.845163 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.845639 kubelet[2585]: E0912 23:58:04.845612 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.845697 kubelet[2585]: E0912 23:58:04.845652 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.845888 kubelet[2585]: E0912 23:58:04.845853 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.845888 kubelet[2585]: W0912 23:58:04.845872 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.846105 kubelet[2585]: E0912 23:58:04.845983 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.846329 kubelet[2585]: E0912 23:58:04.846295 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.846329 kubelet[2585]: W0912 23:58:04.846310 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.846521 kubelet[2585]: E0912 23:58:04.846391 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.846970 kubelet[2585]: E0912 23:58:04.846837 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.846970 kubelet[2585]: W0912 23:58:04.846853 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.846970 kubelet[2585]: E0912 23:58:04.846879 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.847456 kubelet[2585]: E0912 23:58:04.847358 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.847456 kubelet[2585]: W0912 23:58:04.847375 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.847664 kubelet[2585]: E0912 23:58:04.847608 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.847859 kubelet[2585]: E0912 23:58:04.847845 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.848082 kubelet[2585]: W0912 23:58:04.847905 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.848082 kubelet[2585]: E0912 23:58:04.847935 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.848572 kubelet[2585]: E0912 23:58:04.848555 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.848812 kubelet[2585]: W0912 23:58:04.848731 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.848812 kubelet[2585]: E0912 23:58:04.848766 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.849599 kubelet[2585]: E0912 23:58:04.849578 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.849681 kubelet[2585]: W0912 23:58:04.849666 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.849962 kubelet[2585]: E0912 23:58:04.849842 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.850180 kubelet[2585]: E0912 23:58:04.850084 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.850180 kubelet[2585]: W0912 23:58:04.850104 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.850180 kubelet[2585]: E0912 23:58:04.850148 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.850831 kubelet[2585]: E0912 23:58:04.850263 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.850831 kubelet[2585]: W0912 23:58:04.850272 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.850831 kubelet[2585]: E0912 23:58:04.850324 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.850831 kubelet[2585]: E0912 23:58:04.850594 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.850831 kubelet[2585]: W0912 23:58:04.850612 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.850831 kubelet[2585]: E0912 23:58:04.850636 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.851138 kubelet[2585]: E0912 23:58:04.851108 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.851138 kubelet[2585]: W0912 23:58:04.851131 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.851347 kubelet[2585]: E0912 23:58:04.851146 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.851388 kubelet[2585]: E0912 23:58:04.851357 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.851388 kubelet[2585]: W0912 23:58:04.851367 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.851388 kubelet[2585]: E0912 23:58:04.851379 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:04.852620 kubelet[2585]: E0912 23:58:04.851923 2585 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:58:04.852620 kubelet[2585]: W0912 23:58:04.851946 2585 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:58:04.852620 kubelet[2585]: E0912 23:58:04.851961 2585 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:58:05.216761 containerd[1466]: time="2025-09-12T23:58:05.215657838Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:05.217908 containerd[1466]: time="2025-09-12T23:58:05.217865948Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4266814" Sep 12 23:58:05.218048 containerd[1466]: time="2025-09-12T23:58:05.217905910Z" level=info msg="ImageCreate event name:\"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:05.221557 containerd[1466]: time="2025-09-12T23:58:05.221497848Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:05.222857 containerd[1466]: time="2025-09-12T23:58:05.222798393Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5636015\" in 1.577084716s" Sep 12 23:58:05.222975 containerd[1466]: time="2025-09-12T23:58:05.222858996Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\"" Sep 12 23:58:05.226949 containerd[1466]: time="2025-09-12T23:58:05.226904956Z" level=info msg="CreateContainer within sandbox \"eaaf952d1de8d62483e39d3768aac0024efd2f4f83101ab04a5c62c0de1ebc0c\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 12 23:58:05.249115 containerd[1466]: time="2025-09-12T23:58:05.248977331Z" level=info msg="CreateContainer within sandbox \"eaaf952d1de8d62483e39d3768aac0024efd2f4f83101ab04a5c62c0de1ebc0c\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"d4b353e2e7aa254eb811c53c6754eb4634281f33b2aa30f4f09c4b6619a275a5\"" Sep 12 23:58:05.250388 containerd[1466]: time="2025-09-12T23:58:05.250107187Z" level=info msg="StartContainer for \"d4b353e2e7aa254eb811c53c6754eb4634281f33b2aa30f4f09c4b6619a275a5\"" Sep 12 23:58:05.290956 systemd[1]: Started cri-containerd-d4b353e2e7aa254eb811c53c6754eb4634281f33b2aa30f4f09c4b6619a275a5.scope - libcontainer container d4b353e2e7aa254eb811c53c6754eb4634281f33b2aa30f4f09c4b6619a275a5. Sep 12 23:58:05.329430 containerd[1466]: time="2025-09-12T23:58:05.329383120Z" level=info msg="StartContainer for \"d4b353e2e7aa254eb811c53c6754eb4634281f33b2aa30f4f09c4b6619a275a5\" returns successfully" Sep 12 23:58:05.347467 systemd[1]: cri-containerd-d4b353e2e7aa254eb811c53c6754eb4634281f33b2aa30f4f09c4b6619a275a5.scope: Deactivated successfully. Sep 12 23:58:05.381261 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d4b353e2e7aa254eb811c53c6754eb4634281f33b2aa30f4f09c4b6619a275a5-rootfs.mount: Deactivated successfully. Sep 12 23:58:05.491517 containerd[1466]: time="2025-09-12T23:58:05.489353176Z" level=info msg="shim disconnected" id=d4b353e2e7aa254eb811c53c6754eb4634281f33b2aa30f4f09c4b6619a275a5 namespace=k8s.io Sep 12 23:58:05.491517 containerd[1466]: time="2025-09-12T23:58:05.489407379Z" level=warning msg="cleaning up after shim disconnected" id=d4b353e2e7aa254eb811c53c6754eb4634281f33b2aa30f4f09c4b6619a275a5 namespace=k8s.io Sep 12 23:58:05.491517 containerd[1466]: time="2025-09-12T23:58:05.489418140Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 12 23:58:05.775675 kubelet[2585]: I0912 23:58:05.775233 2585 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 23:58:05.778875 containerd[1466]: time="2025-09-12T23:58:05.778275670Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 12 23:58:06.639437 kubelet[2585]: E0912 23:58:06.638675 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vdhzb" podUID="ed6d6d1f-7445-4602-8d9e-b0dd54215b8f" Sep 12 23:58:08.638939 kubelet[2585]: E0912 23:58:08.638842 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-vdhzb" podUID="ed6d6d1f-7445-4602-8d9e-b0dd54215b8f" Sep 12 23:58:09.496611 containerd[1466]: time="2025-09-12T23:58:09.496546494Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:09.498741 containerd[1466]: time="2025-09-12T23:58:09.498097565Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=65913477" Sep 12 23:58:09.499935 containerd[1466]: time="2025-09-12T23:58:09.499894048Z" level=info msg="ImageCreate event name:\"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:09.506897 containerd[1466]: time="2025-09-12T23:58:09.506768123Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:09.514245 containerd[1466]: time="2025-09-12T23:58:09.513932931Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"67282718\" in 3.735545096s" Sep 12 23:58:09.514245 containerd[1466]: time="2025-09-12T23:58:09.513986054Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\"" Sep 12 23:58:09.524004 containerd[1466]: time="2025-09-12T23:58:09.523948390Z" level=info msg="CreateContainer within sandbox \"eaaf952d1de8d62483e39d3768aac0024efd2f4f83101ab04a5c62c0de1ebc0c\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 12 23:58:09.544271 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1049746581.mount: Deactivated successfully. Sep 12 23:58:09.548948 containerd[1466]: time="2025-09-12T23:58:09.548528758Z" level=info msg="CreateContainer within sandbox \"eaaf952d1de8d62483e39d3768aac0024efd2f4f83101ab04a5c62c0de1ebc0c\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"abbef7e51bd2816753df987d8d6f931bc777560439662d5b63d21a49f0413415\"" Sep 12 23:58:09.549624 containerd[1466]: time="2025-09-12T23:58:09.549174707Z" level=info msg="StartContainer for \"abbef7e51bd2816753df987d8d6f931bc777560439662d5b63d21a49f0413415\"" Sep 12 23:58:09.591032 systemd[1]: Started cri-containerd-abbef7e51bd2816753df987d8d6f931bc777560439662d5b63d21a49f0413415.scope - libcontainer container abbef7e51bd2816753df987d8d6f931bc777560439662d5b63d21a49f0413415. Sep 12 23:58:09.632739 containerd[1466]: time="2025-09-12T23:58:09.632582132Z" level=info msg="StartContainer for \"abbef7e51bd2816753df987d8d6f931bc777560439662d5b63d21a49f0413415\" returns successfully" Sep 12 23:58:10.263268 containerd[1466]: time="2025-09-12T23:58:10.263207003Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 23:58:10.266501 systemd[1]: cri-containerd-abbef7e51bd2816753df987d8d6f931bc777560439662d5b63d21a49f0413415.scope: Deactivated successfully. Sep 12 23:58:10.289248 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-abbef7e51bd2816753df987d8d6f931bc777560439662d5b63d21a49f0413415-rootfs.mount: Deactivated successfully. Sep 12 23:58:10.334995 kubelet[2585]: I0912 23:58:10.333352 2585 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 12 23:58:10.391182 systemd[1]: Created slice kubepods-besteffort-pod94d742ff_488d_49c2_b166_9848bac8f37e.slice - libcontainer container kubepods-besteffort-pod94d742ff_488d_49c2_b166_9848bac8f37e.slice. Sep 12 23:58:10.392991 containerd[1466]: time="2025-09-12T23:58:10.392270339Z" level=info msg="shim disconnected" id=abbef7e51bd2816753df987d8d6f931bc777560439662d5b63d21a49f0413415 namespace=k8s.io Sep 12 23:58:10.392991 containerd[1466]: time="2025-09-12T23:58:10.392343582Z" level=warning msg="cleaning up after shim disconnected" id=abbef7e51bd2816753df987d8d6f931bc777560439662d5b63d21a49f0413415 namespace=k8s.io Sep 12 23:58:10.392991 containerd[1466]: time="2025-09-12T23:58:10.392354422Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 12 23:58:10.401759 kubelet[2585]: W0912 23:58:10.400832 2585 reflector.go:569] object-"calico-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4081-3-5-n-326e2e5946" cannot list resource "configmaps" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-4081-3-5-n-326e2e5946' and this object Sep 12 23:58:10.402124 kubelet[2585]: E0912 23:58:10.402008 2585 reflector.go:166] "Unhandled Error" err="object-\"calico-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ci-4081-3-5-n-326e2e5946\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ci-4081-3-5-n-326e2e5946' and this object" logger="UnhandledError" Sep 12 23:58:10.404225 kubelet[2585]: W0912 23:58:10.403907 2585 reflector.go:569] object-"kube-system"/"coredns": failed to list *v1.ConfigMap: configmaps "coredns" is forbidden: User "system:node:ci-4081-3-5-n-326e2e5946" cannot list resource "configmaps" in API group "" in the namespace "kube-system": no relationship found between node 'ci-4081-3-5-n-326e2e5946' and this object Sep 12 23:58:10.405298 kubelet[2585]: E0912 23:58:10.404451 2585 reflector.go:166] "Unhandled Error" err="object-\"kube-system\"/\"coredns\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"coredns\" is forbidden: User \"system:node:ci-4081-3-5-n-326e2e5946\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ci-4081-3-5-n-326e2e5946' and this object" logger="UnhandledError" Sep 12 23:58:10.409145 kubelet[2585]: W0912 23:58:10.408416 2585 reflector.go:569] object-"calico-apiserver"/"calico-apiserver-certs": failed to list *v1.Secret: secrets "calico-apiserver-certs" is forbidden: User "system:node:ci-4081-3-5-n-326e2e5946" cannot list resource "secrets" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-4081-3-5-n-326e2e5946' and this object Sep 12 23:58:10.412433 kubelet[2585]: E0912 23:58:10.409594 2585 reflector.go:166] "Unhandled Error" err="object-\"calico-apiserver\"/\"calico-apiserver-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"calico-apiserver-certs\" is forbidden: User \"system:node:ci-4081-3-5-n-326e2e5946\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ci-4081-3-5-n-326e2e5946' and this object" logger="UnhandledError" Sep 12 23:58:10.418607 systemd[1]: Created slice kubepods-besteffort-pod66998075_6768_4629_a394_a7e649462c77.slice - libcontainer container kubepods-besteffort-pod66998075_6768_4629_a394_a7e649462c77.slice. Sep 12 23:58:10.442090 systemd[1]: Created slice kubepods-burstable-poddc685a7f_f1eb_41f5_8dc9_3a23d11d38d9.slice - libcontainer container kubepods-burstable-poddc685a7f_f1eb_41f5_8dc9_3a23d11d38d9.slice. Sep 12 23:58:10.455354 systemd[1]: Created slice kubepods-besteffort-pode2914591_bb15_4357_8b0d_6d29e5119ec7.slice - libcontainer container kubepods-besteffort-pode2914591_bb15_4357_8b0d_6d29e5119ec7.slice. Sep 12 23:58:10.467870 systemd[1]: Created slice kubepods-burstable-pod62ed9b16_b2ad_4c43_931a_9ec3cc4358d1.slice - libcontainer container kubepods-burstable-pod62ed9b16_b2ad_4c43_931a_9ec3cc4358d1.slice. Sep 12 23:58:10.479396 systemd[1]: Created slice kubepods-besteffort-pod49f008dc_1da3_46cf_ac99_9976edc0d9dc.slice - libcontainer container kubepods-besteffort-pod49f008dc_1da3_46cf_ac99_9976edc0d9dc.slice. Sep 12 23:58:10.490261 kubelet[2585]: I0912 23:58:10.490051 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc685a7f-f1eb-41f5-8dc9-3a23d11d38d9-config-volume\") pod \"coredns-668d6bf9bc-hvkxc\" (UID: \"dc685a7f-f1eb-41f5-8dc9-3a23d11d38d9\") " pod="kube-system/coredns-668d6bf9bc-hvkxc" Sep 12 23:58:10.490261 kubelet[2585]: I0912 23:58:10.490098 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/49f008dc-1da3-46cf-ac99-9976edc0d9dc-calico-apiserver-certs\") pod \"calico-apiserver-6c79dd94d7-qdfzm\" (UID: \"49f008dc-1da3-46cf-ac99-9976edc0d9dc\") " pod="calico-apiserver/calico-apiserver-6c79dd94d7-qdfzm" Sep 12 23:58:10.490261 kubelet[2585]: I0912 23:58:10.490122 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbfvz\" (UniqueName: \"kubernetes.io/projected/66998075-6768-4629-a394-a7e649462c77-kube-api-access-gbfvz\") pod \"calico-apiserver-6c79dd94d7-k7nnj\" (UID: \"66998075-6768-4629-a394-a7e649462c77\") " pod="calico-apiserver/calico-apiserver-6c79dd94d7-k7nnj" Sep 12 23:58:10.490261 kubelet[2585]: I0912 23:58:10.490183 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94d742ff-488d-49c2-b166-9848bac8f37e-tigera-ca-bundle\") pod \"calico-kube-controllers-6c4999d64-82t4c\" (UID: \"94d742ff-488d-49c2-b166-9848bac8f37e\") " pod="calico-system/calico-kube-controllers-6c4999d64-82t4c" Sep 12 23:58:10.490261 kubelet[2585]: I0912 23:58:10.490208 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/66998075-6768-4629-a394-a7e649462c77-calico-apiserver-certs\") pod \"calico-apiserver-6c79dd94d7-k7nnj\" (UID: \"66998075-6768-4629-a394-a7e649462c77\") " pod="calico-apiserver/calico-apiserver-6c79dd94d7-k7nnj" Sep 12 23:58:10.490490 kubelet[2585]: I0912 23:58:10.490237 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hjjf\" (UniqueName: \"kubernetes.io/projected/62ed9b16-b2ad-4c43-931a-9ec3cc4358d1-kube-api-access-7hjjf\") pod \"coredns-668d6bf9bc-th8mg\" (UID: \"62ed9b16-b2ad-4c43-931a-9ec3cc4358d1\") " pod="kube-system/coredns-668d6bf9bc-th8mg" Sep 12 23:58:10.490490 kubelet[2585]: I0912 23:58:10.490258 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mhdq\" (UniqueName: \"kubernetes.io/projected/dc685a7f-f1eb-41f5-8dc9-3a23d11d38d9-kube-api-access-8mhdq\") pod \"coredns-668d6bf9bc-hvkxc\" (UID: \"dc685a7f-f1eb-41f5-8dc9-3a23d11d38d9\") " pod="kube-system/coredns-668d6bf9bc-hvkxc" Sep 12 23:58:10.490490 kubelet[2585]: I0912 23:58:10.490276 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/d77a8544-93ca-4785-a4a1-06bf0c577901-whisker-backend-key-pair\") pod \"whisker-7bbf9bf896-96lc9\" (UID: \"d77a8544-93ca-4785-a4a1-06bf0c577901\") " pod="calico-system/whisker-7bbf9bf896-96lc9" Sep 12 23:58:10.490490 kubelet[2585]: I0912 23:58:10.490291 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d77a8544-93ca-4785-a4a1-06bf0c577901-whisker-ca-bundle\") pod \"whisker-7bbf9bf896-96lc9\" (UID: \"d77a8544-93ca-4785-a4a1-06bf0c577901\") " pod="calico-system/whisker-7bbf9bf896-96lc9" Sep 12 23:58:10.490490 kubelet[2585]: I0912 23:58:10.490308 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6hv4\" (UniqueName: \"kubernetes.io/projected/49f008dc-1da3-46cf-ac99-9976edc0d9dc-kube-api-access-x6hv4\") pod \"calico-apiserver-6c79dd94d7-qdfzm\" (UID: \"49f008dc-1da3-46cf-ac99-9976edc0d9dc\") " pod="calico-apiserver/calico-apiserver-6c79dd94d7-qdfzm" Sep 12 23:58:10.490639 kubelet[2585]: I0912 23:58:10.490323 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/e2914591-bb15-4357-8b0d-6d29e5119ec7-goldmane-key-pair\") pod \"goldmane-54d579b49d-bjt98\" (UID: \"e2914591-bb15-4357-8b0d-6d29e5119ec7\") " pod="calico-system/goldmane-54d579b49d-bjt98" Sep 12 23:58:10.490639 kubelet[2585]: I0912 23:58:10.490341 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjdf4\" (UniqueName: \"kubernetes.io/projected/d77a8544-93ca-4785-a4a1-06bf0c577901-kube-api-access-hjdf4\") pod \"whisker-7bbf9bf896-96lc9\" (UID: \"d77a8544-93ca-4785-a4a1-06bf0c577901\") " pod="calico-system/whisker-7bbf9bf896-96lc9" Sep 12 23:58:10.490639 kubelet[2585]: I0912 23:58:10.490358 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6qkt\" (UniqueName: \"kubernetes.io/projected/94d742ff-488d-49c2-b166-9848bac8f37e-kube-api-access-n6qkt\") pod \"calico-kube-controllers-6c4999d64-82t4c\" (UID: \"94d742ff-488d-49c2-b166-9848bac8f37e\") " pod="calico-system/calico-kube-controllers-6c4999d64-82t4c" Sep 12 23:58:10.490639 kubelet[2585]: I0912 23:58:10.490384 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm9ll\" (UniqueName: \"kubernetes.io/projected/e2914591-bb15-4357-8b0d-6d29e5119ec7-kube-api-access-wm9ll\") pod \"goldmane-54d579b49d-bjt98\" (UID: \"e2914591-bb15-4357-8b0d-6d29e5119ec7\") " pod="calico-system/goldmane-54d579b49d-bjt98" Sep 12 23:58:10.490639 kubelet[2585]: I0912 23:58:10.490399 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2914591-bb15-4357-8b0d-6d29e5119ec7-config\") pod \"goldmane-54d579b49d-bjt98\" (UID: \"e2914591-bb15-4357-8b0d-6d29e5119ec7\") " pod="calico-system/goldmane-54d579b49d-bjt98" Sep 12 23:58:10.490876 kubelet[2585]: I0912 23:58:10.490415 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2914591-bb15-4357-8b0d-6d29e5119ec7-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-bjt98\" (UID: \"e2914591-bb15-4357-8b0d-6d29e5119ec7\") " pod="calico-system/goldmane-54d579b49d-bjt98" Sep 12 23:58:10.490876 kubelet[2585]: I0912 23:58:10.490431 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62ed9b16-b2ad-4c43-931a-9ec3cc4358d1-config-volume\") pod \"coredns-668d6bf9bc-th8mg\" (UID: \"62ed9b16-b2ad-4c43-931a-9ec3cc4358d1\") " pod="kube-system/coredns-668d6bf9bc-th8mg" Sep 12 23:58:10.492747 systemd[1]: Created slice kubepods-besteffort-podd77a8544_93ca_4785_a4a1_06bf0c577901.slice - libcontainer container kubepods-besteffort-podd77a8544_93ca_4785_a4a1_06bf0c577901.slice. Sep 12 23:58:10.647071 systemd[1]: Created slice kubepods-besteffort-poded6d6d1f_7445_4602_8d9e_b0dd54215b8f.slice - libcontainer container kubepods-besteffort-poded6d6d1f_7445_4602_8d9e_b0dd54215b8f.slice. Sep 12 23:58:10.661418 containerd[1466]: time="2025-09-12T23:58:10.660936124Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vdhzb,Uid:ed6d6d1f-7445-4602-8d9e-b0dd54215b8f,Namespace:calico-system,Attempt:0,}" Sep 12 23:58:10.707238 containerd[1466]: time="2025-09-12T23:58:10.706017475Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6c4999d64-82t4c,Uid:94d742ff-488d-49c2-b166-9848bac8f37e,Namespace:calico-system,Attempt:0,}" Sep 12 23:58:10.761603 containerd[1466]: time="2025-09-12T23:58:10.761530456Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-bjt98,Uid:e2914591-bb15-4357-8b0d-6d29e5119ec7,Namespace:calico-system,Attempt:0,}" Sep 12 23:58:10.766261 containerd[1466]: time="2025-09-12T23:58:10.766169425Z" level=error msg="Failed to destroy network for sandbox \"c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:10.767354 containerd[1466]: time="2025-09-12T23:58:10.767055785Z" level=error msg="encountered an error cleaning up failed sandbox \"c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:10.767354 containerd[1466]: time="2025-09-12T23:58:10.767332398Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vdhzb,Uid:ed6d6d1f-7445-4602-8d9e-b0dd54215b8f,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:10.767702 kubelet[2585]: E0912 23:58:10.767661 2585 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:10.767827 kubelet[2585]: E0912 23:58:10.767808 2585 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vdhzb" Sep 12 23:58:10.767856 kubelet[2585]: E0912 23:58:10.767836 2585 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-vdhzb" Sep 12 23:58:10.767921 kubelet[2585]: E0912 23:58:10.767899 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-vdhzb_calico-system(ed6d6d1f-7445-4602-8d9e-b0dd54215b8f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-vdhzb_calico-system(ed6d6d1f-7445-4602-8d9e-b0dd54215b8f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-vdhzb" podUID="ed6d6d1f-7445-4602-8d9e-b0dd54215b8f" Sep 12 23:58:10.802348 containerd[1466]: time="2025-09-12T23:58:10.801135561Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7bbf9bf896-96lc9,Uid:d77a8544-93ca-4785-a4a1-06bf0c577901,Namespace:calico-system,Attempt:0,}" Sep 12 23:58:10.805735 containerd[1466]: time="2025-09-12T23:58:10.804447110Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 12 23:58:10.806516 kubelet[2585]: I0912 23:58:10.806347 2585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" Sep 12 23:58:10.809227 containerd[1466]: time="2025-09-12T23:58:10.809175083Z" level=info msg="StopPodSandbox for \"c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2\"" Sep 12 23:58:10.809592 containerd[1466]: time="2025-09-12T23:58:10.809387533Z" level=info msg="Ensure that sandbox c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2 in task-service has been cleanup successfully" Sep 12 23:58:10.831033 containerd[1466]: time="2025-09-12T23:58:10.830978145Z" level=error msg="Failed to destroy network for sandbox \"7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:10.835760 containerd[1466]: time="2025-09-12T23:58:10.835517790Z" level=error msg="encountered an error cleaning up failed sandbox \"7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:10.835760 containerd[1466]: time="2025-09-12T23:58:10.835749440Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6c4999d64-82t4c,Uid:94d742ff-488d-49c2-b166-9848bac8f37e,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:10.836953 kubelet[2585]: E0912 23:58:10.836811 2585 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:10.836953 kubelet[2585]: E0912 23:58:10.836877 2585 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6c4999d64-82t4c" Sep 12 23:58:10.836953 kubelet[2585]: E0912 23:58:10.836910 2585 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6c4999d64-82t4c" Sep 12 23:58:10.837446 kubelet[2585]: E0912 23:58:10.836955 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6c4999d64-82t4c_calico-system(94d742ff-488d-49c2-b166-9848bac8f37e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6c4999d64-82t4c_calico-system(94d742ff-488d-49c2-b166-9848bac8f37e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6c4999d64-82t4c" podUID="94d742ff-488d-49c2-b166-9848bac8f37e" Sep 12 23:58:10.892630 containerd[1466]: time="2025-09-12T23:58:10.892091099Z" level=error msg="StopPodSandbox for \"c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2\" failed" error="failed to destroy network for sandbox \"c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:10.892804 kubelet[2585]: E0912 23:58:10.892436 2585 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" Sep 12 23:58:10.892804 kubelet[2585]: E0912 23:58:10.892508 2585 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2"} Sep 12 23:58:10.892804 kubelet[2585]: E0912 23:58:10.892592 2585 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ed6d6d1f-7445-4602-8d9e-b0dd54215b8f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 23:58:10.892804 kubelet[2585]: E0912 23:58:10.892616 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ed6d6d1f-7445-4602-8d9e-b0dd54215b8f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-vdhzb" podUID="ed6d6d1f-7445-4602-8d9e-b0dd54215b8f" Sep 12 23:58:10.910637 containerd[1466]: time="2025-09-12T23:58:10.910437845Z" level=error msg="Failed to destroy network for sandbox \"282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:10.911208 containerd[1466]: time="2025-09-12T23:58:10.911047873Z" level=error msg="encountered an error cleaning up failed sandbox \"282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:10.911208 containerd[1466]: time="2025-09-12T23:58:10.911101915Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-bjt98,Uid:e2914591-bb15-4357-8b0d-6d29e5119ec7,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:10.911512 kubelet[2585]: E0912 23:58:10.911477 2585 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:10.911667 kubelet[2585]: E0912 23:58:10.911647 2585 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-bjt98" Sep 12 23:58:10.911867 kubelet[2585]: E0912 23:58:10.911773 2585 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-bjt98" Sep 12 23:58:10.911867 kubelet[2585]: E0912 23:58:10.911828 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-bjt98_calico-system(e2914591-bb15-4357-8b0d-6d29e5119ec7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-bjt98_calico-system(e2914591-bb15-4357-8b0d-6d29e5119ec7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-bjt98" podUID="e2914591-bb15-4357-8b0d-6d29e5119ec7" Sep 12 23:58:10.934979 containerd[1466]: time="2025-09-12T23:58:10.934840905Z" level=error msg="Failed to destroy network for sandbox \"2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:10.935525 containerd[1466]: time="2025-09-12T23:58:10.935352168Z" level=error msg="encountered an error cleaning up failed sandbox \"2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:10.935525 containerd[1466]: time="2025-09-12T23:58:10.935412251Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7bbf9bf896-96lc9,Uid:d77a8544-93ca-4785-a4a1-06bf0c577901,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:10.936001 kubelet[2585]: E0912 23:58:10.935868 2585 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:10.936001 kubelet[2585]: E0912 23:58:10.935929 2585 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7bbf9bf896-96lc9" Sep 12 23:58:10.936001 kubelet[2585]: E0912 23:58:10.935949 2585 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7bbf9bf896-96lc9" Sep 12 23:58:10.936629 kubelet[2585]: E0912 23:58:10.936084 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-7bbf9bf896-96lc9_calico-system(d77a8544-93ca-4785-a4a1-06bf0c577901)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-7bbf9bf896-96lc9_calico-system(d77a8544-93ca-4785-a4a1-06bf0c577901)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7bbf9bf896-96lc9" podUID="d77a8544-93ca-4785-a4a1-06bf0c577901" Sep 12 23:58:11.593044 kubelet[2585]: E0912 23:58:11.592557 2585 configmap.go:193] Couldn't get configMap kube-system/coredns: failed to sync configmap cache: timed out waiting for the condition Sep 12 23:58:11.593044 kubelet[2585]: E0912 23:58:11.592689 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/62ed9b16-b2ad-4c43-931a-9ec3cc4358d1-config-volume podName:62ed9b16-b2ad-4c43-931a-9ec3cc4358d1 nodeName:}" failed. No retries permitted until 2025-09-12 23:58:12.092658942 +0000 UTC m=+34.571892069 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/62ed9b16-b2ad-4c43-931a-9ec3cc4358d1-config-volume") pod "coredns-668d6bf9bc-th8mg" (UID: "62ed9b16-b2ad-4c43-931a-9ec3cc4358d1") : failed to sync configmap cache: timed out waiting for the condition Sep 12 23:58:11.593044 kubelet[2585]: E0912 23:58:11.592963 2585 configmap.go:193] Couldn't get configMap kube-system/coredns: failed to sync configmap cache: timed out waiting for the condition Sep 12 23:58:11.593044 kubelet[2585]: E0912 23:58:11.593000 2585 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dc685a7f-f1eb-41f5-8dc9-3a23d11d38d9-config-volume podName:dc685a7f-f1eb-41f5-8dc9-3a23d11d38d9 nodeName:}" failed. No retries permitted until 2025-09-12 23:58:12.092988357 +0000 UTC m=+34.572221524 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/dc685a7f-f1eb-41f5-8dc9-3a23d11d38d9-config-volume") pod "coredns-668d6bf9bc-hvkxc" (UID: "dc685a7f-f1eb-41f5-8dc9-3a23d11d38d9") : failed to sync configmap cache: timed out waiting for the condition Sep 12 23:58:11.632680 containerd[1466]: time="2025-09-12T23:58:11.631983125Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c79dd94d7-k7nnj,Uid:66998075-6768-4629-a394-a7e649462c77,Namespace:calico-apiserver,Attempt:0,}" Sep 12 23:58:11.686829 containerd[1466]: time="2025-09-12T23:58:11.686780753Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c79dd94d7-qdfzm,Uid:49f008dc-1da3-46cf-ac99-9976edc0d9dc,Namespace:calico-apiserver,Attempt:0,}" Sep 12 23:58:11.730927 containerd[1466]: time="2025-09-12T23:58:11.730863546Z" level=error msg="Failed to destroy network for sandbox \"3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:11.731757 containerd[1466]: time="2025-09-12T23:58:11.731652021Z" level=error msg="encountered an error cleaning up failed sandbox \"3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:11.735818 containerd[1466]: time="2025-09-12T23:58:11.731810188Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c79dd94d7-k7nnj,Uid:66998075-6768-4629-a394-a7e649462c77,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:11.735982 kubelet[2585]: E0912 23:58:11.732123 2585 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:11.735982 kubelet[2585]: E0912 23:58:11.732184 2585 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6c79dd94d7-k7nnj" Sep 12 23:58:11.735982 kubelet[2585]: E0912 23:58:11.732204 2585 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6c79dd94d7-k7nnj" Sep 12 23:58:11.736181 kubelet[2585]: E0912 23:58:11.732248 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6c79dd94d7-k7nnj_calico-apiserver(66998075-6768-4629-a394-a7e649462c77)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6c79dd94d7-k7nnj_calico-apiserver(66998075-6768-4629-a394-a7e649462c77)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6c79dd94d7-k7nnj" podUID="66998075-6768-4629-a394-a7e649462c77" Sep 12 23:58:11.736731 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8-shm.mount: Deactivated successfully. Sep 12 23:58:11.785461 containerd[1466]: time="2025-09-12T23:58:11.785346640Z" level=error msg="Failed to destroy network for sandbox \"b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:11.785779 containerd[1466]: time="2025-09-12T23:58:11.785746298Z" level=error msg="encountered an error cleaning up failed sandbox \"b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:11.785850 containerd[1466]: time="2025-09-12T23:58:11.785824701Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c79dd94d7-qdfzm,Uid:49f008dc-1da3-46cf-ac99-9976edc0d9dc,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:11.786114 kubelet[2585]: E0912 23:58:11.786058 2585 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:11.786177 kubelet[2585]: E0912 23:58:11.786134 2585 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6c79dd94d7-qdfzm" Sep 12 23:58:11.786177 kubelet[2585]: E0912 23:58:11.786154 2585 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6c79dd94d7-qdfzm" Sep 12 23:58:11.786232 kubelet[2585]: E0912 23:58:11.786198 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6c79dd94d7-qdfzm_calico-apiserver(49f008dc-1da3-46cf-ac99-9976edc0d9dc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6c79dd94d7-qdfzm_calico-apiserver(49f008dc-1da3-46cf-ac99-9976edc0d9dc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6c79dd94d7-qdfzm" podUID="49f008dc-1da3-46cf-ac99-9976edc0d9dc" Sep 12 23:58:11.790359 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933-shm.mount: Deactivated successfully. Sep 12 23:58:11.810642 kubelet[2585]: I0912 23:58:11.810561 2585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" Sep 12 23:58:11.814027 containerd[1466]: time="2025-09-12T23:58:11.813829222Z" level=info msg="StopPodSandbox for \"3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8\"" Sep 12 23:58:11.814969 containerd[1466]: time="2025-09-12T23:58:11.814914270Z" level=info msg="Ensure that sandbox 3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8 in task-service has been cleanup successfully" Sep 12 23:58:11.815471 kubelet[2585]: I0912 23:58:11.815346 2585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" Sep 12 23:58:11.819729 containerd[1466]: time="2025-09-12T23:58:11.819641600Z" level=info msg="StopPodSandbox for \"282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0\"" Sep 12 23:58:11.820161 containerd[1466]: time="2025-09-12T23:58:11.820004416Z" level=info msg="Ensure that sandbox 282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0 in task-service has been cleanup successfully" Sep 12 23:58:11.823523 kubelet[2585]: I0912 23:58:11.823366 2585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" Sep 12 23:58:11.825134 containerd[1466]: time="2025-09-12T23:58:11.825057440Z" level=info msg="StopPodSandbox for \"7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98\"" Sep 12 23:58:11.826819 containerd[1466]: time="2025-09-12T23:58:11.826645710Z" level=info msg="Ensure that sandbox 7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98 in task-service has been cleanup successfully" Sep 12 23:58:11.830945 kubelet[2585]: I0912 23:58:11.830905 2585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" Sep 12 23:58:11.832990 containerd[1466]: time="2025-09-12T23:58:11.832455488Z" level=info msg="StopPodSandbox for \"b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933\"" Sep 12 23:58:11.832990 containerd[1466]: time="2025-09-12T23:58:11.832685178Z" level=info msg="Ensure that sandbox b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933 in task-service has been cleanup successfully" Sep 12 23:58:11.837111 kubelet[2585]: I0912 23:58:11.837051 2585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" Sep 12 23:58:11.837761 containerd[1466]: time="2025-09-12T23:58:11.837680399Z" level=info msg="StopPodSandbox for \"2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573\"" Sep 12 23:58:11.838893 containerd[1466]: time="2025-09-12T23:58:11.838740566Z" level=info msg="Ensure that sandbox 2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573 in task-service has been cleanup successfully" Sep 12 23:58:11.894392 containerd[1466]: time="2025-09-12T23:58:11.894254586Z" level=error msg="StopPodSandbox for \"3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8\" failed" error="failed to destroy network for sandbox \"3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:11.897689 kubelet[2585]: E0912 23:58:11.897326 2585 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" Sep 12 23:58:11.897689 kubelet[2585]: E0912 23:58:11.897381 2585 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8"} Sep 12 23:58:11.897689 kubelet[2585]: E0912 23:58:11.897484 2585 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"66998075-6768-4629-a394-a7e649462c77\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 23:58:11.897689 kubelet[2585]: E0912 23:58:11.897513 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"66998075-6768-4629-a394-a7e649462c77\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6c79dd94d7-k7nnj" podUID="66998075-6768-4629-a394-a7e649462c77" Sep 12 23:58:11.914028 containerd[1466]: time="2025-09-12T23:58:11.913965019Z" level=error msg="StopPodSandbox for \"7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98\" failed" error="failed to destroy network for sandbox \"7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:11.914396 kubelet[2585]: E0912 23:58:11.914360 2585 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" Sep 12 23:58:11.914396 kubelet[2585]: E0912 23:58:11.914412 2585 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98"} Sep 12 23:58:11.914555 kubelet[2585]: E0912 23:58:11.914446 2585 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"94d742ff-488d-49c2-b166-9848bac8f37e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 23:58:11.914555 kubelet[2585]: E0912 23:58:11.914471 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"94d742ff-488d-49c2-b166-9848bac8f37e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6c4999d64-82t4c" podUID="94d742ff-488d-49c2-b166-9848bac8f37e" Sep 12 23:58:11.914989 containerd[1466]: time="2025-09-12T23:58:11.914946263Z" level=error msg="StopPodSandbox for \"282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0\" failed" error="failed to destroy network for sandbox \"282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:11.915475 kubelet[2585]: E0912 23:58:11.915436 2585 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" Sep 12 23:58:11.915542 kubelet[2585]: E0912 23:58:11.915487 2585 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0"} Sep 12 23:58:11.915542 kubelet[2585]: E0912 23:58:11.915513 2585 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"e2914591-bb15-4357-8b0d-6d29e5119ec7\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 23:58:11.915542 kubelet[2585]: E0912 23:58:11.915532 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"e2914591-bb15-4357-8b0d-6d29e5119ec7\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-bjt98" podUID="e2914591-bb15-4357-8b0d-6d29e5119ec7" Sep 12 23:58:11.922317 containerd[1466]: time="2025-09-12T23:58:11.922257907Z" level=error msg="StopPodSandbox for \"2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573\" failed" error="failed to destroy network for sandbox \"2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:11.922568 kubelet[2585]: E0912 23:58:11.922531 2585 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" Sep 12 23:58:11.922646 kubelet[2585]: E0912 23:58:11.922602 2585 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573"} Sep 12 23:58:11.922673 kubelet[2585]: E0912 23:58:11.922653 2585 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d77a8544-93ca-4785-a4a1-06bf0c577901\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 23:58:11.922844 kubelet[2585]: E0912 23:58:11.922693 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d77a8544-93ca-4785-a4a1-06bf0c577901\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7bbf9bf896-96lc9" podUID="d77a8544-93ca-4785-a4a1-06bf0c577901" Sep 12 23:58:11.924828 containerd[1466]: time="2025-09-12T23:58:11.924675454Z" level=error msg="StopPodSandbox for \"b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933\" failed" error="failed to destroy network for sandbox \"b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:11.924994 kubelet[2585]: E0912 23:58:11.924947 2585 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" Sep 12 23:58:11.925037 kubelet[2585]: E0912 23:58:11.924999 2585 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933"} Sep 12 23:58:11.925081 kubelet[2585]: E0912 23:58:11.925037 2585 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"49f008dc-1da3-46cf-ac99-9976edc0d9dc\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 23:58:11.925081 kubelet[2585]: E0912 23:58:11.925057 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"49f008dc-1da3-46cf-ac99-9976edc0d9dc\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6c79dd94d7-qdfzm" podUID="49f008dc-1da3-46cf-ac99-9976edc0d9dc" Sep 12 23:58:12.250078 containerd[1466]: time="2025-09-12T23:58:12.250020375Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-hvkxc,Uid:dc685a7f-f1eb-41f5-8dc9-3a23d11d38d9,Namespace:kube-system,Attempt:0,}" Sep 12 23:58:12.277730 containerd[1466]: time="2025-09-12T23:58:12.277663941Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-th8mg,Uid:62ed9b16-b2ad-4c43-931a-9ec3cc4358d1,Namespace:kube-system,Attempt:0,}" Sep 12 23:58:12.371265 containerd[1466]: time="2025-09-12T23:58:12.371089975Z" level=error msg="Failed to destroy network for sandbox \"348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:12.371668 containerd[1466]: time="2025-09-12T23:58:12.371635879Z" level=error msg="encountered an error cleaning up failed sandbox \"348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:12.371746 containerd[1466]: time="2025-09-12T23:58:12.371697321Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-hvkxc,Uid:dc685a7f-f1eb-41f5-8dc9-3a23d11d38d9,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:12.373533 kubelet[2585]: E0912 23:58:12.373003 2585 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:12.373533 kubelet[2585]: E0912 23:58:12.373072 2585 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-hvkxc" Sep 12 23:58:12.373533 kubelet[2585]: E0912 23:58:12.373097 2585 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-hvkxc" Sep 12 23:58:12.373943 kubelet[2585]: E0912 23:58:12.373162 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-hvkxc_kube-system(dc685a7f-f1eb-41f5-8dc9-3a23d11d38d9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-hvkxc_kube-system(dc685a7f-f1eb-41f5-8dc9-3a23d11d38d9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-hvkxc" podUID="dc685a7f-f1eb-41f5-8dc9-3a23d11d38d9" Sep 12 23:58:12.392176 containerd[1466]: time="2025-09-12T23:58:12.392022168Z" level=error msg="Failed to destroy network for sandbox \"5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:12.394389 containerd[1466]: time="2025-09-12T23:58:12.394239144Z" level=error msg="encountered an error cleaning up failed sandbox \"5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:12.394389 containerd[1466]: time="2025-09-12T23:58:12.394335988Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-th8mg,Uid:62ed9b16-b2ad-4c43-931a-9ec3cc4358d1,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:12.396908 kubelet[2585]: E0912 23:58:12.395131 2585 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:12.396908 kubelet[2585]: E0912 23:58:12.395207 2585 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-th8mg" Sep 12 23:58:12.396908 kubelet[2585]: E0912 23:58:12.395228 2585 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-th8mg" Sep 12 23:58:12.397098 kubelet[2585]: E0912 23:58:12.395276 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-th8mg_kube-system(62ed9b16-b2ad-4c43-931a-9ec3cc4358d1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-th8mg_kube-system(62ed9b16-b2ad-4c43-931a-9ec3cc4358d1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-th8mg" podUID="62ed9b16-b2ad-4c43-931a-9ec3cc4358d1" Sep 12 23:58:12.842625 kubelet[2585]: I0912 23:58:12.842584 2585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" Sep 12 23:58:12.843948 containerd[1466]: time="2025-09-12T23:58:12.843903834Z" level=info msg="StopPodSandbox for \"5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8\"" Sep 12 23:58:12.844200 containerd[1466]: time="2025-09-12T23:58:12.844110003Z" level=info msg="Ensure that sandbox 5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8 in task-service has been cleanup successfully" Sep 12 23:58:12.847243 kubelet[2585]: I0912 23:58:12.847188 2585 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" Sep 12 23:58:12.848506 containerd[1466]: time="2025-09-12T23:58:12.848461752Z" level=info msg="StopPodSandbox for \"348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70\"" Sep 12 23:58:12.848714 containerd[1466]: time="2025-09-12T23:58:12.848680202Z" level=info msg="Ensure that sandbox 348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70 in task-service has been cleanup successfully" Sep 12 23:58:12.885296 containerd[1466]: time="2025-09-12T23:58:12.885244277Z" level=error msg="StopPodSandbox for \"5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8\" failed" error="failed to destroy network for sandbox \"5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:12.886753 kubelet[2585]: E0912 23:58:12.885685 2585 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" Sep 12 23:58:12.886753 kubelet[2585]: E0912 23:58:12.885747 2585 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8"} Sep 12 23:58:12.886753 kubelet[2585]: E0912 23:58:12.885782 2585 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"62ed9b16-b2ad-4c43-931a-9ec3cc4358d1\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 23:58:12.886753 kubelet[2585]: E0912 23:58:12.885808 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"62ed9b16-b2ad-4c43-931a-9ec3cc4358d1\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-th8mg" podUID="62ed9b16-b2ad-4c43-931a-9ec3cc4358d1" Sep 12 23:58:12.887395 containerd[1466]: time="2025-09-12T23:58:12.887287446Z" level=error msg="StopPodSandbox for \"348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70\" failed" error="failed to destroy network for sandbox \"348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:58:12.887552 kubelet[2585]: E0912 23:58:12.887501 2585 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" Sep 12 23:58:12.887597 kubelet[2585]: E0912 23:58:12.887553 2585 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70"} Sep 12 23:58:12.887597 kubelet[2585]: E0912 23:58:12.887586 2585 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"dc685a7f-f1eb-41f5-8dc9-3a23d11d38d9\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 23:58:12.887745 kubelet[2585]: E0912 23:58:12.887611 2585 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"dc685a7f-f1eb-41f5-8dc9-3a23d11d38d9\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-hvkxc" podUID="dc685a7f-f1eb-41f5-8dc9-3a23d11d38d9" Sep 12 23:58:17.472099 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1636136002.mount: Deactivated successfully. Sep 12 23:58:17.509176 containerd[1466]: time="2025-09-12T23:58:17.509093480Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:17.511568 containerd[1466]: time="2025-09-12T23:58:17.511493257Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=151100457" Sep 12 23:58:17.512620 containerd[1466]: time="2025-09-12T23:58:17.511877753Z" level=info msg="ImageCreate event name:\"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:17.515929 containerd[1466]: time="2025-09-12T23:58:17.515858115Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:17.516831 containerd[1466]: time="2025-09-12T23:58:17.516745511Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"151100319\" in 6.712200358s" Sep 12 23:58:17.516966 containerd[1466]: time="2025-09-12T23:58:17.516939279Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\"" Sep 12 23:58:17.535218 containerd[1466]: time="2025-09-12T23:58:17.535177782Z" level=info msg="CreateContainer within sandbox \"eaaf952d1de8d62483e39d3768aac0024efd2f4f83101ab04a5c62c0de1ebc0c\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 12 23:58:17.569009 containerd[1466]: time="2025-09-12T23:58:17.568944636Z" level=info msg="CreateContainer within sandbox \"eaaf952d1de8d62483e39d3768aac0024efd2f4f83101ab04a5c62c0de1ebc0c\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"37c8f1dce7d6f0f90821b3a4a8cb4e9f43195649843d83df054b1d6fdbc2be2c\"" Sep 12 23:58:17.569893 containerd[1466]: time="2025-09-12T23:58:17.569861834Z" level=info msg="StartContainer for \"37c8f1dce7d6f0f90821b3a4a8cb4e9f43195649843d83df054b1d6fdbc2be2c\"" Sep 12 23:58:17.603940 systemd[1]: Started cri-containerd-37c8f1dce7d6f0f90821b3a4a8cb4e9f43195649843d83df054b1d6fdbc2be2c.scope - libcontainer container 37c8f1dce7d6f0f90821b3a4a8cb4e9f43195649843d83df054b1d6fdbc2be2c. Sep 12 23:58:17.637200 containerd[1466]: time="2025-09-12T23:58:17.637070130Z" level=info msg="StartContainer for \"37c8f1dce7d6f0f90821b3a4a8cb4e9f43195649843d83df054b1d6fdbc2be2c\" returns successfully" Sep 12 23:58:17.811906 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 12 23:58:17.812320 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 12 23:58:17.927729 kubelet[2585]: I0912 23:58:17.926316 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-n2hgd" podStartSLOduration=1.689216088 podStartE2EDuration="17.926299465s" podCreationTimestamp="2025-09-12 23:58:00 +0000 UTC" firstStartedPulling="2025-09-12 23:58:01.280892264 +0000 UTC m=+23.760125351" lastFinishedPulling="2025-09-12 23:58:17.517975561 +0000 UTC m=+39.997208728" observedRunningTime="2025-09-12 23:58:17.923876286 +0000 UTC m=+40.403109453" watchObservedRunningTime="2025-09-12 23:58:17.926299465 +0000 UTC m=+40.405532592" Sep 12 23:58:18.012231 containerd[1466]: time="2025-09-12T23:58:18.012180475Z" level=info msg="StopPodSandbox for \"2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573\"" Sep 12 23:58:18.207100 containerd[1466]: 2025-09-12 23:58:18.107 [INFO][3788] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" Sep 12 23:58:18.207100 containerd[1466]: 2025-09-12 23:58:18.108 [INFO][3788] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" iface="eth0" netns="/var/run/netns/cni-b34462ab-f7cc-8825-7ed1-888a5b56daa8" Sep 12 23:58:18.207100 containerd[1466]: 2025-09-12 23:58:18.108 [INFO][3788] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" iface="eth0" netns="/var/run/netns/cni-b34462ab-f7cc-8825-7ed1-888a5b56daa8" Sep 12 23:58:18.207100 containerd[1466]: 2025-09-12 23:58:18.108 [INFO][3788] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" iface="eth0" netns="/var/run/netns/cni-b34462ab-f7cc-8825-7ed1-888a5b56daa8" Sep 12 23:58:18.207100 containerd[1466]: 2025-09-12 23:58:18.108 [INFO][3788] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" Sep 12 23:58:18.207100 containerd[1466]: 2025-09-12 23:58:18.108 [INFO][3788] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" Sep 12 23:58:18.207100 containerd[1466]: 2025-09-12 23:58:18.181 [INFO][3800] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" HandleID="k8s-pod-network.2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" Workload="ci--4081--3--5--n--326e2e5946-k8s-whisker--7bbf9bf896--96lc9-eth0" Sep 12 23:58:18.207100 containerd[1466]: 2025-09-12 23:58:18.182 [INFO][3800] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:18.207100 containerd[1466]: 2025-09-12 23:58:18.182 [INFO][3800] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:18.207100 containerd[1466]: 2025-09-12 23:58:18.197 [WARNING][3800] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" HandleID="k8s-pod-network.2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" Workload="ci--4081--3--5--n--326e2e5946-k8s-whisker--7bbf9bf896--96lc9-eth0" Sep 12 23:58:18.207100 containerd[1466]: 2025-09-12 23:58:18.197 [INFO][3800] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" HandleID="k8s-pod-network.2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" Workload="ci--4081--3--5--n--326e2e5946-k8s-whisker--7bbf9bf896--96lc9-eth0" Sep 12 23:58:18.207100 containerd[1466]: 2025-09-12 23:58:18.200 [INFO][3800] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:18.207100 containerd[1466]: 2025-09-12 23:58:18.204 [INFO][3788] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" Sep 12 23:58:18.208581 containerd[1466]: time="2025-09-12T23:58:18.207644820Z" level=info msg="TearDown network for sandbox \"2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573\" successfully" Sep 12 23:58:18.208581 containerd[1466]: time="2025-09-12T23:58:18.207681942Z" level=info msg="StopPodSandbox for \"2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573\" returns successfully" Sep 12 23:58:18.254192 kubelet[2585]: I0912 23:58:18.253323 2585 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d77a8544-93ca-4785-a4a1-06bf0c577901-whisker-ca-bundle\") pod \"d77a8544-93ca-4785-a4a1-06bf0c577901\" (UID: \"d77a8544-93ca-4785-a4a1-06bf0c577901\") " Sep 12 23:58:18.254192 kubelet[2585]: I0912 23:58:18.253407 2585 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/d77a8544-93ca-4785-a4a1-06bf0c577901-whisker-backend-key-pair\") pod \"d77a8544-93ca-4785-a4a1-06bf0c577901\" (UID: \"d77a8544-93ca-4785-a4a1-06bf0c577901\") " Sep 12 23:58:18.254192 kubelet[2585]: I0912 23:58:18.253434 2585 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjdf4\" (UniqueName: \"kubernetes.io/projected/d77a8544-93ca-4785-a4a1-06bf0c577901-kube-api-access-hjdf4\") pod \"d77a8544-93ca-4785-a4a1-06bf0c577901\" (UID: \"d77a8544-93ca-4785-a4a1-06bf0c577901\") " Sep 12 23:58:18.255802 kubelet[2585]: I0912 23:58:18.255310 2585 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d77a8544-93ca-4785-a4a1-06bf0c577901-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "d77a8544-93ca-4785-a4a1-06bf0c577901" (UID: "d77a8544-93ca-4785-a4a1-06bf0c577901"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 12 23:58:18.260694 kubelet[2585]: I0912 23:58:18.260406 2585 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d77a8544-93ca-4785-a4a1-06bf0c577901-kube-api-access-hjdf4" (OuterVolumeSpecName: "kube-api-access-hjdf4") pod "d77a8544-93ca-4785-a4a1-06bf0c577901" (UID: "d77a8544-93ca-4785-a4a1-06bf0c577901"). InnerVolumeSpecName "kube-api-access-hjdf4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 12 23:58:18.262792 kubelet[2585]: I0912 23:58:18.262689 2585 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d77a8544-93ca-4785-a4a1-06bf0c577901-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "d77a8544-93ca-4785-a4a1-06bf0c577901" (UID: "d77a8544-93ca-4785-a4a1-06bf0c577901"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 12 23:58:18.354810 kubelet[2585]: I0912 23:58:18.354665 2585 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/d77a8544-93ca-4785-a4a1-06bf0c577901-whisker-backend-key-pair\") on node \"ci-4081-3-5-n-326e2e5946\" DevicePath \"\"" Sep 12 23:58:18.354810 kubelet[2585]: I0912 23:58:18.354739 2585 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hjdf4\" (UniqueName: \"kubernetes.io/projected/d77a8544-93ca-4785-a4a1-06bf0c577901-kube-api-access-hjdf4\") on node \"ci-4081-3-5-n-326e2e5946\" DevicePath \"\"" Sep 12 23:58:18.354810 kubelet[2585]: I0912 23:58:18.354751 2585 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d77a8544-93ca-4785-a4a1-06bf0c577901-whisker-ca-bundle\") on node \"ci-4081-3-5-n-326e2e5946\" DevicePath \"\"" Sep 12 23:58:18.476559 systemd[1]: run-netns-cni\x2db34462ab\x2df7cc\x2d8825\x2d7ed1\x2d888a5b56daa8.mount: Deactivated successfully. Sep 12 23:58:18.476658 systemd[1]: var-lib-kubelet-pods-d77a8544\x2d93ca\x2d4785\x2da4a1\x2d06bf0c577901-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dhjdf4.mount: Deactivated successfully. Sep 12 23:58:18.476750 systemd[1]: var-lib-kubelet-pods-d77a8544\x2d93ca\x2d4785\x2da4a1\x2d06bf0c577901-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 12 23:58:18.894433 systemd[1]: Removed slice kubepods-besteffort-podd77a8544_93ca_4785_a4a1_06bf0c577901.slice - libcontainer container kubepods-besteffort-podd77a8544_93ca_4785_a4a1_06bf0c577901.slice. Sep 12 23:58:18.918306 systemd[1]: run-containerd-runc-k8s.io-37c8f1dce7d6f0f90821b3a4a8cb4e9f43195649843d83df054b1d6fdbc2be2c-runc.MAWNeM.mount: Deactivated successfully. Sep 12 23:58:18.984144 systemd[1]: Created slice kubepods-besteffort-pod00fc1575_e0b8_47af_b100_9a4260f14add.slice - libcontainer container kubepods-besteffort-pod00fc1575_e0b8_47af_b100_9a4260f14add.slice. Sep 12 23:58:19.061484 kubelet[2585]: I0912 23:58:19.061397 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/00fc1575-e0b8-47af-b100-9a4260f14add-whisker-backend-key-pair\") pod \"whisker-58478cff6c-kqfx8\" (UID: \"00fc1575-e0b8-47af-b100-9a4260f14add\") " pod="calico-system/whisker-58478cff6c-kqfx8" Sep 12 23:58:19.061484 kubelet[2585]: I0912 23:58:19.061472 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00fc1575-e0b8-47af-b100-9a4260f14add-whisker-ca-bundle\") pod \"whisker-58478cff6c-kqfx8\" (UID: \"00fc1575-e0b8-47af-b100-9a4260f14add\") " pod="calico-system/whisker-58478cff6c-kqfx8" Sep 12 23:58:19.062442 kubelet[2585]: I0912 23:58:19.061515 2585 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmmmq\" (UniqueName: \"kubernetes.io/projected/00fc1575-e0b8-47af-b100-9a4260f14add-kube-api-access-tmmmq\") pod \"whisker-58478cff6c-kqfx8\" (UID: \"00fc1575-e0b8-47af-b100-9a4260f14add\") " pod="calico-system/whisker-58478cff6c-kqfx8" Sep 12 23:58:19.292146 containerd[1466]: time="2025-09-12T23:58:19.291442058Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-58478cff6c-kqfx8,Uid:00fc1575-e0b8-47af-b100-9a4260f14add,Namespace:calico-system,Attempt:0,}" Sep 12 23:58:19.502437 systemd-networkd[1378]: caliecfe456a0dd: Link UP Sep 12 23:58:19.502595 systemd-networkd[1378]: caliecfe456a0dd: Gained carrier Sep 12 23:58:19.543054 containerd[1466]: 2025-09-12 23:58:19.334 [INFO][3845] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 23:58:19.543054 containerd[1466]: 2025-09-12 23:58:19.354 [INFO][3845] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--n--326e2e5946-k8s-whisker--58478cff6c--kqfx8-eth0 whisker-58478cff6c- calico-system 00fc1575-e0b8-47af-b100-9a4260f14add 913 0 2025-09-12 23:58:18 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:58478cff6c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4081-3-5-n-326e2e5946 whisker-58478cff6c-kqfx8 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] caliecfe456a0dd [] [] }} ContainerID="dfa2da7c6f6034002d64086cb7320d1c3b9f756d2c4a6b02dc1f1dd5b041de0f" Namespace="calico-system" Pod="whisker-58478cff6c-kqfx8" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-whisker--58478cff6c--kqfx8-" Sep 12 23:58:19.543054 containerd[1466]: 2025-09-12 23:58:19.354 [INFO][3845] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="dfa2da7c6f6034002d64086cb7320d1c3b9f756d2c4a6b02dc1f1dd5b041de0f" Namespace="calico-system" Pod="whisker-58478cff6c-kqfx8" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-whisker--58478cff6c--kqfx8-eth0" Sep 12 23:58:19.543054 containerd[1466]: 2025-09-12 23:58:19.399 [INFO][3856] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="dfa2da7c6f6034002d64086cb7320d1c3b9f756d2c4a6b02dc1f1dd5b041de0f" HandleID="k8s-pod-network.dfa2da7c6f6034002d64086cb7320d1c3b9f756d2c4a6b02dc1f1dd5b041de0f" Workload="ci--4081--3--5--n--326e2e5946-k8s-whisker--58478cff6c--kqfx8-eth0" Sep 12 23:58:19.543054 containerd[1466]: 2025-09-12 23:58:19.401 [INFO][3856] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="dfa2da7c6f6034002d64086cb7320d1c3b9f756d2c4a6b02dc1f1dd5b041de0f" HandleID="k8s-pod-network.dfa2da7c6f6034002d64086cb7320d1c3b9f756d2c4a6b02dc1f1dd5b041de0f" Workload="ci--4081--3--5--n--326e2e5946-k8s-whisker--58478cff6c--kqfx8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b020), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-5-n-326e2e5946", "pod":"whisker-58478cff6c-kqfx8", "timestamp":"2025-09-12 23:58:19.399445275 +0000 UTC"}, Hostname:"ci-4081-3-5-n-326e2e5946", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:58:19.543054 containerd[1466]: 2025-09-12 23:58:19.401 [INFO][3856] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:19.543054 containerd[1466]: 2025-09-12 23:58:19.401 [INFO][3856] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:19.543054 containerd[1466]: 2025-09-12 23:58:19.401 [INFO][3856] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-n-326e2e5946' Sep 12 23:58:19.543054 containerd[1466]: 2025-09-12 23:58:19.418 [INFO][3856] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.dfa2da7c6f6034002d64086cb7320d1c3b9f756d2c4a6b02dc1f1dd5b041de0f" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:19.543054 containerd[1466]: 2025-09-12 23:58:19.425 [INFO][3856] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:19.543054 containerd[1466]: 2025-09-12 23:58:19.432 [INFO][3856] ipam/ipam.go 511: Trying affinity for 192.168.36.64/26 host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:19.543054 containerd[1466]: 2025-09-12 23:58:19.436 [INFO][3856] ipam/ipam.go 158: Attempting to load block cidr=192.168.36.64/26 host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:19.543054 containerd[1466]: 2025-09-12 23:58:19.441 [INFO][3856] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.36.64/26 host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:19.543054 containerd[1466]: 2025-09-12 23:58:19.441 [INFO][3856] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.36.64/26 handle="k8s-pod-network.dfa2da7c6f6034002d64086cb7320d1c3b9f756d2c4a6b02dc1f1dd5b041de0f" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:19.543054 containerd[1466]: 2025-09-12 23:58:19.449 [INFO][3856] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.dfa2da7c6f6034002d64086cb7320d1c3b9f756d2c4a6b02dc1f1dd5b041de0f Sep 12 23:58:19.543054 containerd[1466]: 2025-09-12 23:58:19.466 [INFO][3856] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.36.64/26 handle="k8s-pod-network.dfa2da7c6f6034002d64086cb7320d1c3b9f756d2c4a6b02dc1f1dd5b041de0f" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:19.543054 containerd[1466]: 2025-09-12 23:58:19.488 [INFO][3856] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.36.65/26] block=192.168.36.64/26 handle="k8s-pod-network.dfa2da7c6f6034002d64086cb7320d1c3b9f756d2c4a6b02dc1f1dd5b041de0f" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:19.543054 containerd[1466]: 2025-09-12 23:58:19.488 [INFO][3856] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.36.65/26] handle="k8s-pod-network.dfa2da7c6f6034002d64086cb7320d1c3b9f756d2c4a6b02dc1f1dd5b041de0f" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:19.543054 containerd[1466]: 2025-09-12 23:58:19.488 [INFO][3856] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:19.543054 containerd[1466]: 2025-09-12 23:58:19.488 [INFO][3856] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.36.65/26] IPv6=[] ContainerID="dfa2da7c6f6034002d64086cb7320d1c3b9f756d2c4a6b02dc1f1dd5b041de0f" HandleID="k8s-pod-network.dfa2da7c6f6034002d64086cb7320d1c3b9f756d2c4a6b02dc1f1dd5b041de0f" Workload="ci--4081--3--5--n--326e2e5946-k8s-whisker--58478cff6c--kqfx8-eth0" Sep 12 23:58:19.543601 containerd[1466]: 2025-09-12 23:58:19.492 [INFO][3845] cni-plugin/k8s.go 418: Populated endpoint ContainerID="dfa2da7c6f6034002d64086cb7320d1c3b9f756d2c4a6b02dc1f1dd5b041de0f" Namespace="calico-system" Pod="whisker-58478cff6c-kqfx8" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-whisker--58478cff6c--kqfx8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--326e2e5946-k8s-whisker--58478cff6c--kqfx8-eth0", GenerateName:"whisker-58478cff6c-", Namespace:"calico-system", SelfLink:"", UID:"00fc1575-e0b8-47af-b100-9a4260f14add", ResourceVersion:"913", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 58, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"58478cff6c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-326e2e5946", ContainerID:"", Pod:"whisker-58478cff6c-kqfx8", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.36.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"caliecfe456a0dd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:58:19.543601 containerd[1466]: 2025-09-12 23:58:19.492 [INFO][3845] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.36.65/32] ContainerID="dfa2da7c6f6034002d64086cb7320d1c3b9f756d2c4a6b02dc1f1dd5b041de0f" Namespace="calico-system" Pod="whisker-58478cff6c-kqfx8" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-whisker--58478cff6c--kqfx8-eth0" Sep 12 23:58:19.543601 containerd[1466]: 2025-09-12 23:58:19.492 [INFO][3845] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliecfe456a0dd ContainerID="dfa2da7c6f6034002d64086cb7320d1c3b9f756d2c4a6b02dc1f1dd5b041de0f" Namespace="calico-system" Pod="whisker-58478cff6c-kqfx8" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-whisker--58478cff6c--kqfx8-eth0" Sep 12 23:58:19.543601 containerd[1466]: 2025-09-12 23:58:19.501 [INFO][3845] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="dfa2da7c6f6034002d64086cb7320d1c3b9f756d2c4a6b02dc1f1dd5b041de0f" Namespace="calico-system" Pod="whisker-58478cff6c-kqfx8" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-whisker--58478cff6c--kqfx8-eth0" Sep 12 23:58:19.543601 containerd[1466]: 2025-09-12 23:58:19.506 [INFO][3845] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="dfa2da7c6f6034002d64086cb7320d1c3b9f756d2c4a6b02dc1f1dd5b041de0f" Namespace="calico-system" Pod="whisker-58478cff6c-kqfx8" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-whisker--58478cff6c--kqfx8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--326e2e5946-k8s-whisker--58478cff6c--kqfx8-eth0", GenerateName:"whisker-58478cff6c-", Namespace:"calico-system", SelfLink:"", UID:"00fc1575-e0b8-47af-b100-9a4260f14add", ResourceVersion:"913", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 58, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"58478cff6c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-326e2e5946", ContainerID:"dfa2da7c6f6034002d64086cb7320d1c3b9f756d2c4a6b02dc1f1dd5b041de0f", Pod:"whisker-58478cff6c-kqfx8", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.36.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"caliecfe456a0dd", MAC:"6e:22:b4:96:41:c2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:58:19.543601 containerd[1466]: 2025-09-12 23:58:19.540 [INFO][3845] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="dfa2da7c6f6034002d64086cb7320d1c3b9f756d2c4a6b02dc1f1dd5b041de0f" Namespace="calico-system" Pod="whisker-58478cff6c-kqfx8" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-whisker--58478cff6c--kqfx8-eth0" Sep 12 23:58:19.575968 containerd[1466]: time="2025-09-12T23:58:19.575842454Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 23:58:19.576111 containerd[1466]: time="2025-09-12T23:58:19.575987180Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 23:58:19.578146 containerd[1466]: time="2025-09-12T23:58:19.578044101Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 23:58:19.578448 containerd[1466]: time="2025-09-12T23:58:19.578358354Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 23:58:19.613217 systemd[1]: run-containerd-runc-k8s.io-dfa2da7c6f6034002d64086cb7320d1c3b9f756d2c4a6b02dc1f1dd5b041de0f-runc.KOY9wS.mount: Deactivated successfully. Sep 12 23:58:19.623162 systemd[1]: Started cri-containerd-dfa2da7c6f6034002d64086cb7320d1c3b9f756d2c4a6b02dc1f1dd5b041de0f.scope - libcontainer container dfa2da7c6f6034002d64086cb7320d1c3b9f756d2c4a6b02dc1f1dd5b041de0f. Sep 12 23:58:19.644768 kubelet[2585]: I0912 23:58:19.644722 2585 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d77a8544-93ca-4785-a4a1-06bf0c577901" path="/var/lib/kubelet/pods/d77a8544-93ca-4785-a4a1-06bf0c577901/volumes" Sep 12 23:58:19.734966 containerd[1466]: time="2025-09-12T23:58:19.734898263Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-58478cff6c-kqfx8,Uid:00fc1575-e0b8-47af-b100-9a4260f14add,Namespace:calico-system,Attempt:0,} returns sandbox id \"dfa2da7c6f6034002d64086cb7320d1c3b9f756d2c4a6b02dc1f1dd5b041de0f\"" Sep 12 23:58:19.745930 containerd[1466]: time="2025-09-12T23:58:19.745625769Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 12 23:58:20.475396 systemd[1]: run-containerd-runc-k8s.io-37c8f1dce7d6f0f90821b3a4a8cb4e9f43195649843d83df054b1d6fdbc2be2c-runc.o3FA2k.mount: Deactivated successfully. Sep 12 23:58:21.255742 containerd[1466]: time="2025-09-12T23:58:21.255648028Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:21.257488 containerd[1466]: time="2025-09-12T23:58:21.257428138Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4605606" Sep 12 23:58:21.260474 containerd[1466]: time="2025-09-12T23:58:21.259936476Z" level=info msg="ImageCreate event name:\"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:21.265353 containerd[1466]: time="2025-09-12T23:58:21.265298245Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:21.268973 containerd[1466]: time="2025-09-12T23:58:21.268860823Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"5974839\" in 1.523190412s" Sep 12 23:58:21.268973 containerd[1466]: time="2025-09-12T23:58:21.268924666Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\"" Sep 12 23:58:21.278367 containerd[1466]: time="2025-09-12T23:58:21.278058942Z" level=info msg="CreateContainer within sandbox \"dfa2da7c6f6034002d64086cb7320d1c3b9f756d2c4a6b02dc1f1dd5b041de0f\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 12 23:58:21.298116 containerd[1466]: time="2025-09-12T23:58:21.297975838Z" level=info msg="CreateContainer within sandbox \"dfa2da7c6f6034002d64086cb7320d1c3b9f756d2c4a6b02dc1f1dd5b041de0f\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"90271facc1ff4a95446f8e0d50c0629cdc4ac3a3b57630a5b88f1a8c4399559a\"" Sep 12 23:58:21.298831 containerd[1466]: time="2025-09-12T23:58:21.298798430Z" level=info msg="StartContainer for \"90271facc1ff4a95446f8e0d50c0629cdc4ac3a3b57630a5b88f1a8c4399559a\"" Sep 12 23:58:21.335105 systemd[1]: Started cri-containerd-90271facc1ff4a95446f8e0d50c0629cdc4ac3a3b57630a5b88f1a8c4399559a.scope - libcontainer container 90271facc1ff4a95446f8e0d50c0629cdc4ac3a3b57630a5b88f1a8c4399559a. Sep 12 23:58:21.370206 systemd-networkd[1378]: caliecfe456a0dd: Gained IPv6LL Sep 12 23:58:21.375698 containerd[1466]: time="2025-09-12T23:58:21.375628945Z" level=info msg="StartContainer for \"90271facc1ff4a95446f8e0d50c0629cdc4ac3a3b57630a5b88f1a8c4399559a\" returns successfully" Sep 12 23:58:21.379098 containerd[1466]: time="2025-09-12T23:58:21.379037038Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 12 23:58:22.640269 containerd[1466]: time="2025-09-12T23:58:22.640186122Z" level=info msg="StopPodSandbox for \"c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2\"" Sep 12 23:58:22.641337 containerd[1466]: time="2025-09-12T23:58:22.640199683Z" level=info msg="StopPodSandbox for \"3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8\"" Sep 12 23:58:22.801726 containerd[1466]: 2025-09-12 23:58:22.736 [INFO][4136] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" Sep 12 23:58:22.801726 containerd[1466]: 2025-09-12 23:58:22.736 [INFO][4136] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" iface="eth0" netns="/var/run/netns/cni-6f106c6e-5770-d3d6-fb8e-fae7cb02e90f" Sep 12 23:58:22.801726 containerd[1466]: 2025-09-12 23:58:22.737 [INFO][4136] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" iface="eth0" netns="/var/run/netns/cni-6f106c6e-5770-d3d6-fb8e-fae7cb02e90f" Sep 12 23:58:22.801726 containerd[1466]: 2025-09-12 23:58:22.738 [INFO][4136] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" iface="eth0" netns="/var/run/netns/cni-6f106c6e-5770-d3d6-fb8e-fae7cb02e90f" Sep 12 23:58:22.801726 containerd[1466]: 2025-09-12 23:58:22.739 [INFO][4136] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" Sep 12 23:58:22.801726 containerd[1466]: 2025-09-12 23:58:22.739 [INFO][4136] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" Sep 12 23:58:22.801726 containerd[1466]: 2025-09-12 23:58:22.782 [INFO][4148] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" HandleID="k8s-pod-network.3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--k7nnj-eth0" Sep 12 23:58:22.801726 containerd[1466]: 2025-09-12 23:58:22.782 [INFO][4148] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:22.801726 containerd[1466]: 2025-09-12 23:58:22.782 [INFO][4148] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:22.801726 containerd[1466]: 2025-09-12 23:58:22.792 [WARNING][4148] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" HandleID="k8s-pod-network.3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--k7nnj-eth0" Sep 12 23:58:22.801726 containerd[1466]: 2025-09-12 23:58:22.792 [INFO][4148] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" HandleID="k8s-pod-network.3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--k7nnj-eth0" Sep 12 23:58:22.801726 containerd[1466]: 2025-09-12 23:58:22.795 [INFO][4148] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:22.801726 containerd[1466]: 2025-09-12 23:58:22.799 [INFO][4136] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" Sep 12 23:58:22.804524 containerd[1466]: time="2025-09-12T23:58:22.802293942Z" level=info msg="TearDown network for sandbox \"3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8\" successfully" Sep 12 23:58:22.804524 containerd[1466]: time="2025-09-12T23:58:22.802843283Z" level=info msg="StopPodSandbox for \"3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8\" returns successfully" Sep 12 23:58:22.805440 containerd[1466]: time="2025-09-12T23:58:22.805252776Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c79dd94d7-k7nnj,Uid:66998075-6768-4629-a394-a7e649462c77,Namespace:calico-apiserver,Attempt:1,}" Sep 12 23:58:22.808140 systemd[1]: run-netns-cni\x2d6f106c6e\x2d5770\x2dd3d6\x2dfb8e\x2dfae7cb02e90f.mount: Deactivated successfully. Sep 12 23:58:22.829121 containerd[1466]: 2025-09-12 23:58:22.735 [INFO][4129] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" Sep 12 23:58:22.829121 containerd[1466]: 2025-09-12 23:58:22.735 [INFO][4129] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" iface="eth0" netns="/var/run/netns/cni-5c98cd8d-e3d6-c383-a1f5-aec13a86cc2b" Sep 12 23:58:22.829121 containerd[1466]: 2025-09-12 23:58:22.736 [INFO][4129] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" iface="eth0" netns="/var/run/netns/cni-5c98cd8d-e3d6-c383-a1f5-aec13a86cc2b" Sep 12 23:58:22.829121 containerd[1466]: 2025-09-12 23:58:22.737 [INFO][4129] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" iface="eth0" netns="/var/run/netns/cni-5c98cd8d-e3d6-c383-a1f5-aec13a86cc2b" Sep 12 23:58:22.829121 containerd[1466]: 2025-09-12 23:58:22.737 [INFO][4129] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" Sep 12 23:58:22.829121 containerd[1466]: 2025-09-12 23:58:22.737 [INFO][4129] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" Sep 12 23:58:22.829121 containerd[1466]: 2025-09-12 23:58:22.782 [INFO][4146] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" HandleID="k8s-pod-network.c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" Workload="ci--4081--3--5--n--326e2e5946-k8s-csi--node--driver--vdhzb-eth0" Sep 12 23:58:22.829121 containerd[1466]: 2025-09-12 23:58:22.782 [INFO][4146] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:22.829121 containerd[1466]: 2025-09-12 23:58:22.795 [INFO][4146] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:22.829121 containerd[1466]: 2025-09-12 23:58:22.815 [WARNING][4146] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" HandleID="k8s-pod-network.c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" Workload="ci--4081--3--5--n--326e2e5946-k8s-csi--node--driver--vdhzb-eth0" Sep 12 23:58:22.829121 containerd[1466]: 2025-09-12 23:58:22.816 [INFO][4146] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" HandleID="k8s-pod-network.c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" Workload="ci--4081--3--5--n--326e2e5946-k8s-csi--node--driver--vdhzb-eth0" Sep 12 23:58:22.829121 containerd[1466]: 2025-09-12 23:58:22.819 [INFO][4146] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:22.829121 containerd[1466]: 2025-09-12 23:58:22.821 [INFO][4129] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" Sep 12 23:58:22.834655 containerd[1466]: time="2025-09-12T23:58:22.831777840Z" level=info msg="TearDown network for sandbox \"c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2\" successfully" Sep 12 23:58:22.834655 containerd[1466]: time="2025-09-12T23:58:22.831823002Z" level=info msg="StopPodSandbox for \"c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2\" returns successfully" Sep 12 23:58:22.834074 systemd[1]: run-netns-cni\x2d5c98cd8d\x2de3d6\x2dc383\x2da1f5\x2daec13a86cc2b.mount: Deactivated successfully. Sep 12 23:58:22.837051 containerd[1466]: time="2025-09-12T23:58:22.835007325Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vdhzb,Uid:ed6d6d1f-7445-4602-8d9e-b0dd54215b8f,Namespace:calico-system,Attempt:1,}" Sep 12 23:58:23.020079 systemd-networkd[1378]: cali83c6c543473: Link UP Sep 12 23:58:23.022412 systemd-networkd[1378]: cali83c6c543473: Gained carrier Sep 12 23:58:23.046063 containerd[1466]: 2025-09-12 23:58:22.878 [INFO][4160] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 23:58:23.046063 containerd[1466]: 2025-09-12 23:58:22.896 [INFO][4160] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--k7nnj-eth0 calico-apiserver-6c79dd94d7- calico-apiserver 66998075-6768-4629-a394-a7e649462c77 933 0 2025-09-12 23:57:50 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6c79dd94d7 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081-3-5-n-326e2e5946 calico-apiserver-6c79dd94d7-k7nnj eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali83c6c543473 [] [] }} ContainerID="a487103d5eff70b8f13840377734d309c2ce664230a6e5eaa595b8997abac4de" Namespace="calico-apiserver" Pod="calico-apiserver-6c79dd94d7-k7nnj" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--k7nnj-" Sep 12 23:58:23.046063 containerd[1466]: 2025-09-12 23:58:22.896 [INFO][4160] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a487103d5eff70b8f13840377734d309c2ce664230a6e5eaa595b8997abac4de" Namespace="calico-apiserver" Pod="calico-apiserver-6c79dd94d7-k7nnj" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--k7nnj-eth0" Sep 12 23:58:23.046063 containerd[1466]: 2025-09-12 23:58:22.941 [INFO][4184] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a487103d5eff70b8f13840377734d309c2ce664230a6e5eaa595b8997abac4de" HandleID="k8s-pod-network.a487103d5eff70b8f13840377734d309c2ce664230a6e5eaa595b8997abac4de" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--k7nnj-eth0" Sep 12 23:58:23.046063 containerd[1466]: 2025-09-12 23:58:22.941 [INFO][4184] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a487103d5eff70b8f13840377734d309c2ce664230a6e5eaa595b8997abac4de" HandleID="k8s-pod-network.a487103d5eff70b8f13840377734d309c2ce664230a6e5eaa595b8997abac4de" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--k7nnj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d2ff0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4081-3-5-n-326e2e5946", "pod":"calico-apiserver-6c79dd94d7-k7nnj", "timestamp":"2025-09-12 23:58:22.941470396 +0000 UTC"}, Hostname:"ci-4081-3-5-n-326e2e5946", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:58:23.046063 containerd[1466]: 2025-09-12 23:58:22.941 [INFO][4184] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:23.046063 containerd[1466]: 2025-09-12 23:58:22.942 [INFO][4184] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:23.046063 containerd[1466]: 2025-09-12 23:58:22.942 [INFO][4184] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-n-326e2e5946' Sep 12 23:58:23.046063 containerd[1466]: 2025-09-12 23:58:22.960 [INFO][4184] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a487103d5eff70b8f13840377734d309c2ce664230a6e5eaa595b8997abac4de" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:23.046063 containerd[1466]: 2025-09-12 23:58:22.967 [INFO][4184] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:23.046063 containerd[1466]: 2025-09-12 23:58:22.976 [INFO][4184] ipam/ipam.go 511: Trying affinity for 192.168.36.64/26 host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:23.046063 containerd[1466]: 2025-09-12 23:58:22.980 [INFO][4184] ipam/ipam.go 158: Attempting to load block cidr=192.168.36.64/26 host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:23.046063 containerd[1466]: 2025-09-12 23:58:22.984 [INFO][4184] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.36.64/26 host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:23.046063 containerd[1466]: 2025-09-12 23:58:22.985 [INFO][4184] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.36.64/26 handle="k8s-pod-network.a487103d5eff70b8f13840377734d309c2ce664230a6e5eaa595b8997abac4de" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:23.046063 containerd[1466]: 2025-09-12 23:58:22.988 [INFO][4184] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a487103d5eff70b8f13840377734d309c2ce664230a6e5eaa595b8997abac4de Sep 12 23:58:23.046063 containerd[1466]: 2025-09-12 23:58:22.996 [INFO][4184] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.36.64/26 handle="k8s-pod-network.a487103d5eff70b8f13840377734d309c2ce664230a6e5eaa595b8997abac4de" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:23.046063 containerd[1466]: 2025-09-12 23:58:23.008 [INFO][4184] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.36.66/26] block=192.168.36.64/26 handle="k8s-pod-network.a487103d5eff70b8f13840377734d309c2ce664230a6e5eaa595b8997abac4de" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:23.046063 containerd[1466]: 2025-09-12 23:58:23.008 [INFO][4184] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.36.66/26] handle="k8s-pod-network.a487103d5eff70b8f13840377734d309c2ce664230a6e5eaa595b8997abac4de" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:23.046063 containerd[1466]: 2025-09-12 23:58:23.008 [INFO][4184] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:23.046063 containerd[1466]: 2025-09-12 23:58:23.008 [INFO][4184] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.36.66/26] IPv6=[] ContainerID="a487103d5eff70b8f13840377734d309c2ce664230a6e5eaa595b8997abac4de" HandleID="k8s-pod-network.a487103d5eff70b8f13840377734d309c2ce664230a6e5eaa595b8997abac4de" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--k7nnj-eth0" Sep 12 23:58:23.046940 containerd[1466]: 2025-09-12 23:58:23.013 [INFO][4160] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a487103d5eff70b8f13840377734d309c2ce664230a6e5eaa595b8997abac4de" Namespace="calico-apiserver" Pod="calico-apiserver-6c79dd94d7-k7nnj" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--k7nnj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--k7nnj-eth0", GenerateName:"calico-apiserver-6c79dd94d7-", Namespace:"calico-apiserver", SelfLink:"", UID:"66998075-6768-4629-a394-a7e649462c77", ResourceVersion:"933", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 57, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6c79dd94d7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-326e2e5946", ContainerID:"", Pod:"calico-apiserver-6c79dd94d7-k7nnj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.36.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali83c6c543473", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:58:23.046940 containerd[1466]: 2025-09-12 23:58:23.013 [INFO][4160] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.36.66/32] ContainerID="a487103d5eff70b8f13840377734d309c2ce664230a6e5eaa595b8997abac4de" Namespace="calico-apiserver" Pod="calico-apiserver-6c79dd94d7-k7nnj" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--k7nnj-eth0" Sep 12 23:58:23.046940 containerd[1466]: 2025-09-12 23:58:23.013 [INFO][4160] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali83c6c543473 ContainerID="a487103d5eff70b8f13840377734d309c2ce664230a6e5eaa595b8997abac4de" Namespace="calico-apiserver" Pod="calico-apiserver-6c79dd94d7-k7nnj" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--k7nnj-eth0" Sep 12 23:58:23.046940 containerd[1466]: 2025-09-12 23:58:23.023 [INFO][4160] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a487103d5eff70b8f13840377734d309c2ce664230a6e5eaa595b8997abac4de" Namespace="calico-apiserver" Pod="calico-apiserver-6c79dd94d7-k7nnj" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--k7nnj-eth0" Sep 12 23:58:23.046940 containerd[1466]: 2025-09-12 23:58:23.024 [INFO][4160] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a487103d5eff70b8f13840377734d309c2ce664230a6e5eaa595b8997abac4de" Namespace="calico-apiserver" Pod="calico-apiserver-6c79dd94d7-k7nnj" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--k7nnj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--k7nnj-eth0", GenerateName:"calico-apiserver-6c79dd94d7-", Namespace:"calico-apiserver", SelfLink:"", UID:"66998075-6768-4629-a394-a7e649462c77", ResourceVersion:"933", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 57, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6c79dd94d7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-326e2e5946", ContainerID:"a487103d5eff70b8f13840377734d309c2ce664230a6e5eaa595b8997abac4de", Pod:"calico-apiserver-6c79dd94d7-k7nnj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.36.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali83c6c543473", MAC:"e2:62:ac:16:0c:e9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:58:23.046940 containerd[1466]: 2025-09-12 23:58:23.037 [INFO][4160] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a487103d5eff70b8f13840377734d309c2ce664230a6e5eaa595b8997abac4de" Namespace="calico-apiserver" Pod="calico-apiserver-6c79dd94d7-k7nnj" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--k7nnj-eth0" Sep 12 23:58:23.085699 containerd[1466]: time="2025-09-12T23:58:23.084527571Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 23:58:23.085699 containerd[1466]: time="2025-09-12T23:58:23.084596413Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 23:58:23.085699 containerd[1466]: time="2025-09-12T23:58:23.084659576Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 23:58:23.088387 containerd[1466]: time="2025-09-12T23:58:23.088290235Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 23:58:23.123641 systemd[1]: Started cri-containerd-a487103d5eff70b8f13840377734d309c2ce664230a6e5eaa595b8997abac4de.scope - libcontainer container a487103d5eff70b8f13840377734d309c2ce664230a6e5eaa595b8997abac4de. Sep 12 23:58:23.146062 systemd-networkd[1378]: calicb57a4712a1: Link UP Sep 12 23:58:23.147162 systemd-networkd[1378]: calicb57a4712a1: Gained carrier Sep 12 23:58:23.176696 containerd[1466]: 2025-09-12 23:58:22.895 [INFO][4169] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 23:58:23.176696 containerd[1466]: 2025-09-12 23:58:22.917 [INFO][4169] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--n--326e2e5946-k8s-csi--node--driver--vdhzb-eth0 csi-node-driver- calico-system ed6d6d1f-7445-4602-8d9e-b0dd54215b8f 932 0 2025-09-12 23:58:01 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4081-3-5-n-326e2e5946 csi-node-driver-vdhzb eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calicb57a4712a1 [] [] }} ContainerID="cec3f72e2a154c18e795f75ae44c72769e391b4cd7e152a22dedbe89193ce5e1" Namespace="calico-system" Pod="csi-node-driver-vdhzb" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-csi--node--driver--vdhzb-" Sep 12 23:58:23.176696 containerd[1466]: 2025-09-12 23:58:22.917 [INFO][4169] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cec3f72e2a154c18e795f75ae44c72769e391b4cd7e152a22dedbe89193ce5e1" Namespace="calico-system" Pod="csi-node-driver-vdhzb" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-csi--node--driver--vdhzb-eth0" Sep 12 23:58:23.176696 containerd[1466]: 2025-09-12 23:58:22.962 [INFO][4189] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cec3f72e2a154c18e795f75ae44c72769e391b4cd7e152a22dedbe89193ce5e1" HandleID="k8s-pod-network.cec3f72e2a154c18e795f75ae44c72769e391b4cd7e152a22dedbe89193ce5e1" Workload="ci--4081--3--5--n--326e2e5946-k8s-csi--node--driver--vdhzb-eth0" Sep 12 23:58:23.176696 containerd[1466]: 2025-09-12 23:58:22.963 [INFO][4189] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cec3f72e2a154c18e795f75ae44c72769e391b4cd7e152a22dedbe89193ce5e1" HandleID="k8s-pod-network.cec3f72e2a154c18e795f75ae44c72769e391b4cd7e152a22dedbe89193ce5e1" Workload="ci--4081--3--5--n--326e2e5946-k8s-csi--node--driver--vdhzb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b100), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-5-n-326e2e5946", "pod":"csi-node-driver-vdhzb", "timestamp":"2025-09-12 23:58:22.962570211 +0000 UTC"}, Hostname:"ci-4081-3-5-n-326e2e5946", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:58:23.176696 containerd[1466]: 2025-09-12 23:58:22.963 [INFO][4189] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:23.176696 containerd[1466]: 2025-09-12 23:58:23.008 [INFO][4189] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:23.176696 containerd[1466]: 2025-09-12 23:58:23.008 [INFO][4189] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-n-326e2e5946' Sep 12 23:58:23.176696 containerd[1466]: 2025-09-12 23:58:23.060 [INFO][4189] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.cec3f72e2a154c18e795f75ae44c72769e391b4cd7e152a22dedbe89193ce5e1" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:23.176696 containerd[1466]: 2025-09-12 23:58:23.073 [INFO][4189] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:23.176696 containerd[1466]: 2025-09-12 23:58:23.086 [INFO][4189] ipam/ipam.go 511: Trying affinity for 192.168.36.64/26 host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:23.176696 containerd[1466]: 2025-09-12 23:58:23.092 [INFO][4189] ipam/ipam.go 158: Attempting to load block cidr=192.168.36.64/26 host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:23.176696 containerd[1466]: 2025-09-12 23:58:23.096 [INFO][4189] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.36.64/26 host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:23.176696 containerd[1466]: 2025-09-12 23:58:23.096 [INFO][4189] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.36.64/26 handle="k8s-pod-network.cec3f72e2a154c18e795f75ae44c72769e391b4cd7e152a22dedbe89193ce5e1" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:23.176696 containerd[1466]: 2025-09-12 23:58:23.098 [INFO][4189] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.cec3f72e2a154c18e795f75ae44c72769e391b4cd7e152a22dedbe89193ce5e1 Sep 12 23:58:23.176696 containerd[1466]: 2025-09-12 23:58:23.108 [INFO][4189] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.36.64/26 handle="k8s-pod-network.cec3f72e2a154c18e795f75ae44c72769e391b4cd7e152a22dedbe89193ce5e1" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:23.176696 containerd[1466]: 2025-09-12 23:58:23.127 [INFO][4189] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.36.67/26] block=192.168.36.64/26 handle="k8s-pod-network.cec3f72e2a154c18e795f75ae44c72769e391b4cd7e152a22dedbe89193ce5e1" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:23.176696 containerd[1466]: 2025-09-12 23:58:23.127 [INFO][4189] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.36.67/26] handle="k8s-pod-network.cec3f72e2a154c18e795f75ae44c72769e391b4cd7e152a22dedbe89193ce5e1" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:23.176696 containerd[1466]: 2025-09-12 23:58:23.127 [INFO][4189] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:23.176696 containerd[1466]: 2025-09-12 23:58:23.127 [INFO][4189] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.36.67/26] IPv6=[] ContainerID="cec3f72e2a154c18e795f75ae44c72769e391b4cd7e152a22dedbe89193ce5e1" HandleID="k8s-pod-network.cec3f72e2a154c18e795f75ae44c72769e391b4cd7e152a22dedbe89193ce5e1" Workload="ci--4081--3--5--n--326e2e5946-k8s-csi--node--driver--vdhzb-eth0" Sep 12 23:58:23.177547 containerd[1466]: 2025-09-12 23:58:23.138 [INFO][4169] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cec3f72e2a154c18e795f75ae44c72769e391b4cd7e152a22dedbe89193ce5e1" Namespace="calico-system" Pod="csi-node-driver-vdhzb" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-csi--node--driver--vdhzb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--326e2e5946-k8s-csi--node--driver--vdhzb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"ed6d6d1f-7445-4602-8d9e-b0dd54215b8f", ResourceVersion:"932", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 58, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-326e2e5946", ContainerID:"", Pod:"csi-node-driver-vdhzb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.36.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calicb57a4712a1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:58:23.177547 containerd[1466]: 2025-09-12 23:58:23.138 [INFO][4169] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.36.67/32] ContainerID="cec3f72e2a154c18e795f75ae44c72769e391b4cd7e152a22dedbe89193ce5e1" Namespace="calico-system" Pod="csi-node-driver-vdhzb" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-csi--node--driver--vdhzb-eth0" Sep 12 23:58:23.177547 containerd[1466]: 2025-09-12 23:58:23.138 [INFO][4169] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calicb57a4712a1 ContainerID="cec3f72e2a154c18e795f75ae44c72769e391b4cd7e152a22dedbe89193ce5e1" Namespace="calico-system" Pod="csi-node-driver-vdhzb" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-csi--node--driver--vdhzb-eth0" Sep 12 23:58:23.177547 containerd[1466]: 2025-09-12 23:58:23.148 [INFO][4169] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cec3f72e2a154c18e795f75ae44c72769e391b4cd7e152a22dedbe89193ce5e1" Namespace="calico-system" Pod="csi-node-driver-vdhzb" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-csi--node--driver--vdhzb-eth0" Sep 12 23:58:23.177547 containerd[1466]: 2025-09-12 23:58:23.149 [INFO][4169] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cec3f72e2a154c18e795f75ae44c72769e391b4cd7e152a22dedbe89193ce5e1" Namespace="calico-system" Pod="csi-node-driver-vdhzb" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-csi--node--driver--vdhzb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--326e2e5946-k8s-csi--node--driver--vdhzb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"ed6d6d1f-7445-4602-8d9e-b0dd54215b8f", ResourceVersion:"932", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 58, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-326e2e5946", ContainerID:"cec3f72e2a154c18e795f75ae44c72769e391b4cd7e152a22dedbe89193ce5e1", Pod:"csi-node-driver-vdhzb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.36.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calicb57a4712a1", MAC:"fa:dc:d6:16:ce:25", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:58:23.177547 containerd[1466]: 2025-09-12 23:58:23.173 [INFO][4169] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cec3f72e2a154c18e795f75ae44c72769e391b4cd7e152a22dedbe89193ce5e1" Namespace="calico-system" Pod="csi-node-driver-vdhzb" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-csi--node--driver--vdhzb-eth0" Sep 12 23:58:23.240347 containerd[1466]: time="2025-09-12T23:58:23.233343746Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 23:58:23.240347 containerd[1466]: time="2025-09-12T23:58:23.233446190Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 23:58:23.240347 containerd[1466]: time="2025-09-12T23:58:23.233458750Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 23:58:23.240347 containerd[1466]: time="2025-09-12T23:58:23.233543953Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 23:58:23.284784 systemd[1]: Started cri-containerd-cec3f72e2a154c18e795f75ae44c72769e391b4cd7e152a22dedbe89193ce5e1.scope - libcontainer container cec3f72e2a154c18e795f75ae44c72769e391b4cd7e152a22dedbe89193ce5e1. Sep 12 23:58:23.333822 containerd[1466]: time="2025-09-12T23:58:23.333274210Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c79dd94d7-k7nnj,Uid:66998075-6768-4629-a394-a7e649462c77,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"a487103d5eff70b8f13840377734d309c2ce664230a6e5eaa595b8997abac4de\"" Sep 12 23:58:23.439740 containerd[1466]: time="2025-09-12T23:58:23.439678882Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-vdhzb,Uid:ed6d6d1f-7445-4602-8d9e-b0dd54215b8f,Namespace:calico-system,Attempt:1,} returns sandbox id \"cec3f72e2a154c18e795f75ae44c72769e391b4cd7e152a22dedbe89193ce5e1\"" Sep 12 23:58:23.642427 containerd[1466]: time="2025-09-12T23:58:23.642260394Z" level=info msg="StopPodSandbox for \"7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98\"" Sep 12 23:58:23.822025 containerd[1466]: 2025-09-12 23:58:23.755 [INFO][4330] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" Sep 12 23:58:23.822025 containerd[1466]: 2025-09-12 23:58:23.756 [INFO][4330] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" iface="eth0" netns="/var/run/netns/cni-e778e546-0535-3202-fa71-50e1f687fe69" Sep 12 23:58:23.822025 containerd[1466]: 2025-09-12 23:58:23.758 [INFO][4330] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" iface="eth0" netns="/var/run/netns/cni-e778e546-0535-3202-fa71-50e1f687fe69" Sep 12 23:58:23.822025 containerd[1466]: 2025-09-12 23:58:23.761 [INFO][4330] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" iface="eth0" netns="/var/run/netns/cni-e778e546-0535-3202-fa71-50e1f687fe69" Sep 12 23:58:23.822025 containerd[1466]: 2025-09-12 23:58:23.761 [INFO][4330] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" Sep 12 23:58:23.822025 containerd[1466]: 2025-09-12 23:58:23.762 [INFO][4330] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" Sep 12 23:58:23.822025 containerd[1466]: 2025-09-12 23:58:23.793 [INFO][4337] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" HandleID="k8s-pod-network.7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--kube--controllers--6c4999d64--82t4c-eth0" Sep 12 23:58:23.822025 containerd[1466]: 2025-09-12 23:58:23.794 [INFO][4337] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:23.822025 containerd[1466]: 2025-09-12 23:58:23.794 [INFO][4337] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:23.822025 containerd[1466]: 2025-09-12 23:58:23.814 [WARNING][4337] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" HandleID="k8s-pod-network.7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--kube--controllers--6c4999d64--82t4c-eth0" Sep 12 23:58:23.822025 containerd[1466]: 2025-09-12 23:58:23.815 [INFO][4337] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" HandleID="k8s-pod-network.7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--kube--controllers--6c4999d64--82t4c-eth0" Sep 12 23:58:23.822025 containerd[1466]: 2025-09-12 23:58:23.817 [INFO][4337] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:23.822025 containerd[1466]: 2025-09-12 23:58:23.819 [INFO][4330] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" Sep 12 23:58:23.822521 containerd[1466]: time="2025-09-12T23:58:23.822364046Z" level=info msg="TearDown network for sandbox \"7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98\" successfully" Sep 12 23:58:23.822521 containerd[1466]: time="2025-09-12T23:58:23.822394967Z" level=info msg="StopPodSandbox for \"7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98\" returns successfully" Sep 12 23:58:23.825123 containerd[1466]: time="2025-09-12T23:58:23.824947265Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6c4999d64-82t4c,Uid:94d742ff-488d-49c2-b166-9848bac8f37e,Namespace:calico-system,Attempt:1,}" Sep 12 23:58:23.827863 systemd[1]: run-netns-cni\x2de778e546\x2d0535\x2d3202\x2dfa71\x2d50e1f687fe69.mount: Deactivated successfully. Sep 12 23:58:24.003592 kubelet[2585]: I0912 23:58:24.003554 2585 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 23:58:24.022807 systemd-networkd[1378]: calide756206fe3: Link UP Sep 12 23:58:24.028173 systemd-networkd[1378]: calide756206fe3: Gained carrier Sep 12 23:58:24.060806 containerd[1466]: 2025-09-12 23:58:23.882 [INFO][4344] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 23:58:24.060806 containerd[1466]: 2025-09-12 23:58:23.901 [INFO][4344] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--n--326e2e5946-k8s-calico--kube--controllers--6c4999d64--82t4c-eth0 calico-kube-controllers-6c4999d64- calico-system 94d742ff-488d-49c2-b166-9848bac8f37e 945 0 2025-09-12 23:58:01 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6c4999d64 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4081-3-5-n-326e2e5946 calico-kube-controllers-6c4999d64-82t4c eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calide756206fe3 [] [] }} ContainerID="7aea2aa7848c44875e8f00feb031c6d89d7731ad1fa2a8376fd3054b3b18a753" Namespace="calico-system" Pod="calico-kube-controllers-6c4999d64-82t4c" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-calico--kube--controllers--6c4999d64--82t4c-" Sep 12 23:58:24.060806 containerd[1466]: 2025-09-12 23:58:23.901 [INFO][4344] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7aea2aa7848c44875e8f00feb031c6d89d7731ad1fa2a8376fd3054b3b18a753" Namespace="calico-system" Pod="calico-kube-controllers-6c4999d64-82t4c" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-calico--kube--controllers--6c4999d64--82t4c-eth0" Sep 12 23:58:24.060806 containerd[1466]: 2025-09-12 23:58:23.940 [INFO][4357] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7aea2aa7848c44875e8f00feb031c6d89d7731ad1fa2a8376fd3054b3b18a753" HandleID="k8s-pod-network.7aea2aa7848c44875e8f00feb031c6d89d7731ad1fa2a8376fd3054b3b18a753" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--kube--controllers--6c4999d64--82t4c-eth0" Sep 12 23:58:24.060806 containerd[1466]: 2025-09-12 23:58:23.940 [INFO][4357] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7aea2aa7848c44875e8f00feb031c6d89d7731ad1fa2a8376fd3054b3b18a753" HandleID="k8s-pod-network.7aea2aa7848c44875e8f00feb031c6d89d7731ad1fa2a8376fd3054b3b18a753" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--kube--controllers--6c4999d64--82t4c-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d2ff0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-5-n-326e2e5946", "pod":"calico-kube-controllers-6c4999d64-82t4c", "timestamp":"2025-09-12 23:58:23.940407404 +0000 UTC"}, Hostname:"ci-4081-3-5-n-326e2e5946", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:58:24.060806 containerd[1466]: 2025-09-12 23:58:23.940 [INFO][4357] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:24.060806 containerd[1466]: 2025-09-12 23:58:23.940 [INFO][4357] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:24.060806 containerd[1466]: 2025-09-12 23:58:23.940 [INFO][4357] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-n-326e2e5946' Sep 12 23:58:24.060806 containerd[1466]: 2025-09-12 23:58:23.954 [INFO][4357] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7aea2aa7848c44875e8f00feb031c6d89d7731ad1fa2a8376fd3054b3b18a753" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:24.060806 containerd[1466]: 2025-09-12 23:58:23.962 [INFO][4357] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:24.060806 containerd[1466]: 2025-09-12 23:58:23.969 [INFO][4357] ipam/ipam.go 511: Trying affinity for 192.168.36.64/26 host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:24.060806 containerd[1466]: 2025-09-12 23:58:23.973 [INFO][4357] ipam/ipam.go 158: Attempting to load block cidr=192.168.36.64/26 host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:24.060806 containerd[1466]: 2025-09-12 23:58:23.978 [INFO][4357] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.36.64/26 host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:24.060806 containerd[1466]: 2025-09-12 23:58:23.978 [INFO][4357] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.36.64/26 handle="k8s-pod-network.7aea2aa7848c44875e8f00feb031c6d89d7731ad1fa2a8376fd3054b3b18a753" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:24.060806 containerd[1466]: 2025-09-12 23:58:23.983 [INFO][4357] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.7aea2aa7848c44875e8f00feb031c6d89d7731ad1fa2a8376fd3054b3b18a753 Sep 12 23:58:24.060806 containerd[1466]: 2025-09-12 23:58:23.991 [INFO][4357] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.36.64/26 handle="k8s-pod-network.7aea2aa7848c44875e8f00feb031c6d89d7731ad1fa2a8376fd3054b3b18a753" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:24.060806 containerd[1466]: 2025-09-12 23:58:24.006 [INFO][4357] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.36.68/26] block=192.168.36.64/26 handle="k8s-pod-network.7aea2aa7848c44875e8f00feb031c6d89d7731ad1fa2a8376fd3054b3b18a753" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:24.060806 containerd[1466]: 2025-09-12 23:58:24.007 [INFO][4357] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.36.68/26] handle="k8s-pod-network.7aea2aa7848c44875e8f00feb031c6d89d7731ad1fa2a8376fd3054b3b18a753" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:24.060806 containerd[1466]: 2025-09-12 23:58:24.007 [INFO][4357] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:24.060806 containerd[1466]: 2025-09-12 23:58:24.007 [INFO][4357] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.36.68/26] IPv6=[] ContainerID="7aea2aa7848c44875e8f00feb031c6d89d7731ad1fa2a8376fd3054b3b18a753" HandleID="k8s-pod-network.7aea2aa7848c44875e8f00feb031c6d89d7731ad1fa2a8376fd3054b3b18a753" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--kube--controllers--6c4999d64--82t4c-eth0" Sep 12 23:58:24.061466 containerd[1466]: 2025-09-12 23:58:24.016 [INFO][4344] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7aea2aa7848c44875e8f00feb031c6d89d7731ad1fa2a8376fd3054b3b18a753" Namespace="calico-system" Pod="calico-kube-controllers-6c4999d64-82t4c" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-calico--kube--controllers--6c4999d64--82t4c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--326e2e5946-k8s-calico--kube--controllers--6c4999d64--82t4c-eth0", GenerateName:"calico-kube-controllers-6c4999d64-", Namespace:"calico-system", SelfLink:"", UID:"94d742ff-488d-49c2-b166-9848bac8f37e", ResourceVersion:"945", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 58, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6c4999d64", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-326e2e5946", ContainerID:"", Pod:"calico-kube-controllers-6c4999d64-82t4c", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.36.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calide756206fe3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:58:24.061466 containerd[1466]: 2025-09-12 23:58:24.016 [INFO][4344] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.36.68/32] ContainerID="7aea2aa7848c44875e8f00feb031c6d89d7731ad1fa2a8376fd3054b3b18a753" Namespace="calico-system" Pod="calico-kube-controllers-6c4999d64-82t4c" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-calico--kube--controllers--6c4999d64--82t4c-eth0" Sep 12 23:58:24.061466 containerd[1466]: 2025-09-12 23:58:24.016 [INFO][4344] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calide756206fe3 ContainerID="7aea2aa7848c44875e8f00feb031c6d89d7731ad1fa2a8376fd3054b3b18a753" Namespace="calico-system" Pod="calico-kube-controllers-6c4999d64-82t4c" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-calico--kube--controllers--6c4999d64--82t4c-eth0" Sep 12 23:58:24.061466 containerd[1466]: 2025-09-12 23:58:24.028 [INFO][4344] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7aea2aa7848c44875e8f00feb031c6d89d7731ad1fa2a8376fd3054b3b18a753" Namespace="calico-system" Pod="calico-kube-controllers-6c4999d64-82t4c" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-calico--kube--controllers--6c4999d64--82t4c-eth0" Sep 12 23:58:24.061466 containerd[1466]: 2025-09-12 23:58:24.030 [INFO][4344] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7aea2aa7848c44875e8f00feb031c6d89d7731ad1fa2a8376fd3054b3b18a753" Namespace="calico-system" Pod="calico-kube-controllers-6c4999d64-82t4c" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-calico--kube--controllers--6c4999d64--82t4c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--326e2e5946-k8s-calico--kube--controllers--6c4999d64--82t4c-eth0", GenerateName:"calico-kube-controllers-6c4999d64-", Namespace:"calico-system", SelfLink:"", UID:"94d742ff-488d-49c2-b166-9848bac8f37e", ResourceVersion:"945", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 58, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6c4999d64", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-326e2e5946", ContainerID:"7aea2aa7848c44875e8f00feb031c6d89d7731ad1fa2a8376fd3054b3b18a753", Pod:"calico-kube-controllers-6c4999d64-82t4c", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.36.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calide756206fe3", MAC:"2e:03:d8:71:b0:a1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:58:24.061466 containerd[1466]: 2025-09-12 23:58:24.054 [INFO][4344] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7aea2aa7848c44875e8f00feb031c6d89d7731ad1fa2a8376fd3054b3b18a753" Namespace="calico-system" Pod="calico-kube-controllers-6c4999d64-82t4c" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-calico--kube--controllers--6c4999d64--82t4c-eth0" Sep 12 23:58:24.130037 containerd[1466]: time="2025-09-12T23:58:24.127967060Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 23:58:24.130772 containerd[1466]: time="2025-09-12T23:58:24.129181146Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 23:58:24.130772 containerd[1466]: time="2025-09-12T23:58:24.129836211Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 23:58:24.132083 containerd[1466]: time="2025-09-12T23:58:24.131959692Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 23:58:24.177068 systemd[1]: Started cri-containerd-7aea2aa7848c44875e8f00feb031c6d89d7731ad1fa2a8376fd3054b3b18a753.scope - libcontainer container 7aea2aa7848c44875e8f00feb031c6d89d7731ad1fa2a8376fd3054b3b18a753. Sep 12 23:58:24.185945 systemd-networkd[1378]: calicb57a4712a1: Gained IPv6LL Sep 12 23:58:24.265793 containerd[1466]: time="2025-09-12T23:58:24.265511039Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6c4999d64-82t4c,Uid:94d742ff-488d-49c2-b166-9848bac8f37e,Namespace:calico-system,Attempt:1,} returns sandbox id \"7aea2aa7848c44875e8f00feb031c6d89d7731ad1fa2a8376fd3054b3b18a753\"" Sep 12 23:58:24.374954 kernel: bpftool[4429]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Sep 12 23:58:24.460978 containerd[1466]: time="2025-09-12T23:58:24.460443276Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:24.463485 containerd[1466]: time="2025-09-12T23:58:24.463426989Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=30823700" Sep 12 23:58:24.465863 containerd[1466]: time="2025-09-12T23:58:24.464958167Z" level=info msg="ImageCreate event name:\"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:24.468908 containerd[1466]: time="2025-09-12T23:58:24.467878238Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:24.469648 containerd[1466]: time="2025-09-12T23:58:24.468785672Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"30823530\" in 3.089706473s" Sep 12 23:58:24.469804 containerd[1466]: time="2025-09-12T23:58:24.469787190Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\"" Sep 12 23:58:24.478352 containerd[1466]: time="2025-09-12T23:58:24.478124027Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 23:58:24.507211 systemd-networkd[1378]: cali83c6c543473: Gained IPv6LL Sep 12 23:58:24.525673 containerd[1466]: time="2025-09-12T23:58:24.525154051Z" level=info msg="CreateContainer within sandbox \"dfa2da7c6f6034002d64086cb7320d1c3b9f756d2c4a6b02dc1f1dd5b041de0f\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 12 23:58:24.543384 containerd[1466]: time="2025-09-12T23:58:24.542830562Z" level=info msg="CreateContainer within sandbox \"dfa2da7c6f6034002d64086cb7320d1c3b9f756d2c4a6b02dc1f1dd5b041de0f\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"c9e75a57e736fcad33d3f216b843fa42bc09be9911807587442b963087db36cd\"" Sep 12 23:58:24.544933 containerd[1466]: time="2025-09-12T23:58:24.544899841Z" level=info msg="StartContainer for \"c9e75a57e736fcad33d3f216b843fa42bc09be9911807587442b963087db36cd\"" Sep 12 23:58:24.593966 systemd[1]: Started cri-containerd-c9e75a57e736fcad33d3f216b843fa42bc09be9911807587442b963087db36cd.scope - libcontainer container c9e75a57e736fcad33d3f216b843fa42bc09be9911807587442b963087db36cd. Sep 12 23:58:24.699080 containerd[1466]: time="2025-09-12T23:58:24.698780400Z" level=info msg="StartContainer for \"c9e75a57e736fcad33d3f216b843fa42bc09be9911807587442b963087db36cd\" returns successfully" Sep 12 23:58:24.808616 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1350849795.mount: Deactivated successfully. Sep 12 23:58:25.000247 systemd-networkd[1378]: vxlan.calico: Link UP Sep 12 23:58:25.000262 systemd-networkd[1378]: vxlan.calico: Gained carrier Sep 12 23:58:25.641022 containerd[1466]: time="2025-09-12T23:58:25.640954077Z" level=info msg="StopPodSandbox for \"348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70\"" Sep 12 23:58:25.726914 kubelet[2585]: I0912 23:58:25.726344 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-58478cff6c-kqfx8" podStartSLOduration=2.98722677 podStartE2EDuration="7.72632301s" podCreationTimestamp="2025-09-12 23:58:18 +0000 UTC" firstStartedPulling="2025-09-12 23:58:19.738539968 +0000 UTC m=+42.217773055" lastFinishedPulling="2025-09-12 23:58:24.477636168 +0000 UTC m=+46.956869295" observedRunningTime="2025-09-12 23:58:24.9665486 +0000 UTC m=+47.445781807" watchObservedRunningTime="2025-09-12 23:58:25.72632301 +0000 UTC m=+48.205556137" Sep 12 23:58:25.769258 containerd[1466]: 2025-09-12 23:58:25.723 [INFO][4593] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" Sep 12 23:58:25.769258 containerd[1466]: 2025-09-12 23:58:25.723 [INFO][4593] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" iface="eth0" netns="/var/run/netns/cni-04b3452a-1959-b95b-b036-46c9de2ce69b" Sep 12 23:58:25.769258 containerd[1466]: 2025-09-12 23:58:25.729 [INFO][4593] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" iface="eth0" netns="/var/run/netns/cni-04b3452a-1959-b95b-b036-46c9de2ce69b" Sep 12 23:58:25.769258 containerd[1466]: 2025-09-12 23:58:25.729 [INFO][4593] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" iface="eth0" netns="/var/run/netns/cni-04b3452a-1959-b95b-b036-46c9de2ce69b" Sep 12 23:58:25.769258 containerd[1466]: 2025-09-12 23:58:25.729 [INFO][4593] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" Sep 12 23:58:25.769258 containerd[1466]: 2025-09-12 23:58:25.729 [INFO][4593] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" Sep 12 23:58:25.769258 containerd[1466]: 2025-09-12 23:58:25.750 [INFO][4602] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" HandleID="k8s-pod-network.348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" Workload="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--hvkxc-eth0" Sep 12 23:58:25.769258 containerd[1466]: 2025-09-12 23:58:25.750 [INFO][4602] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:25.769258 containerd[1466]: 2025-09-12 23:58:25.750 [INFO][4602] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:25.769258 containerd[1466]: 2025-09-12 23:58:25.760 [WARNING][4602] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" HandleID="k8s-pod-network.348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" Workload="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--hvkxc-eth0" Sep 12 23:58:25.769258 containerd[1466]: 2025-09-12 23:58:25.760 [INFO][4602] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" HandleID="k8s-pod-network.348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" Workload="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--hvkxc-eth0" Sep 12 23:58:25.769258 containerd[1466]: 2025-09-12 23:58:25.764 [INFO][4602] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:25.769258 containerd[1466]: 2025-09-12 23:58:25.767 [INFO][4593] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" Sep 12 23:58:25.769961 containerd[1466]: time="2025-09-12T23:58:25.769896930Z" level=info msg="TearDown network for sandbox \"348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70\" successfully" Sep 12 23:58:25.769990 containerd[1466]: time="2025-09-12T23:58:25.769962533Z" level=info msg="StopPodSandbox for \"348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70\" returns successfully" Sep 12 23:58:25.772506 containerd[1466]: time="2025-09-12T23:58:25.772370984Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-hvkxc,Uid:dc685a7f-f1eb-41f5-8dc9-3a23d11d38d9,Namespace:kube-system,Attempt:1,}" Sep 12 23:58:25.773387 systemd[1]: run-netns-cni\x2d04b3452a\x2d1959\x2db95b\x2db036\x2d46c9de2ce69b.mount: Deactivated successfully. Sep 12 23:58:25.851855 systemd-networkd[1378]: calide756206fe3: Gained IPv6LL Sep 12 23:58:25.931449 systemd-networkd[1378]: cali1501d727d39: Link UP Sep 12 23:58:25.932209 systemd-networkd[1378]: cali1501d727d39: Gained carrier Sep 12 23:58:25.955468 containerd[1466]: 2025-09-12 23:58:25.835 [INFO][4608] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--hvkxc-eth0 coredns-668d6bf9bc- kube-system dc685a7f-f1eb-41f5-8dc9-3a23d11d38d9 968 0 2025-09-12 23:57:42 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081-3-5-n-326e2e5946 coredns-668d6bf9bc-hvkxc eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali1501d727d39 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="61b7c26ac2ac572cc822328456aeb46ccdc65ede3e5a56c1b2feb7180430733e" Namespace="kube-system" Pod="coredns-668d6bf9bc-hvkxc" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--hvkxc-" Sep 12 23:58:25.955468 containerd[1466]: 2025-09-12 23:58:25.835 [INFO][4608] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="61b7c26ac2ac572cc822328456aeb46ccdc65ede3e5a56c1b2feb7180430733e" Namespace="kube-system" Pod="coredns-668d6bf9bc-hvkxc" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--hvkxc-eth0" Sep 12 23:58:25.955468 containerd[1466]: 2025-09-12 23:58:25.868 [INFO][4620] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="61b7c26ac2ac572cc822328456aeb46ccdc65ede3e5a56c1b2feb7180430733e" HandleID="k8s-pod-network.61b7c26ac2ac572cc822328456aeb46ccdc65ede3e5a56c1b2feb7180430733e" Workload="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--hvkxc-eth0" Sep 12 23:58:25.955468 containerd[1466]: 2025-09-12 23:58:25.868 [INFO][4620] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="61b7c26ac2ac572cc822328456aeb46ccdc65ede3e5a56c1b2feb7180430733e" HandleID="k8s-pod-network.61b7c26ac2ac572cc822328456aeb46ccdc65ede3e5a56c1b2feb7180430733e" Workload="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--hvkxc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b270), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081-3-5-n-326e2e5946", "pod":"coredns-668d6bf9bc-hvkxc", "timestamp":"2025-09-12 23:58:25.86819866 +0000 UTC"}, Hostname:"ci-4081-3-5-n-326e2e5946", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:58:25.955468 containerd[1466]: 2025-09-12 23:58:25.868 [INFO][4620] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:25.955468 containerd[1466]: 2025-09-12 23:58:25.868 [INFO][4620] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:25.955468 containerd[1466]: 2025-09-12 23:58:25.868 [INFO][4620] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-n-326e2e5946' Sep 12 23:58:25.955468 containerd[1466]: 2025-09-12 23:58:25.880 [INFO][4620] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.61b7c26ac2ac572cc822328456aeb46ccdc65ede3e5a56c1b2feb7180430733e" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:25.955468 containerd[1466]: 2025-09-12 23:58:25.887 [INFO][4620] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:25.955468 containerd[1466]: 2025-09-12 23:58:25.893 [INFO][4620] ipam/ipam.go 511: Trying affinity for 192.168.36.64/26 host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:25.955468 containerd[1466]: 2025-09-12 23:58:25.896 [INFO][4620] ipam/ipam.go 158: Attempting to load block cidr=192.168.36.64/26 host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:25.955468 containerd[1466]: 2025-09-12 23:58:25.899 [INFO][4620] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.36.64/26 host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:25.955468 containerd[1466]: 2025-09-12 23:58:25.899 [INFO][4620] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.36.64/26 handle="k8s-pod-network.61b7c26ac2ac572cc822328456aeb46ccdc65ede3e5a56c1b2feb7180430733e" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:25.955468 containerd[1466]: 2025-09-12 23:58:25.901 [INFO][4620] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.61b7c26ac2ac572cc822328456aeb46ccdc65ede3e5a56c1b2feb7180430733e Sep 12 23:58:25.955468 containerd[1466]: 2025-09-12 23:58:25.911 [INFO][4620] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.36.64/26 handle="k8s-pod-network.61b7c26ac2ac572cc822328456aeb46ccdc65ede3e5a56c1b2feb7180430733e" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:25.955468 containerd[1466]: 2025-09-12 23:58:25.920 [INFO][4620] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.36.69/26] block=192.168.36.64/26 handle="k8s-pod-network.61b7c26ac2ac572cc822328456aeb46ccdc65ede3e5a56c1b2feb7180430733e" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:25.955468 containerd[1466]: 2025-09-12 23:58:25.920 [INFO][4620] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.36.69/26] handle="k8s-pod-network.61b7c26ac2ac572cc822328456aeb46ccdc65ede3e5a56c1b2feb7180430733e" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:25.955468 containerd[1466]: 2025-09-12 23:58:25.920 [INFO][4620] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:25.955468 containerd[1466]: 2025-09-12 23:58:25.920 [INFO][4620] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.36.69/26] IPv6=[] ContainerID="61b7c26ac2ac572cc822328456aeb46ccdc65ede3e5a56c1b2feb7180430733e" HandleID="k8s-pod-network.61b7c26ac2ac572cc822328456aeb46ccdc65ede3e5a56c1b2feb7180430733e" Workload="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--hvkxc-eth0" Sep 12 23:58:25.956158 containerd[1466]: 2025-09-12 23:58:25.923 [INFO][4608] cni-plugin/k8s.go 418: Populated endpoint ContainerID="61b7c26ac2ac572cc822328456aeb46ccdc65ede3e5a56c1b2feb7180430733e" Namespace="kube-system" Pod="coredns-668d6bf9bc-hvkxc" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--hvkxc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--hvkxc-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"dc685a7f-f1eb-41f5-8dc9-3a23d11d38d9", ResourceVersion:"968", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 57, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-326e2e5946", ContainerID:"", Pod:"coredns-668d6bf9bc-hvkxc", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.36.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1501d727d39", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:58:25.956158 containerd[1466]: 2025-09-12 23:58:25.925 [INFO][4608] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.36.69/32] ContainerID="61b7c26ac2ac572cc822328456aeb46ccdc65ede3e5a56c1b2feb7180430733e" Namespace="kube-system" Pod="coredns-668d6bf9bc-hvkxc" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--hvkxc-eth0" Sep 12 23:58:25.956158 containerd[1466]: 2025-09-12 23:58:25.926 [INFO][4608] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1501d727d39 ContainerID="61b7c26ac2ac572cc822328456aeb46ccdc65ede3e5a56c1b2feb7180430733e" Namespace="kube-system" Pod="coredns-668d6bf9bc-hvkxc" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--hvkxc-eth0" Sep 12 23:58:25.956158 containerd[1466]: 2025-09-12 23:58:25.931 [INFO][4608] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="61b7c26ac2ac572cc822328456aeb46ccdc65ede3e5a56c1b2feb7180430733e" Namespace="kube-system" Pod="coredns-668d6bf9bc-hvkxc" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--hvkxc-eth0" Sep 12 23:58:25.956158 containerd[1466]: 2025-09-12 23:58:25.933 [INFO][4608] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="61b7c26ac2ac572cc822328456aeb46ccdc65ede3e5a56c1b2feb7180430733e" Namespace="kube-system" Pod="coredns-668d6bf9bc-hvkxc" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--hvkxc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--hvkxc-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"dc685a7f-f1eb-41f5-8dc9-3a23d11d38d9", ResourceVersion:"968", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 57, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-326e2e5946", ContainerID:"61b7c26ac2ac572cc822328456aeb46ccdc65ede3e5a56c1b2feb7180430733e", Pod:"coredns-668d6bf9bc-hvkxc", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.36.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1501d727d39", MAC:"9e:dd:a6:dd:38:5c", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:58:25.956158 containerd[1466]: 2025-09-12 23:58:25.950 [INFO][4608] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="61b7c26ac2ac572cc822328456aeb46ccdc65ede3e5a56c1b2feb7180430733e" Namespace="kube-system" Pod="coredns-668d6bf9bc-hvkxc" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--hvkxc-eth0" Sep 12 23:58:25.980404 containerd[1466]: time="2025-09-12T23:58:25.979951188Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 23:58:25.980404 containerd[1466]: time="2025-09-12T23:58:25.980017591Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 23:58:25.980404 containerd[1466]: time="2025-09-12T23:58:25.980105873Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 23:58:25.980404 containerd[1466]: time="2025-09-12T23:58:25.980217757Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 23:58:26.015025 systemd[1]: Started cri-containerd-61b7c26ac2ac572cc822328456aeb46ccdc65ede3e5a56c1b2feb7180430733e.scope - libcontainer container 61b7c26ac2ac572cc822328456aeb46ccdc65ede3e5a56c1b2feb7180430733e. Sep 12 23:58:26.064032 containerd[1466]: time="2025-09-12T23:58:26.063958105Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-hvkxc,Uid:dc685a7f-f1eb-41f5-8dc9-3a23d11d38d9,Namespace:kube-system,Attempt:1,} returns sandbox id \"61b7c26ac2ac572cc822328456aeb46ccdc65ede3e5a56c1b2feb7180430733e\"" Sep 12 23:58:26.070053 containerd[1466]: time="2025-09-12T23:58:26.069989974Z" level=info msg="CreateContainer within sandbox \"61b7c26ac2ac572cc822328456aeb46ccdc65ede3e5a56c1b2feb7180430733e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 23:58:26.093543 containerd[1466]: time="2025-09-12T23:58:26.093401781Z" level=info msg="CreateContainer within sandbox \"61b7c26ac2ac572cc822328456aeb46ccdc65ede3e5a56c1b2feb7180430733e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"cfb6ab104e140591a3dbf631e3e625b0ea4746b1f15f4fe30b146bb2892c5130\"" Sep 12 23:58:26.095247 containerd[1466]: time="2025-09-12T23:58:26.095211873Z" level=info msg="StartContainer for \"cfb6ab104e140591a3dbf631e3e625b0ea4746b1f15f4fe30b146bb2892c5130\"" Sep 12 23:58:26.133220 systemd[1]: Started cri-containerd-cfb6ab104e140591a3dbf631e3e625b0ea4746b1f15f4fe30b146bb2892c5130.scope - libcontainer container cfb6ab104e140591a3dbf631e3e625b0ea4746b1f15f4fe30b146bb2892c5130. Sep 12 23:58:26.201996 containerd[1466]: time="2025-09-12T23:58:26.200130931Z" level=info msg="StartContainer for \"cfb6ab104e140591a3dbf631e3e625b0ea4746b1f15f4fe30b146bb2892c5130\" returns successfully" Sep 12 23:58:26.617939 systemd-networkd[1378]: vxlan.calico: Gained IPv6LL Sep 12 23:58:26.641324 containerd[1466]: time="2025-09-12T23:58:26.640168213Z" level=info msg="StopPodSandbox for \"b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933\"" Sep 12 23:58:26.641324 containerd[1466]: time="2025-09-12T23:58:26.640423129Z" level=info msg="StopPodSandbox for \"282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0\"" Sep 12 23:58:26.783564 containerd[1466]: 2025-09-12 23:58:26.714 [INFO][4732] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" Sep 12 23:58:26.783564 containerd[1466]: 2025-09-12 23:58:26.714 [INFO][4732] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" iface="eth0" netns="/var/run/netns/cni-2e322483-b141-616a-8c31-7b2cee0c9b6b" Sep 12 23:58:26.783564 containerd[1466]: 2025-09-12 23:58:26.716 [INFO][4732] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" iface="eth0" netns="/var/run/netns/cni-2e322483-b141-616a-8c31-7b2cee0c9b6b" Sep 12 23:58:26.783564 containerd[1466]: 2025-09-12 23:58:26.717 [INFO][4732] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" iface="eth0" netns="/var/run/netns/cni-2e322483-b141-616a-8c31-7b2cee0c9b6b" Sep 12 23:58:26.783564 containerd[1466]: 2025-09-12 23:58:26.717 [INFO][4732] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" Sep 12 23:58:26.783564 containerd[1466]: 2025-09-12 23:58:26.717 [INFO][4732] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" Sep 12 23:58:26.783564 containerd[1466]: 2025-09-12 23:58:26.761 [INFO][4747] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" HandleID="k8s-pod-network.282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" Workload="ci--4081--3--5--n--326e2e5946-k8s-goldmane--54d579b49d--bjt98-eth0" Sep 12 23:58:26.783564 containerd[1466]: 2025-09-12 23:58:26.761 [INFO][4747] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:26.783564 containerd[1466]: 2025-09-12 23:58:26.761 [INFO][4747] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:26.783564 containerd[1466]: 2025-09-12 23:58:26.773 [WARNING][4747] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" HandleID="k8s-pod-network.282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" Workload="ci--4081--3--5--n--326e2e5946-k8s-goldmane--54d579b49d--bjt98-eth0" Sep 12 23:58:26.783564 containerd[1466]: 2025-09-12 23:58:26.773 [INFO][4747] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" HandleID="k8s-pod-network.282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" Workload="ci--4081--3--5--n--326e2e5946-k8s-goldmane--54d579b49d--bjt98-eth0" Sep 12 23:58:26.783564 containerd[1466]: 2025-09-12 23:58:26.776 [INFO][4747] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:26.783564 containerd[1466]: 2025-09-12 23:58:26.781 [INFO][4732] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" Sep 12 23:58:26.784848 containerd[1466]: time="2025-09-12T23:58:26.784678513Z" level=info msg="TearDown network for sandbox \"282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0\" successfully" Sep 12 23:58:26.784848 containerd[1466]: time="2025-09-12T23:58:26.784747712Z" level=info msg="StopPodSandbox for \"282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0\" returns successfully" Sep 12 23:58:26.785598 containerd[1466]: time="2025-09-12T23:58:26.785567739Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-bjt98,Uid:e2914591-bb15-4357-8b0d-6d29e5119ec7,Namespace:calico-system,Attempt:1,}" Sep 12 23:58:26.816808 containerd[1466]: 2025-09-12 23:58:26.737 [INFO][4733] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" Sep 12 23:58:26.816808 containerd[1466]: 2025-09-12 23:58:26.737 [INFO][4733] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" iface="eth0" netns="/var/run/netns/cni-abbcfcde-cba2-11c5-82cd-2ede963d2c37" Sep 12 23:58:26.816808 containerd[1466]: 2025-09-12 23:58:26.737 [INFO][4733] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" iface="eth0" netns="/var/run/netns/cni-abbcfcde-cba2-11c5-82cd-2ede963d2c37" Sep 12 23:58:26.816808 containerd[1466]: 2025-09-12 23:58:26.739 [INFO][4733] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" iface="eth0" netns="/var/run/netns/cni-abbcfcde-cba2-11c5-82cd-2ede963d2c37" Sep 12 23:58:26.816808 containerd[1466]: 2025-09-12 23:58:26.740 [INFO][4733] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" Sep 12 23:58:26.816808 containerd[1466]: 2025-09-12 23:58:26.740 [INFO][4733] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" Sep 12 23:58:26.816808 containerd[1466]: 2025-09-12 23:58:26.774 [INFO][4753] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" HandleID="k8s-pod-network.b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--qdfzm-eth0" Sep 12 23:58:26.816808 containerd[1466]: 2025-09-12 23:58:26.774 [INFO][4753] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:26.816808 containerd[1466]: 2025-09-12 23:58:26.778 [INFO][4753] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:26.816808 containerd[1466]: 2025-09-12 23:58:26.793 [WARNING][4753] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" HandleID="k8s-pod-network.b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--qdfzm-eth0" Sep 12 23:58:26.816808 containerd[1466]: 2025-09-12 23:58:26.794 [INFO][4753] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" HandleID="k8s-pod-network.b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--qdfzm-eth0" Sep 12 23:58:26.816808 containerd[1466]: 2025-09-12 23:58:26.798 [INFO][4753] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:26.816808 containerd[1466]: 2025-09-12 23:58:26.803 [INFO][4733] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" Sep 12 23:58:26.817928 containerd[1466]: time="2025-09-12T23:58:26.817729134Z" level=info msg="TearDown network for sandbox \"b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933\" successfully" Sep 12 23:58:26.817928 containerd[1466]: time="2025-09-12T23:58:26.817768134Z" level=info msg="StopPodSandbox for \"b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933\" returns successfully" Sep 12 23:58:26.819199 containerd[1466]: time="2025-09-12T23:58:26.818942836Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c79dd94d7-qdfzm,Uid:49f008dc-1da3-46cf-ac99-9976edc0d9dc,Namespace:calico-apiserver,Attempt:1,}" Sep 12 23:58:26.991361 systemd[1]: run-netns-cni\x2dabbcfcde\x2dcba2\x2d11c5\x2d82cd\x2d2ede963d2c37.mount: Deactivated successfully. Sep 12 23:58:26.992749 systemd[1]: run-netns-cni\x2d2e322483\x2db141\x2d616a\x2d8c31\x2d7b2cee0c9b6b.mount: Deactivated successfully. Sep 12 23:58:26.998058 kubelet[2585]: I0912 23:58:26.997963 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-hvkxc" podStartSLOduration=44.997933816 podStartE2EDuration="44.997933816s" podCreationTimestamp="2025-09-12 23:57:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 23:58:26.97045779 +0000 UTC m=+49.449690957" watchObservedRunningTime="2025-09-12 23:58:26.997933816 +0000 UTC m=+49.477166903" Sep 12 23:58:27.031441 systemd-networkd[1378]: cali685a3bbb20d: Link UP Sep 12 23:58:27.031695 systemd-networkd[1378]: cali685a3bbb20d: Gained carrier Sep 12 23:58:27.067849 containerd[1466]: 2025-09-12 23:58:26.846 [INFO][4761] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--n--326e2e5946-k8s-goldmane--54d579b49d--bjt98-eth0 goldmane-54d579b49d- calico-system e2914591-bb15-4357-8b0d-6d29e5119ec7 980 0 2025-09-12 23:58:00 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4081-3-5-n-326e2e5946 goldmane-54d579b49d-bjt98 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali685a3bbb20d [] [] }} ContainerID="b468fa4ff2b5b8dc7cf1c81c07626ef41f069d487f08bc09a37255e851f48d4a" Namespace="calico-system" Pod="goldmane-54d579b49d-bjt98" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-goldmane--54d579b49d--bjt98-" Sep 12 23:58:27.067849 containerd[1466]: 2025-09-12 23:58:26.847 [INFO][4761] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b468fa4ff2b5b8dc7cf1c81c07626ef41f069d487f08bc09a37255e851f48d4a" Namespace="calico-system" Pod="goldmane-54d579b49d-bjt98" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-goldmane--54d579b49d--bjt98-eth0" Sep 12 23:58:27.067849 containerd[1466]: 2025-09-12 23:58:26.898 [INFO][4784] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b468fa4ff2b5b8dc7cf1c81c07626ef41f069d487f08bc09a37255e851f48d4a" HandleID="k8s-pod-network.b468fa4ff2b5b8dc7cf1c81c07626ef41f069d487f08bc09a37255e851f48d4a" Workload="ci--4081--3--5--n--326e2e5946-k8s-goldmane--54d579b49d--bjt98-eth0" Sep 12 23:58:27.067849 containerd[1466]: 2025-09-12 23:58:26.899 [INFO][4784] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b468fa4ff2b5b8dc7cf1c81c07626ef41f069d487f08bc09a37255e851f48d4a" HandleID="k8s-pod-network.b468fa4ff2b5b8dc7cf1c81c07626ef41f069d487f08bc09a37255e851f48d4a" Workload="ci--4081--3--5--n--326e2e5946-k8s-goldmane--54d579b49d--bjt98-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d2f50), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-5-n-326e2e5946", "pod":"goldmane-54d579b49d-bjt98", "timestamp":"2025-09-12 23:58:26.898957869 +0000 UTC"}, Hostname:"ci-4081-3-5-n-326e2e5946", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:58:27.067849 containerd[1466]: 2025-09-12 23:58:26.899 [INFO][4784] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:27.067849 containerd[1466]: 2025-09-12 23:58:26.899 [INFO][4784] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:27.067849 containerd[1466]: 2025-09-12 23:58:26.899 [INFO][4784] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-n-326e2e5946' Sep 12 23:58:27.067849 containerd[1466]: 2025-09-12 23:58:26.913 [INFO][4784] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b468fa4ff2b5b8dc7cf1c81c07626ef41f069d487f08bc09a37255e851f48d4a" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:27.067849 containerd[1466]: 2025-09-12 23:58:26.926 [INFO][4784] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:27.067849 containerd[1466]: 2025-09-12 23:58:26.940 [INFO][4784] ipam/ipam.go 511: Trying affinity for 192.168.36.64/26 host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:27.067849 containerd[1466]: 2025-09-12 23:58:26.949 [INFO][4784] ipam/ipam.go 158: Attempting to load block cidr=192.168.36.64/26 host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:27.067849 containerd[1466]: 2025-09-12 23:58:26.961 [INFO][4784] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.36.64/26 host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:27.067849 containerd[1466]: 2025-09-12 23:58:26.961 [INFO][4784] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.36.64/26 handle="k8s-pod-network.b468fa4ff2b5b8dc7cf1c81c07626ef41f069d487f08bc09a37255e851f48d4a" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:27.067849 containerd[1466]: 2025-09-12 23:58:26.966 [INFO][4784] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b468fa4ff2b5b8dc7cf1c81c07626ef41f069d487f08bc09a37255e851f48d4a Sep 12 23:58:27.067849 containerd[1466]: 2025-09-12 23:58:26.972 [INFO][4784] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.36.64/26 handle="k8s-pod-network.b468fa4ff2b5b8dc7cf1c81c07626ef41f069d487f08bc09a37255e851f48d4a" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:27.067849 containerd[1466]: 2025-09-12 23:58:27.003 [INFO][4784] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.36.70/26] block=192.168.36.64/26 handle="k8s-pod-network.b468fa4ff2b5b8dc7cf1c81c07626ef41f069d487f08bc09a37255e851f48d4a" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:27.067849 containerd[1466]: 2025-09-12 23:58:27.003 [INFO][4784] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.36.70/26] handle="k8s-pod-network.b468fa4ff2b5b8dc7cf1c81c07626ef41f069d487f08bc09a37255e851f48d4a" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:27.067849 containerd[1466]: 2025-09-12 23:58:27.003 [INFO][4784] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:27.067849 containerd[1466]: 2025-09-12 23:58:27.003 [INFO][4784] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.36.70/26] IPv6=[] ContainerID="b468fa4ff2b5b8dc7cf1c81c07626ef41f069d487f08bc09a37255e851f48d4a" HandleID="k8s-pod-network.b468fa4ff2b5b8dc7cf1c81c07626ef41f069d487f08bc09a37255e851f48d4a" Workload="ci--4081--3--5--n--326e2e5946-k8s-goldmane--54d579b49d--bjt98-eth0" Sep 12 23:58:27.068409 containerd[1466]: 2025-09-12 23:58:27.006 [INFO][4761] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b468fa4ff2b5b8dc7cf1c81c07626ef41f069d487f08bc09a37255e851f48d4a" Namespace="calico-system" Pod="goldmane-54d579b49d-bjt98" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-goldmane--54d579b49d--bjt98-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--326e2e5946-k8s-goldmane--54d579b49d--bjt98-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"e2914591-bb15-4357-8b0d-6d29e5119ec7", ResourceVersion:"980", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 58, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-326e2e5946", ContainerID:"", Pod:"goldmane-54d579b49d-bjt98", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.36.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali685a3bbb20d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:58:27.068409 containerd[1466]: 2025-09-12 23:58:27.006 [INFO][4761] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.36.70/32] ContainerID="b468fa4ff2b5b8dc7cf1c81c07626ef41f069d487f08bc09a37255e851f48d4a" Namespace="calico-system" Pod="goldmane-54d579b49d-bjt98" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-goldmane--54d579b49d--bjt98-eth0" Sep 12 23:58:27.068409 containerd[1466]: 2025-09-12 23:58:27.006 [INFO][4761] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali685a3bbb20d ContainerID="b468fa4ff2b5b8dc7cf1c81c07626ef41f069d487f08bc09a37255e851f48d4a" Namespace="calico-system" Pod="goldmane-54d579b49d-bjt98" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-goldmane--54d579b49d--bjt98-eth0" Sep 12 23:58:27.068409 containerd[1466]: 2025-09-12 23:58:27.030 [INFO][4761] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b468fa4ff2b5b8dc7cf1c81c07626ef41f069d487f08bc09a37255e851f48d4a" Namespace="calico-system" Pod="goldmane-54d579b49d-bjt98" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-goldmane--54d579b49d--bjt98-eth0" Sep 12 23:58:27.068409 containerd[1466]: 2025-09-12 23:58:27.032 [INFO][4761] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b468fa4ff2b5b8dc7cf1c81c07626ef41f069d487f08bc09a37255e851f48d4a" Namespace="calico-system" Pod="goldmane-54d579b49d-bjt98" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-goldmane--54d579b49d--bjt98-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--326e2e5946-k8s-goldmane--54d579b49d--bjt98-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"e2914591-bb15-4357-8b0d-6d29e5119ec7", ResourceVersion:"980", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 58, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-326e2e5946", ContainerID:"b468fa4ff2b5b8dc7cf1c81c07626ef41f069d487f08bc09a37255e851f48d4a", Pod:"goldmane-54d579b49d-bjt98", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.36.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali685a3bbb20d", MAC:"2a:75:8a:d6:70:e2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:58:27.068409 containerd[1466]: 2025-09-12 23:58:27.063 [INFO][4761] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b468fa4ff2b5b8dc7cf1c81c07626ef41f069d487f08bc09a37255e851f48d4a" Namespace="calico-system" Pod="goldmane-54d579b49d-bjt98" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-goldmane--54d579b49d--bjt98-eth0" Sep 12 23:58:27.108450 containerd[1466]: time="2025-09-12T23:58:27.108275575Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 23:58:27.108450 containerd[1466]: time="2025-09-12T23:58:27.108340215Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 23:58:27.108450 containerd[1466]: time="2025-09-12T23:58:27.108356214Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 23:58:27.109312 containerd[1466]: time="2025-09-12T23:58:27.108594451Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 23:58:27.131113 systemd-networkd[1378]: cali1501d727d39: Gained IPv6LL Sep 12 23:58:27.161278 systemd[1]: run-containerd-runc-k8s.io-b468fa4ff2b5b8dc7cf1c81c07626ef41f069d487f08bc09a37255e851f48d4a-runc.IBebLY.mount: Deactivated successfully. Sep 12 23:58:27.172814 systemd-networkd[1378]: calibb1b8d41740: Link UP Sep 12 23:58:27.173113 systemd-networkd[1378]: calibb1b8d41740: Gained carrier Sep 12 23:58:27.178907 systemd[1]: Started cri-containerd-b468fa4ff2b5b8dc7cf1c81c07626ef41f069d487f08bc09a37255e851f48d4a.scope - libcontainer container b468fa4ff2b5b8dc7cf1c81c07626ef41f069d487f08bc09a37255e851f48d4a. Sep 12 23:58:27.195321 containerd[1466]: 2025-09-12 23:58:26.906 [INFO][4773] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--qdfzm-eth0 calico-apiserver-6c79dd94d7- calico-apiserver 49f008dc-1da3-46cf-ac99-9976edc0d9dc 981 0 2025-09-12 23:57:50 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6c79dd94d7 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081-3-5-n-326e2e5946 calico-apiserver-6c79dd94d7-qdfzm eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calibb1b8d41740 [] [] }} ContainerID="db6961ef6fb7399026dac80a18eefe324112284ac1828ccb8b3cce4891e57fba" Namespace="calico-apiserver" Pod="calico-apiserver-6c79dd94d7-qdfzm" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--qdfzm-" Sep 12 23:58:27.195321 containerd[1466]: 2025-09-12 23:58:26.906 [INFO][4773] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="db6961ef6fb7399026dac80a18eefe324112284ac1828ccb8b3cce4891e57fba" Namespace="calico-apiserver" Pod="calico-apiserver-6c79dd94d7-qdfzm" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--qdfzm-eth0" Sep 12 23:58:27.195321 containerd[1466]: 2025-09-12 23:58:26.987 [INFO][4795] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="db6961ef6fb7399026dac80a18eefe324112284ac1828ccb8b3cce4891e57fba" HandleID="k8s-pod-network.db6961ef6fb7399026dac80a18eefe324112284ac1828ccb8b3cce4891e57fba" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--qdfzm-eth0" Sep 12 23:58:27.195321 containerd[1466]: 2025-09-12 23:58:26.987 [INFO][4795] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="db6961ef6fb7399026dac80a18eefe324112284ac1828ccb8b3cce4891e57fba" HandleID="k8s-pod-network.db6961ef6fb7399026dac80a18eefe324112284ac1828ccb8b3cce4891e57fba" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--qdfzm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003316f0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4081-3-5-n-326e2e5946", "pod":"calico-apiserver-6c79dd94d7-qdfzm", "timestamp":"2025-09-12 23:58:26.983206238 +0000 UTC"}, Hostname:"ci-4081-3-5-n-326e2e5946", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:58:27.195321 containerd[1466]: 2025-09-12 23:58:26.987 [INFO][4795] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:27.195321 containerd[1466]: 2025-09-12 23:58:27.003 [INFO][4795] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:27.195321 containerd[1466]: 2025-09-12 23:58:27.003 [INFO][4795] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-n-326e2e5946' Sep 12 23:58:27.195321 containerd[1466]: 2025-09-12 23:58:27.057 [INFO][4795] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.db6961ef6fb7399026dac80a18eefe324112284ac1828ccb8b3cce4891e57fba" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:27.195321 containerd[1466]: 2025-09-12 23:58:27.070 [INFO][4795] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:27.195321 containerd[1466]: 2025-09-12 23:58:27.083 [INFO][4795] ipam/ipam.go 511: Trying affinity for 192.168.36.64/26 host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:27.195321 containerd[1466]: 2025-09-12 23:58:27.088 [INFO][4795] ipam/ipam.go 158: Attempting to load block cidr=192.168.36.64/26 host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:27.195321 containerd[1466]: 2025-09-12 23:58:27.095 [INFO][4795] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.36.64/26 host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:27.195321 containerd[1466]: 2025-09-12 23:58:27.095 [INFO][4795] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.36.64/26 handle="k8s-pod-network.db6961ef6fb7399026dac80a18eefe324112284ac1828ccb8b3cce4891e57fba" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:27.195321 containerd[1466]: 2025-09-12 23:58:27.101 [INFO][4795] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.db6961ef6fb7399026dac80a18eefe324112284ac1828ccb8b3cce4891e57fba Sep 12 23:58:27.195321 containerd[1466]: 2025-09-12 23:58:27.121 [INFO][4795] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.36.64/26 handle="k8s-pod-network.db6961ef6fb7399026dac80a18eefe324112284ac1828ccb8b3cce4891e57fba" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:27.195321 containerd[1466]: 2025-09-12 23:58:27.143 [INFO][4795] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.36.71/26] block=192.168.36.64/26 handle="k8s-pod-network.db6961ef6fb7399026dac80a18eefe324112284ac1828ccb8b3cce4891e57fba" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:27.195321 containerd[1466]: 2025-09-12 23:58:27.144 [INFO][4795] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.36.71/26] handle="k8s-pod-network.db6961ef6fb7399026dac80a18eefe324112284ac1828ccb8b3cce4891e57fba" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:27.195321 containerd[1466]: 2025-09-12 23:58:27.144 [INFO][4795] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:27.195321 containerd[1466]: 2025-09-12 23:58:27.145 [INFO][4795] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.36.71/26] IPv6=[] ContainerID="db6961ef6fb7399026dac80a18eefe324112284ac1828ccb8b3cce4891e57fba" HandleID="k8s-pod-network.db6961ef6fb7399026dac80a18eefe324112284ac1828ccb8b3cce4891e57fba" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--qdfzm-eth0" Sep 12 23:58:27.196662 containerd[1466]: 2025-09-12 23:58:27.162 [INFO][4773] cni-plugin/k8s.go 418: Populated endpoint ContainerID="db6961ef6fb7399026dac80a18eefe324112284ac1828ccb8b3cce4891e57fba" Namespace="calico-apiserver" Pod="calico-apiserver-6c79dd94d7-qdfzm" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--qdfzm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--qdfzm-eth0", GenerateName:"calico-apiserver-6c79dd94d7-", Namespace:"calico-apiserver", SelfLink:"", UID:"49f008dc-1da3-46cf-ac99-9976edc0d9dc", ResourceVersion:"981", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 57, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6c79dd94d7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-326e2e5946", ContainerID:"", Pod:"calico-apiserver-6c79dd94d7-qdfzm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.36.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calibb1b8d41740", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:58:27.196662 containerd[1466]: 2025-09-12 23:58:27.163 [INFO][4773] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.36.71/32] ContainerID="db6961ef6fb7399026dac80a18eefe324112284ac1828ccb8b3cce4891e57fba" Namespace="calico-apiserver" Pod="calico-apiserver-6c79dd94d7-qdfzm" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--qdfzm-eth0" Sep 12 23:58:27.196662 containerd[1466]: 2025-09-12 23:58:27.163 [INFO][4773] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibb1b8d41740 ContainerID="db6961ef6fb7399026dac80a18eefe324112284ac1828ccb8b3cce4891e57fba" Namespace="calico-apiserver" Pod="calico-apiserver-6c79dd94d7-qdfzm" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--qdfzm-eth0" Sep 12 23:58:27.196662 containerd[1466]: 2025-09-12 23:58:27.174 [INFO][4773] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="db6961ef6fb7399026dac80a18eefe324112284ac1828ccb8b3cce4891e57fba" Namespace="calico-apiserver" Pod="calico-apiserver-6c79dd94d7-qdfzm" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--qdfzm-eth0" Sep 12 23:58:27.196662 containerd[1466]: 2025-09-12 23:58:27.175 [INFO][4773] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="db6961ef6fb7399026dac80a18eefe324112284ac1828ccb8b3cce4891e57fba" Namespace="calico-apiserver" Pod="calico-apiserver-6c79dd94d7-qdfzm" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--qdfzm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--qdfzm-eth0", GenerateName:"calico-apiserver-6c79dd94d7-", Namespace:"calico-apiserver", SelfLink:"", UID:"49f008dc-1da3-46cf-ac99-9976edc0d9dc", ResourceVersion:"981", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 57, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6c79dd94d7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-326e2e5946", ContainerID:"db6961ef6fb7399026dac80a18eefe324112284ac1828ccb8b3cce4891e57fba", Pod:"calico-apiserver-6c79dd94d7-qdfzm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.36.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calibb1b8d41740", MAC:"ae:ec:03:0e:c9:74", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:58:27.196662 containerd[1466]: 2025-09-12 23:58:27.191 [INFO][4773] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="db6961ef6fb7399026dac80a18eefe324112284ac1828ccb8b3cce4891e57fba" Namespace="calico-apiserver" Pod="calico-apiserver-6c79dd94d7-qdfzm" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--qdfzm-eth0" Sep 12 23:58:27.234974 containerd[1466]: time="2025-09-12T23:58:27.233661932Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 23:58:27.235934 containerd[1466]: time="2025-09-12T23:58:27.235305149Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 23:58:27.236127 containerd[1466]: time="2025-09-12T23:58:27.235765463Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 23:58:27.236127 containerd[1466]: time="2025-09-12T23:58:27.235895701Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 23:58:27.260469 containerd[1466]: time="2025-09-12T23:58:27.260333245Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-bjt98,Uid:e2914591-bb15-4357-8b0d-6d29e5119ec7,Namespace:calico-system,Attempt:1,} returns sandbox id \"b468fa4ff2b5b8dc7cf1c81c07626ef41f069d487f08bc09a37255e851f48d4a\"" Sep 12 23:58:27.280916 systemd[1]: Started cri-containerd-db6961ef6fb7399026dac80a18eefe324112284ac1828ccb8b3cce4891e57fba.scope - libcontainer container db6961ef6fb7399026dac80a18eefe324112284ac1828ccb8b3cce4891e57fba. Sep 12 23:58:27.354397 containerd[1466]: time="2025-09-12T23:58:27.354326313Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c79dd94d7-qdfzm,Uid:49f008dc-1da3-46cf-ac99-9976edc0d9dc,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"db6961ef6fb7399026dac80a18eefe324112284ac1828ccb8b3cce4891e57fba\"" Sep 12 23:58:28.063591 containerd[1466]: time="2025-09-12T23:58:28.061668029Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:28.063591 containerd[1466]: time="2025-09-12T23:58:28.063198250Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=44530807" Sep 12 23:58:28.065123 containerd[1466]: time="2025-09-12T23:58:28.065082226Z" level=info msg="ImageCreate event name:\"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:28.068786 containerd[1466]: time="2025-09-12T23:58:28.068725541Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:28.070424 containerd[1466]: time="2025-09-12T23:58:28.070373721Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 3.592166092s" Sep 12 23:58:28.070637 containerd[1466]: time="2025-09-12T23:58:28.070575678Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 12 23:58:28.074205 containerd[1466]: time="2025-09-12T23:58:28.074140034Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 12 23:58:28.075233 containerd[1466]: time="2025-09-12T23:58:28.075193101Z" level=info msg="CreateContainer within sandbox \"a487103d5eff70b8f13840377734d309c2ce664230a6e5eaa595b8997abac4de\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 23:58:28.105519 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount571119869.mount: Deactivated successfully. Sep 12 23:58:28.113012 containerd[1466]: time="2025-09-12T23:58:28.112963910Z" level=info msg="CreateContainer within sandbox \"a487103d5eff70b8f13840377734d309c2ce664230a6e5eaa595b8997abac4de\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"36ec79aeb70be718a28464d378e0c351ab8874c3c8da7aa97d71528145a37894\"" Sep 12 23:58:28.115097 containerd[1466]: time="2025-09-12T23:58:28.115021965Z" level=info msg="StartContainer for \"36ec79aeb70be718a28464d378e0c351ab8874c3c8da7aa97d71528145a37894\"" Sep 12 23:58:28.158972 systemd[1]: Started cri-containerd-36ec79aeb70be718a28464d378e0c351ab8874c3c8da7aa97d71528145a37894.scope - libcontainer container 36ec79aeb70be718a28464d378e0c351ab8874c3c8da7aa97d71528145a37894. Sep 12 23:58:28.197233 containerd[1466]: time="2025-09-12T23:58:28.197172022Z" level=info msg="StartContainer for \"36ec79aeb70be718a28464d378e0c351ab8874c3c8da7aa97d71528145a37894\" returns successfully" Sep 12 23:58:28.603270 systemd-networkd[1378]: cali685a3bbb20d: Gained IPv6LL Sep 12 23:58:28.641977 containerd[1466]: time="2025-09-12T23:58:28.640100388Z" level=info msg="StopPodSandbox for \"5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8\"" Sep 12 23:58:28.764204 containerd[1466]: 2025-09-12 23:58:28.710 [INFO][4967] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" Sep 12 23:58:28.764204 containerd[1466]: 2025-09-12 23:58:28.711 [INFO][4967] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" iface="eth0" netns="/var/run/netns/cni-046a08ad-4835-8dbc-ca0e-9f6ad6395559" Sep 12 23:58:28.764204 containerd[1466]: 2025-09-12 23:58:28.711 [INFO][4967] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" iface="eth0" netns="/var/run/netns/cni-046a08ad-4835-8dbc-ca0e-9f6ad6395559" Sep 12 23:58:28.764204 containerd[1466]: 2025-09-12 23:58:28.711 [INFO][4967] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" iface="eth0" netns="/var/run/netns/cni-046a08ad-4835-8dbc-ca0e-9f6ad6395559" Sep 12 23:58:28.764204 containerd[1466]: 2025-09-12 23:58:28.711 [INFO][4967] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" Sep 12 23:58:28.764204 containerd[1466]: 2025-09-12 23:58:28.712 [INFO][4967] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" Sep 12 23:58:28.764204 containerd[1466]: 2025-09-12 23:58:28.741 [INFO][4975] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" HandleID="k8s-pod-network.5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" Workload="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--th8mg-eth0" Sep 12 23:58:28.764204 containerd[1466]: 2025-09-12 23:58:28.742 [INFO][4975] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:28.764204 containerd[1466]: 2025-09-12 23:58:28.742 [INFO][4975] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:28.764204 containerd[1466]: 2025-09-12 23:58:28.755 [WARNING][4975] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" HandleID="k8s-pod-network.5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" Workload="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--th8mg-eth0" Sep 12 23:58:28.764204 containerd[1466]: 2025-09-12 23:58:28.755 [INFO][4975] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" HandleID="k8s-pod-network.5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" Workload="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--th8mg-eth0" Sep 12 23:58:28.764204 containerd[1466]: 2025-09-12 23:58:28.758 [INFO][4975] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:28.764204 containerd[1466]: 2025-09-12 23:58:28.761 [INFO][4967] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" Sep 12 23:58:28.765262 containerd[1466]: time="2025-09-12T23:58:28.764828435Z" level=info msg="TearDown network for sandbox \"5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8\" successfully" Sep 12 23:58:28.765262 containerd[1466]: time="2025-09-12T23:58:28.764870234Z" level=info msg="StopPodSandbox for \"5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8\" returns successfully" Sep 12 23:58:28.766267 containerd[1466]: time="2025-09-12T23:58:28.766175858Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-th8mg,Uid:62ed9b16-b2ad-4c43-931a-9ec3cc4358d1,Namespace:kube-system,Attempt:1,}" Sep 12 23:58:28.930515 systemd-networkd[1378]: cali7acdeea3a11: Link UP Sep 12 23:58:28.931313 systemd-networkd[1378]: cali7acdeea3a11: Gained carrier Sep 12 23:58:28.959351 containerd[1466]: 2025-09-12 23:58:28.829 [INFO][4983] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--th8mg-eth0 coredns-668d6bf9bc- kube-system 62ed9b16-b2ad-4c43-931a-9ec3cc4358d1 1006 0 2025-09-12 23:57:42 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081-3-5-n-326e2e5946 coredns-668d6bf9bc-th8mg eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali7acdeea3a11 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="72ad11ab37b25761274441baffbca605fea0949c341300ef9c36b8660adab32e" Namespace="kube-system" Pod="coredns-668d6bf9bc-th8mg" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--th8mg-" Sep 12 23:58:28.959351 containerd[1466]: 2025-09-12 23:58:28.829 [INFO][4983] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="72ad11ab37b25761274441baffbca605fea0949c341300ef9c36b8660adab32e" Namespace="kube-system" Pod="coredns-668d6bf9bc-th8mg" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--th8mg-eth0" Sep 12 23:58:28.959351 containerd[1466]: 2025-09-12 23:58:28.860 [INFO][4994] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="72ad11ab37b25761274441baffbca605fea0949c341300ef9c36b8660adab32e" HandleID="k8s-pod-network.72ad11ab37b25761274441baffbca605fea0949c341300ef9c36b8660adab32e" Workload="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--th8mg-eth0" Sep 12 23:58:28.959351 containerd[1466]: 2025-09-12 23:58:28.860 [INFO][4994] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="72ad11ab37b25761274441baffbca605fea0949c341300ef9c36b8660adab32e" HandleID="k8s-pod-network.72ad11ab37b25761274441baffbca605fea0949c341300ef9c36b8660adab32e" Workload="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--th8mg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3660), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081-3-5-n-326e2e5946", "pod":"coredns-668d6bf9bc-th8mg", "timestamp":"2025-09-12 23:58:28.860626802 +0000 UTC"}, Hostname:"ci-4081-3-5-n-326e2e5946", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:58:28.959351 containerd[1466]: 2025-09-12 23:58:28.860 [INFO][4994] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:28.959351 containerd[1466]: 2025-09-12 23:58:28.860 [INFO][4994] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:28.959351 containerd[1466]: 2025-09-12 23:58:28.861 [INFO][4994] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-n-326e2e5946' Sep 12 23:58:28.959351 containerd[1466]: 2025-09-12 23:58:28.873 [INFO][4994] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.72ad11ab37b25761274441baffbca605fea0949c341300ef9c36b8660adab32e" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:28.959351 containerd[1466]: 2025-09-12 23:58:28.881 [INFO][4994] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:28.959351 containerd[1466]: 2025-09-12 23:58:28.888 [INFO][4994] ipam/ipam.go 511: Trying affinity for 192.168.36.64/26 host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:28.959351 containerd[1466]: 2025-09-12 23:58:28.892 [INFO][4994] ipam/ipam.go 158: Attempting to load block cidr=192.168.36.64/26 host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:28.959351 containerd[1466]: 2025-09-12 23:58:28.895 [INFO][4994] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.36.64/26 host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:28.959351 containerd[1466]: 2025-09-12 23:58:28.895 [INFO][4994] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.36.64/26 handle="k8s-pod-network.72ad11ab37b25761274441baffbca605fea0949c341300ef9c36b8660adab32e" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:28.959351 containerd[1466]: 2025-09-12 23:58:28.898 [INFO][4994] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.72ad11ab37b25761274441baffbca605fea0949c341300ef9c36b8660adab32e Sep 12 23:58:28.959351 containerd[1466]: 2025-09-12 23:58:28.904 [INFO][4994] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.36.64/26 handle="k8s-pod-network.72ad11ab37b25761274441baffbca605fea0949c341300ef9c36b8660adab32e" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:28.959351 containerd[1466]: 2025-09-12 23:58:28.916 [INFO][4994] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.36.72/26] block=192.168.36.64/26 handle="k8s-pod-network.72ad11ab37b25761274441baffbca605fea0949c341300ef9c36b8660adab32e" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:28.959351 containerd[1466]: 2025-09-12 23:58:28.916 [INFO][4994] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.36.72/26] handle="k8s-pod-network.72ad11ab37b25761274441baffbca605fea0949c341300ef9c36b8660adab32e" host="ci-4081-3-5-n-326e2e5946" Sep 12 23:58:28.959351 containerd[1466]: 2025-09-12 23:58:28.916 [INFO][4994] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:28.959351 containerd[1466]: 2025-09-12 23:58:28.916 [INFO][4994] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.36.72/26] IPv6=[] ContainerID="72ad11ab37b25761274441baffbca605fea0949c341300ef9c36b8660adab32e" HandleID="k8s-pod-network.72ad11ab37b25761274441baffbca605fea0949c341300ef9c36b8660adab32e" Workload="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--th8mg-eth0" Sep 12 23:58:28.963275 containerd[1466]: 2025-09-12 23:58:28.919 [INFO][4983] cni-plugin/k8s.go 418: Populated endpoint ContainerID="72ad11ab37b25761274441baffbca605fea0949c341300ef9c36b8660adab32e" Namespace="kube-system" Pod="coredns-668d6bf9bc-th8mg" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--th8mg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--th8mg-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"62ed9b16-b2ad-4c43-931a-9ec3cc4358d1", ResourceVersion:"1006", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 57, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-326e2e5946", ContainerID:"", Pod:"coredns-668d6bf9bc-th8mg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.36.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7acdeea3a11", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:58:28.963275 containerd[1466]: 2025-09-12 23:58:28.919 [INFO][4983] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.36.72/32] ContainerID="72ad11ab37b25761274441baffbca605fea0949c341300ef9c36b8660adab32e" Namespace="kube-system" Pod="coredns-668d6bf9bc-th8mg" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--th8mg-eth0" Sep 12 23:58:28.963275 containerd[1466]: 2025-09-12 23:58:28.919 [INFO][4983] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7acdeea3a11 ContainerID="72ad11ab37b25761274441baffbca605fea0949c341300ef9c36b8660adab32e" Namespace="kube-system" Pod="coredns-668d6bf9bc-th8mg" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--th8mg-eth0" Sep 12 23:58:28.963275 containerd[1466]: 2025-09-12 23:58:28.930 [INFO][4983] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="72ad11ab37b25761274441baffbca605fea0949c341300ef9c36b8660adab32e" Namespace="kube-system" Pod="coredns-668d6bf9bc-th8mg" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--th8mg-eth0" Sep 12 23:58:28.963275 containerd[1466]: 2025-09-12 23:58:28.930 [INFO][4983] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="72ad11ab37b25761274441baffbca605fea0949c341300ef9c36b8660adab32e" Namespace="kube-system" Pod="coredns-668d6bf9bc-th8mg" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--th8mg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--th8mg-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"62ed9b16-b2ad-4c43-931a-9ec3cc4358d1", ResourceVersion:"1006", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 57, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-326e2e5946", ContainerID:"72ad11ab37b25761274441baffbca605fea0949c341300ef9c36b8660adab32e", Pod:"coredns-668d6bf9bc-th8mg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.36.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7acdeea3a11", MAC:"56:8a:b2:b4:d1:f2", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:58:28.963275 containerd[1466]: 2025-09-12 23:58:28.952 [INFO][4983] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="72ad11ab37b25761274441baffbca605fea0949c341300ef9c36b8660adab32e" Namespace="kube-system" Pod="coredns-668d6bf9bc-th8mg" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--th8mg-eth0" Sep 12 23:58:28.990564 systemd[1]: run-containerd-runc-k8s.io-36ec79aeb70be718a28464d378e0c351ab8874c3c8da7aa97d71528145a37894-runc.Qze9vE.mount: Deactivated successfully. Sep 12 23:58:28.990674 systemd[1]: run-netns-cni\x2d046a08ad\x2d4835\x2d8dbc\x2dca0e\x2d9f6ad6395559.mount: Deactivated successfully. Sep 12 23:58:29.008547 containerd[1466]: time="2025-09-12T23:58:29.007913498Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 23:58:29.008857 containerd[1466]: time="2025-09-12T23:58:29.008755768Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 23:58:29.008974 containerd[1466]: time="2025-09-12T23:58:29.008933486Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 23:58:29.009412 containerd[1466]: time="2025-09-12T23:58:29.009355322Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 23:58:29.047432 systemd[1]: run-containerd-runc-k8s.io-72ad11ab37b25761274441baffbca605fea0949c341300ef9c36b8660adab32e-runc.DjkhC1.mount: Deactivated successfully. Sep 12 23:58:29.062985 systemd[1]: Started cri-containerd-72ad11ab37b25761274441baffbca605fea0949c341300ef9c36b8660adab32e.scope - libcontainer container 72ad11ab37b25761274441baffbca605fea0949c341300ef9c36b8660adab32e. Sep 12 23:58:29.113802 containerd[1466]: time="2025-09-12T23:58:29.113750873Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-th8mg,Uid:62ed9b16-b2ad-4c43-931a-9ec3cc4358d1,Namespace:kube-system,Attempt:1,} returns sandbox id \"72ad11ab37b25761274441baffbca605fea0949c341300ef9c36b8660adab32e\"" Sep 12 23:58:29.119453 containerd[1466]: time="2025-09-12T23:58:29.119274611Z" level=info msg="CreateContainer within sandbox \"72ad11ab37b25761274441baffbca605fea0949c341300ef9c36b8660adab32e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 23:58:29.137506 containerd[1466]: time="2025-09-12T23:58:29.137374729Z" level=info msg="CreateContainer within sandbox \"72ad11ab37b25761274441baffbca605fea0949c341300ef9c36b8660adab32e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"da7c9706613a8a09fbd9fb092bc5349b2f4c8c808f5fb41127e48791654f4634\"" Sep 12 23:58:29.138851 containerd[1466]: time="2025-09-12T23:58:29.138822592Z" level=info msg="StartContainer for \"da7c9706613a8a09fbd9fb092bc5349b2f4c8c808f5fb41127e48791654f4634\"" Sep 12 23:58:29.173189 systemd[1]: Started cri-containerd-da7c9706613a8a09fbd9fb092bc5349b2f4c8c808f5fb41127e48791654f4634.scope - libcontainer container da7c9706613a8a09fbd9fb092bc5349b2f4c8c808f5fb41127e48791654f4634. Sep 12 23:58:29.177909 systemd-networkd[1378]: calibb1b8d41740: Gained IPv6LL Sep 12 23:58:29.229432 containerd[1466]: time="2025-09-12T23:58:29.229381219Z" level=info msg="StartContainer for \"da7c9706613a8a09fbd9fb092bc5349b2f4c8c808f5fb41127e48791654f4634\" returns successfully" Sep 12 23:58:29.959809 containerd[1466]: time="2025-09-12T23:58:29.959671124Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:29.962351 containerd[1466]: time="2025-09-12T23:58:29.962214896Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8227489" Sep 12 23:58:29.966207 containerd[1466]: time="2025-09-12T23:58:29.964264553Z" level=info msg="ImageCreate event name:\"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:29.969255 containerd[1466]: time="2025-09-12T23:58:29.969148738Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:29.971728 containerd[1466]: time="2025-09-12T23:58:29.971594191Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"9596730\" in 1.897394558s" Sep 12 23:58:29.971964 containerd[1466]: time="2025-09-12T23:58:29.971921547Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\"" Sep 12 23:58:29.975808 containerd[1466]: time="2025-09-12T23:58:29.975504747Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 12 23:58:29.977792 containerd[1466]: time="2025-09-12T23:58:29.977681403Z" level=info msg="CreateContainer within sandbox \"cec3f72e2a154c18e795f75ae44c72769e391b4cd7e152a22dedbe89193ce5e1\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 12 23:58:29.988776 kubelet[2585]: I0912 23:58:29.988647 2585 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 23:58:30.020360 kubelet[2585]: I0912 23:58:30.020285 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6c79dd94d7-k7nnj" podStartSLOduration=35.286247818 podStartE2EDuration="40.02026451s" podCreationTimestamp="2025-09-12 23:57:50 +0000 UTC" firstStartedPulling="2025-09-12 23:58:23.337563814 +0000 UTC m=+45.816796941" lastFinishedPulling="2025-09-12 23:58:28.071580506 +0000 UTC m=+50.550813633" observedRunningTime="2025-09-12 23:58:28.99015067 +0000 UTC m=+51.469383797" watchObservedRunningTime="2025-09-12 23:58:30.02026451 +0000 UTC m=+52.499497637" Sep 12 23:58:30.039397 containerd[1466]: time="2025-09-12T23:58:30.039336840Z" level=info msg="CreateContainer within sandbox \"cec3f72e2a154c18e795f75ae44c72769e391b4cd7e152a22dedbe89193ce5e1\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"8a0669c80c1fdca86ed3b3ff7819eed693bac7eba71ec02fb67268baa414d1d8\"" Sep 12 23:58:30.043832 containerd[1466]: time="2025-09-12T23:58:30.042782205Z" level=info msg="StartContainer for \"8a0669c80c1fdca86ed3b3ff7819eed693bac7eba71ec02fb67268baa414d1d8\"" Sep 12 23:58:30.062504 kubelet[2585]: I0912 23:58:30.061821 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-th8mg" podStartSLOduration=48.061800776 podStartE2EDuration="48.061800776s" podCreationTimestamp="2025-09-12 23:57:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 23:58:30.021817334 +0000 UTC m=+52.501050461" watchObservedRunningTime="2025-09-12 23:58:30.061800776 +0000 UTC m=+52.541033863" Sep 12 23:58:30.118956 systemd[1]: Started cri-containerd-8a0669c80c1fdca86ed3b3ff7819eed693bac7eba71ec02fb67268baa414d1d8.scope - libcontainer container 8a0669c80c1fdca86ed3b3ff7819eed693bac7eba71ec02fb67268baa414d1d8. Sep 12 23:58:30.198565 containerd[1466]: time="2025-09-12T23:58:30.198507092Z" level=info msg="StartContainer for \"8a0669c80c1fdca86ed3b3ff7819eed693bac7eba71ec02fb67268baa414d1d8\" returns successfully" Sep 12 23:58:30.716075 systemd-networkd[1378]: cali7acdeea3a11: Gained IPv6LL Sep 12 23:58:32.931894 containerd[1466]: time="2025-09-12T23:58:32.931828923Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:32.934303 containerd[1466]: time="2025-09-12T23:58:32.933875508Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=48134957" Sep 12 23:58:32.936209 containerd[1466]: time="2025-09-12T23:58:32.935726573Z" level=info msg="ImageCreate event name:\"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:32.939233 containerd[1466]: time="2025-09-12T23:58:32.939181267Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:32.940024 containerd[1466]: time="2025-09-12T23:58:32.939990821Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"49504166\" in 2.964396674s" Sep 12 23:58:32.940127 containerd[1466]: time="2025-09-12T23:58:32.940111620Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\"" Sep 12 23:58:32.941546 containerd[1466]: time="2025-09-12T23:58:32.941517329Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 12 23:58:32.973701 containerd[1466]: time="2025-09-12T23:58:32.973660403Z" level=info msg="CreateContainer within sandbox \"7aea2aa7848c44875e8f00feb031c6d89d7731ad1fa2a8376fd3054b3b18a753\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 12 23:58:32.996480 containerd[1466]: time="2025-09-12T23:58:32.996428749Z" level=info msg="CreateContainer within sandbox \"7aea2aa7848c44875e8f00feb031c6d89d7731ad1fa2a8376fd3054b3b18a753\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"ea8513d99f9393489a1fbf2932b3975999bf81b2b87e8fd50bec7a496582a4bd\"" Sep 12 23:58:32.998859 containerd[1466]: time="2025-09-12T23:58:32.997968617Z" level=info msg="StartContainer for \"ea8513d99f9393489a1fbf2932b3975999bf81b2b87e8fd50bec7a496582a4bd\"" Sep 12 23:58:33.073982 systemd[1]: Started cri-containerd-ea8513d99f9393489a1fbf2932b3975999bf81b2b87e8fd50bec7a496582a4bd.scope - libcontainer container ea8513d99f9393489a1fbf2932b3975999bf81b2b87e8fd50bec7a496582a4bd. Sep 12 23:58:33.122236 containerd[1466]: time="2025-09-12T23:58:33.122114441Z" level=info msg="StartContainer for \"ea8513d99f9393489a1fbf2932b3975999bf81b2b87e8fd50bec7a496582a4bd\" returns successfully" Sep 12 23:58:34.060780 kubelet[2585]: I0912 23:58:34.059899 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-6c4999d64-82t4c" podStartSLOduration=24.433679978 podStartE2EDuration="33.059879763s" podCreationTimestamp="2025-09-12 23:58:01 +0000 UTC" firstStartedPulling="2025-09-12 23:58:24.314767388 +0000 UTC m=+46.794000515" lastFinishedPulling="2025-09-12 23:58:32.940967213 +0000 UTC m=+55.420200300" observedRunningTime="2025-09-12 23:58:34.053832876 +0000 UTC m=+56.533066003" watchObservedRunningTime="2025-09-12 23:58:34.059879763 +0000 UTC m=+56.539112850" Sep 12 23:58:35.500371 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4252056191.mount: Deactivated successfully. Sep 12 23:58:35.907136 containerd[1466]: time="2025-09-12T23:58:35.907063547Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=61845332" Sep 12 23:58:35.913156 containerd[1466]: time="2025-09-12T23:58:35.912989880Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"61845178\" in 2.971338632s" Sep 12 23:58:35.913156 containerd[1466]: time="2025-09-12T23:58:35.913039560Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\"" Sep 12 23:58:35.919802 containerd[1466]: time="2025-09-12T23:58:35.918631255Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:35.919802 containerd[1466]: time="2025-09-12T23:58:35.919569651Z" level=info msg="ImageCreate event name:\"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:35.920616 containerd[1466]: time="2025-09-12T23:58:35.920578807Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:35.931039 containerd[1466]: time="2025-09-12T23:58:35.930983041Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 23:58:35.936723 containerd[1466]: time="2025-09-12T23:58:35.936655295Z" level=info msg="CreateContainer within sandbox \"b468fa4ff2b5b8dc7cf1c81c07626ef41f069d487f08bc09a37255e851f48d4a\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 12 23:58:35.957872 containerd[1466]: time="2025-09-12T23:58:35.957822281Z" level=info msg="CreateContainer within sandbox \"b468fa4ff2b5b8dc7cf1c81c07626ef41f069d487f08bc09a37255e851f48d4a\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"e3a11417fa8dc4433e7e0c5f5a2d1992f0edcfc00bc14edd804cc5527b99f09c\"" Sep 12 23:58:35.960029 containerd[1466]: time="2025-09-12T23:58:35.959970592Z" level=info msg="StartContainer for \"e3a11417fa8dc4433e7e0c5f5a2d1992f0edcfc00bc14edd804cc5527b99f09c\"" Sep 12 23:58:36.000986 systemd[1]: Started cri-containerd-e3a11417fa8dc4433e7e0c5f5a2d1992f0edcfc00bc14edd804cc5527b99f09c.scope - libcontainer container e3a11417fa8dc4433e7e0c5f5a2d1992f0edcfc00bc14edd804cc5527b99f09c. Sep 12 23:58:36.044552 containerd[1466]: time="2025-09-12T23:58:36.044507181Z" level=info msg="StartContainer for \"e3a11417fa8dc4433e7e0c5f5a2d1992f0edcfc00bc14edd804cc5527b99f09c\" returns successfully" Sep 12 23:58:36.056460 systemd[1]: run-containerd-runc-k8s.io-e3a11417fa8dc4433e7e0c5f5a2d1992f0edcfc00bc14edd804cc5527b99f09c-runc.F6np1d.mount: Deactivated successfully. Sep 12 23:58:36.377363 containerd[1466]: time="2025-09-12T23:58:36.377238078Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:36.379168 containerd[1466]: time="2025-09-12T23:58:36.377798676Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 12 23:58:36.381832 containerd[1466]: time="2025-09-12T23:58:36.381791583Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 450.752943ms" Sep 12 23:58:36.381832 containerd[1466]: time="2025-09-12T23:58:36.381835223Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 12 23:58:36.383061 containerd[1466]: time="2025-09-12T23:58:36.382898419Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 12 23:58:36.386317 containerd[1466]: time="2025-09-12T23:58:36.386278567Z" level=info msg="CreateContainer within sandbox \"db6961ef6fb7399026dac80a18eefe324112284ac1828ccb8b3cce4891e57fba\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 23:58:36.421092 containerd[1466]: time="2025-09-12T23:58:36.420989848Z" level=info msg="CreateContainer within sandbox \"db6961ef6fb7399026dac80a18eefe324112284ac1828ccb8b3cce4891e57fba\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"db33830b61a966deccd6ca1a2cdec592008f3dd32185f0ad171dc741cfa3f91e\"" Sep 12 23:58:36.423751 containerd[1466]: time="2025-09-12T23:58:36.422413043Z" level=info msg="StartContainer for \"db33830b61a966deccd6ca1a2cdec592008f3dd32185f0ad171dc741cfa3f91e\"" Sep 12 23:58:36.482147 systemd[1]: Started cri-containerd-db33830b61a966deccd6ca1a2cdec592008f3dd32185f0ad171dc741cfa3f91e.scope - libcontainer container db33830b61a966deccd6ca1a2cdec592008f3dd32185f0ad171dc741cfa3f91e. Sep 12 23:58:36.532251 containerd[1466]: time="2025-09-12T23:58:36.531777908Z" level=info msg="StartContainer for \"db33830b61a966deccd6ca1a2cdec592008f3dd32185f0ad171dc741cfa3f91e\" returns successfully" Sep 12 23:58:37.068784 kubelet[2585]: I0912 23:58:37.068056 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6c79dd94d7-qdfzm" podStartSLOduration=38.042054475 podStartE2EDuration="47.068036893s" podCreationTimestamp="2025-09-12 23:57:50 +0000 UTC" firstStartedPulling="2025-09-12 23:58:27.356589962 +0000 UTC m=+49.835823049" lastFinishedPulling="2025-09-12 23:58:36.38257234 +0000 UTC m=+58.861805467" observedRunningTime="2025-09-12 23:58:37.065266019 +0000 UTC m=+59.544499186" watchObservedRunningTime="2025-09-12 23:58:37.068036893 +0000 UTC m=+59.547270020" Sep 12 23:58:37.098830 kubelet[2585]: I0912 23:58:37.098735 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-bjt98" podStartSLOduration=28.433706307 podStartE2EDuration="37.098695657s" podCreationTimestamp="2025-09-12 23:58:00 +0000 UTC" firstStartedPulling="2025-09-12 23:58:27.265673732 +0000 UTC m=+49.744906859" lastFinishedPulling="2025-09-12 23:58:35.930663002 +0000 UTC m=+58.409896209" observedRunningTime="2025-09-12 23:58:37.0976593 +0000 UTC m=+59.576892427" watchObservedRunningTime="2025-09-12 23:58:37.098695657 +0000 UTC m=+59.577928784" Sep 12 23:58:37.654295 containerd[1466]: time="2025-09-12T23:58:37.653306134Z" level=info msg="StopPodSandbox for \"7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98\"" Sep 12 23:58:37.791211 containerd[1466]: 2025-09-12 23:58:37.723 [WARNING][5343] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--326e2e5946-k8s-calico--kube--controllers--6c4999d64--82t4c-eth0", GenerateName:"calico-kube-controllers-6c4999d64-", Namespace:"calico-system", SelfLink:"", UID:"94d742ff-488d-49c2-b166-9848bac8f37e", ResourceVersion:"1047", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 58, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6c4999d64", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-326e2e5946", ContainerID:"7aea2aa7848c44875e8f00feb031c6d89d7731ad1fa2a8376fd3054b3b18a753", Pod:"calico-kube-controllers-6c4999d64-82t4c", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.36.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calide756206fe3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:58:37.791211 containerd[1466]: 2025-09-12 23:58:37.723 [INFO][5343] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" Sep 12 23:58:37.791211 containerd[1466]: 2025-09-12 23:58:37.723 [INFO][5343] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" iface="eth0" netns="" Sep 12 23:58:37.791211 containerd[1466]: 2025-09-12 23:58:37.723 [INFO][5343] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" Sep 12 23:58:37.791211 containerd[1466]: 2025-09-12 23:58:37.723 [INFO][5343] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" Sep 12 23:58:37.791211 containerd[1466]: 2025-09-12 23:58:37.770 [INFO][5350] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" HandleID="k8s-pod-network.7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--kube--controllers--6c4999d64--82t4c-eth0" Sep 12 23:58:37.791211 containerd[1466]: 2025-09-12 23:58:37.771 [INFO][5350] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:37.791211 containerd[1466]: 2025-09-12 23:58:37.771 [INFO][5350] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:37.791211 containerd[1466]: 2025-09-12 23:58:37.781 [WARNING][5350] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" HandleID="k8s-pod-network.7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--kube--controllers--6c4999d64--82t4c-eth0" Sep 12 23:58:37.791211 containerd[1466]: 2025-09-12 23:58:37.781 [INFO][5350] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" HandleID="k8s-pod-network.7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--kube--controllers--6c4999d64--82t4c-eth0" Sep 12 23:58:37.791211 containerd[1466]: 2025-09-12 23:58:37.785 [INFO][5350] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:37.791211 containerd[1466]: 2025-09-12 23:58:37.788 [INFO][5343] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" Sep 12 23:58:37.791211 containerd[1466]: time="2025-09-12T23:58:37.790752796Z" level=info msg="TearDown network for sandbox \"7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98\" successfully" Sep 12 23:58:37.791211 containerd[1466]: time="2025-09-12T23:58:37.790778716Z" level=info msg="StopPodSandbox for \"7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98\" returns successfully" Sep 12 23:58:37.792081 containerd[1466]: time="2025-09-12T23:58:37.791411154Z" level=info msg="RemovePodSandbox for \"7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98\"" Sep 12 23:58:37.796746 containerd[1466]: time="2025-09-12T23:58:37.796040943Z" level=info msg="Forcibly stopping sandbox \"7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98\"" Sep 12 23:58:37.920099 containerd[1466]: 2025-09-12 23:58:37.849 [WARNING][5364] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--326e2e5946-k8s-calico--kube--controllers--6c4999d64--82t4c-eth0", GenerateName:"calico-kube-controllers-6c4999d64-", Namespace:"calico-system", SelfLink:"", UID:"94d742ff-488d-49c2-b166-9848bac8f37e", ResourceVersion:"1047", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 58, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6c4999d64", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-326e2e5946", ContainerID:"7aea2aa7848c44875e8f00feb031c6d89d7731ad1fa2a8376fd3054b3b18a753", Pod:"calico-kube-controllers-6c4999d64-82t4c", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.36.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calide756206fe3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:58:37.920099 containerd[1466]: 2025-09-12 23:58:37.852 [INFO][5364] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" Sep 12 23:58:37.920099 containerd[1466]: 2025-09-12 23:58:37.852 [INFO][5364] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" iface="eth0" netns="" Sep 12 23:58:37.920099 containerd[1466]: 2025-09-12 23:58:37.852 [INFO][5364] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" Sep 12 23:58:37.920099 containerd[1466]: 2025-09-12 23:58:37.852 [INFO][5364] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" Sep 12 23:58:37.920099 containerd[1466]: 2025-09-12 23:58:37.888 [INFO][5371] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" HandleID="k8s-pod-network.7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--kube--controllers--6c4999d64--82t4c-eth0" Sep 12 23:58:37.920099 containerd[1466]: 2025-09-12 23:58:37.888 [INFO][5371] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:37.920099 containerd[1466]: 2025-09-12 23:58:37.888 [INFO][5371] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:37.920099 containerd[1466]: 2025-09-12 23:58:37.909 [WARNING][5371] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" HandleID="k8s-pod-network.7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--kube--controllers--6c4999d64--82t4c-eth0" Sep 12 23:58:37.920099 containerd[1466]: 2025-09-12 23:58:37.909 [INFO][5371] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" HandleID="k8s-pod-network.7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--kube--controllers--6c4999d64--82t4c-eth0" Sep 12 23:58:37.920099 containerd[1466]: 2025-09-12 23:58:37.913 [INFO][5371] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:37.920099 containerd[1466]: 2025-09-12 23:58:37.916 [INFO][5364] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98" Sep 12 23:58:37.920099 containerd[1466]: time="2025-09-12T23:58:37.919895838Z" level=info msg="TearDown network for sandbox \"7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98\" successfully" Sep 12 23:58:37.931749 containerd[1466]: time="2025-09-12T23:58:37.930864651Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 23:58:37.931749 containerd[1466]: time="2025-09-12T23:58:37.931021691Z" level=info msg="RemovePodSandbox \"7e321eb757593a56c9ed12f777eae66d167b04d882e2b07864e347691e3c3f98\" returns successfully" Sep 12 23:58:37.932363 containerd[1466]: time="2025-09-12T23:58:37.932271008Z" level=info msg="StopPodSandbox for \"282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0\"" Sep 12 23:58:38.113339 containerd[1466]: 2025-09-12 23:58:38.024 [WARNING][5385] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--326e2e5946-k8s-goldmane--54d579b49d--bjt98-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"e2914591-bb15-4357-8b0d-6d29e5119ec7", ResourceVersion:"1065", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 58, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-326e2e5946", ContainerID:"b468fa4ff2b5b8dc7cf1c81c07626ef41f069d487f08bc09a37255e851f48d4a", Pod:"goldmane-54d579b49d-bjt98", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.36.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali685a3bbb20d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:58:38.113339 containerd[1466]: 2025-09-12 23:58:38.025 [INFO][5385] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" Sep 12 23:58:38.113339 containerd[1466]: 2025-09-12 23:58:38.025 [INFO][5385] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" iface="eth0" netns="" Sep 12 23:58:38.113339 containerd[1466]: 2025-09-12 23:58:38.025 [INFO][5385] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" Sep 12 23:58:38.113339 containerd[1466]: 2025-09-12 23:58:38.025 [INFO][5385] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" Sep 12 23:58:38.113339 containerd[1466]: 2025-09-12 23:58:38.079 [INFO][5393] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" HandleID="k8s-pod-network.282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" Workload="ci--4081--3--5--n--326e2e5946-k8s-goldmane--54d579b49d--bjt98-eth0" Sep 12 23:58:38.113339 containerd[1466]: 2025-09-12 23:58:38.079 [INFO][5393] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:38.113339 containerd[1466]: 2025-09-12 23:58:38.080 [INFO][5393] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:38.113339 containerd[1466]: 2025-09-12 23:58:38.096 [WARNING][5393] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" HandleID="k8s-pod-network.282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" Workload="ci--4081--3--5--n--326e2e5946-k8s-goldmane--54d579b49d--bjt98-eth0" Sep 12 23:58:38.113339 containerd[1466]: 2025-09-12 23:58:38.096 [INFO][5393] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" HandleID="k8s-pod-network.282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" Workload="ci--4081--3--5--n--326e2e5946-k8s-goldmane--54d579b49d--bjt98-eth0" Sep 12 23:58:38.113339 containerd[1466]: 2025-09-12 23:58:38.103 [INFO][5393] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:38.113339 containerd[1466]: 2025-09-12 23:58:38.111 [INFO][5385] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" Sep 12 23:58:38.115779 containerd[1466]: time="2025-09-12T23:58:38.113863348Z" level=info msg="TearDown network for sandbox \"282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0\" successfully" Sep 12 23:58:38.115779 containerd[1466]: time="2025-09-12T23:58:38.113919508Z" level=info msg="StopPodSandbox for \"282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0\" returns successfully" Sep 12 23:58:38.116219 containerd[1466]: time="2025-09-12T23:58:38.116184745Z" level=info msg="RemovePodSandbox for \"282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0\"" Sep 12 23:58:38.116315 containerd[1466]: time="2025-09-12T23:58:38.116229345Z" level=info msg="Forcibly stopping sandbox \"282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0\"" Sep 12 23:58:38.327360 containerd[1466]: 2025-09-12 23:58:38.215 [WARNING][5432] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--326e2e5946-k8s-goldmane--54d579b49d--bjt98-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"e2914591-bb15-4357-8b0d-6d29e5119ec7", ResourceVersion:"1065", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 58, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-326e2e5946", ContainerID:"b468fa4ff2b5b8dc7cf1c81c07626ef41f069d487f08bc09a37255e851f48d4a", Pod:"goldmane-54d579b49d-bjt98", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.36.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali685a3bbb20d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:58:38.327360 containerd[1466]: 2025-09-12 23:58:38.216 [INFO][5432] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" Sep 12 23:58:38.327360 containerd[1466]: 2025-09-12 23:58:38.216 [INFO][5432] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" iface="eth0" netns="" Sep 12 23:58:38.327360 containerd[1466]: 2025-09-12 23:58:38.216 [INFO][5432] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" Sep 12 23:58:38.327360 containerd[1466]: 2025-09-12 23:58:38.216 [INFO][5432] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" Sep 12 23:58:38.327360 containerd[1466]: 2025-09-12 23:58:38.291 [INFO][5445] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" HandleID="k8s-pod-network.282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" Workload="ci--4081--3--5--n--326e2e5946-k8s-goldmane--54d579b49d--bjt98-eth0" Sep 12 23:58:38.327360 containerd[1466]: 2025-09-12 23:58:38.292 [INFO][5445] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:38.327360 containerd[1466]: 2025-09-12 23:58:38.292 [INFO][5445] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:38.327360 containerd[1466]: 2025-09-12 23:58:38.306 [WARNING][5445] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" HandleID="k8s-pod-network.282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" Workload="ci--4081--3--5--n--326e2e5946-k8s-goldmane--54d579b49d--bjt98-eth0" Sep 12 23:58:38.327360 containerd[1466]: 2025-09-12 23:58:38.306 [INFO][5445] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" HandleID="k8s-pod-network.282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" Workload="ci--4081--3--5--n--326e2e5946-k8s-goldmane--54d579b49d--bjt98-eth0" Sep 12 23:58:38.327360 containerd[1466]: 2025-09-12 23:58:38.318 [INFO][5445] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:38.327360 containerd[1466]: 2025-09-12 23:58:38.323 [INFO][5432] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0" Sep 12 23:58:38.328416 containerd[1466]: time="2025-09-12T23:58:38.327984264Z" level=info msg="TearDown network for sandbox \"282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0\" successfully" Sep 12 23:58:38.352921 containerd[1466]: time="2025-09-12T23:58:38.352834626Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 23:58:38.353696 containerd[1466]: time="2025-09-12T23:58:38.352969546Z" level=info msg="RemovePodSandbox \"282c32aff9fb2fe91803704b8dfa8fe772bfc540faa928a4e1c8e6d09d4264d0\" returns successfully" Sep 12 23:58:38.355110 containerd[1466]: time="2025-09-12T23:58:38.354210064Z" level=info msg="StopPodSandbox for \"c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2\"" Sep 12 23:58:38.468398 containerd[1466]: time="2025-09-12T23:58:38.467388693Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:38.475963 containerd[1466]: time="2025-09-12T23:58:38.475892920Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=13761208" Sep 12 23:58:38.479787 containerd[1466]: time="2025-09-12T23:58:38.479384555Z" level=info msg="ImageCreate event name:\"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:38.482755 containerd[1466]: time="2025-09-12T23:58:38.482270550Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:58:38.484818 containerd[1466]: time="2025-09-12T23:58:38.484515627Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"15130401\" in 2.101545128s" Sep 12 23:58:38.484818 containerd[1466]: time="2025-09-12T23:58:38.484582027Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\"" Sep 12 23:58:38.492667 containerd[1466]: time="2025-09-12T23:58:38.492613775Z" level=info msg="CreateContainer within sandbox \"cec3f72e2a154c18e795f75ae44c72769e391b4cd7e152a22dedbe89193ce5e1\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 12 23:58:38.531273 containerd[1466]: time="2025-09-12T23:58:38.531222556Z" level=info msg="CreateContainer within sandbox \"cec3f72e2a154c18e795f75ae44c72769e391b4cd7e152a22dedbe89193ce5e1\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"ca217522025d1f72d7eb5081da4eabcf22fdd28add47bdbdb2c89083e08d58e1\"" Sep 12 23:58:38.532964 containerd[1466]: time="2025-09-12T23:58:38.532863954Z" level=info msg="StartContainer for \"ca217522025d1f72d7eb5081da4eabcf22fdd28add47bdbdb2c89083e08d58e1\"" Sep 12 23:58:38.549774 containerd[1466]: 2025-09-12 23:58:38.454 [WARNING][5461] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--326e2e5946-k8s-csi--node--driver--vdhzb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"ed6d6d1f-7445-4602-8d9e-b0dd54215b8f", ResourceVersion:"941", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 58, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-326e2e5946", ContainerID:"cec3f72e2a154c18e795f75ae44c72769e391b4cd7e152a22dedbe89193ce5e1", Pod:"csi-node-driver-vdhzb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.36.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calicb57a4712a1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:58:38.549774 containerd[1466]: 2025-09-12 23:58:38.454 [INFO][5461] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" Sep 12 23:58:38.549774 containerd[1466]: 2025-09-12 23:58:38.454 [INFO][5461] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" iface="eth0" netns="" Sep 12 23:58:38.549774 containerd[1466]: 2025-09-12 23:58:38.454 [INFO][5461] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" Sep 12 23:58:38.549774 containerd[1466]: 2025-09-12 23:58:38.454 [INFO][5461] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" Sep 12 23:58:38.549774 containerd[1466]: 2025-09-12 23:58:38.514 [INFO][5468] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" HandleID="k8s-pod-network.c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" Workload="ci--4081--3--5--n--326e2e5946-k8s-csi--node--driver--vdhzb-eth0" Sep 12 23:58:38.549774 containerd[1466]: 2025-09-12 23:58:38.516 [INFO][5468] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:38.549774 containerd[1466]: 2025-09-12 23:58:38.516 [INFO][5468] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:38.549774 containerd[1466]: 2025-09-12 23:58:38.537 [WARNING][5468] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" HandleID="k8s-pod-network.c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" Workload="ci--4081--3--5--n--326e2e5946-k8s-csi--node--driver--vdhzb-eth0" Sep 12 23:58:38.549774 containerd[1466]: 2025-09-12 23:58:38.537 [INFO][5468] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" HandleID="k8s-pod-network.c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" Workload="ci--4081--3--5--n--326e2e5946-k8s-csi--node--driver--vdhzb-eth0" Sep 12 23:58:38.549774 containerd[1466]: 2025-09-12 23:58:38.544 [INFO][5468] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:38.549774 containerd[1466]: 2025-09-12 23:58:38.546 [INFO][5461] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" Sep 12 23:58:38.549774 containerd[1466]: time="2025-09-12T23:58:38.549360289Z" level=info msg="TearDown network for sandbox \"c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2\" successfully" Sep 12 23:58:38.549774 containerd[1466]: time="2025-09-12T23:58:38.549430169Z" level=info msg="StopPodSandbox for \"c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2\" returns successfully" Sep 12 23:58:38.551740 containerd[1466]: time="2025-09-12T23:58:38.551215606Z" level=info msg="RemovePodSandbox for \"c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2\"" Sep 12 23:58:38.551740 containerd[1466]: time="2025-09-12T23:58:38.551263446Z" level=info msg="Forcibly stopping sandbox \"c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2\"" Sep 12 23:58:38.592977 systemd[1]: Started cri-containerd-ca217522025d1f72d7eb5081da4eabcf22fdd28add47bdbdb2c89083e08d58e1.scope - libcontainer container ca217522025d1f72d7eb5081da4eabcf22fdd28add47bdbdb2c89083e08d58e1. Sep 12 23:58:38.637420 containerd[1466]: time="2025-09-12T23:58:38.637235676Z" level=info msg="StartContainer for \"ca217522025d1f72d7eb5081da4eabcf22fdd28add47bdbdb2c89083e08d58e1\" returns successfully" Sep 12 23:58:38.669462 containerd[1466]: 2025-09-12 23:58:38.612 [WARNING][5489] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--326e2e5946-k8s-csi--node--driver--vdhzb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"ed6d6d1f-7445-4602-8d9e-b0dd54215b8f", ResourceVersion:"941", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 58, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-326e2e5946", ContainerID:"cec3f72e2a154c18e795f75ae44c72769e391b4cd7e152a22dedbe89193ce5e1", Pod:"csi-node-driver-vdhzb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.36.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calicb57a4712a1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:58:38.669462 containerd[1466]: 2025-09-12 23:58:38.613 [INFO][5489] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" Sep 12 23:58:38.669462 containerd[1466]: 2025-09-12 23:58:38.613 [INFO][5489] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" iface="eth0" netns="" Sep 12 23:58:38.669462 containerd[1466]: 2025-09-12 23:58:38.614 [INFO][5489] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" Sep 12 23:58:38.669462 containerd[1466]: 2025-09-12 23:58:38.614 [INFO][5489] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" Sep 12 23:58:38.669462 containerd[1466]: 2025-09-12 23:58:38.649 [INFO][5514] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" HandleID="k8s-pod-network.c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" Workload="ci--4081--3--5--n--326e2e5946-k8s-csi--node--driver--vdhzb-eth0" Sep 12 23:58:38.669462 containerd[1466]: 2025-09-12 23:58:38.649 [INFO][5514] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:38.669462 containerd[1466]: 2025-09-12 23:58:38.650 [INFO][5514] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:38.669462 containerd[1466]: 2025-09-12 23:58:38.663 [WARNING][5514] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" HandleID="k8s-pod-network.c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" Workload="ci--4081--3--5--n--326e2e5946-k8s-csi--node--driver--vdhzb-eth0" Sep 12 23:58:38.669462 containerd[1466]: 2025-09-12 23:58:38.663 [INFO][5514] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" HandleID="k8s-pod-network.c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" Workload="ci--4081--3--5--n--326e2e5946-k8s-csi--node--driver--vdhzb-eth0" Sep 12 23:58:38.669462 containerd[1466]: 2025-09-12 23:58:38.666 [INFO][5514] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:38.669462 containerd[1466]: 2025-09-12 23:58:38.667 [INFO][5489] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2" Sep 12 23:58:38.670204 containerd[1466]: time="2025-09-12T23:58:38.669504787Z" level=info msg="TearDown network for sandbox \"c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2\" successfully" Sep 12 23:58:38.674023 containerd[1466]: time="2025-09-12T23:58:38.673919060Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 23:58:38.674199 containerd[1466]: time="2025-09-12T23:58:38.674036900Z" level=info msg="RemovePodSandbox \"c3dad0ac9ee4c998847493e6792a88312db0dbe2f675e68f2090ab9dd9714fa2\" returns successfully" Sep 12 23:58:38.674660 containerd[1466]: time="2025-09-12T23:58:38.674594699Z" level=info msg="StopPodSandbox for \"3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8\"" Sep 12 23:58:38.769253 containerd[1466]: 2025-09-12 23:58:38.721 [WARNING][5541] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--k7nnj-eth0", GenerateName:"calico-apiserver-6c79dd94d7-", Namespace:"calico-apiserver", SelfLink:"", UID:"66998075-6768-4629-a394-a7e649462c77", ResourceVersion:"1011", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 57, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6c79dd94d7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-326e2e5946", ContainerID:"a487103d5eff70b8f13840377734d309c2ce664230a6e5eaa595b8997abac4de", Pod:"calico-apiserver-6c79dd94d7-k7nnj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.36.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali83c6c543473", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:58:38.769253 containerd[1466]: 2025-09-12 23:58:38.722 [INFO][5541] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" Sep 12 23:58:38.769253 containerd[1466]: 2025-09-12 23:58:38.722 [INFO][5541] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" iface="eth0" netns="" Sep 12 23:58:38.769253 containerd[1466]: 2025-09-12 23:58:38.722 [INFO][5541] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" Sep 12 23:58:38.769253 containerd[1466]: 2025-09-12 23:58:38.722 [INFO][5541] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" Sep 12 23:58:38.769253 containerd[1466]: 2025-09-12 23:58:38.748 [INFO][5548] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" HandleID="k8s-pod-network.3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--k7nnj-eth0" Sep 12 23:58:38.769253 containerd[1466]: 2025-09-12 23:58:38.748 [INFO][5548] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:38.769253 containerd[1466]: 2025-09-12 23:58:38.748 [INFO][5548] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:38.769253 containerd[1466]: 2025-09-12 23:58:38.762 [WARNING][5548] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" HandleID="k8s-pod-network.3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--k7nnj-eth0" Sep 12 23:58:38.769253 containerd[1466]: 2025-09-12 23:58:38.762 [INFO][5548] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" HandleID="k8s-pod-network.3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--k7nnj-eth0" Sep 12 23:58:38.769253 containerd[1466]: 2025-09-12 23:58:38.765 [INFO][5548] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:38.769253 containerd[1466]: 2025-09-12 23:58:38.767 [INFO][5541] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" Sep 12 23:58:38.770520 containerd[1466]: time="2025-09-12T23:58:38.769762635Z" level=info msg="TearDown network for sandbox \"3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8\" successfully" Sep 12 23:58:38.770520 containerd[1466]: time="2025-09-12T23:58:38.769811355Z" level=info msg="StopPodSandbox for \"3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8\" returns successfully" Sep 12 23:58:38.770520 containerd[1466]: time="2025-09-12T23:58:38.770452154Z" level=info msg="RemovePodSandbox for \"3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8\"" Sep 12 23:58:38.770520 containerd[1466]: time="2025-09-12T23:58:38.770483354Z" level=info msg="Forcibly stopping sandbox \"3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8\"" Sep 12 23:58:38.783050 kubelet[2585]: I0912 23:58:38.782856 2585 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 12 23:58:38.787484 kubelet[2585]: I0912 23:58:38.787347 2585 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 12 23:58:38.875876 containerd[1466]: 2025-09-12 23:58:38.828 [WARNING][5562] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--k7nnj-eth0", GenerateName:"calico-apiserver-6c79dd94d7-", Namespace:"calico-apiserver", SelfLink:"", UID:"66998075-6768-4629-a394-a7e649462c77", ResourceVersion:"1011", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 57, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6c79dd94d7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-326e2e5946", ContainerID:"a487103d5eff70b8f13840377734d309c2ce664230a6e5eaa595b8997abac4de", Pod:"calico-apiserver-6c79dd94d7-k7nnj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.36.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali83c6c543473", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:58:38.875876 containerd[1466]: 2025-09-12 23:58:38.829 [INFO][5562] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" Sep 12 23:58:38.875876 containerd[1466]: 2025-09-12 23:58:38.829 [INFO][5562] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" iface="eth0" netns="" Sep 12 23:58:38.875876 containerd[1466]: 2025-09-12 23:58:38.829 [INFO][5562] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" Sep 12 23:58:38.875876 containerd[1466]: 2025-09-12 23:58:38.829 [INFO][5562] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" Sep 12 23:58:38.875876 containerd[1466]: 2025-09-12 23:58:38.855 [INFO][5569] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" HandleID="k8s-pod-network.3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--k7nnj-eth0" Sep 12 23:58:38.875876 containerd[1466]: 2025-09-12 23:58:38.855 [INFO][5569] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:38.875876 containerd[1466]: 2025-09-12 23:58:38.855 [INFO][5569] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:38.875876 containerd[1466]: 2025-09-12 23:58:38.868 [WARNING][5569] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" HandleID="k8s-pod-network.3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--k7nnj-eth0" Sep 12 23:58:38.875876 containerd[1466]: 2025-09-12 23:58:38.868 [INFO][5569] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" HandleID="k8s-pod-network.3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--k7nnj-eth0" Sep 12 23:58:38.875876 containerd[1466]: 2025-09-12 23:58:38.871 [INFO][5569] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:38.875876 containerd[1466]: 2025-09-12 23:58:38.873 [INFO][5562] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8" Sep 12 23:58:38.875876 containerd[1466]: time="2025-09-12T23:58:38.875334955Z" level=info msg="TearDown network for sandbox \"3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8\" successfully" Sep 12 23:58:38.880825 containerd[1466]: time="2025-09-12T23:58:38.880683307Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 23:58:38.880987 containerd[1466]: time="2025-09-12T23:58:38.880856467Z" level=info msg="RemovePodSandbox \"3b93add247b5e858d9d88970724082df69dd56fa1aca05988269da465b1395a8\" returns successfully" Sep 12 23:58:38.881850 containerd[1466]: time="2025-09-12T23:58:38.881438226Z" level=info msg="StopPodSandbox for \"5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8\"" Sep 12 23:58:38.968794 containerd[1466]: 2025-09-12 23:58:38.929 [WARNING][5584] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--th8mg-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"62ed9b16-b2ad-4c43-931a-9ec3cc4358d1", ResourceVersion:"1028", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 57, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-326e2e5946", ContainerID:"72ad11ab37b25761274441baffbca605fea0949c341300ef9c36b8660adab32e", Pod:"coredns-668d6bf9bc-th8mg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.36.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7acdeea3a11", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:58:38.968794 containerd[1466]: 2025-09-12 23:58:38.930 [INFO][5584] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" Sep 12 23:58:38.968794 containerd[1466]: 2025-09-12 23:58:38.930 [INFO][5584] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" iface="eth0" netns="" Sep 12 23:58:38.968794 containerd[1466]: 2025-09-12 23:58:38.930 [INFO][5584] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" Sep 12 23:58:38.968794 containerd[1466]: 2025-09-12 23:58:38.930 [INFO][5584] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" Sep 12 23:58:38.968794 containerd[1466]: 2025-09-12 23:58:38.952 [INFO][5591] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" HandleID="k8s-pod-network.5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" Workload="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--th8mg-eth0" Sep 12 23:58:38.968794 containerd[1466]: 2025-09-12 23:58:38.953 [INFO][5591] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:38.968794 containerd[1466]: 2025-09-12 23:58:38.953 [INFO][5591] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:38.968794 containerd[1466]: 2025-09-12 23:58:38.963 [WARNING][5591] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" HandleID="k8s-pod-network.5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" Workload="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--th8mg-eth0" Sep 12 23:58:38.968794 containerd[1466]: 2025-09-12 23:58:38.963 [INFO][5591] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" HandleID="k8s-pod-network.5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" Workload="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--th8mg-eth0" Sep 12 23:58:38.968794 containerd[1466]: 2025-09-12 23:58:38.965 [INFO][5591] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:38.968794 containerd[1466]: 2025-09-12 23:58:38.967 [INFO][5584] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" Sep 12 23:58:38.969620 containerd[1466]: time="2025-09-12T23:58:38.969432172Z" level=info msg="TearDown network for sandbox \"5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8\" successfully" Sep 12 23:58:38.969620 containerd[1466]: time="2025-09-12T23:58:38.969467492Z" level=info msg="StopPodSandbox for \"5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8\" returns successfully" Sep 12 23:58:38.970849 containerd[1466]: time="2025-09-12T23:58:38.970568971Z" level=info msg="RemovePodSandbox for \"5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8\"" Sep 12 23:58:38.970849 containerd[1466]: time="2025-09-12T23:58:38.970606931Z" level=info msg="Forcibly stopping sandbox \"5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8\"" Sep 12 23:58:39.095573 kubelet[2585]: I0912 23:58:39.095502 2585 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-vdhzb" podStartSLOduration=23.050511759 podStartE2EDuration="38.095483108s" podCreationTimestamp="2025-09-12 23:58:01 +0000 UTC" firstStartedPulling="2025-09-12 23:58:23.441603475 +0000 UTC m=+45.920836602" lastFinishedPulling="2025-09-12 23:58:38.486574784 +0000 UTC m=+60.965807951" observedRunningTime="2025-09-12 23:58:39.095323228 +0000 UTC m=+61.574556435" watchObservedRunningTime="2025-09-12 23:58:39.095483108 +0000 UTC m=+61.574716235" Sep 12 23:58:39.118921 containerd[1466]: 2025-09-12 23:58:39.015 [WARNING][5606] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--th8mg-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"62ed9b16-b2ad-4c43-931a-9ec3cc4358d1", ResourceVersion:"1028", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 57, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-326e2e5946", ContainerID:"72ad11ab37b25761274441baffbca605fea0949c341300ef9c36b8660adab32e", Pod:"coredns-668d6bf9bc-th8mg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.36.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7acdeea3a11", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:58:39.118921 containerd[1466]: 2025-09-12 23:58:39.015 [INFO][5606] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" Sep 12 23:58:39.118921 containerd[1466]: 2025-09-12 23:58:39.015 [INFO][5606] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" iface="eth0" netns="" Sep 12 23:58:39.118921 containerd[1466]: 2025-09-12 23:58:39.015 [INFO][5606] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" Sep 12 23:58:39.118921 containerd[1466]: 2025-09-12 23:58:39.015 [INFO][5606] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" Sep 12 23:58:39.118921 containerd[1466]: 2025-09-12 23:58:39.060 [INFO][5613] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" HandleID="k8s-pod-network.5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" Workload="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--th8mg-eth0" Sep 12 23:58:39.118921 containerd[1466]: 2025-09-12 23:58:39.060 [INFO][5613] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:39.118921 containerd[1466]: 2025-09-12 23:58:39.061 [INFO][5613] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:39.118921 containerd[1466]: 2025-09-12 23:58:39.093 [WARNING][5613] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" HandleID="k8s-pod-network.5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" Workload="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--th8mg-eth0" Sep 12 23:58:39.118921 containerd[1466]: 2025-09-12 23:58:39.093 [INFO][5613] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" HandleID="k8s-pod-network.5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" Workload="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--th8mg-eth0" Sep 12 23:58:39.118921 containerd[1466]: 2025-09-12 23:58:39.109 [INFO][5613] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:39.118921 containerd[1466]: 2025-09-12 23:58:39.113 [INFO][5606] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8" Sep 12 23:58:39.118921 containerd[1466]: time="2025-09-12T23:58:39.118648814Z" level=info msg="TearDown network for sandbox \"5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8\" successfully" Sep 12 23:58:39.126116 containerd[1466]: time="2025-09-12T23:58:39.125756250Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 23:58:39.126116 containerd[1466]: time="2025-09-12T23:58:39.125848250Z" level=info msg="RemovePodSandbox \"5b2dbe2aea561ebb825363ab4fd4162e4976fcfbef7dac50de1e5877f15bbce8\" returns successfully" Sep 12 23:58:39.128494 containerd[1466]: time="2025-09-12T23:58:39.127765009Z" level=info msg="StopPodSandbox for \"b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933\"" Sep 12 23:58:39.229350 containerd[1466]: 2025-09-12 23:58:39.182 [WARNING][5643] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--qdfzm-eth0", GenerateName:"calico-apiserver-6c79dd94d7-", Namespace:"calico-apiserver", SelfLink:"", UID:"49f008dc-1da3-46cf-ac99-9976edc0d9dc", ResourceVersion:"1071", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 57, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6c79dd94d7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-326e2e5946", ContainerID:"db6961ef6fb7399026dac80a18eefe324112284ac1828ccb8b3cce4891e57fba", Pod:"calico-apiserver-6c79dd94d7-qdfzm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.36.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calibb1b8d41740", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:58:39.229350 containerd[1466]: 2025-09-12 23:58:39.182 [INFO][5643] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" Sep 12 23:58:39.229350 containerd[1466]: 2025-09-12 23:58:39.182 [INFO][5643] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" iface="eth0" netns="" Sep 12 23:58:39.229350 containerd[1466]: 2025-09-12 23:58:39.182 [INFO][5643] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" Sep 12 23:58:39.229350 containerd[1466]: 2025-09-12 23:58:39.182 [INFO][5643] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" Sep 12 23:58:39.229350 containerd[1466]: 2025-09-12 23:58:39.211 [INFO][5655] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" HandleID="k8s-pod-network.b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--qdfzm-eth0" Sep 12 23:58:39.229350 containerd[1466]: 2025-09-12 23:58:39.212 [INFO][5655] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:39.229350 containerd[1466]: 2025-09-12 23:58:39.212 [INFO][5655] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:39.229350 containerd[1466]: 2025-09-12 23:58:39.222 [WARNING][5655] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" HandleID="k8s-pod-network.b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--qdfzm-eth0" Sep 12 23:58:39.229350 containerd[1466]: 2025-09-12 23:58:39.223 [INFO][5655] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" HandleID="k8s-pod-network.b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--qdfzm-eth0" Sep 12 23:58:39.229350 containerd[1466]: 2025-09-12 23:58:39.225 [INFO][5655] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:39.229350 containerd[1466]: 2025-09-12 23:58:39.227 [INFO][5643] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" Sep 12 23:58:39.230445 containerd[1466]: time="2025-09-12T23:58:39.230030867Z" level=info msg="TearDown network for sandbox \"b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933\" successfully" Sep 12 23:58:39.230445 containerd[1466]: time="2025-09-12T23:58:39.230064747Z" level=info msg="StopPodSandbox for \"b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933\" returns successfully" Sep 12 23:58:39.231685 containerd[1466]: time="2025-09-12T23:58:39.230803547Z" level=info msg="RemovePodSandbox for \"b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933\"" Sep 12 23:58:39.231685 containerd[1466]: time="2025-09-12T23:58:39.230840747Z" level=info msg="Forcibly stopping sandbox \"b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933\"" Sep 12 23:58:39.361674 containerd[1466]: 2025-09-12 23:58:39.304 [WARNING][5670] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--qdfzm-eth0", GenerateName:"calico-apiserver-6c79dd94d7-", Namespace:"calico-apiserver", SelfLink:"", UID:"49f008dc-1da3-46cf-ac99-9976edc0d9dc", ResourceVersion:"1071", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 57, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6c79dd94d7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-326e2e5946", ContainerID:"db6961ef6fb7399026dac80a18eefe324112284ac1828ccb8b3cce4891e57fba", Pod:"calico-apiserver-6c79dd94d7-qdfzm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.36.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calibb1b8d41740", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:58:39.361674 containerd[1466]: 2025-09-12 23:58:39.305 [INFO][5670] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" Sep 12 23:58:39.361674 containerd[1466]: 2025-09-12 23:58:39.305 [INFO][5670] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" iface="eth0" netns="" Sep 12 23:58:39.361674 containerd[1466]: 2025-09-12 23:58:39.305 [INFO][5670] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" Sep 12 23:58:39.361674 containerd[1466]: 2025-09-12 23:58:39.305 [INFO][5670] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" Sep 12 23:58:39.361674 containerd[1466]: 2025-09-12 23:58:39.337 [INFO][5677] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" HandleID="k8s-pod-network.b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--qdfzm-eth0" Sep 12 23:58:39.361674 containerd[1466]: 2025-09-12 23:58:39.337 [INFO][5677] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:39.361674 containerd[1466]: 2025-09-12 23:58:39.337 [INFO][5677] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:39.361674 containerd[1466]: 2025-09-12 23:58:39.353 [WARNING][5677] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" HandleID="k8s-pod-network.b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--qdfzm-eth0" Sep 12 23:58:39.361674 containerd[1466]: 2025-09-12 23:58:39.353 [INFO][5677] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" HandleID="k8s-pod-network.b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" Workload="ci--4081--3--5--n--326e2e5946-k8s-calico--apiserver--6c79dd94d7--qdfzm-eth0" Sep 12 23:58:39.361674 containerd[1466]: 2025-09-12 23:58:39.356 [INFO][5677] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:39.361674 containerd[1466]: 2025-09-12 23:58:39.358 [INFO][5670] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933" Sep 12 23:58:39.361674 containerd[1466]: time="2025-09-12T23:58:39.361634708Z" level=info msg="TearDown network for sandbox \"b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933\" successfully" Sep 12 23:58:39.367479 containerd[1466]: time="2025-09-12T23:58:39.367373145Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 23:58:39.367650 containerd[1466]: time="2025-09-12T23:58:39.367553505Z" level=info msg="RemovePodSandbox \"b9df369dd46643a45a5c1476773d2818afe85e6d27637181179c4b15ffbb6933\" returns successfully" Sep 12 23:58:39.368501 containerd[1466]: time="2025-09-12T23:58:39.368467504Z" level=info msg="StopPodSandbox for \"348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70\"" Sep 12 23:58:39.481834 containerd[1466]: 2025-09-12 23:58:39.433 [WARNING][5691] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--hvkxc-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"dc685a7f-f1eb-41f5-8dc9-3a23d11d38d9", ResourceVersion:"986", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 57, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-326e2e5946", ContainerID:"61b7c26ac2ac572cc822328456aeb46ccdc65ede3e5a56c1b2feb7180430733e", Pod:"coredns-668d6bf9bc-hvkxc", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.36.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1501d727d39", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:58:39.481834 containerd[1466]: 2025-09-12 23:58:39.433 [INFO][5691] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" Sep 12 23:58:39.481834 containerd[1466]: 2025-09-12 23:58:39.433 [INFO][5691] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" iface="eth0" netns="" Sep 12 23:58:39.481834 containerd[1466]: 2025-09-12 23:58:39.433 [INFO][5691] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" Sep 12 23:58:39.481834 containerd[1466]: 2025-09-12 23:58:39.433 [INFO][5691] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" Sep 12 23:58:39.481834 containerd[1466]: 2025-09-12 23:58:39.460 [INFO][5698] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" HandleID="k8s-pod-network.348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" Workload="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--hvkxc-eth0" Sep 12 23:58:39.481834 containerd[1466]: 2025-09-12 23:58:39.460 [INFO][5698] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:39.481834 containerd[1466]: 2025-09-12 23:58:39.460 [INFO][5698] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:39.481834 containerd[1466]: 2025-09-12 23:58:39.470 [WARNING][5698] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" HandleID="k8s-pod-network.348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" Workload="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--hvkxc-eth0" Sep 12 23:58:39.481834 containerd[1466]: 2025-09-12 23:58:39.471 [INFO][5698] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" HandleID="k8s-pod-network.348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" Workload="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--hvkxc-eth0" Sep 12 23:58:39.481834 containerd[1466]: 2025-09-12 23:58:39.474 [INFO][5698] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:39.481834 containerd[1466]: 2025-09-12 23:58:39.478 [INFO][5691] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" Sep 12 23:58:39.483208 containerd[1466]: time="2025-09-12T23:58:39.482472836Z" level=info msg="TearDown network for sandbox \"348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70\" successfully" Sep 12 23:58:39.483208 containerd[1466]: time="2025-09-12T23:58:39.482509676Z" level=info msg="StopPodSandbox for \"348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70\" returns successfully" Sep 12 23:58:39.484014 containerd[1466]: time="2025-09-12T23:58:39.483603435Z" level=info msg="RemovePodSandbox for \"348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70\"" Sep 12 23:58:39.484014 containerd[1466]: time="2025-09-12T23:58:39.483641075Z" level=info msg="Forcibly stopping sandbox \"348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70\"" Sep 12 23:58:39.601517 containerd[1466]: 2025-09-12 23:58:39.545 [WARNING][5712] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--hvkxc-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"dc685a7f-f1eb-41f5-8dc9-3a23d11d38d9", ResourceVersion:"986", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 57, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-326e2e5946", ContainerID:"61b7c26ac2ac572cc822328456aeb46ccdc65ede3e5a56c1b2feb7180430733e", Pod:"coredns-668d6bf9bc-hvkxc", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.36.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1501d727d39", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:58:39.601517 containerd[1466]: 2025-09-12 23:58:39.546 [INFO][5712] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" Sep 12 23:58:39.601517 containerd[1466]: 2025-09-12 23:58:39.546 [INFO][5712] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" iface="eth0" netns="" Sep 12 23:58:39.601517 containerd[1466]: 2025-09-12 23:58:39.546 [INFO][5712] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" Sep 12 23:58:39.601517 containerd[1466]: 2025-09-12 23:58:39.546 [INFO][5712] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" Sep 12 23:58:39.601517 containerd[1466]: 2025-09-12 23:58:39.582 [INFO][5719] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" HandleID="k8s-pod-network.348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" Workload="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--hvkxc-eth0" Sep 12 23:58:39.601517 containerd[1466]: 2025-09-12 23:58:39.583 [INFO][5719] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:39.601517 containerd[1466]: 2025-09-12 23:58:39.583 [INFO][5719] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:39.601517 containerd[1466]: 2025-09-12 23:58:39.594 [WARNING][5719] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" HandleID="k8s-pod-network.348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" Workload="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--hvkxc-eth0" Sep 12 23:58:39.601517 containerd[1466]: 2025-09-12 23:58:39.594 [INFO][5719] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" HandleID="k8s-pod-network.348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" Workload="ci--4081--3--5--n--326e2e5946-k8s-coredns--668d6bf9bc--hvkxc-eth0" Sep 12 23:58:39.601517 containerd[1466]: 2025-09-12 23:58:39.596 [INFO][5719] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:39.601517 containerd[1466]: 2025-09-12 23:58:39.599 [INFO][5712] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70" Sep 12 23:58:39.603460 containerd[1466]: time="2025-09-12T23:58:39.602087564Z" level=info msg="TearDown network for sandbox \"348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70\" successfully" Sep 12 23:58:39.614102 containerd[1466]: time="2025-09-12T23:58:39.613940917Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 23:58:39.614416 containerd[1466]: time="2025-09-12T23:58:39.614392957Z" level=info msg="RemovePodSandbox \"348f3835785bdb15cdb5abb8b74a97bc344aad17949ebf75bb73ba1d63b96b70\" returns successfully" Sep 12 23:58:39.616228 containerd[1466]: time="2025-09-12T23:58:39.615935196Z" level=info msg="StopPodSandbox for \"2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573\"" Sep 12 23:58:39.741898 containerd[1466]: 2025-09-12 23:58:39.672 [WARNING][5733] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-whisker--7bbf9bf896--96lc9-eth0" Sep 12 23:58:39.741898 containerd[1466]: 2025-09-12 23:58:39.673 [INFO][5733] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" Sep 12 23:58:39.741898 containerd[1466]: 2025-09-12 23:58:39.673 [INFO][5733] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" iface="eth0" netns="" Sep 12 23:58:39.741898 containerd[1466]: 2025-09-12 23:58:39.673 [INFO][5733] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" Sep 12 23:58:39.741898 containerd[1466]: 2025-09-12 23:58:39.673 [INFO][5733] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" Sep 12 23:58:39.741898 containerd[1466]: 2025-09-12 23:58:39.712 [INFO][5740] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" HandleID="k8s-pod-network.2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" Workload="ci--4081--3--5--n--326e2e5946-k8s-whisker--7bbf9bf896--96lc9-eth0" Sep 12 23:58:39.741898 containerd[1466]: 2025-09-12 23:58:39.716 [INFO][5740] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:39.741898 containerd[1466]: 2025-09-12 23:58:39.716 [INFO][5740] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:39.741898 containerd[1466]: 2025-09-12 23:58:39.734 [WARNING][5740] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" HandleID="k8s-pod-network.2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" Workload="ci--4081--3--5--n--326e2e5946-k8s-whisker--7bbf9bf896--96lc9-eth0" Sep 12 23:58:39.741898 containerd[1466]: 2025-09-12 23:58:39.734 [INFO][5740] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" HandleID="k8s-pod-network.2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" Workload="ci--4081--3--5--n--326e2e5946-k8s-whisker--7bbf9bf896--96lc9-eth0" Sep 12 23:58:39.741898 containerd[1466]: 2025-09-12 23:58:39.737 [INFO][5740] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:39.741898 containerd[1466]: 2025-09-12 23:58:39.740 [INFO][5733] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" Sep 12 23:58:39.743576 containerd[1466]: time="2025-09-12T23:58:39.742596480Z" level=info msg="TearDown network for sandbox \"2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573\" successfully" Sep 12 23:58:39.743576 containerd[1466]: time="2025-09-12T23:58:39.742630520Z" level=info msg="StopPodSandbox for \"2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573\" returns successfully" Sep 12 23:58:39.744526 containerd[1466]: time="2025-09-12T23:58:39.744184559Z" level=info msg="RemovePodSandbox for \"2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573\"" Sep 12 23:58:39.744526 containerd[1466]: time="2025-09-12T23:58:39.744224199Z" level=info msg="Forcibly stopping sandbox \"2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573\"" Sep 12 23:58:39.839757 containerd[1466]: 2025-09-12 23:58:39.792 [WARNING][5754] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" WorkloadEndpoint="ci--4081--3--5--n--326e2e5946-k8s-whisker--7bbf9bf896--96lc9-eth0" Sep 12 23:58:39.839757 containerd[1466]: 2025-09-12 23:58:39.792 [INFO][5754] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" Sep 12 23:58:39.839757 containerd[1466]: 2025-09-12 23:58:39.792 [INFO][5754] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" iface="eth0" netns="" Sep 12 23:58:39.839757 containerd[1466]: 2025-09-12 23:58:39.792 [INFO][5754] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" Sep 12 23:58:39.839757 containerd[1466]: 2025-09-12 23:58:39.792 [INFO][5754] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" Sep 12 23:58:39.839757 containerd[1466]: 2025-09-12 23:58:39.816 [INFO][5761] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" HandleID="k8s-pod-network.2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" Workload="ci--4081--3--5--n--326e2e5946-k8s-whisker--7bbf9bf896--96lc9-eth0" Sep 12 23:58:39.839757 containerd[1466]: 2025-09-12 23:58:39.816 [INFO][5761] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:58:39.839757 containerd[1466]: 2025-09-12 23:58:39.816 [INFO][5761] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:58:39.839757 containerd[1466]: 2025-09-12 23:58:39.831 [WARNING][5761] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" HandleID="k8s-pod-network.2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" Workload="ci--4081--3--5--n--326e2e5946-k8s-whisker--7bbf9bf896--96lc9-eth0" Sep 12 23:58:39.839757 containerd[1466]: 2025-09-12 23:58:39.831 [INFO][5761] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" HandleID="k8s-pod-network.2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" Workload="ci--4081--3--5--n--326e2e5946-k8s-whisker--7bbf9bf896--96lc9-eth0" Sep 12 23:58:39.839757 containerd[1466]: 2025-09-12 23:58:39.834 [INFO][5761] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:58:39.839757 containerd[1466]: 2025-09-12 23:58:39.836 [INFO][5754] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573" Sep 12 23:58:39.839757 containerd[1466]: time="2025-09-12T23:58:39.838931022Z" level=info msg="TearDown network for sandbox \"2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573\" successfully" Sep 12 23:58:39.844487 containerd[1466]: time="2025-09-12T23:58:39.844408739Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 23:58:39.844606 containerd[1466]: time="2025-09-12T23:58:39.844509139Z" level=info msg="RemovePodSandbox \"2d0a7e0dfa539a0b63dc73f5bb6fcdf7eef5bcd276b776fe1f97bf8253a16573\" returns successfully" Sep 12 23:58:48.842060 update_engine[1454]: I20250912 23:58:48.840998 1454 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Sep 12 23:58:48.842060 update_engine[1454]: I20250912 23:58:48.841078 1454 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Sep 12 23:58:48.842060 update_engine[1454]: I20250912 23:58:48.841549 1454 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Sep 12 23:58:48.844496 update_engine[1454]: I20250912 23:58:48.844200 1454 omaha_request_params.cc:62] Current group set to lts Sep 12 23:58:48.844496 update_engine[1454]: I20250912 23:58:48.844344 1454 update_attempter.cc:499] Already updated boot flags. Skipping. Sep 12 23:58:48.844496 update_engine[1454]: I20250912 23:58:48.844355 1454 update_attempter.cc:643] Scheduling an action processor start. Sep 12 23:58:48.844496 update_engine[1454]: I20250912 23:58:48.844377 1454 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 12 23:58:48.846465 update_engine[1454]: I20250912 23:58:48.846431 1454 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Sep 12 23:58:48.847015 update_engine[1454]: I20250912 23:58:48.846617 1454 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 12 23:58:48.847015 update_engine[1454]: I20250912 23:58:48.846629 1454 omaha_request_action.cc:272] Request: Sep 12 23:58:48.847015 update_engine[1454]: Sep 12 23:58:48.847015 update_engine[1454]: Sep 12 23:58:48.847015 update_engine[1454]: Sep 12 23:58:48.847015 update_engine[1454]: Sep 12 23:58:48.847015 update_engine[1454]: Sep 12 23:58:48.847015 update_engine[1454]: Sep 12 23:58:48.847015 update_engine[1454]: Sep 12 23:58:48.847015 update_engine[1454]: Sep 12 23:58:48.847015 update_engine[1454]: I20250912 23:58:48.846637 1454 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 12 23:58:48.852827 locksmithd[1488]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Sep 12 23:58:48.853771 update_engine[1454]: I20250912 23:58:48.853528 1454 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 12 23:58:48.853978 update_engine[1454]: I20250912 23:58:48.853926 1454 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 12 23:58:48.857877 update_engine[1454]: E20250912 23:58:48.857789 1454 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 12 23:58:48.858000 update_engine[1454]: I20250912 23:58:48.857903 1454 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Sep 12 23:58:54.037112 kubelet[2585]: I0912 23:58:54.036985 2585 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 23:58:58.845761 update_engine[1454]: I20250912 23:58:58.844978 1454 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 12 23:58:58.845761 update_engine[1454]: I20250912 23:58:58.845202 1454 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 12 23:58:58.845761 update_engine[1454]: I20250912 23:58:58.845398 1454 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 12 23:58:58.846542 update_engine[1454]: E20250912 23:58:58.846498 1454 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 12 23:58:58.846598 update_engine[1454]: I20250912 23:58:58.846570 1454 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Sep 12 23:59:08.842850 update_engine[1454]: I20250912 23:59:08.842758 1454 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 12 23:59:08.843427 update_engine[1454]: I20250912 23:59:08.843385 1454 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 12 23:59:08.843776 update_engine[1454]: I20250912 23:59:08.843690 1454 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 12 23:59:08.844655 update_engine[1454]: E20250912 23:59:08.844605 1454 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 12 23:59:08.844756 update_engine[1454]: I20250912 23:59:08.844723 1454 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Sep 12 23:59:18.844899 update_engine[1454]: I20250912 23:59:18.844760 1454 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 12 23:59:18.845263 update_engine[1454]: I20250912 23:59:18.845077 1454 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 12 23:59:18.845402 update_engine[1454]: I20250912 23:59:18.845366 1454 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 12 23:59:18.846185 update_engine[1454]: E20250912 23:59:18.846123 1454 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 12 23:59:18.846283 update_engine[1454]: I20250912 23:59:18.846205 1454 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 12 23:59:18.846283 update_engine[1454]: I20250912 23:59:18.846216 1454 omaha_request_action.cc:617] Omaha request response: Sep 12 23:59:18.846332 update_engine[1454]: E20250912 23:59:18.846298 1454 omaha_request_action.cc:636] Omaha request network transfer failed. Sep 12 23:59:18.846332 update_engine[1454]: I20250912 23:59:18.846316 1454 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Sep 12 23:59:18.846332 update_engine[1454]: I20250912 23:59:18.846321 1454 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 12 23:59:18.846332 update_engine[1454]: I20250912 23:59:18.846328 1454 update_attempter.cc:306] Processing Done. Sep 12 23:59:18.846413 update_engine[1454]: E20250912 23:59:18.846342 1454 update_attempter.cc:619] Update failed. Sep 12 23:59:18.846413 update_engine[1454]: I20250912 23:59:18.846349 1454 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Sep 12 23:59:18.846413 update_engine[1454]: I20250912 23:59:18.846352 1454 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Sep 12 23:59:18.846413 update_engine[1454]: I20250912 23:59:18.846357 1454 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Sep 12 23:59:18.847051 update_engine[1454]: I20250912 23:59:18.846427 1454 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 12 23:59:18.847051 update_engine[1454]: I20250912 23:59:18.846453 1454 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 12 23:59:18.847051 update_engine[1454]: I20250912 23:59:18.846458 1454 omaha_request_action.cc:272] Request: Sep 12 23:59:18.847051 update_engine[1454]: Sep 12 23:59:18.847051 update_engine[1454]: Sep 12 23:59:18.847051 update_engine[1454]: Sep 12 23:59:18.847051 update_engine[1454]: Sep 12 23:59:18.847051 update_engine[1454]: Sep 12 23:59:18.847051 update_engine[1454]: Sep 12 23:59:18.847051 update_engine[1454]: I20250912 23:59:18.846464 1454 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 12 23:59:18.847051 update_engine[1454]: I20250912 23:59:18.846602 1454 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 12 23:59:18.847301 locksmithd[1488]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Sep 12 23:59:18.849254 update_engine[1454]: I20250912 23:59:18.848865 1454 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 12 23:59:18.849503 update_engine[1454]: E20250912 23:59:18.849452 1454 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 12 23:59:18.849542 update_engine[1454]: I20250912 23:59:18.849510 1454 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 12 23:59:18.849542 update_engine[1454]: I20250912 23:59:18.849520 1454 omaha_request_action.cc:617] Omaha request response: Sep 12 23:59:18.849542 update_engine[1454]: I20250912 23:59:18.849526 1454 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 12 23:59:18.849542 update_engine[1454]: I20250912 23:59:18.849531 1454 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 12 23:59:18.849542 update_engine[1454]: I20250912 23:59:18.849535 1454 update_attempter.cc:306] Processing Done. Sep 12 23:59:18.849542 update_engine[1454]: I20250912 23:59:18.849542 1454 update_attempter.cc:310] Error event sent. Sep 12 23:59:18.849685 update_engine[1454]: I20250912 23:59:18.849551 1454 update_check_scheduler.cc:74] Next update check in 48m52s Sep 12 23:59:18.850366 locksmithd[1488]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Sep 12 23:59:34.047820 systemd[1]: run-containerd-runc-k8s.io-ea8513d99f9393489a1fbf2932b3975999bf81b2b87e8fd50bec7a496582a4bd-runc.aDSzAA.mount: Deactivated successfully. Sep 12 23:59:49.915532 systemd[1]: run-containerd-runc-k8s.io-37c8f1dce7d6f0f90821b3a4a8cb4e9f43195649843d83df054b1d6fdbc2be2c-runc.4vjRFr.mount: Deactivated successfully. Sep 13 00:00:04.052916 systemd[1]: Started logrotate.service - Rotate and Compress System Logs. Sep 13 00:00:04.071508 systemd[1]: logrotate.service: Deactivated successfully. Sep 13 00:00:12.821188 systemd[1]: Started sshd@7-91.99.152.252:22-147.75.109.163:37806.service - OpenSSH per-connection server daemon (147.75.109.163:37806). Sep 13 00:00:13.809111 sshd[6094]: Accepted publickey for core from 147.75.109.163 port 37806 ssh2: RSA SHA256:bk/7TLrptUsRlsRU8kT0ooDVsm6tbA2jrK7QjRZsxaM Sep 13 00:00:13.811960 sshd[6094]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:00:13.817600 systemd-logind[1453]: New session 8 of user core. Sep 13 00:00:13.821998 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 13 00:00:14.625311 sshd[6094]: pam_unix(sshd:session): session closed for user core Sep 13 00:00:14.629306 systemd-logind[1453]: Session 8 logged out. Waiting for processes to exit. Sep 13 00:00:14.629999 systemd[1]: sshd@7-91.99.152.252:22-147.75.109.163:37806.service: Deactivated successfully. Sep 13 00:00:14.634352 systemd[1]: session-8.scope: Deactivated successfully. Sep 13 00:00:14.637521 systemd-logind[1453]: Removed session 8. Sep 13 00:00:19.807614 systemd[1]: Started sshd@8-91.99.152.252:22-147.75.109.163:37810.service - OpenSSH per-connection server daemon (147.75.109.163:37810). Sep 13 00:00:20.824158 sshd[6113]: Accepted publickey for core from 147.75.109.163 port 37810 ssh2: RSA SHA256:bk/7TLrptUsRlsRU8kT0ooDVsm6tbA2jrK7QjRZsxaM Sep 13 00:00:20.826450 sshd[6113]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:00:20.834825 systemd-logind[1453]: New session 9 of user core. Sep 13 00:00:20.841893 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 13 00:00:21.594606 sshd[6113]: pam_unix(sshd:session): session closed for user core Sep 13 00:00:21.600941 systemd[1]: sshd@8-91.99.152.252:22-147.75.109.163:37810.service: Deactivated successfully. Sep 13 00:00:21.603404 systemd[1]: session-9.scope: Deactivated successfully. Sep 13 00:00:21.604945 systemd-logind[1453]: Session 9 logged out. Waiting for processes to exit. Sep 13 00:00:21.606485 systemd-logind[1453]: Removed session 9. Sep 13 00:00:26.770319 systemd[1]: Started sshd@9-91.99.152.252:22-147.75.109.163:53090.service - OpenSSH per-connection server daemon (147.75.109.163:53090). Sep 13 00:00:27.770454 sshd[6150]: Accepted publickey for core from 147.75.109.163 port 53090 ssh2: RSA SHA256:bk/7TLrptUsRlsRU8kT0ooDVsm6tbA2jrK7QjRZsxaM Sep 13 00:00:27.773133 sshd[6150]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:00:27.778702 systemd-logind[1453]: New session 10 of user core. Sep 13 00:00:27.785979 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 13 00:00:28.542116 sshd[6150]: pam_unix(sshd:session): session closed for user core Sep 13 00:00:28.547552 systemd[1]: sshd@9-91.99.152.252:22-147.75.109.163:53090.service: Deactivated successfully. Sep 13 00:00:28.550589 systemd[1]: session-10.scope: Deactivated successfully. Sep 13 00:00:28.551527 systemd-logind[1453]: Session 10 logged out. Waiting for processes to exit. Sep 13 00:00:28.552977 systemd-logind[1453]: Removed session 10. Sep 13 00:00:33.722136 systemd[1]: Started sshd@10-91.99.152.252:22-147.75.109.163:51648.service - OpenSSH per-connection server daemon (147.75.109.163:51648). Sep 13 00:00:34.717936 sshd[6164]: Accepted publickey for core from 147.75.109.163 port 51648 ssh2: RSA SHA256:bk/7TLrptUsRlsRU8kT0ooDVsm6tbA2jrK7QjRZsxaM Sep 13 00:00:34.719734 sshd[6164]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:00:34.726430 systemd-logind[1453]: New session 11 of user core. Sep 13 00:00:34.734140 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 13 00:00:35.493778 sshd[6164]: pam_unix(sshd:session): session closed for user core Sep 13 00:00:35.499583 systemd[1]: sshd@10-91.99.152.252:22-147.75.109.163:51648.service: Deactivated successfully. Sep 13 00:00:35.506607 systemd[1]: session-11.scope: Deactivated successfully. Sep 13 00:00:35.509161 systemd-logind[1453]: Session 11 logged out. Waiting for processes to exit. Sep 13 00:00:35.510335 systemd-logind[1453]: Removed session 11. Sep 13 00:00:35.666092 systemd[1]: Started sshd@11-91.99.152.252:22-147.75.109.163:51664.service - OpenSSH per-connection server daemon (147.75.109.163:51664). Sep 13 00:00:36.659541 sshd[6196]: Accepted publickey for core from 147.75.109.163 port 51664 ssh2: RSA SHA256:bk/7TLrptUsRlsRU8kT0ooDVsm6tbA2jrK7QjRZsxaM Sep 13 00:00:36.660541 sshd[6196]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:00:36.671618 systemd-logind[1453]: New session 12 of user core. Sep 13 00:00:36.686310 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 13 00:00:37.459030 sshd[6196]: pam_unix(sshd:session): session closed for user core Sep 13 00:00:37.464358 systemd[1]: sshd@11-91.99.152.252:22-147.75.109.163:51664.service: Deactivated successfully. Sep 13 00:00:37.467491 systemd[1]: session-12.scope: Deactivated successfully. Sep 13 00:00:37.468824 systemd-logind[1453]: Session 12 logged out. Waiting for processes to exit. Sep 13 00:00:37.471361 systemd-logind[1453]: Removed session 12. Sep 13 00:00:37.636196 systemd[1]: Started sshd@12-91.99.152.252:22-147.75.109.163:51676.service - OpenSSH per-connection server daemon (147.75.109.163:51676). Sep 13 00:00:38.649529 sshd[6207]: Accepted publickey for core from 147.75.109.163 port 51676 ssh2: RSA SHA256:bk/7TLrptUsRlsRU8kT0ooDVsm6tbA2jrK7QjRZsxaM Sep 13 00:00:38.651918 sshd[6207]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:00:38.657305 systemd-logind[1453]: New session 13 of user core. Sep 13 00:00:38.665042 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 13 00:00:39.425939 sshd[6207]: pam_unix(sshd:session): session closed for user core Sep 13 00:00:39.435819 systemd[1]: sshd@12-91.99.152.252:22-147.75.109.163:51676.service: Deactivated successfully. Sep 13 00:00:39.438312 systemd[1]: session-13.scope: Deactivated successfully. Sep 13 00:00:39.448465 systemd-logind[1453]: Session 13 logged out. Waiting for processes to exit. Sep 13 00:00:39.451465 systemd-logind[1453]: Removed session 13. Sep 13 00:00:44.601327 systemd[1]: Started sshd@13-91.99.152.252:22-147.75.109.163:59688.service - OpenSSH per-connection server daemon (147.75.109.163:59688). Sep 13 00:00:45.609278 sshd[6249]: Accepted publickey for core from 147.75.109.163 port 59688 ssh2: RSA SHA256:bk/7TLrptUsRlsRU8kT0ooDVsm6tbA2jrK7QjRZsxaM Sep 13 00:00:45.611550 sshd[6249]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:00:45.617567 systemd-logind[1453]: New session 14 of user core. Sep 13 00:00:45.625077 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 13 00:00:46.405920 sshd[6249]: pam_unix(sshd:session): session closed for user core Sep 13 00:00:46.414620 systemd[1]: sshd@13-91.99.152.252:22-147.75.109.163:59688.service: Deactivated successfully. Sep 13 00:00:46.419441 systemd[1]: session-14.scope: Deactivated successfully. Sep 13 00:00:46.421935 systemd-logind[1453]: Session 14 logged out. Waiting for processes to exit. Sep 13 00:00:46.423381 systemd-logind[1453]: Removed session 14. Sep 13 00:00:49.922316 systemd[1]: run-containerd-runc-k8s.io-37c8f1dce7d6f0f90821b3a4a8cb4e9f43195649843d83df054b1d6fdbc2be2c-runc.iqQ527.mount: Deactivated successfully. Sep 13 00:00:51.586622 systemd[1]: Started sshd@14-91.99.152.252:22-147.75.109.163:59906.service - OpenSSH per-connection server daemon (147.75.109.163:59906). Sep 13 00:00:52.571320 sshd[6283]: Accepted publickey for core from 147.75.109.163 port 59906 ssh2: RSA SHA256:bk/7TLrptUsRlsRU8kT0ooDVsm6tbA2jrK7QjRZsxaM Sep 13 00:00:52.573594 sshd[6283]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:00:52.580772 systemd-logind[1453]: New session 15 of user core. Sep 13 00:00:52.582915 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 13 00:00:53.339794 sshd[6283]: pam_unix(sshd:session): session closed for user core Sep 13 00:00:53.345339 systemd[1]: sshd@14-91.99.152.252:22-147.75.109.163:59906.service: Deactivated successfully. Sep 13 00:00:53.348303 systemd[1]: session-15.scope: Deactivated successfully. Sep 13 00:00:53.349452 systemd-logind[1453]: Session 15 logged out. Waiting for processes to exit. Sep 13 00:00:53.350590 systemd-logind[1453]: Removed session 15. Sep 13 00:00:58.517864 systemd[1]: Started sshd@15-91.99.152.252:22-147.75.109.163:59922.service - OpenSSH per-connection server daemon (147.75.109.163:59922). Sep 13 00:00:59.503751 sshd[6316]: Accepted publickey for core from 147.75.109.163 port 59922 ssh2: RSA SHA256:bk/7TLrptUsRlsRU8kT0ooDVsm6tbA2jrK7QjRZsxaM Sep 13 00:00:59.505820 sshd[6316]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:00:59.516842 systemd-logind[1453]: New session 16 of user core. Sep 13 00:00:59.521115 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 13 00:01:00.269069 sshd[6316]: pam_unix(sshd:session): session closed for user core Sep 13 00:01:00.276520 systemd[1]: sshd@15-91.99.152.252:22-147.75.109.163:59922.service: Deactivated successfully. Sep 13 00:01:00.280826 systemd[1]: session-16.scope: Deactivated successfully. Sep 13 00:01:00.283932 systemd-logind[1453]: Session 16 logged out. Waiting for processes to exit. Sep 13 00:01:00.286998 systemd-logind[1453]: Removed session 16. Sep 13 00:01:04.068881 systemd[1]: run-containerd-runc-k8s.io-ea8513d99f9393489a1fbf2932b3975999bf81b2b87e8fd50bec7a496582a4bd-runc.tKfvrT.mount: Deactivated successfully. Sep 13 00:01:05.446116 systemd[1]: Started sshd@16-91.99.152.252:22-147.75.109.163:44746.service - OpenSSH per-connection server daemon (147.75.109.163:44746). Sep 13 00:01:06.443429 sshd[6354]: Accepted publickey for core from 147.75.109.163 port 44746 ssh2: RSA SHA256:bk/7TLrptUsRlsRU8kT0ooDVsm6tbA2jrK7QjRZsxaM Sep 13 00:01:06.446426 sshd[6354]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:01:06.452751 systemd-logind[1453]: New session 17 of user core. Sep 13 00:01:06.458041 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 13 00:01:07.210893 sshd[6354]: pam_unix(sshd:session): session closed for user core Sep 13 00:01:07.217964 systemd[1]: sshd@16-91.99.152.252:22-147.75.109.163:44746.service: Deactivated successfully. Sep 13 00:01:07.220911 systemd[1]: session-17.scope: Deactivated successfully. Sep 13 00:01:07.222415 systemd-logind[1453]: Session 17 logged out. Waiting for processes to exit. Sep 13 00:01:07.224402 systemd-logind[1453]: Removed session 17. Sep 13 00:01:07.394605 systemd[1]: Started sshd@17-91.99.152.252:22-147.75.109.163:44756.service - OpenSSH per-connection server daemon (147.75.109.163:44756). Sep 13 00:01:08.377444 sshd[6386]: Accepted publickey for core from 147.75.109.163 port 44756 ssh2: RSA SHA256:bk/7TLrptUsRlsRU8kT0ooDVsm6tbA2jrK7QjRZsxaM Sep 13 00:01:08.380394 sshd[6386]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:01:08.390788 systemd-logind[1453]: New session 18 of user core. Sep 13 00:01:08.394952 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 13 00:01:09.316448 sshd[6386]: pam_unix(sshd:session): session closed for user core Sep 13 00:01:09.324439 systemd[1]: sshd@17-91.99.152.252:22-147.75.109.163:44756.service: Deactivated successfully. Sep 13 00:01:09.330437 systemd[1]: session-18.scope: Deactivated successfully. Sep 13 00:01:09.332929 systemd-logind[1453]: Session 18 logged out. Waiting for processes to exit. Sep 13 00:01:09.335638 systemd-logind[1453]: Removed session 18. Sep 13 00:01:09.489240 systemd[1]: Started sshd@18-91.99.152.252:22-147.75.109.163:44758.service - OpenSSH per-connection server daemon (147.75.109.163:44758). Sep 13 00:01:10.485933 sshd[6418]: Accepted publickey for core from 147.75.109.163 port 44758 ssh2: RSA SHA256:bk/7TLrptUsRlsRU8kT0ooDVsm6tbA2jrK7QjRZsxaM Sep 13 00:01:10.489693 sshd[6418]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:01:10.494773 systemd-logind[1453]: New session 19 of user core. Sep 13 00:01:10.504776 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 13 00:01:11.998421 sshd[6418]: pam_unix(sshd:session): session closed for user core Sep 13 00:01:12.004275 systemd[1]: sshd@18-91.99.152.252:22-147.75.109.163:44758.service: Deactivated successfully. Sep 13 00:01:12.006887 systemd[1]: session-19.scope: Deactivated successfully. Sep 13 00:01:12.008361 systemd-logind[1453]: Session 19 logged out. Waiting for processes to exit. Sep 13 00:01:12.009643 systemd-logind[1453]: Removed session 19. Sep 13 00:01:12.171340 systemd[1]: Started sshd@19-91.99.152.252:22-147.75.109.163:44484.service - OpenSSH per-connection server daemon (147.75.109.163:44484). Sep 13 00:01:13.160547 sshd[6436]: Accepted publickey for core from 147.75.109.163 port 44484 ssh2: RSA SHA256:bk/7TLrptUsRlsRU8kT0ooDVsm6tbA2jrK7QjRZsxaM Sep 13 00:01:13.163039 sshd[6436]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:01:13.168854 systemd-logind[1453]: New session 20 of user core. Sep 13 00:01:13.179993 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 13 00:01:14.070046 sshd[6436]: pam_unix(sshd:session): session closed for user core Sep 13 00:01:14.075804 systemd[1]: sshd@19-91.99.152.252:22-147.75.109.163:44484.service: Deactivated successfully. Sep 13 00:01:14.079637 systemd[1]: session-20.scope: Deactivated successfully. Sep 13 00:01:14.081493 systemd-logind[1453]: Session 20 logged out. Waiting for processes to exit. Sep 13 00:01:14.082648 systemd-logind[1453]: Removed session 20. Sep 13 00:01:14.247055 systemd[1]: Started sshd@20-91.99.152.252:22-147.75.109.163:44494.service - OpenSSH per-connection server daemon (147.75.109.163:44494). Sep 13 00:01:15.242997 sshd[6449]: Accepted publickey for core from 147.75.109.163 port 44494 ssh2: RSA SHA256:bk/7TLrptUsRlsRU8kT0ooDVsm6tbA2jrK7QjRZsxaM Sep 13 00:01:15.245149 sshd[6449]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:01:15.250666 systemd-logind[1453]: New session 21 of user core. Sep 13 00:01:15.256136 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 13 00:01:16.003364 sshd[6449]: pam_unix(sshd:session): session closed for user core Sep 13 00:01:16.008696 systemd-logind[1453]: Session 21 logged out. Waiting for processes to exit. Sep 13 00:01:16.009353 systemd[1]: sshd@20-91.99.152.252:22-147.75.109.163:44494.service: Deactivated successfully. Sep 13 00:01:16.011405 systemd[1]: session-21.scope: Deactivated successfully. Sep 13 00:01:16.013382 systemd-logind[1453]: Removed session 21. Sep 13 00:01:21.182170 systemd[1]: Started sshd@21-91.99.152.252:22-147.75.109.163:57170.service - OpenSSH per-connection server daemon (147.75.109.163:57170). Sep 13 00:01:22.162432 sshd[6485]: Accepted publickey for core from 147.75.109.163 port 57170 ssh2: RSA SHA256:bk/7TLrptUsRlsRU8kT0ooDVsm6tbA2jrK7QjRZsxaM Sep 13 00:01:22.164222 sshd[6485]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:01:22.175850 systemd-logind[1453]: New session 22 of user core. Sep 13 00:01:22.181959 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 13 00:01:22.932661 sshd[6485]: pam_unix(sshd:session): session closed for user core Sep 13 00:01:22.938254 systemd-logind[1453]: Session 22 logged out. Waiting for processes to exit. Sep 13 00:01:22.938798 systemd[1]: sshd@21-91.99.152.252:22-147.75.109.163:57170.service: Deactivated successfully. Sep 13 00:01:22.945855 systemd[1]: session-22.scope: Deactivated successfully. Sep 13 00:01:22.947423 systemd-logind[1453]: Removed session 22. Sep 13 00:01:28.112044 systemd[1]: Started sshd@22-91.99.152.252:22-147.75.109.163:57180.service - OpenSSH per-connection server daemon (147.75.109.163:57180). Sep 13 00:01:29.100759 sshd[6501]: Accepted publickey for core from 147.75.109.163 port 57180 ssh2: RSA SHA256:bk/7TLrptUsRlsRU8kT0ooDVsm6tbA2jrK7QjRZsxaM Sep 13 00:01:29.104052 sshd[6501]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:01:29.111185 systemd-logind[1453]: New session 23 of user core. Sep 13 00:01:29.116007 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 13 00:01:29.859979 sshd[6501]: pam_unix(sshd:session): session closed for user core Sep 13 00:01:29.867383 systemd-logind[1453]: Session 23 logged out. Waiting for processes to exit. Sep 13 00:01:29.867595 systemd[1]: sshd@22-91.99.152.252:22-147.75.109.163:57180.service: Deactivated successfully. Sep 13 00:01:29.872591 systemd[1]: session-23.scope: Deactivated successfully. Sep 13 00:01:29.875181 systemd-logind[1453]: Removed session 23. Sep 13 00:01:35.045586 systemd[1]: Started sshd@23-91.99.152.252:22-147.75.109.163:42624.service - OpenSSH per-connection server daemon (147.75.109.163:42624). Sep 13 00:01:36.040628 sshd[6547]: Accepted publickey for core from 147.75.109.163 port 42624 ssh2: RSA SHA256:bk/7TLrptUsRlsRU8kT0ooDVsm6tbA2jrK7QjRZsxaM Sep 13 00:01:36.043307 sshd[6547]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 13 00:01:36.049384 systemd-logind[1453]: New session 24 of user core. Sep 13 00:01:36.054968 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 13 00:01:36.809763 sshd[6547]: pam_unix(sshd:session): session closed for user core Sep 13 00:01:36.814513 systemd[1]: sshd@23-91.99.152.252:22-147.75.109.163:42624.service: Deactivated successfully. Sep 13 00:01:36.817686 systemd[1]: session-24.scope: Deactivated successfully. Sep 13 00:01:36.819012 systemd-logind[1453]: Session 24 logged out. Waiting for processes to exit. Sep 13 00:01:36.821298 systemd-logind[1453]: Removed session 24. Sep 13 00:01:51.568937 systemd[1]: cri-containerd-80056097e8513360e60e718873c9e9449add05ec256867a1eb1be4f5d2a69e4b.scope: Deactivated successfully. Sep 13 00:01:51.571186 systemd[1]: cri-containerd-80056097e8513360e60e718873c9e9449add05ec256867a1eb1be4f5d2a69e4b.scope: Consumed 6.380s CPU time, 18.1M memory peak, 0B memory swap peak. Sep 13 00:01:51.603498 containerd[1466]: time="2025-09-13T00:01:51.603294443Z" level=info msg="shim disconnected" id=80056097e8513360e60e718873c9e9449add05ec256867a1eb1be4f5d2a69e4b namespace=k8s.io Sep 13 00:01:51.603498 containerd[1466]: time="2025-09-13T00:01:51.603418043Z" level=warning msg="cleaning up after shim disconnected" id=80056097e8513360e60e718873c9e9449add05ec256867a1eb1be4f5d2a69e4b namespace=k8s.io Sep 13 00:01:51.603498 containerd[1466]: time="2025-09-13T00:01:51.603474883Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 13 00:01:51.605174 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-80056097e8513360e60e718873c9e9449add05ec256867a1eb1be4f5d2a69e4b-rootfs.mount: Deactivated successfully. Sep 13 00:01:51.646998 kubelet[2585]: I0913 00:01:51.646804 2585 scope.go:117] "RemoveContainer" containerID="80056097e8513360e60e718873c9e9449add05ec256867a1eb1be4f5d2a69e4b" Sep 13 00:01:51.651688 containerd[1466]: time="2025-09-13T00:01:51.651405408Z" level=info msg="CreateContainer within sandbox \"e8e8d41df0140b4b135eafe49358d0de7193fa327b38d9f23e771e008055395a\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Sep 13 00:01:51.674657 containerd[1466]: time="2025-09-13T00:01:51.674514311Z" level=info msg="CreateContainer within sandbox \"e8e8d41df0140b4b135eafe49358d0de7193fa327b38d9f23e771e008055395a\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"7205507cece24b2b37461457c60a4e45e0e1172517b1fee66a33d7eba4ed4b0a\"" Sep 13 00:01:51.675742 containerd[1466]: time="2025-09-13T00:01:51.675402550Z" level=info msg="StartContainer for \"7205507cece24b2b37461457c60a4e45e0e1172517b1fee66a33d7eba4ed4b0a\"" Sep 13 00:01:51.716734 systemd[1]: Started cri-containerd-7205507cece24b2b37461457c60a4e45e0e1172517b1fee66a33d7eba4ed4b0a.scope - libcontainer container 7205507cece24b2b37461457c60a4e45e0e1172517b1fee66a33d7eba4ed4b0a. Sep 13 00:01:51.760071 containerd[1466]: time="2025-09-13T00:01:51.759962447Z" level=info msg="StartContainer for \"7205507cece24b2b37461457c60a4e45e0e1172517b1fee66a33d7eba4ed4b0a\" returns successfully" Sep 13 00:01:51.999781 kubelet[2585]: E0913 00:01:51.999521 2585 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:33324->10.0.0.2:2379: read: connection timed out" Sep 13 00:01:52.010571 systemd[1]: cri-containerd-39cbfc4ed1703a11461788e254170fff464333fbe3251ef2079c81db41d4ce14.scope: Deactivated successfully. Sep 13 00:01:52.010880 systemd[1]: cri-containerd-39cbfc4ed1703a11461788e254170fff464333fbe3251ef2079c81db41d4ce14.scope: Consumed 4.615s CPU time, 15.6M memory peak, 0B memory swap peak. Sep 13 00:01:52.036095 containerd[1466]: time="2025-09-13T00:01:52.035940613Z" level=info msg="shim disconnected" id=39cbfc4ed1703a11461788e254170fff464333fbe3251ef2079c81db41d4ce14 namespace=k8s.io Sep 13 00:01:52.036095 containerd[1466]: time="2025-09-13T00:01:52.035998293Z" level=warning msg="cleaning up after shim disconnected" id=39cbfc4ed1703a11461788e254170fff464333fbe3251ef2079c81db41d4ce14 namespace=k8s.io Sep 13 00:01:52.036095 containerd[1466]: time="2025-09-13T00:01:52.036021773Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 13 00:01:52.606060 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-39cbfc4ed1703a11461788e254170fff464333fbe3251ef2079c81db41d4ce14-rootfs.mount: Deactivated successfully. Sep 13 00:01:52.663654 kubelet[2585]: I0913 00:01:52.663026 2585 scope.go:117] "RemoveContainer" containerID="39cbfc4ed1703a11461788e254170fff464333fbe3251ef2079c81db41d4ce14" Sep 13 00:01:52.665453 containerd[1466]: time="2025-09-13T00:01:52.665409624Z" level=info msg="CreateContainer within sandbox \"88edc675682a63e2069128978a3f0daf8d826d3fd1cc641dcae29457e9fc4898\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Sep 13 00:01:52.687234 containerd[1466]: time="2025-09-13T00:01:52.687187414Z" level=info msg="CreateContainer within sandbox \"88edc675682a63e2069128978a3f0daf8d826d3fd1cc641dcae29457e9fc4898\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"e64ccc6975f6d95ae5a8ec7852e3cd750c84c7346b12af557c77127b8e17d75e\"" Sep 13 00:01:52.687802 containerd[1466]: time="2025-09-13T00:01:52.687776254Z" level=info msg="StartContainer for \"e64ccc6975f6d95ae5a8ec7852e3cd750c84c7346b12af557c77127b8e17d75e\"" Sep 13 00:01:52.734954 systemd[1]: Started cri-containerd-e64ccc6975f6d95ae5a8ec7852e3cd750c84c7346b12af557c77127b8e17d75e.scope - libcontainer container e64ccc6975f6d95ae5a8ec7852e3cd750c84c7346b12af557c77127b8e17d75e. Sep 13 00:01:52.775527 containerd[1466]: time="2025-09-13T00:01:52.775065937Z" level=info msg="StartContainer for \"e64ccc6975f6d95ae5a8ec7852e3cd750c84c7346b12af557c77127b8e17d75e\" returns successfully" Sep 13 00:01:52.836981 systemd[1]: cri-containerd-af616cc4cf459d10914859f7e902fe0176edb59d6465e7b3a83262548dbd05c7.scope: Deactivated successfully. Sep 13 00:01:52.837575 systemd[1]: cri-containerd-af616cc4cf459d10914859f7e902fe0176edb59d6465e7b3a83262548dbd05c7.scope: Consumed 27.631s CPU time. Sep 13 00:01:52.875649 containerd[1466]: time="2025-09-13T00:01:52.875456254Z" level=info msg="shim disconnected" id=af616cc4cf459d10914859f7e902fe0176edb59d6465e7b3a83262548dbd05c7 namespace=k8s.io Sep 13 00:01:52.875649 containerd[1466]: time="2025-09-13T00:01:52.875522374Z" level=warning msg="cleaning up after shim disconnected" id=af616cc4cf459d10914859f7e902fe0176edb59d6465e7b3a83262548dbd05c7 namespace=k8s.io Sep 13 00:01:52.875649 containerd[1466]: time="2025-09-13T00:01:52.875532734Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 13 00:01:53.606883 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-af616cc4cf459d10914859f7e902fe0176edb59d6465e7b3a83262548dbd05c7-rootfs.mount: Deactivated successfully. Sep 13 00:01:53.668105 kubelet[2585]: I0913 00:01:53.667601 2585 scope.go:117] "RemoveContainer" containerID="af616cc4cf459d10914859f7e902fe0176edb59d6465e7b3a83262548dbd05c7" Sep 13 00:01:53.671002 containerd[1466]: time="2025-09-13T00:01:53.669986840Z" level=info msg="CreateContainer within sandbox \"2113b2e3e5c70c8fb3ec3c27c0133fda6a3b87ff46ec94954b2ff416879b196d\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Sep 13 00:01:53.694055 containerd[1466]: time="2025-09-13T00:01:53.693986557Z" level=info msg="CreateContainer within sandbox \"2113b2e3e5c70c8fb3ec3c27c0133fda6a3b87ff46ec94954b2ff416879b196d\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"8f8937fbf1843d296657501d1580b3176525df86df4b2c73cb19fd8b907aa92c\"" Sep 13 00:01:53.694762 containerd[1466]: time="2025-09-13T00:01:53.694732117Z" level=info msg="StartContainer for \"8f8937fbf1843d296657501d1580b3176525df86df4b2c73cb19fd8b907aa92c\"" Sep 13 00:01:53.742935 systemd[1]: Started cri-containerd-8f8937fbf1843d296657501d1580b3176525df86df4b2c73cb19fd8b907aa92c.scope - libcontainer container 8f8937fbf1843d296657501d1580b3176525df86df4b2c73cb19fd8b907aa92c. Sep 13 00:01:53.972799 containerd[1466]: time="2025-09-13T00:01:53.972746563Z" level=info msg="StartContainer for \"8f8937fbf1843d296657501d1580b3176525df86df4b2c73cb19fd8b907aa92c\" returns successfully" Sep 13 00:01:54.608606 systemd[1]: run-containerd-runc-k8s.io-8f8937fbf1843d296657501d1580b3176525df86df4b2c73cb19fd8b907aa92c-runc.PQuuNq.mount: Deactivated successfully. Sep 13 00:01:56.156673 kubelet[2585]: E0913 00:01:56.156354 2585 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:33102->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4081-3-5-n-326e2e5946.1864ae941b3b79d0 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4081-3-5-n-326e2e5946,UID:d337cf5b029a73c04e03d8ba88efb8af,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4081-3-5-n-326e2e5946,},FirstTimestamp:2025-09-13 00:01:45.716611536 +0000 UTC m=+248.195844663,LastTimestamp:2025-09-13 00:01:45.716611536 +0000 UTC m=+248.195844663,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4081-3-5-n-326e2e5946,}"