Sep 12 17:13:12.925527 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Sep 12 17:13:12.925565 kernel: Linux version 6.6.106-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT Fri Sep 12 15:59:19 -00 2025 Sep 12 17:13:12.925581 kernel: KASLR enabled Sep 12 17:13:12.925590 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Sep 12 17:13:12.925599 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x1390c1018 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b43d18 Sep 12 17:13:12.925607 kernel: random: crng init done Sep 12 17:13:12.925618 kernel: ACPI: Early table checksum verification disabled Sep 12 17:13:12.925627 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) Sep 12 17:13:12.925637 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Sep 12 17:13:12.925648 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:13:12.925658 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:13:12.925667 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:13:12.925676 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:13:12.925685 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:13:12.925696 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:13:12.925709 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:13:12.925719 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:13:12.925728 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:13:12.925738 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) Sep 12 17:13:12.925747 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Sep 12 17:13:12.925757 kernel: NUMA: Failed to initialise from firmware Sep 12 17:13:12.925766 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Sep 12 17:13:12.925776 kernel: NUMA: NODE_DATA [mem 0x13966f800-0x139674fff] Sep 12 17:13:12.925785 kernel: Zone ranges: Sep 12 17:13:12.925795 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Sep 12 17:13:12.925807 kernel: DMA32 empty Sep 12 17:13:12.925817 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Sep 12 17:13:12.925826 kernel: Movable zone start for each node Sep 12 17:13:12.925836 kernel: Early memory node ranges Sep 12 17:13:12.925845 kernel: node 0: [mem 0x0000000040000000-0x000000013676ffff] Sep 12 17:13:12.925855 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] Sep 12 17:13:12.925865 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] Sep 12 17:13:12.925874 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] Sep 12 17:13:12.925884 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] Sep 12 17:13:12.925893 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] Sep 12 17:13:12.925903 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] Sep 12 17:13:12.925912 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Sep 12 17:13:12.925924 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Sep 12 17:13:12.925934 kernel: psci: probing for conduit method from ACPI. Sep 12 17:13:12.925944 kernel: psci: PSCIv1.1 detected in firmware. Sep 12 17:13:12.925957 kernel: psci: Using standard PSCI v0.2 function IDs Sep 12 17:13:12.925968 kernel: psci: Trusted OS migration not required Sep 12 17:13:12.925978 kernel: psci: SMC Calling Convention v1.1 Sep 12 17:13:12.925990 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Sep 12 17:13:12.926001 kernel: percpu: Embedded 31 pages/cpu s86632 r8192 d32152 u126976 Sep 12 17:13:12.926011 kernel: pcpu-alloc: s86632 r8192 d32152 u126976 alloc=31*4096 Sep 12 17:13:12.926021 kernel: pcpu-alloc: [0] 0 [0] 1 Sep 12 17:13:12.926031 kernel: Detected PIPT I-cache on CPU0 Sep 12 17:13:12.926042 kernel: CPU features: detected: GIC system register CPU interface Sep 12 17:13:12.926052 kernel: CPU features: detected: Hardware dirty bit management Sep 12 17:13:12.926082 kernel: CPU features: detected: Spectre-v4 Sep 12 17:13:12.926094 kernel: CPU features: detected: Spectre-BHB Sep 12 17:13:12.926105 kernel: CPU features: kernel page table isolation forced ON by KASLR Sep 12 17:13:12.926118 kernel: CPU features: detected: Kernel page table isolation (KPTI) Sep 12 17:13:12.926141 kernel: CPU features: detected: ARM erratum 1418040 Sep 12 17:13:12.926152 kernel: CPU features: detected: SSBS not fully self-synchronizing Sep 12 17:13:12.926162 kernel: alternatives: applying boot alternatives Sep 12 17:13:12.926174 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=1e63d3057914877efa0eb5f75703bd3a3d4c120bdf4a7ab97f41083e29183e56 Sep 12 17:13:12.926185 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 12 17:13:12.926195 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 12 17:13:12.926209 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 12 17:13:12.926220 kernel: Fallback order for Node 0: 0 Sep 12 17:13:12.926230 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1008000 Sep 12 17:13:12.926241 kernel: Policy zone: Normal Sep 12 17:13:12.926255 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 12 17:13:12.926266 kernel: software IO TLB: area num 2. Sep 12 17:13:12.926276 kernel: software IO TLB: mapped [mem 0x00000000fbfff000-0x00000000fffff000] (64MB) Sep 12 17:13:12.926288 kernel: Memory: 3882744K/4096000K available (10304K kernel code, 2186K rwdata, 8108K rodata, 39488K init, 897K bss, 213256K reserved, 0K cma-reserved) Sep 12 17:13:12.926299 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Sep 12 17:13:12.926310 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 12 17:13:12.926322 kernel: rcu: RCU event tracing is enabled. Sep 12 17:13:12.926333 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Sep 12 17:13:12.926344 kernel: Trampoline variant of Tasks RCU enabled. Sep 12 17:13:12.926354 kernel: Tracing variant of Tasks RCU enabled. Sep 12 17:13:12.926364 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 12 17:13:12.926377 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Sep 12 17:13:12.926388 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Sep 12 17:13:12.926398 kernel: GICv3: 256 SPIs implemented Sep 12 17:13:12.926408 kernel: GICv3: 0 Extended SPIs implemented Sep 12 17:13:12.926418 kernel: Root IRQ handler: gic_handle_irq Sep 12 17:13:12.926429 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Sep 12 17:13:12.926439 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Sep 12 17:13:12.926449 kernel: ITS [mem 0x08080000-0x0809ffff] Sep 12 17:13:12.926460 kernel: ITS@0x0000000008080000: allocated 8192 Devices @1000c0000 (indirect, esz 8, psz 64K, shr 1) Sep 12 17:13:12.926471 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @1000d0000 (flat, esz 8, psz 64K, shr 1) Sep 12 17:13:12.926482 kernel: GICv3: using LPI property table @0x00000001000e0000 Sep 12 17:13:12.926492 kernel: GICv3: CPU0: using allocated LPI pending table @0x00000001000f0000 Sep 12 17:13:12.926505 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 12 17:13:12.926515 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 12 17:13:12.926526 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Sep 12 17:13:12.926536 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Sep 12 17:13:12.926546 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Sep 12 17:13:12.926557 kernel: Console: colour dummy device 80x25 Sep 12 17:13:12.926568 kernel: ACPI: Core revision 20230628 Sep 12 17:13:12.926579 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Sep 12 17:13:12.926590 kernel: pid_max: default: 32768 minimum: 301 Sep 12 17:13:12.926600 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Sep 12 17:13:12.926613 kernel: landlock: Up and running. Sep 12 17:13:12.926623 kernel: SELinux: Initializing. Sep 12 17:13:12.926634 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 12 17:13:12.926644 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 12 17:13:12.926655 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 17:13:12.926666 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 17:13:12.926676 kernel: rcu: Hierarchical SRCU implementation. Sep 12 17:13:12.926687 kernel: rcu: Max phase no-delay instances is 400. Sep 12 17:13:12.926698 kernel: Platform MSI: ITS@0x8080000 domain created Sep 12 17:13:12.926711 kernel: PCI/MSI: ITS@0x8080000 domain created Sep 12 17:13:12.926721 kernel: Remapping and enabling EFI services. Sep 12 17:13:12.926732 kernel: smp: Bringing up secondary CPUs ... Sep 12 17:13:12.926742 kernel: Detected PIPT I-cache on CPU1 Sep 12 17:13:12.926752 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Sep 12 17:13:12.926763 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100100000 Sep 12 17:13:12.926773 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 12 17:13:12.926784 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Sep 12 17:13:12.926794 kernel: smp: Brought up 1 node, 2 CPUs Sep 12 17:13:12.926804 kernel: SMP: Total of 2 processors activated. Sep 12 17:13:12.926817 kernel: CPU features: detected: 32-bit EL0 Support Sep 12 17:13:12.926828 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Sep 12 17:13:12.926846 kernel: CPU features: detected: Common not Private translations Sep 12 17:13:12.926860 kernel: CPU features: detected: CRC32 instructions Sep 12 17:13:12.926871 kernel: CPU features: detected: Enhanced Virtualization Traps Sep 12 17:13:12.926883 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Sep 12 17:13:12.926894 kernel: CPU features: detected: LSE atomic instructions Sep 12 17:13:12.926905 kernel: CPU features: detected: Privileged Access Never Sep 12 17:13:12.926916 kernel: CPU features: detected: RAS Extension Support Sep 12 17:13:12.926929 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Sep 12 17:13:12.926940 kernel: CPU: All CPU(s) started at EL1 Sep 12 17:13:12.926951 kernel: alternatives: applying system-wide alternatives Sep 12 17:13:12.926962 kernel: devtmpfs: initialized Sep 12 17:13:12.926973 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 12 17:13:12.926984 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Sep 12 17:13:12.926996 kernel: pinctrl core: initialized pinctrl subsystem Sep 12 17:13:12.927009 kernel: SMBIOS 3.0.0 present. Sep 12 17:13:12.927020 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Sep 12 17:13:12.927031 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 12 17:13:12.927042 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Sep 12 17:13:12.927053 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Sep 12 17:13:12.927084 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Sep 12 17:13:12.927096 kernel: audit: initializing netlink subsys (disabled) Sep 12 17:13:12.927107 kernel: audit: type=2000 audit(0.016:1): state=initialized audit_enabled=0 res=1 Sep 12 17:13:12.927118 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 12 17:13:12.927176 kernel: cpuidle: using governor menu Sep 12 17:13:12.927188 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Sep 12 17:13:12.927199 kernel: ASID allocator initialised with 32768 entries Sep 12 17:13:12.927210 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 12 17:13:12.927221 kernel: Serial: AMBA PL011 UART driver Sep 12 17:13:12.927232 kernel: Modules: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Sep 12 17:13:12.927246 kernel: Modules: 0 pages in range for non-PLT usage Sep 12 17:13:12.927258 kernel: Modules: 508992 pages in range for PLT usage Sep 12 17:13:12.927265 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 12 17:13:12.927275 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Sep 12 17:13:12.927283 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Sep 12 17:13:12.927290 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Sep 12 17:13:12.927297 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 12 17:13:12.927317 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Sep 12 17:13:12.927325 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Sep 12 17:13:12.927333 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Sep 12 17:13:12.927340 kernel: ACPI: Added _OSI(Module Device) Sep 12 17:13:12.927347 kernel: ACPI: Added _OSI(Processor Device) Sep 12 17:13:12.927357 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 12 17:13:12.927365 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 12 17:13:12.927372 kernel: ACPI: Interpreter enabled Sep 12 17:13:12.927380 kernel: ACPI: Using GIC for interrupt routing Sep 12 17:13:12.927387 kernel: ACPI: MCFG table detected, 1 entries Sep 12 17:13:12.927394 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Sep 12 17:13:12.927402 kernel: printk: console [ttyAMA0] enabled Sep 12 17:13:12.927409 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 12 17:13:12.927612 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 12 17:13:12.927699 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Sep 12 17:13:12.927767 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Sep 12 17:13:12.927834 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Sep 12 17:13:12.927898 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Sep 12 17:13:12.927908 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Sep 12 17:13:12.927915 kernel: PCI host bridge to bus 0000:00 Sep 12 17:13:12.927991 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Sep 12 17:13:12.928056 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Sep 12 17:13:12.928493 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Sep 12 17:13:12.928561 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 12 17:13:12.928651 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 Sep 12 17:13:12.928737 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 Sep 12 17:13:12.928807 kernel: pci 0000:00:01.0: reg 0x14: [mem 0x11289000-0x11289fff] Sep 12 17:13:12.928886 kernel: pci 0000:00:01.0: reg 0x20: [mem 0x8000600000-0x8000603fff 64bit pref] Sep 12 17:13:12.928969 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Sep 12 17:13:12.929039 kernel: pci 0000:00:02.0: reg 0x10: [mem 0x11288000-0x11288fff] Sep 12 17:13:12.929702 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Sep 12 17:13:12.929798 kernel: pci 0000:00:02.1: reg 0x10: [mem 0x11287000-0x11287fff] Sep 12 17:13:12.929883 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Sep 12 17:13:12.929976 kernel: pci 0000:00:02.2: reg 0x10: [mem 0x11286000-0x11286fff] Sep 12 17:13:12.930060 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Sep 12 17:13:12.930180 kernel: pci 0000:00:02.3: reg 0x10: [mem 0x11285000-0x11285fff] Sep 12 17:13:12.930262 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Sep 12 17:13:12.930332 kernel: pci 0000:00:02.4: reg 0x10: [mem 0x11284000-0x11284fff] Sep 12 17:13:12.930412 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Sep 12 17:13:12.930486 kernel: pci 0000:00:02.5: reg 0x10: [mem 0x11283000-0x11283fff] Sep 12 17:13:12.930563 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Sep 12 17:13:12.930631 kernel: pci 0000:00:02.6: reg 0x10: [mem 0x11282000-0x11282fff] Sep 12 17:13:12.930707 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Sep 12 17:13:12.930775 kernel: pci 0000:00:02.7: reg 0x10: [mem 0x11281000-0x11281fff] Sep 12 17:13:12.930859 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 Sep 12 17:13:12.930928 kernel: pci 0000:00:03.0: reg 0x10: [mem 0x11280000-0x11280fff] Sep 12 17:13:12.931012 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 Sep 12 17:13:12.931172 kernel: pci 0000:00:04.0: reg 0x10: [io 0x0000-0x0007] Sep 12 17:13:12.931276 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 Sep 12 17:13:12.931354 kernel: pci 0000:01:00.0: reg 0x14: [mem 0x11000000-0x11000fff] Sep 12 17:13:12.931427 kernel: pci 0000:01:00.0: reg 0x20: [mem 0x8000000000-0x8000003fff 64bit pref] Sep 12 17:13:12.931498 kernel: pci 0000:01:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Sep 12 17:13:12.931592 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 Sep 12 17:13:12.931663 kernel: pci 0000:02:00.0: reg 0x10: [mem 0x10e00000-0x10e03fff 64bit] Sep 12 17:13:12.931758 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 Sep 12 17:13:12.931831 kernel: pci 0000:03:00.0: reg 0x14: [mem 0x10c00000-0x10c00fff] Sep 12 17:13:12.931902 kernel: pci 0000:03:00.0: reg 0x20: [mem 0x8000100000-0x8000103fff 64bit pref] Sep 12 17:13:12.931986 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 Sep 12 17:13:12.932057 kernel: pci 0000:04:00.0: reg 0x20: [mem 0x8000200000-0x8000203fff 64bit pref] Sep 12 17:13:12.932185 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 Sep 12 17:13:12.932261 kernel: pci 0000:05:00.0: reg 0x20: [mem 0x8000300000-0x8000303fff 64bit pref] Sep 12 17:13:12.932339 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 Sep 12 17:13:12.932410 kernel: pci 0000:06:00.0: reg 0x14: [mem 0x10600000-0x10600fff] Sep 12 17:13:12.932480 kernel: pci 0000:06:00.0: reg 0x20: [mem 0x8000400000-0x8000403fff 64bit pref] Sep 12 17:13:12.932565 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 Sep 12 17:13:12.932642 kernel: pci 0000:07:00.0: reg 0x14: [mem 0x10400000-0x10400fff] Sep 12 17:13:12.932711 kernel: pci 0000:07:00.0: reg 0x20: [mem 0x8000500000-0x8000503fff 64bit pref] Sep 12 17:13:12.932780 kernel: pci 0000:07:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Sep 12 17:13:12.932856 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Sep 12 17:13:12.932925 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Sep 12 17:13:12.932992 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Sep 12 17:13:12.934300 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Sep 12 17:13:12.934429 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Sep 12 17:13:12.934498 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Sep 12 17:13:12.934573 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Sep 12 17:13:12.934642 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Sep 12 17:13:12.934707 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Sep 12 17:13:12.934781 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Sep 12 17:13:12.934850 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Sep 12 17:13:12.934927 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Sep 12 17:13:12.935003 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Sep 12 17:13:12.935095 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Sep 12 17:13:12.935180 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 05] add_size 200000 add_align 100000 Sep 12 17:13:12.935255 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Sep 12 17:13:12.935324 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Sep 12 17:13:12.935392 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Sep 12 17:13:12.935469 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Sep 12 17:13:12.935535 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Sep 12 17:13:12.935603 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Sep 12 17:13:12.935685 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Sep 12 17:13:12.935752 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Sep 12 17:13:12.935818 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Sep 12 17:13:12.935892 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Sep 12 17:13:12.935969 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Sep 12 17:13:12.936041 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Sep 12 17:13:12.937370 kernel: pci 0000:00:02.0: BAR 14: assigned [mem 0x10000000-0x101fffff] Sep 12 17:13:12.937463 kernel: pci 0000:00:02.0: BAR 15: assigned [mem 0x8000000000-0x80001fffff 64bit pref] Sep 12 17:13:12.937535 kernel: pci 0000:00:02.1: BAR 14: assigned [mem 0x10200000-0x103fffff] Sep 12 17:13:12.937617 kernel: pci 0000:00:02.1: BAR 15: assigned [mem 0x8000200000-0x80003fffff 64bit pref] Sep 12 17:13:12.937691 kernel: pci 0000:00:02.2: BAR 14: assigned [mem 0x10400000-0x105fffff] Sep 12 17:13:12.937761 kernel: pci 0000:00:02.2: BAR 15: assigned [mem 0x8000400000-0x80005fffff 64bit pref] Sep 12 17:13:12.937845 kernel: pci 0000:00:02.3: BAR 14: assigned [mem 0x10600000-0x107fffff] Sep 12 17:13:12.937914 kernel: pci 0000:00:02.3: BAR 15: assigned [mem 0x8000600000-0x80007fffff 64bit pref] Sep 12 17:13:12.937986 kernel: pci 0000:00:02.4: BAR 14: assigned [mem 0x10800000-0x109fffff] Sep 12 17:13:12.938057 kernel: pci 0000:00:02.4: BAR 15: assigned [mem 0x8000800000-0x80009fffff 64bit pref] Sep 12 17:13:12.938158 kernel: pci 0000:00:02.5: BAR 14: assigned [mem 0x10a00000-0x10bfffff] Sep 12 17:13:12.938228 kernel: pci 0000:00:02.5: BAR 15: assigned [mem 0x8000a00000-0x8000bfffff 64bit pref] Sep 12 17:13:12.938303 kernel: pci 0000:00:02.6: BAR 14: assigned [mem 0x10c00000-0x10dfffff] Sep 12 17:13:12.938376 kernel: pci 0000:00:02.6: BAR 15: assigned [mem 0x8000c00000-0x8000dfffff 64bit pref] Sep 12 17:13:12.938448 kernel: pci 0000:00:02.7: BAR 14: assigned [mem 0x10e00000-0x10ffffff] Sep 12 17:13:12.938516 kernel: pci 0000:00:02.7: BAR 15: assigned [mem 0x8000e00000-0x8000ffffff 64bit pref] Sep 12 17:13:12.938586 kernel: pci 0000:00:03.0: BAR 14: assigned [mem 0x11000000-0x111fffff] Sep 12 17:13:12.938654 kernel: pci 0000:00:03.0: BAR 15: assigned [mem 0x8001000000-0x80011fffff 64bit pref] Sep 12 17:13:12.938730 kernel: pci 0000:00:01.0: BAR 4: assigned [mem 0x8001200000-0x8001203fff 64bit pref] Sep 12 17:13:12.938803 kernel: pci 0000:00:01.0: BAR 1: assigned [mem 0x11200000-0x11200fff] Sep 12 17:13:12.938871 kernel: pci 0000:00:02.0: BAR 0: assigned [mem 0x11201000-0x11201fff] Sep 12 17:13:12.938939 kernel: pci 0000:00:02.0: BAR 13: assigned [io 0x1000-0x1fff] Sep 12 17:13:12.939006 kernel: pci 0000:00:02.1: BAR 0: assigned [mem 0x11202000-0x11202fff] Sep 12 17:13:12.941166 kernel: pci 0000:00:02.1: BAR 13: assigned [io 0x2000-0x2fff] Sep 12 17:13:12.941299 kernel: pci 0000:00:02.2: BAR 0: assigned [mem 0x11203000-0x11203fff] Sep 12 17:13:12.941374 kernel: pci 0000:00:02.2: BAR 13: assigned [io 0x3000-0x3fff] Sep 12 17:13:12.941449 kernel: pci 0000:00:02.3: BAR 0: assigned [mem 0x11204000-0x11204fff] Sep 12 17:13:12.941531 kernel: pci 0000:00:02.3: BAR 13: assigned [io 0x4000-0x4fff] Sep 12 17:13:12.941606 kernel: pci 0000:00:02.4: BAR 0: assigned [mem 0x11205000-0x11205fff] Sep 12 17:13:12.941675 kernel: pci 0000:00:02.4: BAR 13: assigned [io 0x5000-0x5fff] Sep 12 17:13:12.941748 kernel: pci 0000:00:02.5: BAR 0: assigned [mem 0x11206000-0x11206fff] Sep 12 17:13:12.941816 kernel: pci 0000:00:02.5: BAR 13: assigned [io 0x6000-0x6fff] Sep 12 17:13:12.941892 kernel: pci 0000:00:02.6: BAR 0: assigned [mem 0x11207000-0x11207fff] Sep 12 17:13:12.941960 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x7000-0x7fff] Sep 12 17:13:12.942032 kernel: pci 0000:00:02.7: BAR 0: assigned [mem 0x11208000-0x11208fff] Sep 12 17:13:12.942170 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x8000-0x8fff] Sep 12 17:13:12.942258 kernel: pci 0000:00:03.0: BAR 0: assigned [mem 0x11209000-0x11209fff] Sep 12 17:13:12.942328 kernel: pci 0000:00:03.0: BAR 13: assigned [io 0x9000-0x9fff] Sep 12 17:13:12.942402 kernel: pci 0000:00:04.0: BAR 0: assigned [io 0xa000-0xa007] Sep 12 17:13:12.942484 kernel: pci 0000:01:00.0: BAR 6: assigned [mem 0x10000000-0x1007ffff pref] Sep 12 17:13:12.942558 kernel: pci 0000:01:00.0: BAR 4: assigned [mem 0x8000000000-0x8000003fff 64bit pref] Sep 12 17:13:12.942629 kernel: pci 0000:01:00.0: BAR 1: assigned [mem 0x10080000-0x10080fff] Sep 12 17:13:12.942703 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Sep 12 17:13:12.942779 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Sep 12 17:13:12.942846 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Sep 12 17:13:12.942914 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Sep 12 17:13:12.942999 kernel: pci 0000:02:00.0: BAR 0: assigned [mem 0x10200000-0x10203fff 64bit] Sep 12 17:13:12.944301 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Sep 12 17:13:12.944423 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Sep 12 17:13:12.944492 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Sep 12 17:13:12.944558 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Sep 12 17:13:12.944641 kernel: pci 0000:03:00.0: BAR 4: assigned [mem 0x8000400000-0x8000403fff 64bit pref] Sep 12 17:13:12.944730 kernel: pci 0000:03:00.0: BAR 1: assigned [mem 0x10400000-0x10400fff] Sep 12 17:13:12.944804 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Sep 12 17:13:12.944872 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Sep 12 17:13:12.944947 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Sep 12 17:13:12.945015 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Sep 12 17:13:12.945306 kernel: pci 0000:04:00.0: BAR 4: assigned [mem 0x8000600000-0x8000603fff 64bit pref] Sep 12 17:13:12.945394 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Sep 12 17:13:12.945469 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Sep 12 17:13:12.945540 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Sep 12 17:13:12.945607 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Sep 12 17:13:12.945683 kernel: pci 0000:05:00.0: BAR 4: assigned [mem 0x8000800000-0x8000803fff 64bit pref] Sep 12 17:13:12.945758 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Sep 12 17:13:12.945825 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Sep 12 17:13:12.945898 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Sep 12 17:13:12.945964 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Sep 12 17:13:12.946042 kernel: pci 0000:06:00.0: BAR 4: assigned [mem 0x8000a00000-0x8000a03fff 64bit pref] Sep 12 17:13:12.946147 kernel: pci 0000:06:00.0: BAR 1: assigned [mem 0x10a00000-0x10a00fff] Sep 12 17:13:12.946226 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Sep 12 17:13:12.946293 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Sep 12 17:13:12.946374 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Sep 12 17:13:12.946441 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Sep 12 17:13:12.946521 kernel: pci 0000:07:00.0: BAR 6: assigned [mem 0x10c00000-0x10c7ffff pref] Sep 12 17:13:12.946591 kernel: pci 0000:07:00.0: BAR 4: assigned [mem 0x8000c00000-0x8000c03fff 64bit pref] Sep 12 17:13:12.946659 kernel: pci 0000:07:00.0: BAR 1: assigned [mem 0x10c80000-0x10c80fff] Sep 12 17:13:12.946732 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Sep 12 17:13:12.946798 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Sep 12 17:13:12.946863 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Sep 12 17:13:12.946934 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Sep 12 17:13:12.947006 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Sep 12 17:13:12.947083 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Sep 12 17:13:12.947199 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Sep 12 17:13:12.947281 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Sep 12 17:13:12.947355 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Sep 12 17:13:12.947423 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Sep 12 17:13:12.947491 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Sep 12 17:13:12.947563 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Sep 12 17:13:12.947639 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Sep 12 17:13:12.947703 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Sep 12 17:13:12.947763 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Sep 12 17:13:12.947844 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Sep 12 17:13:12.947908 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Sep 12 17:13:12.947971 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Sep 12 17:13:12.948054 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Sep 12 17:13:12.948234 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Sep 12 17:13:12.948296 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Sep 12 17:13:12.948379 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Sep 12 17:13:12.948445 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Sep 12 17:13:12.948530 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Sep 12 17:13:12.948626 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Sep 12 17:13:12.948691 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Sep 12 17:13:12.948784 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Sep 12 17:13:12.949227 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Sep 12 17:13:12.949318 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Sep 12 17:13:12.949380 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Sep 12 17:13:12.949466 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Sep 12 17:13:12.949533 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Sep 12 17:13:12.949596 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Sep 12 17:13:12.949680 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Sep 12 17:13:12.949746 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Sep 12 17:13:12.949811 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Sep 12 17:13:12.949885 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Sep 12 17:13:12.949947 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Sep 12 17:13:12.950010 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Sep 12 17:13:12.950111 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Sep 12 17:13:12.950233 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Sep 12 17:13:12.950298 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Sep 12 17:13:12.950314 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Sep 12 17:13:12.950322 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Sep 12 17:13:12.950330 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Sep 12 17:13:12.950338 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Sep 12 17:13:12.950346 kernel: iommu: Default domain type: Translated Sep 12 17:13:12.950354 kernel: iommu: DMA domain TLB invalidation policy: strict mode Sep 12 17:13:12.950361 kernel: efivars: Registered efivars operations Sep 12 17:13:12.950370 kernel: vgaarb: loaded Sep 12 17:13:12.950378 kernel: clocksource: Switched to clocksource arch_sys_counter Sep 12 17:13:12.950388 kernel: VFS: Disk quotas dquot_6.6.0 Sep 12 17:13:12.950396 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 12 17:13:12.950404 kernel: pnp: PnP ACPI init Sep 12 17:13:12.950498 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Sep 12 17:13:12.950510 kernel: pnp: PnP ACPI: found 1 devices Sep 12 17:13:12.950518 kernel: NET: Registered PF_INET protocol family Sep 12 17:13:12.950526 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 12 17:13:12.950534 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 12 17:13:12.950545 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 12 17:13:12.950553 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 12 17:13:12.950573 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 12 17:13:12.950585 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 12 17:13:12.950596 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 12 17:13:12.950604 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 12 17:13:12.950613 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 12 17:13:12.950705 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Sep 12 17:13:12.950719 kernel: PCI: CLS 0 bytes, default 64 Sep 12 17:13:12.950730 kernel: kvm [1]: HYP mode not available Sep 12 17:13:12.950738 kernel: Initialise system trusted keyrings Sep 12 17:13:12.950746 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 12 17:13:12.950754 kernel: Key type asymmetric registered Sep 12 17:13:12.950762 kernel: Asymmetric key parser 'x509' registered Sep 12 17:13:12.950770 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 12 17:13:12.950778 kernel: io scheduler mq-deadline registered Sep 12 17:13:12.950786 kernel: io scheduler kyber registered Sep 12 17:13:12.950794 kernel: io scheduler bfq registered Sep 12 17:13:12.950806 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Sep 12 17:13:12.950884 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Sep 12 17:13:12.950955 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Sep 12 17:13:12.951024 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 17:13:12.951714 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Sep 12 17:13:12.951798 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Sep 12 17:13:12.951873 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 17:13:12.951946 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Sep 12 17:13:12.952014 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Sep 12 17:13:12.952651 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 17:13:12.952750 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Sep 12 17:13:12.952820 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Sep 12 17:13:12.952898 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 17:13:12.952973 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Sep 12 17:13:12.953041 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Sep 12 17:13:12.953154 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 17:13:12.953231 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Sep 12 17:13:12.953299 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Sep 12 17:13:12.953373 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 17:13:12.953449 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Sep 12 17:13:12.953518 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Sep 12 17:13:12.953584 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 17:13:12.953657 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Sep 12 17:13:12.953725 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Sep 12 17:13:12.953795 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 17:13:12.953806 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Sep 12 17:13:12.953878 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Sep 12 17:13:12.953947 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Sep 12 17:13:12.954015 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 17:13:12.954029 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Sep 12 17:13:12.954037 kernel: ACPI: button: Power Button [PWRB] Sep 12 17:13:12.954045 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Sep 12 17:13:12.954214 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Sep 12 17:13:12.954298 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Sep 12 17:13:12.954311 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 12 17:13:12.954319 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Sep 12 17:13:12.954388 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Sep 12 17:13:12.954398 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Sep 12 17:13:12.954406 kernel: thunder_xcv, ver 1.0 Sep 12 17:13:12.954414 kernel: thunder_bgx, ver 1.0 Sep 12 17:13:12.954426 kernel: nicpf, ver 1.0 Sep 12 17:13:12.954434 kernel: nicvf, ver 1.0 Sep 12 17:13:12.954518 kernel: rtc-efi rtc-efi.0: registered as rtc0 Sep 12 17:13:12.954585 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-09-12T17:13:12 UTC (1757697192) Sep 12 17:13:12.954595 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 12 17:13:12.954604 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 counters available Sep 12 17:13:12.954612 kernel: watchdog: Delayed init of the lockup detector failed: -19 Sep 12 17:13:12.954620 kernel: watchdog: Hard watchdog permanently disabled Sep 12 17:13:12.954630 kernel: NET: Registered PF_INET6 protocol family Sep 12 17:13:12.954638 kernel: Segment Routing with IPv6 Sep 12 17:13:12.954646 kernel: In-situ OAM (IOAM) with IPv6 Sep 12 17:13:12.954654 kernel: NET: Registered PF_PACKET protocol family Sep 12 17:13:12.954662 kernel: Key type dns_resolver registered Sep 12 17:13:12.954669 kernel: registered taskstats version 1 Sep 12 17:13:12.954677 kernel: Loading compiled-in X.509 certificates Sep 12 17:13:12.954685 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.106-flatcar: 2d576b5e69e6c5de2f731966fe8b55173c144d02' Sep 12 17:13:12.954693 kernel: Key type .fscrypt registered Sep 12 17:13:12.954702 kernel: Key type fscrypt-provisioning registered Sep 12 17:13:12.954710 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 12 17:13:12.954718 kernel: ima: Allocated hash algorithm: sha1 Sep 12 17:13:12.954726 kernel: ima: No architecture policies found Sep 12 17:13:12.954734 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Sep 12 17:13:12.954741 kernel: clk: Disabling unused clocks Sep 12 17:13:12.954749 kernel: Freeing unused kernel memory: 39488K Sep 12 17:13:12.954756 kernel: Run /init as init process Sep 12 17:13:12.954764 kernel: with arguments: Sep 12 17:13:12.954774 kernel: /init Sep 12 17:13:12.954781 kernel: with environment: Sep 12 17:13:12.954789 kernel: HOME=/ Sep 12 17:13:12.954796 kernel: TERM=linux Sep 12 17:13:12.954804 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 12 17:13:12.954814 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Sep 12 17:13:12.954824 systemd[1]: Detected virtualization kvm. Sep 12 17:13:12.954833 systemd[1]: Detected architecture arm64. Sep 12 17:13:12.954843 systemd[1]: Running in initrd. Sep 12 17:13:12.954852 systemd[1]: No hostname configured, using default hostname. Sep 12 17:13:12.954860 systemd[1]: Hostname set to . Sep 12 17:13:12.954868 systemd[1]: Initializing machine ID from VM UUID. Sep 12 17:13:12.954877 systemd[1]: Queued start job for default target initrd.target. Sep 12 17:13:12.954885 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 17:13:12.954894 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 17:13:12.954903 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 12 17:13:12.954913 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 17:13:12.954922 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 12 17:13:12.954931 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 12 17:13:12.954941 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 12 17:13:12.954949 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 12 17:13:12.954958 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 17:13:12.954968 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 17:13:12.954979 systemd[1]: Reached target paths.target - Path Units. Sep 12 17:13:12.954987 systemd[1]: Reached target slices.target - Slice Units. Sep 12 17:13:12.954995 systemd[1]: Reached target swap.target - Swaps. Sep 12 17:13:12.955003 systemd[1]: Reached target timers.target - Timer Units. Sep 12 17:13:12.955012 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 17:13:12.955020 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 17:13:12.955029 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 12 17:13:12.955038 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Sep 12 17:13:12.955048 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 17:13:12.955057 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 17:13:12.955254 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 17:13:12.955266 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 17:13:12.955275 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 12 17:13:12.955283 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 17:13:12.955292 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 12 17:13:12.955301 systemd[1]: Starting systemd-fsck-usr.service... Sep 12 17:13:12.955310 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 17:13:12.955326 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 17:13:12.955334 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:13:12.955343 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 12 17:13:12.955394 systemd-journald[236]: Collecting audit messages is disabled. Sep 12 17:13:12.955420 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 17:13:12.955429 systemd[1]: Finished systemd-fsck-usr.service. Sep 12 17:13:12.955438 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 17:13:12.955449 systemd-journald[236]: Journal started Sep 12 17:13:12.955474 systemd-journald[236]: Runtime Journal (/run/log/journal/a6f48d7de750421784bf4638b2eb75c0) is 8.0M, max 76.6M, 68.6M free. Sep 12 17:13:12.949366 systemd-modules-load[237]: Inserted module 'overlay' Sep 12 17:13:12.958093 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 17:13:12.963174 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:13:12.973096 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 12 17:13:12.974618 systemd-modules-load[237]: Inserted module 'br_netfilter' Sep 12 17:13:12.975309 kernel: Bridge firewalling registered Sep 12 17:13:12.976494 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 17:13:12.979769 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 17:13:12.980762 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 17:13:12.982004 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 17:13:12.991449 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 17:13:12.997015 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 17:13:13.012559 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 17:13:13.017136 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 17:13:13.033702 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 17:13:13.037634 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 17:13:13.039626 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 17:13:13.047476 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 12 17:13:13.066131 systemd-resolved[269]: Positive Trust Anchors: Sep 12 17:13:13.066150 systemd-resolved[269]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 17:13:13.066184 systemd-resolved[269]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 17:13:13.071708 systemd-resolved[269]: Defaulting to hostname 'linux'. Sep 12 17:13:13.075672 dracut-cmdline[273]: dracut-dracut-053 Sep 12 17:13:13.073022 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 17:13:13.077541 dracut-cmdline[273]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=1e63d3057914877efa0eb5f75703bd3a3d4c120bdf4a7ab97f41083e29183e56 Sep 12 17:13:13.073786 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 17:13:13.168144 kernel: SCSI subsystem initialized Sep 12 17:13:13.173119 kernel: Loading iSCSI transport class v2.0-870. Sep 12 17:13:13.181148 kernel: iscsi: registered transport (tcp) Sep 12 17:13:13.195146 kernel: iscsi: registered transport (qla4xxx) Sep 12 17:13:13.195220 kernel: QLogic iSCSI HBA Driver Sep 12 17:13:13.253139 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 12 17:13:13.263389 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 12 17:13:13.286702 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 12 17:13:13.286785 kernel: device-mapper: uevent: version 1.0.3 Sep 12 17:13:13.286797 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Sep 12 17:13:13.339140 kernel: raid6: neonx8 gen() 15678 MB/s Sep 12 17:13:13.356150 kernel: raid6: neonx4 gen() 11975 MB/s Sep 12 17:13:13.373111 kernel: raid6: neonx2 gen() 13149 MB/s Sep 12 17:13:13.390138 kernel: raid6: neonx1 gen() 10454 MB/s Sep 12 17:13:13.407108 kernel: raid6: int64x8 gen() 6922 MB/s Sep 12 17:13:13.424132 kernel: raid6: int64x4 gen() 7313 MB/s Sep 12 17:13:13.441112 kernel: raid6: int64x2 gen() 6099 MB/s Sep 12 17:13:13.458103 kernel: raid6: int64x1 gen() 5030 MB/s Sep 12 17:13:13.458175 kernel: raid6: using algorithm neonx8 gen() 15678 MB/s Sep 12 17:13:13.475141 kernel: raid6: .... xor() 11947 MB/s, rmw enabled Sep 12 17:13:13.475199 kernel: raid6: using neon recovery algorithm Sep 12 17:13:13.480094 kernel: xor: measuring software checksum speed Sep 12 17:13:13.481316 kernel: 8regs : 16424 MB/sec Sep 12 17:13:13.481360 kernel: 32regs : 19660 MB/sec Sep 12 17:13:13.481379 kernel: arm64_neon : 26848 MB/sec Sep 12 17:13:13.481397 kernel: xor: using function: arm64_neon (26848 MB/sec) Sep 12 17:13:13.533131 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 12 17:13:13.551005 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 12 17:13:13.560430 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 17:13:13.574626 systemd-udevd[455]: Using default interface naming scheme 'v255'. Sep 12 17:13:13.578193 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 17:13:13.588938 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 12 17:13:13.605930 dracut-pre-trigger[460]: rd.md=0: removing MD RAID activation Sep 12 17:13:13.649363 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 17:13:13.658338 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 17:13:13.712000 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 17:13:13.723387 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 12 17:13:13.744905 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 12 17:13:13.752161 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 17:13:13.754568 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 17:13:13.755396 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 17:13:13.761409 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 12 17:13:13.788472 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 12 17:13:13.819658 kernel: scsi host0: Virtio SCSI HBA Sep 12 17:13:13.819847 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Sep 12 17:13:13.824454 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Sep 12 17:13:13.874545 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 12 17:13:13.874689 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 17:13:13.877515 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 17:13:13.879200 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 17:13:13.879404 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:13:13.880687 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:13:13.891750 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:13:13.904617 kernel: ACPI: bus type USB registered Sep 12 17:13:13.904710 kernel: usbcore: registered new interface driver usbfs Sep 12 17:13:13.904736 kernel: sr 0:0:0:0: Power-on or device reset occurred Sep 12 17:13:13.905166 kernel: usbcore: registered new interface driver hub Sep 12 17:13:13.905180 kernel: usbcore: registered new device driver usb Sep 12 17:13:13.913147 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Sep 12 17:13:13.913457 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 12 17:13:13.922108 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Sep 12 17:13:13.928543 kernel: sd 0:0:0:1: Power-on or device reset occurred Sep 12 17:13:13.928837 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Sep 12 17:13:13.930428 kernel: sd 0:0:0:1: [sda] Write Protect is off Sep 12 17:13:13.930699 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Sep 12 17:13:13.930849 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Sep 12 17:13:13.932829 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Sep 12 17:13:13.932987 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Sep 12 17:13:13.934758 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Sep 12 17:13:13.934874 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Sep 12 17:13:13.934969 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Sep 12 17:13:13.935057 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Sep 12 17:13:13.939747 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:13:13.945335 kernel: hub 1-0:1.0: USB hub found Sep 12 17:13:13.945599 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 12 17:13:13.945611 kernel: GPT:17805311 != 80003071 Sep 12 17:13:13.945620 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 12 17:13:13.945629 kernel: hub 1-0:1.0: 4 ports detected Sep 12 17:13:13.945715 kernel: GPT:17805311 != 80003071 Sep 12 17:13:13.945724 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Sep 12 17:13:13.949197 kernel: hub 2-0:1.0: USB hub found Sep 12 17:13:13.949462 kernel: hub 2-0:1.0: 4 ports detected Sep 12 17:13:13.949558 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 12 17:13:13.949571 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 17:13:13.949341 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 17:13:13.954551 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Sep 12 17:13:13.998804 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 17:13:14.018166 kernel: BTRFS: device fsid 5a23a06a-00d4-4606-89bf-13e31a563129 devid 1 transid 36 /dev/sda3 scanned by (udev-worker) (514) Sep 12 17:13:14.022281 kernel: BTRFS: device label OEM devid 1 transid 9 /dev/sda6 scanned by (udev-worker) (505) Sep 12 17:13:14.025994 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Sep 12 17:13:14.038774 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Sep 12 17:13:14.045521 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Sep 12 17:13:14.046506 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Sep 12 17:13:14.056162 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Sep 12 17:13:14.064797 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 12 17:13:14.079172 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 17:13:14.080618 disk-uuid[570]: Primary Header is updated. Sep 12 17:13:14.080618 disk-uuid[570]: Secondary Entries is updated. Sep 12 17:13:14.080618 disk-uuid[570]: Secondary Header is updated. Sep 12 17:13:14.187104 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Sep 12 17:13:14.322970 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Sep 12 17:13:14.323104 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Sep 12 17:13:14.323514 kernel: usbcore: registered new interface driver usbhid Sep 12 17:13:14.323542 kernel: usbhid: USB HID core driver Sep 12 17:13:14.429204 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Sep 12 17:13:14.559124 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Sep 12 17:13:14.613104 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Sep 12 17:13:15.105195 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 17:13:15.106678 disk-uuid[571]: The operation has completed successfully. Sep 12 17:13:15.160419 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 12 17:13:15.161327 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 12 17:13:15.179368 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 12 17:13:15.194988 sh[590]: Success Sep 12 17:13:15.213160 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Sep 12 17:13:15.289884 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 12 17:13:15.293196 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 12 17:13:15.295039 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 12 17:13:15.312707 kernel: BTRFS info (device dm-0): first mount of filesystem 5a23a06a-00d4-4606-89bf-13e31a563129 Sep 12 17:13:15.312801 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Sep 12 17:13:15.312828 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Sep 12 17:13:15.312865 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 12 17:13:15.314098 kernel: BTRFS info (device dm-0): using free space tree Sep 12 17:13:15.319098 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 12 17:13:15.321017 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 12 17:13:15.322999 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 12 17:13:15.330315 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 12 17:13:15.334528 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 12 17:13:15.349099 kernel: BTRFS info (device sda6): first mount of filesystem daec7f45-8bde-44bd-bec0-4b8eac931d0c Sep 12 17:13:15.349323 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 12 17:13:15.350096 kernel: BTRFS info (device sda6): using free space tree Sep 12 17:13:15.357492 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 17:13:15.357584 kernel: BTRFS info (device sda6): auto enabling async discard Sep 12 17:13:15.370879 kernel: BTRFS info (device sda6): last unmount of filesystem daec7f45-8bde-44bd-bec0-4b8eac931d0c Sep 12 17:13:15.370559 systemd[1]: mnt-oem.mount: Deactivated successfully. Sep 12 17:13:15.378928 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 12 17:13:15.385523 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 12 17:13:15.493213 ignition[678]: Ignition 2.19.0 Sep 12 17:13:15.493225 ignition[678]: Stage: fetch-offline Sep 12 17:13:15.493272 ignition[678]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:13:15.493282 ignition[678]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 17:13:15.497114 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 17:13:15.493719 ignition[678]: parsed url from cmdline: "" Sep 12 17:13:15.493723 ignition[678]: no config URL provided Sep 12 17:13:15.493728 ignition[678]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 17:13:15.493737 ignition[678]: no config at "/usr/lib/ignition/user.ign" Sep 12 17:13:15.493743 ignition[678]: failed to fetch config: resource requires networking Sep 12 17:13:15.494093 ignition[678]: Ignition finished successfully Sep 12 17:13:15.505266 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 17:13:15.511310 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 17:13:15.533588 systemd-networkd[779]: lo: Link UP Sep 12 17:13:15.533601 systemd-networkd[779]: lo: Gained carrier Sep 12 17:13:15.535427 systemd-networkd[779]: Enumeration completed Sep 12 17:13:15.535914 systemd-networkd[779]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:13:15.535917 systemd-networkd[779]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 17:13:15.537875 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 17:13:15.538626 systemd-networkd[779]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:13:15.538629 systemd-networkd[779]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 17:13:15.539256 systemd-networkd[779]: eth0: Link UP Sep 12 17:13:15.539259 systemd-networkd[779]: eth0: Gained carrier Sep 12 17:13:15.539268 systemd-networkd[779]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:13:15.540244 systemd[1]: Reached target network.target - Network. Sep 12 17:13:15.545449 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 12 17:13:15.546528 systemd-networkd[779]: eth1: Link UP Sep 12 17:13:15.546531 systemd-networkd[779]: eth1: Gained carrier Sep 12 17:13:15.546544 systemd-networkd[779]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:13:15.564982 ignition[782]: Ignition 2.19.0 Sep 12 17:13:15.564995 ignition[782]: Stage: fetch Sep 12 17:13:15.565387 ignition[782]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:13:15.565404 ignition[782]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 17:13:15.565554 ignition[782]: parsed url from cmdline: "" Sep 12 17:13:15.565560 ignition[782]: no config URL provided Sep 12 17:13:15.565568 ignition[782]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 17:13:15.565579 ignition[782]: no config at "/usr/lib/ignition/user.ign" Sep 12 17:13:15.565612 ignition[782]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Sep 12 17:13:15.566455 ignition[782]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Sep 12 17:13:15.577164 systemd-networkd[779]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Sep 12 17:13:15.602188 systemd-networkd[779]: eth0: DHCPv4 address 49.13.6.100/32, gateway 172.31.1.1 acquired from 172.31.1.1 Sep 12 17:13:15.767573 ignition[782]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Sep 12 17:13:15.772440 ignition[782]: GET result: OK Sep 12 17:13:15.772547 ignition[782]: parsing config with SHA512: 7cf851f506e098f476d6cd78aef6175726f3aceea85a5025d2c59dfddbcdbf29eaf6617911df565b446e81e912d299e603b1f3b7ac1a919864a8770b9fbfe0a1 Sep 12 17:13:15.780003 unknown[782]: fetched base config from "system" Sep 12 17:13:15.780501 ignition[782]: fetch: fetch complete Sep 12 17:13:15.780015 unknown[782]: fetched base config from "system" Sep 12 17:13:15.780507 ignition[782]: fetch: fetch passed Sep 12 17:13:15.780021 unknown[782]: fetched user config from "hetzner" Sep 12 17:13:15.780565 ignition[782]: Ignition finished successfully Sep 12 17:13:15.782554 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 12 17:13:15.789331 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 12 17:13:15.804672 ignition[789]: Ignition 2.19.0 Sep 12 17:13:15.804692 ignition[789]: Stage: kargs Sep 12 17:13:15.804940 ignition[789]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:13:15.804951 ignition[789]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 17:13:15.806298 ignition[789]: kargs: kargs passed Sep 12 17:13:15.806369 ignition[789]: Ignition finished successfully Sep 12 17:13:15.808800 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 12 17:13:15.819525 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 12 17:13:15.833535 ignition[796]: Ignition 2.19.0 Sep 12 17:13:15.833547 ignition[796]: Stage: disks Sep 12 17:13:15.833766 ignition[796]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:13:15.833777 ignition[796]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 17:13:15.837055 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 12 17:13:15.834883 ignition[796]: disks: disks passed Sep 12 17:13:15.839523 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 12 17:13:15.834944 ignition[796]: Ignition finished successfully Sep 12 17:13:15.842778 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 12 17:13:15.844216 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 17:13:15.844795 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 17:13:15.845931 systemd[1]: Reached target basic.target - Basic System. Sep 12 17:13:15.853385 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 12 17:13:15.876593 systemd-fsck[804]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Sep 12 17:13:15.880592 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 12 17:13:15.890235 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 12 17:13:15.943110 kernel: EXT4-fs (sda9): mounted filesystem fc6c61a7-153d-4e7f-95c0-bffdb4824d71 r/w with ordered data mode. Quota mode: none. Sep 12 17:13:15.944493 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 12 17:13:15.945689 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 12 17:13:15.955260 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 17:13:15.959220 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 12 17:13:15.961259 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Sep 12 17:13:15.964210 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 12 17:13:15.964247 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 17:13:15.972123 kernel: BTRFS: device label OEM devid 1 transid 10 /dev/sda6 scanned by mount (812) Sep 12 17:13:15.974521 kernel: BTRFS info (device sda6): first mount of filesystem daec7f45-8bde-44bd-bec0-4b8eac931d0c Sep 12 17:13:15.974584 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 12 17:13:15.974597 kernel: BTRFS info (device sda6): using free space tree Sep 12 17:13:15.983146 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 17:13:15.983208 kernel: BTRFS info (device sda6): auto enabling async discard Sep 12 17:13:15.984629 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 12 17:13:15.990478 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 17:13:16.001676 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 12 17:13:16.047364 initrd-setup-root[840]: cut: /sysroot/etc/passwd: No such file or directory Sep 12 17:13:16.051263 coreos-metadata[814]: Sep 12 17:13:16.051 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Sep 12 17:13:16.054537 coreos-metadata[814]: Sep 12 17:13:16.054 INFO Fetch successful Sep 12 17:13:16.054537 coreos-metadata[814]: Sep 12 17:13:16.054 INFO wrote hostname ci-4081-3-6-2-0999f1dc3d to /sysroot/etc/hostname Sep 12 17:13:16.056055 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 12 17:13:16.061502 initrd-setup-root[848]: cut: /sysroot/etc/group: No such file or directory Sep 12 17:13:16.067624 initrd-setup-root[855]: cut: /sysroot/etc/shadow: No such file or directory Sep 12 17:13:16.072122 initrd-setup-root[862]: cut: /sysroot/etc/gshadow: No such file or directory Sep 12 17:13:16.174172 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 12 17:13:16.189370 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 12 17:13:16.193573 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 12 17:13:16.203179 kernel: BTRFS info (device sda6): last unmount of filesystem daec7f45-8bde-44bd-bec0-4b8eac931d0c Sep 12 17:13:16.227160 ignition[929]: INFO : Ignition 2.19.0 Sep 12 17:13:16.227160 ignition[929]: INFO : Stage: mount Sep 12 17:13:16.227160 ignition[929]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 17:13:16.227160 ignition[929]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 17:13:16.232223 ignition[929]: INFO : mount: mount passed Sep 12 17:13:16.232223 ignition[929]: INFO : Ignition finished successfully Sep 12 17:13:16.229996 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 12 17:13:16.242307 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 12 17:13:16.245789 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 12 17:13:16.313877 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 12 17:13:16.324475 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 17:13:16.335105 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 scanned by mount (941) Sep 12 17:13:16.337375 kernel: BTRFS info (device sda6): first mount of filesystem daec7f45-8bde-44bd-bec0-4b8eac931d0c Sep 12 17:13:16.337426 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 12 17:13:16.337451 kernel: BTRFS info (device sda6): using free space tree Sep 12 17:13:16.341169 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 17:13:16.341239 kernel: BTRFS info (device sda6): auto enabling async discard Sep 12 17:13:16.343946 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 17:13:16.372089 ignition[958]: INFO : Ignition 2.19.0 Sep 12 17:13:16.372089 ignition[958]: INFO : Stage: files Sep 12 17:13:16.374166 ignition[958]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 17:13:16.374166 ignition[958]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 17:13:16.374166 ignition[958]: DEBUG : files: compiled without relabeling support, skipping Sep 12 17:13:16.377791 ignition[958]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 12 17:13:16.377791 ignition[958]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 12 17:13:16.382749 ignition[958]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 12 17:13:16.384310 ignition[958]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 12 17:13:16.385711 unknown[958]: wrote ssh authorized keys file for user: core Sep 12 17:13:16.386906 ignition[958]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 12 17:13:16.389187 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Sep 12 17:13:16.391496 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-arm64.tar.gz: attempt #1 Sep 12 17:13:16.495048 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 12 17:13:16.849060 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Sep 12 17:13:16.850508 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 12 17:13:16.850508 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 12 17:13:16.850508 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 12 17:13:16.850508 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 12 17:13:16.850508 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 17:13:16.850508 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 17:13:16.850508 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 17:13:16.850508 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 17:13:16.859009 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 17:13:16.859009 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 17:13:16.859009 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 12 17:13:16.859009 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 12 17:13:16.859009 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 12 17:13:16.859009 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-arm64.raw: attempt #1 Sep 12 17:13:16.977948 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 12 17:13:17.165234 systemd-networkd[779]: eth0: Gained IPv6LL Sep 12 17:13:17.421673 systemd-networkd[779]: eth1: Gained IPv6LL Sep 12 17:13:17.850499 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 12 17:13:17.850499 ignition[958]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 12 17:13:17.855640 ignition[958]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 17:13:17.855640 ignition[958]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 17:13:17.855640 ignition[958]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 12 17:13:17.855640 ignition[958]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 12 17:13:17.855640 ignition[958]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Sep 12 17:13:17.855640 ignition[958]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Sep 12 17:13:17.855640 ignition[958]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 12 17:13:17.855640 ignition[958]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Sep 12 17:13:17.855640 ignition[958]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Sep 12 17:13:17.855640 ignition[958]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 12 17:13:17.855640 ignition[958]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 12 17:13:17.855640 ignition[958]: INFO : files: files passed Sep 12 17:13:17.855640 ignition[958]: INFO : Ignition finished successfully Sep 12 17:13:17.856029 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 12 17:13:17.862353 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 12 17:13:17.868736 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 12 17:13:17.871997 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 12 17:13:17.872142 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 12 17:13:17.881330 initrd-setup-root-after-ignition[986]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 17:13:17.881330 initrd-setup-root-after-ignition[986]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 12 17:13:17.884576 initrd-setup-root-after-ignition[990]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 17:13:17.886822 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 17:13:17.888692 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 12 17:13:17.896257 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 12 17:13:17.929810 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 12 17:13:17.929958 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 12 17:13:17.932379 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 12 17:13:17.933850 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 12 17:13:17.935397 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 12 17:13:17.942364 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 12 17:13:17.957814 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 17:13:17.962281 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 12 17:13:17.981978 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 12 17:13:17.982872 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 17:13:17.984019 systemd[1]: Stopped target timers.target - Timer Units. Sep 12 17:13:17.985167 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 12 17:13:17.985308 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 17:13:17.986692 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 12 17:13:17.987408 systemd[1]: Stopped target basic.target - Basic System. Sep 12 17:13:17.988554 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 12 17:13:17.989733 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 17:13:17.990749 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 12 17:13:17.991872 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 12 17:13:17.992960 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 17:13:17.994196 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 12 17:13:17.995250 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 12 17:13:17.996404 systemd[1]: Stopped target swap.target - Swaps. Sep 12 17:13:17.997396 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 12 17:13:17.997533 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 12 17:13:17.998885 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 12 17:13:17.999664 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 17:13:18.000842 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 12 17:13:18.000925 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 17:13:18.002060 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 12 17:13:18.002254 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 12 17:13:18.003826 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 12 17:13:18.003948 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 17:13:18.005218 systemd[1]: ignition-files.service: Deactivated successfully. Sep 12 17:13:18.005315 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 12 17:13:18.006482 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Sep 12 17:13:18.006578 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 12 17:13:18.013423 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 12 17:13:18.013942 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 12 17:13:18.014116 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 17:13:18.018341 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 12 17:13:18.018851 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 12 17:13:18.018989 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 17:13:18.022902 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 12 17:13:18.023014 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 17:13:18.036370 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 12 17:13:18.036539 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 12 17:13:18.041725 ignition[1010]: INFO : Ignition 2.19.0 Sep 12 17:13:18.041725 ignition[1010]: INFO : Stage: umount Sep 12 17:13:18.041725 ignition[1010]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 17:13:18.041725 ignition[1010]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 17:13:18.041725 ignition[1010]: INFO : umount: umount passed Sep 12 17:13:18.041725 ignition[1010]: INFO : Ignition finished successfully Sep 12 17:13:18.049428 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 12 17:13:18.050008 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 12 17:13:18.050153 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 12 17:13:18.050930 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 12 17:13:18.050971 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 12 17:13:18.051694 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 12 17:13:18.051741 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 12 17:13:18.052721 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 12 17:13:18.052765 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 12 17:13:18.053731 systemd[1]: Stopped target network.target - Network. Sep 12 17:13:18.054753 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 12 17:13:18.054815 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 17:13:18.055800 systemd[1]: Stopped target paths.target - Path Units. Sep 12 17:13:18.056760 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 12 17:13:18.060170 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 17:13:18.062013 systemd[1]: Stopped target slices.target - Slice Units. Sep 12 17:13:18.063627 systemd[1]: Stopped target sockets.target - Socket Units. Sep 12 17:13:18.064740 systemd[1]: iscsid.socket: Deactivated successfully. Sep 12 17:13:18.064793 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 17:13:18.065720 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 12 17:13:18.065761 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 17:13:18.066971 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 12 17:13:18.067032 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 12 17:13:18.067968 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 12 17:13:18.068017 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 12 17:13:18.069093 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 12 17:13:18.069854 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 12 17:13:18.070946 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 12 17:13:18.071160 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 12 17:13:18.072321 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 12 17:13:18.072418 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 12 17:13:18.078171 systemd-networkd[779]: eth1: DHCPv6 lease lost Sep 12 17:13:18.079214 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 12 17:13:18.079350 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 12 17:13:18.081832 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 12 17:13:18.081897 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 17:13:18.084305 systemd-networkd[779]: eth0: DHCPv6 lease lost Sep 12 17:13:18.086013 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 12 17:13:18.086480 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 12 17:13:18.087728 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 12 17:13:18.087766 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 12 17:13:18.099345 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 12 17:13:18.100609 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 12 17:13:18.100740 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 17:13:18.102508 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 12 17:13:18.102568 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 12 17:13:18.104518 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 12 17:13:18.104578 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 12 17:13:18.106191 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 17:13:18.122328 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 12 17:13:18.122587 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 12 17:13:18.126902 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 12 17:13:18.127057 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 17:13:18.128947 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 12 17:13:18.128993 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 12 17:13:18.132297 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 12 17:13:18.132350 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 17:13:18.133595 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 12 17:13:18.133654 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 12 17:13:18.136947 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 12 17:13:18.137122 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 12 17:13:18.139753 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 12 17:13:18.139820 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 17:13:18.146323 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 12 17:13:18.147462 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 12 17:13:18.147564 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 17:13:18.148875 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 17:13:18.148928 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:13:18.159695 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 12 17:13:18.159917 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 12 17:13:18.161917 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 12 17:13:18.170567 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 12 17:13:18.182906 systemd[1]: Switching root. Sep 12 17:13:18.215867 systemd-journald[236]: Journal stopped Sep 12 17:13:19.136384 systemd-journald[236]: Received SIGTERM from PID 1 (systemd). Sep 12 17:13:19.136459 kernel: SELinux: policy capability network_peer_controls=1 Sep 12 17:13:19.136482 kernel: SELinux: policy capability open_perms=1 Sep 12 17:13:19.136492 kernel: SELinux: policy capability extended_socket_class=1 Sep 12 17:13:19.136501 kernel: SELinux: policy capability always_check_network=0 Sep 12 17:13:19.136514 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 12 17:13:19.136524 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 12 17:13:19.136533 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 12 17:13:19.136543 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 12 17:13:19.136552 kernel: audit: type=1403 audit(1757697198.370:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 12 17:13:19.136567 systemd[1]: Successfully loaded SELinux policy in 35.925ms. Sep 12 17:13:19.136589 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 10.918ms. Sep 12 17:13:19.136600 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Sep 12 17:13:19.136611 systemd[1]: Detected virtualization kvm. Sep 12 17:13:19.136621 systemd[1]: Detected architecture arm64. Sep 12 17:13:19.136632 systemd[1]: Detected first boot. Sep 12 17:13:19.136642 systemd[1]: Hostname set to . Sep 12 17:13:19.136651 systemd[1]: Initializing machine ID from VM UUID. Sep 12 17:13:19.136662 zram_generator::config[1052]: No configuration found. Sep 12 17:13:19.136679 systemd[1]: Populated /etc with preset unit settings. Sep 12 17:13:19.136689 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 12 17:13:19.136700 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 12 17:13:19.136710 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 12 17:13:19.136721 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 12 17:13:19.136731 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 12 17:13:19.136742 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 12 17:13:19.136752 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 12 17:13:19.136764 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 12 17:13:19.136775 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 12 17:13:19.136785 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 12 17:13:19.136795 systemd[1]: Created slice user.slice - User and Session Slice. Sep 12 17:13:19.136806 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 17:13:19.136817 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 17:13:19.136828 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 12 17:13:19.136838 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 12 17:13:19.136849 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 12 17:13:19.136864 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 17:13:19.136875 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Sep 12 17:13:19.136886 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 17:13:19.136896 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 12 17:13:19.136907 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 12 17:13:19.136918 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 12 17:13:19.136934 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 12 17:13:19.136947 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 17:13:19.136964 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 17:13:19.136974 systemd[1]: Reached target slices.target - Slice Units. Sep 12 17:13:19.136986 systemd[1]: Reached target swap.target - Swaps. Sep 12 17:13:19.136998 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 12 17:13:19.137009 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 12 17:13:19.137019 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 17:13:19.137030 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 17:13:19.137041 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 17:13:19.137055 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 12 17:13:19.138173 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 12 17:13:19.138208 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 12 17:13:19.138220 systemd[1]: Mounting media.mount - External Media Directory... Sep 12 17:13:19.138231 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 12 17:13:19.138243 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 12 17:13:19.138254 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 12 17:13:19.138266 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 12 17:13:19.138284 systemd[1]: Reached target machines.target - Containers. Sep 12 17:13:19.138296 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 12 17:13:19.138306 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:13:19.138317 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 17:13:19.138328 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 12 17:13:19.138338 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 17:13:19.138349 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 17:13:19.138364 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 17:13:19.138376 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 12 17:13:19.138387 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 17:13:19.138398 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 12 17:13:19.138409 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 12 17:13:19.138419 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 12 17:13:19.138430 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 12 17:13:19.138442 systemd[1]: Stopped systemd-fsck-usr.service. Sep 12 17:13:19.138455 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 17:13:19.138466 kernel: loop: module loaded Sep 12 17:13:19.138477 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 17:13:19.138488 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 17:13:19.138499 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 12 17:13:19.138510 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 17:13:19.138522 systemd[1]: verity-setup.service: Deactivated successfully. Sep 12 17:13:19.138533 systemd[1]: Stopped verity-setup.service. Sep 12 17:13:19.138545 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 12 17:13:19.138556 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 12 17:13:19.138566 systemd[1]: Mounted media.mount - External Media Directory. Sep 12 17:13:19.138577 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 12 17:13:19.138588 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 12 17:13:19.138598 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 12 17:13:19.138611 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 17:13:19.138622 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 12 17:13:19.138633 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 12 17:13:19.138643 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 17:13:19.138686 systemd-journald[1122]: Collecting audit messages is disabled. Sep 12 17:13:19.138709 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 17:13:19.138722 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 17:13:19.138733 kernel: fuse: init (API version 7.39) Sep 12 17:13:19.138743 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 17:13:19.138753 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 17:13:19.138765 systemd-journald[1122]: Journal started Sep 12 17:13:19.138791 systemd-journald[1122]: Runtime Journal (/run/log/journal/a6f48d7de750421784bf4638b2eb75c0) is 8.0M, max 76.6M, 68.6M free. Sep 12 17:13:18.872140 systemd[1]: Queued start job for default target multi-user.target. Sep 12 17:13:18.898826 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 12 17:13:18.899272 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 12 17:13:19.142117 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 17:13:19.142161 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 17:13:19.143657 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 12 17:13:19.144298 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 12 17:13:19.145656 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 17:13:19.147230 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 17:13:19.148263 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 12 17:13:19.162127 kernel: ACPI: bus type drm_connector registered Sep 12 17:13:19.164200 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 17:13:19.164378 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 17:13:19.167705 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 12 17:13:19.169949 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 17:13:19.176353 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 12 17:13:19.188227 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 12 17:13:19.188906 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 12 17:13:19.188954 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 17:13:19.193964 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Sep 12 17:13:19.199345 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 12 17:13:19.204588 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 12 17:13:19.206479 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:13:19.209353 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 12 17:13:19.213309 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 12 17:13:19.215798 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 17:13:19.218886 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 12 17:13:19.220511 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 17:13:19.224327 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 17:13:19.226285 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 12 17:13:19.234408 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 12 17:13:19.236625 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 12 17:13:19.237489 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 12 17:13:19.240159 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 12 17:13:19.260839 systemd-journald[1122]: Time spent on flushing to /var/log/journal/a6f48d7de750421784bf4638b2eb75c0 is 38.725ms for 1122 entries. Sep 12 17:13:19.260839 systemd-journald[1122]: System Journal (/var/log/journal/a6f48d7de750421784bf4638b2eb75c0) is 8.0M, max 584.8M, 576.8M free. Sep 12 17:13:19.315042 systemd-journald[1122]: Received client request to flush runtime journal. Sep 12 17:13:19.315283 kernel: loop0: detected capacity change from 0 to 8 Sep 12 17:13:19.315316 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 12 17:13:19.268172 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 12 17:13:19.269379 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 12 17:13:19.279333 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Sep 12 17:13:19.316218 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 17:13:19.327373 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Sep 12 17:13:19.328404 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 12 17:13:19.331865 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 17:13:19.334158 kernel: loop1: detected capacity change from 0 to 114328 Sep 12 17:13:19.339866 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 12 17:13:19.345759 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Sep 12 17:13:19.362619 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 12 17:13:19.373626 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 17:13:19.376807 udevadm[1180]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Sep 12 17:13:19.379211 kernel: loop2: detected capacity change from 0 to 114432 Sep 12 17:13:19.419144 kernel: loop3: detected capacity change from 0 to 207008 Sep 12 17:13:19.429015 systemd-tmpfiles[1186]: ACLs are not supported, ignoring. Sep 12 17:13:19.429039 systemd-tmpfiles[1186]: ACLs are not supported, ignoring. Sep 12 17:13:19.437555 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 17:13:19.469528 kernel: loop4: detected capacity change from 0 to 8 Sep 12 17:13:19.473153 kernel: loop5: detected capacity change from 0 to 114328 Sep 12 17:13:19.485131 kernel: loop6: detected capacity change from 0 to 114432 Sep 12 17:13:19.500493 kernel: loop7: detected capacity change from 0 to 207008 Sep 12 17:13:19.520736 (sd-merge)[1191]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Sep 12 17:13:19.521318 (sd-merge)[1191]: Merged extensions into '/usr'. Sep 12 17:13:19.530109 systemd[1]: Reloading requested from client PID 1166 ('systemd-sysext') (unit systemd-sysext.service)... Sep 12 17:13:19.530135 systemd[1]: Reloading... Sep 12 17:13:19.643498 zram_generator::config[1215]: No configuration found. Sep 12 17:13:19.782239 ldconfig[1161]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 12 17:13:19.822954 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 12 17:13:19.870473 systemd[1]: Reloading finished in 337 ms. Sep 12 17:13:19.899135 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 12 17:13:19.900222 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 12 17:13:19.909460 systemd[1]: Starting ensure-sysext.service... Sep 12 17:13:19.916608 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 17:13:19.939544 systemd[1]: Reloading requested from client PID 1254 ('systemctl') (unit ensure-sysext.service)... Sep 12 17:13:19.939565 systemd[1]: Reloading... Sep 12 17:13:19.989790 systemd-tmpfiles[1255]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 12 17:13:19.990122 systemd-tmpfiles[1255]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 12 17:13:19.990818 systemd-tmpfiles[1255]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 12 17:13:19.991055 systemd-tmpfiles[1255]: ACLs are not supported, ignoring. Sep 12 17:13:19.994269 systemd-tmpfiles[1255]: ACLs are not supported, ignoring. Sep 12 17:13:19.999358 systemd-tmpfiles[1255]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 17:13:19.999370 systemd-tmpfiles[1255]: Skipping /boot Sep 12 17:13:20.010819 systemd-tmpfiles[1255]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 17:13:20.010843 systemd-tmpfiles[1255]: Skipping /boot Sep 12 17:13:20.030098 zram_generator::config[1282]: No configuration found. Sep 12 17:13:20.130750 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 12 17:13:20.177650 systemd[1]: Reloading finished in 237 ms. Sep 12 17:13:20.193630 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 12 17:13:20.195879 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 17:13:20.226842 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Sep 12 17:13:20.232007 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 12 17:13:20.237418 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 12 17:13:20.242168 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 17:13:20.246411 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 17:13:20.253033 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 12 17:13:20.256873 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:13:20.264558 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 17:13:20.269863 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 17:13:20.273865 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 17:13:20.274727 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:13:20.284375 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 12 17:13:20.286641 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:13:20.286801 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:13:20.292597 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 17:13:20.292765 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 17:13:20.312453 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:13:20.315488 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 17:13:20.316266 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:13:20.321560 systemd[1]: Finished ensure-sysext.service. Sep 12 17:13:20.323417 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 12 17:13:20.328667 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 12 17:13:20.343433 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 12 17:13:20.346307 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 12 17:13:20.350377 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 17:13:20.350555 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 17:13:20.354678 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 17:13:20.354879 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 17:13:20.364464 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 17:13:20.364564 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 17:13:20.368935 systemd-udevd[1330]: Using default interface naming scheme 'v255'. Sep 12 17:13:20.378813 augenrules[1355]: No rules Sep 12 17:13:20.385457 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Sep 12 17:13:20.387290 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 12 17:13:20.389703 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 17:13:20.389938 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 17:13:20.396181 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 12 17:13:20.413520 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 12 17:13:20.414526 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 12 17:13:20.425159 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 17:13:20.433450 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 17:13:20.476975 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 12 17:13:20.479396 systemd[1]: Reached target time-set.target - System Time Set. Sep 12 17:13:20.523013 systemd-resolved[1329]: Positive Trust Anchors: Sep 12 17:13:20.523037 systemd-resolved[1329]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 17:13:20.523088 systemd-resolved[1329]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 17:13:20.531127 systemd-resolved[1329]: Using system hostname 'ci-4081-3-6-2-0999f1dc3d'. Sep 12 17:13:20.533115 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 17:13:20.535238 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 17:13:20.540829 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Sep 12 17:13:20.560181 systemd-networkd[1374]: lo: Link UP Sep 12 17:13:20.560527 systemd-networkd[1374]: lo: Gained carrier Sep 12 17:13:20.561376 systemd-networkd[1374]: Enumeration completed Sep 12 17:13:20.561672 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 17:13:20.562778 systemd[1]: Reached target network.target - Network. Sep 12 17:13:20.570440 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 12 17:13:20.613048 systemd-networkd[1374]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:13:20.613361 systemd-networkd[1374]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 17:13:20.614398 systemd-networkd[1374]: eth1: Link UP Sep 12 17:13:20.614475 systemd-networkd[1374]: eth1: Gained carrier Sep 12 17:13:20.614551 systemd-networkd[1374]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:13:20.626845 systemd-networkd[1374]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:13:20.627301 systemd-networkd[1374]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 17:13:20.628414 systemd-networkd[1374]: eth0: Link UP Sep 12 17:13:20.628513 systemd-networkd[1374]: eth0: Gained carrier Sep 12 17:13:20.628579 systemd-networkd[1374]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:13:20.646224 kernel: mousedev: PS/2 mouse device common for all mice Sep 12 17:13:20.649199 systemd-networkd[1374]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Sep 12 17:13:20.650216 systemd-timesyncd[1348]: Network configuration changed, trying to establish connection. Sep 12 17:13:20.685135 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (1375) Sep 12 17:13:20.700326 systemd-networkd[1374]: eth0: DHCPv4 address 49.13.6.100/32, gateway 172.31.1.1 acquired from 172.31.1.1 Sep 12 17:13:20.700923 systemd-timesyncd[1348]: Network configuration changed, trying to establish connection. Sep 12 17:13:20.702197 systemd-timesyncd[1348]: Network configuration changed, trying to establish connection. Sep 12 17:13:20.720565 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Sep 12 17:13:20.720661 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Sep 12 17:13:20.720675 kernel: [drm] features: -context_init Sep 12 17:13:20.723145 kernel: [drm] number of scanouts: 1 Sep 12 17:13:20.725219 kernel: [drm] number of cap sets: 0 Sep 12 17:13:20.726102 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:01.0 on minor 0 Sep 12 17:13:20.737662 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Sep 12 17:13:20.737802 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:13:20.744809 kernel: Console: switching to colour frame buffer device 160x50 Sep 12 17:13:20.746040 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 17:13:20.753111 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Sep 12 17:13:20.759301 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 17:13:20.765394 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 17:13:20.766099 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:13:20.766139 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 12 17:13:20.769585 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 17:13:20.771112 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 17:13:20.774578 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 17:13:20.775344 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 17:13:20.777016 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 17:13:20.780664 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Sep 12 17:13:20.799565 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 12 17:13:20.805774 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 17:13:20.806009 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 17:13:20.806996 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 17:13:20.832611 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:13:20.835128 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 12 17:13:20.849646 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 17:13:20.850652 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:13:20.859630 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:13:20.930965 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:13:20.990349 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Sep 12 17:13:20.996568 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Sep 12 17:13:21.014711 lvm[1435]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Sep 12 17:13:21.043042 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Sep 12 17:13:21.045279 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 17:13:21.046024 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 17:13:21.046889 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 12 17:13:21.047767 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 12 17:13:21.048930 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 12 17:13:21.049786 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 12 17:13:21.050590 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 12 17:13:21.051359 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 12 17:13:21.051397 systemd[1]: Reached target paths.target - Path Units. Sep 12 17:13:21.051974 systemd[1]: Reached target timers.target - Timer Units. Sep 12 17:13:21.053750 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 12 17:13:21.056285 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 12 17:13:21.062585 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 12 17:13:21.065346 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Sep 12 17:13:21.066931 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 12 17:13:21.067805 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 17:13:21.068443 systemd[1]: Reached target basic.target - Basic System. Sep 12 17:13:21.069031 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 12 17:13:21.069137 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 12 17:13:21.071255 systemd[1]: Starting containerd.service - containerd container runtime... Sep 12 17:13:21.076952 lvm[1439]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Sep 12 17:13:21.077416 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 12 17:13:21.082477 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 12 17:13:21.087298 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 12 17:13:21.095885 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 12 17:13:21.096662 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 12 17:13:21.100357 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 12 17:13:21.105894 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 12 17:13:21.117257 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Sep 12 17:13:21.120333 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 12 17:13:21.125534 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 12 17:13:21.131542 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 12 17:13:21.133961 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 12 17:13:21.134781 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 12 17:13:21.137271 systemd[1]: Starting update-engine.service - Update Engine... Sep 12 17:13:21.140450 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 12 17:13:21.144346 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Sep 12 17:13:21.155980 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 12 17:13:21.156430 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 12 17:13:21.170470 jq[1443]: false Sep 12 17:13:21.178533 jq[1454]: true Sep 12 17:13:21.187619 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 12 17:13:21.187821 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 12 17:13:21.192704 dbus-daemon[1442]: [system] SELinux support is enabled Sep 12 17:13:21.192875 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 12 17:13:21.196777 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 12 17:13:21.196808 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 12 17:13:21.198584 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 12 17:13:21.198608 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 12 17:13:21.212574 jq[1466]: true Sep 12 17:13:21.231332 (ntainerd)[1474]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 12 17:13:21.231809 extend-filesystems[1444]: Found loop4 Sep 12 17:13:21.234610 extend-filesystems[1444]: Found loop5 Sep 12 17:13:21.234610 extend-filesystems[1444]: Found loop6 Sep 12 17:13:21.234610 extend-filesystems[1444]: Found loop7 Sep 12 17:13:21.234610 extend-filesystems[1444]: Found sda Sep 12 17:13:21.234610 extend-filesystems[1444]: Found sda1 Sep 12 17:13:21.234610 extend-filesystems[1444]: Found sda2 Sep 12 17:13:21.234610 extend-filesystems[1444]: Found sda3 Sep 12 17:13:21.234610 extend-filesystems[1444]: Found usr Sep 12 17:13:21.234610 extend-filesystems[1444]: Found sda4 Sep 12 17:13:21.234610 extend-filesystems[1444]: Found sda6 Sep 12 17:13:21.259178 extend-filesystems[1444]: Found sda7 Sep 12 17:13:21.259178 extend-filesystems[1444]: Found sda9 Sep 12 17:13:21.259178 extend-filesystems[1444]: Checking size of /dev/sda9 Sep 12 17:13:21.264140 tar[1457]: linux-arm64/LICENSE Sep 12 17:13:21.264140 tar[1457]: linux-arm64/helm Sep 12 17:13:21.274429 coreos-metadata[1441]: Sep 12 17:13:21.235 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Sep 12 17:13:21.274429 coreos-metadata[1441]: Sep 12 17:13:21.243 INFO Fetch successful Sep 12 17:13:21.274429 coreos-metadata[1441]: Sep 12 17:13:21.243 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Sep 12 17:13:21.274429 coreos-metadata[1441]: Sep 12 17:13:21.249 INFO Fetch successful Sep 12 17:13:21.275112 systemd[1]: motdgen.service: Deactivated successfully. Sep 12 17:13:21.275464 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 12 17:13:21.289232 extend-filesystems[1444]: Resized partition /dev/sda9 Sep 12 17:13:21.299546 extend-filesystems[1488]: resize2fs 1.47.1 (20-May-2024) Sep 12 17:13:21.307541 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Sep 12 17:13:21.330200 update_engine[1453]: I20250912 17:13:21.325726 1453 main.cc:92] Flatcar Update Engine starting Sep 12 17:13:21.342850 systemd[1]: Started update-engine.service - Update Engine. Sep 12 17:13:21.345865 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 12 17:13:21.347242 update_engine[1453]: I20250912 17:13:21.345867 1453 update_check_scheduler.cc:74] Next update check in 4m53s Sep 12 17:13:21.392878 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 12 17:13:21.394259 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 12 17:13:21.407994 systemd-logind[1452]: New seat seat0. Sep 12 17:13:21.412777 systemd-logind[1452]: Watching system buttons on /dev/input/event0 (Power Button) Sep 12 17:13:21.412808 systemd-logind[1452]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Sep 12 17:13:21.414196 systemd[1]: Started systemd-logind.service - User Login Management. Sep 12 17:13:21.449316 bash[1514]: Updated "/home/core/.ssh/authorized_keys" Sep 12 17:13:21.452305 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (1384) Sep 12 17:13:21.454123 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 12 17:13:21.478742 systemd[1]: Starting sshkeys.service... Sep 12 17:13:21.536368 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 12 17:13:21.547000 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 12 17:13:21.556796 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Sep 12 17:13:21.598536 coreos-metadata[1519]: Sep 12 17:13:21.582 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Sep 12 17:13:21.598536 coreos-metadata[1519]: Sep 12 17:13:21.584 INFO Fetch successful Sep 12 17:13:21.603330 locksmithd[1503]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 12 17:13:21.607736 extend-filesystems[1488]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Sep 12 17:13:21.607736 extend-filesystems[1488]: old_desc_blocks = 1, new_desc_blocks = 5 Sep 12 17:13:21.607736 extend-filesystems[1488]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Sep 12 17:13:21.607653 unknown[1519]: wrote ssh authorized keys file for user: core Sep 12 17:13:21.618843 extend-filesystems[1444]: Resized filesystem in /dev/sda9 Sep 12 17:13:21.618843 extend-filesystems[1444]: Found sr0 Sep 12 17:13:21.609106 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 12 17:13:21.610768 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 12 17:13:21.666301 update-ssh-keys[1527]: Updated "/home/core/.ssh/authorized_keys" Sep 12 17:13:21.662537 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 12 17:13:21.674120 systemd[1]: Finished sshkeys.service. Sep 12 17:13:21.683432 containerd[1474]: time="2025-09-12T17:13:21.683293480Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Sep 12 17:13:21.736298 containerd[1474]: time="2025-09-12T17:13:21.735827280Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Sep 12 17:13:21.744108 containerd[1474]: time="2025-09-12T17:13:21.742565560Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.106-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Sep 12 17:13:21.744108 containerd[1474]: time="2025-09-12T17:13:21.742628640Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Sep 12 17:13:21.744108 containerd[1474]: time="2025-09-12T17:13:21.742652840Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Sep 12 17:13:21.744108 containerd[1474]: time="2025-09-12T17:13:21.742879320Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Sep 12 17:13:21.744108 containerd[1474]: time="2025-09-12T17:13:21.742900760Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Sep 12 17:13:21.744108 containerd[1474]: time="2025-09-12T17:13:21.742998000Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Sep 12 17:13:21.744108 containerd[1474]: time="2025-09-12T17:13:21.743026840Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Sep 12 17:13:21.744108 containerd[1474]: time="2025-09-12T17:13:21.743392480Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Sep 12 17:13:21.744108 containerd[1474]: time="2025-09-12T17:13:21.743417160Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Sep 12 17:13:21.744108 containerd[1474]: time="2025-09-12T17:13:21.743445160Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Sep 12 17:13:21.744108 containerd[1474]: time="2025-09-12T17:13:21.743503080Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Sep 12 17:13:21.744507 containerd[1474]: time="2025-09-12T17:13:21.743612800Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Sep 12 17:13:21.744507 containerd[1474]: time="2025-09-12T17:13:21.743864160Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Sep 12 17:13:21.744507 containerd[1474]: time="2025-09-12T17:13:21.744004280Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Sep 12 17:13:21.744507 containerd[1474]: time="2025-09-12T17:13:21.744026120Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Sep 12 17:13:21.745126 containerd[1474]: time="2025-09-12T17:13:21.745097760Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Sep 12 17:13:21.745333 containerd[1474]: time="2025-09-12T17:13:21.745314240Z" level=info msg="metadata content store policy set" policy=shared Sep 12 17:13:21.750537 containerd[1474]: time="2025-09-12T17:13:21.750475600Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Sep 12 17:13:21.751027 containerd[1474]: time="2025-09-12T17:13:21.750886760Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Sep 12 17:13:21.751236 containerd[1474]: time="2025-09-12T17:13:21.751211160Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Sep 12 17:13:21.751339 containerd[1474]: time="2025-09-12T17:13:21.751318520Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Sep 12 17:13:21.751462 containerd[1474]: time="2025-09-12T17:13:21.751441840Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Sep 12 17:13:21.751808 containerd[1474]: time="2025-09-12T17:13:21.751786240Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Sep 12 17:13:21.752371 containerd[1474]: time="2025-09-12T17:13:21.752340760Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Sep 12 17:13:21.752689 containerd[1474]: time="2025-09-12T17:13:21.752670240Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Sep 12 17:13:21.752774 containerd[1474]: time="2025-09-12T17:13:21.752760360Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Sep 12 17:13:21.752887 containerd[1474]: time="2025-09-12T17:13:21.752867120Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Sep 12 17:13:21.753036 containerd[1474]: time="2025-09-12T17:13:21.753019800Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Sep 12 17:13:21.753252 containerd[1474]: time="2025-09-12T17:13:21.753234200Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Sep 12 17:13:21.753368 containerd[1474]: time="2025-09-12T17:13:21.753306480Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Sep 12 17:13:21.753434 containerd[1474]: time="2025-09-12T17:13:21.753422080Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Sep 12 17:13:21.753500 containerd[1474]: time="2025-09-12T17:13:21.753486640Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Sep 12 17:13:21.753584 containerd[1474]: time="2025-09-12T17:13:21.753572120Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Sep 12 17:13:21.753703 containerd[1474]: time="2025-09-12T17:13:21.753686000Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Sep 12 17:13:21.753775 containerd[1474]: time="2025-09-12T17:13:21.753761200Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Sep 12 17:13:21.753855 containerd[1474]: time="2025-09-12T17:13:21.753842800Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Sep 12 17:13:21.754421 containerd[1474]: time="2025-09-12T17:13:21.753935480Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Sep 12 17:13:21.754421 containerd[1474]: time="2025-09-12T17:13:21.753970040Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Sep 12 17:13:21.754421 containerd[1474]: time="2025-09-12T17:13:21.753987400Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Sep 12 17:13:21.754421 containerd[1474]: time="2025-09-12T17:13:21.754000840Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Sep 12 17:13:21.754421 containerd[1474]: time="2025-09-12T17:13:21.754015000Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Sep 12 17:13:21.754421 containerd[1474]: time="2025-09-12T17:13:21.754032560Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Sep 12 17:13:21.754421 containerd[1474]: time="2025-09-12T17:13:21.754100960Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Sep 12 17:13:21.754421 containerd[1474]: time="2025-09-12T17:13:21.754118240Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Sep 12 17:13:21.754421 containerd[1474]: time="2025-09-12T17:13:21.754134720Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Sep 12 17:13:21.754421 containerd[1474]: time="2025-09-12T17:13:21.754149200Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Sep 12 17:13:21.754421 containerd[1474]: time="2025-09-12T17:13:21.754164120Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Sep 12 17:13:21.754421 containerd[1474]: time="2025-09-12T17:13:21.754180080Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Sep 12 17:13:21.754421 containerd[1474]: time="2025-09-12T17:13:21.754197760Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Sep 12 17:13:21.754421 containerd[1474]: time="2025-09-12T17:13:21.754227080Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Sep 12 17:13:21.754421 containerd[1474]: time="2025-09-12T17:13:21.754272280Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Sep 12 17:13:21.754741 containerd[1474]: time="2025-09-12T17:13:21.754285760Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Sep 12 17:13:21.754859 containerd[1474]: time="2025-09-12T17:13:21.754840640Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Sep 12 17:13:21.755135 containerd[1474]: time="2025-09-12T17:13:21.755112880Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Sep 12 17:13:21.755195 containerd[1474]: time="2025-09-12T17:13:21.755181680Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Sep 12 17:13:21.755263 containerd[1474]: time="2025-09-12T17:13:21.755248760Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Sep 12 17:13:21.755324 containerd[1474]: time="2025-09-12T17:13:21.755311560Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Sep 12 17:13:21.755387 containerd[1474]: time="2025-09-12T17:13:21.755374960Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Sep 12 17:13:21.755443 containerd[1474]: time="2025-09-12T17:13:21.755431360Z" level=info msg="NRI interface is disabled by configuration." Sep 12 17:13:21.755497 containerd[1474]: time="2025-09-12T17:13:21.755485840Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Sep 12 17:13:21.756705 containerd[1474]: time="2025-09-12T17:13:21.755969200Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Sep 12 17:13:21.756705 containerd[1474]: time="2025-09-12T17:13:21.756037320Z" level=info msg="Connect containerd service" Sep 12 17:13:21.756705 containerd[1474]: time="2025-09-12T17:13:21.756133360Z" level=info msg="using legacy CRI server" Sep 12 17:13:21.756705 containerd[1474]: time="2025-09-12T17:13:21.756144240Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 12 17:13:21.756705 containerd[1474]: time="2025-09-12T17:13:21.756245280Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Sep 12 17:13:21.757592 containerd[1474]: time="2025-09-12T17:13:21.757566960Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 17:13:21.757901 containerd[1474]: time="2025-09-12T17:13:21.757860720Z" level=info msg="Start subscribing containerd event" Sep 12 17:13:21.758098 containerd[1474]: time="2025-09-12T17:13:21.758034360Z" level=info msg="Start recovering state" Sep 12 17:13:21.758354 containerd[1474]: time="2025-09-12T17:13:21.758336200Z" level=info msg="Start event monitor" Sep 12 17:13:21.758529 containerd[1474]: time="2025-09-12T17:13:21.758512720Z" level=info msg="Start snapshots syncer" Sep 12 17:13:21.758588 containerd[1474]: time="2025-09-12T17:13:21.758576840Z" level=info msg="Start cni network conf syncer for default" Sep 12 17:13:21.758643 containerd[1474]: time="2025-09-12T17:13:21.758631800Z" level=info msg="Start streaming server" Sep 12 17:13:21.759614 containerd[1474]: time="2025-09-12T17:13:21.759592360Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 12 17:13:21.759757 containerd[1474]: time="2025-09-12T17:13:21.759734560Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 12 17:13:21.760012 systemd[1]: Started containerd.service - containerd container runtime. Sep 12 17:13:21.763176 containerd[1474]: time="2025-09-12T17:13:21.763139480Z" level=info msg="containerd successfully booted in 0.083233s" Sep 12 17:13:22.050775 tar[1457]: linux-arm64/README.md Sep 12 17:13:22.062557 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 12 17:13:22.097795 sshd_keygen[1465]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 12 17:13:22.126978 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 12 17:13:22.136629 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 12 17:13:22.147773 systemd[1]: issuegen.service: Deactivated successfully. Sep 12 17:13:22.148011 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 12 17:13:22.156687 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 12 17:13:22.169459 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 12 17:13:22.180664 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 12 17:13:22.184619 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Sep 12 17:13:22.185534 systemd[1]: Reached target getty.target - Login Prompts. Sep 12 17:13:22.349302 systemd-networkd[1374]: eth1: Gained IPv6LL Sep 12 17:13:22.350594 systemd-timesyncd[1348]: Network configuration changed, trying to establish connection. Sep 12 17:13:22.356227 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 12 17:13:22.358328 systemd[1]: Reached target network-online.target - Network is Online. Sep 12 17:13:22.368503 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:13:22.372873 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 12 17:13:22.407929 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 12 17:13:22.605462 systemd-networkd[1374]: eth0: Gained IPv6LL Sep 12 17:13:22.606171 systemd-timesyncd[1348]: Network configuration changed, trying to establish connection. Sep 12 17:13:23.216029 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:13:23.218139 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 12 17:13:23.225215 systemd[1]: Startup finished in 837ms (kernel) + 5.675s (initrd) + 4.890s (userspace) = 11.403s. Sep 12 17:13:23.226616 (kubelet)[1571]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:13:23.730901 kubelet[1571]: E0912 17:13:23.730821 1571 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:13:23.735777 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:13:23.735986 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:13:33.841852 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 12 17:13:33.853422 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:13:33.999433 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:13:33.999610 (kubelet)[1590]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:13:34.056607 kubelet[1590]: E0912 17:13:34.056551 1590 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:13:34.062681 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:13:34.063009 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:13:36.981510 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 12 17:13:36.986436 systemd[1]: Started sshd@0-49.13.6.100:22-139.178.89.65:33536.service - OpenSSH per-connection server daemon (139.178.89.65:33536). Sep 12 17:13:37.978033 sshd[1598]: Accepted publickey for core from 139.178.89.65 port 33536 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:13:37.981363 sshd[1598]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:13:37.999593 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 12 17:13:38.000144 systemd-logind[1452]: New session 1 of user core. Sep 12 17:13:38.012045 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 12 17:13:38.027681 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 12 17:13:38.040688 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 12 17:13:38.048853 (systemd)[1602]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 12 17:13:38.168654 systemd[1602]: Queued start job for default target default.target. Sep 12 17:13:38.178061 systemd[1602]: Created slice app.slice - User Application Slice. Sep 12 17:13:38.178119 systemd[1602]: Reached target paths.target - Paths. Sep 12 17:13:38.178134 systemd[1602]: Reached target timers.target - Timers. Sep 12 17:13:38.179838 systemd[1602]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 12 17:13:38.209008 systemd[1602]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 12 17:13:38.209318 systemd[1602]: Reached target sockets.target - Sockets. Sep 12 17:13:38.209337 systemd[1602]: Reached target basic.target - Basic System. Sep 12 17:13:38.209441 systemd[1602]: Reached target default.target - Main User Target. Sep 12 17:13:38.209478 systemd[1602]: Startup finished in 151ms. Sep 12 17:13:38.210007 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 12 17:13:38.221519 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 12 17:13:38.912032 systemd[1]: Started sshd@1-49.13.6.100:22-139.178.89.65:33550.service - OpenSSH per-connection server daemon (139.178.89.65:33550). Sep 12 17:13:39.897974 sshd[1613]: Accepted publickey for core from 139.178.89.65 port 33550 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:13:39.900174 sshd[1613]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:13:39.906824 systemd-logind[1452]: New session 2 of user core. Sep 12 17:13:39.912447 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 12 17:13:40.587118 sshd[1613]: pam_unix(sshd:session): session closed for user core Sep 12 17:13:40.593179 systemd-logind[1452]: Session 2 logged out. Waiting for processes to exit. Sep 12 17:13:40.594268 systemd[1]: sshd@1-49.13.6.100:22-139.178.89.65:33550.service: Deactivated successfully. Sep 12 17:13:40.598038 systemd[1]: session-2.scope: Deactivated successfully. Sep 12 17:13:40.599369 systemd-logind[1452]: Removed session 2. Sep 12 17:13:40.770545 systemd[1]: Started sshd@2-49.13.6.100:22-139.178.89.65:50992.service - OpenSSH per-connection server daemon (139.178.89.65:50992). Sep 12 17:13:41.746697 sshd[1620]: Accepted publickey for core from 139.178.89.65 port 50992 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:13:41.749248 sshd[1620]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:13:41.755335 systemd-logind[1452]: New session 3 of user core. Sep 12 17:13:41.760882 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 12 17:13:42.425540 sshd[1620]: pam_unix(sshd:session): session closed for user core Sep 12 17:13:42.429801 systemd[1]: sshd@2-49.13.6.100:22-139.178.89.65:50992.service: Deactivated successfully. Sep 12 17:13:42.431744 systemd[1]: session-3.scope: Deactivated successfully. Sep 12 17:13:42.434492 systemd-logind[1452]: Session 3 logged out. Waiting for processes to exit. Sep 12 17:13:42.436113 systemd-logind[1452]: Removed session 3. Sep 12 17:13:42.595642 systemd[1]: Started sshd@3-49.13.6.100:22-139.178.89.65:51002.service - OpenSSH per-connection server daemon (139.178.89.65:51002). Sep 12 17:13:43.577212 sshd[1627]: Accepted publickey for core from 139.178.89.65 port 51002 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:13:43.579707 sshd[1627]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:13:43.586179 systemd-logind[1452]: New session 4 of user core. Sep 12 17:13:43.595434 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 12 17:13:44.091328 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 12 17:13:44.101054 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:13:44.242365 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:13:44.246101 (kubelet)[1639]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:13:44.260409 sshd[1627]: pam_unix(sshd:session): session closed for user core Sep 12 17:13:44.267152 systemd[1]: sshd@3-49.13.6.100:22-139.178.89.65:51002.service: Deactivated successfully. Sep 12 17:13:44.270982 systemd[1]: session-4.scope: Deactivated successfully. Sep 12 17:13:44.274733 systemd-logind[1452]: Session 4 logged out. Waiting for processes to exit. Sep 12 17:13:44.278342 systemd-logind[1452]: Removed session 4. Sep 12 17:13:44.302822 kubelet[1639]: E0912 17:13:44.302729 1639 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:13:44.306016 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:13:44.306298 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:13:44.435833 systemd[1]: Started sshd@4-49.13.6.100:22-139.178.89.65:51010.service - OpenSSH per-connection server daemon (139.178.89.65:51010). Sep 12 17:13:45.407810 sshd[1649]: Accepted publickey for core from 139.178.89.65 port 51010 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:13:45.410171 sshd[1649]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:13:45.415607 systemd-logind[1452]: New session 5 of user core. Sep 12 17:13:45.423486 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 12 17:13:45.939415 sudo[1652]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 12 17:13:45.939747 sudo[1652]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:13:45.961188 sudo[1652]: pam_unix(sudo:session): session closed for user root Sep 12 17:13:46.121263 sshd[1649]: pam_unix(sshd:session): session closed for user core Sep 12 17:13:46.126805 systemd[1]: sshd@4-49.13.6.100:22-139.178.89.65:51010.service: Deactivated successfully. Sep 12 17:13:46.129712 systemd[1]: session-5.scope: Deactivated successfully. Sep 12 17:13:46.132372 systemd-logind[1452]: Session 5 logged out. Waiting for processes to exit. Sep 12 17:13:46.134215 systemd-logind[1452]: Removed session 5. Sep 12 17:13:46.311659 systemd[1]: Started sshd@5-49.13.6.100:22-139.178.89.65:51012.service - OpenSSH per-connection server daemon (139.178.89.65:51012). Sep 12 17:13:47.363154 sshd[1657]: Accepted publickey for core from 139.178.89.65 port 51012 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:13:47.366061 sshd[1657]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:13:47.373171 systemd-logind[1452]: New session 6 of user core. Sep 12 17:13:47.376282 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 12 17:13:47.926048 sudo[1661]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 12 17:13:47.926413 sudo[1661]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:13:47.930465 sudo[1661]: pam_unix(sudo:session): session closed for user root Sep 12 17:13:47.937509 sudo[1660]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Sep 12 17:13:47.937790 sudo[1660]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:13:47.958653 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Sep 12 17:13:47.960357 auditctl[1664]: No rules Sep 12 17:13:47.960727 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 17:13:47.960930 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Sep 12 17:13:47.964492 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Sep 12 17:13:48.016108 augenrules[1682]: No rules Sep 12 17:13:48.018278 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Sep 12 17:13:48.021588 sudo[1660]: pam_unix(sudo:session): session closed for user root Sep 12 17:13:48.193468 sshd[1657]: pam_unix(sshd:session): session closed for user core Sep 12 17:13:48.199870 systemd[1]: sshd@5-49.13.6.100:22-139.178.89.65:51012.service: Deactivated successfully. Sep 12 17:13:48.202723 systemd[1]: session-6.scope: Deactivated successfully. Sep 12 17:13:48.205006 systemd-logind[1452]: Session 6 logged out. Waiting for processes to exit. Sep 12 17:13:48.206517 systemd-logind[1452]: Removed session 6. Sep 12 17:13:48.370104 systemd[1]: Started sshd@6-49.13.6.100:22-139.178.89.65:51022.service - OpenSSH per-connection server daemon (139.178.89.65:51022). Sep 12 17:13:49.346407 sshd[1690]: Accepted publickey for core from 139.178.89.65 port 51022 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:13:49.348636 sshd[1690]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:13:49.353475 systemd-logind[1452]: New session 7 of user core. Sep 12 17:13:49.364439 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 12 17:13:49.871510 sudo[1693]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 12 17:13:49.871813 sudo[1693]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:13:50.202601 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 12 17:13:50.202611 (dockerd)[1708]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 12 17:13:50.475851 dockerd[1708]: time="2025-09-12T17:13:50.475609360Z" level=info msg="Starting up" Sep 12 17:13:52.361585 dockerd[1708]: time="2025-09-12T17:13:52.361499040Z" level=info msg="Loading containers: start." Sep 12 17:13:52.970482 systemd-timesyncd[1348]: Contacted time server 176.9.44.212:123 (2.flatcar.pool.ntp.org). Sep 12 17:13:52.970569 systemd-timesyncd[1348]: Initial clock synchronization to Fri 2025-09-12 17:13:52.860867 UTC. Sep 12 17:13:53.408085 kernel: Initializing XFRM netlink socket Sep 12 17:13:53.497707 systemd-networkd[1374]: docker0: Link UP Sep 12 17:13:53.538761 dockerd[1708]: time="2025-09-12T17:13:53.538691639Z" level=info msg="Loading containers: done." Sep 12 17:13:53.557906 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck360174450-merged.mount: Deactivated successfully. Sep 12 17:13:53.577527 dockerd[1708]: time="2025-09-12T17:13:53.577412384Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 12 17:13:53.577798 dockerd[1708]: time="2025-09-12T17:13:53.577623070Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Sep 12 17:13:53.577881 dockerd[1708]: time="2025-09-12T17:13:53.577856363Z" level=info msg="Daemon has completed initialization" Sep 12 17:13:53.675269 dockerd[1708]: time="2025-09-12T17:13:53.674875777Z" level=info msg="API listen on /run/docker.sock" Sep 12 17:13:53.675394 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 12 17:13:54.341914 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 12 17:13:54.361131 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:13:54.510359 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:13:54.521736 (kubelet)[1854]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:13:54.578431 kubelet[1854]: E0912 17:13:54.578318 1854 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:13:54.583417 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:13:54.584128 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:13:54.877706 containerd[1474]: time="2025-09-12T17:13:54.877247143Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.9\"" Sep 12 17:13:55.511478 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1099391241.mount: Deactivated successfully. Sep 12 17:13:56.422839 containerd[1474]: time="2025-09-12T17:13:56.422775801Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:13:56.424390 containerd[1474]: time="2025-09-12T17:13:56.424336959Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.9: active requests=0, bytes read=26363783" Sep 12 17:13:56.426098 containerd[1474]: time="2025-09-12T17:13:56.424926917Z" level=info msg="ImageCreate event name:\"sha256:02ea53851f07db91ed471dab1ab11541f5c294802371cd8f0cfd423cd5c71002\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:13:56.429117 containerd[1474]: time="2025-09-12T17:13:56.428785004Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:6df11cc2ad9679b1117be34d3a0230add88bc0a08fd7a3ebc26b680575e8de97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:13:56.431842 containerd[1474]: time="2025-09-12T17:13:56.430299671Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.9\" with image id \"sha256:02ea53851f07db91ed471dab1ab11541f5c294802371cd8f0cfd423cd5c71002\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.9\", repo digest \"registry.k8s.io/kube-apiserver@sha256:6df11cc2ad9679b1117be34d3a0230add88bc0a08fd7a3ebc26b680575e8de97\", size \"26360284\" in 1.552990915s" Sep 12 17:13:56.431842 containerd[1474]: time="2025-09-12T17:13:56.430371409Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.9\" returns image reference \"sha256:02ea53851f07db91ed471dab1ab11541f5c294802371cd8f0cfd423cd5c71002\"" Sep 12 17:13:56.432359 containerd[1474]: time="2025-09-12T17:13:56.432303068Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.9\"" Sep 12 17:13:57.547313 containerd[1474]: time="2025-09-12T17:13:57.547215389Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:13:57.549346 containerd[1474]: time="2025-09-12T17:13:57.549288355Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.9: active requests=0, bytes read=22531220" Sep 12 17:13:57.550455 containerd[1474]: time="2025-09-12T17:13:57.549904864Z" level=info msg="ImageCreate event name:\"sha256:f0bcbad5082c944520b370596a2384affda710b9d7daf84e8a48352699af8e4b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:13:57.553732 containerd[1474]: time="2025-09-12T17:13:57.553678160Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:243c4b8e3bce271fcb1b78008ab996ab6976b1a20096deac08338fcd17979922\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:13:57.555081 containerd[1474]: time="2025-09-12T17:13:57.555012640Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.9\" with image id \"sha256:f0bcbad5082c944520b370596a2384affda710b9d7daf84e8a48352699af8e4b\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.9\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:243c4b8e3bce271fcb1b78008ab996ab6976b1a20096deac08338fcd17979922\", size \"24099975\" in 1.12254793s" Sep 12 17:13:57.555172 containerd[1474]: time="2025-09-12T17:13:57.555088151Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.9\" returns image reference \"sha256:f0bcbad5082c944520b370596a2384affda710b9d7daf84e8a48352699af8e4b\"" Sep 12 17:13:57.555711 containerd[1474]: time="2025-09-12T17:13:57.555651965Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.9\"" Sep 12 17:13:58.543124 containerd[1474]: time="2025-09-12T17:13:58.542632273Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:13:58.544244 containerd[1474]: time="2025-09-12T17:13:58.544188584Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.9: active requests=0, bytes read=17484344" Sep 12 17:13:58.546004 containerd[1474]: time="2025-09-12T17:13:58.545908422Z" level=info msg="ImageCreate event name:\"sha256:1d625baf81b59592006d97a6741bc947698ed222b612ac10efa57b7aa96d2a27\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:13:58.551632 containerd[1474]: time="2025-09-12T17:13:58.551561460Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:50c49520dbd0e8b4076b6a5c77d8014df09ea3d59a73e8bafd2678d51ebb92d5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:13:58.553078 containerd[1474]: time="2025-09-12T17:13:58.553005800Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.9\" with image id \"sha256:1d625baf81b59592006d97a6741bc947698ed222b612ac10efa57b7aa96d2a27\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.9\", repo digest \"registry.k8s.io/kube-scheduler@sha256:50c49520dbd0e8b4076b6a5c77d8014df09ea3d59a73e8bafd2678d51ebb92d5\", size \"19053117\" in 997.311961ms" Sep 12 17:13:58.553636 containerd[1474]: time="2025-09-12T17:13:58.553579477Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.9\" returns image reference \"sha256:1d625baf81b59592006d97a6741bc947698ed222b612ac10efa57b7aa96d2a27\"" Sep 12 17:13:58.554398 containerd[1474]: time="2025-09-12T17:13:58.554363868Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.9\"" Sep 12 17:13:59.522987 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3633283678.mount: Deactivated successfully. Sep 12 17:13:59.854338 containerd[1474]: time="2025-09-12T17:13:59.854214260Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:13:59.856161 containerd[1474]: time="2025-09-12T17:13:59.856090499Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.9: active requests=0, bytes read=27417843" Sep 12 17:13:59.857416 containerd[1474]: time="2025-09-12T17:13:59.857326484Z" level=info msg="ImageCreate event name:\"sha256:72b57ec14d31e8422925ef4c3eff44822cdc04a11fd30d13824f1897d83a16d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:13:59.859674 containerd[1474]: time="2025-09-12T17:13:59.859572564Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:886af02535dc34886e4618b902f8c140d89af57233a245621d29642224516064\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:13:59.861099 containerd[1474]: time="2025-09-12T17:13:59.860336935Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.9\" with image id \"sha256:72b57ec14d31e8422925ef4c3eff44822cdc04a11fd30d13824f1897d83a16d4\", repo tag \"registry.k8s.io/kube-proxy:v1.32.9\", repo digest \"registry.k8s.io/kube-proxy@sha256:886af02535dc34886e4618b902f8c140d89af57233a245621d29642224516064\", size \"27416836\" in 1.305929083s" Sep 12 17:13:59.861099 containerd[1474]: time="2025-09-12T17:13:59.860388139Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.9\" returns image reference \"sha256:72b57ec14d31e8422925ef4c3eff44822cdc04a11fd30d13824f1897d83a16d4\"" Sep 12 17:13:59.861979 containerd[1474]: time="2025-09-12T17:13:59.861668649Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 12 17:14:00.466509 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2661280673.mount: Deactivated successfully. Sep 12 17:14:01.278473 containerd[1474]: time="2025-09-12T17:14:01.278393878Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:01.279870 containerd[1474]: time="2025-09-12T17:14:01.279666444Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951714" Sep 12 17:14:01.281268 containerd[1474]: time="2025-09-12T17:14:01.281180870Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:01.288266 containerd[1474]: time="2025-09-12T17:14:01.288170862Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:01.289858 containerd[1474]: time="2025-09-12T17:14:01.289778410Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.42805654s" Sep 12 17:14:01.289858 containerd[1474]: time="2025-09-12T17:14:01.289857796Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Sep 12 17:14:01.291080 containerd[1474]: time="2025-09-12T17:14:01.290791478Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 12 17:14:01.803299 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3314981995.mount: Deactivated successfully. Sep 12 17:14:01.810239 containerd[1474]: time="2025-09-12T17:14:01.810175249Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:01.812308 containerd[1474]: time="2025-09-12T17:14:01.812255091Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" Sep 12 17:14:01.813258 containerd[1474]: time="2025-09-12T17:14:01.813201115Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:01.817246 containerd[1474]: time="2025-09-12T17:14:01.817172408Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:01.819108 containerd[1474]: time="2025-09-12T17:14:01.818752962Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 527.916457ms" Sep 12 17:14:01.819108 containerd[1474]: time="2025-09-12T17:14:01.818816224Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Sep 12 17:14:01.819625 containerd[1474]: time="2025-09-12T17:14:01.819556537Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Sep 12 17:14:02.415141 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2179250729.mount: Deactivated successfully. Sep 12 17:14:03.894038 containerd[1474]: time="2025-09-12T17:14:03.893943386Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:03.896105 containerd[1474]: time="2025-09-12T17:14:03.895672636Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=67943239" Sep 12 17:14:03.897471 containerd[1474]: time="2025-09-12T17:14:03.897401328Z" level=info msg="ImageCreate event name:\"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:03.905398 containerd[1474]: time="2025-09-12T17:14:03.905306289Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:03.907223 containerd[1474]: time="2025-09-12T17:14:03.907015651Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"67941650\" in 2.087394482s" Sep 12 17:14:03.907223 containerd[1474]: time="2025-09-12T17:14:03.907085998Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\"" Sep 12 17:14:04.591872 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Sep 12 17:14:04.601227 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:14:04.766312 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:14:04.768001 (kubelet)[2073]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:14:04.817089 kubelet[2073]: E0912 17:14:04.816160 2073 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:14:04.819088 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:14:04.819268 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:14:06.836220 update_engine[1453]: I20250912 17:14:06.836103 1453 update_attempter.cc:509] Updating boot flags... Sep 12 17:14:06.907120 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (2089) Sep 12 17:14:07.017049 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (2091) Sep 12 17:14:09.662468 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:14:09.671655 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:14:09.711024 systemd[1]: Reloading requested from client PID 2105 ('systemctl') (unit session-7.scope)... Sep 12 17:14:09.711055 systemd[1]: Reloading... Sep 12 17:14:09.859117 zram_generator::config[2157]: No configuration found. Sep 12 17:14:09.963497 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 12 17:14:10.036680 systemd[1]: Reloading finished in 325 ms. Sep 12 17:14:10.099643 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 12 17:14:10.099788 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 12 17:14:10.101045 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:14:10.108515 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:14:10.258774 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:14:10.272732 (kubelet)[2194]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 17:14:10.329455 kubelet[2194]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:14:10.329916 kubelet[2194]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 12 17:14:10.329977 kubelet[2194]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:14:10.330212 kubelet[2194]: I0912 17:14:10.330168 2194 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 17:14:11.044378 kubelet[2194]: I0912 17:14:11.044303 2194 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 12 17:14:11.044648 kubelet[2194]: I0912 17:14:11.044630 2194 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 17:14:11.045236 kubelet[2194]: I0912 17:14:11.045213 2194 server.go:954] "Client rotation is on, will bootstrap in background" Sep 12 17:14:11.075387 kubelet[2194]: E0912 17:14:11.075312 2194 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://49.13.6.100:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 49.13.6.100:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:14:11.077184 kubelet[2194]: I0912 17:14:11.077140 2194 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 17:14:11.085276 kubelet[2194]: E0912 17:14:11.084919 2194 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Sep 12 17:14:11.085276 kubelet[2194]: I0912 17:14:11.084962 2194 server.go:1421] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Sep 12 17:14:11.091503 kubelet[2194]: I0912 17:14:11.091392 2194 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 17:14:11.092870 kubelet[2194]: I0912 17:14:11.092778 2194 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 17:14:11.093694 kubelet[2194]: I0912 17:14:11.092863 2194 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081-3-6-2-0999f1dc3d","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 17:14:11.093852 kubelet[2194]: I0912 17:14:11.093779 2194 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 17:14:11.093852 kubelet[2194]: I0912 17:14:11.093797 2194 container_manager_linux.go:304] "Creating device plugin manager" Sep 12 17:14:11.094163 kubelet[2194]: I0912 17:14:11.094146 2194 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:14:11.098347 kubelet[2194]: I0912 17:14:11.098291 2194 kubelet.go:446] "Attempting to sync node with API server" Sep 12 17:14:11.098347 kubelet[2194]: I0912 17:14:11.098342 2194 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 17:14:11.098516 kubelet[2194]: I0912 17:14:11.098379 2194 kubelet.go:352] "Adding apiserver pod source" Sep 12 17:14:11.098516 kubelet[2194]: I0912 17:14:11.098396 2194 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 17:14:11.103761 kubelet[2194]: W0912 17:14:11.103423 2194 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://49.13.6.100:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 49.13.6.100:6443: connect: connection refused Sep 12 17:14:11.103761 kubelet[2194]: E0912 17:14:11.103523 2194 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://49.13.6.100:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 49.13.6.100:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:14:11.103761 kubelet[2194]: W0912 17:14:11.103674 2194 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://49.13.6.100:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4081-3-6-2-0999f1dc3d&limit=500&resourceVersion=0": dial tcp 49.13.6.100:6443: connect: connection refused Sep 12 17:14:11.103761 kubelet[2194]: E0912 17:14:11.103716 2194 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://49.13.6.100:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4081-3-6-2-0999f1dc3d&limit=500&resourceVersion=0\": dial tcp 49.13.6.100:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:14:11.104459 kubelet[2194]: I0912 17:14:11.104384 2194 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Sep 12 17:14:11.107114 kubelet[2194]: I0912 17:14:11.106085 2194 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 17:14:11.107114 kubelet[2194]: W0912 17:14:11.106254 2194 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 12 17:14:11.108318 kubelet[2194]: I0912 17:14:11.108291 2194 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 12 17:14:11.108769 kubelet[2194]: I0912 17:14:11.108752 2194 server.go:1287] "Started kubelet" Sep 12 17:14:11.114906 kubelet[2194]: I0912 17:14:11.114862 2194 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 17:14:11.116607 kubelet[2194]: E0912 17:14:11.116287 2194 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://49.13.6.100:6443/api/v1/namespaces/default/events\": dial tcp 49.13.6.100:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4081-3-6-2-0999f1dc3d.186498565303da11 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4081-3-6-2-0999f1dc3d,UID:ci-4081-3-6-2-0999f1dc3d,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4081-3-6-2-0999f1dc3d,},FirstTimestamp:2025-09-12 17:14:11.108706833 +0000 UTC m=+0.829291717,LastTimestamp:2025-09-12 17:14:11.108706833 +0000 UTC m=+0.829291717,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4081-3-6-2-0999f1dc3d,}" Sep 12 17:14:11.122246 kubelet[2194]: I0912 17:14:11.122123 2194 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 17:14:11.123320 kubelet[2194]: I0912 17:14:11.123277 2194 server.go:479] "Adding debug handlers to kubelet server" Sep 12 17:14:11.123793 kubelet[2194]: I0912 17:14:11.123773 2194 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 12 17:14:11.124406 kubelet[2194]: E0912 17:14:11.124364 2194 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-2-0999f1dc3d\" not found" Sep 12 17:14:11.125315 kubelet[2194]: I0912 17:14:11.125229 2194 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 17:14:11.126129 kubelet[2194]: I0912 17:14:11.125650 2194 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 17:14:11.126129 kubelet[2194]: I0912 17:14:11.125937 2194 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 17:14:11.127159 kubelet[2194]: E0912 17:14:11.127112 2194 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://49.13.6.100:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-2-0999f1dc3d?timeout=10s\": dial tcp 49.13.6.100:6443: connect: connection refused" interval="200ms" Sep 12 17:14:11.127681 kubelet[2194]: I0912 17:14:11.127644 2194 factory.go:221] Registration of the systemd container factory successfully Sep 12 17:14:11.127851 kubelet[2194]: I0912 17:14:11.127818 2194 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 17:14:11.129736 kubelet[2194]: E0912 17:14:11.129699 2194 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 17:14:11.129894 kubelet[2194]: I0912 17:14:11.129870 2194 factory.go:221] Registration of the containerd container factory successfully Sep 12 17:14:11.131450 kubelet[2194]: I0912 17:14:11.131418 2194 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 12 17:14:11.131841 kubelet[2194]: I0912 17:14:11.131651 2194 reconciler.go:26] "Reconciler: start to sync state" Sep 12 17:14:11.141492 kubelet[2194]: I0912 17:14:11.141425 2194 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 17:14:11.143595 kubelet[2194]: I0912 17:14:11.143126 2194 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 17:14:11.143595 kubelet[2194]: I0912 17:14:11.143167 2194 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 12 17:14:11.143595 kubelet[2194]: I0912 17:14:11.143195 2194 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 12 17:14:11.143595 kubelet[2194]: I0912 17:14:11.143203 2194 kubelet.go:2382] "Starting kubelet main sync loop" Sep 12 17:14:11.143595 kubelet[2194]: E0912 17:14:11.143270 2194 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 17:14:11.153656 kubelet[2194]: W0912 17:14:11.153579 2194 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://49.13.6.100:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 49.13.6.100:6443: connect: connection refused Sep 12 17:14:11.154390 kubelet[2194]: E0912 17:14:11.154281 2194 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://49.13.6.100:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 49.13.6.100:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:14:11.156137 kubelet[2194]: W0912 17:14:11.155903 2194 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://49.13.6.100:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 49.13.6.100:6443: connect: connection refused Sep 12 17:14:11.158871 kubelet[2194]: E0912 17:14:11.158797 2194 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://49.13.6.100:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 49.13.6.100:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:14:11.165316 kubelet[2194]: I0912 17:14:11.165271 2194 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 12 17:14:11.165316 kubelet[2194]: I0912 17:14:11.165301 2194 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 12 17:14:11.165548 kubelet[2194]: I0912 17:14:11.165344 2194 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:14:11.170770 kubelet[2194]: I0912 17:14:11.170718 2194 policy_none.go:49] "None policy: Start" Sep 12 17:14:11.170770 kubelet[2194]: I0912 17:14:11.170775 2194 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 12 17:14:11.170883 kubelet[2194]: I0912 17:14:11.170800 2194 state_mem.go:35] "Initializing new in-memory state store" Sep 12 17:14:11.180348 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 12 17:14:11.198681 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 12 17:14:11.203687 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 12 17:14:11.214029 kubelet[2194]: I0912 17:14:11.213982 2194 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 17:14:11.214459 kubelet[2194]: I0912 17:14:11.214435 2194 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 17:14:11.214534 kubelet[2194]: I0912 17:14:11.214469 2194 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 17:14:11.216955 kubelet[2194]: I0912 17:14:11.215640 2194 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 17:14:11.218307 kubelet[2194]: E0912 17:14:11.218109 2194 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 12 17:14:11.218421 kubelet[2194]: E0912 17:14:11.218349 2194 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4081-3-6-2-0999f1dc3d\" not found" Sep 12 17:14:11.262671 systemd[1]: Created slice kubepods-burstable-podb78a3622f27d83743c5fd6b9d1da6a02.slice - libcontainer container kubepods-burstable-podb78a3622f27d83743c5fd6b9d1da6a02.slice. Sep 12 17:14:11.275525 kubelet[2194]: E0912 17:14:11.275440 2194 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-2-0999f1dc3d\" not found" node="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:11.279588 systemd[1]: Created slice kubepods-burstable-pod2ac9fd08e57e29da06b427dfb7a8825d.slice - libcontainer container kubepods-burstable-pod2ac9fd08e57e29da06b427dfb7a8825d.slice. Sep 12 17:14:11.290346 kubelet[2194]: E0912 17:14:11.290294 2194 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-2-0999f1dc3d\" not found" node="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:11.297127 systemd[1]: Created slice kubepods-burstable-pod4e8d03837e04485b98184582c07a9787.slice - libcontainer container kubepods-burstable-pod4e8d03837e04485b98184582c07a9787.slice. Sep 12 17:14:11.301518 kubelet[2194]: E0912 17:14:11.301460 2194 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-2-0999f1dc3d\" not found" node="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:11.318625 kubelet[2194]: I0912 17:14:11.318533 2194 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:11.319504 kubelet[2194]: E0912 17:14:11.319451 2194 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://49.13.6.100:6443/api/v1/nodes\": dial tcp 49.13.6.100:6443: connect: connection refused" node="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:11.328281 kubelet[2194]: E0912 17:14:11.328215 2194 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://49.13.6.100:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-2-0999f1dc3d?timeout=10s\": dial tcp 49.13.6.100:6443: connect: connection refused" interval="400ms" Sep 12 17:14:11.332806 kubelet[2194]: I0912 17:14:11.332685 2194 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b78a3622f27d83743c5fd6b9d1da6a02-k8s-certs\") pod \"kube-apiserver-ci-4081-3-6-2-0999f1dc3d\" (UID: \"b78a3622f27d83743c5fd6b9d1da6a02\") " pod="kube-system/kube-apiserver-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:11.332806 kubelet[2194]: I0912 17:14:11.332741 2194 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2ac9fd08e57e29da06b427dfb7a8825d-ca-certs\") pod \"kube-controller-manager-ci-4081-3-6-2-0999f1dc3d\" (UID: \"2ac9fd08e57e29da06b427dfb7a8825d\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:11.332806 kubelet[2194]: I0912 17:14:11.332777 2194 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2ac9fd08e57e29da06b427dfb7a8825d-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081-3-6-2-0999f1dc3d\" (UID: \"2ac9fd08e57e29da06b427dfb7a8825d\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:11.332806 kubelet[2194]: I0912 17:14:11.332821 2194 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b78a3622f27d83743c5fd6b9d1da6a02-ca-certs\") pod \"kube-apiserver-ci-4081-3-6-2-0999f1dc3d\" (UID: \"b78a3622f27d83743c5fd6b9d1da6a02\") " pod="kube-system/kube-apiserver-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:11.333809 kubelet[2194]: I0912 17:14:11.332865 2194 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b78a3622f27d83743c5fd6b9d1da6a02-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081-3-6-2-0999f1dc3d\" (UID: \"b78a3622f27d83743c5fd6b9d1da6a02\") " pod="kube-system/kube-apiserver-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:11.333809 kubelet[2194]: I0912 17:14:11.332894 2194 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/2ac9fd08e57e29da06b427dfb7a8825d-flexvolume-dir\") pod \"kube-controller-manager-ci-4081-3-6-2-0999f1dc3d\" (UID: \"2ac9fd08e57e29da06b427dfb7a8825d\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:11.333809 kubelet[2194]: I0912 17:14:11.332929 2194 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2ac9fd08e57e29da06b427dfb7a8825d-k8s-certs\") pod \"kube-controller-manager-ci-4081-3-6-2-0999f1dc3d\" (UID: \"2ac9fd08e57e29da06b427dfb7a8825d\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:11.333809 kubelet[2194]: I0912 17:14:11.332958 2194 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/2ac9fd08e57e29da06b427dfb7a8825d-kubeconfig\") pod \"kube-controller-manager-ci-4081-3-6-2-0999f1dc3d\" (UID: \"2ac9fd08e57e29da06b427dfb7a8825d\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:11.333809 kubelet[2194]: I0912 17:14:11.332987 2194 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4e8d03837e04485b98184582c07a9787-kubeconfig\") pod \"kube-scheduler-ci-4081-3-6-2-0999f1dc3d\" (UID: \"4e8d03837e04485b98184582c07a9787\") " pod="kube-system/kube-scheduler-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:11.524498 kubelet[2194]: I0912 17:14:11.524009 2194 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:11.524908 kubelet[2194]: E0912 17:14:11.524859 2194 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://49.13.6.100:6443/api/v1/nodes\": dial tcp 49.13.6.100:6443: connect: connection refused" node="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:11.577960 containerd[1474]: time="2025-09-12T17:14:11.577736118Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081-3-6-2-0999f1dc3d,Uid:b78a3622f27d83743c5fd6b9d1da6a02,Namespace:kube-system,Attempt:0,}" Sep 12 17:14:11.591985 containerd[1474]: time="2025-09-12T17:14:11.591921639Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081-3-6-2-0999f1dc3d,Uid:2ac9fd08e57e29da06b427dfb7a8825d,Namespace:kube-system,Attempt:0,}" Sep 12 17:14:11.603556 containerd[1474]: time="2025-09-12T17:14:11.603400503Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081-3-6-2-0999f1dc3d,Uid:4e8d03837e04485b98184582c07a9787,Namespace:kube-system,Attempt:0,}" Sep 12 17:14:11.729840 kubelet[2194]: E0912 17:14:11.729615 2194 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://49.13.6.100:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-2-0999f1dc3d?timeout=10s\": dial tcp 49.13.6.100:6443: connect: connection refused" interval="800ms" Sep 12 17:14:11.931362 kubelet[2194]: I0912 17:14:11.929939 2194 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:11.931362 kubelet[2194]: E0912 17:14:11.931584 2194 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://49.13.6.100:6443/api/v1/nodes\": dial tcp 49.13.6.100:6443: connect: connection refused" node="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:11.987861 kubelet[2194]: W0912 17:14:11.987749 2194 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://49.13.6.100:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4081-3-6-2-0999f1dc3d&limit=500&resourceVersion=0": dial tcp 49.13.6.100:6443: connect: connection refused Sep 12 17:14:11.988035 kubelet[2194]: E0912 17:14:11.987881 2194 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://49.13.6.100:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4081-3-6-2-0999f1dc3d&limit=500&resourceVersion=0\": dial tcp 49.13.6.100:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:14:12.161743 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount592894033.mount: Deactivated successfully. Sep 12 17:14:12.170369 containerd[1474]: time="2025-09-12T17:14:12.170213301Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:14:12.174632 containerd[1474]: time="2025-09-12T17:14:12.174401735Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=269193" Sep 12 17:14:12.176746 containerd[1474]: time="2025-09-12T17:14:12.175747921Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:14:12.177679 containerd[1474]: time="2025-09-12T17:14:12.177613345Z" level=info msg="ImageCreate event name:\"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:14:12.179430 containerd[1474]: time="2025-09-12T17:14:12.179227082Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:14:12.180840 containerd[1474]: time="2025-09-12T17:14:12.180663930Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Sep 12 17:14:12.183164 containerd[1474]: time="2025-09-12T17:14:12.181745041Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Sep 12 17:14:12.185334 containerd[1474]: time="2025-09-12T17:14:12.185268515Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:14:12.188631 containerd[1474]: time="2025-09-12T17:14:12.188537263Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 610.635067ms" Sep 12 17:14:12.190921 containerd[1474]: time="2025-09-12T17:14:12.190841054Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 598.624816ms" Sep 12 17:14:12.191488 containerd[1474]: time="2025-09-12T17:14:12.191442164Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 587.879699ms" Sep 12 17:14:12.251083 kubelet[2194]: W0912 17:14:12.250965 2194 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://49.13.6.100:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 49.13.6.100:6443: connect: connection refused Sep 12 17:14:12.251294 kubelet[2194]: E0912 17:14:12.251098 2194 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://49.13.6.100:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 49.13.6.100:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:14:12.318282 containerd[1474]: time="2025-09-12T17:14:12.318122855Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:14:12.318282 containerd[1474]: time="2025-09-12T17:14:12.318225424Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:14:12.318557 containerd[1474]: time="2025-09-12T17:14:12.318261505Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:14:12.318800 containerd[1474]: time="2025-09-12T17:14:12.318762604Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:14:12.320189 containerd[1474]: time="2025-09-12T17:14:12.319891664Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:14:12.320189 containerd[1474]: time="2025-09-12T17:14:12.319986521Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:14:12.320189 containerd[1474]: time="2025-09-12T17:14:12.320007499Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:14:12.320460 containerd[1474]: time="2025-09-12T17:14:12.320254831Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:14:12.329448 containerd[1474]: time="2025-09-12T17:14:12.329239764Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:14:12.329448 containerd[1474]: time="2025-09-12T17:14:12.329327909Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:14:12.329448 containerd[1474]: time="2025-09-12T17:14:12.329341494Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:14:12.330324 containerd[1474]: time="2025-09-12T17:14:12.329921467Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:14:12.349433 systemd[1]: Started cri-containerd-9c681ab4e8d0f8b39edca0d6eaed857f41f7aa29dd2e44ba61865c4d9cce59ac.scope - libcontainer container 9c681ab4e8d0f8b39edca0d6eaed857f41f7aa29dd2e44ba61865c4d9cce59ac. Sep 12 17:14:12.364414 systemd[1]: Started cri-containerd-df022c1f940687dc693a95621fa10ebf6f829cad92d0ab0c9a5a4e31fda9c245.scope - libcontainer container df022c1f940687dc693a95621fa10ebf6f829cad92d0ab0c9a5a4e31fda9c245. Sep 12 17:14:12.377369 systemd[1]: Started cri-containerd-51a3fa984ebcecfd7d63b13116b415fa2ef51a650700ec86b987abee85d07a27.scope - libcontainer container 51a3fa984ebcecfd7d63b13116b415fa2ef51a650700ec86b987abee85d07a27. Sep 12 17:14:12.392721 kubelet[2194]: W0912 17:14:12.392646 2194 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://49.13.6.100:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 49.13.6.100:6443: connect: connection refused Sep 12 17:14:12.392721 kubelet[2194]: E0912 17:14:12.392730 2194 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://49.13.6.100:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 49.13.6.100:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:14:12.419353 containerd[1474]: time="2025-09-12T17:14:12.419193335Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081-3-6-2-0999f1dc3d,Uid:b78a3622f27d83743c5fd6b9d1da6a02,Namespace:kube-system,Attempt:0,} returns sandbox id \"9c681ab4e8d0f8b39edca0d6eaed857f41f7aa29dd2e44ba61865c4d9cce59ac\"" Sep 12 17:14:12.431092 containerd[1474]: time="2025-09-12T17:14:12.430905242Z" level=info msg="CreateContainer within sandbox \"9c681ab4e8d0f8b39edca0d6eaed857f41f7aa29dd2e44ba61865c4d9cce59ac\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 12 17:14:12.453006 containerd[1474]: time="2025-09-12T17:14:12.452610511Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081-3-6-2-0999f1dc3d,Uid:2ac9fd08e57e29da06b427dfb7a8825d,Namespace:kube-system,Attempt:0,} returns sandbox id \"51a3fa984ebcecfd7d63b13116b415fa2ef51a650700ec86b987abee85d07a27\"" Sep 12 17:14:12.460945 containerd[1474]: time="2025-09-12T17:14:12.460809932Z" level=info msg="CreateContainer within sandbox \"51a3fa984ebcecfd7d63b13116b415fa2ef51a650700ec86b987abee85d07a27\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 12 17:14:12.465485 containerd[1474]: time="2025-09-12T17:14:12.465421509Z" level=info msg="CreateContainer within sandbox \"9c681ab4e8d0f8b39edca0d6eaed857f41f7aa29dd2e44ba61865c4d9cce59ac\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"061357c39b0ce259f3b072501d330b0f84e3eeaa0a0a3bdfa9230a62dded958c\"" Sep 12 17:14:12.467251 containerd[1474]: time="2025-09-12T17:14:12.467202185Z" level=info msg="StartContainer for \"061357c39b0ce259f3b072501d330b0f84e3eeaa0a0a3bdfa9230a62dded958c\"" Sep 12 17:14:12.468204 containerd[1474]: time="2025-09-12T17:14:12.467417553Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081-3-6-2-0999f1dc3d,Uid:4e8d03837e04485b98184582c07a9787,Namespace:kube-system,Attempt:0,} returns sandbox id \"df022c1f940687dc693a95621fa10ebf6f829cad92d0ab0c9a5a4e31fda9c245\"" Sep 12 17:14:12.471260 containerd[1474]: time="2025-09-12T17:14:12.471181486Z" level=info msg="CreateContainer within sandbox \"df022c1f940687dc693a95621fa10ebf6f829cad92d0ab0c9a5a4e31fda9c245\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 12 17:14:12.503593 containerd[1474]: time="2025-09-12T17:14:12.503504843Z" level=info msg="CreateContainer within sandbox \"51a3fa984ebcecfd7d63b13116b415fa2ef51a650700ec86b987abee85d07a27\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"a13a73d4831314da2f237a685be10d25adf0e7716808bdbdbd0122a7cb74084a\"" Sep 12 17:14:12.504994 containerd[1474]: time="2025-09-12T17:14:12.504933699Z" level=info msg="StartContainer for \"a13a73d4831314da2f237a685be10d25adf0e7716808bdbdbd0122a7cb74084a\"" Sep 12 17:14:12.514327 containerd[1474]: time="2025-09-12T17:14:12.514188620Z" level=info msg="CreateContainer within sandbox \"df022c1f940687dc693a95621fa10ebf6f829cad92d0ab0c9a5a4e31fda9c245\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"9cade8ef149aa73c072485c98ab4f0156d4036231e8627e7e375103307b567ec\"" Sep 12 17:14:12.514857 containerd[1474]: time="2025-09-12T17:14:12.514825771Z" level=info msg="StartContainer for \"9cade8ef149aa73c072485c98ab4f0156d4036231e8627e7e375103307b567ec\"" Sep 12 17:14:12.519533 systemd[1]: Started cri-containerd-061357c39b0ce259f3b072501d330b0f84e3eeaa0a0a3bdfa9230a62dded958c.scope - libcontainer container 061357c39b0ce259f3b072501d330b0f84e3eeaa0a0a3bdfa9230a62dded958c. Sep 12 17:14:12.531929 kubelet[2194]: E0912 17:14:12.531865 2194 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://49.13.6.100:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-2-0999f1dc3d?timeout=10s\": dial tcp 49.13.6.100:6443: connect: connection refused" interval="1.6s" Sep 12 17:14:12.559645 kubelet[2194]: W0912 17:14:12.559572 2194 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://49.13.6.100:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 49.13.6.100:6443: connect: connection refused Sep 12 17:14:12.560352 kubelet[2194]: E0912 17:14:12.559650 2194 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://49.13.6.100:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 49.13.6.100:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:14:12.573542 systemd[1]: Started cri-containerd-9cade8ef149aa73c072485c98ab4f0156d4036231e8627e7e375103307b567ec.scope - libcontainer container 9cade8ef149aa73c072485c98ab4f0156d4036231e8627e7e375103307b567ec. Sep 12 17:14:12.576834 systemd[1]: Started cri-containerd-a13a73d4831314da2f237a685be10d25adf0e7716808bdbdbd0122a7cb74084a.scope - libcontainer container a13a73d4831314da2f237a685be10d25adf0e7716808bdbdbd0122a7cb74084a. Sep 12 17:14:12.594940 containerd[1474]: time="2025-09-12T17:14:12.594798407Z" level=info msg="StartContainer for \"061357c39b0ce259f3b072501d330b0f84e3eeaa0a0a3bdfa9230a62dded958c\" returns successfully" Sep 12 17:14:12.665363 containerd[1474]: time="2025-09-12T17:14:12.665260278Z" level=info msg="StartContainer for \"a13a73d4831314da2f237a685be10d25adf0e7716808bdbdbd0122a7cb74084a\" returns successfully" Sep 12 17:14:12.666428 containerd[1474]: time="2025-09-12T17:14:12.666238980Z" level=info msg="StartContainer for \"9cade8ef149aa73c072485c98ab4f0156d4036231e8627e7e375103307b567ec\" returns successfully" Sep 12 17:14:12.737263 kubelet[2194]: I0912 17:14:12.735725 2194 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:12.737263 kubelet[2194]: E0912 17:14:12.736288 2194 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://49.13.6.100:6443/api/v1/nodes\": dial tcp 49.13.6.100:6443: connect: connection refused" node="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:13.176654 kubelet[2194]: E0912 17:14:13.176116 2194 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-2-0999f1dc3d\" not found" node="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:13.179121 kubelet[2194]: E0912 17:14:13.178721 2194 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-2-0999f1dc3d\" not found" node="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:13.196378 kubelet[2194]: E0912 17:14:13.196338 2194 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-2-0999f1dc3d\" not found" node="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:14.191632 kubelet[2194]: E0912 17:14:14.191376 2194 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-2-0999f1dc3d\" not found" node="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:14.192846 kubelet[2194]: E0912 17:14:14.192338 2194 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4081-3-6-2-0999f1dc3d\" not found" node="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:14.339111 kubelet[2194]: I0912 17:14:14.338813 2194 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:15.421212 kubelet[2194]: E0912 17:14:15.421101 2194 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4081-3-6-2-0999f1dc3d\" not found" node="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:15.519046 kubelet[2194]: I0912 17:14:15.518958 2194 kubelet_node_status.go:78] "Successfully registered node" node="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:15.527185 kubelet[2194]: I0912 17:14:15.526738 2194 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:15.596817 kubelet[2194]: E0912 17:14:15.596757 2194 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4081-3-6-2-0999f1dc3d\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:15.596817 kubelet[2194]: I0912 17:14:15.596808 2194 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:15.599564 kubelet[2194]: E0912 17:14:15.599513 2194 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081-3-6-2-0999f1dc3d\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:15.599564 kubelet[2194]: I0912 17:14:15.599558 2194 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:15.602685 kubelet[2194]: E0912 17:14:15.602621 2194 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081-3-6-2-0999f1dc3d\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:16.105300 kubelet[2194]: I0912 17:14:16.104824 2194 apiserver.go:52] "Watching apiserver" Sep 12 17:14:16.132092 kubelet[2194]: I0912 17:14:16.132005 2194 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 12 17:14:17.393291 kubelet[2194]: I0912 17:14:17.393137 2194 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:17.726289 systemd[1]: Reloading requested from client PID 2471 ('systemctl') (unit session-7.scope)... Sep 12 17:14:17.726315 systemd[1]: Reloading... Sep 12 17:14:17.866145 zram_generator::config[2514]: No configuration found. Sep 12 17:14:17.980976 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 12 17:14:18.068787 systemd[1]: Reloading finished in 342 ms. Sep 12 17:14:18.116107 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:14:18.129614 systemd[1]: kubelet.service: Deactivated successfully. Sep 12 17:14:18.130176 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:14:18.130312 systemd[1]: kubelet.service: Consumed 1.384s CPU time, 127.4M memory peak, 0B memory swap peak. Sep 12 17:14:18.135626 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:14:18.326578 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:14:18.326686 (kubelet)[2556]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 17:14:18.425826 kubelet[2556]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:14:18.425826 kubelet[2556]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 12 17:14:18.425826 kubelet[2556]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:14:18.425826 kubelet[2556]: I0912 17:14:18.425161 2556 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 17:14:18.442988 kubelet[2556]: I0912 17:14:18.442909 2556 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 12 17:14:18.442988 kubelet[2556]: I0912 17:14:18.443029 2556 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 17:14:18.444756 kubelet[2556]: I0912 17:14:18.444193 2556 server.go:954] "Client rotation is on, will bootstrap in background" Sep 12 17:14:18.450332 kubelet[2556]: I0912 17:14:18.450280 2556 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 12 17:14:18.459486 kubelet[2556]: I0912 17:14:18.459416 2556 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 17:14:18.469955 kubelet[2556]: E0912 17:14:18.469879 2556 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Sep 12 17:14:18.469955 kubelet[2556]: I0912 17:14:18.469947 2556 server.go:1421] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Sep 12 17:14:18.477132 kubelet[2556]: I0912 17:14:18.476048 2556 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 17:14:18.477464 kubelet[2556]: I0912 17:14:18.477398 2556 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 17:14:18.477763 kubelet[2556]: I0912 17:14:18.477464 2556 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081-3-6-2-0999f1dc3d","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 17:14:18.477870 kubelet[2556]: I0912 17:14:18.477778 2556 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 17:14:18.477870 kubelet[2556]: I0912 17:14:18.477792 2556 container_manager_linux.go:304] "Creating device plugin manager" Sep 12 17:14:18.477870 kubelet[2556]: I0912 17:14:18.477867 2556 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:14:18.478097 kubelet[2556]: I0912 17:14:18.478060 2556 kubelet.go:446] "Attempting to sync node with API server" Sep 12 17:14:18.478151 kubelet[2556]: I0912 17:14:18.478107 2556 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 17:14:18.478182 kubelet[2556]: I0912 17:14:18.478162 2556 kubelet.go:352] "Adding apiserver pod source" Sep 12 17:14:18.478182 kubelet[2556]: I0912 17:14:18.478180 2556 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 17:14:18.484369 kubelet[2556]: I0912 17:14:18.484309 2556 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Sep 12 17:14:18.485080 kubelet[2556]: I0912 17:14:18.485039 2556 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 17:14:18.485726 kubelet[2556]: I0912 17:14:18.485696 2556 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 12 17:14:18.485793 kubelet[2556]: I0912 17:14:18.485747 2556 server.go:1287] "Started kubelet" Sep 12 17:14:18.503059 kubelet[2556]: I0912 17:14:18.502992 2556 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 17:14:18.507450 kubelet[2556]: I0912 17:14:18.507352 2556 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 17:14:18.533122 kubelet[2556]: I0912 17:14:18.532666 2556 server.go:479] "Adding debug handlers to kubelet server" Sep 12 17:14:18.540307 kubelet[2556]: E0912 17:14:18.539535 2556 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4081-3-6-2-0999f1dc3d\" not found" Sep 12 17:14:18.541287 kubelet[2556]: I0912 17:14:18.540967 2556 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 17:14:18.542954 kubelet[2556]: I0912 17:14:18.542512 2556 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 17:14:18.542954 kubelet[2556]: I0912 17:14:18.542922 2556 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 17:14:18.552617 kubelet[2556]: I0912 17:14:18.552554 2556 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 12 17:14:18.561957 kubelet[2556]: I0912 17:14:18.559620 2556 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 12 17:14:18.561957 kubelet[2556]: I0912 17:14:18.559863 2556 reconciler.go:26] "Reconciler: start to sync state" Sep 12 17:14:18.573151 kubelet[2556]: I0912 17:14:18.572737 2556 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 17:14:18.575224 kubelet[2556]: I0912 17:14:18.574367 2556 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 17:14:18.575224 kubelet[2556]: I0912 17:14:18.574408 2556 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 12 17:14:18.575224 kubelet[2556]: I0912 17:14:18.574442 2556 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 12 17:14:18.575224 kubelet[2556]: I0912 17:14:18.574450 2556 kubelet.go:2382] "Starting kubelet main sync loop" Sep 12 17:14:18.575224 kubelet[2556]: E0912 17:14:18.574525 2556 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 17:14:18.578683 kubelet[2556]: I0912 17:14:18.578523 2556 factory.go:221] Registration of the systemd container factory successfully Sep 12 17:14:18.582171 kubelet[2556]: I0912 17:14:18.580117 2556 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 17:14:18.589167 kubelet[2556]: E0912 17:14:18.589053 2556 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 17:14:18.590357 kubelet[2556]: I0912 17:14:18.590317 2556 factory.go:221] Registration of the containerd container factory successfully Sep 12 17:14:18.678181 kubelet[2556]: E0912 17:14:18.678017 2556 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 12 17:14:18.702225 kubelet[2556]: I0912 17:14:18.701725 2556 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 12 17:14:18.702225 kubelet[2556]: I0912 17:14:18.701761 2556 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 12 17:14:18.702225 kubelet[2556]: I0912 17:14:18.701803 2556 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:14:18.702966 kubelet[2556]: I0912 17:14:18.702938 2556 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 12 17:14:18.703411 kubelet[2556]: I0912 17:14:18.703348 2556 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 12 17:14:18.704176 kubelet[2556]: I0912 17:14:18.703476 2556 policy_none.go:49] "None policy: Start" Sep 12 17:14:18.704176 kubelet[2556]: I0912 17:14:18.703502 2556 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 12 17:14:18.704176 kubelet[2556]: I0912 17:14:18.703528 2556 state_mem.go:35] "Initializing new in-memory state store" Sep 12 17:14:18.704176 kubelet[2556]: I0912 17:14:18.703741 2556 state_mem.go:75] "Updated machine memory state" Sep 12 17:14:18.716625 kubelet[2556]: I0912 17:14:18.715584 2556 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 17:14:18.718194 kubelet[2556]: I0912 17:14:18.717802 2556 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 17:14:18.718194 kubelet[2556]: I0912 17:14:18.717832 2556 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 17:14:18.719811 kubelet[2556]: I0912 17:14:18.718883 2556 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 17:14:18.721473 kubelet[2556]: E0912 17:14:18.721433 2556 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 12 17:14:18.857097 kubelet[2556]: I0912 17:14:18.856425 2556 kubelet_node_status.go:75] "Attempting to register node" node="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:18.879933 kubelet[2556]: I0912 17:14:18.879029 2556 kubelet_node_status.go:124] "Node was previously registered" node="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:18.879933 kubelet[2556]: I0912 17:14:18.879221 2556 kubelet_node_status.go:78] "Successfully registered node" node="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:18.881960 kubelet[2556]: I0912 17:14:18.881869 2556 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:18.885206 kubelet[2556]: I0912 17:14:18.882477 2556 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:18.885206 kubelet[2556]: I0912 17:14:18.882992 2556 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:18.908044 kubelet[2556]: E0912 17:14:18.907102 2556 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4081-3-6-2-0999f1dc3d\" already exists" pod="kube-system/kube-apiserver-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:18.964915 kubelet[2556]: I0912 17:14:18.963903 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b78a3622f27d83743c5fd6b9d1da6a02-ca-certs\") pod \"kube-apiserver-ci-4081-3-6-2-0999f1dc3d\" (UID: \"b78a3622f27d83743c5fd6b9d1da6a02\") " pod="kube-system/kube-apiserver-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:18.964915 kubelet[2556]: I0912 17:14:18.963981 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b78a3622f27d83743c5fd6b9d1da6a02-k8s-certs\") pod \"kube-apiserver-ci-4081-3-6-2-0999f1dc3d\" (UID: \"b78a3622f27d83743c5fd6b9d1da6a02\") " pod="kube-system/kube-apiserver-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:18.964915 kubelet[2556]: I0912 17:14:18.964046 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2ac9fd08e57e29da06b427dfb7a8825d-ca-certs\") pod \"kube-controller-manager-ci-4081-3-6-2-0999f1dc3d\" (UID: \"2ac9fd08e57e29da06b427dfb7a8825d\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:18.964915 kubelet[2556]: I0912 17:14:18.964111 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/2ac9fd08e57e29da06b427dfb7a8825d-flexvolume-dir\") pod \"kube-controller-manager-ci-4081-3-6-2-0999f1dc3d\" (UID: \"2ac9fd08e57e29da06b427dfb7a8825d\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:18.964915 kubelet[2556]: I0912 17:14:18.964145 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/2ac9fd08e57e29da06b427dfb7a8825d-kubeconfig\") pod \"kube-controller-manager-ci-4081-3-6-2-0999f1dc3d\" (UID: \"2ac9fd08e57e29da06b427dfb7a8825d\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:18.965276 kubelet[2556]: I0912 17:14:18.964181 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b78a3622f27d83743c5fd6b9d1da6a02-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081-3-6-2-0999f1dc3d\" (UID: \"b78a3622f27d83743c5fd6b9d1da6a02\") " pod="kube-system/kube-apiserver-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:18.965276 kubelet[2556]: I0912 17:14:18.964210 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2ac9fd08e57e29da06b427dfb7a8825d-k8s-certs\") pod \"kube-controller-manager-ci-4081-3-6-2-0999f1dc3d\" (UID: \"2ac9fd08e57e29da06b427dfb7a8825d\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:18.965276 kubelet[2556]: I0912 17:14:18.964248 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2ac9fd08e57e29da06b427dfb7a8825d-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081-3-6-2-0999f1dc3d\" (UID: \"2ac9fd08e57e29da06b427dfb7a8825d\") " pod="kube-system/kube-controller-manager-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:18.965276 kubelet[2556]: I0912 17:14:18.964281 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4e8d03837e04485b98184582c07a9787-kubeconfig\") pod \"kube-scheduler-ci-4081-3-6-2-0999f1dc3d\" (UID: \"4e8d03837e04485b98184582c07a9787\") " pod="kube-system/kube-scheduler-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:19.481170 kubelet[2556]: I0912 17:14:19.481058 2556 apiserver.go:52] "Watching apiserver" Sep 12 17:14:19.560212 kubelet[2556]: I0912 17:14:19.560132 2556 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 12 17:14:19.635502 kubelet[2556]: I0912 17:14:19.635435 2556 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:19.636130 kubelet[2556]: I0912 17:14:19.636103 2556 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:19.659095 kubelet[2556]: E0912 17:14:19.658196 2556 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4081-3-6-2-0999f1dc3d\" already exists" pod="kube-system/kube-scheduler-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:19.663108 kubelet[2556]: E0912 17:14:19.661036 2556 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4081-3-6-2-0999f1dc3d\" already exists" pod="kube-system/kube-controller-manager-ci-4081-3-6-2-0999f1dc3d" Sep 12 17:14:19.696644 kubelet[2556]: I0912 17:14:19.696531 2556 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4081-3-6-2-0999f1dc3d" podStartSLOduration=2.696497174 podStartE2EDuration="2.696497174s" podCreationTimestamp="2025-09-12 17:14:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:14:19.695530624 +0000 UTC m=+1.358271983" watchObservedRunningTime="2025-09-12 17:14:19.696497174 +0000 UTC m=+1.359238533" Sep 12 17:14:19.696973 kubelet[2556]: I0912 17:14:19.696793 2556 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4081-3-6-2-0999f1dc3d" podStartSLOduration=1.696785652 podStartE2EDuration="1.696785652s" podCreationTimestamp="2025-09-12 17:14:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:14:19.678702999 +0000 UTC m=+1.341444358" watchObservedRunningTime="2025-09-12 17:14:19.696785652 +0000 UTC m=+1.359527011" Sep 12 17:14:19.750056 kubelet[2556]: I0912 17:14:19.749619 2556 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4081-3-6-2-0999f1dc3d" podStartSLOduration=1.7495879840000002 podStartE2EDuration="1.749587984s" podCreationTimestamp="2025-09-12 17:14:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:14:19.717424901 +0000 UTC m=+1.380166260" watchObservedRunningTime="2025-09-12 17:14:19.749587984 +0000 UTC m=+1.412329343" Sep 12 17:14:23.052732 kubelet[2556]: I0912 17:14:23.052454 2556 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 12 17:14:23.053545 kubelet[2556]: I0912 17:14:23.053409 2556 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 12 17:14:23.053673 containerd[1474]: time="2025-09-12T17:14:23.053081481Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 12 17:14:23.825719 systemd[1]: Created slice kubepods-besteffort-podcf4fde15_4ecc_4159_a28a_99e03b6fd23c.slice - libcontainer container kubepods-besteffort-podcf4fde15_4ecc_4159_a28a_99e03b6fd23c.slice. Sep 12 17:14:23.897140 kubelet[2556]: I0912 17:14:23.896753 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/cf4fde15-4ecc-4159-a28a-99e03b6fd23c-xtables-lock\") pod \"kube-proxy-98spd\" (UID: \"cf4fde15-4ecc-4159-a28a-99e03b6fd23c\") " pod="kube-system/kube-proxy-98spd" Sep 12 17:14:23.897140 kubelet[2556]: I0912 17:14:23.896844 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cf4fde15-4ecc-4159-a28a-99e03b6fd23c-lib-modules\") pod \"kube-proxy-98spd\" (UID: \"cf4fde15-4ecc-4159-a28a-99e03b6fd23c\") " pod="kube-system/kube-proxy-98spd" Sep 12 17:14:23.897140 kubelet[2556]: I0912 17:14:23.896887 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2nwj\" (UniqueName: \"kubernetes.io/projected/cf4fde15-4ecc-4159-a28a-99e03b6fd23c-kube-api-access-c2nwj\") pod \"kube-proxy-98spd\" (UID: \"cf4fde15-4ecc-4159-a28a-99e03b6fd23c\") " pod="kube-system/kube-proxy-98spd" Sep 12 17:14:23.897140 kubelet[2556]: I0912 17:14:23.896938 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/cf4fde15-4ecc-4159-a28a-99e03b6fd23c-kube-proxy\") pod \"kube-proxy-98spd\" (UID: \"cf4fde15-4ecc-4159-a28a-99e03b6fd23c\") " pod="kube-system/kube-proxy-98spd" Sep 12 17:14:24.136862 containerd[1474]: time="2025-09-12T17:14:24.136785199Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-98spd,Uid:cf4fde15-4ecc-4159-a28a-99e03b6fd23c,Namespace:kube-system,Attempt:0,}" Sep 12 17:14:24.192032 containerd[1474]: time="2025-09-12T17:14:24.191298626Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:14:24.192032 containerd[1474]: time="2025-09-12T17:14:24.191410562Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:14:24.192032 containerd[1474]: time="2025-09-12T17:14:24.191447034Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:14:24.192032 containerd[1474]: time="2025-09-12T17:14:24.191643551Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:14:24.232514 systemd[1]: Started cri-containerd-5fe6499f64f2fd265e9b5e3b3480ed50ea7a57d7544518d134ec7b1041c8f50a.scope - libcontainer container 5fe6499f64f2fd265e9b5e3b3480ed50ea7a57d7544518d134ec7b1041c8f50a. Sep 12 17:14:24.297117 systemd[1]: Created slice kubepods-besteffort-poda9f05d29_9dc5_428d_a958_c4b88601e4e6.slice - libcontainer container kubepods-besteffort-poda9f05d29_9dc5_428d_a958_c4b88601e4e6.slice. Sep 12 17:14:24.300977 kubelet[2556]: I0912 17:14:24.300661 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gq96\" (UniqueName: \"kubernetes.io/projected/a9f05d29-9dc5-428d-a958-c4b88601e4e6-kube-api-access-9gq96\") pod \"tigera-operator-755d956888-25brs\" (UID: \"a9f05d29-9dc5-428d-a958-c4b88601e4e6\") " pod="tigera-operator/tigera-operator-755d956888-25brs" Sep 12 17:14:24.300977 kubelet[2556]: I0912 17:14:24.300725 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a9f05d29-9dc5-428d-a958-c4b88601e4e6-var-lib-calico\") pod \"tigera-operator-755d956888-25brs\" (UID: \"a9f05d29-9dc5-428d-a958-c4b88601e4e6\") " pod="tigera-operator/tigera-operator-755d956888-25brs" Sep 12 17:14:24.355143 containerd[1474]: time="2025-09-12T17:14:24.355039344Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-98spd,Uid:cf4fde15-4ecc-4159-a28a-99e03b6fd23c,Namespace:kube-system,Attempt:0,} returns sandbox id \"5fe6499f64f2fd265e9b5e3b3480ed50ea7a57d7544518d134ec7b1041c8f50a\"" Sep 12 17:14:24.361145 containerd[1474]: time="2025-09-12T17:14:24.361056076Z" level=info msg="CreateContainer within sandbox \"5fe6499f64f2fd265e9b5e3b3480ed50ea7a57d7544518d134ec7b1041c8f50a\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 12 17:14:24.401252 containerd[1474]: time="2025-09-12T17:14:24.397952174Z" level=info msg="CreateContainer within sandbox \"5fe6499f64f2fd265e9b5e3b3480ed50ea7a57d7544518d134ec7b1041c8f50a\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"edbb75e55ca24531e524adfa24f74297d90170d6f8b74eb43009b100908585b8\"" Sep 12 17:14:24.404543 containerd[1474]: time="2025-09-12T17:14:24.403484451Z" level=info msg="StartContainer for \"edbb75e55ca24531e524adfa24f74297d90170d6f8b74eb43009b100908585b8\"" Sep 12 17:14:24.453527 systemd[1]: Started cri-containerd-edbb75e55ca24531e524adfa24f74297d90170d6f8b74eb43009b100908585b8.scope - libcontainer container edbb75e55ca24531e524adfa24f74297d90170d6f8b74eb43009b100908585b8. Sep 12 17:14:24.504477 containerd[1474]: time="2025-09-12T17:14:24.504364596Z" level=info msg="StartContainer for \"edbb75e55ca24531e524adfa24f74297d90170d6f8b74eb43009b100908585b8\" returns successfully" Sep 12 17:14:24.605925 containerd[1474]: time="2025-09-12T17:14:24.605774867Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-25brs,Uid:a9f05d29-9dc5-428d-a958-c4b88601e4e6,Namespace:tigera-operator,Attempt:0,}" Sep 12 17:14:24.650227 containerd[1474]: time="2025-09-12T17:14:24.649260212Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:14:24.650227 containerd[1474]: time="2025-09-12T17:14:24.649356551Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:14:24.650227 containerd[1474]: time="2025-09-12T17:14:24.649377026Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:14:24.650227 containerd[1474]: time="2025-09-12T17:14:24.649525634Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:14:24.691912 systemd[1]: Started cri-containerd-a8f3ca42cfb4b60e00a12c13d8fffc36d3766888554744a1e33b91ca2831f793.scope - libcontainer container a8f3ca42cfb4b60e00a12c13d8fffc36d3766888554744a1e33b91ca2831f793. Sep 12 17:14:24.766450 containerd[1474]: time="2025-09-12T17:14:24.766290926Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-25brs,Uid:a9f05d29-9dc5-428d-a958-c4b88601e4e6,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"a8f3ca42cfb4b60e00a12c13d8fffc36d3766888554744a1e33b91ca2831f793\"" Sep 12 17:14:24.771364 containerd[1474]: time="2025-09-12T17:14:24.771286680Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 12 17:14:25.108483 kubelet[2556]: I0912 17:14:25.108386 2556 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-98spd" podStartSLOduration=2.108352083 podStartE2EDuration="2.108352083s" podCreationTimestamp="2025-09-12 17:14:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:14:24.691148504 +0000 UTC m=+6.353889983" watchObservedRunningTime="2025-09-12 17:14:25.108352083 +0000 UTC m=+6.771093442" Sep 12 17:14:28.049884 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount347113655.mount: Deactivated successfully. Sep 12 17:14:32.112039 containerd[1474]: time="2025-09-12T17:14:32.111565206Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:32.113866 containerd[1474]: time="2025-09-12T17:14:32.113664590Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=22152365" Sep 12 17:14:32.113866 containerd[1474]: time="2025-09-12T17:14:32.113792141Z" level=info msg="ImageCreate event name:\"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:32.117328 containerd[1474]: time="2025-09-12T17:14:32.117250716Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:32.118521 containerd[1474]: time="2025-09-12T17:14:32.118060064Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"22148360\" in 7.346706237s" Sep 12 17:14:32.118521 containerd[1474]: time="2025-09-12T17:14:32.118138138Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\"" Sep 12 17:14:32.124739 containerd[1474]: time="2025-09-12T17:14:32.124431009Z" level=info msg="CreateContainer within sandbox \"a8f3ca42cfb4b60e00a12c13d8fffc36d3766888554744a1e33b91ca2831f793\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 12 17:14:32.145238 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2922972488.mount: Deactivated successfully. Sep 12 17:14:32.147671 containerd[1474]: time="2025-09-12T17:14:32.147485748Z" level=info msg="CreateContainer within sandbox \"a8f3ca42cfb4b60e00a12c13d8fffc36d3766888554744a1e33b91ca2831f793\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"2a6634121269b87912849f7839284ccd259b93fa9dda530aee83149a64895cc0\"" Sep 12 17:14:32.149960 containerd[1474]: time="2025-09-12T17:14:32.149879392Z" level=info msg="StartContainer for \"2a6634121269b87912849f7839284ccd259b93fa9dda530aee83149a64895cc0\"" Sep 12 17:14:32.190671 systemd[1]: run-containerd-runc-k8s.io-2a6634121269b87912849f7839284ccd259b93fa9dda530aee83149a64895cc0-runc.kwz2RB.mount: Deactivated successfully. Sep 12 17:14:32.202826 systemd[1]: Started cri-containerd-2a6634121269b87912849f7839284ccd259b93fa9dda530aee83149a64895cc0.scope - libcontainer container 2a6634121269b87912849f7839284ccd259b93fa9dda530aee83149a64895cc0. Sep 12 17:14:32.240713 containerd[1474]: time="2025-09-12T17:14:32.240576446Z" level=info msg="StartContainer for \"2a6634121269b87912849f7839284ccd259b93fa9dda530aee83149a64895cc0\" returns successfully" Sep 12 17:14:39.587498 sudo[1693]: pam_unix(sudo:session): session closed for user root Sep 12 17:14:39.751454 sshd[1690]: pam_unix(sshd:session): session closed for user core Sep 12 17:14:39.758585 systemd[1]: sshd@6-49.13.6.100:22-139.178.89.65:51022.service: Deactivated successfully. Sep 12 17:14:39.764974 systemd[1]: session-7.scope: Deactivated successfully. Sep 12 17:14:39.765998 systemd[1]: session-7.scope: Consumed 7.833s CPU time, 152.1M memory peak, 0B memory swap peak. Sep 12 17:14:39.776494 systemd-logind[1452]: Session 7 logged out. Waiting for processes to exit. Sep 12 17:14:39.783799 systemd-logind[1452]: Removed session 7. Sep 12 17:14:46.898336 kubelet[2556]: I0912 17:14:46.898184 2556 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-25brs" podStartSLOduration=15.547661276 podStartE2EDuration="22.898149349s" podCreationTimestamp="2025-09-12 17:14:24 +0000 UTC" firstStartedPulling="2025-09-12 17:14:24.769520464 +0000 UTC m=+6.432261783" lastFinishedPulling="2025-09-12 17:14:32.120008497 +0000 UTC m=+13.782749856" observedRunningTime="2025-09-12 17:14:32.709160055 +0000 UTC m=+14.371901414" watchObservedRunningTime="2025-09-12 17:14:46.898149349 +0000 UTC m=+28.560890708" Sep 12 17:14:46.913179 systemd[1]: Created slice kubepods-besteffort-podf57ffba9_8aaa_480f_b4b7_51042075d625.slice - libcontainer container kubepods-besteffort-podf57ffba9_8aaa_480f_b4b7_51042075d625.slice. Sep 12 17:14:46.963744 kubelet[2556]: I0912 17:14:46.963647 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kx9v\" (UniqueName: \"kubernetes.io/projected/f57ffba9-8aaa-480f-b4b7-51042075d625-kube-api-access-6kx9v\") pod \"calico-typha-7b767c467d-5wkhj\" (UID: \"f57ffba9-8aaa-480f-b4b7-51042075d625\") " pod="calico-system/calico-typha-7b767c467d-5wkhj" Sep 12 17:14:46.963744 kubelet[2556]: I0912 17:14:46.963726 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/f57ffba9-8aaa-480f-b4b7-51042075d625-typha-certs\") pod \"calico-typha-7b767c467d-5wkhj\" (UID: \"f57ffba9-8aaa-480f-b4b7-51042075d625\") " pod="calico-system/calico-typha-7b767c467d-5wkhj" Sep 12 17:14:46.963744 kubelet[2556]: I0912 17:14:46.963750 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f57ffba9-8aaa-480f-b4b7-51042075d625-tigera-ca-bundle\") pod \"calico-typha-7b767c467d-5wkhj\" (UID: \"f57ffba9-8aaa-480f-b4b7-51042075d625\") " pod="calico-system/calico-typha-7b767c467d-5wkhj" Sep 12 17:14:47.214243 systemd[1]: Created slice kubepods-besteffort-podebf14c92_8e49_4823_860b_9f0c85880bd5.slice - libcontainer container kubepods-besteffort-podebf14c92_8e49_4823_860b_9f0c85880bd5.slice. Sep 12 17:14:47.222708 containerd[1474]: time="2025-09-12T17:14:47.222621624Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7b767c467d-5wkhj,Uid:f57ffba9-8aaa-480f-b4b7-51042075d625,Namespace:calico-system,Attempt:0,}" Sep 12 17:14:47.267669 kubelet[2556]: I0912 17:14:47.266564 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/ebf14c92-8e49-4823-860b-9f0c85880bd5-var-run-calico\") pod \"calico-node-ftnhk\" (UID: \"ebf14c92-8e49-4823-860b-9f0c85880bd5\") " pod="calico-system/calico-node-ftnhk" Sep 12 17:14:47.267669 kubelet[2556]: I0912 17:14:47.266631 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ebf14c92-8e49-4823-860b-9f0c85880bd5-xtables-lock\") pod \"calico-node-ftnhk\" (UID: \"ebf14c92-8e49-4823-860b-9f0c85880bd5\") " pod="calico-system/calico-node-ftnhk" Sep 12 17:14:47.267669 kubelet[2556]: I0912 17:14:47.266659 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/ebf14c92-8e49-4823-860b-9f0c85880bd5-policysync\") pod \"calico-node-ftnhk\" (UID: \"ebf14c92-8e49-4823-860b-9f0c85880bd5\") " pod="calico-system/calico-node-ftnhk" Sep 12 17:14:47.267669 kubelet[2556]: I0912 17:14:47.266683 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqgs2\" (UniqueName: \"kubernetes.io/projected/ebf14c92-8e49-4823-860b-9f0c85880bd5-kube-api-access-rqgs2\") pod \"calico-node-ftnhk\" (UID: \"ebf14c92-8e49-4823-860b-9f0c85880bd5\") " pod="calico-system/calico-node-ftnhk" Sep 12 17:14:47.267669 kubelet[2556]: I0912 17:14:47.266716 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/ebf14c92-8e49-4823-860b-9f0c85880bd5-cni-log-dir\") pod \"calico-node-ftnhk\" (UID: \"ebf14c92-8e49-4823-860b-9f0c85880bd5\") " pod="calico-system/calico-node-ftnhk" Sep 12 17:14:47.268024 kubelet[2556]: I0912 17:14:47.266734 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/ebf14c92-8e49-4823-860b-9f0c85880bd5-flexvol-driver-host\") pod \"calico-node-ftnhk\" (UID: \"ebf14c92-8e49-4823-860b-9f0c85880bd5\") " pod="calico-system/calico-node-ftnhk" Sep 12 17:14:47.268024 kubelet[2556]: I0912 17:14:47.266754 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebf14c92-8e49-4823-860b-9f0c85880bd5-tigera-ca-bundle\") pod \"calico-node-ftnhk\" (UID: \"ebf14c92-8e49-4823-860b-9f0c85880bd5\") " pod="calico-system/calico-node-ftnhk" Sep 12 17:14:47.268024 kubelet[2556]: I0912 17:14:47.266773 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/ebf14c92-8e49-4823-860b-9f0c85880bd5-var-lib-calico\") pod \"calico-node-ftnhk\" (UID: \"ebf14c92-8e49-4823-860b-9f0c85880bd5\") " pod="calico-system/calico-node-ftnhk" Sep 12 17:14:47.268024 kubelet[2556]: I0912 17:14:47.266798 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ebf14c92-8e49-4823-860b-9f0c85880bd5-lib-modules\") pod \"calico-node-ftnhk\" (UID: \"ebf14c92-8e49-4823-860b-9f0c85880bd5\") " pod="calico-system/calico-node-ftnhk" Sep 12 17:14:47.268024 kubelet[2556]: I0912 17:14:47.266820 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/ebf14c92-8e49-4823-860b-9f0c85880bd5-cni-net-dir\") pod \"calico-node-ftnhk\" (UID: \"ebf14c92-8e49-4823-860b-9f0c85880bd5\") " pod="calico-system/calico-node-ftnhk" Sep 12 17:14:47.268599 kubelet[2556]: I0912 17:14:47.266838 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/ebf14c92-8e49-4823-860b-9f0c85880bd5-node-certs\") pod \"calico-node-ftnhk\" (UID: \"ebf14c92-8e49-4823-860b-9f0c85880bd5\") " pod="calico-system/calico-node-ftnhk" Sep 12 17:14:47.268599 kubelet[2556]: I0912 17:14:47.266888 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/ebf14c92-8e49-4823-860b-9f0c85880bd5-cni-bin-dir\") pod \"calico-node-ftnhk\" (UID: \"ebf14c92-8e49-4823-860b-9f0c85880bd5\") " pod="calico-system/calico-node-ftnhk" Sep 12 17:14:47.291002 containerd[1474]: time="2025-09-12T17:14:47.290021374Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:14:47.291002 containerd[1474]: time="2025-09-12T17:14:47.290174690Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:14:47.291002 containerd[1474]: time="2025-09-12T17:14:47.290191249Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:14:47.291002 containerd[1474]: time="2025-09-12T17:14:47.290333805Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:14:47.350447 systemd[1]: Started cri-containerd-c72f95e6b3df1c904309a66ca7dcf6085fe166d3b66d7c1eb570199c4a80b3b1.scope - libcontainer container c72f95e6b3df1c904309a66ca7dcf6085fe166d3b66d7c1eb570199c4a80b3b1. Sep 12 17:14:47.378043 kubelet[2556]: E0912 17:14:47.377842 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.378043 kubelet[2556]: W0912 17:14:47.377898 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.378043 kubelet[2556]: E0912 17:14:47.377949 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.396188 kubelet[2556]: E0912 17:14:47.396004 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.396188 kubelet[2556]: W0912 17:14:47.396050 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.396188 kubelet[2556]: E0912 17:14:47.396102 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.476428 containerd[1474]: time="2025-09-12T17:14:47.476230280Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7b767c467d-5wkhj,Uid:f57ffba9-8aaa-480f-b4b7-51042075d625,Namespace:calico-system,Attempt:0,} returns sandbox id \"c72f95e6b3df1c904309a66ca7dcf6085fe166d3b66d7c1eb570199c4a80b3b1\"" Sep 12 17:14:47.482526 containerd[1474]: time="2025-09-12T17:14:47.482058700Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 12 17:14:47.520043 containerd[1474]: time="2025-09-12T17:14:47.519289545Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-ftnhk,Uid:ebf14c92-8e49-4823-860b-9f0c85880bd5,Namespace:calico-system,Attempt:0,}" Sep 12 17:14:47.522694 kubelet[2556]: E0912 17:14:47.522426 2556 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-krpgf" podUID="7c11d9b3-8068-4598-8721-3a4e4f793c52" Sep 12 17:14:47.558090 kubelet[2556]: E0912 17:14:47.557999 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.558090 kubelet[2556]: W0912 17:14:47.558052 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.558386 kubelet[2556]: E0912 17:14:47.558123 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.559124 kubelet[2556]: E0912 17:14:47.558516 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.559124 kubelet[2556]: W0912 17:14:47.558541 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.559124 kubelet[2556]: E0912 17:14:47.558630 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.559385 kubelet[2556]: E0912 17:14:47.559362 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.559385 kubelet[2556]: W0912 17:14:47.559380 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.559446 kubelet[2556]: E0912 17:14:47.559395 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.560044 kubelet[2556]: E0912 17:14:47.560014 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.560044 kubelet[2556]: W0912 17:14:47.560037 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.560170 kubelet[2556]: E0912 17:14:47.560053 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.562487 kubelet[2556]: E0912 17:14:47.562428 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.562487 kubelet[2556]: W0912 17:14:47.562476 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.562671 kubelet[2556]: E0912 17:14:47.562501 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.563005 kubelet[2556]: E0912 17:14:47.562976 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.563005 kubelet[2556]: W0912 17:14:47.562999 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.563117 kubelet[2556]: E0912 17:14:47.563011 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.563958 kubelet[2556]: E0912 17:14:47.563904 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.563958 kubelet[2556]: W0912 17:14:47.563950 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.564084 kubelet[2556]: E0912 17:14:47.563973 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.565486 kubelet[2556]: E0912 17:14:47.565434 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.565486 kubelet[2556]: W0912 17:14:47.565477 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.565486 kubelet[2556]: E0912 17:14:47.565500 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.568604 kubelet[2556]: E0912 17:14:47.566236 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.568604 kubelet[2556]: W0912 17:14:47.566258 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.568604 kubelet[2556]: E0912 17:14:47.566272 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.568865 kubelet[2556]: E0912 17:14:47.568733 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.568865 kubelet[2556]: W0912 17:14:47.568748 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.568865 kubelet[2556]: E0912 17:14:47.568770 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.571202 kubelet[2556]: E0912 17:14:47.571047 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.571202 kubelet[2556]: W0912 17:14:47.571186 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.571459 kubelet[2556]: E0912 17:14:47.571223 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.574214 kubelet[2556]: E0912 17:14:47.574155 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.574214 kubelet[2556]: W0912 17:14:47.574189 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.574214 kubelet[2556]: E0912 17:14:47.574223 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.578454 kubelet[2556]: E0912 17:14:47.578189 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.578454 kubelet[2556]: W0912 17:14:47.578242 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.578454 kubelet[2556]: E0912 17:14:47.578281 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.580261 kubelet[2556]: E0912 17:14:47.579619 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.580261 kubelet[2556]: W0912 17:14:47.579672 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.580261 kubelet[2556]: E0912 17:14:47.579709 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.584368 kubelet[2556]: E0912 17:14:47.584272 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.584368 kubelet[2556]: W0912 17:14:47.584328 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.584368 kubelet[2556]: E0912 17:14:47.584375 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.587252 kubelet[2556]: E0912 17:14:47.586962 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.587252 kubelet[2556]: W0912 17:14:47.587011 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.587252 kubelet[2556]: E0912 17:14:47.587052 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.588974 kubelet[2556]: E0912 17:14:47.588910 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.588974 kubelet[2556]: W0912 17:14:47.588962 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.589233 kubelet[2556]: E0912 17:14:47.589004 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.593300 kubelet[2556]: E0912 17:14:47.592889 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.593300 kubelet[2556]: W0912 17:14:47.592935 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.593300 kubelet[2556]: E0912 17:14:47.592974 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.595190 kubelet[2556]: E0912 17:14:47.594214 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.595190 kubelet[2556]: W0912 17:14:47.594257 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.595190 kubelet[2556]: E0912 17:14:47.594285 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.596868 containerd[1474]: time="2025-09-12T17:14:47.592146606Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:14:47.596868 containerd[1474]: time="2025-09-12T17:14:47.592262962Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:14:47.596868 containerd[1474]: time="2025-09-12T17:14:47.592300321Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:14:47.596868 containerd[1474]: time="2025-09-12T17:14:47.592647470Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:14:47.597314 kubelet[2556]: E0912 17:14:47.596444 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.597314 kubelet[2556]: W0912 17:14:47.596481 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.597314 kubelet[2556]: E0912 17:14:47.596512 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.598825 kubelet[2556]: E0912 17:14:47.598492 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.598825 kubelet[2556]: W0912 17:14:47.598531 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.598825 kubelet[2556]: E0912 17:14:47.598560 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.598825 kubelet[2556]: I0912 17:14:47.598606 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7c11d9b3-8068-4598-8721-3a4e4f793c52-kubelet-dir\") pod \"csi-node-driver-krpgf\" (UID: \"7c11d9b3-8068-4598-8721-3a4e4f793c52\") " pod="calico-system/csi-node-driver-krpgf" Sep 12 17:14:47.599169 kubelet[2556]: E0912 17:14:47.598927 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.599169 kubelet[2556]: W0912 17:14:47.598942 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.599169 kubelet[2556]: E0912 17:14:47.598955 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.599169 kubelet[2556]: I0912 17:14:47.598975 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7c11d9b3-8068-4598-8721-3a4e4f793c52-registration-dir\") pod \"csi-node-driver-krpgf\" (UID: \"7c11d9b3-8068-4598-8721-3a4e4f793c52\") " pod="calico-system/csi-node-driver-krpgf" Sep 12 17:14:47.599674 kubelet[2556]: E0912 17:14:47.599359 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.599674 kubelet[2556]: W0912 17:14:47.599386 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.599674 kubelet[2556]: E0912 17:14:47.599438 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.600338 kubelet[2556]: I0912 17:14:47.600140 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7c11d9b3-8068-4598-8721-3a4e4f793c52-socket-dir\") pod \"csi-node-driver-krpgf\" (UID: \"7c11d9b3-8068-4598-8721-3a4e4f793c52\") " pod="calico-system/csi-node-driver-krpgf" Sep 12 17:14:47.600573 kubelet[2556]: E0912 17:14:47.600385 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.600573 kubelet[2556]: W0912 17:14:47.600408 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.600573 kubelet[2556]: E0912 17:14:47.600428 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.602540 kubelet[2556]: E0912 17:14:47.602278 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.602540 kubelet[2556]: W0912 17:14:47.602317 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.602540 kubelet[2556]: E0912 17:14:47.602380 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.602827 kubelet[2556]: E0912 17:14:47.602749 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.602827 kubelet[2556]: W0912 17:14:47.602764 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.603304 kubelet[2556]: E0912 17:14:47.602918 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.603304 kubelet[2556]: E0912 17:14:47.602946 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.603304 kubelet[2556]: W0912 17:14:47.602956 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.603304 kubelet[2556]: E0912 17:14:47.603045 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.603304 kubelet[2556]: I0912 17:14:47.603166 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/7c11d9b3-8068-4598-8721-3a4e4f793c52-varrun\") pod \"csi-node-driver-krpgf\" (UID: \"7c11d9b3-8068-4598-8721-3a4e4f793c52\") " pod="calico-system/csi-node-driver-krpgf" Sep 12 17:14:47.603502 kubelet[2556]: E0912 17:14:47.603336 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.603502 kubelet[2556]: W0912 17:14:47.603349 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.603951 kubelet[2556]: E0912 17:14:47.603625 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.603951 kubelet[2556]: E0912 17:14:47.603707 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.603951 kubelet[2556]: W0912 17:14:47.603716 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.603951 kubelet[2556]: E0912 17:14:47.603728 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.604592 kubelet[2556]: E0912 17:14:47.604361 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.604592 kubelet[2556]: W0912 17:14:47.604385 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.604592 kubelet[2556]: E0912 17:14:47.604534 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.604592 kubelet[2556]: I0912 17:14:47.604587 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdwmz\" (UniqueName: \"kubernetes.io/projected/7c11d9b3-8068-4598-8721-3a4e4f793c52-kube-api-access-zdwmz\") pod \"csi-node-driver-krpgf\" (UID: \"7c11d9b3-8068-4598-8721-3a4e4f793c52\") " pod="calico-system/csi-node-driver-krpgf" Sep 12 17:14:47.606024 kubelet[2556]: E0912 17:14:47.605535 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.606024 kubelet[2556]: W0912 17:14:47.605570 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.606024 kubelet[2556]: E0912 17:14:47.605809 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.607786 kubelet[2556]: E0912 17:14:47.607546 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.607786 kubelet[2556]: W0912 17:14:47.607579 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.607786 kubelet[2556]: E0912 17:14:47.607612 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.608518 kubelet[2556]: E0912 17:14:47.607967 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.608518 kubelet[2556]: W0912 17:14:47.607990 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.608518 kubelet[2556]: E0912 17:14:47.608006 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.608518 kubelet[2556]: E0912 17:14:47.608289 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.608518 kubelet[2556]: W0912 17:14:47.608300 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.608518 kubelet[2556]: E0912 17:14:47.608310 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.609154 kubelet[2556]: E0912 17:14:47.608741 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.609154 kubelet[2556]: W0912 17:14:47.608771 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.609154 kubelet[2556]: E0912 17:14:47.608794 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.642596 systemd[1]: Started cri-containerd-4095577192810c9a3b69e806108a255a713b810f3464738ac2c9862d4a3c1de5.scope - libcontainer container 4095577192810c9a3b69e806108a255a713b810f3464738ac2c9862d4a3c1de5. Sep 12 17:14:47.713104 kubelet[2556]: E0912 17:14:47.712056 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.713104 kubelet[2556]: W0912 17:14:47.712121 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.713104 kubelet[2556]: E0912 17:14:47.712169 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.714731 kubelet[2556]: E0912 17:14:47.713599 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.714731 kubelet[2556]: W0912 17:14:47.714788 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.715526 kubelet[2556]: E0912 17:14:47.715143 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.715526 kubelet[2556]: E0912 17:14:47.715427 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.715526 kubelet[2556]: W0912 17:14:47.715446 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.715526 kubelet[2556]: E0912 17:14:47.715479 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.717720 kubelet[2556]: E0912 17:14:47.717654 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.717720 kubelet[2556]: W0912 17:14:47.717700 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.717720 kubelet[2556]: E0912 17:14:47.717780 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.719731 kubelet[2556]: E0912 17:14:47.719674 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.719731 kubelet[2556]: W0912 17:14:47.719727 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.720098 kubelet[2556]: E0912 17:14:47.719869 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.720098 kubelet[2556]: E0912 17:14:47.720046 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.720098 kubelet[2556]: W0912 17:14:47.720056 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.720098 kubelet[2556]: E0912 17:14:47.720124 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.720098 kubelet[2556]: E0912 17:14:47.720251 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.720098 kubelet[2556]: W0912 17:14:47.720261 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.720547 kubelet[2556]: E0912 17:14:47.720520 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.722977 kubelet[2556]: E0912 17:14:47.720608 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.722977 kubelet[2556]: W0912 17:14:47.720632 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.722977 kubelet[2556]: E0912 17:14:47.720650 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.722977 kubelet[2556]: E0912 17:14:47.722136 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.722977 kubelet[2556]: W0912 17:14:47.722171 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.722977 kubelet[2556]: E0912 17:14:47.722206 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.722977 kubelet[2556]: E0912 17:14:47.722618 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.722977 kubelet[2556]: W0912 17:14:47.722633 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.722977 kubelet[2556]: E0912 17:14:47.722648 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.723439 kubelet[2556]: E0912 17:14:47.723391 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.723439 kubelet[2556]: W0912 17:14:47.723410 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.723439 kubelet[2556]: E0912 17:14:47.723432 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.724305 kubelet[2556]: E0912 17:14:47.724268 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.724305 kubelet[2556]: W0912 17:14:47.724295 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.724439 kubelet[2556]: E0912 17:14:47.724323 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.725363 kubelet[2556]: E0912 17:14:47.725328 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.725363 kubelet[2556]: W0912 17:14:47.725353 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.725614 kubelet[2556]: E0912 17:14:47.725384 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.726036 kubelet[2556]: E0912 17:14:47.726011 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.726036 kubelet[2556]: W0912 17:14:47.726033 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.726271 kubelet[2556]: E0912 17:14:47.726229 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.727710 kubelet[2556]: E0912 17:14:47.727510 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.727710 kubelet[2556]: W0912 17:14:47.727538 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.727710 kubelet[2556]: E0912 17:14:47.727586 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.729430 kubelet[2556]: E0912 17:14:47.729386 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.729430 kubelet[2556]: W0912 17:14:47.729429 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.729771 kubelet[2556]: E0912 17:14:47.729566 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.729771 kubelet[2556]: E0912 17:14:47.729714 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.729771 kubelet[2556]: W0912 17:14:47.729725 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.731237 kubelet[2556]: E0912 17:14:47.730237 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.731237 kubelet[2556]: E0912 17:14:47.730353 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.731237 kubelet[2556]: W0912 17:14:47.730401 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.731237 kubelet[2556]: E0912 17:14:47.730453 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.731237 kubelet[2556]: E0912 17:14:47.731184 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.731237 kubelet[2556]: W0912 17:14:47.731207 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.733713 kubelet[2556]: E0912 17:14:47.731856 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.735087 kubelet[2556]: E0912 17:14:47.734917 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.736354 kubelet[2556]: W0912 17:14:47.736135 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.736354 kubelet[2556]: E0912 17:14:47.736218 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.740540 kubelet[2556]: E0912 17:14:47.740215 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.740540 kubelet[2556]: W0912 17:14:47.740268 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.740540 kubelet[2556]: E0912 17:14:47.740370 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.742455 kubelet[2556]: E0912 17:14:47.742179 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.742455 kubelet[2556]: W0912 17:14:47.742219 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.743666 kubelet[2556]: E0912 17:14:47.743190 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.744735 kubelet[2556]: E0912 17:14:47.744139 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.744735 kubelet[2556]: W0912 17:14:47.744172 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.744735 kubelet[2556]: E0912 17:14:47.744256 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.748813 kubelet[2556]: E0912 17:14:47.748355 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.748813 kubelet[2556]: W0912 17:14:47.748420 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.748813 kubelet[2556]: E0912 17:14:47.748497 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.751434 kubelet[2556]: E0912 17:14:47.751393 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.752405 kubelet[2556]: W0912 17:14:47.751921 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.752405 kubelet[2556]: E0912 17:14:47.752005 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.809325 kubelet[2556]: E0912 17:14:47.809269 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:47.809699 kubelet[2556]: W0912 17:14:47.809575 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:47.809699 kubelet[2556]: E0912 17:14:47.809625 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:47.983161 containerd[1474]: time="2025-09-12T17:14:47.980166734Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-ftnhk,Uid:ebf14c92-8e49-4823-860b-9f0c85880bd5,Namespace:calico-system,Attempt:0,} returns sandbox id \"4095577192810c9a3b69e806108a255a713b810f3464738ac2c9862d4a3c1de5\"" Sep 12 17:14:49.036034 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1493720521.mount: Deactivated successfully. Sep 12 17:14:49.576466 kubelet[2556]: E0912 17:14:49.576217 2556 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-krpgf" podUID="7c11d9b3-8068-4598-8721-3a4e4f793c52" Sep 12 17:14:49.757115 containerd[1474]: time="2025-09-12T17:14:49.755844487Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:49.757747 containerd[1474]: time="2025-09-12T17:14:49.757545279Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=33105775" Sep 12 17:14:49.758574 containerd[1474]: time="2025-09-12T17:14:49.758524891Z" level=info msg="ImageCreate event name:\"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:49.762836 containerd[1474]: time="2025-09-12T17:14:49.762759770Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:49.765728 containerd[1474]: time="2025-09-12T17:14:49.765643448Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"33105629\" in 2.283438313s" Sep 12 17:14:49.766109 containerd[1474]: time="2025-09-12T17:14:49.766044197Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\"" Sep 12 17:14:49.771983 containerd[1474]: time="2025-09-12T17:14:49.771914949Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 12 17:14:49.797685 containerd[1474]: time="2025-09-12T17:14:49.797588338Z" level=info msg="CreateContainer within sandbox \"c72f95e6b3df1c904309a66ca7dcf6085fe166d3b66d7c1eb570199c4a80b3b1\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 12 17:14:49.821933 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3701380938.mount: Deactivated successfully. Sep 12 17:14:49.828419 containerd[1474]: time="2025-09-12T17:14:49.828173907Z" level=info msg="CreateContainer within sandbox \"c72f95e6b3df1c904309a66ca7dcf6085fe166d3b66d7c1eb570199c4a80b3b1\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"c4abf7678d8928a879905d7bfb6238c66bc111924fa3d6b98d29b416ea75a48a\"" Sep 12 17:14:49.831143 containerd[1474]: time="2025-09-12T17:14:49.829960936Z" level=info msg="StartContainer for \"c4abf7678d8928a879905d7bfb6238c66bc111924fa3d6b98d29b416ea75a48a\"" Sep 12 17:14:49.894562 systemd[1]: Started cri-containerd-c4abf7678d8928a879905d7bfb6238c66bc111924fa3d6b98d29b416ea75a48a.scope - libcontainer container c4abf7678d8928a879905d7bfb6238c66bc111924fa3d6b98d29b416ea75a48a. Sep 12 17:14:49.966291 containerd[1474]: time="2025-09-12T17:14:49.965883625Z" level=info msg="StartContainer for \"c4abf7678d8928a879905d7bfb6238c66bc111924fa3d6b98d29b416ea75a48a\" returns successfully" Sep 12 17:14:50.801222 kubelet[2556]: I0912 17:14:50.801104 2556 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-7b767c467d-5wkhj" podStartSLOduration=2.5128525059999998 podStartE2EDuration="4.801030758s" podCreationTimestamp="2025-09-12 17:14:46 +0000 UTC" firstStartedPulling="2025-09-12 17:14:47.480156199 +0000 UTC m=+29.142897558" lastFinishedPulling="2025-09-12 17:14:49.768334451 +0000 UTC m=+31.431075810" observedRunningTime="2025-09-12 17:14:50.79915173 +0000 UTC m=+32.461893089" watchObservedRunningTime="2025-09-12 17:14:50.801030758 +0000 UTC m=+32.463772117" Sep 12 17:14:50.822300 kubelet[2556]: E0912 17:14:50.822235 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.822300 kubelet[2556]: W0912 17:14:50.822289 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.822300 kubelet[2556]: E0912 17:14:50.822341 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.822958 kubelet[2556]: E0912 17:14:50.822764 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.822958 kubelet[2556]: W0912 17:14:50.822816 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.822958 kubelet[2556]: E0912 17:14:50.822837 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.823213 kubelet[2556]: E0912 17:14:50.823185 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.823213 kubelet[2556]: W0912 17:14:50.823209 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.823297 kubelet[2556]: E0912 17:14:50.823227 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.823572 kubelet[2556]: E0912 17:14:50.823551 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.823631 kubelet[2556]: W0912 17:14:50.823576 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.823631 kubelet[2556]: E0912 17:14:50.823592 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.824134 kubelet[2556]: E0912 17:14:50.824111 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.824215 kubelet[2556]: W0912 17:14:50.824142 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.824215 kubelet[2556]: E0912 17:14:50.824171 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.824787 kubelet[2556]: E0912 17:14:50.824761 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.824872 kubelet[2556]: W0912 17:14:50.824787 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.824872 kubelet[2556]: E0912 17:14:50.824831 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.825272 kubelet[2556]: E0912 17:14:50.825237 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.825360 kubelet[2556]: W0912 17:14:50.825274 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.825360 kubelet[2556]: E0912 17:14:50.825296 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.825787 kubelet[2556]: E0912 17:14:50.825761 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.825841 kubelet[2556]: W0912 17:14:50.825787 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.825841 kubelet[2556]: E0912 17:14:50.825805 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.826156 kubelet[2556]: E0912 17:14:50.826138 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.826210 kubelet[2556]: W0912 17:14:50.826158 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.826210 kubelet[2556]: E0912 17:14:50.826174 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.826541 kubelet[2556]: E0912 17:14:50.826482 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.826541 kubelet[2556]: W0912 17:14:50.826510 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.826541 kubelet[2556]: E0912 17:14:50.826535 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.826857 kubelet[2556]: E0912 17:14:50.826836 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.826900 kubelet[2556]: W0912 17:14:50.826858 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.826900 kubelet[2556]: E0912 17:14:50.826874 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.827305 kubelet[2556]: E0912 17:14:50.827277 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.827383 kubelet[2556]: W0912 17:14:50.827307 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.827383 kubelet[2556]: E0912 17:14:50.827324 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.827701 kubelet[2556]: E0912 17:14:50.827682 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.827748 kubelet[2556]: W0912 17:14:50.827714 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.827748 kubelet[2556]: E0912 17:14:50.827728 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.828022 kubelet[2556]: E0912 17:14:50.828006 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.828054 kubelet[2556]: W0912 17:14:50.828022 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.828054 kubelet[2556]: E0912 17:14:50.828033 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.828255 kubelet[2556]: E0912 17:14:50.828242 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.828283 kubelet[2556]: W0912 17:14:50.828256 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.828283 kubelet[2556]: E0912 17:14:50.828268 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.856189 kubelet[2556]: E0912 17:14:50.855937 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.856189 kubelet[2556]: W0912 17:14:50.855985 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.856189 kubelet[2556]: E0912 17:14:50.856025 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.856695 kubelet[2556]: E0912 17:14:50.856429 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.856695 kubelet[2556]: W0912 17:14:50.856444 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.856695 kubelet[2556]: E0912 17:14:50.856463 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.857768 kubelet[2556]: E0912 17:14:50.856785 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.857768 kubelet[2556]: W0912 17:14:50.856799 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.857768 kubelet[2556]: E0912 17:14:50.856812 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.857768 kubelet[2556]: E0912 17:14:50.857244 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.857768 kubelet[2556]: W0912 17:14:50.857288 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.857768 kubelet[2556]: E0912 17:14:50.857327 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.858958 kubelet[2556]: E0912 17:14:50.858424 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.858958 kubelet[2556]: W0912 17:14:50.858458 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.858958 kubelet[2556]: E0912 17:14:50.858510 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.859708 kubelet[2556]: E0912 17:14:50.859669 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.859708 kubelet[2556]: W0912 17:14:50.859762 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.860662 kubelet[2556]: E0912 17:14:50.859830 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.861593 kubelet[2556]: E0912 17:14:50.861551 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.861830 kubelet[2556]: W0912 17:14:50.861762 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.862087 kubelet[2556]: E0912 17:14:50.861998 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.862489 kubelet[2556]: E0912 17:14:50.862434 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.862489 kubelet[2556]: W0912 17:14:50.862458 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.862689 kubelet[2556]: E0912 17:14:50.862598 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.863100 kubelet[2556]: E0912 17:14:50.863048 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.863273 kubelet[2556]: W0912 17:14:50.863203 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.863273 kubelet[2556]: E0912 17:14:50.863244 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.863625 kubelet[2556]: E0912 17:14:50.863606 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.863862 kubelet[2556]: W0912 17:14:50.863664 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.864001 kubelet[2556]: E0912 17:14:50.863919 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.864001 kubelet[2556]: E0912 17:14:50.863979 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.864001 kubelet[2556]: W0912 17:14:50.863993 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.864258 kubelet[2556]: E0912 17:14:50.864091 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.864288 kubelet[2556]: E0912 17:14:50.864260 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.864525 kubelet[2556]: W0912 17:14:50.864284 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.864525 kubelet[2556]: E0912 17:14:50.864310 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.864720 kubelet[2556]: E0912 17:14:50.864695 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.864807 kubelet[2556]: W0912 17:14:50.864793 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.864911 kubelet[2556]: E0912 17:14:50.864899 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.865408 kubelet[2556]: E0912 17:14:50.865383 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.865469 kubelet[2556]: W0912 17:14:50.865407 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.865469 kubelet[2556]: E0912 17:14:50.865453 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.866408 kubelet[2556]: E0912 17:14:50.866379 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.866482 kubelet[2556]: W0912 17:14:50.866408 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.866482 kubelet[2556]: E0912 17:14:50.866442 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.867204 kubelet[2556]: E0912 17:14:50.866845 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.867204 kubelet[2556]: W0912 17:14:50.866869 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.867204 kubelet[2556]: E0912 17:14:50.866884 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.867602 kubelet[2556]: E0912 17:14:50.867565 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.867657 kubelet[2556]: W0912 17:14:50.867602 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.867657 kubelet[2556]: E0912 17:14:50.867651 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:50.868420 kubelet[2556]: E0912 17:14:50.868394 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:50.868485 kubelet[2556]: W0912 17:14:50.868419 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:50.868485 kubelet[2556]: E0912 17:14:50.868440 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.575035 kubelet[2556]: E0912 17:14:51.574902 2556 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-krpgf" podUID="7c11d9b3-8068-4598-8721-3a4e4f793c52" Sep 12 17:14:51.781254 kubelet[2556]: I0912 17:14:51.780475 2556 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:14:51.795176 containerd[1474]: time="2025-09-12T17:14:51.793451286Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:51.796191 containerd[1474]: time="2025-09-12T17:14:51.796111736Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4266814" Sep 12 17:14:51.796434 containerd[1474]: time="2025-09-12T17:14:51.796384689Z" level=info msg="ImageCreate event name:\"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:51.801675 containerd[1474]: time="2025-09-12T17:14:51.801588152Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:51.802491 containerd[1474]: time="2025-09-12T17:14:51.802443730Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5636015\" in 2.029941997s" Sep 12 17:14:51.803672 containerd[1474]: time="2025-09-12T17:14:51.803625739Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\"" Sep 12 17:14:51.810248 containerd[1474]: time="2025-09-12T17:14:51.810176607Z" level=info msg="CreateContainer within sandbox \"4095577192810c9a3b69e806108a255a713b810f3464738ac2c9862d4a3c1de5\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 12 17:14:51.837213 kubelet[2556]: E0912 17:14:51.837018 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.837797 kubelet[2556]: W0912 17:14:51.837233 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.837797 kubelet[2556]: E0912 17:14:51.837302 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.839084 kubelet[2556]: E0912 17:14:51.838890 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.839084 kubelet[2556]: W0912 17:14:51.838968 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.839562 kubelet[2556]: E0912 17:14:51.839132 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.839562 kubelet[2556]: E0912 17:14:51.839484 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.839562 kubelet[2556]: W0912 17:14:51.839499 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.839562 kubelet[2556]: E0912 17:14:51.839512 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.839770 kubelet[2556]: E0912 17:14:51.839702 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.839770 kubelet[2556]: W0912 17:14:51.839712 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.839770 kubelet[2556]: E0912 17:14:51.839722 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.839938 kubelet[2556]: E0912 17:14:51.839911 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.839938 kubelet[2556]: W0912 17:14:51.839927 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.839938 kubelet[2556]: E0912 17:14:51.839938 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.840153 kubelet[2556]: E0912 17:14:51.840082 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.840153 kubelet[2556]: W0912 17:14:51.840091 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.840153 kubelet[2556]: E0912 17:14:51.840101 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.840332 kubelet[2556]: E0912 17:14:51.840270 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.840332 kubelet[2556]: W0912 17:14:51.840279 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.840332 kubelet[2556]: E0912 17:14:51.840288 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.840545 kubelet[2556]: E0912 17:14:51.840521 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.840545 kubelet[2556]: W0912 17:14:51.840538 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.840545 kubelet[2556]: E0912 17:14:51.840549 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.841188 kubelet[2556]: E0912 17:14:51.840748 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.841188 kubelet[2556]: W0912 17:14:51.840757 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.841188 kubelet[2556]: E0912 17:14:51.840767 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.841188 kubelet[2556]: E0912 17:14:51.840913 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.841188 kubelet[2556]: W0912 17:14:51.840921 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.841188 kubelet[2556]: E0912 17:14:51.840930 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.841188 kubelet[2556]: E0912 17:14:51.841114 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.841188 kubelet[2556]: W0912 17:14:51.841129 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.841188 kubelet[2556]: E0912 17:14:51.841139 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.841689 kubelet[2556]: E0912 17:14:51.841292 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.841689 kubelet[2556]: W0912 17:14:51.841300 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.841689 kubelet[2556]: E0912 17:14:51.841366 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.841689 kubelet[2556]: E0912 17:14:51.841545 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.841689 kubelet[2556]: W0912 17:14:51.841554 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.841689 kubelet[2556]: E0912 17:14:51.841563 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.841966 kubelet[2556]: E0912 17:14:51.841712 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.841966 kubelet[2556]: W0912 17:14:51.841722 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.841966 kubelet[2556]: E0912 17:14:51.841729 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.841966 kubelet[2556]: E0912 17:14:51.841877 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.841966 kubelet[2556]: W0912 17:14:51.841884 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.841966 kubelet[2556]: E0912 17:14:51.841890 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.846460 containerd[1474]: time="2025-09-12T17:14:51.846363256Z" level=info msg="CreateContainer within sandbox \"4095577192810c9a3b69e806108a255a713b810f3464738ac2c9862d4a3c1de5\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"60df05c85bd1e59ec33bb44c93c07062fe98d7147e5bfefa505718b7e4d00054\"" Sep 12 17:14:51.847840 containerd[1474]: time="2025-09-12T17:14:51.847723941Z" level=info msg="StartContainer for \"60df05c85bd1e59ec33bb44c93c07062fe98d7147e5bfefa505718b7e4d00054\"" Sep 12 17:14:51.869484 kubelet[2556]: E0912 17:14:51.869421 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.869484 kubelet[2556]: W0912 17:14:51.869468 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.869484 kubelet[2556]: E0912 17:14:51.869516 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.871757 kubelet[2556]: E0912 17:14:51.870719 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.871757 kubelet[2556]: W0912 17:14:51.870756 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.871757 kubelet[2556]: E0912 17:14:51.870788 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.871757 kubelet[2556]: E0912 17:14:51.871161 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.871757 kubelet[2556]: W0912 17:14:51.871172 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.871757 kubelet[2556]: E0912 17:14:51.871184 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.871757 kubelet[2556]: E0912 17:14:51.871579 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.871757 kubelet[2556]: W0912 17:14:51.871594 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.871757 kubelet[2556]: E0912 17:14:51.871663 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.871984 kubelet[2556]: E0912 17:14:51.871874 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.871984 kubelet[2556]: W0912 17:14:51.871884 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.871984 kubelet[2556]: E0912 17:14:51.871898 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.873527 kubelet[2556]: E0912 17:14:51.872087 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.873527 kubelet[2556]: W0912 17:14:51.872104 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.873527 kubelet[2556]: E0912 17:14:51.872122 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.873527 kubelet[2556]: E0912 17:14:51.872354 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.873527 kubelet[2556]: W0912 17:14:51.872364 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.873527 kubelet[2556]: E0912 17:14:51.872399 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.877030 kubelet[2556]: E0912 17:14:51.874948 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.877030 kubelet[2556]: W0912 17:14:51.874995 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.877030 kubelet[2556]: E0912 17:14:51.875029 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.877030 kubelet[2556]: E0912 17:14:51.876378 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.877030 kubelet[2556]: W0912 17:14:51.876406 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.877030 kubelet[2556]: E0912 17:14:51.876530 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.881116 kubelet[2556]: E0912 17:14:51.880928 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.881116 kubelet[2556]: W0912 17:14:51.880974 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.881727 kubelet[2556]: E0912 17:14:51.881483 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.882125 kubelet[2556]: E0912 17:14:51.882031 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.882125 kubelet[2556]: W0912 17:14:51.882054 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.882375 kubelet[2556]: E0912 17:14:51.882288 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.882824 kubelet[2556]: E0912 17:14:51.882714 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.882824 kubelet[2556]: W0912 17:14:51.882746 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.883100 kubelet[2556]: E0912 17:14:51.882999 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.883691 kubelet[2556]: E0912 17:14:51.883506 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.883691 kubelet[2556]: W0912 17:14:51.883529 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.883691 kubelet[2556]: E0912 17:14:51.883572 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.884057 kubelet[2556]: E0912 17:14:51.883920 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.884057 kubelet[2556]: W0912 17:14:51.883934 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.884057 kubelet[2556]: E0912 17:14:51.883953 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.884674 kubelet[2556]: E0912 17:14:51.884526 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.884674 kubelet[2556]: W0912 17:14:51.884542 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.884674 kubelet[2556]: E0912 17:14:51.884561 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.885644 kubelet[2556]: E0912 17:14:51.885205 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.885644 kubelet[2556]: W0912 17:14:51.885238 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.885644 kubelet[2556]: E0912 17:14:51.885261 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.887376 kubelet[2556]: E0912 17:14:51.887327 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.887376 kubelet[2556]: W0912 17:14:51.887365 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.887565 kubelet[2556]: E0912 17:14:51.887408 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.888620 kubelet[2556]: E0912 17:14:51.888587 2556 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:14:51.888620 kubelet[2556]: W0912 17:14:51.888618 2556 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:14:51.888781 kubelet[2556]: E0912 17:14:51.888645 2556 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:14:51.921543 systemd[1]: Started cri-containerd-60df05c85bd1e59ec33bb44c93c07062fe98d7147e5bfefa505718b7e4d00054.scope - libcontainer container 60df05c85bd1e59ec33bb44c93c07062fe98d7147e5bfefa505718b7e4d00054. Sep 12 17:14:51.970747 containerd[1474]: time="2025-09-12T17:14:51.970491477Z" level=info msg="StartContainer for \"60df05c85bd1e59ec33bb44c93c07062fe98d7147e5bfefa505718b7e4d00054\" returns successfully" Sep 12 17:14:51.997193 systemd[1]: cri-containerd-60df05c85bd1e59ec33bb44c93c07062fe98d7147e5bfefa505718b7e4d00054.scope: Deactivated successfully. Sep 12 17:14:52.127968 containerd[1474]: time="2025-09-12T17:14:52.127638479Z" level=info msg="shim disconnected" id=60df05c85bd1e59ec33bb44c93c07062fe98d7147e5bfefa505718b7e4d00054 namespace=k8s.io Sep 12 17:14:52.127968 containerd[1474]: time="2025-09-12T17:14:52.127805714Z" level=warning msg="cleaning up after shim disconnected" id=60df05c85bd1e59ec33bb44c93c07062fe98d7147e5bfefa505718b7e4d00054 namespace=k8s.io Sep 12 17:14:52.127968 containerd[1474]: time="2025-09-12T17:14:52.127821994Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 12 17:14:52.787268 containerd[1474]: time="2025-09-12T17:14:52.787190906Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 12 17:14:52.828586 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-60df05c85bd1e59ec33bb44c93c07062fe98d7147e5bfefa505718b7e4d00054-rootfs.mount: Deactivated successfully. Sep 12 17:14:53.575292 kubelet[2556]: E0912 17:14:53.575123 2556 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-krpgf" podUID="7c11d9b3-8068-4598-8721-3a4e4f793c52" Sep 12 17:14:55.575571 kubelet[2556]: E0912 17:14:55.575043 2556 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-krpgf" podUID="7c11d9b3-8068-4598-8721-3a4e4f793c52" Sep 12 17:14:56.020610 containerd[1474]: time="2025-09-12T17:14:56.019038446Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:56.022403 containerd[1474]: time="2025-09-12T17:14:56.022310255Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=65913477" Sep 12 17:14:56.025279 containerd[1474]: time="2025-09-12T17:14:56.025153113Z" level=info msg="ImageCreate event name:\"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:56.029001 containerd[1474]: time="2025-09-12T17:14:56.027725257Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:14:56.029001 containerd[1474]: time="2025-09-12T17:14:56.028638557Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"67282718\" in 3.241376692s" Sep 12 17:14:56.029001 containerd[1474]: time="2025-09-12T17:14:56.028687876Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\"" Sep 12 17:14:56.038555 containerd[1474]: time="2025-09-12T17:14:56.038424943Z" level=info msg="CreateContainer within sandbox \"4095577192810c9a3b69e806108a255a713b810f3464738ac2c9862d4a3c1de5\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 12 17:14:56.062766 containerd[1474]: time="2025-09-12T17:14:56.062688454Z" level=info msg="CreateContainer within sandbox \"4095577192810c9a3b69e806108a255a713b810f3464738ac2c9862d4a3c1de5\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"438d48979ee3e70bca3989e1e6695adf284b9d225b145109e7db19d0541c8723\"" Sep 12 17:14:56.064719 containerd[1474]: time="2025-09-12T17:14:56.064657891Z" level=info msg="StartContainer for \"438d48979ee3e70bca3989e1e6695adf284b9d225b145109e7db19d0541c8723\"" Sep 12 17:14:56.116502 systemd[1]: Started cri-containerd-438d48979ee3e70bca3989e1e6695adf284b9d225b145109e7db19d0541c8723.scope - libcontainer container 438d48979ee3e70bca3989e1e6695adf284b9d225b145109e7db19d0541c8723. Sep 12 17:14:56.169712 containerd[1474]: time="2025-09-12T17:14:56.168592945Z" level=info msg="StartContainer for \"438d48979ee3e70bca3989e1e6695adf284b9d225b145109e7db19d0541c8723\" returns successfully" Sep 12 17:14:56.885238 containerd[1474]: time="2025-09-12T17:14:56.883867947Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 17:14:56.888201 systemd[1]: cri-containerd-438d48979ee3e70bca3989e1e6695adf284b9d225b145109e7db19d0541c8723.scope: Deactivated successfully. Sep 12 17:14:56.920741 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-438d48979ee3e70bca3989e1e6695adf284b9d225b145109e7db19d0541c8723-rootfs.mount: Deactivated successfully. Sep 12 17:14:56.928800 kubelet[2556]: I0912 17:14:56.928741 2556 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 12 17:14:56.990103 containerd[1474]: time="2025-09-12T17:14:56.989660640Z" level=info msg="shim disconnected" id=438d48979ee3e70bca3989e1e6695adf284b9d225b145109e7db19d0541c8723 namespace=k8s.io Sep 12 17:14:56.990103 containerd[1474]: time="2025-09-12T17:14:56.989777557Z" level=warning msg="cleaning up after shim disconnected" id=438d48979ee3e70bca3989e1e6695adf284b9d225b145109e7db19d0541c8723 namespace=k8s.io Sep 12 17:14:56.990103 containerd[1474]: time="2025-09-12T17:14:56.989788877Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 12 17:14:57.034873 systemd[1]: Created slice kubepods-besteffort-pod5b09b4dd_b380_4298_bca4_9ca9795e60bc.slice - libcontainer container kubepods-besteffort-pod5b09b4dd_b380_4298_bca4_9ca9795e60bc.slice. Sep 12 17:14:57.048195 kubelet[2556]: W0912 17:14:57.045301 2556 reflector.go:569] object-"calico-system"/"whisker-ca-bundle": failed to list *v1.ConfigMap: configmaps "whisker-ca-bundle" is forbidden: User "system:node:ci-4081-3-6-2-0999f1dc3d" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4081-3-6-2-0999f1dc3d' and this object Sep 12 17:14:57.048195 kubelet[2556]: E0912 17:14:57.045389 2556 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"whisker-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"whisker-ca-bundle\" is forbidden: User \"system:node:ci-4081-3-6-2-0999f1dc3d\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4081-3-6-2-0999f1dc3d' and this object" logger="UnhandledError" Sep 12 17:14:57.052772 systemd[1]: Created slice kubepods-besteffort-poda51ebaed_92f6_4627_9397_05d2c27efa46.slice - libcontainer container kubepods-besteffort-poda51ebaed_92f6_4627_9397_05d2c27efa46.slice. Sep 12 17:14:57.069843 kubelet[2556]: W0912 17:14:57.069423 2556 reflector.go:569] object-"kube-system"/"coredns": failed to list *v1.ConfigMap: configmaps "coredns" is forbidden: User "system:node:ci-4081-3-6-2-0999f1dc3d" cannot list resource "configmaps" in API group "" in the namespace "kube-system": no relationship found between node 'ci-4081-3-6-2-0999f1dc3d' and this object Sep 12 17:14:57.070786 kubelet[2556]: E0912 17:14:57.070289 2556 reflector.go:166] "Unhandled Error" err="object-\"kube-system\"/\"coredns\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"coredns\" is forbidden: User \"system:node:ci-4081-3-6-2-0999f1dc3d\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ci-4081-3-6-2-0999f1dc3d' and this object" logger="UnhandledError" Sep 12 17:14:57.071038 kubelet[2556]: W0912 17:14:57.069732 2556 reflector.go:569] object-"calico-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4081-3-6-2-0999f1dc3d" cannot list resource "configmaps" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-4081-3-6-2-0999f1dc3d' and this object Sep 12 17:14:57.071199 kubelet[2556]: E0912 17:14:57.071169 2556 reflector.go:166] "Unhandled Error" err="object-\"calico-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ci-4081-3-6-2-0999f1dc3d\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ci-4081-3-6-2-0999f1dc3d' and this object" logger="UnhandledError" Sep 12 17:14:57.081789 systemd[1]: Created slice kubepods-besteffort-pod3b22a780_bb41_4d2d_9dad_2beb095f9e2c.slice - libcontainer container kubepods-besteffort-pod3b22a780_bb41_4d2d_9dad_2beb095f9e2c.slice. Sep 12 17:14:57.086668 kubelet[2556]: W0912 17:14:57.085103 2556 reflector.go:569] object-"calico-system"/"goldmane-key-pair": failed to list *v1.Secret: secrets "goldmane-key-pair" is forbidden: User "system:node:ci-4081-3-6-2-0999f1dc3d" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4081-3-6-2-0999f1dc3d' and this object Sep 12 17:14:57.086947 kubelet[2556]: E0912 17:14:57.086837 2556 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"goldmane-key-pair\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"goldmane-key-pair\" is forbidden: User \"system:node:ci-4081-3-6-2-0999f1dc3d\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4081-3-6-2-0999f1dc3d' and this object" logger="UnhandledError" Sep 12 17:14:57.093201 systemd[1]: Created slice kubepods-besteffort-pod8a36e632_808a_47ed_b4e3_b9bcf0459096.slice - libcontainer container kubepods-besteffort-pod8a36e632_808a_47ed_b4e3_b9bcf0459096.slice. Sep 12 17:14:57.108546 systemd[1]: Created slice kubepods-burstable-podecddc12a_94e3_4f27_a97d_480a79c66c1a.slice - libcontainer container kubepods-burstable-podecddc12a_94e3_4f27_a97d_480a79c66c1a.slice. Sep 12 17:14:57.121969 kubelet[2556]: I0912 17:14:57.117416 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5v9n\" (UniqueName: \"kubernetes.io/projected/59f64a22-a697-45df-a931-7e5ec9919a4c-kube-api-access-r5v9n\") pod \"coredns-668d6bf9bc-j49g9\" (UID: \"59f64a22-a697-45df-a931-7e5ec9919a4c\") " pod="kube-system/coredns-668d6bf9bc-j49g9" Sep 12 17:14:57.121969 kubelet[2556]: I0912 17:14:57.117559 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b09b4dd-b380-4298-bca4-9ca9795e60bc-whisker-ca-bundle\") pod \"whisker-796fc4c7cd-sr5c6\" (UID: \"5b09b4dd-b380-4298-bca4-9ca9795e60bc\") " pod="calico-system/whisker-796fc4c7cd-sr5c6" Sep 12 17:14:57.121969 kubelet[2556]: I0912 17:14:57.117589 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwfsp\" (UniqueName: \"kubernetes.io/projected/0d01a44b-0670-4021-80bc-06b979ee8824-kube-api-access-dwfsp\") pod \"calico-apiserver-7df9fb7db-kpktm\" (UID: \"0d01a44b-0670-4021-80bc-06b979ee8824\") " pod="calico-apiserver/calico-apiserver-7df9fb7db-kpktm" Sep 12 17:14:57.121969 kubelet[2556]: I0912 17:14:57.117633 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hvlr\" (UniqueName: \"kubernetes.io/projected/3b22a780-bb41-4d2d-9dad-2beb095f9e2c-kube-api-access-9hvlr\") pod \"goldmane-54d579b49d-fnsjh\" (UID: \"3b22a780-bb41-4d2d-9dad-2beb095f9e2c\") " pod="calico-system/goldmane-54d579b49d-fnsjh" Sep 12 17:14:57.121969 kubelet[2556]: I0912 17:14:57.117661 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfz92\" (UniqueName: \"kubernetes.io/projected/ecddc12a-94e3-4f27-a97d-480a79c66c1a-kube-api-access-xfz92\") pod \"coredns-668d6bf9bc-pc7pt\" (UID: \"ecddc12a-94e3-4f27-a97d-480a79c66c1a\") " pod="kube-system/coredns-668d6bf9bc-pc7pt" Sep 12 17:14:57.122489 kubelet[2556]: I0912 17:14:57.117681 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/0d01a44b-0670-4021-80bc-06b979ee8824-calico-apiserver-certs\") pod \"calico-apiserver-7df9fb7db-kpktm\" (UID: \"0d01a44b-0670-4021-80bc-06b979ee8824\") " pod="calico-apiserver/calico-apiserver-7df9fb7db-kpktm" Sep 12 17:14:57.122489 kubelet[2556]: I0912 17:14:57.117714 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5b09b4dd-b380-4298-bca4-9ca9795e60bc-whisker-backend-key-pair\") pod \"whisker-796fc4c7cd-sr5c6\" (UID: \"5b09b4dd-b380-4298-bca4-9ca9795e60bc\") " pod="calico-system/whisker-796fc4c7cd-sr5c6" Sep 12 17:14:57.122489 kubelet[2556]: I0912 17:14:57.117742 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2jl8\" (UniqueName: \"kubernetes.io/projected/5b09b4dd-b380-4298-bca4-9ca9795e60bc-kube-api-access-p2jl8\") pod \"whisker-796fc4c7cd-sr5c6\" (UID: \"5b09b4dd-b380-4298-bca4-9ca9795e60bc\") " pod="calico-system/whisker-796fc4c7cd-sr5c6" Sep 12 17:14:57.122489 kubelet[2556]: I0912 17:14:57.117783 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/3b22a780-bb41-4d2d-9dad-2beb095f9e2c-goldmane-key-pair\") pod \"goldmane-54d579b49d-fnsjh\" (UID: \"3b22a780-bb41-4d2d-9dad-2beb095f9e2c\") " pod="calico-system/goldmane-54d579b49d-fnsjh" Sep 12 17:14:57.122489 kubelet[2556]: I0912 17:14:57.117818 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnlz6\" (UniqueName: \"kubernetes.io/projected/8a36e632-808a-47ed-b4e3-b9bcf0459096-kube-api-access-dnlz6\") pod \"calico-apiserver-7df9fb7db-7p65c\" (UID: \"8a36e632-808a-47ed-b4e3-b9bcf0459096\") " pod="calico-apiserver/calico-apiserver-7df9fb7db-7p65c" Sep 12 17:14:57.122609 kubelet[2556]: I0912 17:14:57.117842 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a51ebaed-92f6-4627-9397-05d2c27efa46-tigera-ca-bundle\") pod \"calico-kube-controllers-6fff4fc9-pkdxs\" (UID: \"a51ebaed-92f6-4627-9397-05d2c27efa46\") " pod="calico-system/calico-kube-controllers-6fff4fc9-pkdxs" Sep 12 17:14:57.122609 kubelet[2556]: I0912 17:14:57.117871 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b22a780-bb41-4d2d-9dad-2beb095f9e2c-config\") pod \"goldmane-54d579b49d-fnsjh\" (UID: \"3b22a780-bb41-4d2d-9dad-2beb095f9e2c\") " pod="calico-system/goldmane-54d579b49d-fnsjh" Sep 12 17:14:57.122609 kubelet[2556]: I0912 17:14:57.117894 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ecddc12a-94e3-4f27-a97d-480a79c66c1a-config-volume\") pod \"coredns-668d6bf9bc-pc7pt\" (UID: \"ecddc12a-94e3-4f27-a97d-480a79c66c1a\") " pod="kube-system/coredns-668d6bf9bc-pc7pt" Sep 12 17:14:57.122609 kubelet[2556]: I0912 17:14:57.117918 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/59f64a22-a697-45df-a931-7e5ec9919a4c-config-volume\") pod \"coredns-668d6bf9bc-j49g9\" (UID: \"59f64a22-a697-45df-a931-7e5ec9919a4c\") " pod="kube-system/coredns-668d6bf9bc-j49g9" Sep 12 17:14:57.122609 kubelet[2556]: I0912 17:14:57.117954 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/8a36e632-808a-47ed-b4e3-b9bcf0459096-calico-apiserver-certs\") pod \"calico-apiserver-7df9fb7db-7p65c\" (UID: \"8a36e632-808a-47ed-b4e3-b9bcf0459096\") " pod="calico-apiserver/calico-apiserver-7df9fb7db-7p65c" Sep 12 17:14:57.122733 kubelet[2556]: I0912 17:14:57.117980 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58tkf\" (UniqueName: \"kubernetes.io/projected/a51ebaed-92f6-4627-9397-05d2c27efa46-kube-api-access-58tkf\") pod \"calico-kube-controllers-6fff4fc9-pkdxs\" (UID: \"a51ebaed-92f6-4627-9397-05d2c27efa46\") " pod="calico-system/calico-kube-controllers-6fff4fc9-pkdxs" Sep 12 17:14:57.122733 kubelet[2556]: I0912 17:14:57.118007 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b22a780-bb41-4d2d-9dad-2beb095f9e2c-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-fnsjh\" (UID: \"3b22a780-bb41-4d2d-9dad-2beb095f9e2c\") " pod="calico-system/goldmane-54d579b49d-fnsjh" Sep 12 17:14:57.132217 systemd[1]: Created slice kubepods-besteffort-pod0d01a44b_0670_4021_80bc_06b979ee8824.slice - libcontainer container kubepods-besteffort-pod0d01a44b_0670_4021_80bc_06b979ee8824.slice. Sep 12 17:14:57.149276 systemd[1]: Created slice kubepods-burstable-pod59f64a22_a697_45df_a931_7e5ec9919a4c.slice - libcontainer container kubepods-burstable-pod59f64a22_a697_45df_a931_7e5ec9919a4c.slice. Sep 12 17:14:57.370595 containerd[1474]: time="2025-09-12T17:14:57.369714141Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6fff4fc9-pkdxs,Uid:a51ebaed-92f6-4627-9397-05d2c27efa46,Namespace:calico-system,Attempt:0,}" Sep 12 17:14:57.483060 containerd[1474]: time="2025-09-12T17:14:57.482586162Z" level=error msg="Failed to destroy network for sandbox \"ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:57.486118 containerd[1474]: time="2025-09-12T17:14:57.484048971Z" level=error msg="encountered an error cleaning up failed sandbox \"ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:57.486118 containerd[1474]: time="2025-09-12T17:14:57.484252167Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6fff4fc9-pkdxs,Uid:a51ebaed-92f6-4627-9397-05d2c27efa46,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:57.486582 kubelet[2556]: E0912 17:14:57.484691 2556 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:57.486582 kubelet[2556]: E0912 17:14:57.484834 2556 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6fff4fc9-pkdxs" Sep 12 17:14:57.486582 kubelet[2556]: E0912 17:14:57.484865 2556 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6fff4fc9-pkdxs" Sep 12 17:14:57.487457 kubelet[2556]: E0912 17:14:57.484939 2556 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6fff4fc9-pkdxs_calico-system(a51ebaed-92f6-4627-9397-05d2c27efa46)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6fff4fc9-pkdxs_calico-system(a51ebaed-92f6-4627-9397-05d2c27efa46)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6fff4fc9-pkdxs" podUID="a51ebaed-92f6-4627-9397-05d2c27efa46" Sep 12 17:14:57.586343 systemd[1]: Created slice kubepods-besteffort-pod7c11d9b3_8068_4598_8721_3a4e4f793c52.slice - libcontainer container kubepods-besteffort-pod7c11d9b3_8068_4598_8721_3a4e4f793c52.slice. Sep 12 17:14:57.592789 containerd[1474]: time="2025-09-12T17:14:57.592429567Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-krpgf,Uid:7c11d9b3-8068-4598-8721-3a4e4f793c52,Namespace:calico-system,Attempt:0,}" Sep 12 17:14:57.680793 containerd[1474]: time="2025-09-12T17:14:57.680380513Z" level=error msg="Failed to destroy network for sandbox \"4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:57.681661 containerd[1474]: time="2025-09-12T17:14:57.681524289Z" level=error msg="encountered an error cleaning up failed sandbox \"4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:57.681801 containerd[1474]: time="2025-09-12T17:14:57.681707845Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-krpgf,Uid:7c11d9b3-8068-4598-8721-3a4e4f793c52,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:57.683208 kubelet[2556]: E0912 17:14:57.682150 2556 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:57.683208 kubelet[2556]: E0912 17:14:57.682253 2556 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-krpgf" Sep 12 17:14:57.683208 kubelet[2556]: E0912 17:14:57.682281 2556 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-krpgf" Sep 12 17:14:57.683660 kubelet[2556]: E0912 17:14:57.682353 2556 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-krpgf_calico-system(7c11d9b3-8068-4598-8721-3a4e4f793c52)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-krpgf_calico-system(7c11d9b3-8068-4598-8721-3a4e4f793c52)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-krpgf" podUID="7c11d9b3-8068-4598-8721-3a4e4f793c52" Sep 12 17:14:57.821341 containerd[1474]: time="2025-09-12T17:14:57.820436281Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 12 17:14:57.824181 kubelet[2556]: I0912 17:14:57.823679 2556 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" Sep 12 17:14:57.829606 containerd[1474]: time="2025-09-12T17:14:57.828170998Z" level=info msg="StopPodSandbox for \"4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266\"" Sep 12 17:14:57.829606 containerd[1474]: time="2025-09-12T17:14:57.829340934Z" level=info msg="Ensure that sandbox 4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266 in task-service has been cleanup successfully" Sep 12 17:14:57.833622 kubelet[2556]: I0912 17:14:57.833534 2556 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" Sep 12 17:14:57.836608 containerd[1474]: time="2025-09-12T17:14:57.836506463Z" level=info msg="StopPodSandbox for \"ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8\"" Sep 12 17:14:57.836890 containerd[1474]: time="2025-09-12T17:14:57.836836976Z" level=info msg="Ensure that sandbox ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8 in task-service has been cleanup successfully" Sep 12 17:14:57.914098 containerd[1474]: time="2025-09-12T17:14:57.912737776Z" level=error msg="StopPodSandbox for \"4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266\" failed" error="failed to destroy network for sandbox \"4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:57.914409 kubelet[2556]: E0912 17:14:57.913384 2556 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" Sep 12 17:14:57.914409 kubelet[2556]: E0912 17:14:57.913523 2556 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266"} Sep 12 17:14:57.914409 kubelet[2556]: E0912 17:14:57.913641 2556 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7c11d9b3-8068-4598-8721-3a4e4f793c52\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 17:14:57.914409 kubelet[2556]: E0912 17:14:57.913673 2556 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7c11d9b3-8068-4598-8721-3a4e4f793c52\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-krpgf" podUID="7c11d9b3-8068-4598-8721-3a4e4f793c52" Sep 12 17:14:57.917695 containerd[1474]: time="2025-09-12T17:14:57.917597273Z" level=error msg="StopPodSandbox for \"ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8\" failed" error="failed to destroy network for sandbox \"ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:57.918211 kubelet[2556]: E0912 17:14:57.918048 2556 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" Sep 12 17:14:57.918510 kubelet[2556]: E0912 17:14:57.918356 2556 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8"} Sep 12 17:14:57.918510 kubelet[2556]: E0912 17:14:57.918416 2556 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"a51ebaed-92f6-4627-9397-05d2c27efa46\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 17:14:57.918510 kubelet[2556]: E0912 17:14:57.918445 2556 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"a51ebaed-92f6-4627-9397-05d2c27efa46\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6fff4fc9-pkdxs" podUID="a51ebaed-92f6-4627-9397-05d2c27efa46" Sep 12 17:14:57.989742 containerd[1474]: time="2025-09-12T17:14:57.989566917Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-fnsjh,Uid:3b22a780-bb41-4d2d-9dad-2beb095f9e2c,Namespace:calico-system,Attempt:0,}" Sep 12 17:14:58.111946 containerd[1474]: time="2025-09-12T17:14:58.111837975Z" level=error msg="Failed to destroy network for sandbox \"13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:58.112727 containerd[1474]: time="2025-09-12T17:14:58.112658919Z" level=error msg="encountered an error cleaning up failed sandbox \"13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:58.113213 containerd[1474]: time="2025-09-12T17:14:58.112758397Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-fnsjh,Uid:3b22a780-bb41-4d2d-9dad-2beb095f9e2c,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:58.114001 kubelet[2556]: E0912 17:14:58.113355 2556 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:58.114001 kubelet[2556]: E0912 17:14:58.113486 2556 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-fnsjh" Sep 12 17:14:58.114001 kubelet[2556]: E0912 17:14:58.113517 2556 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-fnsjh" Sep 12 17:14:58.114728 kubelet[2556]: E0912 17:14:58.113595 2556 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-fnsjh_calico-system(3b22a780-bb41-4d2d-9dad-2beb095f9e2c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-fnsjh_calico-system(3b22a780-bb41-4d2d-9dad-2beb095f9e2c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-fnsjh" podUID="3b22a780-bb41-4d2d-9dad-2beb095f9e2c" Sep 12 17:14:58.222814 kubelet[2556]: E0912 17:14:58.221804 2556 configmap.go:193] Couldn't get configMap kube-system/coredns: failed to sync configmap cache: timed out waiting for the condition Sep 12 17:14:58.222814 kubelet[2556]: E0912 17:14:58.221969 2556 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/59f64a22-a697-45df-a931-7e5ec9919a4c-config-volume podName:59f64a22-a697-45df-a931-7e5ec9919a4c nodeName:}" failed. No retries permitted until 2025-09-12 17:14:58.72193329 +0000 UTC m=+40.384674649 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/59f64a22-a697-45df-a931-7e5ec9919a4c-config-volume") pod "coredns-668d6bf9bc-j49g9" (UID: "59f64a22-a697-45df-a931-7e5ec9919a4c") : failed to sync configmap cache: timed out waiting for the condition Sep 12 17:14:58.222814 kubelet[2556]: E0912 17:14:58.221833 2556 configmap.go:193] Couldn't get configMap kube-system/coredns: failed to sync configmap cache: timed out waiting for the condition Sep 12 17:14:58.222814 kubelet[2556]: E0912 17:14:58.222382 2556 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ecddc12a-94e3-4f27-a97d-480a79c66c1a-config-volume podName:ecddc12a-94e3-4f27-a97d-480a79c66c1a nodeName:}" failed. No retries permitted until 2025-09-12 17:14:58.722359601 +0000 UTC m=+40.385100960 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/ecddc12a-94e3-4f27-a97d-480a79c66c1a-config-volume") pod "coredns-668d6bf9bc-pc7pt" (UID: "ecddc12a-94e3-4f27-a97d-480a79c66c1a") : failed to sync configmap cache: timed out waiting for the condition Sep 12 17:14:58.254762 containerd[1474]: time="2025-09-12T17:14:58.254690902Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-796fc4c7cd-sr5c6,Uid:5b09b4dd-b380-4298-bca4-9ca9795e60bc,Namespace:calico-system,Attempt:0,}" Sep 12 17:14:58.286544 kubelet[2556]: E0912 17:14:58.285802 2556 projected.go:288] Couldn't get configMap calico-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Sep 12 17:14:58.286544 kubelet[2556]: E0912 17:14:58.285880 2556 projected.go:194] Error preparing data for projected volume kube-api-access-dwfsp for pod calico-apiserver/calico-apiserver-7df9fb7db-kpktm: failed to sync configmap cache: timed out waiting for the condition Sep 12 17:14:58.286544 kubelet[2556]: E0912 17:14:58.285993 2556 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0d01a44b-0670-4021-80bc-06b979ee8824-kube-api-access-dwfsp podName:0d01a44b-0670-4021-80bc-06b979ee8824 nodeName:}" failed. No retries permitted until 2025-09-12 17:14:58.785963104 +0000 UTC m=+40.448704463 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-dwfsp" (UniqueName: "kubernetes.io/projected/0d01a44b-0670-4021-80bc-06b979ee8824-kube-api-access-dwfsp") pod "calico-apiserver-7df9fb7db-kpktm" (UID: "0d01a44b-0670-4021-80bc-06b979ee8824") : failed to sync configmap cache: timed out waiting for the condition Sep 12 17:14:58.303046 kubelet[2556]: E0912 17:14:58.301270 2556 projected.go:288] Couldn't get configMap calico-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Sep 12 17:14:58.303046 kubelet[2556]: E0912 17:14:58.301354 2556 projected.go:194] Error preparing data for projected volume kube-api-access-dnlz6 for pod calico-apiserver/calico-apiserver-7df9fb7db-7p65c: failed to sync configmap cache: timed out waiting for the condition Sep 12 17:14:58.303046 kubelet[2556]: E0912 17:14:58.301454 2556 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8a36e632-808a-47ed-b4e3-b9bcf0459096-kube-api-access-dnlz6 podName:8a36e632-808a-47ed-b4e3-b9bcf0459096 nodeName:}" failed. No retries permitted until 2025-09-12 17:14:58.801419229 +0000 UTC m=+40.464160588 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-dnlz6" (UniqueName: "kubernetes.io/projected/8a36e632-808a-47ed-b4e3-b9bcf0459096-kube-api-access-dnlz6") pod "calico-apiserver-7df9fb7db-7p65c" (UID: "8a36e632-808a-47ed-b4e3-b9bcf0459096") : failed to sync configmap cache: timed out waiting for the condition Sep 12 17:14:58.356164 containerd[1474]: time="2025-09-12T17:14:58.356057395Z" level=error msg="Failed to destroy network for sandbox \"fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:58.359718 containerd[1474]: time="2025-09-12T17:14:58.359628602Z" level=error msg="encountered an error cleaning up failed sandbox \"fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:58.359915 containerd[1474]: time="2025-09-12T17:14:58.359765799Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-796fc4c7cd-sr5c6,Uid:5b09b4dd-b380-4298-bca4-9ca9795e60bc,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:58.361050 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7-shm.mount: Deactivated successfully. Sep 12 17:14:58.364852 kubelet[2556]: E0912 17:14:58.362217 2556 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:58.364852 kubelet[2556]: E0912 17:14:58.362310 2556 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-796fc4c7cd-sr5c6" Sep 12 17:14:58.364852 kubelet[2556]: E0912 17:14:58.362343 2556 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-796fc4c7cd-sr5c6" Sep 12 17:14:58.365152 kubelet[2556]: E0912 17:14:58.362411 2556 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-796fc4c7cd-sr5c6_calico-system(5b09b4dd-b380-4298-bca4-9ca9795e60bc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-796fc4c7cd-sr5c6_calico-system(5b09b4dd-b380-4298-bca4-9ca9795e60bc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-796fc4c7cd-sr5c6" podUID="5b09b4dd-b380-4298-bca4-9ca9795e60bc" Sep 12 17:14:58.874614 kubelet[2556]: I0912 17:14:58.874553 2556 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" Sep 12 17:14:58.884379 kubelet[2556]: I0912 17:14:58.882657 2556 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" Sep 12 17:14:58.889278 containerd[1474]: time="2025-09-12T17:14:58.887509438Z" level=info msg="StopPodSandbox for \"fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7\"" Sep 12 17:14:58.889278 containerd[1474]: time="2025-09-12T17:14:58.888305701Z" level=info msg="Ensure that sandbox fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7 in task-service has been cleanup successfully" Sep 12 17:14:58.895764 containerd[1474]: time="2025-09-12T17:14:58.895664031Z" level=info msg="StopPodSandbox for \"13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a\"" Sep 12 17:14:58.896942 containerd[1474]: time="2025-09-12T17:14:58.896142742Z" level=info msg="Ensure that sandbox 13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a in task-service has been cleanup successfully" Sep 12 17:14:58.904686 containerd[1474]: time="2025-09-12T17:14:58.904522931Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7df9fb7db-7p65c,Uid:8a36e632-808a-47ed-b4e3-b9bcf0459096,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:14:58.921656 containerd[1474]: time="2025-09-12T17:14:58.921282629Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-pc7pt,Uid:ecddc12a-94e3-4f27-a97d-480a79c66c1a,Namespace:kube-system,Attempt:0,}" Sep 12 17:14:58.944302 containerd[1474]: time="2025-09-12T17:14:58.943748291Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7df9fb7db-kpktm,Uid:0d01a44b-0670-4021-80bc-06b979ee8824,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:14:58.958469 containerd[1474]: time="2025-09-12T17:14:58.958220996Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-j49g9,Uid:59f64a22-a697-45df-a931-7e5ec9919a4c,Namespace:kube-system,Attempt:0,}" Sep 12 17:14:58.997773 containerd[1474]: time="2025-09-12T17:14:58.997674031Z" level=error msg="StopPodSandbox for \"fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7\" failed" error="failed to destroy network for sandbox \"fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:58.999020 kubelet[2556]: E0912 17:14:58.998938 2556 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" Sep 12 17:14:59.000345 kubelet[2556]: E0912 17:14:58.999048 2556 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7"} Sep 12 17:14:59.000345 kubelet[2556]: E0912 17:14:58.999140 2556 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"5b09b4dd-b380-4298-bca4-9ca9795e60bc\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 17:14:59.000345 kubelet[2556]: E0912 17:14:58.999173 2556 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"5b09b4dd-b380-4298-bca4-9ca9795e60bc\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-796fc4c7cd-sr5c6" podUID="5b09b4dd-b380-4298-bca4-9ca9795e60bc" Sep 12 17:14:59.009399 containerd[1474]: time="2025-09-12T17:14:59.009284960Z" level=error msg="StopPodSandbox for \"13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a\" failed" error="failed to destroy network for sandbox \"13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:59.010138 kubelet[2556]: E0912 17:14:59.009778 2556 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" Sep 12 17:14:59.010138 kubelet[2556]: E0912 17:14:59.009892 2556 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a"} Sep 12 17:14:59.010138 kubelet[2556]: E0912 17:14:59.009963 2556 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"3b22a780-bb41-4d2d-9dad-2beb095f9e2c\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 17:14:59.010138 kubelet[2556]: E0912 17:14:59.009999 2556 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"3b22a780-bb41-4d2d-9dad-2beb095f9e2c\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-fnsjh" podUID="3b22a780-bb41-4d2d-9dad-2beb095f9e2c" Sep 12 17:14:59.109912 containerd[1474]: time="2025-09-12T17:14:59.109668497Z" level=error msg="Failed to destroy network for sandbox \"2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:59.112998 containerd[1474]: time="2025-09-12T17:14:59.112685197Z" level=error msg="encountered an error cleaning up failed sandbox \"2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:59.113967 containerd[1474]: time="2025-09-12T17:14:59.113670778Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7df9fb7db-7p65c,Uid:8a36e632-808a-47ed-b4e3-b9bcf0459096,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:59.117657 kubelet[2556]: E0912 17:14:59.115500 2556 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:59.117657 kubelet[2556]: E0912 17:14:59.115667 2556 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7df9fb7db-7p65c" Sep 12 17:14:59.117657 kubelet[2556]: E0912 17:14:59.115706 2556 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7df9fb7db-7p65c" Sep 12 17:14:59.118457 kubelet[2556]: E0912 17:14:59.115792 2556 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7df9fb7db-7p65c_calico-apiserver(8a36e632-808a-47ed-b4e3-b9bcf0459096)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7df9fb7db-7p65c_calico-apiserver(8a36e632-808a-47ed-b4e3-b9bcf0459096)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7df9fb7db-7p65c" podUID="8a36e632-808a-47ed-b4e3-b9bcf0459096" Sep 12 17:14:59.232154 containerd[1474]: time="2025-09-12T17:14:59.229736766Z" level=error msg="Failed to destroy network for sandbox \"de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:59.232154 containerd[1474]: time="2025-09-12T17:14:59.231646168Z" level=error msg="encountered an error cleaning up failed sandbox \"de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:59.232154 containerd[1474]: time="2025-09-12T17:14:59.231800805Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7df9fb7db-kpktm,Uid:0d01a44b-0670-4021-80bc-06b979ee8824,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:59.235457 kubelet[2556]: E0912 17:14:59.232431 2556 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:59.235457 kubelet[2556]: E0912 17:14:59.232577 2556 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7df9fb7db-kpktm" Sep 12 17:14:59.235457 kubelet[2556]: E0912 17:14:59.232622 2556 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7df9fb7db-kpktm" Sep 12 17:14:59.235729 kubelet[2556]: E0912 17:14:59.232688 2556 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7df9fb7db-kpktm_calico-apiserver(0d01a44b-0670-4021-80bc-06b979ee8824)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7df9fb7db-kpktm_calico-apiserver(0d01a44b-0670-4021-80bc-06b979ee8824)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7df9fb7db-kpktm" podUID="0d01a44b-0670-4021-80bc-06b979ee8824" Sep 12 17:14:59.241631 containerd[1474]: time="2025-09-12T17:14:59.237221858Z" level=error msg="Failed to destroy network for sandbox \"22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:59.246163 containerd[1474]: time="2025-09-12T17:14:59.244817268Z" level=error msg="encountered an error cleaning up failed sandbox \"22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:59.246163 containerd[1474]: time="2025-09-12T17:14:59.244968985Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-pc7pt,Uid:ecddc12a-94e3-4f27-a97d-480a79c66c1a,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:59.250680 kubelet[2556]: E0912 17:14:59.247655 2556 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:59.250680 kubelet[2556]: E0912 17:14:59.247762 2556 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-pc7pt" Sep 12 17:14:59.250680 kubelet[2556]: E0912 17:14:59.247788 2556 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-pc7pt" Sep 12 17:14:59.250984 kubelet[2556]: E0912 17:14:59.247867 2556 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-pc7pt_kube-system(ecddc12a-94e3-4f27-a97d-480a79c66c1a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-pc7pt_kube-system(ecddc12a-94e3-4f27-a97d-480a79c66c1a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-pc7pt" podUID="ecddc12a-94e3-4f27-a97d-480a79c66c1a" Sep 12 17:14:59.253810 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208-shm.mount: Deactivated successfully. Sep 12 17:14:59.280403 containerd[1474]: time="2025-09-12T17:14:59.280255368Z" level=error msg="Failed to destroy network for sandbox \"dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:59.282683 containerd[1474]: time="2025-09-12T17:14:59.282567842Z" level=error msg="encountered an error cleaning up failed sandbox \"dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:59.282908 containerd[1474]: time="2025-09-12T17:14:59.282714079Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-j49g9,Uid:59f64a22-a697-45df-a931-7e5ec9919a4c,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:59.284361 kubelet[2556]: E0912 17:14:59.284276 2556 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:14:59.284711 kubelet[2556]: E0912 17:14:59.284387 2556 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-j49g9" Sep 12 17:14:59.284711 kubelet[2556]: E0912 17:14:59.284417 2556 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-j49g9" Sep 12 17:14:59.284711 kubelet[2556]: E0912 17:14:59.284491 2556 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-j49g9_kube-system(59f64a22-a697-45df-a931-7e5ec9919a4c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-j49g9_kube-system(59f64a22-a697-45df-a931-7e5ec9919a4c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-j49g9" podUID="59f64a22-a697-45df-a931-7e5ec9919a4c" Sep 12 17:14:59.287795 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b-shm.mount: Deactivated successfully. Sep 12 17:14:59.889841 kubelet[2556]: I0912 17:14:59.889772 2556 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" Sep 12 17:14:59.893038 containerd[1474]: time="2025-09-12T17:14:59.892963707Z" level=info msg="StopPodSandbox for \"dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b\"" Sep 12 17:14:59.893618 containerd[1474]: time="2025-09-12T17:14:59.893325099Z" level=info msg="Ensure that sandbox dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b in task-service has been cleanup successfully" Sep 12 17:14:59.900121 kubelet[2556]: I0912 17:14:59.899327 2556 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" Sep 12 17:14:59.901572 containerd[1474]: time="2025-09-12T17:14:59.900699394Z" level=info msg="StopPodSandbox for \"22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208\"" Sep 12 17:14:59.902518 containerd[1474]: time="2025-09-12T17:14:59.902380121Z" level=info msg="Ensure that sandbox 22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208 in task-service has been cleanup successfully" Sep 12 17:14:59.907814 kubelet[2556]: I0912 17:14:59.907655 2556 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" Sep 12 17:14:59.916704 containerd[1474]: time="2025-09-12T17:14:59.916630399Z" level=info msg="StopPodSandbox for \"2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a\"" Sep 12 17:14:59.916962 containerd[1474]: time="2025-09-12T17:14:59.916930273Z" level=info msg="Ensure that sandbox 2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a in task-service has been cleanup successfully" Sep 12 17:14:59.923018 kubelet[2556]: I0912 17:14:59.921333 2556 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" Sep 12 17:14:59.925758 containerd[1474]: time="2025-09-12T17:14:59.925674221Z" level=info msg="StopPodSandbox for \"de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7\"" Sep 12 17:14:59.930766 containerd[1474]: time="2025-09-12T17:14:59.930603283Z" level=info msg="Ensure that sandbox de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7 in task-service has been cleanup successfully" Sep 12 17:15:00.041324 containerd[1474]: time="2025-09-12T17:15:00.041241842Z" level=error msg="StopPodSandbox for \"2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a\" failed" error="failed to destroy network for sandbox \"2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:00.042346 kubelet[2556]: E0912 17:15:00.041972 2556 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" Sep 12 17:15:00.042346 kubelet[2556]: E0912 17:15:00.042176 2556 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a"} Sep 12 17:15:00.042346 kubelet[2556]: E0912 17:15:00.042243 2556 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8a36e632-808a-47ed-b4e3-b9bcf0459096\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 17:15:00.042346 kubelet[2556]: E0912 17:15:00.042285 2556 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8a36e632-808a-47ed-b4e3-b9bcf0459096\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7df9fb7db-7p65c" podUID="8a36e632-808a-47ed-b4e3-b9bcf0459096" Sep 12 17:15:00.045307 containerd[1474]: time="2025-09-12T17:15:00.045227006Z" level=error msg="StopPodSandbox for \"22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208\" failed" error="failed to destroy network for sandbox \"22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:00.046221 kubelet[2556]: E0912 17:15:00.045937 2556 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" Sep 12 17:15:00.046221 kubelet[2556]: E0912 17:15:00.046035 2556 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208"} Sep 12 17:15:00.046221 kubelet[2556]: E0912 17:15:00.046135 2556 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ecddc12a-94e3-4f27-a97d-480a79c66c1a\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 17:15:00.046221 kubelet[2556]: E0912 17:15:00.046169 2556 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ecddc12a-94e3-4f27-a97d-480a79c66c1a\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-pc7pt" podUID="ecddc12a-94e3-4f27-a97d-480a79c66c1a" Sep 12 17:15:00.056410 containerd[1474]: time="2025-09-12T17:15:00.055485049Z" level=error msg="StopPodSandbox for \"dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b\" failed" error="failed to destroy network for sandbox \"dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:00.056658 kubelet[2556]: E0912 17:15:00.055913 2556 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" Sep 12 17:15:00.056658 kubelet[2556]: E0912 17:15:00.056006 2556 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b"} Sep 12 17:15:00.057112 kubelet[2556]: E0912 17:15:00.056874 2556 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"59f64a22-a697-45df-a931-7e5ec9919a4c\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 17:15:00.057112 kubelet[2556]: E0912 17:15:00.056934 2556 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"59f64a22-a697-45df-a931-7e5ec9919a4c\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-j49g9" podUID="59f64a22-a697-45df-a931-7e5ec9919a4c" Sep 12 17:15:00.057808 containerd[1474]: time="2025-09-12T17:15:00.057743886Z" level=error msg="StopPodSandbox for \"de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7\" failed" error="failed to destroy network for sandbox \"de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:15:00.058994 kubelet[2556]: E0912 17:15:00.058650 2556 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" Sep 12 17:15:00.058994 kubelet[2556]: E0912 17:15:00.058743 2556 kuberuntime_manager.go:1546] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7"} Sep 12 17:15:00.058994 kubelet[2556]: E0912 17:15:00.058792 2556 kuberuntime_manager.go:1146] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d01a44b-0670-4021-80bc-06b979ee8824\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 12 17:15:00.058994 kubelet[2556]: E0912 17:15:00.058819 2556 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d01a44b-0670-4021-80bc-06b979ee8824\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7df9fb7db-kpktm" podUID="0d01a44b-0670-4021-80bc-06b979ee8824" Sep 12 17:15:02.511751 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3249416805.mount: Deactivated successfully. Sep 12 17:15:02.548032 containerd[1474]: time="2025-09-12T17:15:02.547864445Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:02.550244 containerd[1474]: time="2025-09-12T17:15:02.550155724Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=151100457" Sep 12 17:15:02.551346 containerd[1474]: time="2025-09-12T17:15:02.551274944Z" level=info msg="ImageCreate event name:\"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:02.555742 containerd[1474]: time="2025-09-12T17:15:02.555638745Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:02.557032 containerd[1474]: time="2025-09-12T17:15:02.556735605Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"151100319\" in 4.736226965s" Sep 12 17:15:02.557032 containerd[1474]: time="2025-09-12T17:15:02.556805804Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\"" Sep 12 17:15:02.586283 containerd[1474]: time="2025-09-12T17:15:02.586205473Z" level=info msg="CreateContainer within sandbox \"4095577192810c9a3b69e806108a255a713b810f3464738ac2c9862d4a3c1de5\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 12 17:15:02.619846 containerd[1474]: time="2025-09-12T17:15:02.619741067Z" level=info msg="CreateContainer within sandbox \"4095577192810c9a3b69e806108a255a713b810f3464738ac2c9862d4a3c1de5\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"8ce444d94d3e2a12112c11e86c3d8794da59c25f3e34b0ca19e1fa2a35a38549\"" Sep 12 17:15:02.622053 containerd[1474]: time="2025-09-12T17:15:02.621916188Z" level=info msg="StartContainer for \"8ce444d94d3e2a12112c11e86c3d8794da59c25f3e34b0ca19e1fa2a35a38549\"" Sep 12 17:15:02.661496 systemd[1]: Started cri-containerd-8ce444d94d3e2a12112c11e86c3d8794da59c25f3e34b0ca19e1fa2a35a38549.scope - libcontainer container 8ce444d94d3e2a12112c11e86c3d8794da59c25f3e34b0ca19e1fa2a35a38549. Sep 12 17:15:02.720575 containerd[1474]: time="2025-09-12T17:15:02.720492888Z" level=info msg="StartContainer for \"8ce444d94d3e2a12112c11e86c3d8794da59c25f3e34b0ca19e1fa2a35a38549\" returns successfully" Sep 12 17:15:02.900742 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 12 17:15:02.900952 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 12 17:15:03.147150 kubelet[2556]: I0912 17:15:03.145527 2556 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-ftnhk" podStartSLOduration=1.5749541649999999 podStartE2EDuration="16.145493006s" podCreationTimestamp="2025-09-12 17:14:47 +0000 UTC" firstStartedPulling="2025-09-12 17:14:47.987998371 +0000 UTC m=+29.650739730" lastFinishedPulling="2025-09-12 17:15:02.558537212 +0000 UTC m=+44.221278571" observedRunningTime="2025-09-12 17:15:02.978841504 +0000 UTC m=+44.641582903" watchObservedRunningTime="2025-09-12 17:15:03.145493006 +0000 UTC m=+44.808234325" Sep 12 17:15:03.151904 containerd[1474]: time="2025-09-12T17:15:03.149841050Z" level=info msg="StopPodSandbox for \"fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7\"" Sep 12 17:15:03.488589 containerd[1474]: 2025-09-12 17:15:03.367 [INFO][3766] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" Sep 12 17:15:03.488589 containerd[1474]: 2025-09-12 17:15:03.368 [INFO][3766] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" iface="eth0" netns="/var/run/netns/cni-71b49d40-80e2-8468-bec8-4023451250d3" Sep 12 17:15:03.488589 containerd[1474]: 2025-09-12 17:15:03.368 [INFO][3766] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" iface="eth0" netns="/var/run/netns/cni-71b49d40-80e2-8468-bec8-4023451250d3" Sep 12 17:15:03.488589 containerd[1474]: 2025-09-12 17:15:03.369 [INFO][3766] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" iface="eth0" netns="/var/run/netns/cni-71b49d40-80e2-8468-bec8-4023451250d3" Sep 12 17:15:03.488589 containerd[1474]: 2025-09-12 17:15:03.369 [INFO][3766] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" Sep 12 17:15:03.488589 containerd[1474]: 2025-09-12 17:15:03.369 [INFO][3766] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" Sep 12 17:15:03.488589 containerd[1474]: 2025-09-12 17:15:03.455 [INFO][3775] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" HandleID="k8s-pod-network.fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-whisker--796fc4c7cd--sr5c6-eth0" Sep 12 17:15:03.488589 containerd[1474]: 2025-09-12 17:15:03.455 [INFO][3775] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:03.488589 containerd[1474]: 2025-09-12 17:15:03.455 [INFO][3775] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:03.488589 containerd[1474]: 2025-09-12 17:15:03.475 [WARNING][3775] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" HandleID="k8s-pod-network.fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-whisker--796fc4c7cd--sr5c6-eth0" Sep 12 17:15:03.488589 containerd[1474]: 2025-09-12 17:15:03.476 [INFO][3775] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" HandleID="k8s-pod-network.fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-whisker--796fc4c7cd--sr5c6-eth0" Sep 12 17:15:03.488589 containerd[1474]: 2025-09-12 17:15:03.481 [INFO][3775] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:03.488589 containerd[1474]: 2025-09-12 17:15:03.484 [INFO][3766] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" Sep 12 17:15:03.491183 containerd[1474]: time="2025-09-12T17:15:03.488714219Z" level=info msg="TearDown network for sandbox \"fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7\" successfully" Sep 12 17:15:03.491183 containerd[1474]: time="2025-09-12T17:15:03.488756738Z" level=info msg="StopPodSandbox for \"fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7\" returns successfully" Sep 12 17:15:03.512815 systemd[1]: run-netns-cni\x2d71b49d40\x2d80e2\x2d8468\x2dbec8\x2d4023451250d3.mount: Deactivated successfully. Sep 12 17:15:03.588535 kubelet[2556]: I0912 17:15:03.586897 2556 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b09b4dd-b380-4298-bca4-9ca9795e60bc-whisker-ca-bundle\") pod \"5b09b4dd-b380-4298-bca4-9ca9795e60bc\" (UID: \"5b09b4dd-b380-4298-bca4-9ca9795e60bc\") " Sep 12 17:15:03.588535 kubelet[2556]: I0912 17:15:03.587129 2556 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5b09b4dd-b380-4298-bca4-9ca9795e60bc-whisker-backend-key-pair\") pod \"5b09b4dd-b380-4298-bca4-9ca9795e60bc\" (UID: \"5b09b4dd-b380-4298-bca4-9ca9795e60bc\") " Sep 12 17:15:03.588535 kubelet[2556]: I0912 17:15:03.587222 2556 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2jl8\" (UniqueName: \"kubernetes.io/projected/5b09b4dd-b380-4298-bca4-9ca9795e60bc-kube-api-access-p2jl8\") pod \"5b09b4dd-b380-4298-bca4-9ca9795e60bc\" (UID: \"5b09b4dd-b380-4298-bca4-9ca9795e60bc\") " Sep 12 17:15:03.588535 kubelet[2556]: I0912 17:15:03.587701 2556 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b09b4dd-b380-4298-bca4-9ca9795e60bc-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "5b09b4dd-b380-4298-bca4-9ca9795e60bc" (UID: "5b09b4dd-b380-4298-bca4-9ca9795e60bc"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 12 17:15:03.601111 kubelet[2556]: I0912 17:15:03.599732 2556 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b09b4dd-b380-4298-bca4-9ca9795e60bc-kube-api-access-p2jl8" (OuterVolumeSpecName: "kube-api-access-p2jl8") pod "5b09b4dd-b380-4298-bca4-9ca9795e60bc" (UID: "5b09b4dd-b380-4298-bca4-9ca9795e60bc"). InnerVolumeSpecName "kube-api-access-p2jl8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 12 17:15:03.601722 kubelet[2556]: I0912 17:15:03.601632 2556 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b09b4dd-b380-4298-bca4-9ca9795e60bc-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "5b09b4dd-b380-4298-bca4-9ca9795e60bc" (UID: "5b09b4dd-b380-4298-bca4-9ca9795e60bc"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 12 17:15:03.602611 systemd[1]: var-lib-kubelet-pods-5b09b4dd\x2db380\x2d4298\x2dbca4\x2d9ca9795e60bc-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dp2jl8.mount: Deactivated successfully. Sep 12 17:15:03.602769 systemd[1]: var-lib-kubelet-pods-5b09b4dd\x2db380\x2d4298\x2dbca4\x2d9ca9795e60bc-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 12 17:15:03.688505 kubelet[2556]: I0912 17:15:03.688432 2556 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b09b4dd-b380-4298-bca4-9ca9795e60bc-whisker-ca-bundle\") on node \"ci-4081-3-6-2-0999f1dc3d\" DevicePath \"\"" Sep 12 17:15:03.688505 kubelet[2556]: I0912 17:15:03.688489 2556 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5b09b4dd-b380-4298-bca4-9ca9795e60bc-whisker-backend-key-pair\") on node \"ci-4081-3-6-2-0999f1dc3d\" DevicePath \"\"" Sep 12 17:15:03.688505 kubelet[2556]: I0912 17:15:03.688509 2556 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p2jl8\" (UniqueName: \"kubernetes.io/projected/5b09b4dd-b380-4298-bca4-9ca9795e60bc-kube-api-access-p2jl8\") on node \"ci-4081-3-6-2-0999f1dc3d\" DevicePath \"\"" Sep 12 17:15:03.948024 kubelet[2556]: I0912 17:15:03.947316 2556 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:15:03.954494 systemd[1]: Removed slice kubepods-besteffort-pod5b09b4dd_b380_4298_bca4_9ca9795e60bc.slice - libcontainer container kubepods-besteffort-pod5b09b4dd_b380_4298_bca4_9ca9795e60bc.slice. Sep 12 17:15:04.058248 systemd[1]: Created slice kubepods-besteffort-podb7169a5d_9351_4a54_9654_bf00214c5b5a.slice - libcontainer container kubepods-besteffort-podb7169a5d_9351_4a54_9654_bf00214c5b5a.slice. Sep 12 17:15:04.093397 kubelet[2556]: I0912 17:15:04.091694 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7169a5d-9351-4a54-9654-bf00214c5b5a-whisker-ca-bundle\") pod \"whisker-78fb8c774f-mnjlm\" (UID: \"b7169a5d-9351-4a54-9654-bf00214c5b5a\") " pod="calico-system/whisker-78fb8c774f-mnjlm" Sep 12 17:15:04.093397 kubelet[2556]: I0912 17:15:04.091784 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqxmr\" (UniqueName: \"kubernetes.io/projected/b7169a5d-9351-4a54-9654-bf00214c5b5a-kube-api-access-mqxmr\") pod \"whisker-78fb8c774f-mnjlm\" (UID: \"b7169a5d-9351-4a54-9654-bf00214c5b5a\") " pod="calico-system/whisker-78fb8c774f-mnjlm" Sep 12 17:15:04.093397 kubelet[2556]: I0912 17:15:04.091831 2556 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/b7169a5d-9351-4a54-9654-bf00214c5b5a-whisker-backend-key-pair\") pod \"whisker-78fb8c774f-mnjlm\" (UID: \"b7169a5d-9351-4a54-9654-bf00214c5b5a\") " pod="calico-system/whisker-78fb8c774f-mnjlm" Sep 12 17:15:04.368519 containerd[1474]: time="2025-09-12T17:15:04.368157747Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-78fb8c774f-mnjlm,Uid:b7169a5d-9351-4a54-9654-bf00214c5b5a,Namespace:calico-system,Attempt:0,}" Sep 12 17:15:04.571685 systemd-networkd[1374]: caliaef140e81b1: Link UP Sep 12 17:15:04.573534 systemd-networkd[1374]: caliaef140e81b1: Gained carrier Sep 12 17:15:04.587114 kubelet[2556]: I0912 17:15:04.587006 2556 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b09b4dd-b380-4298-bca4-9ca9795e60bc" path="/var/lib/kubelet/pods/5b09b4dd-b380-4298-bca4-9ca9795e60bc/volumes" Sep 12 17:15:04.611166 containerd[1474]: 2025-09-12 17:15:04.421 [INFO][3796] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 17:15:04.611166 containerd[1474]: 2025-09-12 17:15:04.444 [INFO][3796] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--2--0999f1dc3d-k8s-whisker--78fb8c774f--mnjlm-eth0 whisker-78fb8c774f- calico-system b7169a5d-9351-4a54-9654-bf00214c5b5a 892 0 2025-09-12 17:15:04 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:78fb8c774f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4081-3-6-2-0999f1dc3d whisker-78fb8c774f-mnjlm eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] caliaef140e81b1 [] [] }} ContainerID="b2e211a53accc6fddc12dedc4c1a07dd048e0f1b40d2f7384c63668d54bc004a" Namespace="calico-system" Pod="whisker-78fb8c774f-mnjlm" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-whisker--78fb8c774f--mnjlm-" Sep 12 17:15:04.611166 containerd[1474]: 2025-09-12 17:15:04.444 [INFO][3796] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b2e211a53accc6fddc12dedc4c1a07dd048e0f1b40d2f7384c63668d54bc004a" Namespace="calico-system" Pod="whisker-78fb8c774f-mnjlm" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-whisker--78fb8c774f--mnjlm-eth0" Sep 12 17:15:04.611166 containerd[1474]: 2025-09-12 17:15:04.483 [INFO][3808] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b2e211a53accc6fddc12dedc4c1a07dd048e0f1b40d2f7384c63668d54bc004a" HandleID="k8s-pod-network.b2e211a53accc6fddc12dedc4c1a07dd048e0f1b40d2f7384c63668d54bc004a" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-whisker--78fb8c774f--mnjlm-eth0" Sep 12 17:15:04.611166 containerd[1474]: 2025-09-12 17:15:04.483 [INFO][3808] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b2e211a53accc6fddc12dedc4c1a07dd048e0f1b40d2f7384c63668d54bc004a" HandleID="k8s-pod-network.b2e211a53accc6fddc12dedc4c1a07dd048e0f1b40d2f7384c63668d54bc004a" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-whisker--78fb8c774f--mnjlm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3730), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-2-0999f1dc3d", "pod":"whisker-78fb8c774f-mnjlm", "timestamp":"2025-09-12 17:15:04.483404976 +0000 UTC"}, Hostname:"ci-4081-3-6-2-0999f1dc3d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:15:04.611166 containerd[1474]: 2025-09-12 17:15:04.483 [INFO][3808] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:04.611166 containerd[1474]: 2025-09-12 17:15:04.483 [INFO][3808] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:04.611166 containerd[1474]: 2025-09-12 17:15:04.483 [INFO][3808] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-2-0999f1dc3d' Sep 12 17:15:04.611166 containerd[1474]: 2025-09-12 17:15:04.497 [INFO][3808] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b2e211a53accc6fddc12dedc4c1a07dd048e0f1b40d2f7384c63668d54bc004a" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:04.611166 containerd[1474]: 2025-09-12 17:15:04.507 [INFO][3808] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:04.611166 containerd[1474]: 2025-09-12 17:15:04.520 [INFO][3808] ipam/ipam.go 511: Trying affinity for 192.168.48.64/26 host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:04.611166 containerd[1474]: 2025-09-12 17:15:04.524 [INFO][3808] ipam/ipam.go 158: Attempting to load block cidr=192.168.48.64/26 host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:04.611166 containerd[1474]: 2025-09-12 17:15:04.529 [INFO][3808] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.48.64/26 host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:04.611166 containerd[1474]: 2025-09-12 17:15:04.529 [INFO][3808] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.48.64/26 handle="k8s-pod-network.b2e211a53accc6fddc12dedc4c1a07dd048e0f1b40d2f7384c63668d54bc004a" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:04.611166 containerd[1474]: 2025-09-12 17:15:04.532 [INFO][3808] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b2e211a53accc6fddc12dedc4c1a07dd048e0f1b40d2f7384c63668d54bc004a Sep 12 17:15:04.611166 containerd[1474]: 2025-09-12 17:15:04.541 [INFO][3808] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.48.64/26 handle="k8s-pod-network.b2e211a53accc6fddc12dedc4c1a07dd048e0f1b40d2f7384c63668d54bc004a" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:04.611166 containerd[1474]: 2025-09-12 17:15:04.551 [INFO][3808] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.48.65/26] block=192.168.48.64/26 handle="k8s-pod-network.b2e211a53accc6fddc12dedc4c1a07dd048e0f1b40d2f7384c63668d54bc004a" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:04.611166 containerd[1474]: 2025-09-12 17:15:04.551 [INFO][3808] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.48.65/26] handle="k8s-pod-network.b2e211a53accc6fddc12dedc4c1a07dd048e0f1b40d2f7384c63668d54bc004a" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:04.611166 containerd[1474]: 2025-09-12 17:15:04.552 [INFO][3808] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:04.611166 containerd[1474]: 2025-09-12 17:15:04.552 [INFO][3808] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.48.65/26] IPv6=[] ContainerID="b2e211a53accc6fddc12dedc4c1a07dd048e0f1b40d2f7384c63668d54bc004a" HandleID="k8s-pod-network.b2e211a53accc6fddc12dedc4c1a07dd048e0f1b40d2f7384c63668d54bc004a" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-whisker--78fb8c774f--mnjlm-eth0" Sep 12 17:15:04.612306 containerd[1474]: 2025-09-12 17:15:04.555 [INFO][3796] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b2e211a53accc6fddc12dedc4c1a07dd048e0f1b40d2f7384c63668d54bc004a" Namespace="calico-system" Pod="whisker-78fb8c774f-mnjlm" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-whisker--78fb8c774f--mnjlm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--2--0999f1dc3d-k8s-whisker--78fb8c774f--mnjlm-eth0", GenerateName:"whisker-78fb8c774f-", Namespace:"calico-system", SelfLink:"", UID:"b7169a5d-9351-4a54-9654-bf00214c5b5a", ResourceVersion:"892", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 15, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"78fb8c774f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-2-0999f1dc3d", ContainerID:"", Pod:"whisker-78fb8c774f-mnjlm", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.48.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"caliaef140e81b1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:04.612306 containerd[1474]: 2025-09-12 17:15:04.555 [INFO][3796] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.48.65/32] ContainerID="b2e211a53accc6fddc12dedc4c1a07dd048e0f1b40d2f7384c63668d54bc004a" Namespace="calico-system" Pod="whisker-78fb8c774f-mnjlm" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-whisker--78fb8c774f--mnjlm-eth0" Sep 12 17:15:04.612306 containerd[1474]: 2025-09-12 17:15:04.555 [INFO][3796] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliaef140e81b1 ContainerID="b2e211a53accc6fddc12dedc4c1a07dd048e0f1b40d2f7384c63668d54bc004a" Namespace="calico-system" Pod="whisker-78fb8c774f-mnjlm" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-whisker--78fb8c774f--mnjlm-eth0" Sep 12 17:15:04.612306 containerd[1474]: 2025-09-12 17:15:04.569 [INFO][3796] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b2e211a53accc6fddc12dedc4c1a07dd048e0f1b40d2f7384c63668d54bc004a" Namespace="calico-system" Pod="whisker-78fb8c774f-mnjlm" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-whisker--78fb8c774f--mnjlm-eth0" Sep 12 17:15:04.612306 containerd[1474]: 2025-09-12 17:15:04.572 [INFO][3796] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b2e211a53accc6fddc12dedc4c1a07dd048e0f1b40d2f7384c63668d54bc004a" Namespace="calico-system" Pod="whisker-78fb8c774f-mnjlm" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-whisker--78fb8c774f--mnjlm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--2--0999f1dc3d-k8s-whisker--78fb8c774f--mnjlm-eth0", GenerateName:"whisker-78fb8c774f-", Namespace:"calico-system", SelfLink:"", UID:"b7169a5d-9351-4a54-9654-bf00214c5b5a", ResourceVersion:"892", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 15, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"78fb8c774f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-2-0999f1dc3d", ContainerID:"b2e211a53accc6fddc12dedc4c1a07dd048e0f1b40d2f7384c63668d54bc004a", Pod:"whisker-78fb8c774f-mnjlm", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.48.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"caliaef140e81b1", MAC:"7a:58:c9:3f:fd:ff", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:04.612306 containerd[1474]: 2025-09-12 17:15:04.605 [INFO][3796] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b2e211a53accc6fddc12dedc4c1a07dd048e0f1b40d2f7384c63668d54bc004a" Namespace="calico-system" Pod="whisker-78fb8c774f-mnjlm" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-whisker--78fb8c774f--mnjlm-eth0" Sep 12 17:15:04.636684 containerd[1474]: time="2025-09-12T17:15:04.636385241Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:15:04.636684 containerd[1474]: time="2025-09-12T17:15:04.636558878Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:15:04.636684 containerd[1474]: time="2025-09-12T17:15:04.636588518Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:04.637035 containerd[1474]: time="2025-09-12T17:15:04.636738835Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:04.670905 systemd[1]: Started cri-containerd-b2e211a53accc6fddc12dedc4c1a07dd048e0f1b40d2f7384c63668d54bc004a.scope - libcontainer container b2e211a53accc6fddc12dedc4c1a07dd048e0f1b40d2f7384c63668d54bc004a. Sep 12 17:15:04.760728 containerd[1474]: time="2025-09-12T17:15:04.760604678Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-78fb8c774f-mnjlm,Uid:b7169a5d-9351-4a54-9654-bf00214c5b5a,Namespace:calico-system,Attempt:0,} returns sandbox id \"b2e211a53accc6fddc12dedc4c1a07dd048e0f1b40d2f7384c63668d54bc004a\"" Sep 12 17:15:04.769363 containerd[1474]: time="2025-09-12T17:15:04.769263130Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 12 17:15:05.709386 systemd-networkd[1374]: caliaef140e81b1: Gained IPv6LL Sep 12 17:15:08.232007 containerd[1474]: time="2025-09-12T17:15:08.231918608Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:08.233327 containerd[1474]: time="2025-09-12T17:15:08.233260787Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4605606" Sep 12 17:15:08.236130 containerd[1474]: time="2025-09-12T17:15:08.234143813Z" level=info msg="ImageCreate event name:\"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:08.240435 containerd[1474]: time="2025-09-12T17:15:08.240345277Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:08.242861 containerd[1474]: time="2025-09-12T17:15:08.242766360Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"5974839\" in 3.473406512s" Sep 12 17:15:08.242861 containerd[1474]: time="2025-09-12T17:15:08.242856958Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\"" Sep 12 17:15:08.262768 containerd[1474]: time="2025-09-12T17:15:08.262672931Z" level=info msg="CreateContainer within sandbox \"b2e211a53accc6fddc12dedc4c1a07dd048e0f1b40d2f7384c63668d54bc004a\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 12 17:15:08.282878 containerd[1474]: time="2025-09-12T17:15:08.282761059Z" level=info msg="CreateContainer within sandbox \"b2e211a53accc6fddc12dedc4c1a07dd048e0f1b40d2f7384c63668d54bc004a\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"3efe0b3c992f7206d1becc954d8f1d9d932f80eb7c459d1251a491a260c8e98d\"" Sep 12 17:15:08.289968 containerd[1474]: time="2025-09-12T17:15:08.288379412Z" level=info msg="StartContainer for \"3efe0b3c992f7206d1becc954d8f1d9d932f80eb7c459d1251a491a260c8e98d\"" Sep 12 17:15:08.336306 systemd[1]: run-containerd-runc-k8s.io-3efe0b3c992f7206d1becc954d8f1d9d932f80eb7c459d1251a491a260c8e98d-runc.tQISse.mount: Deactivated successfully. Sep 12 17:15:08.346433 systemd[1]: Started cri-containerd-3efe0b3c992f7206d1becc954d8f1d9d932f80eb7c459d1251a491a260c8e98d.scope - libcontainer container 3efe0b3c992f7206d1becc954d8f1d9d932f80eb7c459d1251a491a260c8e98d. Sep 12 17:15:08.395470 containerd[1474]: time="2025-09-12T17:15:08.395160476Z" level=info msg="StartContainer for \"3efe0b3c992f7206d1becc954d8f1d9d932f80eb7c459d1251a491a260c8e98d\" returns successfully" Sep 12 17:15:08.398811 containerd[1474]: time="2025-09-12T17:15:08.398760860Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 12 17:15:09.072728 kubelet[2556]: I0912 17:15:09.072655 2556 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:15:09.455134 kernel: bpftool[4079]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Sep 12 17:15:09.578144 containerd[1474]: time="2025-09-12T17:15:09.577739849Z" level=info msg="StopPodSandbox for \"ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8\"" Sep 12 17:15:09.802648 containerd[1474]: 2025-09-12 17:15:09.702 [INFO][4091] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" Sep 12 17:15:09.802648 containerd[1474]: 2025-09-12 17:15:09.704 [INFO][4091] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" iface="eth0" netns="/var/run/netns/cni-afb73e72-03d1-c976-7f6b-d1081e8d0283" Sep 12 17:15:09.802648 containerd[1474]: 2025-09-12 17:15:09.705 [INFO][4091] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" iface="eth0" netns="/var/run/netns/cni-afb73e72-03d1-c976-7f6b-d1081e8d0283" Sep 12 17:15:09.802648 containerd[1474]: 2025-09-12 17:15:09.705 [INFO][4091] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" iface="eth0" netns="/var/run/netns/cni-afb73e72-03d1-c976-7f6b-d1081e8d0283" Sep 12 17:15:09.802648 containerd[1474]: 2025-09-12 17:15:09.705 [INFO][4091] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" Sep 12 17:15:09.802648 containerd[1474]: 2025-09-12 17:15:09.705 [INFO][4091] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" Sep 12 17:15:09.802648 containerd[1474]: 2025-09-12 17:15:09.775 [INFO][4121] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" HandleID="k8s-pod-network.ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--kube--controllers--6fff4fc9--pkdxs-eth0" Sep 12 17:15:09.802648 containerd[1474]: 2025-09-12 17:15:09.777 [INFO][4121] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:09.802648 containerd[1474]: 2025-09-12 17:15:09.777 [INFO][4121] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:09.802648 containerd[1474]: 2025-09-12 17:15:09.791 [WARNING][4121] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" HandleID="k8s-pod-network.ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--kube--controllers--6fff4fc9--pkdxs-eth0" Sep 12 17:15:09.802648 containerd[1474]: 2025-09-12 17:15:09.791 [INFO][4121] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" HandleID="k8s-pod-network.ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--kube--controllers--6fff4fc9--pkdxs-eth0" Sep 12 17:15:09.802648 containerd[1474]: 2025-09-12 17:15:09.795 [INFO][4121] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:09.802648 containerd[1474]: 2025-09-12 17:15:09.798 [INFO][4091] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" Sep 12 17:15:09.806842 containerd[1474]: time="2025-09-12T17:15:09.806503618Z" level=info msg="TearDown network for sandbox \"ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8\" successfully" Sep 12 17:15:09.806842 containerd[1474]: time="2025-09-12T17:15:09.806591377Z" level=info msg="StopPodSandbox for \"ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8\" returns successfully" Sep 12 17:15:09.810299 containerd[1474]: time="2025-09-12T17:15:09.809549852Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6fff4fc9-pkdxs,Uid:a51ebaed-92f6-4627-9397-05d2c27efa46,Namespace:calico-system,Attempt:1,}" Sep 12 17:15:09.808963 systemd[1]: run-netns-cni\x2dafb73e72\x2d03d1\x2dc976\x2d7f6b\x2dd1081e8d0283.mount: Deactivated successfully. Sep 12 17:15:09.942219 systemd-networkd[1374]: vxlan.calico: Link UP Sep 12 17:15:09.942239 systemd-networkd[1374]: vxlan.calico: Gained carrier Sep 12 17:15:10.257333 systemd-networkd[1374]: cali82790c92204: Link UP Sep 12 17:15:10.260125 systemd-networkd[1374]: cali82790c92204: Gained carrier Sep 12 17:15:10.302395 containerd[1474]: 2025-09-12 17:15:10.023 [INFO][4145] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--2--0999f1dc3d-k8s-calico--kube--controllers--6fff4fc9--pkdxs-eth0 calico-kube-controllers-6fff4fc9- calico-system a51ebaed-92f6-4627-9397-05d2c27efa46 917 0 2025-09-12 17:14:47 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6fff4fc9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4081-3-6-2-0999f1dc3d calico-kube-controllers-6fff4fc9-pkdxs eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali82790c92204 [] [] }} ContainerID="1fda1c833dd0c298abd21be0cc5394b58067328fa4156066fc6617adad0b9cf5" Namespace="calico-system" Pod="calico-kube-controllers-6fff4fc9-pkdxs" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-calico--kube--controllers--6fff4fc9--pkdxs-" Sep 12 17:15:10.302395 containerd[1474]: 2025-09-12 17:15:10.023 [INFO][4145] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1fda1c833dd0c298abd21be0cc5394b58067328fa4156066fc6617adad0b9cf5" Namespace="calico-system" Pod="calico-kube-controllers-6fff4fc9-pkdxs" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-calico--kube--controllers--6fff4fc9--pkdxs-eth0" Sep 12 17:15:10.302395 containerd[1474]: 2025-09-12 17:15:10.131 [INFO][4164] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1fda1c833dd0c298abd21be0cc5394b58067328fa4156066fc6617adad0b9cf5" HandleID="k8s-pod-network.1fda1c833dd0c298abd21be0cc5394b58067328fa4156066fc6617adad0b9cf5" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--kube--controllers--6fff4fc9--pkdxs-eth0" Sep 12 17:15:10.302395 containerd[1474]: 2025-09-12 17:15:10.133 [INFO][4164] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1fda1c833dd0c298abd21be0cc5394b58067328fa4156066fc6617adad0b9cf5" HandleID="k8s-pod-network.1fda1c833dd0c298abd21be0cc5394b58067328fa4156066fc6617adad0b9cf5" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--kube--controllers--6fff4fc9--pkdxs-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024a7a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-2-0999f1dc3d", "pod":"calico-kube-controllers-6fff4fc9-pkdxs", "timestamp":"2025-09-12 17:15:10.12803566 +0000 UTC"}, Hostname:"ci-4081-3-6-2-0999f1dc3d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:15:10.302395 containerd[1474]: 2025-09-12 17:15:10.133 [INFO][4164] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:10.302395 containerd[1474]: 2025-09-12 17:15:10.134 [INFO][4164] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:10.302395 containerd[1474]: 2025-09-12 17:15:10.134 [INFO][4164] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-2-0999f1dc3d' Sep 12 17:15:10.302395 containerd[1474]: 2025-09-12 17:15:10.157 [INFO][4164] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1fda1c833dd0c298abd21be0cc5394b58067328fa4156066fc6617adad0b9cf5" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:10.302395 containerd[1474]: 2025-09-12 17:15:10.170 [INFO][4164] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:10.302395 containerd[1474]: 2025-09-12 17:15:10.187 [INFO][4164] ipam/ipam.go 511: Trying affinity for 192.168.48.64/26 host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:10.302395 containerd[1474]: 2025-09-12 17:15:10.191 [INFO][4164] ipam/ipam.go 158: Attempting to load block cidr=192.168.48.64/26 host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:10.302395 containerd[1474]: 2025-09-12 17:15:10.205 [INFO][4164] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.48.64/26 host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:10.302395 containerd[1474]: 2025-09-12 17:15:10.206 [INFO][4164] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.48.64/26 handle="k8s-pod-network.1fda1c833dd0c298abd21be0cc5394b58067328fa4156066fc6617adad0b9cf5" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:10.302395 containerd[1474]: 2025-09-12 17:15:10.216 [INFO][4164] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1fda1c833dd0c298abd21be0cc5394b58067328fa4156066fc6617adad0b9cf5 Sep 12 17:15:10.302395 containerd[1474]: 2025-09-12 17:15:10.226 [INFO][4164] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.48.64/26 handle="k8s-pod-network.1fda1c833dd0c298abd21be0cc5394b58067328fa4156066fc6617adad0b9cf5" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:10.302395 containerd[1474]: 2025-09-12 17:15:10.246 [INFO][4164] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.48.66/26] block=192.168.48.64/26 handle="k8s-pod-network.1fda1c833dd0c298abd21be0cc5394b58067328fa4156066fc6617adad0b9cf5" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:10.302395 containerd[1474]: 2025-09-12 17:15:10.248 [INFO][4164] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.48.66/26] handle="k8s-pod-network.1fda1c833dd0c298abd21be0cc5394b58067328fa4156066fc6617adad0b9cf5" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:10.302395 containerd[1474]: 2025-09-12 17:15:10.248 [INFO][4164] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:10.302395 containerd[1474]: 2025-09-12 17:15:10.248 [INFO][4164] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.48.66/26] IPv6=[] ContainerID="1fda1c833dd0c298abd21be0cc5394b58067328fa4156066fc6617adad0b9cf5" HandleID="k8s-pod-network.1fda1c833dd0c298abd21be0cc5394b58067328fa4156066fc6617adad0b9cf5" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--kube--controllers--6fff4fc9--pkdxs-eth0" Sep 12 17:15:10.307386 containerd[1474]: 2025-09-12 17:15:10.251 [INFO][4145] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1fda1c833dd0c298abd21be0cc5394b58067328fa4156066fc6617adad0b9cf5" Namespace="calico-system" Pod="calico-kube-controllers-6fff4fc9-pkdxs" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-calico--kube--controllers--6fff4fc9--pkdxs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--2--0999f1dc3d-k8s-calico--kube--controllers--6fff4fc9--pkdxs-eth0", GenerateName:"calico-kube-controllers-6fff4fc9-", Namespace:"calico-system", SelfLink:"", UID:"a51ebaed-92f6-4627-9397-05d2c27efa46", ResourceVersion:"917", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6fff4fc9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-2-0999f1dc3d", ContainerID:"", Pod:"calico-kube-controllers-6fff4fc9-pkdxs", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.48.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali82790c92204", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:10.307386 containerd[1474]: 2025-09-12 17:15:10.251 [INFO][4145] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.48.66/32] ContainerID="1fda1c833dd0c298abd21be0cc5394b58067328fa4156066fc6617adad0b9cf5" Namespace="calico-system" Pod="calico-kube-controllers-6fff4fc9-pkdxs" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-calico--kube--controllers--6fff4fc9--pkdxs-eth0" Sep 12 17:15:10.307386 containerd[1474]: 2025-09-12 17:15:10.251 [INFO][4145] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali82790c92204 ContainerID="1fda1c833dd0c298abd21be0cc5394b58067328fa4156066fc6617adad0b9cf5" Namespace="calico-system" Pod="calico-kube-controllers-6fff4fc9-pkdxs" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-calico--kube--controllers--6fff4fc9--pkdxs-eth0" Sep 12 17:15:10.307386 containerd[1474]: 2025-09-12 17:15:10.260 [INFO][4145] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1fda1c833dd0c298abd21be0cc5394b58067328fa4156066fc6617adad0b9cf5" Namespace="calico-system" Pod="calico-kube-controllers-6fff4fc9-pkdxs" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-calico--kube--controllers--6fff4fc9--pkdxs-eth0" Sep 12 17:15:10.307386 containerd[1474]: 2025-09-12 17:15:10.261 [INFO][4145] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1fda1c833dd0c298abd21be0cc5394b58067328fa4156066fc6617adad0b9cf5" Namespace="calico-system" Pod="calico-kube-controllers-6fff4fc9-pkdxs" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-calico--kube--controllers--6fff4fc9--pkdxs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--2--0999f1dc3d-k8s-calico--kube--controllers--6fff4fc9--pkdxs-eth0", GenerateName:"calico-kube-controllers-6fff4fc9-", Namespace:"calico-system", SelfLink:"", UID:"a51ebaed-92f6-4627-9397-05d2c27efa46", ResourceVersion:"917", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6fff4fc9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-2-0999f1dc3d", ContainerID:"1fda1c833dd0c298abd21be0cc5394b58067328fa4156066fc6617adad0b9cf5", Pod:"calico-kube-controllers-6fff4fc9-pkdxs", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.48.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali82790c92204", MAC:"22:78:5a:2a:c2:27", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:10.307386 containerd[1474]: 2025-09-12 17:15:10.295 [INFO][4145] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1fda1c833dd0c298abd21be0cc5394b58067328fa4156066fc6617adad0b9cf5" Namespace="calico-system" Pod="calico-kube-controllers-6fff4fc9-pkdxs" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-calico--kube--controllers--6fff4fc9--pkdxs-eth0" Sep 12 17:15:10.358198 containerd[1474]: time="2025-09-12T17:15:10.356896300Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:15:10.358198 containerd[1474]: time="2025-09-12T17:15:10.357003138Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:15:10.358198 containerd[1474]: time="2025-09-12T17:15:10.357019378Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:10.361272 containerd[1474]: time="2025-09-12T17:15:10.360223211Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:10.418572 systemd[1]: Started cri-containerd-1fda1c833dd0c298abd21be0cc5394b58067328fa4156066fc6617adad0b9cf5.scope - libcontainer container 1fda1c833dd0c298abd21be0cc5394b58067328fa4156066fc6617adad0b9cf5. Sep 12 17:15:10.507296 containerd[1474]: time="2025-09-12T17:15:10.507189267Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6fff4fc9-pkdxs,Uid:a51ebaed-92f6-4627-9397-05d2c27efa46,Namespace:calico-system,Attempt:1,} returns sandbox id \"1fda1c833dd0c298abd21be0cc5394b58067328fa4156066fc6617adad0b9cf5\"" Sep 12 17:15:10.584292 containerd[1474]: time="2025-09-12T17:15:10.583536653Z" level=info msg="StopPodSandbox for \"4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266\"" Sep 12 17:15:10.586376 containerd[1474]: time="2025-09-12T17:15:10.585565783Z" level=info msg="StopPodSandbox for \"13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a\"" Sep 12 17:15:10.769412 containerd[1474]: 2025-09-12 17:15:10.681 [INFO][4260] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" Sep 12 17:15:10.769412 containerd[1474]: 2025-09-12 17:15:10.687 [INFO][4260] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" iface="eth0" netns="/var/run/netns/cni-0cbd13fa-27fe-8153-018d-346e18ad9e35" Sep 12 17:15:10.769412 containerd[1474]: 2025-09-12 17:15:10.689 [INFO][4260] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" iface="eth0" netns="/var/run/netns/cni-0cbd13fa-27fe-8153-018d-346e18ad9e35" Sep 12 17:15:10.769412 containerd[1474]: 2025-09-12 17:15:10.691 [INFO][4260] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" iface="eth0" netns="/var/run/netns/cni-0cbd13fa-27fe-8153-018d-346e18ad9e35" Sep 12 17:15:10.769412 containerd[1474]: 2025-09-12 17:15:10.691 [INFO][4260] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" Sep 12 17:15:10.769412 containerd[1474]: 2025-09-12 17:15:10.691 [INFO][4260] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" Sep 12 17:15:10.769412 containerd[1474]: 2025-09-12 17:15:10.742 [INFO][4278] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" HandleID="k8s-pod-network.4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-csi--node--driver--krpgf-eth0" Sep 12 17:15:10.769412 containerd[1474]: 2025-09-12 17:15:10.744 [INFO][4278] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:10.769412 containerd[1474]: 2025-09-12 17:15:10.744 [INFO][4278] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:10.769412 containerd[1474]: 2025-09-12 17:15:10.758 [WARNING][4278] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" HandleID="k8s-pod-network.4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-csi--node--driver--krpgf-eth0" Sep 12 17:15:10.769412 containerd[1474]: 2025-09-12 17:15:10.759 [INFO][4278] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" HandleID="k8s-pod-network.4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-csi--node--driver--krpgf-eth0" Sep 12 17:15:10.769412 containerd[1474]: 2025-09-12 17:15:10.762 [INFO][4278] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:10.769412 containerd[1474]: 2025-09-12 17:15:10.765 [INFO][4260] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" Sep 12 17:15:10.772537 containerd[1474]: time="2025-09-12T17:15:10.771675858Z" level=info msg="TearDown network for sandbox \"4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266\" successfully" Sep 12 17:15:10.772537 containerd[1474]: time="2025-09-12T17:15:10.771763176Z" level=info msg="StopPodSandbox for \"4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266\" returns successfully" Sep 12 17:15:10.778466 containerd[1474]: time="2025-09-12T17:15:10.777327374Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-krpgf,Uid:7c11d9b3-8068-4598-8721-3a4e4f793c52,Namespace:calico-system,Attempt:1,}" Sep 12 17:15:10.806770 systemd[1]: run-netns-cni\x2d0cbd13fa\x2d27fe\x2d8153\x2d018d\x2d346e18ad9e35.mount: Deactivated successfully. Sep 12 17:15:10.840293 containerd[1474]: 2025-09-12 17:15:10.730 [INFO][4268] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" Sep 12 17:15:10.840293 containerd[1474]: 2025-09-12 17:15:10.730 [INFO][4268] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" iface="eth0" netns="/var/run/netns/cni-142c914c-ea54-c6e1-99ed-8f314261f06b" Sep 12 17:15:10.840293 containerd[1474]: 2025-09-12 17:15:10.736 [INFO][4268] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" iface="eth0" netns="/var/run/netns/cni-142c914c-ea54-c6e1-99ed-8f314261f06b" Sep 12 17:15:10.840293 containerd[1474]: 2025-09-12 17:15:10.736 [INFO][4268] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" iface="eth0" netns="/var/run/netns/cni-142c914c-ea54-c6e1-99ed-8f314261f06b" Sep 12 17:15:10.840293 containerd[1474]: 2025-09-12 17:15:10.736 [INFO][4268] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" Sep 12 17:15:10.840293 containerd[1474]: 2025-09-12 17:15:10.736 [INFO][4268] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" Sep 12 17:15:10.840293 containerd[1474]: 2025-09-12 17:15:10.793 [INFO][4287] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" HandleID="k8s-pod-network.13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-goldmane--54d579b49d--fnsjh-eth0" Sep 12 17:15:10.840293 containerd[1474]: 2025-09-12 17:15:10.793 [INFO][4287] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:10.840293 containerd[1474]: 2025-09-12 17:15:10.793 [INFO][4287] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:10.840293 containerd[1474]: 2025-09-12 17:15:10.811 [WARNING][4287] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" HandleID="k8s-pod-network.13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-goldmane--54d579b49d--fnsjh-eth0" Sep 12 17:15:10.840293 containerd[1474]: 2025-09-12 17:15:10.811 [INFO][4287] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" HandleID="k8s-pod-network.13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-goldmane--54d579b49d--fnsjh-eth0" Sep 12 17:15:10.840293 containerd[1474]: 2025-09-12 17:15:10.817 [INFO][4287] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:10.840293 containerd[1474]: 2025-09-12 17:15:10.831 [INFO][4268] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" Sep 12 17:15:10.842569 containerd[1474]: time="2025-09-12T17:15:10.842364727Z" level=info msg="TearDown network for sandbox \"13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a\" successfully" Sep 12 17:15:10.842569 containerd[1474]: time="2025-09-12T17:15:10.842427766Z" level=info msg="StopPodSandbox for \"13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a\" returns successfully" Sep 12 17:15:10.847203 systemd[1]: run-netns-cni\x2d142c914c\x2dea54\x2dc6e1\x2d99ed\x2d8f314261f06b.mount: Deactivated successfully. Sep 12 17:15:10.848671 containerd[1474]: time="2025-09-12T17:15:10.848425077Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-fnsjh,Uid:3b22a780-bb41-4d2d-9dad-2beb095f9e2c,Namespace:calico-system,Attempt:1,}" Sep 12 17:15:11.181215 systemd-networkd[1374]: cali556a08f9179: Link UP Sep 12 17:15:11.183290 systemd-networkd[1374]: cali556a08f9179: Gained carrier Sep 12 17:15:11.221481 containerd[1474]: 2025-09-12 17:15:10.982 [INFO][4302] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--2--0999f1dc3d-k8s-csi--node--driver--krpgf-eth0 csi-node-driver- calico-system 7c11d9b3-8068-4598-8721-3a4e4f793c52 927 0 2025-09-12 17:14:47 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4081-3-6-2-0999f1dc3d csi-node-driver-krpgf eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali556a08f9179 [] [] }} ContainerID="3003849266b76e7168cdfde98b1fd3830e01312968a4c53470e7af0342164f4b" Namespace="calico-system" Pod="csi-node-driver-krpgf" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-csi--node--driver--krpgf-" Sep 12 17:15:11.221481 containerd[1474]: 2025-09-12 17:15:10.983 [INFO][4302] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3003849266b76e7168cdfde98b1fd3830e01312968a4c53470e7af0342164f4b" Namespace="calico-system" Pod="csi-node-driver-krpgf" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-csi--node--driver--krpgf-eth0" Sep 12 17:15:11.221481 containerd[1474]: 2025-09-12 17:15:11.079 [INFO][4351] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3003849266b76e7168cdfde98b1fd3830e01312968a4c53470e7af0342164f4b" HandleID="k8s-pod-network.3003849266b76e7168cdfde98b1fd3830e01312968a4c53470e7af0342164f4b" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-csi--node--driver--krpgf-eth0" Sep 12 17:15:11.221481 containerd[1474]: 2025-09-12 17:15:11.079 [INFO][4351] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3003849266b76e7168cdfde98b1fd3830e01312968a4c53470e7af0342164f4b" HandleID="k8s-pod-network.3003849266b76e7168cdfde98b1fd3830e01312968a4c53470e7af0342164f4b" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-csi--node--driver--krpgf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000323970), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-2-0999f1dc3d", "pod":"csi-node-driver-krpgf", "timestamp":"2025-09-12 17:15:11.079088154 +0000 UTC"}, Hostname:"ci-4081-3-6-2-0999f1dc3d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:15:11.221481 containerd[1474]: 2025-09-12 17:15:11.079 [INFO][4351] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:11.221481 containerd[1474]: 2025-09-12 17:15:11.079 [INFO][4351] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:11.221481 containerd[1474]: 2025-09-12 17:15:11.079 [INFO][4351] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-2-0999f1dc3d' Sep 12 17:15:11.221481 containerd[1474]: 2025-09-12 17:15:11.103 [INFO][4351] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3003849266b76e7168cdfde98b1fd3830e01312968a4c53470e7af0342164f4b" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:11.221481 containerd[1474]: 2025-09-12 17:15:11.117 [INFO][4351] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:11.221481 containerd[1474]: 2025-09-12 17:15:11.127 [INFO][4351] ipam/ipam.go 511: Trying affinity for 192.168.48.64/26 host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:11.221481 containerd[1474]: 2025-09-12 17:15:11.131 [INFO][4351] ipam/ipam.go 158: Attempting to load block cidr=192.168.48.64/26 host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:11.221481 containerd[1474]: 2025-09-12 17:15:11.138 [INFO][4351] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.48.64/26 host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:11.221481 containerd[1474]: 2025-09-12 17:15:11.138 [INFO][4351] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.48.64/26 handle="k8s-pod-network.3003849266b76e7168cdfde98b1fd3830e01312968a4c53470e7af0342164f4b" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:11.221481 containerd[1474]: 2025-09-12 17:15:11.142 [INFO][4351] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.3003849266b76e7168cdfde98b1fd3830e01312968a4c53470e7af0342164f4b Sep 12 17:15:11.221481 containerd[1474]: 2025-09-12 17:15:11.150 [INFO][4351] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.48.64/26 handle="k8s-pod-network.3003849266b76e7168cdfde98b1fd3830e01312968a4c53470e7af0342164f4b" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:11.221481 containerd[1474]: 2025-09-12 17:15:11.163 [INFO][4351] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.48.67/26] block=192.168.48.64/26 handle="k8s-pod-network.3003849266b76e7168cdfde98b1fd3830e01312968a4c53470e7af0342164f4b" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:11.221481 containerd[1474]: 2025-09-12 17:15:11.164 [INFO][4351] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.48.67/26] handle="k8s-pod-network.3003849266b76e7168cdfde98b1fd3830e01312968a4c53470e7af0342164f4b" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:11.221481 containerd[1474]: 2025-09-12 17:15:11.164 [INFO][4351] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:11.221481 containerd[1474]: 2025-09-12 17:15:11.164 [INFO][4351] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.48.67/26] IPv6=[] ContainerID="3003849266b76e7168cdfde98b1fd3830e01312968a4c53470e7af0342164f4b" HandleID="k8s-pod-network.3003849266b76e7168cdfde98b1fd3830e01312968a4c53470e7af0342164f4b" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-csi--node--driver--krpgf-eth0" Sep 12 17:15:11.222320 containerd[1474]: 2025-09-12 17:15:11.172 [INFO][4302] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3003849266b76e7168cdfde98b1fd3830e01312968a4c53470e7af0342164f4b" Namespace="calico-system" Pod="csi-node-driver-krpgf" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-csi--node--driver--krpgf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--2--0999f1dc3d-k8s-csi--node--driver--krpgf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"7c11d9b3-8068-4598-8721-3a4e4f793c52", ResourceVersion:"927", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-2-0999f1dc3d", ContainerID:"", Pod:"csi-node-driver-krpgf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.48.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali556a08f9179", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:11.222320 containerd[1474]: 2025-09-12 17:15:11.173 [INFO][4302] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.48.67/32] ContainerID="3003849266b76e7168cdfde98b1fd3830e01312968a4c53470e7af0342164f4b" Namespace="calico-system" Pod="csi-node-driver-krpgf" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-csi--node--driver--krpgf-eth0" Sep 12 17:15:11.222320 containerd[1474]: 2025-09-12 17:15:11.174 [INFO][4302] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali556a08f9179 ContainerID="3003849266b76e7168cdfde98b1fd3830e01312968a4c53470e7af0342164f4b" Namespace="calico-system" Pod="csi-node-driver-krpgf" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-csi--node--driver--krpgf-eth0" Sep 12 17:15:11.222320 containerd[1474]: 2025-09-12 17:15:11.182 [INFO][4302] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3003849266b76e7168cdfde98b1fd3830e01312968a4c53470e7af0342164f4b" Namespace="calico-system" Pod="csi-node-driver-krpgf" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-csi--node--driver--krpgf-eth0" Sep 12 17:15:11.222320 containerd[1474]: 2025-09-12 17:15:11.183 [INFO][4302] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3003849266b76e7168cdfde98b1fd3830e01312968a4c53470e7af0342164f4b" Namespace="calico-system" Pod="csi-node-driver-krpgf" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-csi--node--driver--krpgf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--2--0999f1dc3d-k8s-csi--node--driver--krpgf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"7c11d9b3-8068-4598-8721-3a4e4f793c52", ResourceVersion:"927", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-2-0999f1dc3d", ContainerID:"3003849266b76e7168cdfde98b1fd3830e01312968a4c53470e7af0342164f4b", Pod:"csi-node-driver-krpgf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.48.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali556a08f9179", MAC:"1a:8e:d1:bb:6b:85", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:11.222320 containerd[1474]: 2025-09-12 17:15:11.216 [INFO][4302] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3003849266b76e7168cdfde98b1fd3830e01312968a4c53470e7af0342164f4b" Namespace="calico-system" Pod="csi-node-driver-krpgf" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-csi--node--driver--krpgf-eth0" Sep 12 17:15:11.264269 containerd[1474]: time="2025-09-12T17:15:11.264078860Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:15:11.265690 containerd[1474]: time="2025-09-12T17:15:11.264289337Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:15:11.265690 containerd[1474]: time="2025-09-12T17:15:11.264347256Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:11.267184 containerd[1474]: time="2025-09-12T17:15:11.266488065Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:11.306476 systemd[1]: Started cri-containerd-3003849266b76e7168cdfde98b1fd3830e01312968a4c53470e7af0342164f4b.scope - libcontainer container 3003849266b76e7168cdfde98b1fd3830e01312968a4c53470e7af0342164f4b. Sep 12 17:15:11.315806 systemd-networkd[1374]: calia8e96e15d55: Link UP Sep 12 17:15:11.321617 systemd-networkd[1374]: calia8e96e15d55: Gained carrier Sep 12 17:15:11.363482 containerd[1474]: 2025-09-12 17:15:11.023 [INFO][4318] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--2--0999f1dc3d-k8s-goldmane--54d579b49d--fnsjh-eth0 goldmane-54d579b49d- calico-system 3b22a780-bb41-4d2d-9dad-2beb095f9e2c 928 0 2025-09-12 17:14:47 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4081-3-6-2-0999f1dc3d goldmane-54d579b49d-fnsjh eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calia8e96e15d55 [] [] }} ContainerID="591a64a3fdf996e8c3d680f0698e12606385018435cee65b50db99b9a3df99dc" Namespace="calico-system" Pod="goldmane-54d579b49d-fnsjh" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-goldmane--54d579b49d--fnsjh-" Sep 12 17:15:11.363482 containerd[1474]: 2025-09-12 17:15:11.024 [INFO][4318] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="591a64a3fdf996e8c3d680f0698e12606385018435cee65b50db99b9a3df99dc" Namespace="calico-system" Pod="goldmane-54d579b49d-fnsjh" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-goldmane--54d579b49d--fnsjh-eth0" Sep 12 17:15:11.363482 containerd[1474]: 2025-09-12 17:15:11.117 [INFO][4362] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="591a64a3fdf996e8c3d680f0698e12606385018435cee65b50db99b9a3df99dc" HandleID="k8s-pod-network.591a64a3fdf996e8c3d680f0698e12606385018435cee65b50db99b9a3df99dc" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-goldmane--54d579b49d--fnsjh-eth0" Sep 12 17:15:11.363482 containerd[1474]: 2025-09-12 17:15:11.118 [INFO][4362] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="591a64a3fdf996e8c3d680f0698e12606385018435cee65b50db99b9a3df99dc" HandleID="k8s-pod-network.591a64a3fdf996e8c3d680f0698e12606385018435cee65b50db99b9a3df99dc" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-goldmane--54d579b49d--fnsjh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c2e0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-6-2-0999f1dc3d", "pod":"goldmane-54d579b49d-fnsjh", "timestamp":"2025-09-12 17:15:11.117728071 +0000 UTC"}, Hostname:"ci-4081-3-6-2-0999f1dc3d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:15:11.363482 containerd[1474]: 2025-09-12 17:15:11.118 [INFO][4362] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:11.363482 containerd[1474]: 2025-09-12 17:15:11.164 [INFO][4362] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:11.363482 containerd[1474]: 2025-09-12 17:15:11.165 [INFO][4362] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-2-0999f1dc3d' Sep 12 17:15:11.363482 containerd[1474]: 2025-09-12 17:15:11.204 [INFO][4362] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.591a64a3fdf996e8c3d680f0698e12606385018435cee65b50db99b9a3df99dc" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:11.363482 containerd[1474]: 2025-09-12 17:15:11.225 [INFO][4362] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:11.363482 containerd[1474]: 2025-09-12 17:15:11.238 [INFO][4362] ipam/ipam.go 511: Trying affinity for 192.168.48.64/26 host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:11.363482 containerd[1474]: 2025-09-12 17:15:11.242 [INFO][4362] ipam/ipam.go 158: Attempting to load block cidr=192.168.48.64/26 host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:11.363482 containerd[1474]: 2025-09-12 17:15:11.248 [INFO][4362] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.48.64/26 host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:11.363482 containerd[1474]: 2025-09-12 17:15:11.248 [INFO][4362] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.48.64/26 handle="k8s-pod-network.591a64a3fdf996e8c3d680f0698e12606385018435cee65b50db99b9a3df99dc" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:11.363482 containerd[1474]: 2025-09-12 17:15:11.257 [INFO][4362] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.591a64a3fdf996e8c3d680f0698e12606385018435cee65b50db99b9a3df99dc Sep 12 17:15:11.363482 containerd[1474]: 2025-09-12 17:15:11.268 [INFO][4362] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.48.64/26 handle="k8s-pod-network.591a64a3fdf996e8c3d680f0698e12606385018435cee65b50db99b9a3df99dc" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:11.363482 containerd[1474]: 2025-09-12 17:15:11.291 [INFO][4362] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.48.68/26] block=192.168.48.64/26 handle="k8s-pod-network.591a64a3fdf996e8c3d680f0698e12606385018435cee65b50db99b9a3df99dc" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:11.363482 containerd[1474]: 2025-09-12 17:15:11.291 [INFO][4362] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.48.68/26] handle="k8s-pod-network.591a64a3fdf996e8c3d680f0698e12606385018435cee65b50db99b9a3df99dc" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:11.363482 containerd[1474]: 2025-09-12 17:15:11.292 [INFO][4362] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:11.363482 containerd[1474]: 2025-09-12 17:15:11.292 [INFO][4362] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.48.68/26] IPv6=[] ContainerID="591a64a3fdf996e8c3d680f0698e12606385018435cee65b50db99b9a3df99dc" HandleID="k8s-pod-network.591a64a3fdf996e8c3d680f0698e12606385018435cee65b50db99b9a3df99dc" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-goldmane--54d579b49d--fnsjh-eth0" Sep 12 17:15:11.365857 containerd[1474]: 2025-09-12 17:15:11.307 [INFO][4318] cni-plugin/k8s.go 418: Populated endpoint ContainerID="591a64a3fdf996e8c3d680f0698e12606385018435cee65b50db99b9a3df99dc" Namespace="calico-system" Pod="goldmane-54d579b49d-fnsjh" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-goldmane--54d579b49d--fnsjh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--2--0999f1dc3d-k8s-goldmane--54d579b49d--fnsjh-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"3b22a780-bb41-4d2d-9dad-2beb095f9e2c", ResourceVersion:"928", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-2-0999f1dc3d", ContainerID:"", Pod:"goldmane-54d579b49d-fnsjh", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.48.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia8e96e15d55", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:11.365857 containerd[1474]: 2025-09-12 17:15:11.307 [INFO][4318] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.48.68/32] ContainerID="591a64a3fdf996e8c3d680f0698e12606385018435cee65b50db99b9a3df99dc" Namespace="calico-system" Pod="goldmane-54d579b49d-fnsjh" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-goldmane--54d579b49d--fnsjh-eth0" Sep 12 17:15:11.365857 containerd[1474]: 2025-09-12 17:15:11.307 [INFO][4318] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia8e96e15d55 ContainerID="591a64a3fdf996e8c3d680f0698e12606385018435cee65b50db99b9a3df99dc" Namespace="calico-system" Pod="goldmane-54d579b49d-fnsjh" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-goldmane--54d579b49d--fnsjh-eth0" Sep 12 17:15:11.365857 containerd[1474]: 2025-09-12 17:15:11.325 [INFO][4318] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="591a64a3fdf996e8c3d680f0698e12606385018435cee65b50db99b9a3df99dc" Namespace="calico-system" Pod="goldmane-54d579b49d-fnsjh" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-goldmane--54d579b49d--fnsjh-eth0" Sep 12 17:15:11.365857 containerd[1474]: 2025-09-12 17:15:11.326 [INFO][4318] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="591a64a3fdf996e8c3d680f0698e12606385018435cee65b50db99b9a3df99dc" Namespace="calico-system" Pod="goldmane-54d579b49d-fnsjh" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-goldmane--54d579b49d--fnsjh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--2--0999f1dc3d-k8s-goldmane--54d579b49d--fnsjh-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"3b22a780-bb41-4d2d-9dad-2beb095f9e2c", ResourceVersion:"928", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-2-0999f1dc3d", ContainerID:"591a64a3fdf996e8c3d680f0698e12606385018435cee65b50db99b9a3df99dc", Pod:"goldmane-54d579b49d-fnsjh", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.48.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia8e96e15d55", MAC:"aa:c7:be:8d:f2:ef", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:11.365857 containerd[1474]: 2025-09-12 17:15:11.351 [INFO][4318] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="591a64a3fdf996e8c3d680f0698e12606385018435cee65b50db99b9a3df99dc" Namespace="calico-system" Pod="goldmane-54d579b49d-fnsjh" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-goldmane--54d579b49d--fnsjh-eth0" Sep 12 17:15:11.378826 containerd[1474]: time="2025-09-12T17:15:11.378363836Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-krpgf,Uid:7c11d9b3-8068-4598-8721-3a4e4f793c52,Namespace:calico-system,Attempt:1,} returns sandbox id \"3003849266b76e7168cdfde98b1fd3830e01312968a4c53470e7af0342164f4b\"" Sep 12 17:15:11.404093 containerd[1474]: time="2025-09-12T17:15:11.403664747Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:15:11.404093 containerd[1474]: time="2025-09-12T17:15:11.403777146Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:15:11.404093 containerd[1474]: time="2025-09-12T17:15:11.403796865Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:11.404093 containerd[1474]: time="2025-09-12T17:15:11.403914744Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:11.434385 systemd[1]: Started cri-containerd-591a64a3fdf996e8c3d680f0698e12606385018435cee65b50db99b9a3df99dc.scope - libcontainer container 591a64a3fdf996e8c3d680f0698e12606385018435cee65b50db99b9a3df99dc. Sep 12 17:15:11.490568 containerd[1474]: time="2025-09-12T17:15:11.490500483Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-fnsjh,Uid:3b22a780-bb41-4d2d-9dad-2beb095f9e2c,Namespace:calico-system,Attempt:1,} returns sandbox id \"591a64a3fdf996e8c3d680f0698e12606385018435cee65b50db99b9a3df99dc\"" Sep 12 17:15:11.535695 systemd-networkd[1374]: vxlan.calico: Gained IPv6LL Sep 12 17:15:11.586162 containerd[1474]: time="2025-09-12T17:15:11.586099331Z" level=info msg="StopPodSandbox for \"dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b\"" Sep 12 17:15:11.725434 containerd[1474]: 2025-09-12 17:15:11.663 [INFO][4483] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" Sep 12 17:15:11.725434 containerd[1474]: 2025-09-12 17:15:11.664 [INFO][4483] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" iface="eth0" netns="/var/run/netns/cni-755eae99-9f94-460c-efad-f6a3dd1fbeed" Sep 12 17:15:11.725434 containerd[1474]: 2025-09-12 17:15:11.665 [INFO][4483] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" iface="eth0" netns="/var/run/netns/cni-755eae99-9f94-460c-efad-f6a3dd1fbeed" Sep 12 17:15:11.725434 containerd[1474]: 2025-09-12 17:15:11.666 [INFO][4483] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" iface="eth0" netns="/var/run/netns/cni-755eae99-9f94-460c-efad-f6a3dd1fbeed" Sep 12 17:15:11.725434 containerd[1474]: 2025-09-12 17:15:11.667 [INFO][4483] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" Sep 12 17:15:11.725434 containerd[1474]: 2025-09-12 17:15:11.667 [INFO][4483] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" Sep 12 17:15:11.725434 containerd[1474]: 2025-09-12 17:15:11.701 [INFO][4491] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" HandleID="k8s-pod-network.dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--j49g9-eth0" Sep 12 17:15:11.725434 containerd[1474]: 2025-09-12 17:15:11.702 [INFO][4491] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:11.725434 containerd[1474]: 2025-09-12 17:15:11.702 [INFO][4491] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:11.725434 containerd[1474]: 2025-09-12 17:15:11.717 [WARNING][4491] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" HandleID="k8s-pod-network.dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--j49g9-eth0" Sep 12 17:15:11.725434 containerd[1474]: 2025-09-12 17:15:11.717 [INFO][4491] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" HandleID="k8s-pod-network.dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--j49g9-eth0" Sep 12 17:15:11.725434 containerd[1474]: 2025-09-12 17:15:11.720 [INFO][4491] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:11.725434 containerd[1474]: 2025-09-12 17:15:11.722 [INFO][4483] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" Sep 12 17:15:11.725434 containerd[1474]: time="2025-09-12T17:15:11.725296224Z" level=info msg="TearDown network for sandbox \"dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b\" successfully" Sep 12 17:15:11.725434 containerd[1474]: time="2025-09-12T17:15:11.725339583Z" level=info msg="StopPodSandbox for \"dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b\" returns successfully" Sep 12 17:15:11.727210 containerd[1474]: time="2025-09-12T17:15:11.726558886Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-j49g9,Uid:59f64a22-a697-45df-a931-7e5ec9919a4c,Namespace:kube-system,Attempt:1,}" Sep 12 17:15:11.812636 systemd[1]: run-netns-cni\x2d755eae99\x2d9f94\x2d460c\x2defad\x2df6a3dd1fbeed.mount: Deactivated successfully. Sep 12 17:15:11.980192 systemd-networkd[1374]: cali6ecd77912b3: Link UP Sep 12 17:15:11.980510 systemd-networkd[1374]: cali6ecd77912b3: Gained carrier Sep 12 17:15:12.030120 containerd[1474]: 2025-09-12 17:15:11.815 [INFO][4497] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--j49g9-eth0 coredns-668d6bf9bc- kube-system 59f64a22-a697-45df-a931-7e5ec9919a4c 939 0 2025-09-12 17:14:24 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081-3-6-2-0999f1dc3d coredns-668d6bf9bc-j49g9 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali6ecd77912b3 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="64f8a67d06a906d37de2f14a2f5f7f7247fdbe2a09aae7f83662535c97f07dfd" Namespace="kube-system" Pod="coredns-668d6bf9bc-j49g9" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--j49g9-" Sep 12 17:15:12.030120 containerd[1474]: 2025-09-12 17:15:11.816 [INFO][4497] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="64f8a67d06a906d37de2f14a2f5f7f7247fdbe2a09aae7f83662535c97f07dfd" Namespace="kube-system" Pod="coredns-668d6bf9bc-j49g9" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--j49g9-eth0" Sep 12 17:15:12.030120 containerd[1474]: 2025-09-12 17:15:11.875 [INFO][4510] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="64f8a67d06a906d37de2f14a2f5f7f7247fdbe2a09aae7f83662535c97f07dfd" HandleID="k8s-pod-network.64f8a67d06a906d37de2f14a2f5f7f7247fdbe2a09aae7f83662535c97f07dfd" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--j49g9-eth0" Sep 12 17:15:12.030120 containerd[1474]: 2025-09-12 17:15:11.875 [INFO][4510] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="64f8a67d06a906d37de2f14a2f5f7f7247fdbe2a09aae7f83662535c97f07dfd" HandleID="k8s-pod-network.64f8a67d06a906d37de2f14a2f5f7f7247fdbe2a09aae7f83662535c97f07dfd" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--j49g9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b220), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081-3-6-2-0999f1dc3d", "pod":"coredns-668d6bf9bc-j49g9", "timestamp":"2025-09-12 17:15:11.874323974 +0000 UTC"}, Hostname:"ci-4081-3-6-2-0999f1dc3d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:15:12.030120 containerd[1474]: 2025-09-12 17:15:11.876 [INFO][4510] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:12.030120 containerd[1474]: 2025-09-12 17:15:11.876 [INFO][4510] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:12.030120 containerd[1474]: 2025-09-12 17:15:11.876 [INFO][4510] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-2-0999f1dc3d' Sep 12 17:15:12.030120 containerd[1474]: 2025-09-12 17:15:11.895 [INFO][4510] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.64f8a67d06a906d37de2f14a2f5f7f7247fdbe2a09aae7f83662535c97f07dfd" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:12.030120 containerd[1474]: 2025-09-12 17:15:11.911 [INFO][4510] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:12.030120 containerd[1474]: 2025-09-12 17:15:11.923 [INFO][4510] ipam/ipam.go 511: Trying affinity for 192.168.48.64/26 host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:12.030120 containerd[1474]: 2025-09-12 17:15:11.928 [INFO][4510] ipam/ipam.go 158: Attempting to load block cidr=192.168.48.64/26 host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:12.030120 containerd[1474]: 2025-09-12 17:15:11.935 [INFO][4510] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.48.64/26 host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:12.030120 containerd[1474]: 2025-09-12 17:15:11.935 [INFO][4510] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.48.64/26 handle="k8s-pod-network.64f8a67d06a906d37de2f14a2f5f7f7247fdbe2a09aae7f83662535c97f07dfd" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:12.030120 containerd[1474]: 2025-09-12 17:15:11.941 [INFO][4510] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.64f8a67d06a906d37de2f14a2f5f7f7247fdbe2a09aae7f83662535c97f07dfd Sep 12 17:15:12.030120 containerd[1474]: 2025-09-12 17:15:11.952 [INFO][4510] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.48.64/26 handle="k8s-pod-network.64f8a67d06a906d37de2f14a2f5f7f7247fdbe2a09aae7f83662535c97f07dfd" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:12.030120 containerd[1474]: 2025-09-12 17:15:11.970 [INFO][4510] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.48.69/26] block=192.168.48.64/26 handle="k8s-pod-network.64f8a67d06a906d37de2f14a2f5f7f7247fdbe2a09aae7f83662535c97f07dfd" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:12.030120 containerd[1474]: 2025-09-12 17:15:11.970 [INFO][4510] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.48.69/26] handle="k8s-pod-network.64f8a67d06a906d37de2f14a2f5f7f7247fdbe2a09aae7f83662535c97f07dfd" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:12.030120 containerd[1474]: 2025-09-12 17:15:11.970 [INFO][4510] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:12.030120 containerd[1474]: 2025-09-12 17:15:11.970 [INFO][4510] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.48.69/26] IPv6=[] ContainerID="64f8a67d06a906d37de2f14a2f5f7f7247fdbe2a09aae7f83662535c97f07dfd" HandleID="k8s-pod-network.64f8a67d06a906d37de2f14a2f5f7f7247fdbe2a09aae7f83662535c97f07dfd" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--j49g9-eth0" Sep 12 17:15:12.031545 containerd[1474]: 2025-09-12 17:15:11.974 [INFO][4497] cni-plugin/k8s.go 418: Populated endpoint ContainerID="64f8a67d06a906d37de2f14a2f5f7f7247fdbe2a09aae7f83662535c97f07dfd" Namespace="kube-system" Pod="coredns-668d6bf9bc-j49g9" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--j49g9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--j49g9-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"59f64a22-a697-45df-a931-7e5ec9919a4c", ResourceVersion:"939", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-2-0999f1dc3d", ContainerID:"", Pod:"coredns-668d6bf9bc-j49g9", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.48.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6ecd77912b3", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:12.031545 containerd[1474]: 2025-09-12 17:15:11.974 [INFO][4497] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.48.69/32] ContainerID="64f8a67d06a906d37de2f14a2f5f7f7247fdbe2a09aae7f83662535c97f07dfd" Namespace="kube-system" Pod="coredns-668d6bf9bc-j49g9" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--j49g9-eth0" Sep 12 17:15:12.031545 containerd[1474]: 2025-09-12 17:15:11.974 [INFO][4497] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6ecd77912b3 ContainerID="64f8a67d06a906d37de2f14a2f5f7f7247fdbe2a09aae7f83662535c97f07dfd" Namespace="kube-system" Pod="coredns-668d6bf9bc-j49g9" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--j49g9-eth0" Sep 12 17:15:12.031545 containerd[1474]: 2025-09-12 17:15:11.982 [INFO][4497] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="64f8a67d06a906d37de2f14a2f5f7f7247fdbe2a09aae7f83662535c97f07dfd" Namespace="kube-system" Pod="coredns-668d6bf9bc-j49g9" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--j49g9-eth0" Sep 12 17:15:12.031545 containerd[1474]: 2025-09-12 17:15:11.987 [INFO][4497] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="64f8a67d06a906d37de2f14a2f5f7f7247fdbe2a09aae7f83662535c97f07dfd" Namespace="kube-system" Pod="coredns-668d6bf9bc-j49g9" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--j49g9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--j49g9-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"59f64a22-a697-45df-a931-7e5ec9919a4c", ResourceVersion:"939", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-2-0999f1dc3d", ContainerID:"64f8a67d06a906d37de2f14a2f5f7f7247fdbe2a09aae7f83662535c97f07dfd", Pod:"coredns-668d6bf9bc-j49g9", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.48.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6ecd77912b3", MAC:"aa:01:04:9b:0b:69", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:12.031545 containerd[1474]: 2025-09-12 17:15:12.017 [INFO][4497] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="64f8a67d06a906d37de2f14a2f5f7f7247fdbe2a09aae7f83662535c97f07dfd" Namespace="kube-system" Pod="coredns-668d6bf9bc-j49g9" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--j49g9-eth0" Sep 12 17:15:12.045525 systemd-networkd[1374]: cali82790c92204: Gained IPv6LL Sep 12 17:15:12.083017 containerd[1474]: time="2025-09-12T17:15:12.081499020Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:15:12.083017 containerd[1474]: time="2025-09-12T17:15:12.081587618Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:15:12.083017 containerd[1474]: time="2025-09-12T17:15:12.081619898Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:12.083017 containerd[1474]: time="2025-09-12T17:15:12.081767736Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:12.138006 systemd[1]: Started cri-containerd-64f8a67d06a906d37de2f14a2f5f7f7247fdbe2a09aae7f83662535c97f07dfd.scope - libcontainer container 64f8a67d06a906d37de2f14a2f5f7f7247fdbe2a09aae7f83662535c97f07dfd. Sep 12 17:15:12.211890 containerd[1474]: time="2025-09-12T17:15:12.211398044Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-j49g9,Uid:59f64a22-a697-45df-a931-7e5ec9919a4c,Namespace:kube-system,Attempt:1,} returns sandbox id \"64f8a67d06a906d37de2f14a2f5f7f7247fdbe2a09aae7f83662535c97f07dfd\"" Sep 12 17:15:12.220951 containerd[1474]: time="2025-09-12T17:15:12.220686632Z" level=info msg="CreateContainer within sandbox \"64f8a67d06a906d37de2f14a2f5f7f7247fdbe2a09aae7f83662535c97f07dfd\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 17:15:12.246324 kubelet[2556]: I0912 17:15:12.245958 2556 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:15:12.256016 containerd[1474]: time="2025-09-12T17:15:12.255606373Z" level=info msg="CreateContainer within sandbox \"64f8a67d06a906d37de2f14a2f5f7f7247fdbe2a09aae7f83662535c97f07dfd\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"a708c5cf67757309cfa2bcd96c50f71614d67669d4f695c1665bc3764a984b83\"" Sep 12 17:15:12.259229 containerd[1474]: time="2025-09-12T17:15:12.257561265Z" level=info msg="StartContainer for \"a708c5cf67757309cfa2bcd96c50f71614d67669d4f695c1665bc3764a984b83\"" Sep 12 17:15:12.327489 systemd[1]: Started cri-containerd-a708c5cf67757309cfa2bcd96c50f71614d67669d4f695c1665bc3764a984b83.scope - libcontainer container a708c5cf67757309cfa2bcd96c50f71614d67669d4f695c1665bc3764a984b83. Sep 12 17:15:12.366513 systemd-networkd[1374]: calia8e96e15d55: Gained IPv6LL Sep 12 17:15:12.422281 containerd[1474]: time="2025-09-12T17:15:12.422110794Z" level=info msg="StartContainer for \"a708c5cf67757309cfa2bcd96c50f71614d67669d4f695c1665bc3764a984b83\" returns successfully" Sep 12 17:15:12.577954 containerd[1474]: time="2025-09-12T17:15:12.576935183Z" level=info msg="StopPodSandbox for \"2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a\"" Sep 12 17:15:12.917187 containerd[1474]: 2025-09-12 17:15:12.792 [INFO][4657] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" Sep 12 17:15:12.917187 containerd[1474]: 2025-09-12 17:15:12.794 [INFO][4657] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" iface="eth0" netns="/var/run/netns/cni-07312799-89c5-1750-d116-35b2e37f6fa6" Sep 12 17:15:12.917187 containerd[1474]: 2025-09-12 17:15:12.796 [INFO][4657] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" iface="eth0" netns="/var/run/netns/cni-07312799-89c5-1750-d116-35b2e37f6fa6" Sep 12 17:15:12.917187 containerd[1474]: 2025-09-12 17:15:12.796 [INFO][4657] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" iface="eth0" netns="/var/run/netns/cni-07312799-89c5-1750-d116-35b2e37f6fa6" Sep 12 17:15:12.917187 containerd[1474]: 2025-09-12 17:15:12.796 [INFO][4657] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" Sep 12 17:15:12.917187 containerd[1474]: 2025-09-12 17:15:12.797 [INFO][4657] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" Sep 12 17:15:12.917187 containerd[1474]: 2025-09-12 17:15:12.871 [INFO][4680] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" HandleID="k8s-pod-network.2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--7p65c-eth0" Sep 12 17:15:12.917187 containerd[1474]: 2025-09-12 17:15:12.872 [INFO][4680] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:12.917187 containerd[1474]: 2025-09-12 17:15:12.872 [INFO][4680] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:12.917187 containerd[1474]: 2025-09-12 17:15:12.897 [WARNING][4680] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" HandleID="k8s-pod-network.2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--7p65c-eth0" Sep 12 17:15:12.917187 containerd[1474]: 2025-09-12 17:15:12.897 [INFO][4680] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" HandleID="k8s-pod-network.2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--7p65c-eth0" Sep 12 17:15:12.917187 containerd[1474]: 2025-09-12 17:15:12.903 [INFO][4680] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:12.917187 containerd[1474]: 2025-09-12 17:15:12.912 [INFO][4657] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" Sep 12 17:15:12.918212 containerd[1474]: time="2025-09-12T17:15:12.917330161Z" level=info msg="TearDown network for sandbox \"2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a\" successfully" Sep 12 17:15:12.918212 containerd[1474]: time="2025-09-12T17:15:12.917372600Z" level=info msg="StopPodSandbox for \"2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a\" returns successfully" Sep 12 17:15:12.923568 systemd[1]: run-netns-cni\x2d07312799\x2d89c5\x2d1750\x2dd116\x2d35b2e37f6fa6.mount: Deactivated successfully. Sep 12 17:15:12.924578 containerd[1474]: time="2025-09-12T17:15:12.923864428Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7df9fb7db-7p65c,Uid:8a36e632-808a-47ed-b4e3-b9bcf0459096,Namespace:calico-apiserver,Attempt:1,}" Sep 12 17:15:13.005570 systemd-networkd[1374]: cali556a08f9179: Gained IPv6LL Sep 12 17:15:13.113100 kubelet[2556]: I0912 17:15:13.112924 2556 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-j49g9" podStartSLOduration=49.112888437 podStartE2EDuration="49.112888437s" podCreationTimestamp="2025-09-12 17:14:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:15:13.100878325 +0000 UTC m=+54.763619684" watchObservedRunningTime="2025-09-12 17:15:13.112888437 +0000 UTC m=+54.775629756" Sep 12 17:15:13.135840 systemd-networkd[1374]: cali6ecd77912b3: Gained IPv6LL Sep 12 17:15:13.193800 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4289584852.mount: Deactivated successfully. Sep 12 17:15:13.244983 containerd[1474]: time="2025-09-12T17:15:13.244904585Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:13.247959 containerd[1474]: time="2025-09-12T17:15:13.247862784Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=30823700" Sep 12 17:15:13.251387 containerd[1474]: time="2025-09-12T17:15:13.250993100Z" level=info msg="ImageCreate event name:\"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:13.262683 containerd[1474]: time="2025-09-12T17:15:13.261964146Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:13.265537 containerd[1474]: time="2025-09-12T17:15:13.265441457Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"30823530\" in 4.866041447s" Sep 12 17:15:13.266101 containerd[1474]: time="2025-09-12T17:15:13.265607575Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\"" Sep 12 17:15:13.275820 containerd[1474]: time="2025-09-12T17:15:13.275739753Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 12 17:15:13.279778 containerd[1474]: time="2025-09-12T17:15:13.279248064Z" level=info msg="CreateContainer within sandbox \"b2e211a53accc6fddc12dedc4c1a07dd048e0f1b40d2f7384c63668d54bc004a\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 12 17:15:13.406547 containerd[1474]: time="2025-09-12T17:15:13.406465920Z" level=info msg="CreateContainer within sandbox \"b2e211a53accc6fddc12dedc4c1a07dd048e0f1b40d2f7384c63668d54bc004a\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"9ea9b93059466ef78e187cadb52c10a504272cb3ee36b9d7fe09d26853b9a859\"" Sep 12 17:15:13.410668 containerd[1474]: time="2025-09-12T17:15:13.410567902Z" level=info msg="StartContainer for \"9ea9b93059466ef78e187cadb52c10a504272cb3ee36b9d7fe09d26853b9a859\"" Sep 12 17:15:13.491554 systemd[1]: Started cri-containerd-9ea9b93059466ef78e187cadb52c10a504272cb3ee36b9d7fe09d26853b9a859.scope - libcontainer container 9ea9b93059466ef78e187cadb52c10a504272cb3ee36b9d7fe09d26853b9a859. Sep 12 17:15:13.498244 systemd-networkd[1374]: calib44e249471e: Link UP Sep 12 17:15:13.501486 systemd-networkd[1374]: calib44e249471e: Gained carrier Sep 12 17:15:13.539905 containerd[1474]: 2025-09-12 17:15:13.234 [INFO][4686] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--7p65c-eth0 calico-apiserver-7df9fb7db- calico-apiserver 8a36e632-808a-47ed-b4e3-b9bcf0459096 952 0 2025-09-12 17:14:41 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7df9fb7db projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081-3-6-2-0999f1dc3d calico-apiserver-7df9fb7db-7p65c eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calib44e249471e [] [] }} ContainerID="dc8199d51d223fa210f25174d49c024ce3c37f44fc96832e965fbabdf016e744" Namespace="calico-apiserver" Pod="calico-apiserver-7df9fb7db-7p65c" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--7p65c-" Sep 12 17:15:13.539905 containerd[1474]: 2025-09-12 17:15:13.235 [INFO][4686] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="dc8199d51d223fa210f25174d49c024ce3c37f44fc96832e965fbabdf016e744" Namespace="calico-apiserver" Pod="calico-apiserver-7df9fb7db-7p65c" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--7p65c-eth0" Sep 12 17:15:13.539905 containerd[1474]: 2025-09-12 17:15:13.330 [INFO][4704] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="dc8199d51d223fa210f25174d49c024ce3c37f44fc96832e965fbabdf016e744" HandleID="k8s-pod-network.dc8199d51d223fa210f25174d49c024ce3c37f44fc96832e965fbabdf016e744" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--7p65c-eth0" Sep 12 17:15:13.539905 containerd[1474]: 2025-09-12 17:15:13.332 [INFO][4704] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="dc8199d51d223fa210f25174d49c024ce3c37f44fc96832e965fbabdf016e744" HandleID="k8s-pod-network.dc8199d51d223fa210f25174d49c024ce3c37f44fc96832e965fbabdf016e744" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--7p65c-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400030e3c0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4081-3-6-2-0999f1dc3d", "pod":"calico-apiserver-7df9fb7db-7p65c", "timestamp":"2025-09-12 17:15:13.330320627 +0000 UTC"}, Hostname:"ci-4081-3-6-2-0999f1dc3d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:15:13.539905 containerd[1474]: 2025-09-12 17:15:13.332 [INFO][4704] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:13.539905 containerd[1474]: 2025-09-12 17:15:13.332 [INFO][4704] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:13.539905 containerd[1474]: 2025-09-12 17:15:13.332 [INFO][4704] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-2-0999f1dc3d' Sep 12 17:15:13.539905 containerd[1474]: 2025-09-12 17:15:13.365 [INFO][4704] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.dc8199d51d223fa210f25174d49c024ce3c37f44fc96832e965fbabdf016e744" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:13.539905 containerd[1474]: 2025-09-12 17:15:13.378 [INFO][4704] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:13.539905 containerd[1474]: 2025-09-12 17:15:13.398 [INFO][4704] ipam/ipam.go 511: Trying affinity for 192.168.48.64/26 host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:13.539905 containerd[1474]: 2025-09-12 17:15:13.405 [INFO][4704] ipam/ipam.go 158: Attempting to load block cidr=192.168.48.64/26 host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:13.539905 containerd[1474]: 2025-09-12 17:15:13.422 [INFO][4704] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.48.64/26 host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:13.539905 containerd[1474]: 2025-09-12 17:15:13.422 [INFO][4704] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.48.64/26 handle="k8s-pod-network.dc8199d51d223fa210f25174d49c024ce3c37f44fc96832e965fbabdf016e744" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:13.539905 containerd[1474]: 2025-09-12 17:15:13.427 [INFO][4704] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.dc8199d51d223fa210f25174d49c024ce3c37f44fc96832e965fbabdf016e744 Sep 12 17:15:13.539905 containerd[1474]: 2025-09-12 17:15:13.439 [INFO][4704] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.48.64/26 handle="k8s-pod-network.dc8199d51d223fa210f25174d49c024ce3c37f44fc96832e965fbabdf016e744" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:13.539905 containerd[1474]: 2025-09-12 17:15:13.462 [INFO][4704] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.48.70/26] block=192.168.48.64/26 handle="k8s-pod-network.dc8199d51d223fa210f25174d49c024ce3c37f44fc96832e965fbabdf016e744" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:13.539905 containerd[1474]: 2025-09-12 17:15:13.462 [INFO][4704] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.48.70/26] handle="k8s-pod-network.dc8199d51d223fa210f25174d49c024ce3c37f44fc96832e965fbabdf016e744" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:13.539905 containerd[1474]: 2025-09-12 17:15:13.462 [INFO][4704] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:13.539905 containerd[1474]: 2025-09-12 17:15:13.462 [INFO][4704] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.48.70/26] IPv6=[] ContainerID="dc8199d51d223fa210f25174d49c024ce3c37f44fc96832e965fbabdf016e744" HandleID="k8s-pod-network.dc8199d51d223fa210f25174d49c024ce3c37f44fc96832e965fbabdf016e744" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--7p65c-eth0" Sep 12 17:15:13.540645 containerd[1474]: 2025-09-12 17:15:13.483 [INFO][4686] cni-plugin/k8s.go 418: Populated endpoint ContainerID="dc8199d51d223fa210f25174d49c024ce3c37f44fc96832e965fbabdf016e744" Namespace="calico-apiserver" Pod="calico-apiserver-7df9fb7db-7p65c" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--7p65c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--7p65c-eth0", GenerateName:"calico-apiserver-7df9fb7db-", Namespace:"calico-apiserver", SelfLink:"", UID:"8a36e632-808a-47ed-b4e3-b9bcf0459096", ResourceVersion:"952", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7df9fb7db", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-2-0999f1dc3d", ContainerID:"", Pod:"calico-apiserver-7df9fb7db-7p65c", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.48.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib44e249471e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:13.540645 containerd[1474]: 2025-09-12 17:15:13.483 [INFO][4686] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.48.70/32] ContainerID="dc8199d51d223fa210f25174d49c024ce3c37f44fc96832e965fbabdf016e744" Namespace="calico-apiserver" Pod="calico-apiserver-7df9fb7db-7p65c" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--7p65c-eth0" Sep 12 17:15:13.540645 containerd[1474]: 2025-09-12 17:15:13.483 [INFO][4686] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib44e249471e ContainerID="dc8199d51d223fa210f25174d49c024ce3c37f44fc96832e965fbabdf016e744" Namespace="calico-apiserver" Pod="calico-apiserver-7df9fb7db-7p65c" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--7p65c-eth0" Sep 12 17:15:13.540645 containerd[1474]: 2025-09-12 17:15:13.501 [INFO][4686] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="dc8199d51d223fa210f25174d49c024ce3c37f44fc96832e965fbabdf016e744" Namespace="calico-apiserver" Pod="calico-apiserver-7df9fb7db-7p65c" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--7p65c-eth0" Sep 12 17:15:13.540645 containerd[1474]: 2025-09-12 17:15:13.508 [INFO][4686] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="dc8199d51d223fa210f25174d49c024ce3c37f44fc96832e965fbabdf016e744" Namespace="calico-apiserver" Pod="calico-apiserver-7df9fb7db-7p65c" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--7p65c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--7p65c-eth0", GenerateName:"calico-apiserver-7df9fb7db-", Namespace:"calico-apiserver", SelfLink:"", UID:"8a36e632-808a-47ed-b4e3-b9bcf0459096", ResourceVersion:"952", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7df9fb7db", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-2-0999f1dc3d", ContainerID:"dc8199d51d223fa210f25174d49c024ce3c37f44fc96832e965fbabdf016e744", Pod:"calico-apiserver-7df9fb7db-7p65c", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.48.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib44e249471e", MAC:"42:16:3d:e0:f8:0e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:13.540645 containerd[1474]: 2025-09-12 17:15:13.534 [INFO][4686] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="dc8199d51d223fa210f25174d49c024ce3c37f44fc96832e965fbabdf016e744" Namespace="calico-apiserver" Pod="calico-apiserver-7df9fb7db-7p65c" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--7p65c-eth0" Sep 12 17:15:13.579501 containerd[1474]: time="2025-09-12T17:15:13.577124526Z" level=info msg="StopPodSandbox for \"22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208\"" Sep 12 17:15:13.622131 containerd[1474]: time="2025-09-12T17:15:13.621930258Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:15:13.622131 containerd[1474]: time="2025-09-12T17:15:13.622042656Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:15:13.623513 containerd[1474]: time="2025-09-12T17:15:13.622102736Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:13.623513 containerd[1474]: time="2025-09-12T17:15:13.622249974Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:13.720942 systemd[1]: Started cri-containerd-dc8199d51d223fa210f25174d49c024ce3c37f44fc96832e965fbabdf016e744.scope - libcontainer container dc8199d51d223fa210f25174d49c024ce3c37f44fc96832e965fbabdf016e744. Sep 12 17:15:13.726001 containerd[1474]: time="2025-09-12T17:15:13.725929280Z" level=info msg="StartContainer for \"9ea9b93059466ef78e187cadb52c10a504272cb3ee36b9d7fe09d26853b9a859\" returns successfully" Sep 12 17:15:13.922819 containerd[1474]: time="2025-09-12T17:15:13.922603121Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7df9fb7db-7p65c,Uid:8a36e632-808a-47ed-b4e3-b9bcf0459096,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"dc8199d51d223fa210f25174d49c024ce3c37f44fc96832e965fbabdf016e744\"" Sep 12 17:15:13.934998 containerd[1474]: 2025-09-12 17:15:13.807 [INFO][4772] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" Sep 12 17:15:13.934998 containerd[1474]: 2025-09-12 17:15:13.807 [INFO][4772] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" iface="eth0" netns="/var/run/netns/cni-eb7ead88-0957-4d71-0143-f0d6ef2c3977" Sep 12 17:15:13.934998 containerd[1474]: 2025-09-12 17:15:13.810 [INFO][4772] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" iface="eth0" netns="/var/run/netns/cni-eb7ead88-0957-4d71-0143-f0d6ef2c3977" Sep 12 17:15:13.934998 containerd[1474]: 2025-09-12 17:15:13.811 [INFO][4772] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" iface="eth0" netns="/var/run/netns/cni-eb7ead88-0957-4d71-0143-f0d6ef2c3977" Sep 12 17:15:13.934998 containerd[1474]: 2025-09-12 17:15:13.813 [INFO][4772] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" Sep 12 17:15:13.934998 containerd[1474]: 2025-09-12 17:15:13.813 [INFO][4772] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" Sep 12 17:15:13.934998 containerd[1474]: 2025-09-12 17:15:13.885 [INFO][4808] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" HandleID="k8s-pod-network.22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--pc7pt-eth0" Sep 12 17:15:13.934998 containerd[1474]: 2025-09-12 17:15:13.885 [INFO][4808] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:13.934998 containerd[1474]: 2025-09-12 17:15:13.885 [INFO][4808] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:13.934998 containerd[1474]: 2025-09-12 17:15:13.914 [WARNING][4808] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" HandleID="k8s-pod-network.22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--pc7pt-eth0" Sep 12 17:15:13.934998 containerd[1474]: 2025-09-12 17:15:13.914 [INFO][4808] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" HandleID="k8s-pod-network.22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--pc7pt-eth0" Sep 12 17:15:13.934998 containerd[1474]: 2025-09-12 17:15:13.923 [INFO][4808] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:13.934998 containerd[1474]: 2025-09-12 17:15:13.932 [INFO][4772] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" Sep 12 17:15:13.936842 containerd[1474]: time="2025-09-12T17:15:13.936653124Z" level=info msg="TearDown network for sandbox \"22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208\" successfully" Sep 12 17:15:13.937032 containerd[1474]: time="2025-09-12T17:15:13.936941880Z" level=info msg="StopPodSandbox for \"22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208\" returns successfully" Sep 12 17:15:13.939814 containerd[1474]: time="2025-09-12T17:15:13.938436579Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-pc7pt,Uid:ecddc12a-94e3-4f27-a97d-480a79c66c1a,Namespace:kube-system,Attempt:1,}" Sep 12 17:15:13.943949 systemd[1]: run-netns-cni\x2deb7ead88\x2d0957\x2d4d71\x2d0143\x2df0d6ef2c3977.mount: Deactivated successfully. Sep 12 17:15:14.274163 systemd-networkd[1374]: cali6f2608cc972: Link UP Sep 12 17:15:14.274804 systemd-networkd[1374]: cali6f2608cc972: Gained carrier Sep 12 17:15:14.309951 kubelet[2556]: I0912 17:15:14.307051 2556 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-78fb8c774f-mnjlm" podStartSLOduration=1.798808938 podStartE2EDuration="10.307008005s" podCreationTimestamp="2025-09-12 17:15:04 +0000 UTC" firstStartedPulling="2025-09-12 17:15:04.765446555 +0000 UTC m=+46.428187914" lastFinishedPulling="2025-09-12 17:15:13.273645622 +0000 UTC m=+54.936386981" observedRunningTime="2025-09-12 17:15:14.162385238 +0000 UTC m=+55.825126597" watchObservedRunningTime="2025-09-12 17:15:14.307008005 +0000 UTC m=+55.969749364" Sep 12 17:15:14.316374 containerd[1474]: 2025-09-12 17:15:14.030 [INFO][4822] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--pc7pt-eth0 coredns-668d6bf9bc- kube-system ecddc12a-94e3-4f27-a97d-480a79c66c1a 969 0 2025-09-12 17:14:24 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081-3-6-2-0999f1dc3d coredns-668d6bf9bc-pc7pt eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali6f2608cc972 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="9d315990e24b7bbb64ef9cdbbe513f65aec60e348d9e5395c24af791792f01bc" Namespace="kube-system" Pod="coredns-668d6bf9bc-pc7pt" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--pc7pt-" Sep 12 17:15:14.316374 containerd[1474]: 2025-09-12 17:15:14.031 [INFO][4822] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9d315990e24b7bbb64ef9cdbbe513f65aec60e348d9e5395c24af791792f01bc" Namespace="kube-system" Pod="coredns-668d6bf9bc-pc7pt" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--pc7pt-eth0" Sep 12 17:15:14.316374 containerd[1474]: 2025-09-12 17:15:14.089 [INFO][4834] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9d315990e24b7bbb64ef9cdbbe513f65aec60e348d9e5395c24af791792f01bc" HandleID="k8s-pod-network.9d315990e24b7bbb64ef9cdbbe513f65aec60e348d9e5395c24af791792f01bc" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--pc7pt-eth0" Sep 12 17:15:14.316374 containerd[1474]: 2025-09-12 17:15:14.089 [INFO][4834] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9d315990e24b7bbb64ef9cdbbe513f65aec60e348d9e5395c24af791792f01bc" HandleID="k8s-pod-network.9d315990e24b7bbb64ef9cdbbe513f65aec60e348d9e5395c24af791792f01bc" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--pc7pt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3640), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081-3-6-2-0999f1dc3d", "pod":"coredns-668d6bf9bc-pc7pt", "timestamp":"2025-09-12 17:15:14.089195847 +0000 UTC"}, Hostname:"ci-4081-3-6-2-0999f1dc3d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:15:14.316374 containerd[1474]: 2025-09-12 17:15:14.089 [INFO][4834] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:14.316374 containerd[1474]: 2025-09-12 17:15:14.089 [INFO][4834] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:14.316374 containerd[1474]: 2025-09-12 17:15:14.089 [INFO][4834] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-2-0999f1dc3d' Sep 12 17:15:14.316374 containerd[1474]: 2025-09-12 17:15:14.111 [INFO][4834] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9d315990e24b7bbb64ef9cdbbe513f65aec60e348d9e5395c24af791792f01bc" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:14.316374 containerd[1474]: 2025-09-12 17:15:14.171 [INFO][4834] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:14.316374 containerd[1474]: 2025-09-12 17:15:14.198 [INFO][4834] ipam/ipam.go 511: Trying affinity for 192.168.48.64/26 host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:14.316374 containerd[1474]: 2025-09-12 17:15:14.204 [INFO][4834] ipam/ipam.go 158: Attempting to load block cidr=192.168.48.64/26 host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:14.316374 containerd[1474]: 2025-09-12 17:15:14.214 [INFO][4834] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.48.64/26 host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:14.316374 containerd[1474]: 2025-09-12 17:15:14.214 [INFO][4834] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.48.64/26 handle="k8s-pod-network.9d315990e24b7bbb64ef9cdbbe513f65aec60e348d9e5395c24af791792f01bc" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:14.316374 containerd[1474]: 2025-09-12 17:15:14.218 [INFO][4834] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9d315990e24b7bbb64ef9cdbbe513f65aec60e348d9e5395c24af791792f01bc Sep 12 17:15:14.316374 containerd[1474]: 2025-09-12 17:15:14.227 [INFO][4834] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.48.64/26 handle="k8s-pod-network.9d315990e24b7bbb64ef9cdbbe513f65aec60e348d9e5395c24af791792f01bc" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:14.316374 containerd[1474]: 2025-09-12 17:15:14.259 [INFO][4834] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.48.71/26] block=192.168.48.64/26 handle="k8s-pod-network.9d315990e24b7bbb64ef9cdbbe513f65aec60e348d9e5395c24af791792f01bc" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:14.316374 containerd[1474]: 2025-09-12 17:15:14.259 [INFO][4834] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.48.71/26] handle="k8s-pod-network.9d315990e24b7bbb64ef9cdbbe513f65aec60e348d9e5395c24af791792f01bc" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:14.316374 containerd[1474]: 2025-09-12 17:15:14.259 [INFO][4834] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:14.316374 containerd[1474]: 2025-09-12 17:15:14.259 [INFO][4834] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.48.71/26] IPv6=[] ContainerID="9d315990e24b7bbb64ef9cdbbe513f65aec60e348d9e5395c24af791792f01bc" HandleID="k8s-pod-network.9d315990e24b7bbb64ef9cdbbe513f65aec60e348d9e5395c24af791792f01bc" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--pc7pt-eth0" Sep 12 17:15:14.317140 containerd[1474]: 2025-09-12 17:15:14.264 [INFO][4822] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9d315990e24b7bbb64ef9cdbbe513f65aec60e348d9e5395c24af791792f01bc" Namespace="kube-system" Pod="coredns-668d6bf9bc-pc7pt" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--pc7pt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--pc7pt-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"ecddc12a-94e3-4f27-a97d-480a79c66c1a", ResourceVersion:"969", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-2-0999f1dc3d", ContainerID:"", Pod:"coredns-668d6bf9bc-pc7pt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.48.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6f2608cc972", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:14.317140 containerd[1474]: 2025-09-12 17:15:14.265 [INFO][4822] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.48.71/32] ContainerID="9d315990e24b7bbb64ef9cdbbe513f65aec60e348d9e5395c24af791792f01bc" Namespace="kube-system" Pod="coredns-668d6bf9bc-pc7pt" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--pc7pt-eth0" Sep 12 17:15:14.317140 containerd[1474]: 2025-09-12 17:15:14.265 [INFO][4822] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6f2608cc972 ContainerID="9d315990e24b7bbb64ef9cdbbe513f65aec60e348d9e5395c24af791792f01bc" Namespace="kube-system" Pod="coredns-668d6bf9bc-pc7pt" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--pc7pt-eth0" Sep 12 17:15:14.317140 containerd[1474]: 2025-09-12 17:15:14.271 [INFO][4822] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9d315990e24b7bbb64ef9cdbbe513f65aec60e348d9e5395c24af791792f01bc" Namespace="kube-system" Pod="coredns-668d6bf9bc-pc7pt" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--pc7pt-eth0" Sep 12 17:15:14.317140 containerd[1474]: 2025-09-12 17:15:14.278 [INFO][4822] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9d315990e24b7bbb64ef9cdbbe513f65aec60e348d9e5395c24af791792f01bc" Namespace="kube-system" Pod="coredns-668d6bf9bc-pc7pt" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--pc7pt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--pc7pt-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"ecddc12a-94e3-4f27-a97d-480a79c66c1a", ResourceVersion:"969", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-2-0999f1dc3d", ContainerID:"9d315990e24b7bbb64ef9cdbbe513f65aec60e348d9e5395c24af791792f01bc", Pod:"coredns-668d6bf9bc-pc7pt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.48.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6f2608cc972", MAC:"6a:8b:ec:b8:e2:7f", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:14.317140 containerd[1474]: 2025-09-12 17:15:14.309 [INFO][4822] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9d315990e24b7bbb64ef9cdbbe513f65aec60e348d9e5395c24af791792f01bc" Namespace="kube-system" Pod="coredns-668d6bf9bc-pc7pt" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--pc7pt-eth0" Sep 12 17:15:14.366162 containerd[1474]: time="2025-09-12T17:15:14.365492159Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:15:14.366162 containerd[1474]: time="2025-09-12T17:15:14.366050272Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:15:14.366162 containerd[1474]: time="2025-09-12T17:15:14.366078431Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:14.368512 containerd[1474]: time="2025-09-12T17:15:14.366226989Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:14.401547 systemd[1]: Started cri-containerd-9d315990e24b7bbb64ef9cdbbe513f65aec60e348d9e5395c24af791792f01bc.scope - libcontainer container 9d315990e24b7bbb64ef9cdbbe513f65aec60e348d9e5395c24af791792f01bc. Sep 12 17:15:14.469878 containerd[1474]: time="2025-09-12T17:15:14.469813562Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-pc7pt,Uid:ecddc12a-94e3-4f27-a97d-480a79c66c1a,Namespace:kube-system,Attempt:1,} returns sandbox id \"9d315990e24b7bbb64ef9cdbbe513f65aec60e348d9e5395c24af791792f01bc\"" Sep 12 17:15:14.479762 containerd[1474]: time="2025-09-12T17:15:14.479463269Z" level=info msg="CreateContainer within sandbox \"9d315990e24b7bbb64ef9cdbbe513f65aec60e348d9e5395c24af791792f01bc\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 17:15:14.506206 containerd[1474]: time="2025-09-12T17:15:14.504460364Z" level=info msg="CreateContainer within sandbox \"9d315990e24b7bbb64ef9cdbbe513f65aec60e348d9e5395c24af791792f01bc\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"9f864381301c26f83630ae46c2b8f51ed2891503cb70399786a2e70fdc8e2c46\"" Sep 12 17:15:14.508420 containerd[1474]: time="2025-09-12T17:15:14.506445617Z" level=info msg="StartContainer for \"9f864381301c26f83630ae46c2b8f51ed2891503cb70399786a2e70fdc8e2c46\"" Sep 12 17:15:14.585182 containerd[1474]: time="2025-09-12T17:15:14.582996682Z" level=info msg="StopPodSandbox for \"de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7\"" Sep 12 17:15:14.586357 systemd[1]: Started cri-containerd-9f864381301c26f83630ae46c2b8f51ed2891503cb70399786a2e70fdc8e2c46.scope - libcontainer container 9f864381301c26f83630ae46c2b8f51ed2891503cb70399786a2e70fdc8e2c46. Sep 12 17:15:14.704854 containerd[1474]: time="2025-09-12T17:15:14.704399409Z" level=info msg="StartContainer for \"9f864381301c26f83630ae46c2b8f51ed2891503cb70399786a2e70fdc8e2c46\" returns successfully" Sep 12 17:15:14.937933 containerd[1474]: 2025-09-12 17:15:14.825 [INFO][4929] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" Sep 12 17:15:14.937933 containerd[1474]: 2025-09-12 17:15:14.826 [INFO][4929] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" iface="eth0" netns="/var/run/netns/cni-65dd9ffd-5dea-14fd-814a-b53985ef29fb" Sep 12 17:15:14.937933 containerd[1474]: 2025-09-12 17:15:14.827 [INFO][4929] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" iface="eth0" netns="/var/run/netns/cni-65dd9ffd-5dea-14fd-814a-b53985ef29fb" Sep 12 17:15:14.937933 containerd[1474]: 2025-09-12 17:15:14.832 [INFO][4929] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" iface="eth0" netns="/var/run/netns/cni-65dd9ffd-5dea-14fd-814a-b53985ef29fb" Sep 12 17:15:14.937933 containerd[1474]: 2025-09-12 17:15:14.832 [INFO][4929] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" Sep 12 17:15:14.937933 containerd[1474]: 2025-09-12 17:15:14.832 [INFO][4929] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" Sep 12 17:15:14.937933 containerd[1474]: 2025-09-12 17:15:14.907 [INFO][4948] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" HandleID="k8s-pod-network.de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--kpktm-eth0" Sep 12 17:15:14.937933 containerd[1474]: 2025-09-12 17:15:14.908 [INFO][4948] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:14.937933 containerd[1474]: 2025-09-12 17:15:14.908 [INFO][4948] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:14.937933 containerd[1474]: 2025-09-12 17:15:14.927 [WARNING][4948] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" HandleID="k8s-pod-network.de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--kpktm-eth0" Sep 12 17:15:14.937933 containerd[1474]: 2025-09-12 17:15:14.927 [INFO][4948] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" HandleID="k8s-pod-network.de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--kpktm-eth0" Sep 12 17:15:14.937933 containerd[1474]: 2025-09-12 17:15:14.931 [INFO][4948] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:14.937933 containerd[1474]: 2025-09-12 17:15:14.934 [INFO][4929] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" Sep 12 17:15:14.946649 systemd[1]: run-netns-cni\x2d65dd9ffd\x2d5dea\x2d14fd\x2d814a\x2db53985ef29fb.mount: Deactivated successfully. Sep 12 17:15:14.948535 containerd[1474]: time="2025-09-12T17:15:14.948472006Z" level=info msg="TearDown network for sandbox \"de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7\" successfully" Sep 12 17:15:14.948977 containerd[1474]: time="2025-09-12T17:15:14.948767522Z" level=info msg="StopPodSandbox for \"de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7\" returns successfully" Sep 12 17:15:14.953602 containerd[1474]: time="2025-09-12T17:15:14.952730787Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7df9fb7db-kpktm,Uid:0d01a44b-0670-4021-80bc-06b979ee8824,Namespace:calico-apiserver,Attempt:1,}" Sep 12 17:15:15.119119 systemd-networkd[1374]: calib44e249471e: Gained IPv6LL Sep 12 17:15:15.238340 kubelet[2556]: I0912 17:15:15.238098 2556 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-pc7pt" podStartSLOduration=51.23805191 podStartE2EDuration="51.23805191s" podCreationTimestamp="2025-09-12 17:14:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:15:15.169092964 +0000 UTC m=+56.831834323" watchObservedRunningTime="2025-09-12 17:15:15.23805191 +0000 UTC m=+56.900793309" Sep 12 17:15:15.340334 systemd-networkd[1374]: calif6b4ab77080: Link UP Sep 12 17:15:15.342841 systemd-networkd[1374]: calif6b4ab77080: Gained carrier Sep 12 17:15:15.430402 containerd[1474]: 2025-09-12 17:15:15.089 [INFO][4955] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--kpktm-eth0 calico-apiserver-7df9fb7db- calico-apiserver 0d01a44b-0670-4021-80bc-06b979ee8824 986 0 2025-09-12 17:14:41 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7df9fb7db projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081-3-6-2-0999f1dc3d calico-apiserver-7df9fb7db-kpktm eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calif6b4ab77080 [] [] }} ContainerID="36cb49f4ec250ee9635802c5d120c6cb2059151fbfd4b1779672f177c7ef3604" Namespace="calico-apiserver" Pod="calico-apiserver-7df9fb7db-kpktm" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--kpktm-" Sep 12 17:15:15.430402 containerd[1474]: 2025-09-12 17:15:15.089 [INFO][4955] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="36cb49f4ec250ee9635802c5d120c6cb2059151fbfd4b1779672f177c7ef3604" Namespace="calico-apiserver" Pod="calico-apiserver-7df9fb7db-kpktm" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--kpktm-eth0" Sep 12 17:15:15.430402 containerd[1474]: 2025-09-12 17:15:15.182 [INFO][4966] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="36cb49f4ec250ee9635802c5d120c6cb2059151fbfd4b1779672f177c7ef3604" HandleID="k8s-pod-network.36cb49f4ec250ee9635802c5d120c6cb2059151fbfd4b1779672f177c7ef3604" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--kpktm-eth0" Sep 12 17:15:15.430402 containerd[1474]: 2025-09-12 17:15:15.183 [INFO][4966] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="36cb49f4ec250ee9635802c5d120c6cb2059151fbfd4b1779672f177c7ef3604" HandleID="k8s-pod-network.36cb49f4ec250ee9635802c5d120c6cb2059151fbfd4b1779672f177c7ef3604" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--kpktm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002abf40), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4081-3-6-2-0999f1dc3d", "pod":"calico-apiserver-7df9fb7db-kpktm", "timestamp":"2025-09-12 17:15:15.182817298 +0000 UTC"}, Hostname:"ci-4081-3-6-2-0999f1dc3d", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:15:15.430402 containerd[1474]: 2025-09-12 17:15:15.183 [INFO][4966] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:15.430402 containerd[1474]: 2025-09-12 17:15:15.183 [INFO][4966] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:15.430402 containerd[1474]: 2025-09-12 17:15:15.183 [INFO][4966] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-6-2-0999f1dc3d' Sep 12 17:15:15.430402 containerd[1474]: 2025-09-12 17:15:15.233 [INFO][4966] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.36cb49f4ec250ee9635802c5d120c6cb2059151fbfd4b1779672f177c7ef3604" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:15.430402 containerd[1474]: 2025-09-12 17:15:15.266 [INFO][4966] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:15.430402 containerd[1474]: 2025-09-12 17:15:15.285 [INFO][4966] ipam/ipam.go 511: Trying affinity for 192.168.48.64/26 host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:15.430402 containerd[1474]: 2025-09-12 17:15:15.289 [INFO][4966] ipam/ipam.go 158: Attempting to load block cidr=192.168.48.64/26 host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:15.430402 containerd[1474]: 2025-09-12 17:15:15.300 [INFO][4966] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.48.64/26 host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:15.430402 containerd[1474]: 2025-09-12 17:15:15.300 [INFO][4966] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.48.64/26 handle="k8s-pod-network.36cb49f4ec250ee9635802c5d120c6cb2059151fbfd4b1779672f177c7ef3604" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:15.430402 containerd[1474]: 2025-09-12 17:15:15.304 [INFO][4966] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.36cb49f4ec250ee9635802c5d120c6cb2059151fbfd4b1779672f177c7ef3604 Sep 12 17:15:15.430402 containerd[1474]: 2025-09-12 17:15:15.315 [INFO][4966] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.48.64/26 handle="k8s-pod-network.36cb49f4ec250ee9635802c5d120c6cb2059151fbfd4b1779672f177c7ef3604" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:15.430402 containerd[1474]: 2025-09-12 17:15:15.328 [INFO][4966] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.48.72/26] block=192.168.48.64/26 handle="k8s-pod-network.36cb49f4ec250ee9635802c5d120c6cb2059151fbfd4b1779672f177c7ef3604" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:15.430402 containerd[1474]: 2025-09-12 17:15:15.328 [INFO][4966] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.48.72/26] handle="k8s-pod-network.36cb49f4ec250ee9635802c5d120c6cb2059151fbfd4b1779672f177c7ef3604" host="ci-4081-3-6-2-0999f1dc3d" Sep 12 17:15:15.430402 containerd[1474]: 2025-09-12 17:15:15.328 [INFO][4966] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:15.430402 containerd[1474]: 2025-09-12 17:15:15.328 [INFO][4966] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.48.72/26] IPv6=[] ContainerID="36cb49f4ec250ee9635802c5d120c6cb2059151fbfd4b1779672f177c7ef3604" HandleID="k8s-pod-network.36cb49f4ec250ee9635802c5d120c6cb2059151fbfd4b1779672f177c7ef3604" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--kpktm-eth0" Sep 12 17:15:15.431815 containerd[1474]: 2025-09-12 17:15:15.331 [INFO][4955] cni-plugin/k8s.go 418: Populated endpoint ContainerID="36cb49f4ec250ee9635802c5d120c6cb2059151fbfd4b1779672f177c7ef3604" Namespace="calico-apiserver" Pod="calico-apiserver-7df9fb7db-kpktm" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--kpktm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--kpktm-eth0", GenerateName:"calico-apiserver-7df9fb7db-", Namespace:"calico-apiserver", SelfLink:"", UID:"0d01a44b-0670-4021-80bc-06b979ee8824", ResourceVersion:"986", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7df9fb7db", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-2-0999f1dc3d", ContainerID:"", Pod:"calico-apiserver-7df9fb7db-kpktm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.48.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif6b4ab77080", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:15.431815 containerd[1474]: 2025-09-12 17:15:15.333 [INFO][4955] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.48.72/32] ContainerID="36cb49f4ec250ee9635802c5d120c6cb2059151fbfd4b1779672f177c7ef3604" Namespace="calico-apiserver" Pod="calico-apiserver-7df9fb7db-kpktm" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--kpktm-eth0" Sep 12 17:15:15.431815 containerd[1474]: 2025-09-12 17:15:15.333 [INFO][4955] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif6b4ab77080 ContainerID="36cb49f4ec250ee9635802c5d120c6cb2059151fbfd4b1779672f177c7ef3604" Namespace="calico-apiserver" Pod="calico-apiserver-7df9fb7db-kpktm" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--kpktm-eth0" Sep 12 17:15:15.431815 containerd[1474]: 2025-09-12 17:15:15.342 [INFO][4955] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="36cb49f4ec250ee9635802c5d120c6cb2059151fbfd4b1779672f177c7ef3604" Namespace="calico-apiserver" Pod="calico-apiserver-7df9fb7db-kpktm" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--kpktm-eth0" Sep 12 17:15:15.431815 containerd[1474]: 2025-09-12 17:15:15.344 [INFO][4955] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="36cb49f4ec250ee9635802c5d120c6cb2059151fbfd4b1779672f177c7ef3604" Namespace="calico-apiserver" Pod="calico-apiserver-7df9fb7db-kpktm" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--kpktm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--kpktm-eth0", GenerateName:"calico-apiserver-7df9fb7db-", Namespace:"calico-apiserver", SelfLink:"", UID:"0d01a44b-0670-4021-80bc-06b979ee8824", ResourceVersion:"986", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7df9fb7db", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-2-0999f1dc3d", ContainerID:"36cb49f4ec250ee9635802c5d120c6cb2059151fbfd4b1779672f177c7ef3604", Pod:"calico-apiserver-7df9fb7db-kpktm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.48.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif6b4ab77080", MAC:"3a:a2:23:c8:a3:ce", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:15.431815 containerd[1474]: 2025-09-12 17:15:15.426 [INFO][4955] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="36cb49f4ec250ee9635802c5d120c6cb2059151fbfd4b1779672f177c7ef3604" Namespace="calico-apiserver" Pod="calico-apiserver-7df9fb7db-kpktm" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--kpktm-eth0" Sep 12 17:15:15.488234 containerd[1474]: time="2025-09-12T17:15:15.482938911Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 12 17:15:15.488234 containerd[1474]: time="2025-09-12T17:15:15.483036550Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 12 17:15:15.488234 containerd[1474]: time="2025-09-12T17:15:15.483050710Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:15.488234 containerd[1474]: time="2025-09-12T17:15:15.486307825Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 12 17:15:15.550693 systemd[1]: Started cri-containerd-36cb49f4ec250ee9635802c5d120c6cb2059151fbfd4b1779672f177c7ef3604.scope - libcontainer container 36cb49f4ec250ee9635802c5d120c6cb2059151fbfd4b1779672f177c7ef3604. Sep 12 17:15:15.749692 containerd[1474]: time="2025-09-12T17:15:15.749516539Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7df9fb7db-kpktm,Uid:0d01a44b-0670-4021-80bc-06b979ee8824,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"36cb49f4ec250ee9635802c5d120c6cb2059151fbfd4b1779672f177c7ef3604\"" Sep 12 17:15:16.333389 systemd-networkd[1374]: cali6f2608cc972: Gained IPv6LL Sep 12 17:15:16.909715 systemd-networkd[1374]: calif6b4ab77080: Gained IPv6LL Sep 12 17:15:17.224562 containerd[1474]: time="2025-09-12T17:15:17.224357219Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:17.226411 containerd[1474]: time="2025-09-12T17:15:17.225878079Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=48134957" Sep 12 17:15:17.230126 containerd[1474]: time="2025-09-12T17:15:17.227399539Z" level=info msg="ImageCreate event name:\"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:17.234609 containerd[1474]: time="2025-09-12T17:15:17.234509406Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:17.235232 containerd[1474]: time="2025-09-12T17:15:17.235180397Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"49504166\" in 3.959361725s" Sep 12 17:15:17.235320 containerd[1474]: time="2025-09-12T17:15:17.235232916Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\"" Sep 12 17:15:17.240366 containerd[1474]: time="2025-09-12T17:15:17.240297370Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 12 17:15:17.269393 containerd[1474]: time="2025-09-12T17:15:17.265749156Z" level=info msg="CreateContainer within sandbox \"1fda1c833dd0c298abd21be0cc5394b58067328fa4156066fc6617adad0b9cf5\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 12 17:15:17.308704 containerd[1474]: time="2025-09-12T17:15:17.308586233Z" level=info msg="CreateContainer within sandbox \"1fda1c833dd0c298abd21be0cc5394b58067328fa4156066fc6617adad0b9cf5\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"6ba95ad4134da62b2e02861ad037e954b0a6c55d866760cf60eaeca8b560bb19\"" Sep 12 17:15:17.314164 containerd[1474]: time="2025-09-12T17:15:17.314096040Z" level=info msg="StartContainer for \"6ba95ad4134da62b2e02861ad037e954b0a6c55d866760cf60eaeca8b560bb19\"" Sep 12 17:15:17.386933 systemd[1]: Started cri-containerd-6ba95ad4134da62b2e02861ad037e954b0a6c55d866760cf60eaeca8b560bb19.scope - libcontainer container 6ba95ad4134da62b2e02861ad037e954b0a6c55d866760cf60eaeca8b560bb19. Sep 12 17:15:17.505523 containerd[1474]: time="2025-09-12T17:15:17.505323648Z" level=info msg="StartContainer for \"6ba95ad4134da62b2e02861ad037e954b0a6c55d866760cf60eaeca8b560bb19\" returns successfully" Sep 12 17:15:18.472504 kubelet[2556]: I0912 17:15:18.472384 2556 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-6fff4fc9-pkdxs" podStartSLOduration=24.747062258 podStartE2EDuration="31.472359274s" podCreationTimestamp="2025-09-12 17:14:47 +0000 UTC" firstStartedPulling="2025-09-12 17:15:10.512975821 +0000 UTC m=+52.175717180" lastFinishedPulling="2025-09-12 17:15:17.238272837 +0000 UTC m=+58.901014196" observedRunningTime="2025-09-12 17:15:18.207253226 +0000 UTC m=+59.869994585" watchObservedRunningTime="2025-09-12 17:15:18.472359274 +0000 UTC m=+60.135100633" Sep 12 17:15:18.606946 containerd[1474]: time="2025-09-12T17:15:18.606773453Z" level=info msg="StopPodSandbox for \"2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a\"" Sep 12 17:15:18.882441 containerd[1474]: 2025-09-12 17:15:18.727 [WARNING][5118] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--7p65c-eth0", GenerateName:"calico-apiserver-7df9fb7db-", Namespace:"calico-apiserver", SelfLink:"", UID:"8a36e632-808a-47ed-b4e3-b9bcf0459096", ResourceVersion:"966", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7df9fb7db", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-2-0999f1dc3d", ContainerID:"dc8199d51d223fa210f25174d49c024ce3c37f44fc96832e965fbabdf016e744", Pod:"calico-apiserver-7df9fb7db-7p65c", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.48.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib44e249471e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:18.882441 containerd[1474]: 2025-09-12 17:15:18.728 [INFO][5118] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" Sep 12 17:15:18.882441 containerd[1474]: 2025-09-12 17:15:18.728 [INFO][5118] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" iface="eth0" netns="" Sep 12 17:15:18.882441 containerd[1474]: 2025-09-12 17:15:18.728 [INFO][5118] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" Sep 12 17:15:18.882441 containerd[1474]: 2025-09-12 17:15:18.728 [INFO][5118] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" Sep 12 17:15:18.882441 containerd[1474]: 2025-09-12 17:15:18.832 [INFO][5126] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" HandleID="k8s-pod-network.2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--7p65c-eth0" Sep 12 17:15:18.882441 containerd[1474]: 2025-09-12 17:15:18.832 [INFO][5126] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:18.882441 containerd[1474]: 2025-09-12 17:15:18.832 [INFO][5126] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:18.882441 containerd[1474]: 2025-09-12 17:15:18.864 [WARNING][5126] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" HandleID="k8s-pod-network.2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--7p65c-eth0" Sep 12 17:15:18.882441 containerd[1474]: 2025-09-12 17:15:18.865 [INFO][5126] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" HandleID="k8s-pod-network.2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--7p65c-eth0" Sep 12 17:15:18.882441 containerd[1474]: 2025-09-12 17:15:18.872 [INFO][5126] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:18.882441 containerd[1474]: 2025-09-12 17:15:18.878 [INFO][5118] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" Sep 12 17:15:18.883708 containerd[1474]: time="2025-09-12T17:15:18.883203994Z" level=info msg="TearDown network for sandbox \"2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a\" successfully" Sep 12 17:15:18.884635 containerd[1474]: time="2025-09-12T17:15:18.883249434Z" level=info msg="StopPodSandbox for \"2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a\" returns successfully" Sep 12 17:15:18.887057 containerd[1474]: time="2025-09-12T17:15:18.886978065Z" level=info msg="RemovePodSandbox for \"2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a\"" Sep 12 17:15:18.892798 containerd[1474]: time="2025-09-12T17:15:18.891901202Z" level=info msg="Forcibly stopping sandbox \"2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a\"" Sep 12 17:15:18.905886 containerd[1474]: time="2025-09-12T17:15:18.905788062Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:18.909125 containerd[1474]: time="2025-09-12T17:15:18.907584319Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8227489" Sep 12 17:15:18.910917 containerd[1474]: time="2025-09-12T17:15:18.910842356Z" level=info msg="ImageCreate event name:\"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:18.922398 containerd[1474]: time="2025-09-12T17:15:18.922271688Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:18.923108 containerd[1474]: time="2025-09-12T17:15:18.922750762Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"9596730\" in 1.682386673s" Sep 12 17:15:18.923108 containerd[1474]: time="2025-09-12T17:15:18.922815921Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\"" Sep 12 17:15:18.928369 containerd[1474]: time="2025-09-12T17:15:18.926729791Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 12 17:15:18.932573 containerd[1474]: time="2025-09-12T17:15:18.932497396Z" level=info msg="CreateContainer within sandbox \"3003849266b76e7168cdfde98b1fd3830e01312968a4c53470e7af0342164f4b\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 12 17:15:19.040462 containerd[1474]: time="2025-09-12T17:15:19.039557937Z" level=info msg="CreateContainer within sandbox \"3003849266b76e7168cdfde98b1fd3830e01312968a4c53470e7af0342164f4b\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"76679aa24f79886c0295900602c5ab83eadcca86354f21554b48ead682f94eae\"" Sep 12 17:15:19.040965 containerd[1474]: time="2025-09-12T17:15:19.040826560Z" level=info msg="StartContainer for \"76679aa24f79886c0295900602c5ab83eadcca86354f21554b48ead682f94eae\"" Sep 12 17:15:19.087819 containerd[1474]: 2025-09-12 17:15:18.986 [WARNING][5140] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--7p65c-eth0", GenerateName:"calico-apiserver-7df9fb7db-", Namespace:"calico-apiserver", SelfLink:"", UID:"8a36e632-808a-47ed-b4e3-b9bcf0459096", ResourceVersion:"966", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7df9fb7db", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-2-0999f1dc3d", ContainerID:"dc8199d51d223fa210f25174d49c024ce3c37f44fc96832e965fbabdf016e744", Pod:"calico-apiserver-7df9fb7db-7p65c", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.48.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib44e249471e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:19.087819 containerd[1474]: 2025-09-12 17:15:18.987 [INFO][5140] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" Sep 12 17:15:19.087819 containerd[1474]: 2025-09-12 17:15:18.987 [INFO][5140] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" iface="eth0" netns="" Sep 12 17:15:19.087819 containerd[1474]: 2025-09-12 17:15:18.987 [INFO][5140] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" Sep 12 17:15:19.087819 containerd[1474]: 2025-09-12 17:15:18.987 [INFO][5140] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" Sep 12 17:15:19.087819 containerd[1474]: 2025-09-12 17:15:19.031 [INFO][5147] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" HandleID="k8s-pod-network.2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--7p65c-eth0" Sep 12 17:15:19.087819 containerd[1474]: 2025-09-12 17:15:19.034 [INFO][5147] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:19.087819 containerd[1474]: 2025-09-12 17:15:19.034 [INFO][5147] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:19.087819 containerd[1474]: 2025-09-12 17:15:19.069 [WARNING][5147] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" HandleID="k8s-pod-network.2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--7p65c-eth0" Sep 12 17:15:19.087819 containerd[1474]: 2025-09-12 17:15:19.069 [INFO][5147] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" HandleID="k8s-pod-network.2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--7p65c-eth0" Sep 12 17:15:19.087819 containerd[1474]: 2025-09-12 17:15:19.075 [INFO][5147] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:19.087819 containerd[1474]: 2025-09-12 17:15:19.082 [INFO][5140] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a" Sep 12 17:15:19.090142 containerd[1474]: time="2025-09-12T17:15:19.088449912Z" level=info msg="TearDown network for sandbox \"2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a\" successfully" Sep 12 17:15:19.117204 containerd[1474]: time="2025-09-12T17:15:19.117120986Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 17:15:19.117393 containerd[1474]: time="2025-09-12T17:15:19.117256544Z" level=info msg="RemovePodSandbox \"2667df64706e8d1b6bd4eb93f79e0d8cd65a3f485ac3cd76b9174af9b4f5592a\" returns successfully" Sep 12 17:15:19.121305 containerd[1474]: time="2025-09-12T17:15:19.120156587Z" level=info msg="StopPodSandbox for \"ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8\"" Sep 12 17:15:19.152463 systemd[1]: Started cri-containerd-76679aa24f79886c0295900602c5ab83eadcca86354f21554b48ead682f94eae.scope - libcontainer container 76679aa24f79886c0295900602c5ab83eadcca86354f21554b48ead682f94eae. Sep 12 17:15:19.277128 containerd[1474]: time="2025-09-12T17:15:19.276013237Z" level=info msg="StartContainer for \"76679aa24f79886c0295900602c5ab83eadcca86354f21554b48ead682f94eae\" returns successfully" Sep 12 17:15:19.373994 containerd[1474]: 2025-09-12 17:15:19.265 [WARNING][5181] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--2--0999f1dc3d-k8s-calico--kube--controllers--6fff4fc9--pkdxs-eth0", GenerateName:"calico-kube-controllers-6fff4fc9-", Namespace:"calico-system", SelfLink:"", UID:"a51ebaed-92f6-4627-9397-05d2c27efa46", ResourceVersion:"1020", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6fff4fc9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-2-0999f1dc3d", ContainerID:"1fda1c833dd0c298abd21be0cc5394b58067328fa4156066fc6617adad0b9cf5", Pod:"calico-kube-controllers-6fff4fc9-pkdxs", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.48.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali82790c92204", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:19.373994 containerd[1474]: 2025-09-12 17:15:19.271 [INFO][5181] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" Sep 12 17:15:19.373994 containerd[1474]: 2025-09-12 17:15:19.271 [INFO][5181] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" iface="eth0" netns="" Sep 12 17:15:19.373994 containerd[1474]: 2025-09-12 17:15:19.271 [INFO][5181] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" Sep 12 17:15:19.373994 containerd[1474]: 2025-09-12 17:15:19.271 [INFO][5181] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" Sep 12 17:15:19.373994 containerd[1474]: 2025-09-12 17:15:19.328 [INFO][5202] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" HandleID="k8s-pod-network.ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--kube--controllers--6fff4fc9--pkdxs-eth0" Sep 12 17:15:19.373994 containerd[1474]: 2025-09-12 17:15:19.328 [INFO][5202] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:19.373994 containerd[1474]: 2025-09-12 17:15:19.328 [INFO][5202] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:19.373994 containerd[1474]: 2025-09-12 17:15:19.356 [WARNING][5202] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" HandleID="k8s-pod-network.ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--kube--controllers--6fff4fc9--pkdxs-eth0" Sep 12 17:15:19.373994 containerd[1474]: 2025-09-12 17:15:19.358 [INFO][5202] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" HandleID="k8s-pod-network.ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--kube--controllers--6fff4fc9--pkdxs-eth0" Sep 12 17:15:19.373994 containerd[1474]: 2025-09-12 17:15:19.364 [INFO][5202] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:19.373994 containerd[1474]: 2025-09-12 17:15:19.370 [INFO][5181] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" Sep 12 17:15:19.374564 containerd[1474]: time="2025-09-12T17:15:19.374059144Z" level=info msg="TearDown network for sandbox \"ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8\" successfully" Sep 12 17:15:19.374564 containerd[1474]: time="2025-09-12T17:15:19.374112824Z" level=info msg="StopPodSandbox for \"ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8\" returns successfully" Sep 12 17:15:19.376320 containerd[1474]: time="2025-09-12T17:15:19.376224957Z" level=info msg="RemovePodSandbox for \"ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8\"" Sep 12 17:15:19.376320 containerd[1474]: time="2025-09-12T17:15:19.376312276Z" level=info msg="Forcibly stopping sandbox \"ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8\"" Sep 12 17:15:19.526755 containerd[1474]: 2025-09-12 17:15:19.440 [WARNING][5220] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--2--0999f1dc3d-k8s-calico--kube--controllers--6fff4fc9--pkdxs-eth0", GenerateName:"calico-kube-controllers-6fff4fc9-", Namespace:"calico-system", SelfLink:"", UID:"a51ebaed-92f6-4627-9397-05d2c27efa46", ResourceVersion:"1020", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6fff4fc9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-2-0999f1dc3d", ContainerID:"1fda1c833dd0c298abd21be0cc5394b58067328fa4156066fc6617adad0b9cf5", Pod:"calico-kube-controllers-6fff4fc9-pkdxs", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.48.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali82790c92204", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:19.526755 containerd[1474]: 2025-09-12 17:15:19.441 [INFO][5220] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" Sep 12 17:15:19.526755 containerd[1474]: 2025-09-12 17:15:19.441 [INFO][5220] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" iface="eth0" netns="" Sep 12 17:15:19.526755 containerd[1474]: 2025-09-12 17:15:19.441 [INFO][5220] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" Sep 12 17:15:19.526755 containerd[1474]: 2025-09-12 17:15:19.441 [INFO][5220] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" Sep 12 17:15:19.526755 containerd[1474]: 2025-09-12 17:15:19.483 [INFO][5227] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" HandleID="k8s-pod-network.ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--kube--controllers--6fff4fc9--pkdxs-eth0" Sep 12 17:15:19.526755 containerd[1474]: 2025-09-12 17:15:19.485 [INFO][5227] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:19.526755 containerd[1474]: 2025-09-12 17:15:19.485 [INFO][5227] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:19.526755 containerd[1474]: 2025-09-12 17:15:19.510 [WARNING][5227] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" HandleID="k8s-pod-network.ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--kube--controllers--6fff4fc9--pkdxs-eth0" Sep 12 17:15:19.526755 containerd[1474]: 2025-09-12 17:15:19.511 [INFO][5227] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" HandleID="k8s-pod-network.ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--kube--controllers--6fff4fc9--pkdxs-eth0" Sep 12 17:15:19.526755 containerd[1474]: 2025-09-12 17:15:19.514 [INFO][5227] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:19.526755 containerd[1474]: 2025-09-12 17:15:19.519 [INFO][5220] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8" Sep 12 17:15:19.526755 containerd[1474]: time="2025-09-12T17:15:19.526668315Z" level=info msg="TearDown network for sandbox \"ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8\" successfully" Sep 12 17:15:19.532043 containerd[1474]: time="2025-09-12T17:15:19.531252137Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 17:15:19.532043 containerd[1474]: time="2025-09-12T17:15:19.531364775Z" level=info msg="RemovePodSandbox \"ce2cd271b4fe965a5586a193e0d10b93b9e9edf084b1af2e80f36b93949f8ed8\" returns successfully" Sep 12 17:15:19.534089 containerd[1474]: time="2025-09-12T17:15:19.533782625Z" level=info msg="StopPodSandbox for \"fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7\"" Sep 12 17:15:19.714821 containerd[1474]: 2025-09-12 17:15:19.637 [WARNING][5242] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-whisker--796fc4c7cd--sr5c6-eth0" Sep 12 17:15:19.714821 containerd[1474]: 2025-09-12 17:15:19.637 [INFO][5242] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" Sep 12 17:15:19.714821 containerd[1474]: 2025-09-12 17:15:19.637 [INFO][5242] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" iface="eth0" netns="" Sep 12 17:15:19.714821 containerd[1474]: 2025-09-12 17:15:19.637 [INFO][5242] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" Sep 12 17:15:19.714821 containerd[1474]: 2025-09-12 17:15:19.637 [INFO][5242] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" Sep 12 17:15:19.714821 containerd[1474]: 2025-09-12 17:15:19.681 [INFO][5249] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" HandleID="k8s-pod-network.fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-whisker--796fc4c7cd--sr5c6-eth0" Sep 12 17:15:19.714821 containerd[1474]: 2025-09-12 17:15:19.682 [INFO][5249] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:19.714821 containerd[1474]: 2025-09-12 17:15:19.682 [INFO][5249] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:19.714821 containerd[1474]: 2025-09-12 17:15:19.704 [WARNING][5249] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" HandleID="k8s-pod-network.fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-whisker--796fc4c7cd--sr5c6-eth0" Sep 12 17:15:19.714821 containerd[1474]: 2025-09-12 17:15:19.704 [INFO][5249] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" HandleID="k8s-pod-network.fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-whisker--796fc4c7cd--sr5c6-eth0" Sep 12 17:15:19.714821 containerd[1474]: 2025-09-12 17:15:19.709 [INFO][5249] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:19.714821 containerd[1474]: 2025-09-12 17:15:19.712 [INFO][5242] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" Sep 12 17:15:19.717968 containerd[1474]: time="2025-09-12T17:15:19.714882072Z" level=info msg="TearDown network for sandbox \"fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7\" successfully" Sep 12 17:15:19.717968 containerd[1474]: time="2025-09-12T17:15:19.714945231Z" level=info msg="StopPodSandbox for \"fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7\" returns successfully" Sep 12 17:15:19.717968 containerd[1474]: time="2025-09-12T17:15:19.717446399Z" level=info msg="RemovePodSandbox for \"fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7\"" Sep 12 17:15:19.717968 containerd[1474]: time="2025-09-12T17:15:19.717509878Z" level=info msg="Forcibly stopping sandbox \"fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7\"" Sep 12 17:15:19.841142 containerd[1474]: 2025-09-12 17:15:19.782 [WARNING][5263] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" WorkloadEndpoint="ci--4081--3--6--2--0999f1dc3d-k8s-whisker--796fc4c7cd--sr5c6-eth0" Sep 12 17:15:19.841142 containerd[1474]: 2025-09-12 17:15:19.782 [INFO][5263] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" Sep 12 17:15:19.841142 containerd[1474]: 2025-09-12 17:15:19.782 [INFO][5263] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" iface="eth0" netns="" Sep 12 17:15:19.841142 containerd[1474]: 2025-09-12 17:15:19.782 [INFO][5263] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" Sep 12 17:15:19.841142 containerd[1474]: 2025-09-12 17:15:19.782 [INFO][5263] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" Sep 12 17:15:19.841142 containerd[1474]: 2025-09-12 17:15:19.818 [INFO][5270] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" HandleID="k8s-pod-network.fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-whisker--796fc4c7cd--sr5c6-eth0" Sep 12 17:15:19.841142 containerd[1474]: 2025-09-12 17:15:19.818 [INFO][5270] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:19.841142 containerd[1474]: 2025-09-12 17:15:19.818 [INFO][5270] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:19.841142 containerd[1474]: 2025-09-12 17:15:19.832 [WARNING][5270] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" HandleID="k8s-pod-network.fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-whisker--796fc4c7cd--sr5c6-eth0" Sep 12 17:15:19.841142 containerd[1474]: 2025-09-12 17:15:19.832 [INFO][5270] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" HandleID="k8s-pod-network.fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-whisker--796fc4c7cd--sr5c6-eth0" Sep 12 17:15:19.841142 containerd[1474]: 2025-09-12 17:15:19.835 [INFO][5270] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:19.841142 containerd[1474]: 2025-09-12 17:15:19.838 [INFO][5263] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7" Sep 12 17:15:19.842274 containerd[1474]: time="2025-09-12T17:15:19.841524534Z" level=info msg="TearDown network for sandbox \"fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7\" successfully" Sep 12 17:15:19.847855 containerd[1474]: time="2025-09-12T17:15:19.847587897Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 17:15:19.847855 containerd[1474]: time="2025-09-12T17:15:19.847698095Z" level=info msg="RemovePodSandbox \"fc4688e0616bdad68c47e9ca8a2cf2f34b3b14bb5c0411451690ef4df39a3ca7\" returns successfully" Sep 12 17:15:19.849630 containerd[1474]: time="2025-09-12T17:15:19.849303835Z" level=info msg="StopPodSandbox for \"de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7\"" Sep 12 17:15:19.988440 containerd[1474]: 2025-09-12 17:15:19.918 [WARNING][5284] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--kpktm-eth0", GenerateName:"calico-apiserver-7df9fb7db-", Namespace:"calico-apiserver", SelfLink:"", UID:"0d01a44b-0670-4021-80bc-06b979ee8824", ResourceVersion:"997", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7df9fb7db", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-2-0999f1dc3d", ContainerID:"36cb49f4ec250ee9635802c5d120c6cb2059151fbfd4b1779672f177c7ef3604", Pod:"calico-apiserver-7df9fb7db-kpktm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.48.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif6b4ab77080", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:19.988440 containerd[1474]: 2025-09-12 17:15:19.919 [INFO][5284] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" Sep 12 17:15:19.988440 containerd[1474]: 2025-09-12 17:15:19.919 [INFO][5284] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" iface="eth0" netns="" Sep 12 17:15:19.988440 containerd[1474]: 2025-09-12 17:15:19.919 [INFO][5284] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" Sep 12 17:15:19.988440 containerd[1474]: 2025-09-12 17:15:19.919 [INFO][5284] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" Sep 12 17:15:19.988440 containerd[1474]: 2025-09-12 17:15:19.958 [INFO][5291] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" HandleID="k8s-pod-network.de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--kpktm-eth0" Sep 12 17:15:19.988440 containerd[1474]: 2025-09-12 17:15:19.960 [INFO][5291] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:19.988440 containerd[1474]: 2025-09-12 17:15:19.960 [INFO][5291] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:19.988440 containerd[1474]: 2025-09-12 17:15:19.979 [WARNING][5291] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" HandleID="k8s-pod-network.de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--kpktm-eth0" Sep 12 17:15:19.988440 containerd[1474]: 2025-09-12 17:15:19.980 [INFO][5291] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" HandleID="k8s-pod-network.de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--kpktm-eth0" Sep 12 17:15:19.988440 containerd[1474]: 2025-09-12 17:15:19.983 [INFO][5291] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:19.988440 containerd[1474]: 2025-09-12 17:15:19.986 [INFO][5284] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" Sep 12 17:15:19.990026 containerd[1474]: time="2025-09-12T17:15:19.989240288Z" level=info msg="TearDown network for sandbox \"de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7\" successfully" Sep 12 17:15:19.990026 containerd[1474]: time="2025-09-12T17:15:19.989312047Z" level=info msg="StopPodSandbox for \"de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7\" returns successfully" Sep 12 17:15:19.990379 containerd[1474]: time="2025-09-12T17:15:19.990198555Z" level=info msg="RemovePodSandbox for \"de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7\"" Sep 12 17:15:19.990379 containerd[1474]: time="2025-09-12T17:15:19.990240395Z" level=info msg="Forcibly stopping sandbox \"de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7\"" Sep 12 17:15:20.141006 containerd[1474]: 2025-09-12 17:15:20.053 [WARNING][5305] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--kpktm-eth0", GenerateName:"calico-apiserver-7df9fb7db-", Namespace:"calico-apiserver", SelfLink:"", UID:"0d01a44b-0670-4021-80bc-06b979ee8824", ResourceVersion:"997", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7df9fb7db", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-2-0999f1dc3d", ContainerID:"36cb49f4ec250ee9635802c5d120c6cb2059151fbfd4b1779672f177c7ef3604", Pod:"calico-apiserver-7df9fb7db-kpktm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.48.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif6b4ab77080", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:20.141006 containerd[1474]: 2025-09-12 17:15:20.054 [INFO][5305] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" Sep 12 17:15:20.141006 containerd[1474]: 2025-09-12 17:15:20.054 [INFO][5305] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" iface="eth0" netns="" Sep 12 17:15:20.141006 containerd[1474]: 2025-09-12 17:15:20.054 [INFO][5305] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" Sep 12 17:15:20.141006 containerd[1474]: 2025-09-12 17:15:20.054 [INFO][5305] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" Sep 12 17:15:20.141006 containerd[1474]: 2025-09-12 17:15:20.106 [INFO][5312] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" HandleID="k8s-pod-network.de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--kpktm-eth0" Sep 12 17:15:20.141006 containerd[1474]: 2025-09-12 17:15:20.107 [INFO][5312] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:20.141006 containerd[1474]: 2025-09-12 17:15:20.107 [INFO][5312] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:20.141006 containerd[1474]: 2025-09-12 17:15:20.126 [WARNING][5312] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" HandleID="k8s-pod-network.de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--kpktm-eth0" Sep 12 17:15:20.141006 containerd[1474]: 2025-09-12 17:15:20.126 [INFO][5312] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" HandleID="k8s-pod-network.de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-calico--apiserver--7df9fb7db--kpktm-eth0" Sep 12 17:15:20.141006 containerd[1474]: 2025-09-12 17:15:20.130 [INFO][5312] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:20.141006 containerd[1474]: 2025-09-12 17:15:20.137 [INFO][5305] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7" Sep 12 17:15:20.141006 containerd[1474]: time="2025-09-12T17:15:20.140959733Z" level=info msg="TearDown network for sandbox \"de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7\" successfully" Sep 12 17:15:20.150316 containerd[1474]: time="2025-09-12T17:15:20.148718275Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 17:15:20.150316 containerd[1474]: time="2025-09-12T17:15:20.149000472Z" level=info msg="RemovePodSandbox \"de20c281e9be8ceb5ac227d9407ee1c1a9f9de1e010ed5f1af1dde4010b0c2d7\" returns successfully" Sep 12 17:15:20.151447 containerd[1474]: time="2025-09-12T17:15:20.150691170Z" level=info msg="StopPodSandbox for \"13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a\"" Sep 12 17:15:20.344374 containerd[1474]: 2025-09-12 17:15:20.247 [WARNING][5326] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--2--0999f1dc3d-k8s-goldmane--54d579b49d--fnsjh-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"3b22a780-bb41-4d2d-9dad-2beb095f9e2c", ResourceVersion:"935", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-2-0999f1dc3d", ContainerID:"591a64a3fdf996e8c3d680f0698e12606385018435cee65b50db99b9a3df99dc", Pod:"goldmane-54d579b49d-fnsjh", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.48.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia8e96e15d55", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:20.344374 containerd[1474]: 2025-09-12 17:15:20.248 [INFO][5326] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" Sep 12 17:15:20.344374 containerd[1474]: 2025-09-12 17:15:20.248 [INFO][5326] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" iface="eth0" netns="" Sep 12 17:15:20.344374 containerd[1474]: 2025-09-12 17:15:20.248 [INFO][5326] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" Sep 12 17:15:20.344374 containerd[1474]: 2025-09-12 17:15:20.248 [INFO][5326] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" Sep 12 17:15:20.344374 containerd[1474]: 2025-09-12 17:15:20.286 [INFO][5334] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" HandleID="k8s-pod-network.13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-goldmane--54d579b49d--fnsjh-eth0" Sep 12 17:15:20.344374 containerd[1474]: 2025-09-12 17:15:20.286 [INFO][5334] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:20.344374 containerd[1474]: 2025-09-12 17:15:20.286 [INFO][5334] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:20.344374 containerd[1474]: 2025-09-12 17:15:20.311 [WARNING][5334] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" HandleID="k8s-pod-network.13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-goldmane--54d579b49d--fnsjh-eth0" Sep 12 17:15:20.344374 containerd[1474]: 2025-09-12 17:15:20.311 [INFO][5334] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" HandleID="k8s-pod-network.13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-goldmane--54d579b49d--fnsjh-eth0" Sep 12 17:15:20.344374 containerd[1474]: 2025-09-12 17:15:20.332 [INFO][5334] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:20.344374 containerd[1474]: 2025-09-12 17:15:20.338 [INFO][5326] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" Sep 12 17:15:20.348108 containerd[1474]: time="2025-09-12T17:15:20.344610166Z" level=info msg="TearDown network for sandbox \"13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a\" successfully" Sep 12 17:15:20.348108 containerd[1474]: time="2025-09-12T17:15:20.347686007Z" level=info msg="StopPodSandbox for \"13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a\" returns successfully" Sep 12 17:15:20.350541 containerd[1474]: time="2025-09-12T17:15:20.350409253Z" level=info msg="RemovePodSandbox for \"13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a\"" Sep 12 17:15:20.350541 containerd[1474]: time="2025-09-12T17:15:20.350547771Z" level=info msg="Forcibly stopping sandbox \"13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a\"" Sep 12 17:15:20.595958 containerd[1474]: 2025-09-12 17:15:20.497 [WARNING][5348] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--2--0999f1dc3d-k8s-goldmane--54d579b49d--fnsjh-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"3b22a780-bb41-4d2d-9dad-2beb095f9e2c", ResourceVersion:"935", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-2-0999f1dc3d", ContainerID:"591a64a3fdf996e8c3d680f0698e12606385018435cee65b50db99b9a3df99dc", Pod:"goldmane-54d579b49d-fnsjh", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.48.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia8e96e15d55", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:20.595958 containerd[1474]: 2025-09-12 17:15:20.503 [INFO][5348] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" Sep 12 17:15:20.595958 containerd[1474]: 2025-09-12 17:15:20.503 [INFO][5348] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" iface="eth0" netns="" Sep 12 17:15:20.595958 containerd[1474]: 2025-09-12 17:15:20.503 [INFO][5348] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" Sep 12 17:15:20.595958 containerd[1474]: 2025-09-12 17:15:20.503 [INFO][5348] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" Sep 12 17:15:20.595958 containerd[1474]: 2025-09-12 17:15:20.555 [INFO][5356] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" HandleID="k8s-pod-network.13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-goldmane--54d579b49d--fnsjh-eth0" Sep 12 17:15:20.595958 containerd[1474]: 2025-09-12 17:15:20.555 [INFO][5356] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:20.595958 containerd[1474]: 2025-09-12 17:15:20.555 [INFO][5356] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:20.595958 containerd[1474]: 2025-09-12 17:15:20.576 [WARNING][5356] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" HandleID="k8s-pod-network.13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-goldmane--54d579b49d--fnsjh-eth0" Sep 12 17:15:20.595958 containerd[1474]: 2025-09-12 17:15:20.577 [INFO][5356] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" HandleID="k8s-pod-network.13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-goldmane--54d579b49d--fnsjh-eth0" Sep 12 17:15:20.595958 containerd[1474]: 2025-09-12 17:15:20.585 [INFO][5356] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:20.595958 containerd[1474]: 2025-09-12 17:15:20.587 [INFO][5348] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a" Sep 12 17:15:20.595958 containerd[1474]: time="2025-09-12T17:15:20.593315111Z" level=info msg="TearDown network for sandbox \"13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a\" successfully" Sep 12 17:15:20.602244 containerd[1474]: time="2025-09-12T17:15:20.602154639Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 17:15:20.602492 containerd[1474]: time="2025-09-12T17:15:20.602287158Z" level=info msg="RemovePodSandbox \"13b1e6e1079aeb74408a4c106a5914b00d680c225078fed150adc2cc99d95c7a\" returns successfully" Sep 12 17:15:20.603334 containerd[1474]: time="2025-09-12T17:15:20.602958149Z" level=info msg="StopPodSandbox for \"4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266\"" Sep 12 17:15:20.780620 containerd[1474]: 2025-09-12 17:15:20.688 [WARNING][5370] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--2--0999f1dc3d-k8s-csi--node--driver--krpgf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"7c11d9b3-8068-4598-8721-3a4e4f793c52", ResourceVersion:"932", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-2-0999f1dc3d", ContainerID:"3003849266b76e7168cdfde98b1fd3830e01312968a4c53470e7af0342164f4b", Pod:"csi-node-driver-krpgf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.48.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali556a08f9179", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:20.780620 containerd[1474]: 2025-09-12 17:15:20.689 [INFO][5370] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" Sep 12 17:15:20.780620 containerd[1474]: 2025-09-12 17:15:20.689 [INFO][5370] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" iface="eth0" netns="" Sep 12 17:15:20.780620 containerd[1474]: 2025-09-12 17:15:20.689 [INFO][5370] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" Sep 12 17:15:20.780620 containerd[1474]: 2025-09-12 17:15:20.689 [INFO][5370] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" Sep 12 17:15:20.780620 containerd[1474]: 2025-09-12 17:15:20.745 [INFO][5377] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" HandleID="k8s-pod-network.4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-csi--node--driver--krpgf-eth0" Sep 12 17:15:20.780620 containerd[1474]: 2025-09-12 17:15:20.745 [INFO][5377] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:20.780620 containerd[1474]: 2025-09-12 17:15:20.745 [INFO][5377] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:20.780620 containerd[1474]: 2025-09-12 17:15:20.768 [WARNING][5377] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" HandleID="k8s-pod-network.4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-csi--node--driver--krpgf-eth0" Sep 12 17:15:20.780620 containerd[1474]: 2025-09-12 17:15:20.768 [INFO][5377] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" HandleID="k8s-pod-network.4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-csi--node--driver--krpgf-eth0" Sep 12 17:15:20.780620 containerd[1474]: 2025-09-12 17:15:20.774 [INFO][5377] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:20.780620 containerd[1474]: 2025-09-12 17:15:20.776 [INFO][5370] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" Sep 12 17:15:20.783648 containerd[1474]: time="2025-09-12T17:15:20.780631229Z" level=info msg="TearDown network for sandbox \"4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266\" successfully" Sep 12 17:15:20.783648 containerd[1474]: time="2025-09-12T17:15:20.780671429Z" level=info msg="StopPodSandbox for \"4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266\" returns successfully" Sep 12 17:15:20.783648 containerd[1474]: time="2025-09-12T17:15:20.782721923Z" level=info msg="RemovePodSandbox for \"4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266\"" Sep 12 17:15:20.783648 containerd[1474]: time="2025-09-12T17:15:20.782811482Z" level=info msg="Forcibly stopping sandbox \"4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266\"" Sep 12 17:15:21.035862 containerd[1474]: 2025-09-12 17:15:20.881 [WARNING][5394] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--2--0999f1dc3d-k8s-csi--node--driver--krpgf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"7c11d9b3-8068-4598-8721-3a4e4f793c52", ResourceVersion:"932", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-2-0999f1dc3d", ContainerID:"3003849266b76e7168cdfde98b1fd3830e01312968a4c53470e7af0342164f4b", Pod:"csi-node-driver-krpgf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.48.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali556a08f9179", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:21.035862 containerd[1474]: 2025-09-12 17:15:20.881 [INFO][5394] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" Sep 12 17:15:21.035862 containerd[1474]: 2025-09-12 17:15:20.881 [INFO][5394] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" iface="eth0" netns="" Sep 12 17:15:21.035862 containerd[1474]: 2025-09-12 17:15:20.881 [INFO][5394] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" Sep 12 17:15:21.035862 containerd[1474]: 2025-09-12 17:15:20.881 [INFO][5394] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" Sep 12 17:15:21.035862 containerd[1474]: 2025-09-12 17:15:20.987 [INFO][5406] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" HandleID="k8s-pod-network.4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-csi--node--driver--krpgf-eth0" Sep 12 17:15:21.035862 containerd[1474]: 2025-09-12 17:15:20.987 [INFO][5406] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:21.035862 containerd[1474]: 2025-09-12 17:15:20.987 [INFO][5406] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:21.035862 containerd[1474]: 2025-09-12 17:15:21.010 [WARNING][5406] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" HandleID="k8s-pod-network.4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-csi--node--driver--krpgf-eth0" Sep 12 17:15:21.035862 containerd[1474]: 2025-09-12 17:15:21.011 [INFO][5406] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" HandleID="k8s-pod-network.4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-csi--node--driver--krpgf-eth0" Sep 12 17:15:21.035862 containerd[1474]: 2025-09-12 17:15:21.021 [INFO][5406] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:21.035862 containerd[1474]: 2025-09-12 17:15:21.030 [INFO][5394] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266" Sep 12 17:15:21.037902 containerd[1474]: time="2025-09-12T17:15:21.036325172Z" level=info msg="TearDown network for sandbox \"4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266\" successfully" Sep 12 17:15:21.050560 containerd[1474]: time="2025-09-12T17:15:21.049793964Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 17:15:21.050560 containerd[1474]: time="2025-09-12T17:15:21.050246318Z" level=info msg="RemovePodSandbox \"4144508f92b8377437d05d3e3f54b9c688ba69ea087844d5f28e60164de8e266\" returns successfully" Sep 12 17:15:21.052217 containerd[1474]: time="2025-09-12T17:15:21.052091495Z" level=info msg="StopPodSandbox for \"dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b\"" Sep 12 17:15:21.406402 containerd[1474]: 2025-09-12 17:15:21.215 [WARNING][5421] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--j49g9-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"59f64a22-a697-45df-a931-7e5ec9919a4c", ResourceVersion:"957", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-2-0999f1dc3d", ContainerID:"64f8a67d06a906d37de2f14a2f5f7f7247fdbe2a09aae7f83662535c97f07dfd", Pod:"coredns-668d6bf9bc-j49g9", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.48.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6ecd77912b3", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:21.406402 containerd[1474]: 2025-09-12 17:15:21.216 [INFO][5421] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" Sep 12 17:15:21.406402 containerd[1474]: 2025-09-12 17:15:21.217 [INFO][5421] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" iface="eth0" netns="" Sep 12 17:15:21.406402 containerd[1474]: 2025-09-12 17:15:21.217 [INFO][5421] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" Sep 12 17:15:21.406402 containerd[1474]: 2025-09-12 17:15:21.217 [INFO][5421] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" Sep 12 17:15:21.406402 containerd[1474]: 2025-09-12 17:15:21.315 [INFO][5429] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" HandleID="k8s-pod-network.dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--j49g9-eth0" Sep 12 17:15:21.406402 containerd[1474]: 2025-09-12 17:15:21.319 [INFO][5429] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:21.406402 containerd[1474]: 2025-09-12 17:15:21.319 [INFO][5429] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:21.406402 containerd[1474]: 2025-09-12 17:15:21.364 [WARNING][5429] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" HandleID="k8s-pod-network.dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--j49g9-eth0" Sep 12 17:15:21.406402 containerd[1474]: 2025-09-12 17:15:21.364 [INFO][5429] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" HandleID="k8s-pod-network.dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--j49g9-eth0" Sep 12 17:15:21.406402 containerd[1474]: 2025-09-12 17:15:21.395 [INFO][5429] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:21.406402 containerd[1474]: 2025-09-12 17:15:21.401 [INFO][5421] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" Sep 12 17:15:21.407005 containerd[1474]: time="2025-09-12T17:15:21.406672161Z" level=info msg="TearDown network for sandbox \"dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b\" successfully" Sep 12 17:15:21.407005 containerd[1474]: time="2025-09-12T17:15:21.406734600Z" level=info msg="StopPodSandbox for \"dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b\" returns successfully" Sep 12 17:15:21.410103 containerd[1474]: time="2025-09-12T17:15:21.409226009Z" level=info msg="RemovePodSandbox for \"dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b\"" Sep 12 17:15:21.410426 containerd[1474]: time="2025-09-12T17:15:21.410388234Z" level=info msg="Forcibly stopping sandbox \"dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b\"" Sep 12 17:15:21.720629 containerd[1474]: 2025-09-12 17:15:21.567 [WARNING][5449] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--j49g9-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"59f64a22-a697-45df-a931-7e5ec9919a4c", ResourceVersion:"957", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-2-0999f1dc3d", ContainerID:"64f8a67d06a906d37de2f14a2f5f7f7247fdbe2a09aae7f83662535c97f07dfd", Pod:"coredns-668d6bf9bc-j49g9", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.48.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6ecd77912b3", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:21.720629 containerd[1474]: 2025-09-12 17:15:21.568 [INFO][5449] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" Sep 12 17:15:21.720629 containerd[1474]: 2025-09-12 17:15:21.568 [INFO][5449] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" iface="eth0" netns="" Sep 12 17:15:21.720629 containerd[1474]: 2025-09-12 17:15:21.568 [INFO][5449] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" Sep 12 17:15:21.720629 containerd[1474]: 2025-09-12 17:15:21.568 [INFO][5449] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" Sep 12 17:15:21.720629 containerd[1474]: 2025-09-12 17:15:21.662 [INFO][5457] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" HandleID="k8s-pod-network.dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--j49g9-eth0" Sep 12 17:15:21.720629 containerd[1474]: 2025-09-12 17:15:21.662 [INFO][5457] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:21.720629 containerd[1474]: 2025-09-12 17:15:21.662 [INFO][5457] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:21.720629 containerd[1474]: 2025-09-12 17:15:21.697 [WARNING][5457] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" HandleID="k8s-pod-network.dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--j49g9-eth0" Sep 12 17:15:21.720629 containerd[1474]: 2025-09-12 17:15:21.697 [INFO][5457] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" HandleID="k8s-pod-network.dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--j49g9-eth0" Sep 12 17:15:21.720629 containerd[1474]: 2025-09-12 17:15:21.702 [INFO][5457] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:21.720629 containerd[1474]: 2025-09-12 17:15:21.710 [INFO][5449] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b" Sep 12 17:15:21.721190 containerd[1474]: time="2025-09-12T17:15:21.720876808Z" level=info msg="TearDown network for sandbox \"dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b\" successfully" Sep 12 17:15:21.730992 containerd[1474]: time="2025-09-12T17:15:21.730735086Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 17:15:21.731391 containerd[1474]: time="2025-09-12T17:15:21.731202440Z" level=info msg="RemovePodSandbox \"dfd10de400f5da19bed4ce9b6c95b063f73831955d4dfd785ae24c739e1e6a3b\" returns successfully" Sep 12 17:15:21.738686 containerd[1474]: time="2025-09-12T17:15:21.738603788Z" level=info msg="StopPodSandbox for \"22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208\"" Sep 12 17:15:21.782515 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3813717480.mount: Deactivated successfully. Sep 12 17:15:22.011600 containerd[1474]: 2025-09-12 17:15:21.879 [WARNING][5471] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--pc7pt-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"ecddc12a-94e3-4f27-a97d-480a79c66c1a", ResourceVersion:"990", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-2-0999f1dc3d", ContainerID:"9d315990e24b7bbb64ef9cdbbe513f65aec60e348d9e5395c24af791792f01bc", Pod:"coredns-668d6bf9bc-pc7pt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.48.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6f2608cc972", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:22.011600 containerd[1474]: 2025-09-12 17:15:21.880 [INFO][5471] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" Sep 12 17:15:22.011600 containerd[1474]: 2025-09-12 17:15:21.880 [INFO][5471] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" iface="eth0" netns="" Sep 12 17:15:22.011600 containerd[1474]: 2025-09-12 17:15:21.880 [INFO][5471] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" Sep 12 17:15:22.011600 containerd[1474]: 2025-09-12 17:15:21.881 [INFO][5471] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" Sep 12 17:15:22.011600 containerd[1474]: 2025-09-12 17:15:21.970 [INFO][5483] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" HandleID="k8s-pod-network.22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--pc7pt-eth0" Sep 12 17:15:22.011600 containerd[1474]: 2025-09-12 17:15:21.971 [INFO][5483] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:22.011600 containerd[1474]: 2025-09-12 17:15:21.971 [INFO][5483] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:22.011600 containerd[1474]: 2025-09-12 17:15:21.993 [WARNING][5483] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" HandleID="k8s-pod-network.22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--pc7pt-eth0" Sep 12 17:15:22.011600 containerd[1474]: 2025-09-12 17:15:21.993 [INFO][5483] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" HandleID="k8s-pod-network.22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--pc7pt-eth0" Sep 12 17:15:22.011600 containerd[1474]: 2025-09-12 17:15:21.998 [INFO][5483] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:22.011600 containerd[1474]: 2025-09-12 17:15:22.003 [INFO][5471] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" Sep 12 17:15:22.011600 containerd[1474]: time="2025-09-12T17:15:22.010247767Z" level=info msg="TearDown network for sandbox \"22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208\" successfully" Sep 12 17:15:22.011600 containerd[1474]: time="2025-09-12T17:15:22.010288366Z" level=info msg="StopPodSandbox for \"22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208\" returns successfully" Sep 12 17:15:22.014652 containerd[1474]: time="2025-09-12T17:15:22.013394048Z" level=info msg="RemovePodSandbox for \"22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208\"" Sep 12 17:15:22.014652 containerd[1474]: time="2025-09-12T17:15:22.013887242Z" level=info msg="Forcibly stopping sandbox \"22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208\"" Sep 12 17:15:22.214241 containerd[1474]: 2025-09-12 17:15:22.134 [WARNING][5497] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--pc7pt-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"ecddc12a-94e3-4f27-a97d-480a79c66c1a", ResourceVersion:"990", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 14, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-6-2-0999f1dc3d", ContainerID:"9d315990e24b7bbb64ef9cdbbe513f65aec60e348d9e5395c24af791792f01bc", Pod:"coredns-668d6bf9bc-pc7pt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.48.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6f2608cc972", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:15:22.214241 containerd[1474]: 2025-09-12 17:15:22.134 [INFO][5497] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" Sep 12 17:15:22.214241 containerd[1474]: 2025-09-12 17:15:22.134 [INFO][5497] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" iface="eth0" netns="" Sep 12 17:15:22.214241 containerd[1474]: 2025-09-12 17:15:22.134 [INFO][5497] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" Sep 12 17:15:22.214241 containerd[1474]: 2025-09-12 17:15:22.135 [INFO][5497] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" Sep 12 17:15:22.214241 containerd[1474]: 2025-09-12 17:15:22.181 [INFO][5504] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" HandleID="k8s-pod-network.22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--pc7pt-eth0" Sep 12 17:15:22.214241 containerd[1474]: 2025-09-12 17:15:22.182 [INFO][5504] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:15:22.214241 containerd[1474]: 2025-09-12 17:15:22.183 [INFO][5504] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:15:22.214241 containerd[1474]: 2025-09-12 17:15:22.202 [WARNING][5504] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" HandleID="k8s-pod-network.22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--pc7pt-eth0" Sep 12 17:15:22.214241 containerd[1474]: 2025-09-12 17:15:22.202 [INFO][5504] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" HandleID="k8s-pod-network.22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" Workload="ci--4081--3--6--2--0999f1dc3d-k8s-coredns--668d6bf9bc--pc7pt-eth0" Sep 12 17:15:22.214241 containerd[1474]: 2025-09-12 17:15:22.209 [INFO][5504] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:15:22.214241 containerd[1474]: 2025-09-12 17:15:22.211 [INFO][5497] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208" Sep 12 17:15:22.215400 containerd[1474]: time="2025-09-12T17:15:22.214895089Z" level=info msg="TearDown network for sandbox \"22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208\" successfully" Sep 12 17:15:22.224098 containerd[1474]: time="2025-09-12T17:15:22.222320837Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 12 17:15:22.224098 containerd[1474]: time="2025-09-12T17:15:22.222440636Z" level=info msg="RemovePodSandbox \"22e3cdc6e9d2964a355f2abb708065e2ade653e106e6d210166694bab564d208\" returns successfully" Sep 12 17:15:22.723096 containerd[1474]: time="2025-09-12T17:15:22.719436840Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=61845332" Sep 12 17:15:22.730137 containerd[1474]: time="2025-09-12T17:15:22.726863869Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:22.738594 containerd[1474]: time="2025-09-12T17:15:22.738112250Z" level=info msg="ImageCreate event name:\"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:22.751378 containerd[1474]: time="2025-09-12T17:15:22.747730612Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:22.751378 containerd[1474]: time="2025-09-12T17:15:22.748721560Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"61845178\" in 3.818504494s" Sep 12 17:15:22.751378 containerd[1474]: time="2025-09-12T17:15:22.748761759Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\"" Sep 12 17:15:22.751378 containerd[1474]: time="2025-09-12T17:15:22.751050731Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 17:15:22.770774 containerd[1474]: time="2025-09-12T17:15:22.770687449Z" level=info msg="CreateContainer within sandbox \"591a64a3fdf996e8c3d680f0698e12606385018435cee65b50db99b9a3df99dc\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 12 17:15:22.812327 containerd[1474]: time="2025-09-12T17:15:22.812265378Z" level=info msg="CreateContainer within sandbox \"591a64a3fdf996e8c3d680f0698e12606385018435cee65b50db99b9a3df99dc\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"0c1848f8f275feb8f30f25af4b0fcbb91b47a7f77b960fc9978548f95aefa78e\"" Sep 12 17:15:22.813710 containerd[1474]: time="2025-09-12T17:15:22.813655641Z" level=info msg="StartContainer for \"0c1848f8f275feb8f30f25af4b0fcbb91b47a7f77b960fc9978548f95aefa78e\"" Sep 12 17:15:22.918667 systemd[1]: Started cri-containerd-0c1848f8f275feb8f30f25af4b0fcbb91b47a7f77b960fc9978548f95aefa78e.scope - libcontainer container 0c1848f8f275feb8f30f25af4b0fcbb91b47a7f77b960fc9978548f95aefa78e. Sep 12 17:15:22.990494 containerd[1474]: time="2025-09-12T17:15:22.990326747Z" level=info msg="StartContainer for \"0c1848f8f275feb8f30f25af4b0fcbb91b47a7f77b960fc9978548f95aefa78e\" returns successfully" Sep 12 17:15:24.360747 systemd[1]: run-containerd-runc-k8s.io-0c1848f8f275feb8f30f25af4b0fcbb91b47a7f77b960fc9978548f95aefa78e-runc.F39ll0.mount: Deactivated successfully. Sep 12 17:15:25.446055 containerd[1474]: time="2025-09-12T17:15:25.445973020Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:25.448369 containerd[1474]: time="2025-09-12T17:15:25.448050956Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=44530807" Sep 12 17:15:25.452131 containerd[1474]: time="2025-09-12T17:15:25.450825442Z" level=info msg="ImageCreate event name:\"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:25.458263 containerd[1474]: time="2025-09-12T17:15:25.458184635Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:25.460093 containerd[1474]: time="2025-09-12T17:15:25.459997973Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 2.708882123s" Sep 12 17:15:25.460093 containerd[1474]: time="2025-09-12T17:15:25.460060812Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 12 17:15:25.462551 containerd[1474]: time="2025-09-12T17:15:25.462492503Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 17:15:25.469734 containerd[1474]: time="2025-09-12T17:15:25.469309742Z" level=info msg="CreateContainer within sandbox \"dc8199d51d223fa210f25174d49c024ce3c37f44fc96832e965fbabdf016e744\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 17:15:25.488770 containerd[1474]: time="2025-09-12T17:15:25.488703071Z" level=info msg="CreateContainer within sandbox \"dc8199d51d223fa210f25174d49c024ce3c37f44fc96832e965fbabdf016e744\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"2226f4095e992446fc4754fe9f5ff84e69cb36187e02e991ede5a568c555bded\"" Sep 12 17:15:25.491779 containerd[1474]: time="2025-09-12T17:15:25.491723875Z" level=info msg="StartContainer for \"2226f4095e992446fc4754fe9f5ff84e69cb36187e02e991ede5a568c555bded\"" Sep 12 17:15:25.566419 systemd[1]: Started cri-containerd-2226f4095e992446fc4754fe9f5ff84e69cb36187e02e991ede5a568c555bded.scope - libcontainer container 2226f4095e992446fc4754fe9f5ff84e69cb36187e02e991ede5a568c555bded. Sep 12 17:15:25.710993 containerd[1474]: time="2025-09-12T17:15:25.710724264Z" level=info msg="StartContainer for \"2226f4095e992446fc4754fe9f5ff84e69cb36187e02e991ede5a568c555bded\" returns successfully" Sep 12 17:15:25.854752 containerd[1474]: time="2025-09-12T17:15:25.854656788Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:25.867596 containerd[1474]: time="2025-09-12T17:15:25.859578970Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 12 17:15:25.870546 containerd[1474]: time="2025-09-12T17:15:25.870464480Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 407.917257ms" Sep 12 17:15:25.870546 containerd[1474]: time="2025-09-12T17:15:25.870534359Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 12 17:15:25.876751 containerd[1474]: time="2025-09-12T17:15:25.876661166Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 12 17:15:25.880546 containerd[1474]: time="2025-09-12T17:15:25.880465801Z" level=info msg="CreateContainer within sandbox \"36cb49f4ec250ee9635802c5d120c6cb2059151fbfd4b1779672f177c7ef3604\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 17:15:25.926917 containerd[1474]: time="2025-09-12T17:15:25.925302826Z" level=info msg="CreateContainer within sandbox \"36cb49f4ec250ee9635802c5d120c6cb2059151fbfd4b1779672f177c7ef3604\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"08b59a20ba498baf8ac446ffb5b01b94c64c7982198af4d2ef70550b0d438cb0\"" Sep 12 17:15:25.927804 containerd[1474]: time="2025-09-12T17:15:25.927574719Z" level=info msg="StartContainer for \"08b59a20ba498baf8ac446ffb5b01b94c64c7982198af4d2ef70550b0d438cb0\"" Sep 12 17:15:26.036500 systemd[1]: Started cri-containerd-08b59a20ba498baf8ac446ffb5b01b94c64c7982198af4d2ef70550b0d438cb0.scope - libcontainer container 08b59a20ba498baf8ac446ffb5b01b94c64c7982198af4d2ef70550b0d438cb0. Sep 12 17:15:26.172698 containerd[1474]: time="2025-09-12T17:15:26.172625737Z" level=info msg="StartContainer for \"08b59a20ba498baf8ac446ffb5b01b94c64c7982198af4d2ef70550b0d438cb0\" returns successfully" Sep 12 17:15:26.332137 kubelet[2556]: I0912 17:15:26.331891 2556 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7df9fb7db-kpktm" podStartSLOduration=35.212035814 podStartE2EDuration="45.331860896s" podCreationTimestamp="2025-09-12 17:14:41 +0000 UTC" firstStartedPulling="2025-09-12 17:15:15.754209595 +0000 UTC m=+57.416950914" lastFinishedPulling="2025-09-12 17:15:25.874034677 +0000 UTC m=+67.536775996" observedRunningTime="2025-09-12 17:15:26.3315277 +0000 UTC m=+67.994269099" watchObservedRunningTime="2025-09-12 17:15:26.331860896 +0000 UTC m=+67.994602255" Sep 12 17:15:26.333297 kubelet[2556]: I0912 17:15:26.332308 2556 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-fnsjh" podStartSLOduration=28.075015557 podStartE2EDuration="39.332300491s" podCreationTimestamp="2025-09-12 17:14:47 +0000 UTC" firstStartedPulling="2025-09-12 17:15:11.49274325 +0000 UTC m=+53.155484609" lastFinishedPulling="2025-09-12 17:15:22.750028184 +0000 UTC m=+64.412769543" observedRunningTime="2025-09-12 17:15:23.329403219 +0000 UTC m=+64.992144578" watchObservedRunningTime="2025-09-12 17:15:26.332300491 +0000 UTC m=+67.995041810" Sep 12 17:15:26.363323 kubelet[2556]: I0912 17:15:26.363180 2556 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7df9fb7db-7p65c" podStartSLOduration=33.832439658 podStartE2EDuration="45.363118207s" podCreationTimestamp="2025-09-12 17:14:41 +0000 UTC" firstStartedPulling="2025-09-12 17:15:13.930827766 +0000 UTC m=+55.593569125" lastFinishedPulling="2025-09-12 17:15:25.461506315 +0000 UTC m=+67.124247674" observedRunningTime="2025-09-12 17:15:26.362704852 +0000 UTC m=+68.025446211" watchObservedRunningTime="2025-09-12 17:15:26.363118207 +0000 UTC m=+68.025859566" Sep 12 17:15:27.318583 kubelet[2556]: I0912 17:15:27.317082 2556 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:15:27.318583 kubelet[2556]: I0912 17:15:27.317050 2556 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:15:27.666758 containerd[1474]: time="2025-09-12T17:15:27.666688964Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:27.671613 containerd[1474]: time="2025-09-12T17:15:27.671467908Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=13761208" Sep 12 17:15:27.674087 containerd[1474]: time="2025-09-12T17:15:27.673503724Z" level=info msg="ImageCreate event name:\"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:27.684819 containerd[1474]: time="2025-09-12T17:15:27.682486899Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:15:27.687950 containerd[1474]: time="2025-09-12T17:15:27.684044801Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"15130401\" in 1.807297876s" Sep 12 17:15:27.688265 containerd[1474]: time="2025-09-12T17:15:27.688241231Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\"" Sep 12 17:15:27.696629 containerd[1474]: time="2025-09-12T17:15:27.696416096Z" level=info msg="CreateContainer within sandbox \"3003849266b76e7168cdfde98b1fd3830e01312968a4c53470e7af0342164f4b\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 12 17:15:27.725816 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2001603400.mount: Deactivated successfully. Sep 12 17:15:27.731429 containerd[1474]: time="2025-09-12T17:15:27.731134809Z" level=info msg="CreateContainer within sandbox \"3003849266b76e7168cdfde98b1fd3830e01312968a4c53470e7af0342164f4b\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"3b3d1354e330d55391d7d01e15de68722903f3870db7f45d48e123de2abde35f\"" Sep 12 17:15:27.733562 containerd[1474]: time="2025-09-12T17:15:27.733156786Z" level=info msg="StartContainer for \"3b3d1354e330d55391d7d01e15de68722903f3870db7f45d48e123de2abde35f\"" Sep 12 17:15:27.803453 systemd[1]: Started cri-containerd-3b3d1354e330d55391d7d01e15de68722903f3870db7f45d48e123de2abde35f.scope - libcontainer container 3b3d1354e330d55391d7d01e15de68722903f3870db7f45d48e123de2abde35f. Sep 12 17:15:27.909533 containerd[1474]: time="2025-09-12T17:15:27.909342684Z" level=info msg="StartContainer for \"3b3d1354e330d55391d7d01e15de68722903f3870db7f45d48e123de2abde35f\" returns successfully" Sep 12 17:15:28.856556 kubelet[2556]: I0912 17:15:28.856369 2556 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 12 17:15:28.864205 kubelet[2556]: I0912 17:15:28.863977 2556 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 12 17:15:32.835527 systemd[1]: run-containerd-runc-k8s.io-0c1848f8f275feb8f30f25af4b0fcbb91b47a7f77b960fc9978548f95aefa78e-runc.xOuSep.mount: Deactivated successfully. Sep 12 17:15:42.613596 systemd[1]: run-containerd-runc-k8s.io-8ce444d94d3e2a12112c11e86c3d8794da59c25f3e34b0ca19e1fa2a35a38549-runc.eloobS.mount: Deactivated successfully. Sep 12 17:15:44.382747 kubelet[2556]: I0912 17:15:44.382257 2556 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:15:44.419101 kubelet[2556]: I0912 17:15:44.416779 2556 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-krpgf" podStartSLOduration=41.109665215 podStartE2EDuration="57.416729863s" podCreationTimestamp="2025-09-12 17:14:47 +0000 UTC" firstStartedPulling="2025-09-12 17:15:11.385275855 +0000 UTC m=+53.048017214" lastFinishedPulling="2025-09-12 17:15:27.692340503 +0000 UTC m=+69.355081862" observedRunningTime="2025-09-12 17:15:28.343464077 +0000 UTC m=+70.006205476" watchObservedRunningTime="2025-09-12 17:15:44.416729863 +0000 UTC m=+86.079471222" Sep 12 17:15:45.566612 systemd[1]: run-containerd-runc-k8s.io-6ba95ad4134da62b2e02861ad037e954b0a6c55d866760cf60eaeca8b560bb19-runc.zPGBF4.mount: Deactivated successfully. Sep 12 17:15:53.284992 kubelet[2556]: I0912 17:15:53.284931 2556 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:16:12.601589 systemd[1]: run-containerd-runc-k8s.io-8ce444d94d3e2a12112c11e86c3d8794da59c25f3e34b0ca19e1fa2a35a38549-runc.plVi0U.mount: Deactivated successfully. Sep 12 17:17:02.025771 systemd[1]: Started sshd@7-49.13.6.100:22-139.178.89.65:37452.service - OpenSSH per-connection server daemon (139.178.89.65:37452). Sep 12 17:17:03.014165 sshd[6088]: Accepted publickey for core from 139.178.89.65 port 37452 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:17:03.015658 sshd[6088]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:17:03.024108 systemd-logind[1452]: New session 8 of user core. Sep 12 17:17:03.029304 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 12 17:17:03.823405 sshd[6088]: pam_unix(sshd:session): session closed for user core Sep 12 17:17:03.829596 systemd[1]: sshd@7-49.13.6.100:22-139.178.89.65:37452.service: Deactivated successfully. Sep 12 17:17:03.831868 systemd[1]: session-8.scope: Deactivated successfully. Sep 12 17:17:03.834117 systemd-logind[1452]: Session 8 logged out. Waiting for processes to exit. Sep 12 17:17:03.836316 systemd-logind[1452]: Removed session 8. Sep 12 17:17:09.003648 systemd[1]: Started sshd@8-49.13.6.100:22-139.178.89.65:37454.service - OpenSSH per-connection server daemon (139.178.89.65:37454). Sep 12 17:17:09.987877 sshd[6102]: Accepted publickey for core from 139.178.89.65 port 37454 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:17:09.989582 sshd[6102]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:17:09.995366 systemd-logind[1452]: New session 9 of user core. Sep 12 17:17:10.003368 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 12 17:17:10.749948 sshd[6102]: pam_unix(sshd:session): session closed for user core Sep 12 17:17:10.755668 systemd[1]: sshd@8-49.13.6.100:22-139.178.89.65:37454.service: Deactivated successfully. Sep 12 17:17:10.758044 systemd[1]: session-9.scope: Deactivated successfully. Sep 12 17:17:10.759027 systemd-logind[1452]: Session 9 logged out. Waiting for processes to exit. Sep 12 17:17:10.760125 systemd-logind[1452]: Removed session 9. Sep 12 17:17:10.927532 systemd[1]: Started sshd@9-49.13.6.100:22-139.178.89.65:35728.service - OpenSSH per-connection server daemon (139.178.89.65:35728). Sep 12 17:17:11.916838 sshd[6115]: Accepted publickey for core from 139.178.89.65 port 35728 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:17:11.919055 sshd[6115]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:17:11.924495 systemd-logind[1452]: New session 10 of user core. Sep 12 17:17:11.929258 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 12 17:17:12.720035 sshd[6115]: pam_unix(sshd:session): session closed for user core Sep 12 17:17:12.727785 systemd[1]: sshd@9-49.13.6.100:22-139.178.89.65:35728.service: Deactivated successfully. Sep 12 17:17:12.731286 systemd[1]: session-10.scope: Deactivated successfully. Sep 12 17:17:12.732802 systemd-logind[1452]: Session 10 logged out. Waiting for processes to exit. Sep 12 17:17:12.733928 systemd-logind[1452]: Removed session 10. Sep 12 17:17:12.901984 systemd[1]: Started sshd@10-49.13.6.100:22-139.178.89.65:35736.service - OpenSSH per-connection server daemon (139.178.89.65:35736). Sep 12 17:17:13.878441 sshd[6148]: Accepted publickey for core from 139.178.89.65 port 35736 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:17:13.880947 sshd[6148]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:17:13.886474 systemd-logind[1452]: New session 11 of user core. Sep 12 17:17:13.890365 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 12 17:17:14.629333 sshd[6148]: pam_unix(sshd:session): session closed for user core Sep 12 17:17:14.634494 systemd-logind[1452]: Session 11 logged out. Waiting for processes to exit. Sep 12 17:17:14.635483 systemd[1]: sshd@10-49.13.6.100:22-139.178.89.65:35736.service: Deactivated successfully. Sep 12 17:17:14.640357 systemd[1]: session-11.scope: Deactivated successfully. Sep 12 17:17:14.641848 systemd-logind[1452]: Removed session 11. Sep 12 17:17:19.808444 systemd[1]: Started sshd@11-49.13.6.100:22-139.178.89.65:35740.service - OpenSSH per-connection server daemon (139.178.89.65:35740). Sep 12 17:17:20.800527 sshd[6187]: Accepted publickey for core from 139.178.89.65 port 35740 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:17:20.801899 sshd[6187]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:17:20.808371 systemd-logind[1452]: New session 12 of user core. Sep 12 17:17:20.813390 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 12 17:17:21.562478 sshd[6187]: pam_unix(sshd:session): session closed for user core Sep 12 17:17:21.568051 systemd-logind[1452]: Session 12 logged out. Waiting for processes to exit. Sep 12 17:17:21.568467 systemd[1]: sshd@11-49.13.6.100:22-139.178.89.65:35740.service: Deactivated successfully. Sep 12 17:17:21.571037 systemd[1]: session-12.scope: Deactivated successfully. Sep 12 17:17:21.572715 systemd-logind[1452]: Removed session 12. Sep 12 17:17:26.738680 systemd[1]: Started sshd@12-49.13.6.100:22-139.178.89.65:57712.service - OpenSSH per-connection server daemon (139.178.89.65:57712). Sep 12 17:17:27.705552 sshd[6222]: Accepted publickey for core from 139.178.89.65 port 57712 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:17:27.708851 sshd[6222]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:17:27.714235 systemd-logind[1452]: New session 13 of user core. Sep 12 17:17:27.718388 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 12 17:17:28.476333 sshd[6222]: pam_unix(sshd:session): session closed for user core Sep 12 17:17:28.481840 systemd-logind[1452]: Session 13 logged out. Waiting for processes to exit. Sep 12 17:17:28.482386 systemd[1]: sshd@12-49.13.6.100:22-139.178.89.65:57712.service: Deactivated successfully. Sep 12 17:17:28.484146 systemd[1]: session-13.scope: Deactivated successfully. Sep 12 17:17:28.487692 systemd-logind[1452]: Removed session 13. Sep 12 17:17:33.651435 systemd[1]: Started sshd@13-49.13.6.100:22-139.178.89.65:39620.service - OpenSSH per-connection server daemon (139.178.89.65:39620). Sep 12 17:17:34.623862 sshd[6253]: Accepted publickey for core from 139.178.89.65 port 39620 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:17:34.626709 sshd[6253]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:17:34.634772 systemd-logind[1452]: New session 14 of user core. Sep 12 17:17:34.641330 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 12 17:17:35.387497 sshd[6253]: pam_unix(sshd:session): session closed for user core Sep 12 17:17:35.394459 systemd[1]: sshd@13-49.13.6.100:22-139.178.89.65:39620.service: Deactivated successfully. Sep 12 17:17:35.397369 systemd[1]: session-14.scope: Deactivated successfully. Sep 12 17:17:35.398737 systemd-logind[1452]: Session 14 logged out. Waiting for processes to exit. Sep 12 17:17:35.399906 systemd-logind[1452]: Removed session 14. Sep 12 17:17:35.562437 systemd[1]: Started sshd@14-49.13.6.100:22-139.178.89.65:39628.service - OpenSSH per-connection server daemon (139.178.89.65:39628). Sep 12 17:17:36.559550 sshd[6266]: Accepted publickey for core from 139.178.89.65 port 39628 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:17:36.563101 sshd[6266]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:17:36.569189 systemd-logind[1452]: New session 15 of user core. Sep 12 17:17:36.574453 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 12 17:17:37.461339 sshd[6266]: pam_unix(sshd:session): session closed for user core Sep 12 17:17:37.467677 systemd[1]: sshd@14-49.13.6.100:22-139.178.89.65:39628.service: Deactivated successfully. Sep 12 17:17:37.470562 systemd[1]: session-15.scope: Deactivated successfully. Sep 12 17:17:37.472620 systemd-logind[1452]: Session 15 logged out. Waiting for processes to exit. Sep 12 17:17:37.474582 systemd-logind[1452]: Removed session 15. Sep 12 17:17:37.646475 systemd[1]: Started sshd@15-49.13.6.100:22-139.178.89.65:39632.service - OpenSSH per-connection server daemon (139.178.89.65:39632). Sep 12 17:17:38.698690 sshd[6277]: Accepted publickey for core from 139.178.89.65 port 39632 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:17:38.700841 sshd[6277]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:17:38.707519 systemd-logind[1452]: New session 16 of user core. Sep 12 17:17:38.710325 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 12 17:17:39.975031 sshd[6277]: pam_unix(sshd:session): session closed for user core Sep 12 17:17:39.980853 systemd-logind[1452]: Session 16 logged out. Waiting for processes to exit. Sep 12 17:17:39.981404 systemd[1]: sshd@15-49.13.6.100:22-139.178.89.65:39632.service: Deactivated successfully. Sep 12 17:17:39.987503 systemd[1]: session-16.scope: Deactivated successfully. Sep 12 17:17:39.989741 systemd-logind[1452]: Removed session 16. Sep 12 17:17:40.154576 systemd[1]: Started sshd@16-49.13.6.100:22-139.178.89.65:45198.service - OpenSSH per-connection server daemon (139.178.89.65:45198). Sep 12 17:17:41.127002 sshd[6294]: Accepted publickey for core from 139.178.89.65 port 45198 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:17:41.128032 sshd[6294]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:17:41.133262 systemd-logind[1452]: New session 17 of user core. Sep 12 17:17:41.140415 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 12 17:17:42.009010 sshd[6294]: pam_unix(sshd:session): session closed for user core Sep 12 17:17:42.014211 systemd[1]: sshd@16-49.13.6.100:22-139.178.89.65:45198.service: Deactivated successfully. Sep 12 17:17:42.016582 systemd[1]: session-17.scope: Deactivated successfully. Sep 12 17:17:42.019672 systemd-logind[1452]: Session 17 logged out. Waiting for processes to exit. Sep 12 17:17:42.021450 systemd-logind[1452]: Removed session 17. Sep 12 17:17:42.181327 systemd[1]: Started sshd@17-49.13.6.100:22-139.178.89.65:45204.service - OpenSSH per-connection server daemon (139.178.89.65:45204). Sep 12 17:17:43.174115 sshd[6305]: Accepted publickey for core from 139.178.89.65 port 45204 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:17:43.175626 sshd[6305]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:17:43.180127 systemd-logind[1452]: New session 18 of user core. Sep 12 17:17:43.187376 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 12 17:17:43.931759 sshd[6305]: pam_unix(sshd:session): session closed for user core Sep 12 17:17:43.936915 systemd[1]: sshd@17-49.13.6.100:22-139.178.89.65:45204.service: Deactivated successfully. Sep 12 17:17:43.942059 systemd[1]: session-18.scope: Deactivated successfully. Sep 12 17:17:43.943325 systemd-logind[1452]: Session 18 logged out. Waiting for processes to exit. Sep 12 17:17:43.944609 systemd-logind[1452]: Removed session 18. Sep 12 17:17:49.113406 systemd[1]: Started sshd@18-49.13.6.100:22-139.178.89.65:45208.service - OpenSSH per-connection server daemon (139.178.89.65:45208). Sep 12 17:17:50.104046 sshd[6376]: Accepted publickey for core from 139.178.89.65 port 45208 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:17:50.106894 sshd[6376]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:17:50.113000 systemd-logind[1452]: New session 19 of user core. Sep 12 17:17:50.117340 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 12 17:17:50.870904 sshd[6376]: pam_unix(sshd:session): session closed for user core Sep 12 17:17:50.876699 systemd[1]: sshd@18-49.13.6.100:22-139.178.89.65:45208.service: Deactivated successfully. Sep 12 17:17:50.876973 systemd-logind[1452]: Session 19 logged out. Waiting for processes to exit. Sep 12 17:17:50.880681 systemd[1]: session-19.scope: Deactivated successfully. Sep 12 17:17:50.882041 systemd-logind[1452]: Removed session 19. Sep 12 17:17:56.051374 systemd[1]: Started sshd@19-49.13.6.100:22-139.178.89.65:40616.service - OpenSSH per-connection server daemon (139.178.89.65:40616). Sep 12 17:17:57.047507 sshd[6419]: Accepted publickey for core from 139.178.89.65 port 40616 ssh2: RSA SHA256:1SAgMiZlCHBr0Vs456OmR0PXIyT7CPtESXBBc/039go Sep 12 17:17:57.048597 sshd[6419]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:17:57.053840 systemd-logind[1452]: New session 20 of user core. Sep 12 17:17:57.059252 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 12 17:17:57.809385 sshd[6419]: pam_unix(sshd:session): session closed for user core Sep 12 17:17:57.814412 systemd[1]: sshd@19-49.13.6.100:22-139.178.89.65:40616.service: Deactivated successfully. Sep 12 17:17:57.820209 systemd[1]: session-20.scope: Deactivated successfully. Sep 12 17:17:57.821587 systemd-logind[1452]: Session 20 logged out. Waiting for processes to exit. Sep 12 17:17:57.824997 systemd-logind[1452]: Removed session 20. Sep 12 17:18:12.910147 kubelet[2556]: E0912 17:18:12.909745 2556 controller.go:195] "Failed to update lease" err="Put \"https://49.13.6.100:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-6-2-0999f1dc3d?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 12 17:18:13.041183 kubelet[2556]: E0912 17:18:13.040720 2556 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:53724->10.0.0.2:2379: read: connection timed out" Sep 12 17:18:13.049617 systemd[1]: cri-containerd-9cade8ef149aa73c072485c98ab4f0156d4036231e8627e7e375103307b567ec.scope: Deactivated successfully. Sep 12 17:18:13.049878 systemd[1]: cri-containerd-9cade8ef149aa73c072485c98ab4f0156d4036231e8627e7e375103307b567ec.scope: Consumed 4.040s CPU time, 15.6M memory peak, 0B memory swap peak. Sep 12 17:18:13.084526 containerd[1474]: time="2025-09-12T17:18:13.084299769Z" level=info msg="shim disconnected" id=9cade8ef149aa73c072485c98ab4f0156d4036231e8627e7e375103307b567ec namespace=k8s.io Sep 12 17:18:13.084526 containerd[1474]: time="2025-09-12T17:18:13.084364050Z" level=warning msg="cleaning up after shim disconnected" id=9cade8ef149aa73c072485c98ab4f0156d4036231e8627e7e375103307b567ec namespace=k8s.io Sep 12 17:18:13.084526 containerd[1474]: time="2025-09-12T17:18:13.084375410Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 12 17:18:13.087387 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9cade8ef149aa73c072485c98ab4f0156d4036231e8627e7e375103307b567ec-rootfs.mount: Deactivated successfully. Sep 12 17:18:13.194259 systemd[1]: cri-containerd-2a6634121269b87912849f7839284ccd259b93fa9dda530aee83149a64895cc0.scope: Deactivated successfully. Sep 12 17:18:13.194889 systemd[1]: cri-containerd-2a6634121269b87912849f7839284ccd259b93fa9dda530aee83149a64895cc0.scope: Consumed 21.970s CPU time. Sep 12 17:18:13.224251 containerd[1474]: time="2025-09-12T17:18:13.224164001Z" level=info msg="shim disconnected" id=2a6634121269b87912849f7839284ccd259b93fa9dda530aee83149a64895cc0 namespace=k8s.io Sep 12 17:18:13.224251 containerd[1474]: time="2025-09-12T17:18:13.224218002Z" level=warning msg="cleaning up after shim disconnected" id=2a6634121269b87912849f7839284ccd259b93fa9dda530aee83149a64895cc0 namespace=k8s.io Sep 12 17:18:13.224251 containerd[1474]: time="2025-09-12T17:18:13.224226122Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 12 17:18:13.228352 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2a6634121269b87912849f7839284ccd259b93fa9dda530aee83149a64895cc0-rootfs.mount: Deactivated successfully. Sep 12 17:18:13.462509 systemd[1]: cri-containerd-a13a73d4831314da2f237a685be10d25adf0e7716808bdbdbd0122a7cb74084a.scope: Deactivated successfully. Sep 12 17:18:13.464776 systemd[1]: cri-containerd-a13a73d4831314da2f237a685be10d25adf0e7716808bdbdbd0122a7cb74084a.scope: Consumed 5.743s CPU time, 17.6M memory peak, 0B memory swap peak. Sep 12 17:18:13.490008 containerd[1474]: time="2025-09-12T17:18:13.489910774Z" level=info msg="shim disconnected" id=a13a73d4831314da2f237a685be10d25adf0e7716808bdbdbd0122a7cb74084a namespace=k8s.io Sep 12 17:18:13.490211 containerd[1474]: time="2025-09-12T17:18:13.490002575Z" level=warning msg="cleaning up after shim disconnected" id=a13a73d4831314da2f237a685be10d25adf0e7716808bdbdbd0122a7cb74084a namespace=k8s.io Sep 12 17:18:13.490211 containerd[1474]: time="2025-09-12T17:18:13.490024015Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 12 17:18:13.492617 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a13a73d4831314da2f237a685be10d25adf0e7716808bdbdbd0122a7cb74084a-rootfs.mount: Deactivated successfully. Sep 12 17:18:13.827583 kubelet[2556]: I0912 17:18:13.826948 2556 scope.go:117] "RemoveContainer" containerID="2a6634121269b87912849f7839284ccd259b93fa9dda530aee83149a64895cc0" Sep 12 17:18:13.829203 kubelet[2556]: I0912 17:18:13.829009 2556 scope.go:117] "RemoveContainer" containerID="9cade8ef149aa73c072485c98ab4f0156d4036231e8627e7e375103307b567ec" Sep 12 17:18:13.830440 containerd[1474]: time="2025-09-12T17:18:13.830240313Z" level=info msg="CreateContainer within sandbox \"a8f3ca42cfb4b60e00a12c13d8fffc36d3766888554744a1e33b91ca2831f793\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Sep 12 17:18:13.831684 kubelet[2556]: I0912 17:18:13.831609 2556 scope.go:117] "RemoveContainer" containerID="a13a73d4831314da2f237a685be10d25adf0e7716808bdbdbd0122a7cb74084a" Sep 12 17:18:13.832802 containerd[1474]: time="2025-09-12T17:18:13.832654669Z" level=info msg="CreateContainer within sandbox \"df022c1f940687dc693a95621fa10ebf6f829cad92d0ab0c9a5a4e31fda9c245\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Sep 12 17:18:13.843162 containerd[1474]: time="2025-09-12T17:18:13.843121627Z" level=info msg="CreateContainer within sandbox \"51a3fa984ebcecfd7d63b13116b415fa2ef51a650700ec86b987abee85d07a27\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Sep 12 17:18:13.866167 containerd[1474]: time="2025-09-12T17:18:13.866128175Z" level=info msg="CreateContainer within sandbox \"51a3fa984ebcecfd7d63b13116b415fa2ef51a650700ec86b987abee85d07a27\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"a3bf0e54835733b6ab9f99a3842aaea1cb7a33ae8289c01a403e138c4fb95e74\"" Sep 12 17:18:13.867121 containerd[1474]: time="2025-09-12T17:18:13.866945867Z" level=info msg="StartContainer for \"a3bf0e54835733b6ab9f99a3842aaea1cb7a33ae8289c01a403e138c4fb95e74\"" Sep 12 17:18:13.871629 containerd[1474]: time="2025-09-12T17:18:13.871576697Z" level=info msg="CreateContainer within sandbox \"a8f3ca42cfb4b60e00a12c13d8fffc36d3766888554744a1e33b91ca2831f793\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"d02e675bc9add0168689668c811fdece2ebcba02ce306a7856b2c845399cd2e9\"" Sep 12 17:18:13.873143 containerd[1474]: time="2025-09-12T17:18:13.872311548Z" level=info msg="StartContainer for \"d02e675bc9add0168689668c811fdece2ebcba02ce306a7856b2c845399cd2e9\"" Sep 12 17:18:13.906488 containerd[1474]: time="2025-09-12T17:18:13.906433303Z" level=info msg="CreateContainer within sandbox \"df022c1f940687dc693a95621fa10ebf6f829cad92d0ab0c9a5a4e31fda9c245\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"ffa734d743dd57ad1b8a8619368c817b0efcfb935eff29b8faf8dfe9bfbfa993\"" Sep 12 17:18:13.906987 containerd[1474]: time="2025-09-12T17:18:13.906884430Z" level=info msg="StartContainer for \"ffa734d743dd57ad1b8a8619368c817b0efcfb935eff29b8faf8dfe9bfbfa993\"" Sep 12 17:18:13.912296 systemd[1]: Started cri-containerd-d02e675bc9add0168689668c811fdece2ebcba02ce306a7856b2c845399cd2e9.scope - libcontainer container d02e675bc9add0168689668c811fdece2ebcba02ce306a7856b2c845399cd2e9. Sep 12 17:18:13.922880 systemd[1]: Started cri-containerd-a3bf0e54835733b6ab9f99a3842aaea1cb7a33ae8289c01a403e138c4fb95e74.scope - libcontainer container a3bf0e54835733b6ab9f99a3842aaea1cb7a33ae8289c01a403e138c4fb95e74. Sep 12 17:18:13.946230 systemd[1]: Started cri-containerd-ffa734d743dd57ad1b8a8619368c817b0efcfb935eff29b8faf8dfe9bfbfa993.scope - libcontainer container ffa734d743dd57ad1b8a8619368c817b0efcfb935eff29b8faf8dfe9bfbfa993. Sep 12 17:18:13.984827 containerd[1474]: time="2025-09-12T17:18:13.983025260Z" level=info msg="StartContainer for \"a3bf0e54835733b6ab9f99a3842aaea1cb7a33ae8289c01a403e138c4fb95e74\" returns successfully" Sep 12 17:18:13.984827 containerd[1474]: time="2025-09-12T17:18:13.983363585Z" level=info msg="StartContainer for \"d02e675bc9add0168689668c811fdece2ebcba02ce306a7856b2c845399cd2e9\" returns successfully" Sep 12 17:18:14.012087 containerd[1474]: time="2025-09-12T17:18:14.012039815Z" level=info msg="StartContainer for \"ffa734d743dd57ad1b8a8619368c817b0efcfb935eff29b8faf8dfe9bfbfa993\" returns successfully" Sep 12 17:18:14.824902 update_engine[1453]: I20250912 17:18:14.823143 1453 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Sep 12 17:18:14.824902 update_engine[1453]: I20250912 17:18:14.823199 1453 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Sep 12 17:18:14.824902 update_engine[1453]: I20250912 17:18:14.823449 1453 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Sep 12 17:18:14.824902 update_engine[1453]: I20250912 17:18:14.823845 1453 omaha_request_params.cc:62] Current group set to lts Sep 12 17:18:14.824902 update_engine[1453]: I20250912 17:18:14.823937 1453 update_attempter.cc:499] Already updated boot flags. Skipping. Sep 12 17:18:14.824902 update_engine[1453]: I20250912 17:18:14.823946 1453 update_attempter.cc:643] Scheduling an action processor start. Sep 12 17:18:14.824902 update_engine[1453]: I20250912 17:18:14.823961 1453 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 12 17:18:14.825703 locksmithd[1503]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Sep 12 17:18:14.830076 update_engine[1453]: I20250912 17:18:14.828550 1453 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Sep 12 17:18:14.830076 update_engine[1453]: I20250912 17:18:14.828637 1453 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 12 17:18:14.830076 update_engine[1453]: I20250912 17:18:14.828646 1453 omaha_request_action.cc:272] Request: Sep 12 17:18:14.830076 update_engine[1453]: Sep 12 17:18:14.830076 update_engine[1453]: Sep 12 17:18:14.830076 update_engine[1453]: Sep 12 17:18:14.830076 update_engine[1453]: Sep 12 17:18:14.830076 update_engine[1453]: Sep 12 17:18:14.830076 update_engine[1453]: Sep 12 17:18:14.830076 update_engine[1453]: Sep 12 17:18:14.830076 update_engine[1453]: Sep 12 17:18:14.830076 update_engine[1453]: I20250912 17:18:14.828651 1453 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 12 17:18:14.835572 update_engine[1453]: I20250912 17:18:14.832752 1453 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 12 17:18:14.836310 update_engine[1453]: I20250912 17:18:14.836277 1453 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 12 17:18:14.837374 update_engine[1453]: E20250912 17:18:14.837340 1453 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 12 17:18:14.837555 update_engine[1453]: I20250912 17:18:14.837530 1453 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Sep 12 17:18:18.163756 kubelet[2556]: E0912 17:18:18.163450 2556 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:53528->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4081-3-6-2-0999f1dc3d.1864988d684c29be kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4081-3-6-2-0999f1dc3d,UID:b78a3622f27d83743c5fd6b9d1da6a02,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4081-3-6-2-0999f1dc3d,},FirstTimestamp:2025-09-12 17:18:07.688968638 +0000 UTC m=+229.351709997,LastTimestamp:2025-09-12 17:18:07.688968638 +0000 UTC m=+229.351709997,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4081-3-6-2-0999f1dc3d,}"