Sep 5 23:57:49.913553 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Sep 5 23:57:49.913577 kernel: Linux version 6.6.103-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT Fri Sep 5 22:30:47 -00 2025 Sep 5 23:57:49.913587 kernel: KASLR enabled Sep 5 23:57:49.913593 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Sep 5 23:57:49.913598 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x1390c1018 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b43d18 Sep 5 23:57:49.913604 kernel: random: crng init done Sep 5 23:57:49.913611 kernel: ACPI: Early table checksum verification disabled Sep 5 23:57:49.913617 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) Sep 5 23:57:49.913623 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Sep 5 23:57:49.913631 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Sep 5 23:57:49.913637 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 5 23:57:49.913643 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Sep 5 23:57:49.913648 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 5 23:57:49.913654 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 5 23:57:49.913662 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 5 23:57:49.913670 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 5 23:57:49.913676 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Sep 5 23:57:49.913682 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 5 23:57:49.913689 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) Sep 5 23:57:49.913695 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Sep 5 23:57:49.913701 kernel: NUMA: Failed to initialise from firmware Sep 5 23:57:49.913708 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Sep 5 23:57:49.913714 kernel: NUMA: NODE_DATA [mem 0x13966e800-0x139673fff] Sep 5 23:57:49.913720 kernel: Zone ranges: Sep 5 23:57:49.913726 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Sep 5 23:57:49.913734 kernel: DMA32 empty Sep 5 23:57:49.913740 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Sep 5 23:57:49.913746 kernel: Movable zone start for each node Sep 5 23:57:49.913753 kernel: Early memory node ranges Sep 5 23:57:49.913759 kernel: node 0: [mem 0x0000000040000000-0x000000013676ffff] Sep 5 23:57:49.913765 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] Sep 5 23:57:49.913772 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] Sep 5 23:57:49.913778 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] Sep 5 23:57:49.913784 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] Sep 5 23:57:49.913790 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] Sep 5 23:57:49.913796 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] Sep 5 23:57:49.913803 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Sep 5 23:57:49.913810 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Sep 5 23:57:49.913817 kernel: psci: probing for conduit method from ACPI. Sep 5 23:57:49.913823 kernel: psci: PSCIv1.1 detected in firmware. Sep 5 23:57:49.913833 kernel: psci: Using standard PSCI v0.2 function IDs Sep 5 23:57:49.913839 kernel: psci: Trusted OS migration not required Sep 5 23:57:49.913846 kernel: psci: SMC Calling Convention v1.1 Sep 5 23:57:49.913854 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Sep 5 23:57:49.913861 kernel: percpu: Embedded 31 pages/cpu s86632 r8192 d32152 u126976 Sep 5 23:57:49.913868 kernel: pcpu-alloc: s86632 r8192 d32152 u126976 alloc=31*4096 Sep 5 23:57:49.913875 kernel: pcpu-alloc: [0] 0 [0] 1 Sep 5 23:57:49.913881 kernel: Detected PIPT I-cache on CPU0 Sep 5 23:57:49.913888 kernel: CPU features: detected: GIC system register CPU interface Sep 5 23:57:49.913895 kernel: CPU features: detected: Hardware dirty bit management Sep 5 23:57:49.913901 kernel: CPU features: detected: Spectre-v4 Sep 5 23:57:49.913908 kernel: CPU features: detected: Spectre-BHB Sep 5 23:57:49.913915 kernel: CPU features: kernel page table isolation forced ON by KASLR Sep 5 23:57:49.913923 kernel: CPU features: detected: Kernel page table isolation (KPTI) Sep 5 23:57:49.913930 kernel: CPU features: detected: ARM erratum 1418040 Sep 5 23:57:49.913936 kernel: CPU features: detected: SSBS not fully self-synchronizing Sep 5 23:57:49.913943 kernel: alternatives: applying boot alternatives Sep 5 23:57:49.913951 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=ac831c89fe9ee7829b7371dadfb138f8d0e2b31ae3a5a920e0eba13bbab016c3 Sep 5 23:57:49.913958 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 5 23:57:49.913965 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 5 23:57:49.913972 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 5 23:57:49.913979 kernel: Fallback order for Node 0: 0 Sep 5 23:57:49.913985 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1008000 Sep 5 23:57:49.913992 kernel: Policy zone: Normal Sep 5 23:57:49.914000 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 5 23:57:49.914007 kernel: software IO TLB: area num 2. Sep 5 23:57:49.914013 kernel: software IO TLB: mapped [mem 0x00000000fbfff000-0x00000000fffff000] (64MB) Sep 5 23:57:49.914021 kernel: Memory: 3882804K/4096000K available (10304K kernel code, 2186K rwdata, 8108K rodata, 39424K init, 897K bss, 213196K reserved, 0K cma-reserved) Sep 5 23:57:49.914028 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Sep 5 23:57:49.914035 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 5 23:57:49.914055 kernel: rcu: RCU event tracing is enabled. Sep 5 23:57:49.914065 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Sep 5 23:57:49.914072 kernel: Trampoline variant of Tasks RCU enabled. Sep 5 23:57:49.914078 kernel: Tracing variant of Tasks RCU enabled. Sep 5 23:57:49.914085 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 5 23:57:49.914094 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Sep 5 23:57:49.914101 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Sep 5 23:57:49.914108 kernel: GICv3: 256 SPIs implemented Sep 5 23:57:49.914114 kernel: GICv3: 0 Extended SPIs implemented Sep 5 23:57:49.914121 kernel: Root IRQ handler: gic_handle_irq Sep 5 23:57:49.914128 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Sep 5 23:57:49.914134 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Sep 5 23:57:49.914141 kernel: ITS [mem 0x08080000-0x0809ffff] Sep 5 23:57:49.914148 kernel: ITS@0x0000000008080000: allocated 8192 Devices @1000c0000 (indirect, esz 8, psz 64K, shr 1) Sep 5 23:57:49.914167 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @1000d0000 (flat, esz 8, psz 64K, shr 1) Sep 5 23:57:49.914175 kernel: GICv3: using LPI property table @0x00000001000e0000 Sep 5 23:57:49.914181 kernel: GICv3: CPU0: using allocated LPI pending table @0x00000001000f0000 Sep 5 23:57:49.914192 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 5 23:57:49.914199 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 5 23:57:49.914206 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Sep 5 23:57:49.914212 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Sep 5 23:57:49.914219 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Sep 5 23:57:49.914226 kernel: Console: colour dummy device 80x25 Sep 5 23:57:49.914233 kernel: ACPI: Core revision 20230628 Sep 5 23:57:49.914240 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Sep 5 23:57:49.914247 kernel: pid_max: default: 32768 minimum: 301 Sep 5 23:57:49.914254 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Sep 5 23:57:49.914262 kernel: landlock: Up and running. Sep 5 23:57:49.914269 kernel: SELinux: Initializing. Sep 5 23:57:49.914276 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 5 23:57:49.914283 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 5 23:57:49.914290 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 5 23:57:49.914297 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 5 23:57:49.914303 kernel: rcu: Hierarchical SRCU implementation. Sep 5 23:57:49.914310 kernel: rcu: Max phase no-delay instances is 400. Sep 5 23:57:49.914317 kernel: Platform MSI: ITS@0x8080000 domain created Sep 5 23:57:49.914326 kernel: PCI/MSI: ITS@0x8080000 domain created Sep 5 23:57:49.914333 kernel: Remapping and enabling EFI services. Sep 5 23:57:49.914340 kernel: smp: Bringing up secondary CPUs ... Sep 5 23:57:49.914347 kernel: Detected PIPT I-cache on CPU1 Sep 5 23:57:49.914354 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Sep 5 23:57:49.914360 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100100000 Sep 5 23:57:49.914367 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 5 23:57:49.914374 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Sep 5 23:57:49.914381 kernel: smp: Brought up 1 node, 2 CPUs Sep 5 23:57:49.914388 kernel: SMP: Total of 2 processors activated. Sep 5 23:57:49.914396 kernel: CPU features: detected: 32-bit EL0 Support Sep 5 23:57:49.914403 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Sep 5 23:57:49.914415 kernel: CPU features: detected: Common not Private translations Sep 5 23:57:49.914424 kernel: CPU features: detected: CRC32 instructions Sep 5 23:57:49.914431 kernel: CPU features: detected: Enhanced Virtualization Traps Sep 5 23:57:49.914438 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Sep 5 23:57:49.914445 kernel: CPU features: detected: LSE atomic instructions Sep 5 23:57:49.914453 kernel: CPU features: detected: Privileged Access Never Sep 5 23:57:49.914460 kernel: CPU features: detected: RAS Extension Support Sep 5 23:57:49.914469 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Sep 5 23:57:49.914476 kernel: CPU: All CPU(s) started at EL1 Sep 5 23:57:49.914483 kernel: alternatives: applying system-wide alternatives Sep 5 23:57:49.914491 kernel: devtmpfs: initialized Sep 5 23:57:49.914498 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 5 23:57:49.914506 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Sep 5 23:57:49.914513 kernel: pinctrl core: initialized pinctrl subsystem Sep 5 23:57:49.914522 kernel: SMBIOS 3.0.0 present. Sep 5 23:57:49.914529 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Sep 5 23:57:49.914536 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 5 23:57:49.914543 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Sep 5 23:57:49.914551 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Sep 5 23:57:49.914558 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Sep 5 23:57:49.914567 kernel: audit: initializing netlink subsys (disabled) Sep 5 23:57:49.914575 kernel: audit: type=2000 audit(0.012:1): state=initialized audit_enabled=0 res=1 Sep 5 23:57:49.914582 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 5 23:57:49.914591 kernel: cpuidle: using governor menu Sep 5 23:57:49.914598 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Sep 5 23:57:49.914605 kernel: ASID allocator initialised with 32768 entries Sep 5 23:57:49.914612 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 5 23:57:49.914620 kernel: Serial: AMBA PL011 UART driver Sep 5 23:57:49.914627 kernel: Modules: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Sep 5 23:57:49.914635 kernel: Modules: 0 pages in range for non-PLT usage Sep 5 23:57:49.914642 kernel: Modules: 509008 pages in range for PLT usage Sep 5 23:57:49.914649 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 5 23:57:49.914663 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Sep 5 23:57:49.914671 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Sep 5 23:57:49.914678 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Sep 5 23:57:49.914686 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 5 23:57:49.914693 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Sep 5 23:57:49.914700 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Sep 5 23:57:49.914707 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Sep 5 23:57:49.914714 kernel: ACPI: Added _OSI(Module Device) Sep 5 23:57:49.914722 kernel: ACPI: Added _OSI(Processor Device) Sep 5 23:57:49.914730 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 5 23:57:49.914737 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 5 23:57:49.914745 kernel: ACPI: Interpreter enabled Sep 5 23:57:49.914752 kernel: ACPI: Using GIC for interrupt routing Sep 5 23:57:49.914760 kernel: ACPI: MCFG table detected, 1 entries Sep 5 23:57:49.914767 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Sep 5 23:57:49.914774 kernel: printk: console [ttyAMA0] enabled Sep 5 23:57:49.914781 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 5 23:57:49.914948 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 5 23:57:49.915027 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Sep 5 23:57:49.915114 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Sep 5 23:57:49.915204 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Sep 5 23:57:49.915272 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Sep 5 23:57:49.915282 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Sep 5 23:57:49.915289 kernel: PCI host bridge to bus 0000:00 Sep 5 23:57:49.915361 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Sep 5 23:57:49.915425 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Sep 5 23:57:49.915482 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Sep 5 23:57:49.915539 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 5 23:57:49.915620 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 Sep 5 23:57:49.915696 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 Sep 5 23:57:49.915762 kernel: pci 0000:00:01.0: reg 0x14: [mem 0x11289000-0x11289fff] Sep 5 23:57:49.915831 kernel: pci 0000:00:01.0: reg 0x20: [mem 0x8000600000-0x8000603fff 64bit pref] Sep 5 23:57:49.915903 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Sep 5 23:57:49.915969 kernel: pci 0000:00:02.0: reg 0x10: [mem 0x11288000-0x11288fff] Sep 5 23:57:49.916075 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Sep 5 23:57:49.916220 kernel: pci 0000:00:02.1: reg 0x10: [mem 0x11287000-0x11287fff] Sep 5 23:57:49.916312 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Sep 5 23:57:49.916382 kernel: pci 0000:00:02.2: reg 0x10: [mem 0x11286000-0x11286fff] Sep 5 23:57:49.916466 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Sep 5 23:57:49.916533 kernel: pci 0000:00:02.3: reg 0x10: [mem 0x11285000-0x11285fff] Sep 5 23:57:49.916605 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Sep 5 23:57:49.916671 kernel: pci 0000:00:02.4: reg 0x10: [mem 0x11284000-0x11284fff] Sep 5 23:57:49.916742 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Sep 5 23:57:49.916810 kernel: pci 0000:00:02.5: reg 0x10: [mem 0x11283000-0x11283fff] Sep 5 23:57:49.916881 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Sep 5 23:57:49.916947 kernel: pci 0000:00:02.6: reg 0x10: [mem 0x11282000-0x11282fff] Sep 5 23:57:49.917018 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Sep 5 23:57:49.917106 kernel: pci 0000:00:02.7: reg 0x10: [mem 0x11281000-0x11281fff] Sep 5 23:57:49.919297 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 Sep 5 23:57:49.919389 kernel: pci 0000:00:03.0: reg 0x10: [mem 0x11280000-0x11280fff] Sep 5 23:57:49.919472 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 Sep 5 23:57:49.919537 kernel: pci 0000:00:04.0: reg 0x10: [io 0x0000-0x0007] Sep 5 23:57:49.919613 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 Sep 5 23:57:49.919719 kernel: pci 0000:01:00.0: reg 0x14: [mem 0x11000000-0x11000fff] Sep 5 23:57:49.919810 kernel: pci 0000:01:00.0: reg 0x20: [mem 0x8000000000-0x8000003fff 64bit pref] Sep 5 23:57:49.919882 kernel: pci 0000:01:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Sep 5 23:57:49.920056 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 Sep 5 23:57:49.920146 kernel: pci 0000:02:00.0: reg 0x10: [mem 0x10e00000-0x10e03fff 64bit] Sep 5 23:57:49.920255 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 Sep 5 23:57:49.920327 kernel: pci 0000:03:00.0: reg 0x14: [mem 0x10c00000-0x10c00fff] Sep 5 23:57:49.920393 kernel: pci 0000:03:00.0: reg 0x20: [mem 0x8000100000-0x8000103fff 64bit pref] Sep 5 23:57:49.920471 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 Sep 5 23:57:49.920550 kernel: pci 0000:04:00.0: reg 0x20: [mem 0x8000200000-0x8000203fff 64bit pref] Sep 5 23:57:49.920640 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 Sep 5 23:57:49.920710 kernel: pci 0000:05:00.0: reg 0x20: [mem 0x8000300000-0x8000303fff 64bit pref] Sep 5 23:57:49.920787 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 Sep 5 23:57:49.920855 kernel: pci 0000:06:00.0: reg 0x14: [mem 0x10600000-0x10600fff] Sep 5 23:57:49.920923 kernel: pci 0000:06:00.0: reg 0x20: [mem 0x8000400000-0x8000403fff 64bit pref] Sep 5 23:57:49.920998 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 Sep 5 23:57:49.921111 kernel: pci 0000:07:00.0: reg 0x14: [mem 0x10400000-0x10400fff] Sep 5 23:57:49.921202 kernel: pci 0000:07:00.0: reg 0x20: [mem 0x8000500000-0x8000503fff 64bit pref] Sep 5 23:57:49.921290 kernel: pci 0000:07:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Sep 5 23:57:49.921366 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Sep 5 23:57:49.921432 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Sep 5 23:57:49.921497 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Sep 5 23:57:49.921571 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Sep 5 23:57:49.921637 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Sep 5 23:57:49.921701 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Sep 5 23:57:49.921769 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Sep 5 23:57:49.921834 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Sep 5 23:57:49.921899 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Sep 5 23:57:49.921968 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Sep 5 23:57:49.922032 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Sep 5 23:57:49.922128 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Sep 5 23:57:49.924317 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Sep 5 23:57:49.924479 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Sep 5 23:57:49.924555 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 05] add_size 200000 add_align 100000 Sep 5 23:57:49.924655 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Sep 5 23:57:49.924724 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Sep 5 23:57:49.924788 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Sep 5 23:57:49.924863 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Sep 5 23:57:49.924926 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Sep 5 23:57:49.924989 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Sep 5 23:57:49.925069 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Sep 5 23:57:49.925137 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Sep 5 23:57:49.925214 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Sep 5 23:57:49.925283 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Sep 5 23:57:49.925348 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Sep 5 23:57:49.925415 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Sep 5 23:57:49.925485 kernel: pci 0000:00:02.0: BAR 14: assigned [mem 0x10000000-0x101fffff] Sep 5 23:57:49.925550 kernel: pci 0000:00:02.0: BAR 15: assigned [mem 0x8000000000-0x80001fffff 64bit pref] Sep 5 23:57:49.925617 kernel: pci 0000:00:02.1: BAR 14: assigned [mem 0x10200000-0x103fffff] Sep 5 23:57:49.925680 kernel: pci 0000:00:02.1: BAR 15: assigned [mem 0x8000200000-0x80003fffff 64bit pref] Sep 5 23:57:49.925747 kernel: pci 0000:00:02.2: BAR 14: assigned [mem 0x10400000-0x105fffff] Sep 5 23:57:49.925813 kernel: pci 0000:00:02.2: BAR 15: assigned [mem 0x8000400000-0x80005fffff 64bit pref] Sep 5 23:57:49.925882 kernel: pci 0000:00:02.3: BAR 14: assigned [mem 0x10600000-0x107fffff] Sep 5 23:57:49.925947 kernel: pci 0000:00:02.3: BAR 15: assigned [mem 0x8000600000-0x80007fffff 64bit pref] Sep 5 23:57:49.926012 kernel: pci 0000:00:02.4: BAR 14: assigned [mem 0x10800000-0x109fffff] Sep 5 23:57:49.926126 kernel: pci 0000:00:02.4: BAR 15: assigned [mem 0x8000800000-0x80009fffff 64bit pref] Sep 5 23:57:49.926958 kernel: pci 0000:00:02.5: BAR 14: assigned [mem 0x10a00000-0x10bfffff] Sep 5 23:57:49.927039 kernel: pci 0000:00:02.5: BAR 15: assigned [mem 0x8000a00000-0x8000bfffff 64bit pref] Sep 5 23:57:49.927145 kernel: pci 0000:00:02.6: BAR 14: assigned [mem 0x10c00000-0x10dfffff] Sep 5 23:57:49.927232 kernel: pci 0000:00:02.6: BAR 15: assigned [mem 0x8000c00000-0x8000dfffff 64bit pref] Sep 5 23:57:49.927299 kernel: pci 0000:00:02.7: BAR 14: assigned [mem 0x10e00000-0x10ffffff] Sep 5 23:57:49.927364 kernel: pci 0000:00:02.7: BAR 15: assigned [mem 0x8000e00000-0x8000ffffff 64bit pref] Sep 5 23:57:49.927431 kernel: pci 0000:00:03.0: BAR 14: assigned [mem 0x11000000-0x111fffff] Sep 5 23:57:49.927495 kernel: pci 0000:00:03.0: BAR 15: assigned [mem 0x8001000000-0x80011fffff 64bit pref] Sep 5 23:57:49.927565 kernel: pci 0000:00:01.0: BAR 4: assigned [mem 0x8001200000-0x8001203fff 64bit pref] Sep 5 23:57:49.927659 kernel: pci 0000:00:01.0: BAR 1: assigned [mem 0x11200000-0x11200fff] Sep 5 23:57:49.927734 kernel: pci 0000:00:02.0: BAR 0: assigned [mem 0x11201000-0x11201fff] Sep 5 23:57:49.927800 kernel: pci 0000:00:02.0: BAR 13: assigned [io 0x1000-0x1fff] Sep 5 23:57:49.927867 kernel: pci 0000:00:02.1: BAR 0: assigned [mem 0x11202000-0x11202fff] Sep 5 23:57:49.927932 kernel: pci 0000:00:02.1: BAR 13: assigned [io 0x2000-0x2fff] Sep 5 23:57:49.927999 kernel: pci 0000:00:02.2: BAR 0: assigned [mem 0x11203000-0x11203fff] Sep 5 23:57:49.928079 kernel: pci 0000:00:02.2: BAR 13: assigned [io 0x3000-0x3fff] Sep 5 23:57:49.928148 kernel: pci 0000:00:02.3: BAR 0: assigned [mem 0x11204000-0x11204fff] Sep 5 23:57:49.928237 kernel: pci 0000:00:02.3: BAR 13: assigned [io 0x4000-0x4fff] Sep 5 23:57:49.928306 kernel: pci 0000:00:02.4: BAR 0: assigned [mem 0x11205000-0x11205fff] Sep 5 23:57:49.928370 kernel: pci 0000:00:02.4: BAR 13: assigned [io 0x5000-0x5fff] Sep 5 23:57:49.928435 kernel: pci 0000:00:02.5: BAR 0: assigned [mem 0x11206000-0x11206fff] Sep 5 23:57:49.928498 kernel: pci 0000:00:02.5: BAR 13: assigned [io 0x6000-0x6fff] Sep 5 23:57:49.928563 kernel: pci 0000:00:02.6: BAR 0: assigned [mem 0x11207000-0x11207fff] Sep 5 23:57:49.928626 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x7000-0x7fff] Sep 5 23:57:49.928690 kernel: pci 0000:00:02.7: BAR 0: assigned [mem 0x11208000-0x11208fff] Sep 5 23:57:49.928756 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x8000-0x8fff] Sep 5 23:57:49.928821 kernel: pci 0000:00:03.0: BAR 0: assigned [mem 0x11209000-0x11209fff] Sep 5 23:57:49.928885 kernel: pci 0000:00:03.0: BAR 13: assigned [io 0x9000-0x9fff] Sep 5 23:57:49.928955 kernel: pci 0000:00:04.0: BAR 0: assigned [io 0xa000-0xa007] Sep 5 23:57:49.929027 kernel: pci 0000:01:00.0: BAR 6: assigned [mem 0x10000000-0x1007ffff pref] Sep 5 23:57:49.929151 kernel: pci 0000:01:00.0: BAR 4: assigned [mem 0x8000000000-0x8000003fff 64bit pref] Sep 5 23:57:49.929255 kernel: pci 0000:01:00.0: BAR 1: assigned [mem 0x10080000-0x10080fff] Sep 5 23:57:49.929322 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Sep 5 23:57:49.929392 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Sep 5 23:57:49.929457 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Sep 5 23:57:49.929521 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Sep 5 23:57:49.929594 kernel: pci 0000:02:00.0: BAR 0: assigned [mem 0x10200000-0x10203fff 64bit] Sep 5 23:57:49.929660 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Sep 5 23:57:49.929729 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Sep 5 23:57:49.929793 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Sep 5 23:57:49.929857 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Sep 5 23:57:49.929928 kernel: pci 0000:03:00.0: BAR 4: assigned [mem 0x8000400000-0x8000403fff 64bit pref] Sep 5 23:57:49.929996 kernel: pci 0000:03:00.0: BAR 1: assigned [mem 0x10400000-0x10400fff] Sep 5 23:57:49.930079 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Sep 5 23:57:49.930147 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Sep 5 23:57:49.932193 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Sep 5 23:57:49.932268 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Sep 5 23:57:49.932340 kernel: pci 0000:04:00.0: BAR 4: assigned [mem 0x8000600000-0x8000603fff 64bit pref] Sep 5 23:57:49.932405 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Sep 5 23:57:49.932469 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Sep 5 23:57:49.932532 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Sep 5 23:57:49.932597 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Sep 5 23:57:49.932670 kernel: pci 0000:05:00.0: BAR 4: assigned [mem 0x8000800000-0x8000803fff 64bit pref] Sep 5 23:57:49.932743 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Sep 5 23:57:49.932808 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Sep 5 23:57:49.932872 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Sep 5 23:57:49.932936 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Sep 5 23:57:49.933007 kernel: pci 0000:06:00.0: BAR 4: assigned [mem 0x8000a00000-0x8000a03fff 64bit pref] Sep 5 23:57:49.933092 kernel: pci 0000:06:00.0: BAR 1: assigned [mem 0x10a00000-0x10a00fff] Sep 5 23:57:49.934682 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Sep 5 23:57:49.934787 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Sep 5 23:57:49.934861 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Sep 5 23:57:49.934925 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Sep 5 23:57:49.934998 kernel: pci 0000:07:00.0: BAR 6: assigned [mem 0x10c00000-0x10c7ffff pref] Sep 5 23:57:49.935113 kernel: pci 0000:07:00.0: BAR 4: assigned [mem 0x8000c00000-0x8000c03fff 64bit pref] Sep 5 23:57:49.935957 kernel: pci 0000:07:00.0: BAR 1: assigned [mem 0x10c80000-0x10c80fff] Sep 5 23:57:49.936077 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Sep 5 23:57:49.936207 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Sep 5 23:57:49.936294 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Sep 5 23:57:49.936365 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Sep 5 23:57:49.936448 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Sep 5 23:57:49.936529 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Sep 5 23:57:49.936596 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Sep 5 23:57:49.936659 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Sep 5 23:57:49.936725 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Sep 5 23:57:49.936787 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Sep 5 23:57:49.936850 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Sep 5 23:57:49.936917 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Sep 5 23:57:49.936985 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Sep 5 23:57:49.937052 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Sep 5 23:57:49.937119 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Sep 5 23:57:49.939327 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Sep 5 23:57:49.939406 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Sep 5 23:57:49.939466 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Sep 5 23:57:49.939542 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Sep 5 23:57:49.939629 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Sep 5 23:57:49.939691 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Sep 5 23:57:49.939767 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Sep 5 23:57:49.939833 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Sep 5 23:57:49.939892 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Sep 5 23:57:49.939969 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Sep 5 23:57:49.940030 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Sep 5 23:57:49.940111 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Sep 5 23:57:49.940210 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Sep 5 23:57:49.940277 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Sep 5 23:57:49.940336 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Sep 5 23:57:49.940402 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Sep 5 23:57:49.940464 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Sep 5 23:57:49.940523 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Sep 5 23:57:49.940589 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Sep 5 23:57:49.940650 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Sep 5 23:57:49.940723 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Sep 5 23:57:49.940804 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Sep 5 23:57:49.940935 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Sep 5 23:57:49.941003 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Sep 5 23:57:49.941126 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Sep 5 23:57:49.942300 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Sep 5 23:57:49.942378 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Sep 5 23:57:49.942395 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Sep 5 23:57:49.942404 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Sep 5 23:57:49.942412 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Sep 5 23:57:49.942420 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Sep 5 23:57:49.942430 kernel: iommu: Default domain type: Translated Sep 5 23:57:49.942438 kernel: iommu: DMA domain TLB invalidation policy: strict mode Sep 5 23:57:49.942446 kernel: efivars: Registered efivars operations Sep 5 23:57:49.942454 kernel: vgaarb: loaded Sep 5 23:57:49.942461 kernel: clocksource: Switched to clocksource arch_sys_counter Sep 5 23:57:49.942471 kernel: VFS: Disk quotas dquot_6.6.0 Sep 5 23:57:49.942479 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 5 23:57:49.942487 kernel: pnp: PnP ACPI init Sep 5 23:57:49.942564 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Sep 5 23:57:49.942576 kernel: pnp: PnP ACPI: found 1 devices Sep 5 23:57:49.942587 kernel: NET: Registered PF_INET protocol family Sep 5 23:57:49.942597 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 5 23:57:49.942606 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 5 23:57:49.942618 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 5 23:57:49.942626 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 5 23:57:49.942634 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 5 23:57:49.942642 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 5 23:57:49.942650 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 5 23:57:49.942658 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 5 23:57:49.942666 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 5 23:57:49.942743 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Sep 5 23:57:49.942755 kernel: PCI: CLS 0 bytes, default 64 Sep 5 23:57:49.942765 kernel: kvm [1]: HYP mode not available Sep 5 23:57:49.942772 kernel: Initialise system trusted keyrings Sep 5 23:57:49.942780 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 5 23:57:49.942788 kernel: Key type asymmetric registered Sep 5 23:57:49.942796 kernel: Asymmetric key parser 'x509' registered Sep 5 23:57:49.942803 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 5 23:57:49.942811 kernel: io scheduler mq-deadline registered Sep 5 23:57:49.942818 kernel: io scheduler kyber registered Sep 5 23:57:49.942826 kernel: io scheduler bfq registered Sep 5 23:57:49.942836 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Sep 5 23:57:49.942905 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Sep 5 23:57:49.942972 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Sep 5 23:57:49.943037 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 5 23:57:49.943180 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Sep 5 23:57:49.943258 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Sep 5 23:57:49.943326 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 5 23:57:49.943402 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Sep 5 23:57:49.943469 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Sep 5 23:57:49.943535 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 5 23:57:49.943604 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Sep 5 23:57:49.943669 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Sep 5 23:57:49.943734 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 5 23:57:49.943804 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Sep 5 23:57:49.943869 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Sep 5 23:57:49.943934 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 5 23:57:49.944001 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Sep 5 23:57:49.944090 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Sep 5 23:57:49.944178 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 5 23:57:49.944267 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Sep 5 23:57:49.944335 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Sep 5 23:57:49.944402 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 5 23:57:49.944470 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Sep 5 23:57:49.944536 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Sep 5 23:57:49.944606 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 5 23:57:49.944616 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Sep 5 23:57:49.944682 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Sep 5 23:57:49.944747 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Sep 5 23:57:49.944812 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 5 23:57:49.944823 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Sep 5 23:57:49.944831 kernel: ACPI: button: Power Button [PWRB] Sep 5 23:57:49.944839 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Sep 5 23:57:49.944913 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Sep 5 23:57:49.944986 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Sep 5 23:57:49.944997 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 5 23:57:49.945005 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Sep 5 23:57:49.945120 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Sep 5 23:57:49.945134 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Sep 5 23:57:49.945142 kernel: thunder_xcv, ver 1.0 Sep 5 23:57:49.945150 kernel: thunder_bgx, ver 1.0 Sep 5 23:57:49.945229 kernel: nicpf, ver 1.0 Sep 5 23:57:49.945237 kernel: nicvf, ver 1.0 Sep 5 23:57:49.945335 kernel: rtc-efi rtc-efi.0: registered as rtc0 Sep 5 23:57:49.945397 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-09-05T23:57:49 UTC (1757116669) Sep 5 23:57:49.945408 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 5 23:57:49.945416 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 counters available Sep 5 23:57:49.945424 kernel: watchdog: Delayed init of the lockup detector failed: -19 Sep 5 23:57:49.945432 kernel: watchdog: Hard watchdog permanently disabled Sep 5 23:57:49.945442 kernel: NET: Registered PF_INET6 protocol family Sep 5 23:57:49.945450 kernel: Segment Routing with IPv6 Sep 5 23:57:49.945457 kernel: In-situ OAM (IOAM) with IPv6 Sep 5 23:57:49.945465 kernel: NET: Registered PF_PACKET protocol family Sep 5 23:57:49.945473 kernel: Key type dns_resolver registered Sep 5 23:57:49.945480 kernel: registered taskstats version 1 Sep 5 23:57:49.945488 kernel: Loading compiled-in X.509 certificates Sep 5 23:57:49.945496 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.103-flatcar: 5b16e1dfa86dac534548885fd675b87757ff9e20' Sep 5 23:57:49.945503 kernel: Key type .fscrypt registered Sep 5 23:57:49.945511 kernel: Key type fscrypt-provisioning registered Sep 5 23:57:49.945521 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 5 23:57:49.945528 kernel: ima: Allocated hash algorithm: sha1 Sep 5 23:57:49.945536 kernel: ima: No architecture policies found Sep 5 23:57:49.945544 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Sep 5 23:57:49.945551 kernel: clk: Disabling unused clocks Sep 5 23:57:49.945559 kernel: Freeing unused kernel memory: 39424K Sep 5 23:57:49.945567 kernel: Run /init as init process Sep 5 23:57:49.945575 kernel: with arguments: Sep 5 23:57:49.945584 kernel: /init Sep 5 23:57:49.945591 kernel: with environment: Sep 5 23:57:49.945599 kernel: HOME=/ Sep 5 23:57:49.945606 kernel: TERM=linux Sep 5 23:57:49.945614 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 5 23:57:49.945623 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Sep 5 23:57:49.945633 systemd[1]: Detected virtualization kvm. Sep 5 23:57:49.945642 systemd[1]: Detected architecture arm64. Sep 5 23:57:49.945651 systemd[1]: Running in initrd. Sep 5 23:57:49.945659 systemd[1]: No hostname configured, using default hostname. Sep 5 23:57:49.945667 systemd[1]: Hostname set to . Sep 5 23:57:49.945675 systemd[1]: Initializing machine ID from VM UUID. Sep 5 23:57:49.945684 systemd[1]: Queued start job for default target initrd.target. Sep 5 23:57:49.945692 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 5 23:57:49.945700 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 5 23:57:49.945710 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 5 23:57:49.945720 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 5 23:57:49.945729 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 5 23:57:49.945737 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 5 23:57:49.945747 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 5 23:57:49.945755 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 5 23:57:49.945764 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 5 23:57:49.945772 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 5 23:57:49.945781 systemd[1]: Reached target paths.target - Path Units. Sep 5 23:57:49.945790 systemd[1]: Reached target slices.target - Slice Units. Sep 5 23:57:49.945798 systemd[1]: Reached target swap.target - Swaps. Sep 5 23:57:49.945806 systemd[1]: Reached target timers.target - Timer Units. Sep 5 23:57:49.945814 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 5 23:57:49.945822 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 5 23:57:49.945832 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 5 23:57:49.945840 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Sep 5 23:57:49.945850 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 5 23:57:49.945859 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 5 23:57:49.945867 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 5 23:57:49.945875 systemd[1]: Reached target sockets.target - Socket Units. Sep 5 23:57:49.945884 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 5 23:57:49.945892 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 5 23:57:49.945901 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 5 23:57:49.945909 systemd[1]: Starting systemd-fsck-usr.service... Sep 5 23:57:49.945917 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 5 23:57:49.945927 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 5 23:57:49.945935 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 5 23:57:49.945943 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 5 23:57:49.945952 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 5 23:57:49.945960 systemd[1]: Finished systemd-fsck-usr.service. Sep 5 23:57:49.945989 systemd-journald[236]: Collecting audit messages is disabled. Sep 5 23:57:49.946012 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 5 23:57:49.946021 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 5 23:57:49.946031 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 5 23:57:49.946039 kernel: Bridge firewalling registered Sep 5 23:57:49.946062 systemd-journald[236]: Journal started Sep 5 23:57:49.946082 systemd-journald[236]: Runtime Journal (/run/log/journal/a482337add7d4c7ebb9ddff3338b76fa) is 8.0M, max 76.6M, 68.6M free. Sep 5 23:57:49.948430 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 5 23:57:49.920151 systemd-modules-load[237]: Inserted module 'overlay' Sep 5 23:57:49.951715 systemd[1]: Started systemd-journald.service - Journal Service. Sep 5 23:57:49.941579 systemd-modules-load[237]: Inserted module 'br_netfilter' Sep 5 23:57:49.952180 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 5 23:57:49.953108 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 5 23:57:49.957357 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 5 23:57:49.959482 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 5 23:57:49.965801 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 5 23:57:49.975590 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 5 23:57:49.978193 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 5 23:57:49.983119 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 5 23:57:49.991641 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 5 23:57:49.993500 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 5 23:57:50.002379 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 5 23:57:50.004789 dracut-cmdline[269]: dracut-dracut-053 Sep 5 23:57:50.007250 dracut-cmdline[269]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=ac831c89fe9ee7829b7371dadfb138f8d0e2b31ae3a5a920e0eba13bbab016c3 Sep 5 23:57:50.037985 systemd-resolved[275]: Positive Trust Anchors: Sep 5 23:57:50.038004 systemd-resolved[275]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 5 23:57:50.038037 systemd-resolved[275]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 5 23:57:50.043726 systemd-resolved[275]: Defaulting to hostname 'linux'. Sep 5 23:57:50.045205 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 5 23:57:50.046839 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 5 23:57:50.103180 kernel: SCSI subsystem initialized Sep 5 23:57:50.107176 kernel: Loading iSCSI transport class v2.0-870. Sep 5 23:57:50.115180 kernel: iscsi: registered transport (tcp) Sep 5 23:57:50.129181 kernel: iscsi: registered transport (qla4xxx) Sep 5 23:57:50.129244 kernel: QLogic iSCSI HBA Driver Sep 5 23:57:50.169240 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 5 23:57:50.173368 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 5 23:57:50.195217 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 5 23:57:50.195349 kernel: device-mapper: uevent: version 1.0.3 Sep 5 23:57:50.195378 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Sep 5 23:57:50.244211 kernel: raid6: neonx8 gen() 15637 MB/s Sep 5 23:57:50.261203 kernel: raid6: neonx4 gen() 15612 MB/s Sep 5 23:57:50.278193 kernel: raid6: neonx2 gen() 13138 MB/s Sep 5 23:57:50.295211 kernel: raid6: neonx1 gen() 10409 MB/s Sep 5 23:57:50.312193 kernel: raid6: int64x8 gen() 6924 MB/s Sep 5 23:57:50.329205 kernel: raid6: int64x4 gen() 7296 MB/s Sep 5 23:57:50.346238 kernel: raid6: int64x2 gen() 6106 MB/s Sep 5 23:57:50.363240 kernel: raid6: int64x1 gen() 5036 MB/s Sep 5 23:57:50.363325 kernel: raid6: using algorithm neonx8 gen() 15637 MB/s Sep 5 23:57:50.380211 kernel: raid6: .... xor() 11971 MB/s, rmw enabled Sep 5 23:57:50.380284 kernel: raid6: using neon recovery algorithm Sep 5 23:57:50.385432 kernel: xor: measuring software checksum speed Sep 5 23:57:50.385492 kernel: 8regs : 19826 MB/sec Sep 5 23:57:50.385513 kernel: 32regs : 19017 MB/sec Sep 5 23:57:50.386185 kernel: arm64_neon : 26919 MB/sec Sep 5 23:57:50.386216 kernel: xor: using function: arm64_neon (26919 MB/sec) Sep 5 23:57:50.436207 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 5 23:57:50.450037 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 5 23:57:50.457422 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 5 23:57:50.480121 systemd-udevd[455]: Using default interface naming scheme 'v255'. Sep 5 23:57:50.483570 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 5 23:57:50.490472 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 5 23:57:50.505874 dracut-pre-trigger[462]: rd.md=0: removing MD RAID activation Sep 5 23:57:50.543396 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 5 23:57:50.548358 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 5 23:57:50.597803 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 5 23:57:50.608315 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 5 23:57:50.622905 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 5 23:57:50.624141 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 5 23:57:50.626852 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 5 23:57:50.627577 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 5 23:57:50.636463 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 5 23:57:50.653627 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 5 23:57:50.719864 kernel: scsi host0: Virtio SCSI HBA Sep 5 23:57:50.722239 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Sep 5 23:57:50.722277 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Sep 5 23:57:50.731116 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 5 23:57:50.731266 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 5 23:57:50.733396 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 5 23:57:50.734266 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 5 23:57:50.734413 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 5 23:57:50.735925 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 5 23:57:50.743492 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 5 23:57:50.764205 kernel: ACPI: bus type USB registered Sep 5 23:57:50.765701 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 5 23:57:50.769537 kernel: sr 0:0:0:0: Power-on or device reset occurred Sep 5 23:57:50.771174 kernel: usbcore: registered new interface driver usbfs Sep 5 23:57:50.771213 kernel: usbcore: registered new interface driver hub Sep 5 23:57:50.771224 kernel: usbcore: registered new device driver usb Sep 5 23:57:50.774195 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Sep 5 23:57:50.774424 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 5 23:57:50.775623 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 5 23:57:50.780199 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Sep 5 23:57:50.794482 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Sep 5 23:57:50.794699 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Sep 5 23:57:50.795609 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Sep 5 23:57:50.797593 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Sep 5 23:57:50.797764 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Sep 5 23:57:50.799194 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Sep 5 23:57:50.801232 kernel: hub 1-0:1.0: USB hub found Sep 5 23:57:50.801496 kernel: hub 1-0:1.0: 4 ports detected Sep 5 23:57:50.801588 kernel: sd 0:0:0:1: Power-on or device reset occurred Sep 5 23:57:50.801688 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Sep 5 23:57:50.803010 kernel: sd 0:0:0:1: [sda] Write Protect is off Sep 5 23:57:50.803193 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Sep 5 23:57:50.803231 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Sep 5 23:57:50.804471 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Sep 5 23:57:50.804646 kernel: hub 2-0:1.0: USB hub found Sep 5 23:57:50.806195 kernel: hub 2-0:1.0: 4 ports detected Sep 5 23:57:50.810331 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 5 23:57:50.810386 kernel: GPT:17805311 != 80003071 Sep 5 23:57:50.810396 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 5 23:57:50.812338 kernel: GPT:17805311 != 80003071 Sep 5 23:57:50.812388 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 5 23:57:50.812432 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 5 23:57:50.813254 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Sep 5 23:57:50.813209 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 5 23:57:50.867178 kernel: BTRFS: device label OEM devid 1 transid 9 /dev/sda6 scanned by (udev-worker) (500) Sep 5 23:57:50.871203 kernel: BTRFS: device fsid 045c118e-b098-46f0-884a-43665575c70e devid 1 transid 37 /dev/sda3 scanned by (udev-worker) (520) Sep 5 23:57:50.872311 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Sep 5 23:57:50.879476 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Sep 5 23:57:50.892756 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Sep 5 23:57:50.897050 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Sep 5 23:57:50.897890 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Sep 5 23:57:50.903414 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 5 23:57:50.912972 disk-uuid[572]: Primary Header is updated. Sep 5 23:57:50.912972 disk-uuid[572]: Secondary Entries is updated. Sep 5 23:57:50.912972 disk-uuid[572]: Secondary Header is updated. Sep 5 23:57:51.044232 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Sep 5 23:57:51.182197 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Sep 5 23:57:51.183569 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Sep 5 23:57:51.183893 kernel: usbcore: registered new interface driver usbhid Sep 5 23:57:51.183914 kernel: usbhid: USB HID core driver Sep 5 23:57:51.286202 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Sep 5 23:57:51.416243 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Sep 5 23:57:51.469246 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Sep 5 23:57:51.927196 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 5 23:57:51.927755 disk-uuid[573]: The operation has completed successfully. Sep 5 23:57:51.979708 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 5 23:57:51.980613 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 5 23:57:51.994386 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 5 23:57:52.014139 sh[581]: Success Sep 5 23:57:52.032197 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Sep 5 23:57:52.083858 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 5 23:57:52.100590 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 5 23:57:52.106076 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 5 23:57:52.129858 kernel: BTRFS info (device dm-0): first mount of filesystem 045c118e-b098-46f0-884a-43665575c70e Sep 5 23:57:52.129925 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Sep 5 23:57:52.129941 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Sep 5 23:57:52.131689 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 5 23:57:52.131744 kernel: BTRFS info (device dm-0): using free space tree Sep 5 23:57:52.139191 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 5 23:57:52.141274 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 5 23:57:52.143935 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 5 23:57:52.152413 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 5 23:57:52.157322 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 5 23:57:52.170281 kernel: BTRFS info (device sda6): first mount of filesystem 7395d4d5-ecb1-4acb-b5a4-3e846eddb858 Sep 5 23:57:52.170384 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 5 23:57:52.170430 kernel: BTRFS info (device sda6): using free space tree Sep 5 23:57:52.177427 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 5 23:57:52.177506 kernel: BTRFS info (device sda6): auto enabling async discard Sep 5 23:57:52.190185 kernel: BTRFS info (device sda6): last unmount of filesystem 7395d4d5-ecb1-4acb-b5a4-3e846eddb858 Sep 5 23:57:52.190403 systemd[1]: mnt-oem.mount: Deactivated successfully. Sep 5 23:57:52.196978 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 5 23:57:52.202441 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 5 23:57:52.278508 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 5 23:57:52.287430 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 5 23:57:52.310247 ignition[673]: Ignition 2.19.0 Sep 5 23:57:52.310830 ignition[673]: Stage: fetch-offline Sep 5 23:57:52.310330 systemd-networkd[767]: lo: Link UP Sep 5 23:57:52.310878 ignition[673]: no configs at "/usr/lib/ignition/base.d" Sep 5 23:57:52.310340 systemd-networkd[767]: lo: Gained carrier Sep 5 23:57:52.310886 ignition[673]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 5 23:57:52.312003 systemd-networkd[767]: Enumeration completed Sep 5 23:57:52.311302 ignition[673]: parsed url from cmdline: "" Sep 5 23:57:52.313319 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 5 23:57:52.311306 ignition[673]: no config URL provided Sep 5 23:57:52.315110 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 5 23:57:52.311317 ignition[673]: reading system config file "/usr/lib/ignition/user.ign" Sep 5 23:57:52.316134 systemd-networkd[767]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 5 23:57:52.311328 ignition[673]: no config at "/usr/lib/ignition/user.ign" Sep 5 23:57:52.316137 systemd-networkd[767]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 5 23:57:52.311334 ignition[673]: failed to fetch config: resource requires networking Sep 5 23:57:52.317533 systemd-networkd[767]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 5 23:57:52.311533 ignition[673]: Ignition finished successfully Sep 5 23:57:52.317536 systemd-networkd[767]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 5 23:57:52.318798 systemd[1]: Reached target network.target - Network. Sep 5 23:57:52.320174 systemd-networkd[767]: eth0: Link UP Sep 5 23:57:52.320178 systemd-networkd[767]: eth0: Gained carrier Sep 5 23:57:52.320187 systemd-networkd[767]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 5 23:57:52.324521 systemd-networkd[767]: eth1: Link UP Sep 5 23:57:52.324525 systemd-networkd[767]: eth1: Gained carrier Sep 5 23:57:52.324536 systemd-networkd[767]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 5 23:57:52.326432 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 5 23:57:52.342357 ignition[770]: Ignition 2.19.0 Sep 5 23:57:52.342996 ignition[770]: Stage: fetch Sep 5 23:57:52.343237 ignition[770]: no configs at "/usr/lib/ignition/base.d" Sep 5 23:57:52.343248 ignition[770]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 5 23:57:52.343344 ignition[770]: parsed url from cmdline: "" Sep 5 23:57:52.343348 ignition[770]: no config URL provided Sep 5 23:57:52.343353 ignition[770]: reading system config file "/usr/lib/ignition/user.ign" Sep 5 23:57:52.343362 ignition[770]: no config at "/usr/lib/ignition/user.ign" Sep 5 23:57:52.343382 ignition[770]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Sep 5 23:57:52.347277 ignition[770]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Sep 5 23:57:52.358258 systemd-networkd[767]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Sep 5 23:57:52.387264 systemd-networkd[767]: eth0: DHCPv4 address 128.140.56.156/32, gateway 172.31.1.1 acquired from 172.31.1.1 Sep 5 23:57:52.548233 ignition[770]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Sep 5 23:57:52.553989 ignition[770]: GET result: OK Sep 5 23:57:52.554119 ignition[770]: parsing config with SHA512: 37c9ed143eb7be823c6d24ef6a5237896bb3ff8f909fcaef21d3f8340337030afcdbc299328c291f2c64e13db97f0dde896eeb46247a70588b687f4cae5dbeb1 Sep 5 23:57:52.558993 unknown[770]: fetched base config from "system" Sep 5 23:57:52.559002 unknown[770]: fetched base config from "system" Sep 5 23:57:52.559441 ignition[770]: fetch: fetch complete Sep 5 23:57:52.559007 unknown[770]: fetched user config from "hetzner" Sep 5 23:57:52.559446 ignition[770]: fetch: fetch passed Sep 5 23:57:52.559490 ignition[770]: Ignition finished successfully Sep 5 23:57:52.564434 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 5 23:57:52.572439 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 5 23:57:52.588473 ignition[777]: Ignition 2.19.0 Sep 5 23:57:52.588489 ignition[777]: Stage: kargs Sep 5 23:57:52.588675 ignition[777]: no configs at "/usr/lib/ignition/base.d" Sep 5 23:57:52.588686 ignition[777]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 5 23:57:52.593493 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 5 23:57:52.589754 ignition[777]: kargs: kargs passed Sep 5 23:57:52.589807 ignition[777]: Ignition finished successfully Sep 5 23:57:52.602447 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 5 23:57:52.617533 ignition[783]: Ignition 2.19.0 Sep 5 23:57:52.617544 ignition[783]: Stage: disks Sep 5 23:57:52.617728 ignition[783]: no configs at "/usr/lib/ignition/base.d" Sep 5 23:57:52.620677 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 5 23:57:52.617737 ignition[783]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 5 23:57:52.621767 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 5 23:57:52.618733 ignition[783]: disks: disks passed Sep 5 23:57:52.623062 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 5 23:57:52.618781 ignition[783]: Ignition finished successfully Sep 5 23:57:52.624788 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 5 23:57:52.625753 systemd[1]: Reached target sysinit.target - System Initialization. Sep 5 23:57:52.626730 systemd[1]: Reached target basic.target - Basic System. Sep 5 23:57:52.634540 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 5 23:57:52.652263 systemd-fsck[791]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Sep 5 23:57:52.657284 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 5 23:57:52.666353 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 5 23:57:52.718177 kernel: EXT4-fs (sda9): mounted filesystem 72e55cb0-8368-4871-a3a0-8637412e72e8 r/w with ordered data mode. Quota mode: none. Sep 5 23:57:52.718723 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 5 23:57:52.719814 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 5 23:57:52.731398 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 5 23:57:52.736172 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 5 23:57:52.739340 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Sep 5 23:57:52.741932 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 5 23:57:52.743388 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 5 23:57:52.748050 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 5 23:57:52.750241 kernel: BTRFS: device label OEM devid 1 transid 10 /dev/sda6 scanned by mount (799) Sep 5 23:57:52.750574 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 5 23:57:52.755481 kernel: BTRFS info (device sda6): first mount of filesystem 7395d4d5-ecb1-4acb-b5a4-3e846eddb858 Sep 5 23:57:52.755534 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 5 23:57:52.756504 kernel: BTRFS info (device sda6): using free space tree Sep 5 23:57:52.767526 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 5 23:57:52.767596 kernel: BTRFS info (device sda6): auto enabling async discard Sep 5 23:57:52.774478 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 5 23:57:52.806962 initrd-setup-root[827]: cut: /sysroot/etc/passwd: No such file or directory Sep 5 23:57:52.814908 initrd-setup-root[834]: cut: /sysroot/etc/group: No such file or directory Sep 5 23:57:52.817451 coreos-metadata[801]: Sep 05 23:57:52.817 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Sep 5 23:57:52.820065 coreos-metadata[801]: Sep 05 23:57:52.819 INFO Fetch successful Sep 5 23:57:52.822202 coreos-metadata[801]: Sep 05 23:57:52.821 INFO wrote hostname ci-4081-3-5-n-8aba32846f to /sysroot/etc/hostname Sep 5 23:57:52.825254 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 5 23:57:52.827303 initrd-setup-root[841]: cut: /sysroot/etc/shadow: No such file or directory Sep 5 23:57:52.832273 initrd-setup-root[849]: cut: /sysroot/etc/gshadow: No such file or directory Sep 5 23:57:52.944938 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 5 23:57:52.951347 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 5 23:57:52.958410 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 5 23:57:52.969214 kernel: BTRFS info (device sda6): last unmount of filesystem 7395d4d5-ecb1-4acb-b5a4-3e846eddb858 Sep 5 23:57:52.997368 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 5 23:57:53.003609 ignition[916]: INFO : Ignition 2.19.0 Sep 5 23:57:53.003609 ignition[916]: INFO : Stage: mount Sep 5 23:57:53.005216 ignition[916]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 5 23:57:53.005216 ignition[916]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 5 23:57:53.008141 ignition[916]: INFO : mount: mount passed Sep 5 23:57:53.008141 ignition[916]: INFO : Ignition finished successfully Sep 5 23:57:53.007604 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 5 23:57:53.012408 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 5 23:57:53.129327 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 5 23:57:53.138537 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 5 23:57:53.148441 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 scanned by mount (928) Sep 5 23:57:53.150221 kernel: BTRFS info (device sda6): first mount of filesystem 7395d4d5-ecb1-4acb-b5a4-3e846eddb858 Sep 5 23:57:53.150329 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 5 23:57:53.150354 kernel: BTRFS info (device sda6): using free space tree Sep 5 23:57:53.153210 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 5 23:57:53.153259 kernel: BTRFS info (device sda6): auto enabling async discard Sep 5 23:57:53.157664 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 5 23:57:53.199478 ignition[946]: INFO : Ignition 2.19.0 Sep 5 23:57:53.199478 ignition[946]: INFO : Stage: files Sep 5 23:57:53.200539 ignition[946]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 5 23:57:53.200539 ignition[946]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 5 23:57:53.202001 ignition[946]: DEBUG : files: compiled without relabeling support, skipping Sep 5 23:57:53.202001 ignition[946]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 5 23:57:53.202001 ignition[946]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 5 23:57:53.205093 ignition[946]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 5 23:57:53.206027 ignition[946]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 5 23:57:53.207001 unknown[946]: wrote ssh authorized keys file for user: core Sep 5 23:57:53.208112 ignition[946]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 5 23:57:53.208977 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/etc/flatcar-cgroupv1" Sep 5 23:57:53.210075 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/etc/flatcar-cgroupv1" Sep 5 23:57:53.210075 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Sep 5 23:57:53.210075 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(4): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 Sep 5 23:57:53.318178 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(4): GET result: OK Sep 5 23:57:53.493678 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Sep 5 23:57:53.496043 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/install.sh" Sep 5 23:57:53.496043 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/install.sh" Sep 5 23:57:53.496043 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 5 23:57:53.496043 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 5 23:57:53.496043 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 5 23:57:53.496043 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 5 23:57:53.496043 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 5 23:57:53.496043 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 5 23:57:53.496043 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 5 23:57:53.496043 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 5 23:57:53.496043 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 5 23:57:53.496043 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 5 23:57:53.496043 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 5 23:57:53.496043 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(b): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-arm64.raw: attempt #1 Sep 5 23:57:53.576746 systemd-networkd[767]: eth0: Gained IPv6LL Sep 5 23:57:53.577261 systemd-networkd[767]: eth1: Gained IPv6LL Sep 5 23:57:53.606364 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(b): GET result: OK Sep 5 23:57:53.820134 ignition[946]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 5 23:57:53.820134 ignition[946]: INFO : files: op(c): [started] processing unit "containerd.service" Sep 5 23:57:53.822530 ignition[946]: INFO : files: op(c): op(d): [started] writing systemd drop-in "10-use-cgroupfs.conf" at "/sysroot/etc/systemd/system/containerd.service.d/10-use-cgroupfs.conf" Sep 5 23:57:53.822530 ignition[946]: INFO : files: op(c): op(d): [finished] writing systemd drop-in "10-use-cgroupfs.conf" at "/sysroot/etc/systemd/system/containerd.service.d/10-use-cgroupfs.conf" Sep 5 23:57:53.822530 ignition[946]: INFO : files: op(c): [finished] processing unit "containerd.service" Sep 5 23:57:53.822530 ignition[946]: INFO : files: op(e): [started] processing unit "prepare-helm.service" Sep 5 23:57:53.827884 ignition[946]: INFO : files: op(e): op(f): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 5 23:57:53.827884 ignition[946]: INFO : files: op(e): op(f): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 5 23:57:53.827884 ignition[946]: INFO : files: op(e): [finished] processing unit "prepare-helm.service" Sep 5 23:57:53.827884 ignition[946]: INFO : files: op(10): [started] processing unit "coreos-metadata.service" Sep 5 23:57:53.827884 ignition[946]: INFO : files: op(10): op(11): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Sep 5 23:57:53.827884 ignition[946]: INFO : files: op(10): op(11): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Sep 5 23:57:53.827884 ignition[946]: INFO : files: op(10): [finished] processing unit "coreos-metadata.service" Sep 5 23:57:53.827884 ignition[946]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Sep 5 23:57:53.827884 ignition[946]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Sep 5 23:57:53.827884 ignition[946]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 5 23:57:53.827884 ignition[946]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 5 23:57:53.827884 ignition[946]: INFO : files: files passed Sep 5 23:57:53.827884 ignition[946]: INFO : Ignition finished successfully Sep 5 23:57:53.827537 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 5 23:57:53.833453 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 5 23:57:53.837692 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 5 23:57:53.852136 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 5 23:57:53.852363 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 5 23:57:53.863189 initrd-setup-root-after-ignition[973]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 5 23:57:53.863189 initrd-setup-root-after-ignition[973]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 5 23:57:53.865900 initrd-setup-root-after-ignition[977]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 5 23:57:53.870210 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 5 23:57:53.871900 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 5 23:57:53.881463 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 5 23:57:53.924573 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 5 23:57:53.924774 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 5 23:57:53.928217 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 5 23:57:53.929372 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 5 23:57:53.930501 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 5 23:57:53.935422 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 5 23:57:53.951765 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 5 23:57:53.958596 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 5 23:57:53.973336 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 5 23:57:53.974748 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 5 23:57:53.975515 systemd[1]: Stopped target timers.target - Timer Units. Sep 5 23:57:53.976583 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 5 23:57:53.976714 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 5 23:57:53.978833 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 5 23:57:53.979497 systemd[1]: Stopped target basic.target - Basic System. Sep 5 23:57:53.980555 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 5 23:57:53.981589 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 5 23:57:53.982905 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 5 23:57:53.984233 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 5 23:57:53.985340 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 5 23:57:53.986516 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 5 23:57:53.987665 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 5 23:57:53.988756 systemd[1]: Stopped target swap.target - Swaps. Sep 5 23:57:53.989617 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 5 23:57:53.989741 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 5 23:57:53.991066 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 5 23:57:53.991815 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 5 23:57:53.992852 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 5 23:57:53.992928 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 5 23:57:53.994029 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 5 23:57:53.994183 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 5 23:57:53.995667 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 5 23:57:53.995795 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 5 23:57:53.997064 systemd[1]: ignition-files.service: Deactivated successfully. Sep 5 23:57:53.997180 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 5 23:57:53.998047 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Sep 5 23:57:53.998142 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 5 23:57:54.008551 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 5 23:57:54.014503 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 5 23:57:54.019322 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 5 23:57:54.019495 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 5 23:57:54.020469 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 5 23:57:54.020561 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 5 23:57:54.027808 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 5 23:57:54.028370 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 5 23:57:54.036984 ignition[998]: INFO : Ignition 2.19.0 Sep 5 23:57:54.036984 ignition[998]: INFO : Stage: umount Sep 5 23:57:54.040779 ignition[998]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 5 23:57:54.040779 ignition[998]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 5 23:57:54.040779 ignition[998]: INFO : umount: umount passed Sep 5 23:57:54.040779 ignition[998]: INFO : Ignition finished successfully Sep 5 23:57:54.041721 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 5 23:57:54.041881 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 5 23:57:54.046525 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 5 23:57:54.047851 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 5 23:57:54.048231 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 5 23:57:54.049274 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 5 23:57:54.049371 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 5 23:57:54.050424 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 5 23:57:54.050480 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 5 23:57:54.051118 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 5 23:57:54.051241 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 5 23:57:54.052071 systemd[1]: Stopped target network.target - Network. Sep 5 23:57:54.052838 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 5 23:57:54.052885 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 5 23:57:54.053852 systemd[1]: Stopped target paths.target - Path Units. Sep 5 23:57:54.054653 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 5 23:57:54.058222 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 5 23:57:54.059836 systemd[1]: Stopped target slices.target - Slice Units. Sep 5 23:57:54.061333 systemd[1]: Stopped target sockets.target - Socket Units. Sep 5 23:57:54.062520 systemd[1]: iscsid.socket: Deactivated successfully. Sep 5 23:57:54.062584 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 5 23:57:54.063569 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 5 23:57:54.063605 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 5 23:57:54.064463 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 5 23:57:54.064517 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 5 23:57:54.065396 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 5 23:57:54.065438 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 5 23:57:54.066314 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 5 23:57:54.066356 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 5 23:57:54.067596 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 5 23:57:54.068679 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 5 23:57:54.071459 systemd-networkd[767]: eth0: DHCPv6 lease lost Sep 5 23:57:54.074995 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 5 23:57:54.075151 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 5 23:57:54.076781 systemd-networkd[767]: eth1: DHCPv6 lease lost Sep 5 23:57:54.079678 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 5 23:57:54.079807 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 5 23:57:54.081719 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 5 23:57:54.081795 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 5 23:57:54.094524 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 5 23:57:54.095808 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 5 23:57:54.095928 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 5 23:57:54.098777 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 5 23:57:54.098850 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 5 23:57:54.100395 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 5 23:57:54.100492 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 5 23:57:54.102758 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 5 23:57:54.102845 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 5 23:57:54.103932 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 5 23:57:54.120993 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 5 23:57:54.122330 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 5 23:57:54.124035 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 5 23:57:54.124231 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 5 23:57:54.125715 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 5 23:57:54.125764 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 5 23:57:54.126928 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 5 23:57:54.126971 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 5 23:57:54.128033 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 5 23:57:54.128080 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 5 23:57:54.129553 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 5 23:57:54.129601 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 5 23:57:54.130961 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 5 23:57:54.131007 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 5 23:57:54.142033 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 5 23:57:54.143662 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 5 23:57:54.143776 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 5 23:57:54.145496 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 5 23:57:54.145579 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 5 23:57:54.151354 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 5 23:57:54.151449 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 5 23:57:54.154574 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 5 23:57:54.154635 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 5 23:57:54.156285 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 5 23:57:54.156409 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 5 23:57:54.158572 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 5 23:57:54.166407 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 5 23:57:54.177901 systemd[1]: Switching root. Sep 5 23:57:54.214620 systemd-journald[236]: Journal stopped Sep 5 23:57:55.194373 systemd-journald[236]: Received SIGTERM from PID 1 (systemd). Sep 5 23:57:55.194461 kernel: SELinux: policy capability network_peer_controls=1 Sep 5 23:57:55.194474 kernel: SELinux: policy capability open_perms=1 Sep 5 23:57:55.194484 kernel: SELinux: policy capability extended_socket_class=1 Sep 5 23:57:55.194495 kernel: SELinux: policy capability always_check_network=0 Sep 5 23:57:55.194504 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 5 23:57:55.194515 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 5 23:57:55.194535 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 5 23:57:55.194548 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 5 23:57:55.194559 kernel: audit: type=1403 audit(1757116674.442:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 5 23:57:55.194570 systemd[1]: Successfully loaded SELinux policy in 36.015ms. Sep 5 23:57:55.194594 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 10.845ms. Sep 5 23:57:55.194607 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Sep 5 23:57:55.194618 systemd[1]: Detected virtualization kvm. Sep 5 23:57:55.194629 systemd[1]: Detected architecture arm64. Sep 5 23:57:55.194640 systemd[1]: Detected first boot. Sep 5 23:57:55.194653 systemd[1]: Hostname set to . Sep 5 23:57:55.194664 systemd[1]: Initializing machine ID from VM UUID. Sep 5 23:57:55.194674 zram_generator::config[1060]: No configuration found. Sep 5 23:57:55.194686 systemd[1]: Populated /etc with preset unit settings. Sep 5 23:57:55.194696 systemd[1]: Queued start job for default target multi-user.target. Sep 5 23:57:55.194707 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 5 23:57:55.194719 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 5 23:57:55.194730 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 5 23:57:55.194742 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 5 23:57:55.194753 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 5 23:57:55.194768 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 5 23:57:55.194780 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 5 23:57:55.194790 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 5 23:57:55.194802 systemd[1]: Created slice user.slice - User and Session Slice. Sep 5 23:57:55.194813 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 5 23:57:55.194824 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 5 23:57:55.194835 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 5 23:57:55.194848 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 5 23:57:55.194859 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 5 23:57:55.194870 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 5 23:57:55.194881 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Sep 5 23:57:55.194892 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 5 23:57:55.194904 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 5 23:57:55.194915 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 5 23:57:55.194932 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 5 23:57:55.194943 systemd[1]: Reached target slices.target - Slice Units. Sep 5 23:57:55.194954 systemd[1]: Reached target swap.target - Swaps. Sep 5 23:57:55.194965 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 5 23:57:55.194976 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 5 23:57:55.194987 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 5 23:57:55.194997 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Sep 5 23:57:55.195020 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 5 23:57:55.195036 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 5 23:57:55.195047 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 5 23:57:55.195060 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 5 23:57:55.195071 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 5 23:57:55.195081 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 5 23:57:55.195092 systemd[1]: Mounting media.mount - External Media Directory... Sep 5 23:57:55.195103 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 5 23:57:55.195114 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 5 23:57:55.195124 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 5 23:57:55.195137 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 5 23:57:55.195148 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 5 23:57:55.195851 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 5 23:57:55.195879 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 5 23:57:55.195890 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 5 23:57:55.195901 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 5 23:57:55.195914 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 5 23:57:55.195925 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 5 23:57:55.195935 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 5 23:57:55.195946 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 5 23:57:55.195956 systemd[1]: systemd-journald.service: unit configures an IP firewall, but the local system does not support BPF/cgroup firewalling. Sep 5 23:57:55.195969 systemd[1]: systemd-journald.service: (This warning is only shown for the first unit using IP firewalling.) Sep 5 23:57:55.195979 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 5 23:57:55.195989 kernel: fuse: init (API version 7.39) Sep 5 23:57:55.196049 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 5 23:57:55.196063 kernel: loop: module loaded Sep 5 23:57:55.196074 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 5 23:57:55.196084 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 5 23:57:55.196095 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 5 23:57:55.196144 systemd-journald[1144]: Collecting audit messages is disabled. Sep 5 23:57:55.196188 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 5 23:57:55.196205 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 5 23:57:55.196215 systemd[1]: Mounted media.mount - External Media Directory. Sep 5 23:57:55.196227 systemd-journald[1144]: Journal started Sep 5 23:57:55.196250 systemd-journald[1144]: Runtime Journal (/run/log/journal/a482337add7d4c7ebb9ddff3338b76fa) is 8.0M, max 76.6M, 68.6M free. Sep 5 23:57:55.199383 systemd[1]: Started systemd-journald.service - Journal Service. Sep 5 23:57:55.201629 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 5 23:57:55.202615 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 5 23:57:55.203298 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 5 23:57:55.204777 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 5 23:57:55.205818 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 5 23:57:55.205986 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 5 23:57:55.209481 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 5 23:57:55.209669 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 5 23:57:55.210601 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 5 23:57:55.210764 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 5 23:57:55.212614 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 5 23:57:55.212890 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 5 23:57:55.214679 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 5 23:57:55.214838 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 5 23:57:55.217509 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 5 23:57:55.218551 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 5 23:57:55.224568 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 5 23:57:55.233184 kernel: ACPI: bus type drm_connector registered Sep 5 23:57:55.235498 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 5 23:57:55.236578 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 5 23:57:55.236742 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 5 23:57:55.246992 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 5 23:57:55.252347 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 5 23:57:55.259360 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 5 23:57:55.261785 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 5 23:57:55.271354 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 5 23:57:55.282928 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 5 23:57:55.284511 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 5 23:57:55.294455 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 5 23:57:55.295121 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 5 23:57:55.300700 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 5 23:57:55.305804 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 5 23:57:55.316316 systemd-journald[1144]: Time spent on flushing to /var/log/journal/a482337add7d4c7ebb9ddff3338b76fa is 49.967ms for 1109 entries. Sep 5 23:57:55.316316 systemd-journald[1144]: System Journal (/var/log/journal/a482337add7d4c7ebb9ddff3338b76fa) is 8.0M, max 584.8M, 576.8M free. Sep 5 23:57:55.378260 systemd-journald[1144]: Received client request to flush runtime journal. Sep 5 23:57:55.312761 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 5 23:57:55.317656 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 5 23:57:55.328103 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 5 23:57:55.339376 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Sep 5 23:57:55.352723 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 5 23:57:55.353869 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 5 23:57:55.354991 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 5 23:57:55.374536 udevadm[1201]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Sep 5 23:57:55.383821 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 5 23:57:55.389783 systemd-tmpfiles[1195]: ACLs are not supported, ignoring. Sep 5 23:57:55.389800 systemd-tmpfiles[1195]: ACLs are not supported, ignoring. Sep 5 23:57:55.398073 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 5 23:57:55.403422 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 5 23:57:55.441751 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 5 23:57:55.450465 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 5 23:57:55.472184 systemd-tmpfiles[1217]: ACLs are not supported, ignoring. Sep 5 23:57:55.472514 systemd-tmpfiles[1217]: ACLs are not supported, ignoring. Sep 5 23:57:55.477203 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 5 23:57:55.875692 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 5 23:57:55.883404 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 5 23:57:55.921069 systemd-udevd[1223]: Using default interface naming scheme 'v255'. Sep 5 23:57:55.943215 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 5 23:57:55.980437 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 5 23:57:55.995370 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 5 23:57:56.019040 systemd[1]: Found device dev-ttyAMA0.device - /dev/ttyAMA0. Sep 5 23:57:56.069106 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 5 23:57:56.165235 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 37 scanned by (udev-worker) (1244) Sep 5 23:57:56.181327 systemd-networkd[1232]: lo: Link UP Sep 5 23:57:56.181336 systemd-networkd[1232]: lo: Gained carrier Sep 5 23:57:56.182905 systemd-networkd[1232]: Enumeration completed Sep 5 23:57:56.183067 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 5 23:57:56.186408 systemd-networkd[1232]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 5 23:57:56.186419 systemd-networkd[1232]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 5 23:57:56.187278 systemd-networkd[1232]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 5 23:57:56.187281 systemd-networkd[1232]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 5 23:57:56.187798 systemd-networkd[1232]: eth0: Link UP Sep 5 23:57:56.187809 systemd-networkd[1232]: eth0: Gained carrier Sep 5 23:57:56.187822 systemd-networkd[1232]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 5 23:57:56.190371 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 5 23:57:56.194503 systemd-networkd[1232]: eth1: Link UP Sep 5 23:57:56.194516 systemd-networkd[1232]: eth1: Gained carrier Sep 5 23:57:56.194537 systemd-networkd[1232]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 5 23:57:56.231384 systemd-networkd[1232]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Sep 5 23:57:56.250261 systemd-networkd[1232]: eth0: DHCPv4 address 128.140.56.156/32, gateway 172.31.1.1 acquired from 172.31.1.1 Sep 5 23:57:56.251177 kernel: mousedev: PS/2 mouse device common for all mice Sep 5 23:57:56.255942 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 5 23:57:56.268386 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 5 23:57:56.273920 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 5 23:57:56.279696 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 5 23:57:56.281579 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 5 23:57:56.281639 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 5 23:57:56.282135 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 5 23:57:56.282725 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 5 23:57:56.300572 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Sep 5 23:57:56.303456 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 5 23:57:56.303646 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 5 23:57:56.306663 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 5 23:57:56.307358 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 5 23:57:56.316230 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Sep 5 23:57:56.316305 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Sep 5 23:57:56.316319 kernel: [drm] features: -context_init Sep 5 23:57:56.315544 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 5 23:57:56.315600 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 5 23:57:56.320607 kernel: [drm] number of scanouts: 1 Sep 5 23:57:56.320679 kernel: [drm] number of cap sets: 0 Sep 5 23:57:56.326178 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:01.0 on minor 0 Sep 5 23:57:56.335341 kernel: Console: switching to colour frame buffer device 160x50 Sep 5 23:57:56.341953 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 5 23:57:56.343461 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Sep 5 23:57:56.355474 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 5 23:57:56.355818 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 5 23:57:56.368525 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 5 23:57:56.438510 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 5 23:57:56.486320 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Sep 5 23:57:56.495402 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Sep 5 23:57:56.509208 lvm[1293]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Sep 5 23:57:56.538867 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Sep 5 23:57:56.541396 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 5 23:57:56.551475 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Sep 5 23:57:56.557249 lvm[1296]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Sep 5 23:57:56.583649 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Sep 5 23:57:56.584823 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 5 23:57:56.585721 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 5 23:57:56.585821 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 5 23:57:56.586585 systemd[1]: Reached target machines.target - Containers. Sep 5 23:57:56.588559 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Sep 5 23:57:56.594402 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 5 23:57:56.604435 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 5 23:57:56.609780 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 5 23:57:56.612441 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 5 23:57:56.617491 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Sep 5 23:57:56.620983 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 5 23:57:56.630707 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 5 23:57:56.654868 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 5 23:57:56.659704 kernel: loop0: detected capacity change from 0 to 203944 Sep 5 23:57:56.658087 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 5 23:57:56.660859 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Sep 5 23:57:56.678205 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 5 23:57:56.695273 kernel: loop1: detected capacity change from 0 to 8 Sep 5 23:57:56.716275 kernel: loop2: detected capacity change from 0 to 114432 Sep 5 23:57:56.758217 kernel: loop3: detected capacity change from 0 to 114328 Sep 5 23:57:56.795220 kernel: loop4: detected capacity change from 0 to 203944 Sep 5 23:57:56.818200 kernel: loop5: detected capacity change from 0 to 8 Sep 5 23:57:56.821194 kernel: loop6: detected capacity change from 0 to 114432 Sep 5 23:57:56.845282 kernel: loop7: detected capacity change from 0 to 114328 Sep 5 23:57:56.859063 (sd-merge)[1318]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Sep 5 23:57:56.859585 (sd-merge)[1318]: Merged extensions into '/usr'. Sep 5 23:57:56.864287 systemd[1]: Reloading requested from client PID 1304 ('systemd-sysext') (unit systemd-sysext.service)... Sep 5 23:57:56.864306 systemd[1]: Reloading... Sep 5 23:57:56.960199 zram_generator::config[1346]: No configuration found. Sep 5 23:57:57.051419 ldconfig[1300]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 5 23:57:57.066702 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 5 23:57:57.127468 systemd[1]: Reloading finished in 262 ms. Sep 5 23:57:57.143777 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 5 23:57:57.144873 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 5 23:57:57.154461 systemd[1]: Starting ensure-sysext.service... Sep 5 23:57:57.157352 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 5 23:57:57.163130 systemd[1]: Reloading requested from client PID 1390 ('systemctl') (unit ensure-sysext.service)... Sep 5 23:57:57.163173 systemd[1]: Reloading... Sep 5 23:57:57.189334 systemd-tmpfiles[1391]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 5 23:57:57.189645 systemd-tmpfiles[1391]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 5 23:57:57.190480 systemd-tmpfiles[1391]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 5 23:57:57.190757 systemd-tmpfiles[1391]: ACLs are not supported, ignoring. Sep 5 23:57:57.190810 systemd-tmpfiles[1391]: ACLs are not supported, ignoring. Sep 5 23:57:57.197328 systemd-tmpfiles[1391]: Detected autofs mount point /boot during canonicalization of boot. Sep 5 23:57:57.197340 systemd-tmpfiles[1391]: Skipping /boot Sep 5 23:57:57.209528 systemd-tmpfiles[1391]: Detected autofs mount point /boot during canonicalization of boot. Sep 5 23:57:57.209677 systemd-tmpfiles[1391]: Skipping /boot Sep 5 23:57:57.249250 zram_generator::config[1420]: No configuration found. Sep 5 23:57:57.366301 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 5 23:57:57.427638 systemd[1]: Reloading finished in 264 ms. Sep 5 23:57:57.451677 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 5 23:57:57.472447 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Sep 5 23:57:57.488409 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 5 23:57:57.497457 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 5 23:57:57.503617 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 5 23:57:57.512428 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 5 23:57:57.525759 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 5 23:57:57.530503 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 5 23:57:57.533691 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 5 23:57:57.545601 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 5 23:57:57.546803 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 5 23:57:57.550895 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 5 23:57:57.552504 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 5 23:57:57.566664 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 5 23:57:57.566878 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 5 23:57:57.573051 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 5 23:57:57.593321 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 5 23:57:57.595893 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 5 23:57:57.598442 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 5 23:57:57.605296 augenrules[1497]: No rules Sep 5 23:57:57.606359 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Sep 5 23:57:57.615519 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 5 23:57:57.624310 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 5 23:57:57.637509 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 5 23:57:57.642410 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 5 23:57:57.650909 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 5 23:57:57.652405 systemd-resolved[1472]: Positive Trust Anchors: Sep 5 23:57:57.652418 systemd-resolved[1472]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 5 23:57:57.652449 systemd-resolved[1472]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 5 23:57:57.652704 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 5 23:57:57.655469 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 5 23:57:57.661265 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 5 23:57:57.667317 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 5 23:57:57.667528 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 5 23:57:57.669496 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 5 23:57:57.669849 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 5 23:57:57.671434 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 5 23:57:57.671625 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 5 23:57:57.672237 systemd-resolved[1472]: Using system hostname 'ci-4081-3-5-n-8aba32846f'. Sep 5 23:57:57.678816 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 5 23:57:57.679168 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 5 23:57:57.685219 systemd[1]: Finished ensure-sysext.service. Sep 5 23:57:57.697606 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 5 23:57:57.698403 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 5 23:57:57.699567 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 5 23:57:57.700201 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 5 23:57:57.701393 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 5 23:57:57.703554 systemd[1]: Reached target network.target - Network. Sep 5 23:57:57.706777 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 5 23:57:57.707540 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 5 23:57:57.752448 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 5 23:57:57.754650 systemd[1]: Reached target sysinit.target - System Initialization. Sep 5 23:57:57.755369 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 5 23:57:57.756112 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 5 23:57:57.757062 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 5 23:57:57.757999 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 5 23:57:57.758041 systemd[1]: Reached target paths.target - Path Units. Sep 5 23:57:57.758637 systemd[1]: Reached target time-set.target - System Time Set. Sep 5 23:57:57.759431 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 5 23:57:57.760125 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 5 23:57:57.760768 systemd[1]: Reached target timers.target - Timer Units. Sep 5 23:57:57.762421 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 5 23:57:57.764774 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 5 23:57:57.766763 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 5 23:57:57.771029 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 5 23:57:57.772652 systemd[1]: Reached target sockets.target - Socket Units. Sep 5 23:57:57.774244 systemd[1]: Reached target basic.target - Basic System. Sep 5 23:57:57.775839 systemd[1]: System is tainted: cgroupsv1 Sep 5 23:57:57.775930 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 5 23:57:57.776001 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 5 23:57:57.778244 systemd[1]: Starting containerd.service - containerd container runtime... Sep 5 23:57:57.782381 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 5 23:57:58.234995 systemd-timesyncd[1523]: Contacted time server 46.38.241.235:123 (0.flatcar.pool.ntp.org). Sep 5 23:57:58.235350 systemd-resolved[1472]: Clock change detected. Flushing caches. Sep 5 23:57:58.235696 systemd-timesyncd[1523]: Initial clock synchronization to Fri 2025-09-05 23:57:58.234889 UTC. Sep 5 23:57:58.235836 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 5 23:57:58.241145 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 5 23:57:58.247734 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 5 23:57:58.248296 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 5 23:57:58.255274 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 5 23:57:58.259773 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 5 23:57:58.264910 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Sep 5 23:57:58.273587 jq[1534]: false Sep 5 23:57:58.274845 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 5 23:57:58.290053 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 5 23:57:58.310927 coreos-metadata[1531]: Sep 05 23:57:58.310 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Sep 5 23:57:58.319890 coreos-metadata[1531]: Sep 05 23:57:58.314 INFO Fetch successful Sep 5 23:57:58.319890 coreos-metadata[1531]: Sep 05 23:57:58.318 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Sep 5 23:57:58.317513 dbus-daemon[1532]: [system] SELinux support is enabled Sep 5 23:57:58.316333 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 5 23:57:58.318618 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 5 23:57:58.323074 systemd[1]: Starting update-engine.service - Update Engine... Sep 5 23:57:58.330088 coreos-metadata[1531]: Sep 05 23:57:58.323 INFO Fetch successful Sep 5 23:57:58.330669 extend-filesystems[1535]: Found loop4 Sep 5 23:57:58.334539 extend-filesystems[1535]: Found loop5 Sep 5 23:57:58.334539 extend-filesystems[1535]: Found loop6 Sep 5 23:57:58.334539 extend-filesystems[1535]: Found loop7 Sep 5 23:57:58.334539 extend-filesystems[1535]: Found sda Sep 5 23:57:58.334539 extend-filesystems[1535]: Found sda1 Sep 5 23:57:58.334539 extend-filesystems[1535]: Found sda2 Sep 5 23:57:58.334539 extend-filesystems[1535]: Found sda3 Sep 5 23:57:58.334539 extend-filesystems[1535]: Found usr Sep 5 23:57:58.334539 extend-filesystems[1535]: Found sda4 Sep 5 23:57:58.334539 extend-filesystems[1535]: Found sda6 Sep 5 23:57:58.334539 extend-filesystems[1535]: Found sda7 Sep 5 23:57:58.334539 extend-filesystems[1535]: Found sda9 Sep 5 23:57:58.334539 extend-filesystems[1535]: Checking size of /dev/sda9 Sep 5 23:57:58.341076 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 5 23:57:58.344604 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 5 23:57:58.372253 update_engine[1554]: I20250905 23:57:58.371501 1554 main.cc:92] Flatcar Update Engine starting Sep 5 23:57:58.377703 systemd-networkd[1232]: eth1: Gained IPv6LL Sep 5 23:57:58.381028 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 5 23:57:58.381295 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 5 23:57:58.381615 systemd[1]: motdgen.service: Deactivated successfully. Sep 5 23:57:58.381867 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 5 23:57:58.384647 jq[1555]: true Sep 5 23:57:58.391721 update_engine[1554]: I20250905 23:57:58.388036 1554 update_check_scheduler.cc:74] Next update check in 8m26s Sep 5 23:57:58.400929 extend-filesystems[1535]: Resized partition /dev/sda9 Sep 5 23:57:58.407590 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 5 23:57:58.408443 extend-filesystems[1571]: resize2fs 1.47.1 (20-May-2024) Sep 5 23:57:58.408931 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 5 23:57:58.409158 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 5 23:57:58.420306 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Sep 5 23:57:58.445907 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 37 scanned by (udev-worker) (1230) Sep 5 23:57:58.469098 (ntainerd)[1576]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 5 23:57:58.486809 jq[1574]: true Sep 5 23:57:58.487140 tar[1570]: linux-arm64/helm Sep 5 23:57:58.504315 systemd[1]: Started update-engine.service - Update Engine. Sep 5 23:57:58.510924 systemd[1]: Reached target network-online.target - Network is Online. Sep 5 23:57:58.522777 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 23:57:58.535867 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 5 23:57:58.537418 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 5 23:57:58.537472 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 5 23:57:58.539159 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 5 23:57:58.539176 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 5 23:57:58.541605 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 5 23:57:58.557127 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 5 23:57:58.619651 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Sep 5 23:57:58.642821 extend-filesystems[1571]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Sep 5 23:57:58.642821 extend-filesystems[1571]: old_desc_blocks = 1, new_desc_blocks = 5 Sep 5 23:57:58.642821 extend-filesystems[1571]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Sep 5 23:57:58.675596 bash[1623]: Updated "/home/core/.ssh/authorized_keys" Sep 5 23:57:58.623631 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 5 23:57:58.679279 extend-filesystems[1535]: Resized filesystem in /dev/sda9 Sep 5 23:57:58.679279 extend-filesystems[1535]: Found sr0 Sep 5 23:57:58.629988 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 5 23:57:58.634150 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 5 23:57:58.635113 systemd-networkd[1232]: eth0: Gained IPv6LL Sep 5 23:57:58.644050 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 5 23:57:58.666871 systemd[1]: Starting sshkeys.service... Sep 5 23:57:58.674740 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 5 23:57:58.674998 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 5 23:57:58.690244 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 5 23:57:58.701925 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 5 23:57:58.710760 systemd-logind[1551]: New seat seat0. Sep 5 23:57:58.714807 systemd-logind[1551]: Watching system buttons on /dev/input/event0 (Power Button) Sep 5 23:57:58.714828 systemd-logind[1551]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Sep 5 23:57:58.715162 systemd[1]: Started systemd-logind.service - User Login Management. Sep 5 23:57:58.760219 coreos-metadata[1638]: Sep 05 23:57:58.759 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Sep 5 23:57:58.762522 coreos-metadata[1638]: Sep 05 23:57:58.761 INFO Fetch successful Sep 5 23:57:58.766935 unknown[1638]: wrote ssh authorized keys file for user: core Sep 5 23:57:58.806666 update-ssh-keys[1643]: Updated "/home/core/.ssh/authorized_keys" Sep 5 23:57:58.811853 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 5 23:57:58.817097 systemd[1]: Finished sshkeys.service. Sep 5 23:57:58.899937 containerd[1576]: time="2025-09-05T23:57:58.898183214Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Sep 5 23:57:58.976633 containerd[1576]: time="2025-09-05T23:57:58.975313534Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Sep 5 23:57:58.979654 containerd[1576]: time="2025-09-05T23:57:58.978844934Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.103-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Sep 5 23:57:58.979654 containerd[1576]: time="2025-09-05T23:57:58.978895534Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Sep 5 23:57:58.979654 containerd[1576]: time="2025-09-05T23:57:58.978913814Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Sep 5 23:57:58.979654 containerd[1576]: time="2025-09-05T23:57:58.979087534Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Sep 5 23:57:58.979654 containerd[1576]: time="2025-09-05T23:57:58.979105774Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Sep 5 23:57:58.979654 containerd[1576]: time="2025-09-05T23:57:58.979165134Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Sep 5 23:57:58.979654 containerd[1576]: time="2025-09-05T23:57:58.979180134Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Sep 5 23:57:58.979654 containerd[1576]: time="2025-09-05T23:57:58.979405854Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Sep 5 23:57:58.979654 containerd[1576]: time="2025-09-05T23:57:58.979420854Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Sep 5 23:57:58.979654 containerd[1576]: time="2025-09-05T23:57:58.979436694Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Sep 5 23:57:58.979654 containerd[1576]: time="2025-09-05T23:57:58.979490974Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Sep 5 23:57:58.979931 containerd[1576]: time="2025-09-05T23:57:58.979577094Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Sep 5 23:57:58.980895 containerd[1576]: time="2025-09-05T23:57:58.980840894Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Sep 5 23:57:58.981071 containerd[1576]: time="2025-09-05T23:57:58.981044294Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Sep 5 23:57:58.981071 containerd[1576]: time="2025-09-05T23:57:58.981064694Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Sep 5 23:57:58.981173 containerd[1576]: time="2025-09-05T23:57:58.981153574Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Sep 5 23:57:58.981221 containerd[1576]: time="2025-09-05T23:57:58.981205894Z" level=info msg="metadata content store policy set" policy=shared Sep 5 23:57:58.985916 containerd[1576]: time="2025-09-05T23:57:58.985864334Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Sep 5 23:57:58.986023 containerd[1576]: time="2025-09-05T23:57:58.985935214Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Sep 5 23:57:58.986023 containerd[1576]: time="2025-09-05T23:57:58.985961734Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Sep 5 23:57:58.986023 containerd[1576]: time="2025-09-05T23:57:58.985979334Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Sep 5 23:57:58.986023 containerd[1576]: time="2025-09-05T23:57:58.985993854Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Sep 5 23:57:58.987576 containerd[1576]: time="2025-09-05T23:57:58.986168454Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Sep 5 23:57:58.987576 containerd[1576]: time="2025-09-05T23:57:58.986523574Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Sep 5 23:57:58.987576 containerd[1576]: time="2025-09-05T23:57:58.986657414Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Sep 5 23:57:58.987576 containerd[1576]: time="2025-09-05T23:57:58.986677334Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Sep 5 23:57:58.987576 containerd[1576]: time="2025-09-05T23:57:58.986697094Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Sep 5 23:57:58.987576 containerd[1576]: time="2025-09-05T23:57:58.986711054Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Sep 5 23:57:58.987576 containerd[1576]: time="2025-09-05T23:57:58.986728534Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Sep 5 23:57:58.987576 containerd[1576]: time="2025-09-05T23:57:58.986741734Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Sep 5 23:57:58.987576 containerd[1576]: time="2025-09-05T23:57:58.986756054Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Sep 5 23:57:58.987576 containerd[1576]: time="2025-09-05T23:57:58.986771574Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Sep 5 23:57:58.987576 containerd[1576]: time="2025-09-05T23:57:58.986784174Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Sep 5 23:57:58.987576 containerd[1576]: time="2025-09-05T23:57:58.986796014Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Sep 5 23:57:58.987576 containerd[1576]: time="2025-09-05T23:57:58.986807814Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Sep 5 23:57:58.987576 containerd[1576]: time="2025-09-05T23:57:58.986828614Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Sep 5 23:57:58.987954 containerd[1576]: time="2025-09-05T23:57:58.986847814Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Sep 5 23:57:58.987954 containerd[1576]: time="2025-09-05T23:57:58.986859934Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Sep 5 23:57:58.987954 containerd[1576]: time="2025-09-05T23:57:58.986885134Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Sep 5 23:57:58.987954 containerd[1576]: time="2025-09-05T23:57:58.986902454Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Sep 5 23:57:58.987954 containerd[1576]: time="2025-09-05T23:57:58.986916414Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Sep 5 23:57:58.987954 containerd[1576]: time="2025-09-05T23:57:58.986928574Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Sep 5 23:57:58.987954 containerd[1576]: time="2025-09-05T23:57:58.986943494Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Sep 5 23:57:58.987954 containerd[1576]: time="2025-09-05T23:57:58.986956454Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Sep 5 23:57:58.987954 containerd[1576]: time="2025-09-05T23:57:58.986978214Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Sep 5 23:57:58.987954 containerd[1576]: time="2025-09-05T23:57:58.986990294Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Sep 5 23:57:58.987954 containerd[1576]: time="2025-09-05T23:57:58.987001614Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Sep 5 23:57:58.987954 containerd[1576]: time="2025-09-05T23:57:58.987014734Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Sep 5 23:57:58.987954 containerd[1576]: time="2025-09-05T23:57:58.987030894Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Sep 5 23:57:58.987954 containerd[1576]: time="2025-09-05T23:57:58.987052574Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Sep 5 23:57:58.987954 containerd[1576]: time="2025-09-05T23:57:58.987065774Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Sep 5 23:57:58.988413 containerd[1576]: time="2025-09-05T23:57:58.987078534Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Sep 5 23:57:58.988413 containerd[1576]: time="2025-09-05T23:57:58.987195014Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Sep 5 23:57:58.988413 containerd[1576]: time="2025-09-05T23:57:58.987213894Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Sep 5 23:57:58.988413 containerd[1576]: time="2025-09-05T23:57:58.987225694Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Sep 5 23:57:58.988413 containerd[1576]: time="2025-09-05T23:57:58.987238494Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Sep 5 23:57:58.988413 containerd[1576]: time="2025-09-05T23:57:58.987247654Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Sep 5 23:57:58.988413 containerd[1576]: time="2025-09-05T23:57:58.987261734Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Sep 5 23:57:58.988413 containerd[1576]: time="2025-09-05T23:57:58.987272214Z" level=info msg="NRI interface is disabled by configuration." Sep 5 23:57:58.988413 containerd[1576]: time="2025-09-05T23:57:58.987282734Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Sep 5 23:57:58.991803 containerd[1576]: time="2025-09-05T23:57:58.991535574Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:false] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:false SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Sep 5 23:57:58.992552 containerd[1576]: time="2025-09-05T23:57:58.991839494Z" level=info msg="Connect containerd service" Sep 5 23:57:58.992552 containerd[1576]: time="2025-09-05T23:57:58.991897414Z" level=info msg="using legacy CRI server" Sep 5 23:57:58.992552 containerd[1576]: time="2025-09-05T23:57:58.991907494Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 5 23:57:58.992552 containerd[1576]: time="2025-09-05T23:57:58.992047374Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Sep 5 23:57:58.999646 containerd[1576]: time="2025-09-05T23:57:58.999039974Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 5 23:57:59.001652 containerd[1576]: time="2025-09-05T23:57:59.000727294Z" level=info msg="Start subscribing containerd event" Sep 5 23:57:59.001652 containerd[1576]: time="2025-09-05T23:57:59.000811294Z" level=info msg="Start recovering state" Sep 5 23:57:59.001652 containerd[1576]: time="2025-09-05T23:57:59.000909134Z" level=info msg="Start event monitor" Sep 5 23:57:59.001652 containerd[1576]: time="2025-09-05T23:57:59.000923174Z" level=info msg="Start snapshots syncer" Sep 5 23:57:59.001652 containerd[1576]: time="2025-09-05T23:57:59.000933814Z" level=info msg="Start cni network conf syncer for default" Sep 5 23:57:59.001652 containerd[1576]: time="2025-09-05T23:57:59.000941574Z" level=info msg="Start streaming server" Sep 5 23:57:59.001869 containerd[1576]: time="2025-09-05T23:57:59.001816094Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 5 23:57:59.001950 containerd[1576]: time="2025-09-05T23:57:59.001899854Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 5 23:57:59.002131 systemd[1]: Started containerd.service - containerd container runtime. Sep 5 23:57:59.009328 containerd[1576]: time="2025-09-05T23:57:59.001997214Z" level=info msg="containerd successfully booted in 0.109196s" Sep 5 23:57:59.055753 locksmithd[1604]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 5 23:57:59.312694 tar[1570]: linux-arm64/LICENSE Sep 5 23:57:59.312694 tar[1570]: linux-arm64/README.md Sep 5 23:57:59.332135 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 5 23:57:59.657810 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 23:57:59.660139 (kubelet)[1672]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 5 23:58:00.043040 sshd_keygen[1572]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 5 23:58:00.068090 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 5 23:58:00.079134 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 5 23:58:00.088877 systemd[1]: issuegen.service: Deactivated successfully. Sep 5 23:58:00.089142 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 5 23:58:00.100548 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 5 23:58:00.117221 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 5 23:58:00.128041 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 5 23:58:00.131852 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Sep 5 23:58:00.133459 systemd[1]: Reached target getty.target - Login Prompts. Sep 5 23:58:00.134401 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 5 23:58:00.135276 systemd[1]: Startup finished in 5.533s (kernel) + 5.280s (userspace) = 10.813s. Sep 5 23:58:00.227370 kubelet[1672]: E0905 23:58:00.227289 1672 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 5 23:58:00.231276 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 5 23:58:00.231745 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 5 23:58:10.353139 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 5 23:58:10.361974 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 23:58:10.502001 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 23:58:10.503080 (kubelet)[1716]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 5 23:58:10.551936 kubelet[1716]: E0905 23:58:10.551869 1716 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 5 23:58:10.559114 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 5 23:58:10.559415 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 5 23:58:20.603159 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 5 23:58:20.610200 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 23:58:20.764996 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 23:58:20.766593 (kubelet)[1737]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 5 23:58:20.815192 kubelet[1737]: E0905 23:58:20.815102 1737 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 5 23:58:20.820849 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 5 23:58:20.821023 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 5 23:58:30.853132 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 5 23:58:30.863972 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 23:58:30.995874 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 23:58:31.000322 (kubelet)[1757]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 5 23:58:31.052880 kubelet[1757]: E0905 23:58:31.052786 1757 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 5 23:58:31.056613 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 5 23:58:31.057529 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 5 23:58:39.665916 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 5 23:58:39.673197 systemd[1]: Started sshd@0-128.140.56.156:22-139.178.68.195:33578.service - OpenSSH per-connection server daemon (139.178.68.195:33578). Sep 5 23:58:40.669140 sshd[1765]: Accepted publickey for core from 139.178.68.195 port 33578 ssh2: RSA SHA256:+hHHVborSkWo7/0A1ohHVzFaxSLc/9IisClzOe0fYVI Sep 5 23:58:40.671714 sshd[1765]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:58:40.686725 systemd-logind[1551]: New session 1 of user core. Sep 5 23:58:40.688507 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 5 23:58:40.698222 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 5 23:58:40.714868 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 5 23:58:40.729801 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 5 23:58:40.733806 (systemd)[1771]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 5 23:58:40.850490 systemd[1771]: Queued start job for default target default.target. Sep 5 23:58:40.851708 systemd[1771]: Created slice app.slice - User Application Slice. Sep 5 23:58:40.851841 systemd[1771]: Reached target paths.target - Paths. Sep 5 23:58:40.851914 systemd[1771]: Reached target timers.target - Timers. Sep 5 23:58:40.859829 systemd[1771]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 5 23:58:40.870087 systemd[1771]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 5 23:58:40.871460 systemd[1771]: Reached target sockets.target - Sockets. Sep 5 23:58:40.871937 systemd[1771]: Reached target basic.target - Basic System. Sep 5 23:58:40.872209 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 5 23:58:40.874136 systemd[1771]: Reached target default.target - Main User Target. Sep 5 23:58:40.874457 systemd[1771]: Startup finished in 132ms. Sep 5 23:58:40.881272 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 5 23:58:41.103088 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Sep 5 23:58:41.116420 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 23:58:41.246892 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 23:58:41.252424 (kubelet)[1794]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 5 23:58:41.301552 kubelet[1794]: E0905 23:58:41.301473 1794 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 5 23:58:41.304239 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 5 23:58:41.304553 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 5 23:58:41.580098 systemd[1]: Started sshd@1-128.140.56.156:22-139.178.68.195:54504.service - OpenSSH per-connection server daemon (139.178.68.195:54504). Sep 5 23:58:42.571560 sshd[1804]: Accepted publickey for core from 139.178.68.195 port 54504 ssh2: RSA SHA256:+hHHVborSkWo7/0A1ohHVzFaxSLc/9IisClzOe0fYVI Sep 5 23:58:42.573737 sshd[1804]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:58:42.580002 systemd-logind[1551]: New session 2 of user core. Sep 5 23:58:42.589247 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 5 23:58:43.263860 sshd[1804]: pam_unix(sshd:session): session closed for user core Sep 5 23:58:43.268729 systemd[1]: sshd@1-128.140.56.156:22-139.178.68.195:54504.service: Deactivated successfully. Sep 5 23:58:43.271862 systemd[1]: session-2.scope: Deactivated successfully. Sep 5 23:58:43.272130 systemd-logind[1551]: Session 2 logged out. Waiting for processes to exit. Sep 5 23:58:43.275087 systemd-logind[1551]: Removed session 2. Sep 5 23:58:43.323744 update_engine[1554]: I20250905 23:58:43.322690 1554 update_attempter.cc:509] Updating boot flags... Sep 5 23:58:43.380653 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 37 scanned by (udev-worker) (1821) Sep 5 23:58:43.441197 systemd[1]: Started sshd@2-128.140.56.156:22-139.178.68.195:54510.service - OpenSSH per-connection server daemon (139.178.68.195:54510). Sep 5 23:58:43.444669 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 37 scanned by (udev-worker) (1824) Sep 5 23:58:44.505084 sshd[1829]: Accepted publickey for core from 139.178.68.195 port 54510 ssh2: RSA SHA256:+hHHVborSkWo7/0A1ohHVzFaxSLc/9IisClzOe0fYVI Sep 5 23:58:44.507903 sshd[1829]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:58:44.514243 systemd-logind[1551]: New session 3 of user core. Sep 5 23:58:44.528159 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 5 23:58:45.228891 sshd[1829]: pam_unix(sshd:session): session closed for user core Sep 5 23:58:45.233510 systemd-logind[1551]: Session 3 logged out. Waiting for processes to exit. Sep 5 23:58:45.234287 systemd[1]: sshd@2-128.140.56.156:22-139.178.68.195:54510.service: Deactivated successfully. Sep 5 23:58:45.239249 systemd[1]: session-3.scope: Deactivated successfully. Sep 5 23:58:45.240341 systemd-logind[1551]: Removed session 3. Sep 5 23:58:45.399469 systemd[1]: Started sshd@3-128.140.56.156:22-139.178.68.195:54520.service - OpenSSH per-connection server daemon (139.178.68.195:54520). Sep 5 23:58:46.390834 sshd[1838]: Accepted publickey for core from 139.178.68.195 port 54520 ssh2: RSA SHA256:+hHHVborSkWo7/0A1ohHVzFaxSLc/9IisClzOe0fYVI Sep 5 23:58:46.393101 sshd[1838]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:58:46.397470 systemd-logind[1551]: New session 4 of user core. Sep 5 23:58:46.406349 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 5 23:58:47.081021 sshd[1838]: pam_unix(sshd:session): session closed for user core Sep 5 23:58:47.089148 systemd[1]: sshd@3-128.140.56.156:22-139.178.68.195:54520.service: Deactivated successfully. Sep 5 23:58:47.092360 systemd-logind[1551]: Session 4 logged out. Waiting for processes to exit. Sep 5 23:58:47.092553 systemd[1]: session-4.scope: Deactivated successfully. Sep 5 23:58:47.093896 systemd-logind[1551]: Removed session 4. Sep 5 23:58:47.255080 systemd[1]: Started sshd@4-128.140.56.156:22-139.178.68.195:54532.service - OpenSSH per-connection server daemon (139.178.68.195:54532). Sep 5 23:58:48.246516 sshd[1846]: Accepted publickey for core from 139.178.68.195 port 54532 ssh2: RSA SHA256:+hHHVborSkWo7/0A1ohHVzFaxSLc/9IisClzOe0fYVI Sep 5 23:58:48.249163 sshd[1846]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:58:48.255368 systemd-logind[1551]: New session 5 of user core. Sep 5 23:58:48.262300 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 5 23:58:48.784641 sudo[1850]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 5 23:58:48.784981 sudo[1850]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 5 23:58:48.800829 sudo[1850]: pam_unix(sudo:session): session closed for user root Sep 5 23:58:48.964183 sshd[1846]: pam_unix(sshd:session): session closed for user core Sep 5 23:58:48.969992 systemd[1]: sshd@4-128.140.56.156:22-139.178.68.195:54532.service: Deactivated successfully. Sep 5 23:58:48.974312 systemd[1]: session-5.scope: Deactivated successfully. Sep 5 23:58:48.975349 systemd-logind[1551]: Session 5 logged out. Waiting for processes to exit. Sep 5 23:58:48.976996 systemd-logind[1551]: Removed session 5. Sep 5 23:58:49.141164 systemd[1]: Started sshd@5-128.140.56.156:22-139.178.68.195:54540.service - OpenSSH per-connection server daemon (139.178.68.195:54540). Sep 5 23:58:50.202984 sshd[1855]: Accepted publickey for core from 139.178.68.195 port 54540 ssh2: RSA SHA256:+hHHVborSkWo7/0A1ohHVzFaxSLc/9IisClzOe0fYVI Sep 5 23:58:50.204952 sshd[1855]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:58:50.211014 systemd-logind[1551]: New session 6 of user core. Sep 5 23:58:50.222400 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 5 23:58:50.762356 sudo[1860]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 5 23:58:50.762775 sudo[1860]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 5 23:58:50.767120 sudo[1860]: pam_unix(sudo:session): session closed for user root Sep 5 23:58:50.773578 sudo[1859]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Sep 5 23:58:50.774049 sudo[1859]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 5 23:58:50.794091 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Sep 5 23:58:50.797279 auditctl[1863]: No rules Sep 5 23:58:50.796618 systemd[1]: audit-rules.service: Deactivated successfully. Sep 5 23:58:50.797055 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Sep 5 23:58:50.812254 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Sep 5 23:58:50.839435 augenrules[1882]: No rules Sep 5 23:58:50.841325 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Sep 5 23:58:50.845110 sudo[1859]: pam_unix(sudo:session): session closed for user root Sep 5 23:58:51.016493 sshd[1855]: pam_unix(sshd:session): session closed for user core Sep 5 23:58:51.020662 systemd-logind[1551]: Session 6 logged out. Waiting for processes to exit. Sep 5 23:58:51.021300 systemd[1]: sshd@5-128.140.56.156:22-139.178.68.195:54540.service: Deactivated successfully. Sep 5 23:58:51.025568 systemd[1]: session-6.scope: Deactivated successfully. Sep 5 23:58:51.027095 systemd-logind[1551]: Removed session 6. Sep 5 23:58:51.185075 systemd[1]: Started sshd@6-128.140.56.156:22-139.178.68.195:44060.service - OpenSSH per-connection server daemon (139.178.68.195:44060). Sep 5 23:58:51.353000 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Sep 5 23:58:51.359902 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 23:58:51.499987 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 23:58:51.504435 (kubelet)[1905]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 5 23:58:51.553190 kubelet[1905]: E0905 23:58:51.553143 1905 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 5 23:58:51.557257 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 5 23:58:51.557471 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 5 23:58:52.171800 sshd[1891]: Accepted publickey for core from 139.178.68.195 port 44060 ssh2: RSA SHA256:+hHHVborSkWo7/0A1ohHVzFaxSLc/9IisClzOe0fYVI Sep 5 23:58:52.174155 sshd[1891]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 5 23:58:52.180016 systemd-logind[1551]: New session 7 of user core. Sep 5 23:58:52.186274 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 5 23:58:52.701052 sudo[1915]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 5 23:58:52.702028 sudo[1915]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 5 23:58:53.006075 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 5 23:58:53.007871 (dockerd)[1930]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 5 23:58:53.261674 dockerd[1930]: time="2025-09-05T23:58:53.260583293Z" level=info msg="Starting up" Sep 5 23:58:53.365996 dockerd[1930]: time="2025-09-05T23:58:53.365736571Z" level=info msg="Loading containers: start." Sep 5 23:58:53.473667 kernel: Initializing XFRM netlink socket Sep 5 23:58:53.565227 systemd-networkd[1232]: docker0: Link UP Sep 5 23:58:53.590352 dockerd[1930]: time="2025-09-05T23:58:53.590107422Z" level=info msg="Loading containers: done." Sep 5 23:58:53.607934 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck879329596-merged.mount: Deactivated successfully. Sep 5 23:58:53.613185 dockerd[1930]: time="2025-09-05T23:58:53.612711406Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 5 23:58:53.613185 dockerd[1930]: time="2025-09-05T23:58:53.612855328Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Sep 5 23:58:53.613185 dockerd[1930]: time="2025-09-05T23:58:53.612983050Z" level=info msg="Daemon has completed initialization" Sep 5 23:58:53.653829 dockerd[1930]: time="2025-09-05T23:58:53.653648988Z" level=info msg="API listen on /run/docker.sock" Sep 5 23:58:53.654108 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 5 23:58:54.684376 containerd[1576]: time="2025-09-05T23:58:54.684324407Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\"" Sep 5 23:58:55.333158 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2899649794.mount: Deactivated successfully. Sep 5 23:58:56.629945 containerd[1576]: time="2025-09-05T23:58:56.628434940Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:58:56.629945 containerd[1576]: time="2025-09-05T23:58:56.629873878Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.12: active requests=0, bytes read=25652533" Sep 5 23:58:56.630910 containerd[1576]: time="2025-09-05T23:58:56.630846771Z" level=info msg="ImageCreate event name:\"sha256:25d00c9505e8a4a7a6c827030f878b50e58bbf63322e01a7d92807bcb4db6b3d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:58:56.634334 containerd[1576]: time="2025-09-05T23:58:56.634281414Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:58:56.636143 containerd[1576]: time="2025-09-05T23:58:56.636089796Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.12\" with image id \"sha256:25d00c9505e8a4a7a6c827030f878b50e58bbf63322e01a7d92807bcb4db6b3d\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.12\", repo digest \"registry.k8s.io/kube-apiserver@sha256:e9011c3bee8c06ecabd7816e119dca4e448c92f7a78acd891de3d2db1dc6c234\", size \"25649241\" in 1.951715788s" Sep 5 23:58:56.636143 containerd[1576]: time="2025-09-05T23:58:56.636141157Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.12\" returns image reference \"sha256:25d00c9505e8a4a7a6c827030f878b50e58bbf63322e01a7d92807bcb4db6b3d\"" Sep 5 23:58:56.638428 containerd[1576]: time="2025-09-05T23:58:56.638386705Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\"" Sep 5 23:58:58.036927 containerd[1576]: time="2025-09-05T23:58:58.035663772Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:58:58.036927 containerd[1576]: time="2025-09-05T23:58:58.036876186Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.12: active requests=0, bytes read=22460329" Sep 5 23:58:58.037453 containerd[1576]: time="2025-09-05T23:58:58.037426232Z" level=info msg="ImageCreate event name:\"sha256:04df324666956d4cb57096c0edff6bfe1d75e71fb8f508dec8818f2842f821e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:58:58.040721 containerd[1576]: time="2025-09-05T23:58:58.040674787Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:58:58.045247 containerd[1576]: time="2025-09-05T23:58:58.045196157Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.12\" with image id \"sha256:04df324666956d4cb57096c0edff6bfe1d75e71fb8f508dec8818f2842f821e1\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.12\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d2862f94d87320267fddbd55db26556a267aa802e51d6b60f25786b4c428afc8\", size \"23997423\" in 1.406762332s" Sep 5 23:58:58.045433 containerd[1576]: time="2025-09-05T23:58:58.045416640Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.12\" returns image reference \"sha256:04df324666956d4cb57096c0edff6bfe1d75e71fb8f508dec8818f2842f821e1\"" Sep 5 23:58:58.045962 containerd[1576]: time="2025-09-05T23:58:58.045930525Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\"" Sep 5 23:58:59.102039 containerd[1576]: time="2025-09-05T23:58:59.101978922Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:58:59.104182 containerd[1576]: time="2025-09-05T23:58:59.103767701Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.12: active requests=0, bytes read=17125923" Sep 5 23:58:59.105756 containerd[1576]: time="2025-09-05T23:58:59.105713161Z" level=info msg="ImageCreate event name:\"sha256:00b0619122c2d4fd3b5e102e9850d8c732e08a386b9c172c409b3a5cd552e07d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:58:59.112012 containerd[1576]: time="2025-09-05T23:58:59.111942625Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:58:59.114407 containerd[1576]: time="2025-09-05T23:58:59.114208009Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.12\" with image id \"sha256:00b0619122c2d4fd3b5e102e9850d8c732e08a386b9c172c409b3a5cd552e07d\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.12\", repo digest \"registry.k8s.io/kube-scheduler@sha256:152943b7e30244f4415fd0a5860a2dccd91660fe983d30a28a10edb0cc8f6756\", size \"18663035\" in 1.068230562s" Sep 5 23:58:59.114407 containerd[1576]: time="2025-09-05T23:58:59.114272649Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.12\" returns image reference \"sha256:00b0619122c2d4fd3b5e102e9850d8c732e08a386b9c172c409b3a5cd552e07d\"" Sep 5 23:58:59.114975 containerd[1576]: time="2025-09-05T23:58:59.114862335Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\"" Sep 5 23:59:00.118363 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2363933324.mount: Deactivated successfully. Sep 5 23:59:00.422337 containerd[1576]: time="2025-09-05T23:59:00.422185677Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:59:00.425026 containerd[1576]: time="2025-09-05T23:59:00.424955984Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.12: active requests=0, bytes read=26916121" Sep 5 23:59:00.426404 containerd[1576]: time="2025-09-05T23:59:00.426355197Z" level=info msg="ImageCreate event name:\"sha256:25c7652bd0d893b147dce9135dc6a68c37da76f9a20dceec1d520782031b2f36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:59:00.429434 containerd[1576]: time="2025-09-05T23:59:00.429358426Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:59:00.430598 containerd[1576]: time="2025-09-05T23:59:00.430552318Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.12\" with image id \"sha256:25c7652bd0d893b147dce9135dc6a68c37da76f9a20dceec1d520782031b2f36\", repo tag \"registry.k8s.io/kube-proxy:v1.31.12\", repo digest \"registry.k8s.io/kube-proxy@sha256:90aa6b5f4065937521ff8438bc705317485d0be3f8b00a07145e697d92cc2cc6\", size \"26915114\" in 1.315648622s" Sep 5 23:59:00.430768 containerd[1576]: time="2025-09-05T23:59:00.430746920Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.12\" returns image reference \"sha256:25c7652bd0d893b147dce9135dc6a68c37da76f9a20dceec1d520782031b2f36\"" Sep 5 23:59:00.431576 containerd[1576]: time="2025-09-05T23:59:00.431528967Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 5 23:59:01.034713 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3499387050.mount: Deactivated successfully. Sep 5 23:59:01.602495 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. Sep 5 23:59:01.610236 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 23:59:01.778688 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 23:59:01.789206 (kubelet)[2204]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 5 23:59:01.842197 kubelet[2204]: E0905 23:59:01.842105 2204 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 5 23:59:01.845397 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 5 23:59:01.845919 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 5 23:59:01.906757 containerd[1576]: time="2025-09-05T23:59:01.906565333Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:59:01.909291 containerd[1576]: time="2025-09-05T23:59:01.909205796Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951714" Sep 5 23:59:01.911016 containerd[1576]: time="2025-09-05T23:59:01.910454368Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:59:01.914852 containerd[1576]: time="2025-09-05T23:59:01.914793767Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:59:01.916863 containerd[1576]: time="2025-09-05T23:59:01.916779945Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.485203697s" Sep 5 23:59:01.916926 containerd[1576]: time="2025-09-05T23:59:01.916873506Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Sep 5 23:59:01.917338 containerd[1576]: time="2025-09-05T23:59:01.917316550Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 5 23:59:02.481384 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1777787343.mount: Deactivated successfully. Sep 5 23:59:02.491374 containerd[1576]: time="2025-09-05T23:59:02.489470424Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:59:02.491374 containerd[1576]: time="2025-09-05T23:59:02.491120358Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" Sep 5 23:59:02.491374 containerd[1576]: time="2025-09-05T23:59:02.491291879Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:59:02.494342 containerd[1576]: time="2025-09-05T23:59:02.494291425Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:59:02.495101 containerd[1576]: time="2025-09-05T23:59:02.495056991Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 577.707921ms" Sep 5 23:59:02.495101 containerd[1576]: time="2025-09-05T23:59:02.495100192Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Sep 5 23:59:02.496034 containerd[1576]: time="2025-09-05T23:59:02.495986519Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Sep 5 23:59:03.033023 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3971469172.mount: Deactivated successfully. Sep 5 23:59:04.978660 containerd[1576]: time="2025-09-05T23:59:04.976465998Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:59:04.978660 containerd[1576]: time="2025-09-05T23:59:04.977974890Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=66537235" Sep 5 23:59:04.980493 containerd[1576]: time="2025-09-05T23:59:04.980447348Z" level=info msg="ImageCreate event name:\"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:59:04.986059 containerd[1576]: time="2025-09-05T23:59:04.986008110Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:59:04.987654 containerd[1576]: time="2025-09-05T23:59:04.987016357Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"66535646\" in 2.490858517s" Sep 5 23:59:04.987800 containerd[1576]: time="2025-09-05T23:59:04.987783643Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\"" Sep 5 23:59:06.623028 systemd[1]: Started sshd@7-128.140.56.156:22-103.99.206.83:37274.service - OpenSSH per-connection server daemon (103.99.206.83:37274). Sep 5 23:59:06.961654 sshd[2290]: Connection closed by 103.99.206.83 port 37274 [preauth] Sep 5 23:59:06.966132 systemd[1]: sshd@7-128.140.56.156:22-103.99.206.83:37274.service: Deactivated successfully. Sep 5 23:59:09.920954 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 23:59:09.932421 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 23:59:09.966935 systemd[1]: Reloading requested from client PID 2303 ('systemctl') (unit session-7.scope)... Sep 5 23:59:09.967100 systemd[1]: Reloading... Sep 5 23:59:10.082650 zram_generator::config[2344]: No configuration found. Sep 5 23:59:10.189721 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 5 23:59:10.259591 systemd[1]: Reloading finished in 292 ms. Sep 5 23:59:10.311575 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 5 23:59:10.312018 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 5 23:59:10.312387 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 23:59:10.319095 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 23:59:10.462978 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 23:59:10.464065 (kubelet)[2403]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 5 23:59:10.508648 kubelet[2403]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 5 23:59:10.508648 kubelet[2403]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 5 23:59:10.508648 kubelet[2403]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 5 23:59:10.508648 kubelet[2403]: I0905 23:59:10.507911 2403 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 5 23:59:11.576656 kubelet[2403]: I0905 23:59:11.575942 2403 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 5 23:59:11.576656 kubelet[2403]: I0905 23:59:11.575985 2403 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 5 23:59:11.576656 kubelet[2403]: I0905 23:59:11.576418 2403 server.go:934] "Client rotation is on, will bootstrap in background" Sep 5 23:59:11.605844 kubelet[2403]: E0905 23:59:11.605805 2403 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://128.140.56.156:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 128.140.56.156:6443: connect: connection refused" logger="UnhandledError" Sep 5 23:59:11.606890 kubelet[2403]: I0905 23:59:11.606851 2403 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 5 23:59:11.618256 kubelet[2403]: E0905 23:59:11.618209 2403 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Sep 5 23:59:11.618256 kubelet[2403]: I0905 23:59:11.618253 2403 server.go:1408] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Sep 5 23:59:11.623394 kubelet[2403]: I0905 23:59:11.623323 2403 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 5 23:59:11.623965 kubelet[2403]: I0905 23:59:11.623912 2403 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 5 23:59:11.624106 kubelet[2403]: I0905 23:59:11.624051 2403 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 5 23:59:11.624326 kubelet[2403]: I0905 23:59:11.624090 2403 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081-3-5-n-8aba32846f","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"cgroupfs","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":1} Sep 5 23:59:11.624488 kubelet[2403]: I0905 23:59:11.624366 2403 topology_manager.go:138] "Creating topology manager with none policy" Sep 5 23:59:11.624488 kubelet[2403]: I0905 23:59:11.624379 2403 container_manager_linux.go:300] "Creating device plugin manager" Sep 5 23:59:11.624611 kubelet[2403]: I0905 23:59:11.624590 2403 state_mem.go:36] "Initialized new in-memory state store" Sep 5 23:59:11.628527 kubelet[2403]: I0905 23:59:11.628069 2403 kubelet.go:408] "Attempting to sync node with API server" Sep 5 23:59:11.628527 kubelet[2403]: I0905 23:59:11.628120 2403 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 5 23:59:11.628527 kubelet[2403]: I0905 23:59:11.628152 2403 kubelet.go:314] "Adding apiserver pod source" Sep 5 23:59:11.628527 kubelet[2403]: I0905 23:59:11.628236 2403 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 5 23:59:11.634025 kubelet[2403]: I0905 23:59:11.633953 2403 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Sep 5 23:59:11.634718 kubelet[2403]: I0905 23:59:11.634686 2403 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 5 23:59:11.634903 kubelet[2403]: W0905 23:59:11.634886 2403 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 5 23:59:11.636673 kubelet[2403]: I0905 23:59:11.635918 2403 server.go:1274] "Started kubelet" Sep 5 23:59:11.636673 kubelet[2403]: W0905 23:59:11.636090 2403 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://128.140.56.156:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4081-3-5-n-8aba32846f&limit=500&resourceVersion=0": dial tcp 128.140.56.156:6443: connect: connection refused Sep 5 23:59:11.636673 kubelet[2403]: E0905 23:59:11.636142 2403 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://128.140.56.156:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4081-3-5-n-8aba32846f&limit=500&resourceVersion=0\": dial tcp 128.140.56.156:6443: connect: connection refused" logger="UnhandledError" Sep 5 23:59:11.640807 kubelet[2403]: I0905 23:59:11.640765 2403 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 5 23:59:11.645209 kubelet[2403]: W0905 23:59:11.645131 2403 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://128.140.56.156:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 128.140.56.156:6443: connect: connection refused Sep 5 23:59:11.645209 kubelet[2403]: E0905 23:59:11.645212 2403 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://128.140.56.156:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 128.140.56.156:6443: connect: connection refused" logger="UnhandledError" Sep 5 23:59:11.647232 kubelet[2403]: I0905 23:59:11.647186 2403 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 5 23:59:11.649455 kubelet[2403]: I0905 23:59:11.649407 2403 server.go:449] "Adding debug handlers to kubelet server" Sep 5 23:59:11.650278 kubelet[2403]: I0905 23:59:11.650255 2403 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 5 23:59:11.650771 kubelet[2403]: I0905 23:59:11.650713 2403 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 5 23:59:11.651038 kubelet[2403]: I0905 23:59:11.651013 2403 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 5 23:59:11.653194 kubelet[2403]: I0905 23:59:11.653170 2403 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 5 23:59:11.653715 kubelet[2403]: E0905 23:59:11.653690 2403 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4081-3-5-n-8aba32846f\" not found" Sep 5 23:59:11.657495 kubelet[2403]: I0905 23:59:11.657445 2403 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 5 23:59:11.659445 kubelet[2403]: I0905 23:59:11.658368 2403 reconciler.go:26] "Reconciler: start to sync state" Sep 5 23:59:11.659822 kubelet[2403]: W0905 23:59:11.659758 2403 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://128.140.56.156:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 128.140.56.156:6443: connect: connection refused Sep 5 23:59:11.659961 kubelet[2403]: E0905 23:59:11.659938 2403 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://128.140.56.156:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 128.140.56.156:6443: connect: connection refused" logger="UnhandledError" Sep 5 23:59:11.662159 kubelet[2403]: E0905 23:59:11.660094 2403 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://128.140.56.156:6443/api/v1/namespaces/default/events\": dial tcp 128.140.56.156:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4081-3-5-n-8aba32846f.1862886042254041 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4081-3-5-n-8aba32846f,UID:ci-4081-3-5-n-8aba32846f,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4081-3-5-n-8aba32846f,},FirstTimestamp:2025-09-05 23:59:11.635890241 +0000 UTC m=+1.165197673,LastTimestamp:2025-09-05 23:59:11.635890241 +0000 UTC m=+1.165197673,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4081-3-5-n-8aba32846f,}" Sep 5 23:59:11.662676 kubelet[2403]: E0905 23:59:11.662606 2403 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://128.140.56.156:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-5-n-8aba32846f?timeout=10s\": dial tcp 128.140.56.156:6443: connect: connection refused" interval="200ms" Sep 5 23:59:11.663574 kubelet[2403]: I0905 23:59:11.663540 2403 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 5 23:59:11.666426 kubelet[2403]: I0905 23:59:11.666403 2403 factory.go:221] Registration of the containerd container factory successfully Sep 5 23:59:11.666638 kubelet[2403]: I0905 23:59:11.666611 2403 factory.go:221] Registration of the systemd container factory successfully Sep 5 23:59:11.674702 kubelet[2403]: I0905 23:59:11.674132 2403 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 5 23:59:11.676069 kubelet[2403]: I0905 23:59:11.676041 2403 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 5 23:59:11.676177 kubelet[2403]: I0905 23:59:11.676168 2403 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 5 23:59:11.676257 kubelet[2403]: I0905 23:59:11.676249 2403 kubelet.go:2321] "Starting kubelet main sync loop" Sep 5 23:59:11.676359 kubelet[2403]: E0905 23:59:11.676328 2403 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 5 23:59:11.680240 kubelet[2403]: E0905 23:59:11.680204 2403 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 5 23:59:11.688464 kubelet[2403]: W0905 23:59:11.688120 2403 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://128.140.56.156:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 128.140.56.156:6443: connect: connection refused Sep 5 23:59:11.688464 kubelet[2403]: E0905 23:59:11.688196 2403 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://128.140.56.156:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 128.140.56.156:6443: connect: connection refused" logger="UnhandledError" Sep 5 23:59:11.703102 kubelet[2403]: I0905 23:59:11.702786 2403 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 5 23:59:11.703102 kubelet[2403]: I0905 23:59:11.702832 2403 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 5 23:59:11.703102 kubelet[2403]: I0905 23:59:11.702854 2403 state_mem.go:36] "Initialized new in-memory state store" Sep 5 23:59:11.705375 kubelet[2403]: I0905 23:59:11.705346 2403 policy_none.go:49] "None policy: Start" Sep 5 23:59:11.706375 kubelet[2403]: I0905 23:59:11.706355 2403 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 5 23:59:11.706467 kubelet[2403]: I0905 23:59:11.706388 2403 state_mem.go:35] "Initializing new in-memory state store" Sep 5 23:59:11.711466 kubelet[2403]: I0905 23:59:11.711434 2403 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 5 23:59:11.712852 kubelet[2403]: I0905 23:59:11.712830 2403 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 5 23:59:11.714653 kubelet[2403]: I0905 23:59:11.712853 2403 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 5 23:59:11.714653 kubelet[2403]: I0905 23:59:11.714486 2403 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 5 23:59:11.715673 kubelet[2403]: E0905 23:59:11.715647 2403 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4081-3-5-n-8aba32846f\" not found" Sep 5 23:59:11.818718 kubelet[2403]: I0905 23:59:11.818296 2403 kubelet_node_status.go:72] "Attempting to register node" node="ci-4081-3-5-n-8aba32846f" Sep 5 23:59:11.819495 kubelet[2403]: E0905 23:59:11.819465 2403 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://128.140.56.156:6443/api/v1/nodes\": dial tcp 128.140.56.156:6443: connect: connection refused" node="ci-4081-3-5-n-8aba32846f" Sep 5 23:59:11.859752 kubelet[2403]: I0905 23:59:11.859570 2403 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d8f42a34ba66a7b41021f548558ce545-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081-3-5-n-8aba32846f\" (UID: \"d8f42a34ba66a7b41021f548558ce545\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-n-8aba32846f" Sep 5 23:59:11.859752 kubelet[2403]: I0905 23:59:11.859646 2403 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4ba089ccbafa93e097d8d8214ca67960-ca-certs\") pod \"kube-apiserver-ci-4081-3-5-n-8aba32846f\" (UID: \"4ba089ccbafa93e097d8d8214ca67960\") " pod="kube-system/kube-apiserver-ci-4081-3-5-n-8aba32846f" Sep 5 23:59:11.859752 kubelet[2403]: I0905 23:59:11.859673 2403 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4ba089ccbafa93e097d8d8214ca67960-k8s-certs\") pod \"kube-apiserver-ci-4081-3-5-n-8aba32846f\" (UID: \"4ba089ccbafa93e097d8d8214ca67960\") " pod="kube-system/kube-apiserver-ci-4081-3-5-n-8aba32846f" Sep 5 23:59:11.859752 kubelet[2403]: I0905 23:59:11.859696 2403 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/d8f42a34ba66a7b41021f548558ce545-flexvolume-dir\") pod \"kube-controller-manager-ci-4081-3-5-n-8aba32846f\" (UID: \"d8f42a34ba66a7b41021f548558ce545\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-n-8aba32846f" Sep 5 23:59:11.859752 kubelet[2403]: I0905 23:59:11.859721 2403 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d8f42a34ba66a7b41021f548558ce545-k8s-certs\") pod \"kube-controller-manager-ci-4081-3-5-n-8aba32846f\" (UID: \"d8f42a34ba66a7b41021f548558ce545\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-n-8aba32846f" Sep 5 23:59:11.860064 kubelet[2403]: I0905 23:59:11.859771 2403 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d8f42a34ba66a7b41021f548558ce545-kubeconfig\") pod \"kube-controller-manager-ci-4081-3-5-n-8aba32846f\" (UID: \"d8f42a34ba66a7b41021f548558ce545\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-n-8aba32846f" Sep 5 23:59:11.860064 kubelet[2403]: I0905 23:59:11.859842 2403 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4ba089ccbafa93e097d8d8214ca67960-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081-3-5-n-8aba32846f\" (UID: \"4ba089ccbafa93e097d8d8214ca67960\") " pod="kube-system/kube-apiserver-ci-4081-3-5-n-8aba32846f" Sep 5 23:59:11.860064 kubelet[2403]: I0905 23:59:11.859877 2403 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d8f42a34ba66a7b41021f548558ce545-ca-certs\") pod \"kube-controller-manager-ci-4081-3-5-n-8aba32846f\" (UID: \"d8f42a34ba66a7b41021f548558ce545\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-n-8aba32846f" Sep 5 23:59:11.860064 kubelet[2403]: I0905 23:59:11.859915 2403 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/92f21c11552e4489b30dd575b7a2de9f-kubeconfig\") pod \"kube-scheduler-ci-4081-3-5-n-8aba32846f\" (UID: \"92f21c11552e4489b30dd575b7a2de9f\") " pod="kube-system/kube-scheduler-ci-4081-3-5-n-8aba32846f" Sep 5 23:59:11.864170 kubelet[2403]: E0905 23:59:11.864113 2403 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://128.140.56.156:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-5-n-8aba32846f?timeout=10s\": dial tcp 128.140.56.156:6443: connect: connection refused" interval="400ms" Sep 5 23:59:12.023943 kubelet[2403]: I0905 23:59:12.023427 2403 kubelet_node_status.go:72] "Attempting to register node" node="ci-4081-3-5-n-8aba32846f" Sep 5 23:59:12.023943 kubelet[2403]: E0905 23:59:12.023787 2403 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://128.140.56.156:6443/api/v1/nodes\": dial tcp 128.140.56.156:6443: connect: connection refused" node="ci-4081-3-5-n-8aba32846f" Sep 5 23:59:12.089812 containerd[1576]: time="2025-09-05T23:59:12.089686334Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081-3-5-n-8aba32846f,Uid:4ba089ccbafa93e097d8d8214ca67960,Namespace:kube-system,Attempt:0,}" Sep 5 23:59:12.092336 containerd[1576]: time="2025-09-05T23:59:12.091596702Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081-3-5-n-8aba32846f,Uid:d8f42a34ba66a7b41021f548558ce545,Namespace:kube-system,Attempt:0,}" Sep 5 23:59:12.095128 containerd[1576]: time="2025-09-05T23:59:12.094671396Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081-3-5-n-8aba32846f,Uid:92f21c11552e4489b30dd575b7a2de9f,Namespace:kube-system,Attempt:0,}" Sep 5 23:59:12.265121 kubelet[2403]: E0905 23:59:12.264920 2403 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://128.140.56.156:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-5-n-8aba32846f?timeout=10s\": dial tcp 128.140.56.156:6443: connect: connection refused" interval="800ms" Sep 5 23:59:12.427089 kubelet[2403]: I0905 23:59:12.427027 2403 kubelet_node_status.go:72] "Attempting to register node" node="ci-4081-3-5-n-8aba32846f" Sep 5 23:59:12.427578 kubelet[2403]: E0905 23:59:12.427493 2403 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://128.140.56.156:6443/api/v1/nodes\": dial tcp 128.140.56.156:6443: connect: connection refused" node="ci-4081-3-5-n-8aba32846f" Sep 5 23:59:12.511163 kubelet[2403]: W0905 23:59:12.511067 2403 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://128.140.56.156:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 128.140.56.156:6443: connect: connection refused Sep 5 23:59:12.511327 kubelet[2403]: E0905 23:59:12.511174 2403 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://128.140.56.156:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 128.140.56.156:6443: connect: connection refused" logger="UnhandledError" Sep 5 23:59:12.562346 kubelet[2403]: W0905 23:59:12.562164 2403 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://128.140.56.156:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4081-3-5-n-8aba32846f&limit=500&resourceVersion=0": dial tcp 128.140.56.156:6443: connect: connection refused Sep 5 23:59:12.562346 kubelet[2403]: E0905 23:59:12.562241 2403 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://128.140.56.156:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4081-3-5-n-8aba32846f&limit=500&resourceVersion=0\": dial tcp 128.140.56.156:6443: connect: connection refused" logger="UnhandledError" Sep 5 23:59:12.659016 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4165999152.mount: Deactivated successfully. Sep 5 23:59:12.669028 containerd[1576]: time="2025-09-05T23:59:12.668093714Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 5 23:59:12.671407 containerd[1576]: time="2025-09-05T23:59:12.671367288Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=269193" Sep 5 23:59:12.672275 containerd[1576]: time="2025-09-05T23:59:12.672246732Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 5 23:59:12.673700 containerd[1576]: time="2025-09-05T23:59:12.673668458Z" level=info msg="ImageCreate event name:\"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 5 23:59:12.676646 containerd[1576]: time="2025-09-05T23:59:12.676567551Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 5 23:59:12.679080 containerd[1576]: time="2025-09-05T23:59:12.678737121Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Sep 5 23:59:12.679080 containerd[1576]: time="2025-09-05T23:59:12.679032562Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Sep 5 23:59:12.681435 containerd[1576]: time="2025-09-05T23:59:12.681374773Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 5 23:59:12.683215 kubelet[2403]: W0905 23:59:12.683001 2403 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://128.140.56.156:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 128.140.56.156:6443: connect: connection refused Sep 5 23:59:12.683215 kubelet[2403]: E0905 23:59:12.683071 2403 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://128.140.56.156:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 128.140.56.156:6443: connect: connection refused" logger="UnhandledError" Sep 5 23:59:12.686653 containerd[1576]: time="2025-09-05T23:59:12.685387871Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 595.462376ms" Sep 5 23:59:12.687011 containerd[1576]: time="2025-09-05T23:59:12.686965998Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 595.255655ms" Sep 5 23:59:12.696907 containerd[1576]: time="2025-09-05T23:59:12.696850482Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4e42fb3c9d90ed7895bc04a9d96fe3102a65b521f485cc5a4f3dd818afef9cef\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"268403\" in 602.051565ms" Sep 5 23:59:12.717701 kubelet[2403]: W0905 23:59:12.717608 2403 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://128.140.56.156:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 128.140.56.156:6443: connect: connection refused Sep 5 23:59:12.717853 kubelet[2403]: E0905 23:59:12.717706 2403 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://128.140.56.156:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 128.140.56.156:6443: connect: connection refused" logger="UnhandledError" Sep 5 23:59:12.825603 containerd[1576]: time="2025-09-05T23:59:12.825367655Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 5 23:59:12.825900 containerd[1576]: time="2025-09-05T23:59:12.825458095Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 5 23:59:12.826974 containerd[1576]: time="2025-09-05T23:59:12.826347659Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:59:12.826974 containerd[1576]: time="2025-09-05T23:59:12.826469900Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:59:12.831798 containerd[1576]: time="2025-09-05T23:59:12.831556003Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 5 23:59:12.833327 containerd[1576]: time="2025-09-05T23:59:12.832330686Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 5 23:59:12.833654 containerd[1576]: time="2025-09-05T23:59:12.833557692Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:59:12.834371 containerd[1576]: time="2025-09-05T23:59:12.834198254Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 5 23:59:12.834371 containerd[1576]: time="2025-09-05T23:59:12.834291575Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 5 23:59:12.834371 containerd[1576]: time="2025-09-05T23:59:12.834328535Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:59:12.834712 containerd[1576]: time="2025-09-05T23:59:12.834478816Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:59:12.836281 containerd[1576]: time="2025-09-05T23:59:12.835685621Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:59:12.913594 containerd[1576]: time="2025-09-05T23:59:12.913382928Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4081-3-5-n-8aba32846f,Uid:92f21c11552e4489b30dd575b7a2de9f,Namespace:kube-system,Attempt:0,} returns sandbox id \"4df81ee4cfb7f49a52fde4f7fbfd70b746b3dba0a0969167e43128470b2f1253\"" Sep 5 23:59:12.921751 containerd[1576]: time="2025-09-05T23:59:12.921311603Z" level=info msg="CreateContainer within sandbox \"4df81ee4cfb7f49a52fde4f7fbfd70b746b3dba0a0969167e43128470b2f1253\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 5 23:59:12.927135 containerd[1576]: time="2025-09-05T23:59:12.927074789Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4081-3-5-n-8aba32846f,Uid:d8f42a34ba66a7b41021f548558ce545,Namespace:kube-system,Attempt:0,} returns sandbox id \"6b9a5da773a547cfd9fc54155abdaf3a6e73c6f00abf03c59222b0ca5b3b77dd\"" Sep 5 23:59:12.930326 containerd[1576]: time="2025-09-05T23:59:12.930263523Z" level=info msg="CreateContainer within sandbox \"6b9a5da773a547cfd9fc54155abdaf3a6e73c6f00abf03c59222b0ca5b3b77dd\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 5 23:59:12.935733 containerd[1576]: time="2025-09-05T23:59:12.935692107Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4081-3-5-n-8aba32846f,Uid:4ba089ccbafa93e097d8d8214ca67960,Namespace:kube-system,Attempt:0,} returns sandbox id \"a46d70131525d94c0b20e7b01b3d52251975e3e464b9617f4a14cc24bf0ecb11\"" Sep 5 23:59:12.939218 containerd[1576]: time="2025-09-05T23:59:12.939167603Z" level=info msg="CreateContainer within sandbox \"a46d70131525d94c0b20e7b01b3d52251975e3e464b9617f4a14cc24bf0ecb11\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 5 23:59:12.953666 containerd[1576]: time="2025-09-05T23:59:12.953598107Z" level=info msg="CreateContainer within sandbox \"4df81ee4cfb7f49a52fde4f7fbfd70b746b3dba0a0969167e43128470b2f1253\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"d18077f8a74891589a28489baecc1cf79dfcb67fb977737dc56149cf618d0d1b\"" Sep 5 23:59:12.954493 containerd[1576]: time="2025-09-05T23:59:12.954451671Z" level=info msg="StartContainer for \"d18077f8a74891589a28489baecc1cf79dfcb67fb977737dc56149cf618d0d1b\"" Sep 5 23:59:12.955850 containerd[1576]: time="2025-09-05T23:59:12.955783077Z" level=info msg="CreateContainer within sandbox \"6b9a5da773a547cfd9fc54155abdaf3a6e73c6f00abf03c59222b0ca5b3b77dd\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"3b3e214261afc585e291c28a6e47b4197bac762af6a953ab69a61bfccb52be32\"" Sep 5 23:59:12.956477 containerd[1576]: time="2025-09-05T23:59:12.956448520Z" level=info msg="StartContainer for \"3b3e214261afc585e291c28a6e47b4197bac762af6a953ab69a61bfccb52be32\"" Sep 5 23:59:12.959538 containerd[1576]: time="2025-09-05T23:59:12.959492933Z" level=info msg="CreateContainer within sandbox \"a46d70131525d94c0b20e7b01b3d52251975e3e464b9617f4a14cc24bf0ecb11\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"2c351d4738fb64ea7d050dcff5c5bef0b98b0724aefb6ed3e9002e4967dba61b\"" Sep 5 23:59:12.960299 containerd[1576]: time="2025-09-05T23:59:12.960156216Z" level=info msg="StartContainer for \"2c351d4738fb64ea7d050dcff5c5bef0b98b0724aefb6ed3e9002e4967dba61b\"" Sep 5 23:59:13.064994 containerd[1576]: time="2025-09-05T23:59:13.064821385Z" level=info msg="StartContainer for \"3b3e214261afc585e291c28a6e47b4197bac762af6a953ab69a61bfccb52be32\" returns successfully" Sep 5 23:59:13.067604 kubelet[2403]: E0905 23:59:13.067550 2403 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://128.140.56.156:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4081-3-5-n-8aba32846f?timeout=10s\": dial tcp 128.140.56.156:6443: connect: connection refused" interval="1.6s" Sep 5 23:59:13.080776 containerd[1576]: time="2025-09-05T23:59:13.080602771Z" level=info msg="StartContainer for \"2c351d4738fb64ea7d050dcff5c5bef0b98b0724aefb6ed3e9002e4967dba61b\" returns successfully" Sep 5 23:59:13.080776 containerd[1576]: time="2025-09-05T23:59:13.080771812Z" level=info msg="StartContainer for \"d18077f8a74891589a28489baecc1cf79dfcb67fb977737dc56149cf618d0d1b\" returns successfully" Sep 5 23:59:13.231961 kubelet[2403]: I0905 23:59:13.231916 2403 kubelet_node_status.go:72] "Attempting to register node" node="ci-4081-3-5-n-8aba32846f" Sep 5 23:59:15.576333 kubelet[2403]: E0905 23:59:15.576294 2403 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4081-3-5-n-8aba32846f\" not found" node="ci-4081-3-5-n-8aba32846f" Sep 5 23:59:15.634505 kubelet[2403]: I0905 23:59:15.634452 2403 kubelet_node_status.go:75] "Successfully registered node" node="ci-4081-3-5-n-8aba32846f" Sep 5 23:59:15.642397 kubelet[2403]: I0905 23:59:15.642344 2403 apiserver.go:52] "Watching apiserver" Sep 5 23:59:15.659521 kubelet[2403]: I0905 23:59:15.659484 2403 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 5 23:59:17.939285 systemd[1]: Reloading requested from client PID 2679 ('systemctl') (unit session-7.scope)... Sep 5 23:59:17.939341 systemd[1]: Reloading... Sep 5 23:59:18.064706 zram_generator::config[2719]: No configuration found. Sep 5 23:59:18.164160 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 5 23:59:18.241219 systemd[1]: Reloading finished in 301 ms. Sep 5 23:59:18.279704 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 23:59:18.291503 systemd[1]: kubelet.service: Deactivated successfully. Sep 5 23:59:18.293056 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 23:59:18.309929 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 5 23:59:18.431920 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 5 23:59:18.432063 (kubelet)[2774]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 5 23:59:18.499958 kubelet[2774]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 5 23:59:18.500311 kubelet[2774]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 5 23:59:18.500373 kubelet[2774]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 5 23:59:18.500679 kubelet[2774]: I0905 23:59:18.500471 2774 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 5 23:59:18.513272 kubelet[2774]: I0905 23:59:18.513222 2774 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 5 23:59:18.513451 kubelet[2774]: I0905 23:59:18.513440 2774 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 5 23:59:18.513820 kubelet[2774]: I0905 23:59:18.513803 2774 server.go:934] "Client rotation is on, will bootstrap in background" Sep 5 23:59:18.521087 kubelet[2774]: I0905 23:59:18.517726 2774 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 5 23:59:18.526265 kubelet[2774]: I0905 23:59:18.526075 2774 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 5 23:59:18.533606 kubelet[2774]: E0905 23:59:18.533464 2774 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Sep 5 23:59:18.534312 kubelet[2774]: I0905 23:59:18.533831 2774 server.go:1408] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Sep 5 23:59:18.537528 kubelet[2774]: I0905 23:59:18.537502 2774 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 5 23:59:18.538196 kubelet[2774]: I0905 23:59:18.538179 2774 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 5 23:59:18.538421 kubelet[2774]: I0905 23:59:18.538386 2774 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 5 23:59:18.538679 kubelet[2774]: I0905 23:59:18.538474 2774 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4081-3-5-n-8aba32846f","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"cgroupfs","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":1} Sep 5 23:59:18.539087 kubelet[2774]: I0905 23:59:18.538816 2774 topology_manager.go:138] "Creating topology manager with none policy" Sep 5 23:59:18.539087 kubelet[2774]: I0905 23:59:18.538834 2774 container_manager_linux.go:300] "Creating device plugin manager" Sep 5 23:59:18.539087 kubelet[2774]: I0905 23:59:18.538875 2774 state_mem.go:36] "Initialized new in-memory state store" Sep 5 23:59:18.539087 kubelet[2774]: I0905 23:59:18.538994 2774 kubelet.go:408] "Attempting to sync node with API server" Sep 5 23:59:18.539087 kubelet[2774]: I0905 23:59:18.539006 2774 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 5 23:59:18.539087 kubelet[2774]: I0905 23:59:18.539025 2774 kubelet.go:314] "Adding apiserver pod source" Sep 5 23:59:18.539087 kubelet[2774]: I0905 23:59:18.539039 2774 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 5 23:59:18.543677 kubelet[2774]: I0905 23:59:18.542800 2774 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Sep 5 23:59:18.543677 kubelet[2774]: I0905 23:59:18.543300 2774 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 5 23:59:18.544462 kubelet[2774]: I0905 23:59:18.544220 2774 server.go:1274] "Started kubelet" Sep 5 23:59:18.551607 kubelet[2774]: I0905 23:59:18.551566 2774 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 5 23:59:18.565372 kubelet[2774]: I0905 23:59:18.564679 2774 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 5 23:59:18.566847 kubelet[2774]: I0905 23:59:18.566549 2774 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 5 23:59:18.570442 kubelet[2774]: I0905 23:59:18.570402 2774 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 5 23:59:18.570964 kubelet[2774]: E0905 23:59:18.570939 2774 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4081-3-5-n-8aba32846f\" not found" Sep 5 23:59:18.571686 kubelet[2774]: I0905 23:59:18.571669 2774 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 5 23:59:18.571997 kubelet[2774]: I0905 23:59:18.571987 2774 reconciler.go:26] "Reconciler: start to sync state" Sep 5 23:59:18.586656 kubelet[2774]: I0905 23:59:18.586591 2774 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 5 23:59:18.588409 kubelet[2774]: I0905 23:59:18.588379 2774 server.go:449] "Adding debug handlers to kubelet server" Sep 5 23:59:18.589443 kubelet[2774]: I0905 23:59:18.589370 2774 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 5 23:59:18.589711 kubelet[2774]: I0905 23:59:18.589425 2774 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 5 23:59:18.589711 kubelet[2774]: I0905 23:59:18.589535 2774 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 5 23:59:18.589711 kubelet[2774]: I0905 23:59:18.589553 2774 kubelet.go:2321] "Starting kubelet main sync loop" Sep 5 23:59:18.589711 kubelet[2774]: I0905 23:59:18.589587 2774 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 5 23:59:18.589711 kubelet[2774]: E0905 23:59:18.589596 2774 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 5 23:59:18.599150 kubelet[2774]: I0905 23:59:18.599108 2774 factory.go:221] Registration of the containerd container factory successfully Sep 5 23:59:18.599150 kubelet[2774]: I0905 23:59:18.599133 2774 factory.go:221] Registration of the systemd container factory successfully Sep 5 23:59:18.599326 kubelet[2774]: I0905 23:59:18.599219 2774 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 5 23:59:18.605528 kubelet[2774]: E0905 23:59:18.605299 2774 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 5 23:59:18.679159 kubelet[2774]: I0905 23:59:18.678854 2774 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 5 23:59:18.679159 kubelet[2774]: I0905 23:59:18.678875 2774 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 5 23:59:18.679159 kubelet[2774]: I0905 23:59:18.678898 2774 state_mem.go:36] "Initialized new in-memory state store" Sep 5 23:59:18.679159 kubelet[2774]: I0905 23:59:18.679048 2774 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 5 23:59:18.679159 kubelet[2774]: I0905 23:59:18.679058 2774 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 5 23:59:18.679159 kubelet[2774]: I0905 23:59:18.679077 2774 policy_none.go:49] "None policy: Start" Sep 5 23:59:18.680654 kubelet[2774]: I0905 23:59:18.679900 2774 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 5 23:59:18.680654 kubelet[2774]: I0905 23:59:18.679923 2774 state_mem.go:35] "Initializing new in-memory state store" Sep 5 23:59:18.680654 kubelet[2774]: I0905 23:59:18.680070 2774 state_mem.go:75] "Updated machine memory state" Sep 5 23:59:18.681454 kubelet[2774]: I0905 23:59:18.681433 2774 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 5 23:59:18.681710 kubelet[2774]: I0905 23:59:18.681694 2774 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 5 23:59:18.681816 kubelet[2774]: I0905 23:59:18.681782 2774 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 5 23:59:18.682710 kubelet[2774]: I0905 23:59:18.682692 2774 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 5 23:59:18.792994 kubelet[2774]: I0905 23:59:18.792878 2774 kubelet_node_status.go:72] "Attempting to register node" node="ci-4081-3-5-n-8aba32846f" Sep 5 23:59:18.805395 kubelet[2774]: I0905 23:59:18.805340 2774 kubelet_node_status.go:111] "Node was previously registered" node="ci-4081-3-5-n-8aba32846f" Sep 5 23:59:18.805515 kubelet[2774]: I0905 23:59:18.805437 2774 kubelet_node_status.go:75] "Successfully registered node" node="ci-4081-3-5-n-8aba32846f" Sep 5 23:59:18.873027 kubelet[2774]: I0905 23:59:18.872929 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/d8f42a34ba66a7b41021f548558ce545-flexvolume-dir\") pod \"kube-controller-manager-ci-4081-3-5-n-8aba32846f\" (UID: \"d8f42a34ba66a7b41021f548558ce545\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-n-8aba32846f" Sep 5 23:59:18.873027 kubelet[2774]: I0905 23:59:18.873002 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d8f42a34ba66a7b41021f548558ce545-k8s-certs\") pod \"kube-controller-manager-ci-4081-3-5-n-8aba32846f\" (UID: \"d8f42a34ba66a7b41021f548558ce545\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-n-8aba32846f" Sep 5 23:59:18.873325 kubelet[2774]: I0905 23:59:18.873044 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d8f42a34ba66a7b41021f548558ce545-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4081-3-5-n-8aba32846f\" (UID: \"d8f42a34ba66a7b41021f548558ce545\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-n-8aba32846f" Sep 5 23:59:18.873325 kubelet[2774]: I0905 23:59:18.873079 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4ba089ccbafa93e097d8d8214ca67960-ca-certs\") pod \"kube-apiserver-ci-4081-3-5-n-8aba32846f\" (UID: \"4ba089ccbafa93e097d8d8214ca67960\") " pod="kube-system/kube-apiserver-ci-4081-3-5-n-8aba32846f" Sep 5 23:59:18.873325 kubelet[2774]: I0905 23:59:18.873111 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4ba089ccbafa93e097d8d8214ca67960-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4081-3-5-n-8aba32846f\" (UID: \"4ba089ccbafa93e097d8d8214ca67960\") " pod="kube-system/kube-apiserver-ci-4081-3-5-n-8aba32846f" Sep 5 23:59:18.873325 kubelet[2774]: I0905 23:59:18.873139 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d8f42a34ba66a7b41021f548558ce545-ca-certs\") pod \"kube-controller-manager-ci-4081-3-5-n-8aba32846f\" (UID: \"d8f42a34ba66a7b41021f548558ce545\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-n-8aba32846f" Sep 5 23:59:18.873325 kubelet[2774]: I0905 23:59:18.873168 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/92f21c11552e4489b30dd575b7a2de9f-kubeconfig\") pod \"kube-scheduler-ci-4081-3-5-n-8aba32846f\" (UID: \"92f21c11552e4489b30dd575b7a2de9f\") " pod="kube-system/kube-scheduler-ci-4081-3-5-n-8aba32846f" Sep 5 23:59:18.873585 kubelet[2774]: I0905 23:59:18.873194 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4ba089ccbafa93e097d8d8214ca67960-k8s-certs\") pod \"kube-apiserver-ci-4081-3-5-n-8aba32846f\" (UID: \"4ba089ccbafa93e097d8d8214ca67960\") " pod="kube-system/kube-apiserver-ci-4081-3-5-n-8aba32846f" Sep 5 23:59:18.873585 kubelet[2774]: I0905 23:59:18.873222 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d8f42a34ba66a7b41021f548558ce545-kubeconfig\") pod \"kube-controller-manager-ci-4081-3-5-n-8aba32846f\" (UID: \"d8f42a34ba66a7b41021f548558ce545\") " pod="kube-system/kube-controller-manager-ci-4081-3-5-n-8aba32846f" Sep 5 23:59:19.540436 kubelet[2774]: I0905 23:59:19.539988 2774 apiserver.go:52] "Watching apiserver" Sep 5 23:59:19.572201 kubelet[2774]: I0905 23:59:19.572130 2774 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 5 23:59:19.695076 kubelet[2774]: I0905 23:59:19.694969 2774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4081-3-5-n-8aba32846f" podStartSLOduration=1.69494285 podStartE2EDuration="1.69494285s" podCreationTimestamp="2025-09-05 23:59:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-05 23:59:19.677076519 +0000 UTC m=+1.240700750" watchObservedRunningTime="2025-09-05 23:59:19.69494285 +0000 UTC m=+1.258567121" Sep 5 23:59:19.708249 kubelet[2774]: I0905 23:59:19.708138 2774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4081-3-5-n-8aba32846f" podStartSLOduration=1.708112527 podStartE2EDuration="1.708112527s" podCreationTimestamp="2025-09-05 23:59:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-05 23:59:19.695882732 +0000 UTC m=+1.259507003" watchObservedRunningTime="2025-09-05 23:59:19.708112527 +0000 UTC m=+1.271736758" Sep 5 23:59:19.728030 kubelet[2774]: I0905 23:59:19.725563 2774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4081-3-5-n-8aba32846f" podStartSLOduration=1.725547417 podStartE2EDuration="1.725547417s" podCreationTimestamp="2025-09-05 23:59:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-05 23:59:19.708602048 +0000 UTC m=+1.272226279" watchObservedRunningTime="2025-09-05 23:59:19.725547417 +0000 UTC m=+1.289171648" Sep 5 23:59:24.044680 kubelet[2774]: I0905 23:59:24.044648 2774 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 5 23:59:24.046519 containerd[1576]: time="2025-09-05T23:59:24.046408182Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 5 23:59:24.047810 kubelet[2774]: I0905 23:59:24.046861 2774 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 5 23:59:24.608580 kubelet[2774]: I0905 23:59:24.608518 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/f8288178-67e5-45aa-8705-8c0fdc33b50f-xtables-lock\") pod \"kube-proxy-mlls8\" (UID: \"f8288178-67e5-45aa-8705-8c0fdc33b50f\") " pod="kube-system/kube-proxy-mlls8" Sep 5 23:59:24.608580 kubelet[2774]: I0905 23:59:24.608592 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f8288178-67e5-45aa-8705-8c0fdc33b50f-lib-modules\") pod \"kube-proxy-mlls8\" (UID: \"f8288178-67e5-45aa-8705-8c0fdc33b50f\") " pod="kube-system/kube-proxy-mlls8" Sep 5 23:59:24.608900 kubelet[2774]: I0905 23:59:24.608654 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/f8288178-67e5-45aa-8705-8c0fdc33b50f-kube-proxy\") pod \"kube-proxy-mlls8\" (UID: \"f8288178-67e5-45aa-8705-8c0fdc33b50f\") " pod="kube-system/kube-proxy-mlls8" Sep 5 23:59:24.608900 kubelet[2774]: I0905 23:59:24.608690 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f9v7\" (UniqueName: \"kubernetes.io/projected/f8288178-67e5-45aa-8705-8c0fdc33b50f-kube-api-access-7f9v7\") pod \"kube-proxy-mlls8\" (UID: \"f8288178-67e5-45aa-8705-8c0fdc33b50f\") " pod="kube-system/kube-proxy-mlls8" Sep 5 23:59:24.728363 kubelet[2774]: E0905 23:59:24.727911 2774 projected.go:288] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Sep 5 23:59:24.728363 kubelet[2774]: E0905 23:59:24.727957 2774 projected.go:194] Error preparing data for projected volume kube-api-access-7f9v7 for pod kube-system/kube-proxy-mlls8: configmap "kube-root-ca.crt" not found Sep 5 23:59:24.728363 kubelet[2774]: E0905 23:59:24.728024 2774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f8288178-67e5-45aa-8705-8c0fdc33b50f-kube-api-access-7f9v7 podName:f8288178-67e5-45aa-8705-8c0fdc33b50f nodeName:}" failed. No retries permitted until 2025-09-05 23:59:25.228001863 +0000 UTC m=+6.791626094 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7f9v7" (UniqueName: "kubernetes.io/projected/f8288178-67e5-45aa-8705-8c0fdc33b50f-kube-api-access-7f9v7") pod "kube-proxy-mlls8" (UID: "f8288178-67e5-45aa-8705-8c0fdc33b50f") : configmap "kube-root-ca.crt" not found Sep 5 23:59:25.111218 kubelet[2774]: I0905 23:59:25.111134 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gxfs\" (UniqueName: \"kubernetes.io/projected/23a0e9d1-17a8-405c-8175-553cd60cb6ee-kube-api-access-6gxfs\") pod \"tigera-operator-58fc44c59b-rxksr\" (UID: \"23a0e9d1-17a8-405c-8175-553cd60cb6ee\") " pod="tigera-operator/tigera-operator-58fc44c59b-rxksr" Sep 5 23:59:25.111791 kubelet[2774]: I0905 23:59:25.111236 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/23a0e9d1-17a8-405c-8175-553cd60cb6ee-var-lib-calico\") pod \"tigera-operator-58fc44c59b-rxksr\" (UID: \"23a0e9d1-17a8-405c-8175-553cd60cb6ee\") " pod="tigera-operator/tigera-operator-58fc44c59b-rxksr" Sep 5 23:59:25.384502 containerd[1576]: time="2025-09-05T23:59:25.384369163Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-rxksr,Uid:23a0e9d1-17a8-405c-8175-553cd60cb6ee,Namespace:tigera-operator,Attempt:0,}" Sep 5 23:59:25.412802 containerd[1576]: time="2025-09-05T23:59:25.412536657Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 5 23:59:25.412802 containerd[1576]: time="2025-09-05T23:59:25.412645778Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 5 23:59:25.412802 containerd[1576]: time="2025-09-05T23:59:25.412668818Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:59:25.413482 containerd[1576]: time="2025-09-05T23:59:25.413358099Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:59:25.457157 containerd[1576]: time="2025-09-05T23:59:25.457037463Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-mlls8,Uid:f8288178-67e5-45aa-8705-8c0fdc33b50f,Namespace:kube-system,Attempt:0,}" Sep 5 23:59:25.463149 containerd[1576]: time="2025-09-05T23:59:25.463109675Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-rxksr,Uid:23a0e9d1-17a8-405c-8175-553cd60cb6ee,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"e665bdb44818108392062ed3a80cc01ef969347b158429872911365b3c3914ee\"" Sep 5 23:59:25.465463 containerd[1576]: time="2025-09-05T23:59:25.465305799Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 5 23:59:25.485152 containerd[1576]: time="2025-09-05T23:59:25.484935357Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 5 23:59:25.485152 containerd[1576]: time="2025-09-05T23:59:25.485048597Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 5 23:59:25.485152 containerd[1576]: time="2025-09-05T23:59:25.485068957Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:59:25.486655 containerd[1576]: time="2025-09-05T23:59:25.485849879Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:59:25.524919 containerd[1576]: time="2025-09-05T23:59:25.524807514Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-mlls8,Uid:f8288178-67e5-45aa-8705-8c0fdc33b50f,Namespace:kube-system,Attempt:0,} returns sandbox id \"9d3cd40058e0d67adac85f827c048416bfdbcf57d6918be2be1e488534adab2e\"" Sep 5 23:59:25.529268 containerd[1576]: time="2025-09-05T23:59:25.529217242Z" level=info msg="CreateContainer within sandbox \"9d3cd40058e0d67adac85f827c048416bfdbcf57d6918be2be1e488534adab2e\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 5 23:59:25.543933 containerd[1576]: time="2025-09-05T23:59:25.543802230Z" level=info msg="CreateContainer within sandbox \"9d3cd40058e0d67adac85f827c048416bfdbcf57d6918be2be1e488534adab2e\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"4707ab1be2389c980f05e477a0f6a86237a3fff7ce0c73051e2f8a07564800e7\"" Sep 5 23:59:25.544817 containerd[1576]: time="2025-09-05T23:59:25.544759592Z" level=info msg="StartContainer for \"4707ab1be2389c980f05e477a0f6a86237a3fff7ce0c73051e2f8a07564800e7\"" Sep 5 23:59:25.608813 containerd[1576]: time="2025-09-05T23:59:25.608768236Z" level=info msg="StartContainer for \"4707ab1be2389c980f05e477a0f6a86237a3fff7ce0c73051e2f8a07564800e7\" returns successfully" Sep 5 23:59:26.056745 kubelet[2774]: I0905 23:59:26.055869 2774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-mlls8" podStartSLOduration=2.055851051 podStartE2EDuration="2.055851051s" podCreationTimestamp="2025-09-05 23:59:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-05 23:59:25.676027285 +0000 UTC m=+7.239651516" watchObservedRunningTime="2025-09-05 23:59:26.055851051 +0000 UTC m=+7.619475282" Sep 5 23:59:27.209334 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4030874432.mount: Deactivated successfully. Sep 5 23:59:27.586699 containerd[1576]: time="2025-09-05T23:59:27.586537431Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:59:27.588254 containerd[1576]: time="2025-09-05T23:59:27.587962633Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=22152365" Sep 5 23:59:27.588254 containerd[1576]: time="2025-09-05T23:59:27.588209074Z" level=info msg="ImageCreate event name:\"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:59:27.591086 containerd[1576]: time="2025-09-05T23:59:27.591020158Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:59:27.592250 containerd[1576]: time="2025-09-05T23:59:27.592115280Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"22148360\" in 2.126771081s" Sep 5 23:59:27.592250 containerd[1576]: time="2025-09-05T23:59:27.592153720Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\"" Sep 5 23:59:27.596453 containerd[1576]: time="2025-09-05T23:59:27.596344607Z" level=info msg="CreateContainer within sandbox \"e665bdb44818108392062ed3a80cc01ef969347b158429872911365b3c3914ee\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 5 23:59:27.614171 containerd[1576]: time="2025-09-05T23:59:27.614112677Z" level=info msg="CreateContainer within sandbox \"e665bdb44818108392062ed3a80cc01ef969347b158429872911365b3c3914ee\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"3b8b67645073c617c28d809da380ce1c94e68b38c3dcbfee485e83b86a155eb8\"" Sep 5 23:59:27.616667 containerd[1576]: time="2025-09-05T23:59:27.616468441Z" level=info msg="StartContainer for \"3b8b67645073c617c28d809da380ce1c94e68b38c3dcbfee485e83b86a155eb8\"" Sep 5 23:59:27.676370 containerd[1576]: time="2025-09-05T23:59:27.676264303Z" level=info msg="StartContainer for \"3b8b67645073c617c28d809da380ce1c94e68b38c3dcbfee485e83b86a155eb8\" returns successfully" Sep 5 23:59:28.757103 kubelet[2774]: I0905 23:59:28.756946 2774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-58fc44c59b-rxksr" podStartSLOduration=1.628351049 podStartE2EDuration="3.756928373s" podCreationTimestamp="2025-09-05 23:59:25 +0000 UTC" firstStartedPulling="2025-09-05 23:59:25.464570838 +0000 UTC m=+7.028195069" lastFinishedPulling="2025-09-05 23:59:27.593148162 +0000 UTC m=+9.156772393" observedRunningTime="2025-09-05 23:59:28.687966824 +0000 UTC m=+10.251591095" watchObservedRunningTime="2025-09-05 23:59:28.756928373 +0000 UTC m=+10.320552564" Sep 5 23:59:34.021718 sudo[1915]: pam_unix(sudo:session): session closed for user root Sep 5 23:59:34.183697 sshd[1891]: pam_unix(sshd:session): session closed for user core Sep 5 23:59:34.193665 systemd-logind[1551]: Session 7 logged out. Waiting for processes to exit. Sep 5 23:59:34.197192 systemd[1]: sshd@6-128.140.56.156:22-139.178.68.195:44060.service: Deactivated successfully. Sep 5 23:59:34.204981 systemd[1]: session-7.scope: Deactivated successfully. Sep 5 23:59:34.213132 systemd-logind[1551]: Removed session 7. Sep 5 23:59:41.814607 kubelet[2774]: W0905 23:59:41.814561 2774 reflector.go:561] object-"calico-system"/"typha-certs": failed to list *v1.Secret: secrets "typha-certs" is forbidden: User "system:node:ci-4081-3-5-n-8aba32846f" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4081-3-5-n-8aba32846f' and this object Sep 5 23:59:41.814607 kubelet[2774]: E0905 23:59:41.814609 2774 reflector.go:158] "Unhandled Error" err="object-\"calico-system\"/\"typha-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"typha-certs\" is forbidden: User \"system:node:ci-4081-3-5-n-8aba32846f\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4081-3-5-n-8aba32846f' and this object" logger="UnhandledError" Sep 5 23:59:41.814607 kubelet[2774]: W0905 23:59:41.814674 2774 reflector.go:561] object-"calico-system"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4081-3-5-n-8aba32846f" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4081-3-5-n-8aba32846f' and this object Sep 5 23:59:41.814607 kubelet[2774]: E0905 23:59:41.814685 2774 reflector.go:158] "Unhandled Error" err="object-\"calico-system\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ci-4081-3-5-n-8aba32846f\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4081-3-5-n-8aba32846f' and this object" logger="UnhandledError" Sep 5 23:59:41.814607 kubelet[2774]: W0905 23:59:41.814698 2774 reflector.go:561] object-"calico-system"/"tigera-ca-bundle": failed to list *v1.ConfigMap: configmaps "tigera-ca-bundle" is forbidden: User "system:node:ci-4081-3-5-n-8aba32846f" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4081-3-5-n-8aba32846f' and this object Sep 5 23:59:41.815597 kubelet[2774]: E0905 23:59:41.814730 2774 reflector.go:158] "Unhandled Error" err="object-\"calico-system\"/\"tigera-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"tigera-ca-bundle\" is forbidden: User \"system:node:ci-4081-3-5-n-8aba32846f\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4081-3-5-n-8aba32846f' and this object" logger="UnhandledError" Sep 5 23:59:41.826645 kubelet[2774]: I0905 23:59:41.826577 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl8kl\" (UniqueName: \"kubernetes.io/projected/465a82c7-673e-4948-ab39-de8c30afe602-kube-api-access-gl8kl\") pod \"calico-typha-8645b78b65-fhvz7\" (UID: \"465a82c7-673e-4948-ab39-de8c30afe602\") " pod="calico-system/calico-typha-8645b78b65-fhvz7" Sep 5 23:59:41.826827 kubelet[2774]: I0905 23:59:41.826676 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/465a82c7-673e-4948-ab39-de8c30afe602-tigera-ca-bundle\") pod \"calico-typha-8645b78b65-fhvz7\" (UID: \"465a82c7-673e-4948-ab39-de8c30afe602\") " pod="calico-system/calico-typha-8645b78b65-fhvz7" Sep 5 23:59:41.826827 kubelet[2774]: I0905 23:59:41.826700 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/465a82c7-673e-4948-ab39-de8c30afe602-typha-certs\") pod \"calico-typha-8645b78b65-fhvz7\" (UID: \"465a82c7-673e-4948-ab39-de8c30afe602\") " pod="calico-system/calico-typha-8645b78b65-fhvz7" Sep 5 23:59:42.027437 kubelet[2774]: I0905 23:59:42.027205 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/b6a7ce10-9d2f-4625-8dd2-b969ffd82644-var-lib-calico\") pod \"calico-node-j42nm\" (UID: \"b6a7ce10-9d2f-4625-8dd2-b969ffd82644\") " pod="calico-system/calico-node-j42nm" Sep 5 23:59:42.027437 kubelet[2774]: I0905 23:59:42.027251 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/b6a7ce10-9d2f-4625-8dd2-b969ffd82644-xtables-lock\") pod \"calico-node-j42nm\" (UID: \"b6a7ce10-9d2f-4625-8dd2-b969ffd82644\") " pod="calico-system/calico-node-j42nm" Sep 5 23:59:42.027437 kubelet[2774]: I0905 23:59:42.027271 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/b6a7ce10-9d2f-4625-8dd2-b969ffd82644-cni-log-dir\") pod \"calico-node-j42nm\" (UID: \"b6a7ce10-9d2f-4625-8dd2-b969ffd82644\") " pod="calico-system/calico-node-j42nm" Sep 5 23:59:42.027437 kubelet[2774]: I0905 23:59:42.027286 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6a7ce10-9d2f-4625-8dd2-b969ffd82644-tigera-ca-bundle\") pod \"calico-node-j42nm\" (UID: \"b6a7ce10-9d2f-4625-8dd2-b969ffd82644\") " pod="calico-system/calico-node-j42nm" Sep 5 23:59:42.027437 kubelet[2774]: I0905 23:59:42.027305 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6qxc\" (UniqueName: \"kubernetes.io/projected/b6a7ce10-9d2f-4625-8dd2-b969ffd82644-kube-api-access-x6qxc\") pod \"calico-node-j42nm\" (UID: \"b6a7ce10-9d2f-4625-8dd2-b969ffd82644\") " pod="calico-system/calico-node-j42nm" Sep 5 23:59:42.027687 kubelet[2774]: I0905 23:59:42.027323 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/b6a7ce10-9d2f-4625-8dd2-b969ffd82644-policysync\") pod \"calico-node-j42nm\" (UID: \"b6a7ce10-9d2f-4625-8dd2-b969ffd82644\") " pod="calico-system/calico-node-j42nm" Sep 5 23:59:42.027687 kubelet[2774]: I0905 23:59:42.027338 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b6a7ce10-9d2f-4625-8dd2-b969ffd82644-lib-modules\") pod \"calico-node-j42nm\" (UID: \"b6a7ce10-9d2f-4625-8dd2-b969ffd82644\") " pod="calico-system/calico-node-j42nm" Sep 5 23:59:42.027687 kubelet[2774]: I0905 23:59:42.027353 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/b6a7ce10-9d2f-4625-8dd2-b969ffd82644-cni-net-dir\") pod \"calico-node-j42nm\" (UID: \"b6a7ce10-9d2f-4625-8dd2-b969ffd82644\") " pod="calico-system/calico-node-j42nm" Sep 5 23:59:42.027687 kubelet[2774]: I0905 23:59:42.027367 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/b6a7ce10-9d2f-4625-8dd2-b969ffd82644-var-run-calico\") pod \"calico-node-j42nm\" (UID: \"b6a7ce10-9d2f-4625-8dd2-b969ffd82644\") " pod="calico-system/calico-node-j42nm" Sep 5 23:59:42.027687 kubelet[2774]: I0905 23:59:42.027436 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/b6a7ce10-9d2f-4625-8dd2-b969ffd82644-cni-bin-dir\") pod \"calico-node-j42nm\" (UID: \"b6a7ce10-9d2f-4625-8dd2-b969ffd82644\") " pod="calico-system/calico-node-j42nm" Sep 5 23:59:42.027828 kubelet[2774]: I0905 23:59:42.027483 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/b6a7ce10-9d2f-4625-8dd2-b969ffd82644-node-certs\") pod \"calico-node-j42nm\" (UID: \"b6a7ce10-9d2f-4625-8dd2-b969ffd82644\") " pod="calico-system/calico-node-j42nm" Sep 5 23:59:42.027828 kubelet[2774]: I0905 23:59:42.027501 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/b6a7ce10-9d2f-4625-8dd2-b969ffd82644-flexvol-driver-host\") pod \"calico-node-j42nm\" (UID: \"b6a7ce10-9d2f-4625-8dd2-b969ffd82644\") " pod="calico-system/calico-node-j42nm" Sep 5 23:59:42.137678 kubelet[2774]: E0905 23:59:42.136667 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.137678 kubelet[2774]: W0905 23:59:42.136690 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.137678 kubelet[2774]: E0905 23:59:42.136725 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.226664 kubelet[2774]: E0905 23:59:42.224354 2774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ps2ks" podUID="4ff97107-cfe4-4d25-918b-36fd4176bf0c" Sep 5 23:59:42.230776 kubelet[2774]: E0905 23:59:42.230739 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.230776 kubelet[2774]: W0905 23:59:42.230766 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.231002 kubelet[2774]: E0905 23:59:42.230807 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.232406 kubelet[2774]: E0905 23:59:42.232380 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.232406 kubelet[2774]: W0905 23:59:42.232401 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.232608 kubelet[2774]: E0905 23:59:42.232483 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.232886 kubelet[2774]: E0905 23:59:42.232866 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.232886 kubelet[2774]: W0905 23:59:42.232883 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.232973 kubelet[2774]: E0905 23:59:42.232901 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.233322 kubelet[2774]: E0905 23:59:42.233302 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.233772 kubelet[2774]: W0905 23:59:42.233323 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.233772 kubelet[2774]: E0905 23:59:42.233341 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.233772 kubelet[2774]: E0905 23:59:42.233599 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.233772 kubelet[2774]: W0905 23:59:42.233610 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.233772 kubelet[2774]: E0905 23:59:42.233650 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.234637 kubelet[2774]: E0905 23:59:42.233875 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.234637 kubelet[2774]: W0905 23:59:42.233885 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.234637 kubelet[2774]: E0905 23:59:42.234021 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.234637 kubelet[2774]: W0905 23:59:42.234029 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.234637 kubelet[2774]: E0905 23:59:42.234038 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.234637 kubelet[2774]: E0905 23:59:42.234521 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.236227 kubelet[2774]: E0905 23:59:42.234925 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.236227 kubelet[2774]: W0905 23:59:42.234948 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.236227 kubelet[2774]: E0905 23:59:42.234968 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.236227 kubelet[2774]: E0905 23:59:42.235214 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.236227 kubelet[2774]: W0905 23:59:42.235223 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.236227 kubelet[2774]: E0905 23:59:42.235233 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.236227 kubelet[2774]: E0905 23:59:42.235365 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.236227 kubelet[2774]: W0905 23:59:42.235373 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.236227 kubelet[2774]: E0905 23:59:42.235381 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.236227 kubelet[2774]: E0905 23:59:42.235982 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.237253 kubelet[2774]: W0905 23:59:42.235994 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.237253 kubelet[2774]: E0905 23:59:42.236007 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.237253 kubelet[2774]: E0905 23:59:42.236147 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.237253 kubelet[2774]: W0905 23:59:42.236154 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.237253 kubelet[2774]: E0905 23:59:42.236163 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.237253 kubelet[2774]: E0905 23:59:42.236836 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.237253 kubelet[2774]: W0905 23:59:42.236850 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.237253 kubelet[2774]: E0905 23:59:42.236863 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.240914 kubelet[2774]: E0905 23:59:42.237518 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.240914 kubelet[2774]: W0905 23:59:42.237533 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.240914 kubelet[2774]: E0905 23:59:42.237546 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.240914 kubelet[2774]: E0905 23:59:42.238779 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.240914 kubelet[2774]: W0905 23:59:42.238833 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.240914 kubelet[2774]: E0905 23:59:42.238851 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.240914 kubelet[2774]: E0905 23:59:42.239002 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.240914 kubelet[2774]: W0905 23:59:42.239009 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.240914 kubelet[2774]: E0905 23:59:42.239017 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.240914 kubelet[2774]: E0905 23:59:42.239146 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.241132 kubelet[2774]: W0905 23:59:42.239153 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.241132 kubelet[2774]: E0905 23:59:42.239160 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.241132 kubelet[2774]: E0905 23:59:42.239273 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.241132 kubelet[2774]: W0905 23:59:42.239280 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.241132 kubelet[2774]: E0905 23:59:42.239289 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.241132 kubelet[2774]: E0905 23:59:42.239741 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.241132 kubelet[2774]: W0905 23:59:42.239754 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.241132 kubelet[2774]: E0905 23:59:42.239766 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.241132 kubelet[2774]: E0905 23:59:42.240498 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.241132 kubelet[2774]: W0905 23:59:42.240511 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.241330 kubelet[2774]: E0905 23:59:42.240522 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.241330 kubelet[2774]: E0905 23:59:42.241204 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.241330 kubelet[2774]: W0905 23:59:42.241226 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.241330 kubelet[2774]: E0905 23:59:42.241239 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.241589 kubelet[2774]: E0905 23:59:42.241421 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.241589 kubelet[2774]: W0905 23:59:42.241442 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.241589 kubelet[2774]: E0905 23:59:42.241453 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.242531 kubelet[2774]: E0905 23:59:42.242492 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.242531 kubelet[2774]: W0905 23:59:42.242515 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.242531 kubelet[2774]: E0905 23:59:42.242530 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.242719 kubelet[2774]: E0905 23:59:42.242700 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.242719 kubelet[2774]: W0905 23:59:42.242715 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.243237 kubelet[2774]: E0905 23:59:42.242853 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.243237 kubelet[2774]: W0905 23:59:42.242860 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.243237 kubelet[2774]: E0905 23:59:42.242869 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.243237 kubelet[2774]: E0905 23:59:42.242892 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.344418 kubelet[2774]: E0905 23:59:42.344337 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.344418 kubelet[2774]: W0905 23:59:42.344362 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.344418 kubelet[2774]: E0905 23:59:42.344389 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.347236 kubelet[2774]: E0905 23:59:42.346748 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.347236 kubelet[2774]: W0905 23:59:42.346780 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.347236 kubelet[2774]: E0905 23:59:42.346830 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.347236 kubelet[2774]: E0905 23:59:42.347196 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.347236 kubelet[2774]: W0905 23:59:42.347211 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.347236 kubelet[2774]: E0905 23:59:42.347228 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.348529 kubelet[2774]: I0905 23:59:42.347266 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ff97107-cfe4-4d25-918b-36fd4176bf0c-kubelet-dir\") pod \"csi-node-driver-ps2ks\" (UID: \"4ff97107-cfe4-4d25-918b-36fd4176bf0c\") " pod="calico-system/csi-node-driver-ps2ks" Sep 5 23:59:42.348529 kubelet[2774]: E0905 23:59:42.347578 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.348529 kubelet[2774]: W0905 23:59:42.347594 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.348529 kubelet[2774]: E0905 23:59:42.347707 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.348529 kubelet[2774]: I0905 23:59:42.347952 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4ff97107-cfe4-4d25-918b-36fd4176bf0c-socket-dir\") pod \"csi-node-driver-ps2ks\" (UID: \"4ff97107-cfe4-4d25-918b-36fd4176bf0c\") " pod="calico-system/csi-node-driver-ps2ks" Sep 5 23:59:42.348529 kubelet[2774]: E0905 23:59:42.348377 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.348529 kubelet[2774]: W0905 23:59:42.348390 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.348529 kubelet[2774]: E0905 23:59:42.348408 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.351227 kubelet[2774]: E0905 23:59:42.348747 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.351227 kubelet[2774]: W0905 23:59:42.348803 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.351227 kubelet[2774]: E0905 23:59:42.348865 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.351227 kubelet[2774]: E0905 23:59:42.349237 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.351227 kubelet[2774]: W0905 23:59:42.349250 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.351227 kubelet[2774]: E0905 23:59:42.349276 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.351227 kubelet[2774]: I0905 23:59:42.349296 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/4ff97107-cfe4-4d25-918b-36fd4176bf0c-varrun\") pod \"csi-node-driver-ps2ks\" (UID: \"4ff97107-cfe4-4d25-918b-36fd4176bf0c\") " pod="calico-system/csi-node-driver-ps2ks" Sep 5 23:59:42.351227 kubelet[2774]: E0905 23:59:42.349682 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.351227 kubelet[2774]: W0905 23:59:42.349697 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.353949 kubelet[2774]: E0905 23:59:42.349748 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.353949 kubelet[2774]: I0905 23:59:42.349769 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vz6g\" (UniqueName: \"kubernetes.io/projected/4ff97107-cfe4-4d25-918b-36fd4176bf0c-kube-api-access-4vz6g\") pod \"csi-node-driver-ps2ks\" (UID: \"4ff97107-cfe4-4d25-918b-36fd4176bf0c\") " pod="calico-system/csi-node-driver-ps2ks" Sep 5 23:59:42.353949 kubelet[2774]: E0905 23:59:42.350193 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.353949 kubelet[2774]: W0905 23:59:42.350206 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.353949 kubelet[2774]: E0905 23:59:42.350253 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.353949 kubelet[2774]: E0905 23:59:42.350577 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.353949 kubelet[2774]: W0905 23:59:42.350589 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.353949 kubelet[2774]: E0905 23:59:42.350604 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.353949 kubelet[2774]: E0905 23:59:42.350926 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.354612 kubelet[2774]: W0905 23:59:42.350938 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.354612 kubelet[2774]: E0905 23:59:42.350966 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.354612 kubelet[2774]: I0905 23:59:42.350989 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4ff97107-cfe4-4d25-918b-36fd4176bf0c-registration-dir\") pod \"csi-node-driver-ps2ks\" (UID: \"4ff97107-cfe4-4d25-918b-36fd4176bf0c\") " pod="calico-system/csi-node-driver-ps2ks" Sep 5 23:59:42.354612 kubelet[2774]: E0905 23:59:42.351289 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.354612 kubelet[2774]: W0905 23:59:42.351303 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.354612 kubelet[2774]: E0905 23:59:42.351412 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.354612 kubelet[2774]: E0905 23:59:42.351724 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.354612 kubelet[2774]: W0905 23:59:42.351735 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.354612 kubelet[2774]: E0905 23:59:42.351747 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.354833 kubelet[2774]: E0905 23:59:42.352203 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.354833 kubelet[2774]: W0905 23:59:42.352217 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.354833 kubelet[2774]: E0905 23:59:42.352233 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.354833 kubelet[2774]: E0905 23:59:42.352516 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.354833 kubelet[2774]: W0905 23:59:42.352528 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.354833 kubelet[2774]: E0905 23:59:42.352567 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.354833 kubelet[2774]: E0905 23:59:42.352967 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.354833 kubelet[2774]: W0905 23:59:42.352978 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.354833 kubelet[2774]: E0905 23:59:42.353026 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.354833 kubelet[2774]: E0905 23:59:42.353385 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.355032 kubelet[2774]: W0905 23:59:42.353397 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.355032 kubelet[2774]: E0905 23:59:42.353415 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.355032 kubelet[2774]: E0905 23:59:42.353658 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.355032 kubelet[2774]: W0905 23:59:42.353668 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.355032 kubelet[2774]: E0905 23:59:42.353678 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.355032 kubelet[2774]: E0905 23:59:42.354028 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.355032 kubelet[2774]: W0905 23:59:42.354040 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.355032 kubelet[2774]: E0905 23:59:42.354051 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.355032 kubelet[2774]: E0905 23:59:42.354338 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.355032 kubelet[2774]: W0905 23:59:42.354364 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.356807 kubelet[2774]: E0905 23:59:42.354376 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.456076 kubelet[2774]: E0905 23:59:42.455581 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.456076 kubelet[2774]: W0905 23:59:42.455639 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.456076 kubelet[2774]: E0905 23:59:42.455673 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.461104 kubelet[2774]: E0905 23:59:42.460880 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.461104 kubelet[2774]: W0905 23:59:42.460918 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.461104 kubelet[2774]: E0905 23:59:42.460966 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.462556 kubelet[2774]: E0905 23:59:42.461913 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.462556 kubelet[2774]: W0905 23:59:42.461943 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.462556 kubelet[2774]: E0905 23:59:42.462010 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.462556 kubelet[2774]: E0905 23:59:42.462178 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.462556 kubelet[2774]: W0905 23:59:42.462187 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.463177 kubelet[2774]: E0905 23:59:42.462885 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.465836 kubelet[2774]: E0905 23:59:42.463364 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.465836 kubelet[2774]: W0905 23:59:42.463379 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.465836 kubelet[2774]: E0905 23:59:42.463520 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.466511 kubelet[2774]: E0905 23:59:42.466321 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.466511 kubelet[2774]: W0905 23:59:42.466342 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.466511 kubelet[2774]: E0905 23:59:42.466454 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.467069 kubelet[2774]: E0905 23:59:42.466929 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.467069 kubelet[2774]: W0905 23:59:42.466942 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.467069 kubelet[2774]: E0905 23:59:42.466968 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.468333 kubelet[2774]: E0905 23:59:42.467853 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.468333 kubelet[2774]: W0905 23:59:42.467868 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.468333 kubelet[2774]: E0905 23:59:42.467886 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.470503 kubelet[2774]: E0905 23:59:42.469918 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.470503 kubelet[2774]: W0905 23:59:42.469937 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.470503 kubelet[2774]: E0905 23:59:42.470410 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.472589 kubelet[2774]: E0905 23:59:42.471382 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.472589 kubelet[2774]: W0905 23:59:42.471398 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.472589 kubelet[2774]: E0905 23:59:42.471418 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.474136 kubelet[2774]: E0905 23:59:42.473044 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.474136 kubelet[2774]: W0905 23:59:42.473064 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.474136 kubelet[2774]: E0905 23:59:42.473081 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.475901 kubelet[2774]: E0905 23:59:42.475665 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.475901 kubelet[2774]: W0905 23:59:42.475685 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.475901 kubelet[2774]: E0905 23:59:42.475746 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.476608 kubelet[2774]: E0905 23:59:42.476507 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.476608 kubelet[2774]: W0905 23:59:42.476528 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.476608 kubelet[2774]: E0905 23:59:42.476550 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.478123 kubelet[2774]: E0905 23:59:42.478098 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.478123 kubelet[2774]: W0905 23:59:42.478120 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.478349 kubelet[2774]: E0905 23:59:42.478201 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.479612 kubelet[2774]: E0905 23:59:42.479580 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.479612 kubelet[2774]: W0905 23:59:42.479607 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.480149 kubelet[2774]: E0905 23:59:42.479987 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.481977 kubelet[2774]: E0905 23:59:42.481731 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.481977 kubelet[2774]: W0905 23:59:42.481750 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.482206 kubelet[2774]: E0905 23:59:42.482188 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.482520 kubelet[2774]: E0905 23:59:42.482405 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.482520 kubelet[2774]: W0905 23:59:42.482422 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.482957 kubelet[2774]: E0905 23:59:42.482885 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.483279 kubelet[2774]: E0905 23:59:42.483248 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.483279 kubelet[2774]: W0905 23:59:42.483262 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.483507 kubelet[2774]: E0905 23:59:42.483406 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.483739 kubelet[2774]: E0905 23:59:42.483728 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.483885 kubelet[2774]: W0905 23:59:42.483825 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.484161 kubelet[2774]: E0905 23:59:42.484062 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.484858 kubelet[2774]: E0905 23:59:42.484801 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.484858 kubelet[2774]: W0905 23:59:42.484837 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.485118 kubelet[2774]: E0905 23:59:42.485019 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.485950 kubelet[2774]: E0905 23:59:42.485892 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.485950 kubelet[2774]: W0905 23:59:42.485929 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.486312 kubelet[2774]: E0905 23:59:42.486088 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.486538 kubelet[2774]: E0905 23:59:42.486526 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.486538 kubelet[2774]: W0905 23:59:42.486569 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.486972 kubelet[2774]: E0905 23:59:42.486816 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.487187 kubelet[2774]: E0905 23:59:42.487097 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.487187 kubelet[2774]: W0905 23:59:42.487109 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.487330 kubelet[2774]: E0905 23:59:42.487299 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.487617 kubelet[2774]: E0905 23:59:42.487591 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.487739 kubelet[2774]: W0905 23:59:42.487681 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.487953 kubelet[2774]: E0905 23:59:42.487776 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.488149 kubelet[2774]: E0905 23:59:42.488138 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.488209 kubelet[2774]: W0905 23:59:42.488199 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.488367 kubelet[2774]: E0905 23:59:42.488268 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.488656 kubelet[2774]: E0905 23:59:42.488571 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.488656 kubelet[2774]: W0905 23:59:42.488605 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.488875 kubelet[2774]: E0905 23:59:42.488760 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.489247 kubelet[2774]: E0905 23:59:42.489109 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.489247 kubelet[2774]: W0905 23:59:42.489121 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.489247 kubelet[2774]: E0905 23:59:42.489188 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.489647 kubelet[2774]: E0905 23:59:42.489493 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.489647 kubelet[2774]: W0905 23:59:42.489505 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.489647 kubelet[2774]: E0905 23:59:42.489592 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.490192 kubelet[2774]: E0905 23:59:42.489878 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.490192 kubelet[2774]: W0905 23:59:42.489890 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.490192 kubelet[2774]: E0905 23:59:42.489903 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.490446 kubelet[2774]: E0905 23:59:42.490434 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.490511 kubelet[2774]: W0905 23:59:42.490500 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.490584 kubelet[2774]: E0905 23:59:42.490556 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.584915 kubelet[2774]: E0905 23:59:42.584521 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.584915 kubelet[2774]: W0905 23:59:42.584552 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.584915 kubelet[2774]: E0905 23:59:42.584577 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.586247 kubelet[2774]: E0905 23:59:42.585896 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.586247 kubelet[2774]: W0905 23:59:42.585921 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.586247 kubelet[2774]: E0905 23:59:42.585942 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.586905 kubelet[2774]: E0905 23:59:42.586691 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.586905 kubelet[2774]: W0905 23:59:42.586710 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.586905 kubelet[2774]: E0905 23:59:42.586726 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.587193 kubelet[2774]: E0905 23:59:42.587175 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.588009 kubelet[2774]: W0905 23:59:42.587985 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.588267 kubelet[2774]: E0905 23:59:42.588121 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.588722 kubelet[2774]: E0905 23:59:42.588398 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.588722 kubelet[2774]: W0905 23:59:42.588412 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.588722 kubelet[2774]: E0905 23:59:42.588422 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.589085 kubelet[2774]: E0905 23:59:42.589001 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.589190 kubelet[2774]: W0905 23:59:42.589166 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.589247 kubelet[2774]: E0905 23:59:42.589235 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.690311 kubelet[2774]: E0905 23:59:42.690283 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.690759 kubelet[2774]: W0905 23:59:42.690468 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.690759 kubelet[2774]: E0905 23:59:42.690513 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.691147 kubelet[2774]: E0905 23:59:42.691033 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.691147 kubelet[2774]: W0905 23:59:42.691047 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.691147 kubelet[2774]: E0905 23:59:42.691060 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.691345 kubelet[2774]: E0905 23:59:42.691328 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.691526 kubelet[2774]: W0905 23:59:42.691392 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.691526 kubelet[2774]: E0905 23:59:42.691406 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.691721 kubelet[2774]: E0905 23:59:42.691708 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.692055 kubelet[2774]: W0905 23:59:42.691925 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.692055 kubelet[2774]: E0905 23:59:42.691943 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.692312 kubelet[2774]: E0905 23:59:42.692300 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.692491 kubelet[2774]: W0905 23:59:42.692383 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.692491 kubelet[2774]: E0905 23:59:42.692402 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.692762 kubelet[2774]: E0905 23:59:42.692694 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.692762 kubelet[2774]: W0905 23:59:42.692705 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.692762 kubelet[2774]: E0905 23:59:42.692715 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.793983 kubelet[2774]: E0905 23:59:42.793865 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.793983 kubelet[2774]: W0905 23:59:42.793900 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.793983 kubelet[2774]: E0905 23:59:42.793929 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.795693 kubelet[2774]: E0905 23:59:42.794237 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.795693 kubelet[2774]: W0905 23:59:42.794259 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.795693 kubelet[2774]: E0905 23:59:42.794276 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.795693 kubelet[2774]: E0905 23:59:42.794508 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.795693 kubelet[2774]: W0905 23:59:42.794520 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.795693 kubelet[2774]: E0905 23:59:42.794534 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.795693 kubelet[2774]: E0905 23:59:42.794826 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.795693 kubelet[2774]: W0905 23:59:42.794840 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.795693 kubelet[2774]: E0905 23:59:42.794856 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.795693 kubelet[2774]: E0905 23:59:42.795104 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.796702 kubelet[2774]: W0905 23:59:42.795116 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.796702 kubelet[2774]: E0905 23:59:42.795130 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.796702 kubelet[2774]: E0905 23:59:42.795366 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.796702 kubelet[2774]: W0905 23:59:42.795378 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.796702 kubelet[2774]: E0905 23:59:42.795391 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.897255 kubelet[2774]: E0905 23:59:42.896976 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.897255 kubelet[2774]: W0905 23:59:42.897004 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.897255 kubelet[2774]: E0905 23:59:42.897071 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.898574 kubelet[2774]: E0905 23:59:42.897436 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.898574 kubelet[2774]: W0905 23:59:42.897448 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.898574 kubelet[2774]: E0905 23:59:42.897462 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.898574 kubelet[2774]: E0905 23:59:42.897748 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.898574 kubelet[2774]: W0905 23:59:42.897760 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.898574 kubelet[2774]: E0905 23:59:42.897773 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.898574 kubelet[2774]: E0905 23:59:42.898204 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.898574 kubelet[2774]: W0905 23:59:42.898221 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.898574 kubelet[2774]: E0905 23:59:42.898234 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.899233 kubelet[2774]: E0905 23:59:42.899102 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.899233 kubelet[2774]: W0905 23:59:42.899116 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.899233 kubelet[2774]: E0905 23:59:42.899130 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.899489 kubelet[2774]: E0905 23:59:42.899360 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:42.899489 kubelet[2774]: W0905 23:59:42.899370 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:42.899489 kubelet[2774]: E0905 23:59:42.899381 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:42.928240 kubelet[2774]: E0905 23:59:42.927880 2774 secret.go:189] Couldn't get secret calico-system/typha-certs: failed to sync secret cache: timed out waiting for the condition Sep 5 23:59:42.928240 kubelet[2774]: E0905 23:59:42.927976 2774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/465a82c7-673e-4948-ab39-de8c30afe602-typha-certs podName:465a82c7-673e-4948-ab39-de8c30afe602 nodeName:}" failed. No retries permitted until 2025-09-05 23:59:43.427950495 +0000 UTC m=+24.991574686 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "typha-certs" (UniqueName: "kubernetes.io/secret/465a82c7-673e-4948-ab39-de8c30afe602-typha-certs") pod "calico-typha-8645b78b65-fhvz7" (UID: "465a82c7-673e-4948-ab39-de8c30afe602") : failed to sync secret cache: timed out waiting for the condition Sep 5 23:59:42.928490 kubelet[2774]: E0905 23:59:42.928472 2774 configmap.go:193] Couldn't get configMap calico-system/tigera-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Sep 5 23:59:42.928605 kubelet[2774]: E0905 23:59:42.928594 2774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/465a82c7-673e-4948-ab39-de8c30afe602-tigera-ca-bundle podName:465a82c7-673e-4948-ab39-de8c30afe602 nodeName:}" failed. No retries permitted until 2025-09-05 23:59:43.428580629 +0000 UTC m=+24.992204860 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tigera-ca-bundle" (UniqueName: "kubernetes.io/configmap/465a82c7-673e-4948-ab39-de8c30afe602-tigera-ca-bundle") pod "calico-typha-8645b78b65-fhvz7" (UID: "465a82c7-673e-4948-ab39-de8c30afe602") : failed to sync configmap cache: timed out waiting for the condition Sep 5 23:59:42.943245 kubelet[2774]: E0905 23:59:42.942890 2774 projected.go:288] Couldn't get configMap calico-system/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Sep 5 23:59:42.944917 kubelet[2774]: E0905 23:59:42.943709 2774 projected.go:194] Error preparing data for projected volume kube-api-access-gl8kl for pod calico-system/calico-typha-8645b78b65-fhvz7: failed to sync configmap cache: timed out waiting for the condition Sep 5 23:59:42.944917 kubelet[2774]: E0905 23:59:42.943787 2774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/465a82c7-673e-4948-ab39-de8c30afe602-kube-api-access-gl8kl podName:465a82c7-673e-4948-ab39-de8c30afe602 nodeName:}" failed. No retries permitted until 2025-09-05 23:59:43.443767084 +0000 UTC m=+25.007391315 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-gl8kl" (UniqueName: "kubernetes.io/projected/465a82c7-673e-4948-ab39-de8c30afe602-kube-api-access-gl8kl") pod "calico-typha-8645b78b65-fhvz7" (UID: "465a82c7-673e-4948-ab39-de8c30afe602") : failed to sync configmap cache: timed out waiting for the condition Sep 5 23:59:43.000592 kubelet[2774]: E0905 23:59:43.000399 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.000592 kubelet[2774]: W0905 23:59:43.000429 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.000592 kubelet[2774]: E0905 23:59:43.000456 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.001113 kubelet[2774]: E0905 23:59:43.000736 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.001113 kubelet[2774]: W0905 23:59:43.000748 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.001113 kubelet[2774]: E0905 23:59:43.000762 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.001509 kubelet[2774]: E0905 23:59:43.001341 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.001509 kubelet[2774]: W0905 23:59:43.001360 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.001509 kubelet[2774]: E0905 23:59:43.001377 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.001771 kubelet[2774]: E0905 23:59:43.001755 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.002012 kubelet[2774]: W0905 23:59:43.001861 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.002012 kubelet[2774]: E0905 23:59:43.001883 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.002302 kubelet[2774]: E0905 23:59:43.002286 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.002516 kubelet[2774]: W0905 23:59:43.002376 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.002516 kubelet[2774]: E0905 23:59:43.002397 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.002716 kubelet[2774]: E0905 23:59:43.002701 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.002788 kubelet[2774]: W0905 23:59:43.002775 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.002888 kubelet[2774]: E0905 23:59:43.002874 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.103990 kubelet[2774]: E0905 23:59:43.103949 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.103990 kubelet[2774]: W0905 23:59:43.103981 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.104174 kubelet[2774]: E0905 23:59:43.104011 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.104287 kubelet[2774]: E0905 23:59:43.104270 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.104287 kubelet[2774]: W0905 23:59:43.104287 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.104373 kubelet[2774]: E0905 23:59:43.104302 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.104540 kubelet[2774]: E0905 23:59:43.104524 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.104540 kubelet[2774]: W0905 23:59:43.104540 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.104615 kubelet[2774]: E0905 23:59:43.104553 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.104857 kubelet[2774]: E0905 23:59:43.104838 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.104905 kubelet[2774]: W0905 23:59:43.104859 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.104905 kubelet[2774]: E0905 23:59:43.104874 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.105119 kubelet[2774]: E0905 23:59:43.105105 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.105160 kubelet[2774]: W0905 23:59:43.105120 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.105160 kubelet[2774]: E0905 23:59:43.105135 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.105358 kubelet[2774]: E0905 23:59:43.105344 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.105395 kubelet[2774]: W0905 23:59:43.105359 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.105395 kubelet[2774]: E0905 23:59:43.105372 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.129049 kubelet[2774]: E0905 23:59:43.128961 2774 configmap.go:193] Couldn't get configMap calico-system/tigera-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Sep 5 23:59:43.129236 kubelet[2774]: E0905 23:59:43.129070 2774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b6a7ce10-9d2f-4625-8dd2-b969ffd82644-tigera-ca-bundle podName:b6a7ce10-9d2f-4625-8dd2-b969ffd82644 nodeName:}" failed. No retries permitted until 2025-09-05 23:59:43.629044736 +0000 UTC m=+25.192668967 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tigera-ca-bundle" (UniqueName: "kubernetes.io/configmap/b6a7ce10-9d2f-4625-8dd2-b969ffd82644-tigera-ca-bundle") pod "calico-node-j42nm" (UID: "b6a7ce10-9d2f-4625-8dd2-b969ffd82644") : failed to sync configmap cache: timed out waiting for the condition Sep 5 23:59:43.143184 kubelet[2774]: E0905 23:59:43.143116 2774 projected.go:288] Couldn't get configMap calico-system/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Sep 5 23:59:43.143184 kubelet[2774]: E0905 23:59:43.143180 2774 projected.go:194] Error preparing data for projected volume kube-api-access-x6qxc for pod calico-system/calico-node-j42nm: failed to sync configmap cache: timed out waiting for the condition Sep 5 23:59:43.143825 kubelet[2774]: E0905 23:59:43.143772 2774 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b6a7ce10-9d2f-4625-8dd2-b969ffd82644-kube-api-access-x6qxc podName:b6a7ce10-9d2f-4625-8dd2-b969ffd82644 nodeName:}" failed. No retries permitted until 2025-09-05 23:59:43.643274201 +0000 UTC m=+25.206898512 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-x6qxc" (UniqueName: "kubernetes.io/projected/b6a7ce10-9d2f-4625-8dd2-b969ffd82644-kube-api-access-x6qxc") pod "calico-node-j42nm" (UID: "b6a7ce10-9d2f-4625-8dd2-b969ffd82644") : failed to sync configmap cache: timed out waiting for the condition Sep 5 23:59:43.207650 kubelet[2774]: E0905 23:59:43.206136 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.207650 kubelet[2774]: W0905 23:59:43.206173 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.207650 kubelet[2774]: E0905 23:59:43.206203 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.207650 kubelet[2774]: E0905 23:59:43.206451 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.207650 kubelet[2774]: W0905 23:59:43.206464 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.207650 kubelet[2774]: E0905 23:59:43.206480 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.207650 kubelet[2774]: E0905 23:59:43.206752 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.207650 kubelet[2774]: W0905 23:59:43.206765 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.207650 kubelet[2774]: E0905 23:59:43.206780 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.207650 kubelet[2774]: E0905 23:59:43.207033 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.208084 kubelet[2774]: W0905 23:59:43.207046 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.208084 kubelet[2774]: E0905 23:59:43.207059 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.208084 kubelet[2774]: E0905 23:59:43.207312 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.208084 kubelet[2774]: W0905 23:59:43.207330 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.208084 kubelet[2774]: E0905 23:59:43.207344 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.208084 kubelet[2774]: E0905 23:59:43.207575 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.208084 kubelet[2774]: W0905 23:59:43.207587 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.208084 kubelet[2774]: E0905 23:59:43.207600 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.215076 kubelet[2774]: E0905 23:59:43.215038 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.215305 kubelet[2774]: W0905 23:59:43.215278 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.215369 kubelet[2774]: E0905 23:59:43.215321 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.308471 kubelet[2774]: E0905 23:59:43.308432 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.308471 kubelet[2774]: W0905 23:59:43.308456 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.308471 kubelet[2774]: E0905 23:59:43.308480 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.310769 kubelet[2774]: E0905 23:59:43.310732 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.310769 kubelet[2774]: W0905 23:59:43.310758 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.310769 kubelet[2774]: E0905 23:59:43.310782 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.311138 kubelet[2774]: E0905 23:59:43.311118 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.311138 kubelet[2774]: W0905 23:59:43.311134 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.311233 kubelet[2774]: E0905 23:59:43.311146 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.311413 kubelet[2774]: E0905 23:59:43.311386 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.311413 kubelet[2774]: W0905 23:59:43.311408 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.311503 kubelet[2774]: E0905 23:59:43.311440 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.311792 kubelet[2774]: E0905 23:59:43.311762 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.311900 kubelet[2774]: W0905 23:59:43.311817 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.311900 kubelet[2774]: E0905 23:59:43.311832 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.414108 kubelet[2774]: E0905 23:59:43.412903 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.414108 kubelet[2774]: W0905 23:59:43.412926 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.414108 kubelet[2774]: E0905 23:59:43.412950 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.414108 kubelet[2774]: E0905 23:59:43.413197 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.414108 kubelet[2774]: W0905 23:59:43.413206 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.414108 kubelet[2774]: E0905 23:59:43.413217 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.414108 kubelet[2774]: E0905 23:59:43.413641 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.414108 kubelet[2774]: W0905 23:59:43.413652 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.414108 kubelet[2774]: E0905 23:59:43.413663 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.414108 kubelet[2774]: E0905 23:59:43.413909 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.415400 kubelet[2774]: W0905 23:59:43.413919 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.415400 kubelet[2774]: E0905 23:59:43.413930 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.415400 kubelet[2774]: E0905 23:59:43.415039 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.415400 kubelet[2774]: W0905 23:59:43.415052 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.415400 kubelet[2774]: E0905 23:59:43.415065 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.516652 kubelet[2774]: E0905 23:59:43.516471 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.516652 kubelet[2774]: W0905 23:59:43.516512 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.516652 kubelet[2774]: E0905 23:59:43.516556 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.517463 kubelet[2774]: E0905 23:59:43.516967 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.517463 kubelet[2774]: W0905 23:59:43.516981 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.517463 kubelet[2774]: E0905 23:59:43.516996 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.518244 kubelet[2774]: E0905 23:59:43.518223 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.518313 kubelet[2774]: W0905 23:59:43.518246 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.518313 kubelet[2774]: E0905 23:59:43.518273 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.518666 kubelet[2774]: E0905 23:59:43.518647 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.518828 kubelet[2774]: W0905 23:59:43.518667 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.518828 kubelet[2774]: E0905 23:59:43.518729 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.519502 kubelet[2774]: E0905 23:59:43.519138 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.519502 kubelet[2774]: W0905 23:59:43.519177 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.519502 kubelet[2774]: E0905 23:59:43.519218 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.519502 kubelet[2774]: E0905 23:59:43.519445 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.519502 kubelet[2774]: W0905 23:59:43.519457 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.519957 kubelet[2774]: E0905 23:59:43.519912 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.520484 kubelet[2774]: E0905 23:59:43.520464 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.520546 kubelet[2774]: W0905 23:59:43.520486 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.520546 kubelet[2774]: E0905 23:59:43.520515 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.521100 kubelet[2774]: E0905 23:59:43.521063 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.521100 kubelet[2774]: W0905 23:59:43.521086 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.521100 kubelet[2774]: E0905 23:59:43.521122 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.521271 kubelet[2774]: E0905 23:59:43.521256 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.521368 kubelet[2774]: W0905 23:59:43.521271 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.521368 kubelet[2774]: E0905 23:59:43.521294 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.521433 kubelet[2774]: E0905 23:59:43.521398 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.521433 kubelet[2774]: W0905 23:59:43.521406 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.521534 kubelet[2774]: E0905 23:59:43.521483 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.521572 kubelet[2774]: E0905 23:59:43.521558 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.521572 kubelet[2774]: W0905 23:59:43.521567 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.521673 kubelet[2774]: E0905 23:59:43.521650 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.521761 kubelet[2774]: E0905 23:59:43.521749 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.521761 kubelet[2774]: W0905 23:59:43.521761 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.521834 kubelet[2774]: E0905 23:59:43.521777 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.521983 kubelet[2774]: E0905 23:59:43.521971 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.522021 kubelet[2774]: W0905 23:59:43.521986 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.522021 kubelet[2774]: E0905 23:59:43.522002 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.522392 kubelet[2774]: E0905 23:59:43.522276 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.522392 kubelet[2774]: W0905 23:59:43.522293 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.522392 kubelet[2774]: E0905 23:59:43.522308 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.522635 kubelet[2774]: E0905 23:59:43.522605 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.522974 kubelet[2774]: W0905 23:59:43.522688 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.522974 kubelet[2774]: E0905 23:59:43.522706 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.528013 kubelet[2774]: E0905 23:59:43.527763 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.528013 kubelet[2774]: W0905 23:59:43.527794 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.528013 kubelet[2774]: E0905 23:59:43.527863 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.532313 kubelet[2774]: E0905 23:59:43.532281 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.532313 kubelet[2774]: W0905 23:59:43.532307 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.532484 kubelet[2774]: E0905 23:59:43.532339 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.533911 kubelet[2774]: E0905 23:59:43.533881 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.533911 kubelet[2774]: W0905 23:59:43.533906 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.534038 kubelet[2774]: E0905 23:59:43.533933 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.537230 kubelet[2774]: E0905 23:59:43.536507 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.537556 kubelet[2774]: W0905 23:59:43.537360 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.537556 kubelet[2774]: E0905 23:59:43.537407 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.537795 kubelet[2774]: E0905 23:59:43.537757 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.537895 kubelet[2774]: W0905 23:59:43.537882 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.537952 kubelet[2774]: E0905 23:59:43.537941 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.591147 kubelet[2774]: E0905 23:59:43.591047 2774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ps2ks" podUID="4ff97107-cfe4-4d25-918b-36fd4176bf0c" Sep 5 23:59:43.610442 containerd[1576]: time="2025-09-05T23:59:43.609737935Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-8645b78b65-fhvz7,Uid:465a82c7-673e-4948-ab39-de8c30afe602,Namespace:calico-system,Attempt:0,}" Sep 5 23:59:43.624685 kubelet[2774]: E0905 23:59:43.624656 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.624896 kubelet[2774]: W0905 23:59:43.624877 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.624965 kubelet[2774]: E0905 23:59:43.624953 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.626519 kubelet[2774]: E0905 23:59:43.626501 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.626660 kubelet[2774]: W0905 23:59:43.626647 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.626722 kubelet[2774]: E0905 23:59:43.626710 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.643639 containerd[1576]: time="2025-09-05T23:59:43.643512220Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 5 23:59:43.644325 containerd[1576]: time="2025-09-05T23:59:43.644068992Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 5 23:59:43.644325 containerd[1576]: time="2025-09-05T23:59:43.644091072Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:59:43.644325 containerd[1576]: time="2025-09-05T23:59:43.644188754Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:59:43.701368 containerd[1576]: time="2025-09-05T23:59:43.701234739Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-8645b78b65-fhvz7,Uid:465a82c7-673e-4948-ab39-de8c30afe602,Namespace:calico-system,Attempt:0,} returns sandbox id \"cc77606d314d916aa1ab46984c387f6005bacc20f43fc57bec0d5a46a34fdf00\"" Sep 5 23:59:43.706310 containerd[1576]: time="2025-09-05T23:59:43.705894679Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 5 23:59:43.728380 kubelet[2774]: E0905 23:59:43.728341 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.728380 kubelet[2774]: W0905 23:59:43.728374 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.728380 kubelet[2774]: E0905 23:59:43.728400 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.729004 kubelet[2774]: E0905 23:59:43.728977 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.729226 kubelet[2774]: W0905 23:59:43.729006 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.729226 kubelet[2774]: E0905 23:59:43.729033 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.729449 kubelet[2774]: E0905 23:59:43.729429 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.729641 kubelet[2774]: W0905 23:59:43.729524 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.729641 kubelet[2774]: E0905 23:59:43.729561 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.729906 kubelet[2774]: E0905 23:59:43.729887 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.729906 kubelet[2774]: W0905 23:59:43.729906 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.730179 kubelet[2774]: E0905 23:59:43.730066 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.730341 kubelet[2774]: E0905 23:59:43.730325 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.730396 kubelet[2774]: W0905 23:59:43.730343 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.730396 kubelet[2774]: E0905 23:59:43.730366 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.730667 kubelet[2774]: E0905 23:59:43.730649 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.730728 kubelet[2774]: W0905 23:59:43.730670 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.730728 kubelet[2774]: E0905 23:59:43.730689 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.734298 kubelet[2774]: E0905 23:59:43.734271 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.734298 kubelet[2774]: W0905 23:59:43.734293 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.734426 kubelet[2774]: E0905 23:59:43.734320 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.735185 kubelet[2774]: E0905 23:59:43.735163 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.735420 kubelet[2774]: W0905 23:59:43.735265 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.735420 kubelet[2774]: E0905 23:59:43.735299 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.736020 kubelet[2774]: E0905 23:59:43.735663 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.736020 kubelet[2774]: W0905 23:59:43.735677 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.736185 kubelet[2774]: E0905 23:59:43.735698 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.738069 kubelet[2774]: E0905 23:59:43.738046 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.738354 kubelet[2774]: W0905 23:59:43.738167 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.738354 kubelet[2774]: E0905 23:59:43.738198 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.741710 kubelet[2774]: E0905 23:59:43.741294 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.741710 kubelet[2774]: W0905 23:59:43.741493 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.741710 kubelet[2774]: E0905 23:59:43.741516 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.750703 kubelet[2774]: E0905 23:59:43.750663 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:43.750703 kubelet[2774]: W0905 23:59:43.750694 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:43.751020 kubelet[2774]: E0905 23:59:43.750724 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:43.751416 containerd[1576]: time="2025-09-05T23:59:43.751381775Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-j42nm,Uid:b6a7ce10-9d2f-4625-8dd2-b969ffd82644,Namespace:calico-system,Attempt:0,}" Sep 5 23:59:43.784044 containerd[1576]: time="2025-09-05T23:59:43.782946653Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 5 23:59:43.784044 containerd[1576]: time="2025-09-05T23:59:43.783048015Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 5 23:59:43.784044 containerd[1576]: time="2025-09-05T23:59:43.783059215Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:59:43.784044 containerd[1576]: time="2025-09-05T23:59:43.783153617Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:59:43.823411 containerd[1576]: time="2025-09-05T23:59:43.823344560Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-j42nm,Uid:b6a7ce10-9d2f-4625-8dd2-b969ffd82644,Namespace:calico-system,Attempt:0,} returns sandbox id \"c207f498ce7317ea226193d8f353f693e77cc568655affb684b6563138ee9a14\"" Sep 5 23:59:45.037519 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3001921795.mount: Deactivated successfully. Sep 5 23:59:45.530252 containerd[1576]: time="2025-09-05T23:59:45.530185861Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:59:45.531617 containerd[1576]: time="2025-09-05T23:59:45.531555929Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=33105775" Sep 5 23:59:45.533977 containerd[1576]: time="2025-09-05T23:59:45.533766654Z" level=info msg="ImageCreate event name:\"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:59:45.538317 containerd[1576]: time="2025-09-05T23:59:45.537676574Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:59:45.538838 containerd[1576]: time="2025-09-05T23:59:45.538793596Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"33105629\" in 1.832856756s" Sep 5 23:59:45.538912 containerd[1576]: time="2025-09-05T23:59:45.538842157Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\"" Sep 5 23:59:45.540208 containerd[1576]: time="2025-09-05T23:59:45.540173864Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 5 23:59:45.555579 containerd[1576]: time="2025-09-05T23:59:45.555541097Z" level=info msg="CreateContainer within sandbox \"cc77606d314d916aa1ab46984c387f6005bacc20f43fc57bec0d5a46a34fdf00\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 5 23:59:45.572945 containerd[1576]: time="2025-09-05T23:59:45.572880049Z" level=info msg="CreateContainer within sandbox \"cc77606d314d916aa1ab46984c387f6005bacc20f43fc57bec0d5a46a34fdf00\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"f367af8c0bb1c08b69035875e8471d6cc323d93f9cab635b73fba378b09f46aa\"" Sep 5 23:59:45.573819 containerd[1576]: time="2025-09-05T23:59:45.573792308Z" level=info msg="StartContainer for \"f367af8c0bb1c08b69035875e8471d6cc323d93f9cab635b73fba378b09f46aa\"" Sep 5 23:59:45.590684 kubelet[2774]: E0905 23:59:45.589987 2774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ps2ks" podUID="4ff97107-cfe4-4d25-918b-36fd4176bf0c" Sep 5 23:59:45.656600 containerd[1576]: time="2025-09-05T23:59:45.655958298Z" level=info msg="StartContainer for \"f367af8c0bb1c08b69035875e8471d6cc323d93f9cab635b73fba378b09f46aa\" returns successfully" Sep 5 23:59:45.743324 kubelet[2774]: I0905 23:59:45.742039 2774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-8645b78b65-fhvz7" podStartSLOduration=2.906488755 podStartE2EDuration="4.742023048s" podCreationTimestamp="2025-09-05 23:59:41 +0000 UTC" firstStartedPulling="2025-09-05 23:59:43.704506609 +0000 UTC m=+25.268130840" lastFinishedPulling="2025-09-05 23:59:45.540040902 +0000 UTC m=+27.103665133" observedRunningTime="2025-09-05 23:59:45.740715981 +0000 UTC m=+27.304340252" watchObservedRunningTime="2025-09-05 23:59:45.742023048 +0000 UTC m=+27.305647279" Sep 5 23:59:45.764867 kubelet[2774]: E0905 23:59:45.764783 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.764867 kubelet[2774]: W0905 23:59:45.764823 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.764867 kubelet[2774]: E0905 23:59:45.764879 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.768286 kubelet[2774]: E0905 23:59:45.766466 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.768286 kubelet[2774]: W0905 23:59:45.766710 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.768286 kubelet[2774]: E0905 23:59:45.766944 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.768580 kubelet[2774]: E0905 23:59:45.768371 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.768580 kubelet[2774]: W0905 23:59:45.768518 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.768580 kubelet[2774]: E0905 23:59:45.768540 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.770043 kubelet[2774]: E0905 23:59:45.769811 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.770043 kubelet[2774]: W0905 23:59:45.769996 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.770043 kubelet[2774]: E0905 23:59:45.770017 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.771460 kubelet[2774]: E0905 23:59:45.770778 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.771460 kubelet[2774]: W0905 23:59:45.770791 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.771460 kubelet[2774]: E0905 23:59:45.770839 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.771460 kubelet[2774]: E0905 23:59:45.771341 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.771460 kubelet[2774]: W0905 23:59:45.771423 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.771460 kubelet[2774]: E0905 23:59:45.771436 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.773085 kubelet[2774]: E0905 23:59:45.771796 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.773085 kubelet[2774]: W0905 23:59:45.771814 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.773085 kubelet[2774]: E0905 23:59:45.771826 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.773085 kubelet[2774]: E0905 23:59:45.772103 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.773085 kubelet[2774]: W0905 23:59:45.772113 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.773085 kubelet[2774]: E0905 23:59:45.772154 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.773085 kubelet[2774]: E0905 23:59:45.772349 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.773085 kubelet[2774]: W0905 23:59:45.772359 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.773085 kubelet[2774]: E0905 23:59:45.772383 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.773085 kubelet[2774]: E0905 23:59:45.772562 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.773364 kubelet[2774]: W0905 23:59:45.772572 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.773364 kubelet[2774]: E0905 23:59:45.772583 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.773364 kubelet[2774]: E0905 23:59:45.772788 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.773364 kubelet[2774]: W0905 23:59:45.772798 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.773364 kubelet[2774]: E0905 23:59:45.772807 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.773364 kubelet[2774]: E0905 23:59:45.773023 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.773364 kubelet[2774]: W0905 23:59:45.773033 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.773364 kubelet[2774]: E0905 23:59:45.773043 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.773364 kubelet[2774]: E0905 23:59:45.773231 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.773364 kubelet[2774]: W0905 23:59:45.773240 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.773552 kubelet[2774]: E0905 23:59:45.773267 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.773552 kubelet[2774]: E0905 23:59:45.773459 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.773552 kubelet[2774]: W0905 23:59:45.773484 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.773552 kubelet[2774]: E0905 23:59:45.773496 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.775212 kubelet[2774]: E0905 23:59:45.773660 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.775212 kubelet[2774]: W0905 23:59:45.773669 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.775212 kubelet[2774]: E0905 23:59:45.773678 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.844901 kubelet[2774]: E0905 23:59:45.844663 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.844901 kubelet[2774]: W0905 23:59:45.844688 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.845448 kubelet[2774]: E0905 23:59:45.845083 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.845448 kubelet[2774]: E0905 23:59:45.845421 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.845745 kubelet[2774]: W0905 23:59:45.845433 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.845745 kubelet[2774]: E0905 23:59:45.845696 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.847156 kubelet[2774]: E0905 23:59:45.846992 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.847156 kubelet[2774]: W0905 23:59:45.847014 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.847156 kubelet[2774]: E0905 23:59:45.847094 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.847532 kubelet[2774]: E0905 23:59:45.847409 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.847532 kubelet[2774]: W0905 23:59:45.847421 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.847532 kubelet[2774]: E0905 23:59:45.847454 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.848251 kubelet[2774]: E0905 23:59:45.848082 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.848251 kubelet[2774]: W0905 23:59:45.848096 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.848251 kubelet[2774]: E0905 23:59:45.848112 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.848591 kubelet[2774]: E0905 23:59:45.848507 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.848591 kubelet[2774]: W0905 23:59:45.848528 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.849025 kubelet[2774]: E0905 23:59:45.848673 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.849284 kubelet[2774]: E0905 23:59:45.849168 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.849284 kubelet[2774]: W0905 23:59:45.849270 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.849558 kubelet[2774]: E0905 23:59:45.849435 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.851105 kubelet[2774]: E0905 23:59:45.849991 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.851105 kubelet[2774]: W0905 23:59:45.850003 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.851353 kubelet[2774]: E0905 23:59:45.851235 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.851619 kubelet[2774]: E0905 23:59:45.851526 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.851619 kubelet[2774]: W0905 23:59:45.851538 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.851619 kubelet[2774]: E0905 23:59:45.851596 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.852062 kubelet[2774]: E0905 23:59:45.851934 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.852062 kubelet[2774]: W0905 23:59:45.851947 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.852062 kubelet[2774]: E0905 23:59:45.852004 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.852430 kubelet[2774]: E0905 23:59:45.852298 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.852430 kubelet[2774]: W0905 23:59:45.852311 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.852430 kubelet[2774]: E0905 23:59:45.852325 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.852793 kubelet[2774]: E0905 23:59:45.852704 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.852793 kubelet[2774]: W0905 23:59:45.852717 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.852793 kubelet[2774]: E0905 23:59:45.852734 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.853448 kubelet[2774]: E0905 23:59:45.853293 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.853448 kubelet[2774]: W0905 23:59:45.853305 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.853448 kubelet[2774]: E0905 23:59:45.853323 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.854111 kubelet[2774]: E0905 23:59:45.854002 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.854111 kubelet[2774]: W0905 23:59:45.854014 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.854111 kubelet[2774]: E0905 23:59:45.854028 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.854582 kubelet[2774]: E0905 23:59:45.854353 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.854582 kubelet[2774]: W0905 23:59:45.854366 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.854582 kubelet[2774]: E0905 23:59:45.854441 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.855812 kubelet[2774]: E0905 23:59:45.855727 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.855812 kubelet[2774]: W0905 23:59:45.855742 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.856088 kubelet[2774]: E0905 23:59:45.856000 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.856088 kubelet[2774]: E0905 23:59:45.856069 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.856088 kubelet[2774]: W0905 23:59:45.856077 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.856453 kubelet[2774]: E0905 23:59:45.856316 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:45.856563 kubelet[2774]: E0905 23:59:45.856522 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:45.856563 kubelet[2774]: W0905 23:59:45.856533 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:45.856563 kubelet[2774]: E0905 23:59:45.856543 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.545427 systemd[1]: run-containerd-runc-k8s.io-f367af8c0bb1c08b69035875e8471d6cc323d93f9cab635b73fba378b09f46aa-runc.7Fu3Un.mount: Deactivated successfully. Sep 5 23:59:46.725255 kubelet[2774]: I0905 23:59:46.725216 2774 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 5 23:59:46.782637 kubelet[2774]: E0905 23:59:46.782580 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.782756 kubelet[2774]: W0905 23:59:46.782655 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.782756 kubelet[2774]: E0905 23:59:46.782713 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.783341 kubelet[2774]: E0905 23:59:46.783301 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.783341 kubelet[2774]: W0905 23:59:46.783333 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.783464 kubelet[2774]: E0905 23:59:46.783357 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.785907 kubelet[2774]: E0905 23:59:46.785691 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.785907 kubelet[2774]: W0905 23:59:46.785713 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.785907 kubelet[2774]: E0905 23:59:46.785744 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.786530 kubelet[2774]: E0905 23:59:46.786292 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.786530 kubelet[2774]: W0905 23:59:46.786323 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.786530 kubelet[2774]: E0905 23:59:46.786336 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.787030 kubelet[2774]: E0905 23:59:46.786775 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.787030 kubelet[2774]: W0905 23:59:46.786788 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.787030 kubelet[2774]: E0905 23:59:46.786914 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.788615 kubelet[2774]: E0905 23:59:46.788436 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.788615 kubelet[2774]: W0905 23:59:46.788485 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.788615 kubelet[2774]: E0905 23:59:46.788504 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.788982 kubelet[2774]: E0905 23:59:46.788888 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.788982 kubelet[2774]: W0905 23:59:46.788902 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.788982 kubelet[2774]: E0905 23:59:46.788916 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.789506 kubelet[2774]: E0905 23:59:46.789354 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.789506 kubelet[2774]: W0905 23:59:46.789368 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.789506 kubelet[2774]: E0905 23:59:46.789381 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.789965 kubelet[2774]: E0905 23:59:46.789771 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.789965 kubelet[2774]: W0905 23:59:46.789784 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.789965 kubelet[2774]: E0905 23:59:46.789796 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.790406 kubelet[2774]: E0905 23:59:46.790335 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.790406 kubelet[2774]: W0905 23:59:46.790349 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.790406 kubelet[2774]: E0905 23:59:46.790361 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.790830 kubelet[2774]: E0905 23:59:46.790806 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.791072 kubelet[2774]: W0905 23:59:46.790949 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.791072 kubelet[2774]: E0905 23:59:46.790971 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.791303 kubelet[2774]: E0905 23:59:46.791292 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.791437 kubelet[2774]: W0905 23:59:46.791363 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.791437 kubelet[2774]: E0905 23:59:46.791379 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.791760 kubelet[2774]: E0905 23:59:46.791747 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.791981 kubelet[2774]: W0905 23:59:46.791876 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.791981 kubelet[2774]: E0905 23:59:46.791894 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.792700 kubelet[2774]: E0905 23:59:46.792585 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.792700 kubelet[2774]: W0905 23:59:46.792599 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.792700 kubelet[2774]: E0905 23:59:46.792611 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.793232 kubelet[2774]: E0905 23:59:46.793148 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.793232 kubelet[2774]: W0905 23:59:46.793162 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.793232 kubelet[2774]: E0905 23:59:46.793174 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.856161 kubelet[2774]: E0905 23:59:46.855930 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.856161 kubelet[2774]: W0905 23:59:46.855959 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.856161 kubelet[2774]: E0905 23:59:46.856000 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.857192 kubelet[2774]: E0905 23:59:46.857036 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.857192 kubelet[2774]: W0905 23:59:46.857057 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.857487 kubelet[2774]: E0905 23:59:46.857307 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.858252 kubelet[2774]: E0905 23:59:46.858005 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.858252 kubelet[2774]: W0905 23:59:46.858020 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.858252 kubelet[2774]: E0905 23:59:46.858037 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.858658 kubelet[2774]: E0905 23:59:46.858477 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.858658 kubelet[2774]: W0905 23:59:46.858490 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.858658 kubelet[2774]: E0905 23:59:46.858615 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.859533 kubelet[2774]: E0905 23:59:46.859255 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.859533 kubelet[2774]: W0905 23:59:46.859273 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.859533 kubelet[2774]: E0905 23:59:46.859306 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.860259 kubelet[2774]: E0905 23:59:46.860162 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.860259 kubelet[2774]: W0905 23:59:46.860186 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.860259 kubelet[2774]: E0905 23:59:46.860219 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.861184 kubelet[2774]: E0905 23:59:46.860965 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.861184 kubelet[2774]: W0905 23:59:46.860985 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.861184 kubelet[2774]: E0905 23:59:46.861013 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.861787 kubelet[2774]: E0905 23:59:46.861657 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.861787 kubelet[2774]: W0905 23:59:46.861674 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.861787 kubelet[2774]: E0905 23:59:46.861700 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.862289 kubelet[2774]: E0905 23:59:46.862086 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.862289 kubelet[2774]: W0905 23:59:46.862110 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.862289 kubelet[2774]: E0905 23:59:46.862136 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.862796 kubelet[2774]: E0905 23:59:46.862694 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.862796 kubelet[2774]: W0905 23:59:46.862714 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.862796 kubelet[2774]: E0905 23:59:46.862744 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.863407 kubelet[2774]: E0905 23:59:46.863249 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.863407 kubelet[2774]: W0905 23:59:46.863265 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.863407 kubelet[2774]: E0905 23:59:46.863292 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.863771 kubelet[2774]: E0905 23:59:46.863703 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.863771 kubelet[2774]: W0905 23:59:46.863721 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.863771 kubelet[2774]: E0905 23:59:46.863746 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.864886 kubelet[2774]: E0905 23:59:46.864384 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.864886 kubelet[2774]: W0905 23:59:46.864404 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.864886 kubelet[2774]: E0905 23:59:46.864426 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.865167 kubelet[2774]: E0905 23:59:46.865148 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.865167 kubelet[2774]: W0905 23:59:46.865166 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.865273 kubelet[2774]: E0905 23:59:46.865185 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.865439 kubelet[2774]: E0905 23:59:46.865423 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.865439 kubelet[2774]: W0905 23:59:46.865436 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.865750 kubelet[2774]: E0905 23:59:46.865468 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.865750 kubelet[2774]: E0905 23:59:46.865651 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.865750 kubelet[2774]: W0905 23:59:46.865661 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.865750 kubelet[2774]: E0905 23:59:46.865683 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.866450 kubelet[2774]: E0905 23:59:46.866310 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.866450 kubelet[2774]: W0905 23:59:46.866329 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.866450 kubelet[2774]: E0905 23:59:46.866351 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.866986 kubelet[2774]: E0905 23:59:46.866919 2774 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 5 23:59:46.866986 kubelet[2774]: W0905 23:59:46.866934 2774 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 5 23:59:46.866986 kubelet[2774]: E0905 23:59:46.866948 2774 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 5 23:59:46.875945 containerd[1576]: time="2025-09-05T23:59:46.875864024Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:59:46.877323 containerd[1576]: time="2025-09-05T23:59:46.877271892Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4266814" Sep 5 23:59:46.878507 containerd[1576]: time="2025-09-05T23:59:46.878447755Z" level=info msg="ImageCreate event name:\"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:59:46.881881 containerd[1576]: time="2025-09-05T23:59:46.881815702Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:59:46.883662 containerd[1576]: time="2025-09-05T23:59:46.883571097Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5636015\" in 1.343323311s" Sep 5 23:59:46.883662 containerd[1576]: time="2025-09-05T23:59:46.883657018Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\"" Sep 5 23:59:46.888671 containerd[1576]: time="2025-09-05T23:59:46.888540795Z" level=info msg="CreateContainer within sandbox \"c207f498ce7317ea226193d8f353f693e77cc568655affb684b6563138ee9a14\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 5 23:59:46.907130 containerd[1576]: time="2025-09-05T23:59:46.907073642Z" level=info msg="CreateContainer within sandbox \"c207f498ce7317ea226193d8f353f693e77cc568655affb684b6563138ee9a14\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"362888c0eec4817eb9fe5592f7ac9ae0ef50b65d386e6e13adeccae3c08c2e2b\"" Sep 5 23:59:46.909042 containerd[1576]: time="2025-09-05T23:59:46.908893798Z" level=info msg="StartContainer for \"362888c0eec4817eb9fe5592f7ac9ae0ef50b65d386e6e13adeccae3c08c2e2b\"" Sep 5 23:59:46.977406 containerd[1576]: time="2025-09-05T23:59:46.977358112Z" level=info msg="StartContainer for \"362888c0eec4817eb9fe5592f7ac9ae0ef50b65d386e6e13adeccae3c08c2e2b\" returns successfully" Sep 5 23:59:47.149788 containerd[1576]: time="2025-09-05T23:59:47.149183874Z" level=info msg="shim disconnected" id=362888c0eec4817eb9fe5592f7ac9ae0ef50b65d386e6e13adeccae3c08c2e2b namespace=k8s.io Sep 5 23:59:47.149788 containerd[1576]: time="2025-09-05T23:59:47.149298277Z" level=warning msg="cleaning up after shim disconnected" id=362888c0eec4817eb9fe5592f7ac9ae0ef50b65d386e6e13adeccae3c08c2e2b namespace=k8s.io Sep 5 23:59:47.149788 containerd[1576]: time="2025-09-05T23:59:47.149315157Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 5 23:59:47.547797 systemd[1]: run-containerd-runc-k8s.io-362888c0eec4817eb9fe5592f7ac9ae0ef50b65d386e6e13adeccae3c08c2e2b-runc.93WhVK.mount: Deactivated successfully. Sep 5 23:59:47.548478 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-362888c0eec4817eb9fe5592f7ac9ae0ef50b65d386e6e13adeccae3c08c2e2b-rootfs.mount: Deactivated successfully. Sep 5 23:59:47.590154 kubelet[2774]: E0905 23:59:47.590039 2774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ps2ks" podUID="4ff97107-cfe4-4d25-918b-36fd4176bf0c" Sep 5 23:59:47.733650 containerd[1576]: time="2025-09-05T23:59:47.732741835Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 5 23:59:49.590978 kubelet[2774]: E0905 23:59:49.590869 2774 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ps2ks" podUID="4ff97107-cfe4-4d25-918b-36fd4176bf0c" Sep 5 23:59:50.180670 containerd[1576]: time="2025-09-05T23:59:50.179918157Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:59:50.181283 containerd[1576]: time="2025-09-05T23:59:50.181214180Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=65913477" Sep 5 23:59:50.183130 containerd[1576]: time="2025-09-05T23:59:50.181830751Z" level=info msg="ImageCreate event name:\"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:59:50.184780 containerd[1576]: time="2025-09-05T23:59:50.184745083Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:59:50.185778 containerd[1576]: time="2025-09-05T23:59:50.185740941Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"67282718\" in 2.452959345s" Sep 5 23:59:50.185857 containerd[1576]: time="2025-09-05T23:59:50.185779222Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\"" Sep 5 23:59:50.190117 containerd[1576]: time="2025-09-05T23:59:50.190078258Z" level=info msg="CreateContainer within sandbox \"c207f498ce7317ea226193d8f353f693e77cc568655affb684b6563138ee9a14\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 5 23:59:50.210593 containerd[1576]: time="2025-09-05T23:59:50.210533982Z" level=info msg="CreateContainer within sandbox \"c207f498ce7317ea226193d8f353f693e77cc568655affb684b6563138ee9a14\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"33eb7703b77426d85c2e4d1cc563ac0f26a732dc919726b8b6d6fcbb95acebe8\"" Sep 5 23:59:50.211535 containerd[1576]: time="2025-09-05T23:59:50.211399477Z" level=info msg="StartContainer for \"33eb7703b77426d85c2e4d1cc563ac0f26a732dc919726b8b6d6fcbb95acebe8\"" Sep 5 23:59:50.274997 containerd[1576]: time="2025-09-05T23:59:50.274900366Z" level=info msg="StartContainer for \"33eb7703b77426d85c2e4d1cc563ac0f26a732dc919726b8b6d6fcbb95acebe8\" returns successfully" Sep 5 23:59:50.846908 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-33eb7703b77426d85c2e4d1cc563ac0f26a732dc919726b8b6d6fcbb95acebe8-rootfs.mount: Deactivated successfully. Sep 5 23:59:50.929089 kubelet[2774]: I0905 23:59:50.929057 2774 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Sep 5 23:59:50.941645 containerd[1576]: time="2025-09-05T23:59:50.941539981Z" level=info msg="shim disconnected" id=33eb7703b77426d85c2e4d1cc563ac0f26a732dc919726b8b6d6fcbb95acebe8 namespace=k8s.io Sep 5 23:59:50.941645 containerd[1576]: time="2025-09-05T23:59:50.941612823Z" level=warning msg="cleaning up after shim disconnected" id=33eb7703b77426d85c2e4d1cc563ac0f26a732dc919726b8b6d6fcbb95acebe8 namespace=k8s.io Sep 5 23:59:50.941645 containerd[1576]: time="2025-09-05T23:59:50.941642863Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 5 23:59:51.093383 kubelet[2774]: I0905 23:59:51.093328 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x5mj\" (UniqueName: \"kubernetes.io/projected/7f869a7e-1df4-4450-bbcf-ba5d46557d8b-kube-api-access-5x5mj\") pod \"coredns-7c65d6cfc9-mm682\" (UID: \"7f869a7e-1df4-4450-bbcf-ba5d46557d8b\") " pod="kube-system/coredns-7c65d6cfc9-mm682" Sep 5 23:59:51.093555 kubelet[2774]: I0905 23:59:51.093413 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e137dec7-0dff-4776-b673-c2d533bf21f9-config-volume\") pod \"coredns-7c65d6cfc9-8dbj7\" (UID: \"e137dec7-0dff-4776-b673-c2d533bf21f9\") " pod="kube-system/coredns-7c65d6cfc9-8dbj7" Sep 5 23:59:51.093555 kubelet[2774]: I0905 23:59:51.093459 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/040258ac-5a18-4787-bc03-5dba35b78258-config\") pod \"goldmane-7988f88666-2kb4z\" (UID: \"040258ac-5a18-4787-bc03-5dba35b78258\") " pod="calico-system/goldmane-7988f88666-2kb4z" Sep 5 23:59:51.093555 kubelet[2774]: I0905 23:59:51.093496 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wplzp\" (UniqueName: \"kubernetes.io/projected/040258ac-5a18-4787-bc03-5dba35b78258-kube-api-access-wplzp\") pod \"goldmane-7988f88666-2kb4z\" (UID: \"040258ac-5a18-4787-bc03-5dba35b78258\") " pod="calico-system/goldmane-7988f88666-2kb4z" Sep 5 23:59:51.093555 kubelet[2774]: I0905 23:59:51.093535 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsbdq\" (UniqueName: \"kubernetes.io/projected/8fd1cf87-0065-4b36-b757-559cbde7316b-kube-api-access-dsbdq\") pod \"calico-apiserver-76445ff9b6-n8cgl\" (UID: \"8fd1cf87-0065-4b36-b757-559cbde7316b\") " pod="calico-apiserver/calico-apiserver-76445ff9b6-n8cgl" Sep 5 23:59:51.093682 kubelet[2774]: I0905 23:59:51.093574 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr95m\" (UniqueName: \"kubernetes.io/projected/55586fdc-5c0e-4b46-a8dd-484764d828e4-kube-api-access-wr95m\") pod \"calico-apiserver-76445ff9b6-kh72c\" (UID: \"55586fdc-5c0e-4b46-a8dd-484764d828e4\") " pod="calico-apiserver/calico-apiserver-76445ff9b6-kh72c" Sep 5 23:59:51.093682 kubelet[2774]: I0905 23:59:51.093609 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/040258ac-5a18-4787-bc03-5dba35b78258-goldmane-key-pair\") pod \"goldmane-7988f88666-2kb4z\" (UID: \"040258ac-5a18-4787-bc03-5dba35b78258\") " pod="calico-system/goldmane-7988f88666-2kb4z" Sep 5 23:59:51.093682 kubelet[2774]: I0905 23:59:51.093671 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/6155f6a9-1ed8-4cf8-a81d-799ccb4af958-whisker-backend-key-pair\") pod \"whisker-7cd7c86f75-6qzr6\" (UID: \"6155f6a9-1ed8-4cf8-a81d-799ccb4af958\") " pod="calico-system/whisker-7cd7c86f75-6qzr6" Sep 5 23:59:51.093763 kubelet[2774]: I0905 23:59:51.093716 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/8fd1cf87-0065-4b36-b757-559cbde7316b-calico-apiserver-certs\") pod \"calico-apiserver-76445ff9b6-n8cgl\" (UID: \"8fd1cf87-0065-4b36-b757-559cbde7316b\") " pod="calico-apiserver/calico-apiserver-76445ff9b6-n8cgl" Sep 5 23:59:51.093763 kubelet[2774]: I0905 23:59:51.093752 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/040258ac-5a18-4787-bc03-5dba35b78258-goldmane-ca-bundle\") pod \"goldmane-7988f88666-2kb4z\" (UID: \"040258ac-5a18-4787-bc03-5dba35b78258\") " pod="calico-system/goldmane-7988f88666-2kb4z" Sep 5 23:59:51.093813 kubelet[2774]: I0905 23:59:51.093791 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7f869a7e-1df4-4450-bbcf-ba5d46557d8b-config-volume\") pod \"coredns-7c65d6cfc9-mm682\" (UID: \"7f869a7e-1df4-4450-bbcf-ba5d46557d8b\") " pod="kube-system/coredns-7c65d6cfc9-mm682" Sep 5 23:59:51.093842 kubelet[2774]: I0905 23:59:51.093827 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d045d3d-688c-447d-9931-b4f33f413acf-tigera-ca-bundle\") pod \"calico-kube-controllers-cbc856df4-cgnd8\" (UID: \"0d045d3d-688c-447d-9931-b4f33f413acf\") " pod="calico-system/calico-kube-controllers-cbc856df4-cgnd8" Sep 5 23:59:51.093935 kubelet[2774]: I0905 23:59:51.093866 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktvsc\" (UniqueName: \"kubernetes.io/projected/0d045d3d-688c-447d-9931-b4f33f413acf-kube-api-access-ktvsc\") pod \"calico-kube-controllers-cbc856df4-cgnd8\" (UID: \"0d045d3d-688c-447d-9931-b4f33f413acf\") " pod="calico-system/calico-kube-controllers-cbc856df4-cgnd8" Sep 5 23:59:51.094006 kubelet[2774]: I0905 23:59:51.093977 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/55586fdc-5c0e-4b46-a8dd-484764d828e4-calico-apiserver-certs\") pod \"calico-apiserver-76445ff9b6-kh72c\" (UID: \"55586fdc-5c0e-4b46-a8dd-484764d828e4\") " pod="calico-apiserver/calico-apiserver-76445ff9b6-kh72c" Sep 5 23:59:51.094070 kubelet[2774]: I0905 23:59:51.094043 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpmsj\" (UniqueName: \"kubernetes.io/projected/e137dec7-0dff-4776-b673-c2d533bf21f9-kube-api-access-dpmsj\") pod \"coredns-7c65d6cfc9-8dbj7\" (UID: \"e137dec7-0dff-4776-b673-c2d533bf21f9\") " pod="kube-system/coredns-7c65d6cfc9-8dbj7" Sep 5 23:59:51.094123 kubelet[2774]: I0905 23:59:51.094097 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6155f6a9-1ed8-4cf8-a81d-799ccb4af958-whisker-ca-bundle\") pod \"whisker-7cd7c86f75-6qzr6\" (UID: \"6155f6a9-1ed8-4cf8-a81d-799ccb4af958\") " pod="calico-system/whisker-7cd7c86f75-6qzr6" Sep 5 23:59:51.094155 kubelet[2774]: I0905 23:59:51.094139 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9kk9\" (UniqueName: \"kubernetes.io/projected/6155f6a9-1ed8-4cf8-a81d-799ccb4af958-kube-api-access-h9kk9\") pod \"whisker-7cd7c86f75-6qzr6\" (UID: \"6155f6a9-1ed8-4cf8-a81d-799ccb4af958\") " pod="calico-system/whisker-7cd7c86f75-6qzr6" Sep 5 23:59:51.278674 containerd[1576]: time="2025-09-05T23:59:51.278157079Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-mm682,Uid:7f869a7e-1df4-4450-bbcf-ba5d46557d8b,Namespace:kube-system,Attempt:0,}" Sep 5 23:59:51.309936 containerd[1576]: time="2025-09-05T23:59:51.309560583Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cbc856df4-cgnd8,Uid:0d045d3d-688c-447d-9931-b4f33f413acf,Namespace:calico-system,Attempt:0,}" Sep 5 23:59:51.311845 containerd[1576]: time="2025-09-05T23:59:51.311569098Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-2kb4z,Uid:040258ac-5a18-4787-bc03-5dba35b78258,Namespace:calico-system,Attempt:0,}" Sep 5 23:59:51.314407 containerd[1576]: time="2025-09-05T23:59:51.313777216Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76445ff9b6-n8cgl,Uid:8fd1cf87-0065-4b36-b757-559cbde7316b,Namespace:calico-apiserver,Attempt:0,}" Sep 5 23:59:51.314715 containerd[1576]: time="2025-09-05T23:59:51.313909578Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-8dbj7,Uid:e137dec7-0dff-4776-b673-c2d533bf21f9,Namespace:kube-system,Attempt:0,}" Sep 5 23:59:51.316336 containerd[1576]: time="2025-09-05T23:59:51.316304420Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76445ff9b6-kh72c,Uid:55586fdc-5c0e-4b46-a8dd-484764d828e4,Namespace:calico-apiserver,Attempt:0,}" Sep 5 23:59:51.318254 containerd[1576]: time="2025-09-05T23:59:51.318037690Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7cd7c86f75-6qzr6,Uid:6155f6a9-1ed8-4cf8-a81d-799ccb4af958,Namespace:calico-system,Attempt:0,}" Sep 5 23:59:51.372507 containerd[1576]: time="2025-09-05T23:59:51.372399751Z" level=error msg="Failed to destroy network for sandbox \"ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.373005 containerd[1576]: time="2025-09-05T23:59:51.372856959Z" level=error msg="encountered an error cleaning up failed sandbox \"ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.373005 containerd[1576]: time="2025-09-05T23:59:51.372941441Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-mm682,Uid:7f869a7e-1df4-4450-bbcf-ba5d46557d8b,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.373616 kubelet[2774]: E0905 23:59:51.373501 2774 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.373705 kubelet[2774]: E0905 23:59:51.373657 2774 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-mm682" Sep 5 23:59:51.373705 kubelet[2774]: E0905 23:59:51.373693 2774 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-mm682" Sep 5 23:59:51.374005 kubelet[2774]: E0905 23:59:51.373754 2774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-mm682_kube-system(7f869a7e-1df4-4450-bbcf-ba5d46557d8b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-mm682_kube-system(7f869a7e-1df4-4450-bbcf-ba5d46557d8b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-mm682" podUID="7f869a7e-1df4-4450-bbcf-ba5d46557d8b" Sep 5 23:59:51.523438 containerd[1576]: time="2025-09-05T23:59:51.523164483Z" level=error msg="Failed to destroy network for sandbox \"5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.524787 containerd[1576]: time="2025-09-05T23:59:51.524744990Z" level=error msg="encountered an error cleaning up failed sandbox \"5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.525829 containerd[1576]: time="2025-09-05T23:59:51.525702967Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-2kb4z,Uid:040258ac-5a18-4787-bc03-5dba35b78258,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.526241 kubelet[2774]: E0905 23:59:51.526194 2774 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.526322 kubelet[2774]: E0905 23:59:51.526268 2774 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-2kb4z" Sep 5 23:59:51.526322 kubelet[2774]: E0905 23:59:51.526290 2774 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-2kb4z" Sep 5 23:59:51.526494 kubelet[2774]: E0905 23:59:51.526329 2774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7988f88666-2kb4z_calico-system(040258ac-5a18-4787-bc03-5dba35b78258)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7988f88666-2kb4z_calico-system(040258ac-5a18-4787-bc03-5dba35b78258)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-2kb4z" podUID="040258ac-5a18-4787-bc03-5dba35b78258" Sep 5 23:59:51.545486 containerd[1576]: time="2025-09-05T23:59:51.545361067Z" level=error msg="Failed to destroy network for sandbox \"53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.546021 containerd[1576]: time="2025-09-05T23:59:51.545910637Z" level=error msg="encountered an error cleaning up failed sandbox \"53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.546021 containerd[1576]: time="2025-09-05T23:59:51.545983638Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7cd7c86f75-6qzr6,Uid:6155f6a9-1ed8-4cf8-a81d-799ccb4af958,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.547525 kubelet[2774]: E0905 23:59:51.547058 2774 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.547525 kubelet[2774]: E0905 23:59:51.547121 2774 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7cd7c86f75-6qzr6" Sep 5 23:59:51.547525 kubelet[2774]: E0905 23:59:51.547141 2774 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7cd7c86f75-6qzr6" Sep 5 23:59:51.547772 kubelet[2774]: E0905 23:59:51.547185 2774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-7cd7c86f75-6qzr6_calico-system(6155f6a9-1ed8-4cf8-a81d-799ccb4af958)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-7cd7c86f75-6qzr6_calico-system(6155f6a9-1ed8-4cf8-a81d-799ccb4af958)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7cd7c86f75-6qzr6" podUID="6155f6a9-1ed8-4cf8-a81d-799ccb4af958" Sep 5 23:59:51.550039 containerd[1576]: time="2025-09-05T23:59:51.549983907Z" level=error msg="Failed to destroy network for sandbox \"7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.551093 containerd[1576]: time="2025-09-05T23:59:51.551047886Z" level=error msg="encountered an error cleaning up failed sandbox \"7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.551189 containerd[1576]: time="2025-09-05T23:59:51.551138887Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cbc856df4-cgnd8,Uid:0d045d3d-688c-447d-9931-b4f33f413acf,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.551480 kubelet[2774]: E0905 23:59:51.551437 2774 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.551655 kubelet[2774]: E0905 23:59:51.551508 2774 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-cbc856df4-cgnd8" Sep 5 23:59:51.551655 kubelet[2774]: E0905 23:59:51.551531 2774 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-cbc856df4-cgnd8" Sep 5 23:59:51.551655 kubelet[2774]: E0905 23:59:51.551587 2774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-cbc856df4-cgnd8_calico-system(0d045d3d-688c-447d-9931-b4f33f413acf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-cbc856df4-cgnd8_calico-system(0d045d3d-688c-447d-9931-b4f33f413acf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-cbc856df4-cgnd8" podUID="0d045d3d-688c-447d-9931-b4f33f413acf" Sep 5 23:59:51.553129 containerd[1576]: time="2025-09-05T23:59:51.553056440Z" level=error msg="Failed to destroy network for sandbox \"1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.554183 containerd[1576]: time="2025-09-05T23:59:51.554149699Z" level=error msg="encountered an error cleaning up failed sandbox \"1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.554408 containerd[1576]: time="2025-09-05T23:59:51.554385103Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76445ff9b6-n8cgl,Uid:8fd1cf87-0065-4b36-b757-559cbde7316b,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.555132 kubelet[2774]: E0905 23:59:51.555096 2774 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.555214 kubelet[2774]: E0905 23:59:51.555154 2774 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-76445ff9b6-n8cgl" Sep 5 23:59:51.555214 kubelet[2774]: E0905 23:59:51.555171 2774 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-76445ff9b6-n8cgl" Sep 5 23:59:51.555276 kubelet[2774]: E0905 23:59:51.555208 2774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-76445ff9b6-n8cgl_calico-apiserver(8fd1cf87-0065-4b36-b757-559cbde7316b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-76445ff9b6-n8cgl_calico-apiserver(8fd1cf87-0065-4b36-b757-559cbde7316b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-76445ff9b6-n8cgl" podUID="8fd1cf87-0065-4b36-b757-559cbde7316b" Sep 5 23:59:51.578919 containerd[1576]: time="2025-09-05T23:59:51.578781246Z" level=error msg="Failed to destroy network for sandbox \"dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.579642 containerd[1576]: time="2025-09-05T23:59:51.579505379Z" level=error msg="encountered an error cleaning up failed sandbox \"dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.579642 containerd[1576]: time="2025-09-05T23:59:51.579577580Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-8dbj7,Uid:e137dec7-0dff-4776-b673-c2d533bf21f9,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.580384 kubelet[2774]: E0905 23:59:51.579929 2774 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.580384 kubelet[2774]: E0905 23:59:51.579990 2774 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-8dbj7" Sep 5 23:59:51.580384 kubelet[2774]: E0905 23:59:51.580007 2774 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-8dbj7" Sep 5 23:59:51.580567 kubelet[2774]: E0905 23:59:51.580052 2774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-8dbj7_kube-system(e137dec7-0dff-4776-b673-c2d533bf21f9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-8dbj7_kube-system(e137dec7-0dff-4776-b673-c2d533bf21f9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-8dbj7" podUID="e137dec7-0dff-4776-b673-c2d533bf21f9" Sep 5 23:59:51.582546 containerd[1576]: time="2025-09-05T23:59:51.582510311Z" level=error msg="Failed to destroy network for sandbox \"163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.583195 containerd[1576]: time="2025-09-05T23:59:51.583141361Z" level=error msg="encountered an error cleaning up failed sandbox \"163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.583346 containerd[1576]: time="2025-09-05T23:59:51.583278084Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76445ff9b6-kh72c,Uid:55586fdc-5c0e-4b46-a8dd-484764d828e4,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.583954 kubelet[2774]: E0905 23:59:51.583757 2774 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.583954 kubelet[2774]: E0905 23:59:51.583820 2774 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-76445ff9b6-kh72c" Sep 5 23:59:51.583954 kubelet[2774]: E0905 23:59:51.583841 2774 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-76445ff9b6-kh72c" Sep 5 23:59:51.584078 kubelet[2774]: E0905 23:59:51.583914 2774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-76445ff9b6-kh72c_calico-apiserver(55586fdc-5c0e-4b46-a8dd-484764d828e4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-76445ff9b6-kh72c_calico-apiserver(55586fdc-5c0e-4b46-a8dd-484764d828e4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-76445ff9b6-kh72c" podUID="55586fdc-5c0e-4b46-a8dd-484764d828e4" Sep 5 23:59:51.594966 containerd[1576]: time="2025-09-05T23:59:51.594797243Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ps2ks,Uid:4ff97107-cfe4-4d25-918b-36fd4176bf0c,Namespace:calico-system,Attempt:0,}" Sep 5 23:59:51.667706 containerd[1576]: time="2025-09-05T23:59:51.664808656Z" level=error msg="Failed to destroy network for sandbox \"1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.667706 containerd[1576]: time="2025-09-05T23:59:51.665273224Z" level=error msg="encountered an error cleaning up failed sandbox \"1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.667706 containerd[1576]: time="2025-09-05T23:59:51.665333145Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ps2ks,Uid:4ff97107-cfe4-4d25-918b-36fd4176bf0c,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.668541 kubelet[2774]: E0905 23:59:51.667821 2774 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.668541 kubelet[2774]: E0905 23:59:51.667876 2774 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ps2ks" Sep 5 23:59:51.668541 kubelet[2774]: E0905 23:59:51.667933 2774 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ps2ks" Sep 5 23:59:51.669224 kubelet[2774]: E0905 23:59:51.667973 2774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-ps2ks_calico-system(4ff97107-cfe4-4d25-918b-36fd4176bf0c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-ps2ks_calico-system(4ff97107-cfe4-4d25-918b-36fd4176bf0c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-ps2ks" podUID="4ff97107-cfe4-4d25-918b-36fd4176bf0c" Sep 5 23:59:51.746470 kubelet[2774]: I0905 23:59:51.746425 2774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" Sep 5 23:59:51.748165 containerd[1576]: time="2025-09-05T23:59:51.748058818Z" level=info msg="StopPodSandbox for \"dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847\"" Sep 5 23:59:51.748427 containerd[1576]: time="2025-09-05T23:59:51.748410064Z" level=info msg="Ensure that sandbox dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847 in task-service has been cleanup successfully" Sep 5 23:59:51.758601 containerd[1576]: time="2025-09-05T23:59:51.758545720Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 5 23:59:51.761616 kubelet[2774]: I0905 23:59:51.761566 2774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" Sep 5 23:59:51.764264 containerd[1576]: time="2025-09-05T23:59:51.762614870Z" level=info msg="StopPodSandbox for \"1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063\"" Sep 5 23:59:51.764264 containerd[1576]: time="2025-09-05T23:59:51.763241041Z" level=info msg="Ensure that sandbox 1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063 in task-service has been cleanup successfully" Sep 5 23:59:51.764629 kubelet[2774]: I0905 23:59:51.764212 2774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" Sep 5 23:59:51.769333 containerd[1576]: time="2025-09-05T23:59:51.767395793Z" level=info msg="StopPodSandbox for \"ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e\"" Sep 5 23:59:51.769333 containerd[1576]: time="2025-09-05T23:59:51.767555876Z" level=info msg="Ensure that sandbox ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e in task-service has been cleanup successfully" Sep 5 23:59:51.773398 kubelet[2774]: I0905 23:59:51.773351 2774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" Sep 5 23:59:51.774909 containerd[1576]: time="2025-09-05T23:59:51.774401234Z" level=info msg="StopPodSandbox for \"53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856\"" Sep 5 23:59:51.774909 containerd[1576]: time="2025-09-05T23:59:51.774581917Z" level=info msg="Ensure that sandbox 53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856 in task-service has been cleanup successfully" Sep 5 23:59:51.786120 kubelet[2774]: I0905 23:59:51.785546 2774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" Sep 5 23:59:51.793964 containerd[1576]: time="2025-09-05T23:59:51.793803610Z" level=info msg="StopPodSandbox for \"7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61\"" Sep 5 23:59:51.794096 containerd[1576]: time="2025-09-05T23:59:51.794027734Z" level=info msg="Ensure that sandbox 7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61 in task-service has been cleanup successfully" Sep 5 23:59:51.805176 kubelet[2774]: I0905 23:59:51.803889 2774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" Sep 5 23:59:51.808882 containerd[1576]: time="2025-09-05T23:59:51.808082378Z" level=info msg="StopPodSandbox for \"163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a\"" Sep 5 23:59:51.808882 containerd[1576]: time="2025-09-05T23:59:51.808267261Z" level=info msg="Ensure that sandbox 163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a in task-service has been cleanup successfully" Sep 5 23:59:51.809793 kubelet[2774]: I0905 23:59:51.809767 2774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" Sep 5 23:59:51.811609 containerd[1576]: time="2025-09-05T23:59:51.811256593Z" level=info msg="StopPodSandbox for \"5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098\"" Sep 5 23:59:51.811609 containerd[1576]: time="2025-09-05T23:59:51.811439716Z" level=info msg="Ensure that sandbox 5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098 in task-service has been cleanup successfully" Sep 5 23:59:51.821807 kubelet[2774]: I0905 23:59:51.821768 2774 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" Sep 5 23:59:51.826191 containerd[1576]: time="2025-09-05T23:59:51.826145290Z" level=info msg="StopPodSandbox for \"1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481\"" Sep 5 23:59:51.827114 containerd[1576]: time="2025-09-05T23:59:51.826329734Z" level=info msg="Ensure that sandbox 1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481 in task-service has been cleanup successfully" Sep 5 23:59:51.834089 containerd[1576]: time="2025-09-05T23:59:51.833936505Z" level=error msg="StopPodSandbox for \"dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847\" failed" error="failed to destroy network for sandbox \"dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.834251 kubelet[2774]: E0905 23:59:51.834164 2774 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" Sep 5 23:59:51.834312 kubelet[2774]: E0905 23:59:51.834224 2774 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847"} Sep 5 23:59:51.834312 kubelet[2774]: E0905 23:59:51.834282 2774 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"e137dec7-0dff-4776-b673-c2d533bf21f9\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 5 23:59:51.834312 kubelet[2774]: E0905 23:59:51.834303 2774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"e137dec7-0dff-4776-b673-c2d533bf21f9\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-8dbj7" podUID="e137dec7-0dff-4776-b673-c2d533bf21f9" Sep 5 23:59:51.877265 containerd[1576]: time="2025-09-05T23:59:51.877190335Z" level=error msg="StopPodSandbox for \"53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856\" failed" error="failed to destroy network for sandbox \"53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.877652 kubelet[2774]: E0905 23:59:51.877406 2774 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" Sep 5 23:59:51.877652 kubelet[2774]: E0905 23:59:51.877464 2774 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856"} Sep 5 23:59:51.877652 kubelet[2774]: E0905 23:59:51.877499 2774 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"6155f6a9-1ed8-4cf8-a81d-799ccb4af958\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 5 23:59:51.877652 kubelet[2774]: E0905 23:59:51.877522 2774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"6155f6a9-1ed8-4cf8-a81d-799ccb4af958\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7cd7c86f75-6qzr6" podUID="6155f6a9-1ed8-4cf8-a81d-799ccb4af958" Sep 5 23:59:51.892891 containerd[1576]: time="2025-09-05T23:59:51.892826445Z" level=error msg="StopPodSandbox for \"163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a\" failed" error="failed to destroy network for sandbox \"163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.893230 kubelet[2774]: E0905 23:59:51.893108 2774 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" Sep 5 23:59:51.893230 kubelet[2774]: E0905 23:59:51.893180 2774 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a"} Sep 5 23:59:51.893358 kubelet[2774]: E0905 23:59:51.893251 2774 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"55586fdc-5c0e-4b46-a8dd-484764d828e4\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 5 23:59:51.893358 kubelet[2774]: E0905 23:59:51.893274 2774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"55586fdc-5c0e-4b46-a8dd-484764d828e4\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-76445ff9b6-kh72c" podUID="55586fdc-5c0e-4b46-a8dd-484764d828e4" Sep 5 23:59:51.896881 containerd[1576]: time="2025-09-05T23:59:51.896275385Z" level=error msg="StopPodSandbox for \"ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e\" failed" error="failed to destroy network for sandbox \"ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.897083 kubelet[2774]: E0905 23:59:51.896508 2774 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" Sep 5 23:59:51.897083 kubelet[2774]: E0905 23:59:51.896555 2774 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e"} Sep 5 23:59:51.897083 kubelet[2774]: E0905 23:59:51.896587 2774 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7f869a7e-1df4-4450-bbcf-ba5d46557d8b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 5 23:59:51.897083 kubelet[2774]: E0905 23:59:51.896608 2774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7f869a7e-1df4-4450-bbcf-ba5d46557d8b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-mm682" podUID="7f869a7e-1df4-4450-bbcf-ba5d46557d8b" Sep 5 23:59:51.911691 containerd[1576]: time="2025-09-05T23:59:51.910307788Z" level=error msg="StopPodSandbox for \"1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063\" failed" error="failed to destroy network for sandbox \"1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.911841 kubelet[2774]: E0905 23:59:51.910549 2774 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" Sep 5 23:59:51.911841 kubelet[2774]: E0905 23:59:51.910604 2774 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063"} Sep 5 23:59:51.911841 kubelet[2774]: E0905 23:59:51.910710 2774 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"4ff97107-cfe4-4d25-918b-36fd4176bf0c\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 5 23:59:51.911841 kubelet[2774]: E0905 23:59:51.910741 2774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"4ff97107-cfe4-4d25-918b-36fd4176bf0c\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-ps2ks" podUID="4ff97107-cfe4-4d25-918b-36fd4176bf0c" Sep 5 23:59:51.918239 containerd[1576]: time="2025-09-05T23:59:51.917419391Z" level=error msg="StopPodSandbox for \"7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61\" failed" error="failed to destroy network for sandbox \"7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.918822 kubelet[2774]: E0905 23:59:51.918760 2774 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" Sep 5 23:59:51.919015 kubelet[2774]: E0905 23:59:51.918991 2774 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61"} Sep 5 23:59:51.919113 kubelet[2774]: E0905 23:59:51.919099 2774 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0d045d3d-688c-447d-9931-b4f33f413acf\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 5 23:59:51.919250 kubelet[2774]: E0905 23:59:51.919217 2774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0d045d3d-688c-447d-9931-b4f33f413acf\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-cbc856df4-cgnd8" podUID="0d045d3d-688c-447d-9931-b4f33f413acf" Sep 5 23:59:51.929836 containerd[1576]: time="2025-09-05T23:59:51.929776445Z" level=error msg="StopPodSandbox for \"1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481\" failed" error="failed to destroy network for sandbox \"1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.930238 kubelet[2774]: E0905 23:59:51.930047 2774 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" Sep 5 23:59:51.930238 kubelet[2774]: E0905 23:59:51.930111 2774 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481"} Sep 5 23:59:51.930238 kubelet[2774]: E0905 23:59:51.930147 2774 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8fd1cf87-0065-4b36-b757-559cbde7316b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 5 23:59:51.930238 kubelet[2774]: E0905 23:59:51.930170 2774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8fd1cf87-0065-4b36-b757-559cbde7316b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-76445ff9b6-n8cgl" podUID="8fd1cf87-0065-4b36-b757-559cbde7316b" Sep 5 23:59:51.931086 containerd[1576]: time="2025-09-05T23:59:51.931044227Z" level=error msg="StopPodSandbox for \"5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098\" failed" error="failed to destroy network for sandbox \"5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 5 23:59:51.931284 kubelet[2774]: E0905 23:59:51.931251 2774 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" Sep 5 23:59:51.931342 kubelet[2774]: E0905 23:59:51.931299 2774 kuberuntime_manager.go:1479] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098"} Sep 5 23:59:51.931342 kubelet[2774]: E0905 23:59:51.931327 2774 kuberuntime_manager.go:1079] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"040258ac-5a18-4787-bc03-5dba35b78258\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Sep 5 23:59:51.931461 kubelet[2774]: E0905 23:59:51.931355 2774 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"040258ac-5a18-4787-bc03-5dba35b78258\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-2kb4z" podUID="040258ac-5a18-4787-bc03-5dba35b78258" Sep 5 23:59:55.686504 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount200031809.mount: Deactivated successfully. Sep 5 23:59:55.714179 containerd[1576]: time="2025-09-05T23:59:55.714086498Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:59:55.716070 containerd[1576]: time="2025-09-05T23:59:55.715871005Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=151100457" Sep 5 23:59:55.717331 containerd[1576]: time="2025-09-05T23:59:55.717160666Z" level=info msg="ImageCreate event name:\"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:59:55.720101 containerd[1576]: time="2025-09-05T23:59:55.720032950Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:59:55.720979 containerd[1576]: time="2025-09-05T23:59:55.720741721Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"151100319\" in 3.962150001s" Sep 5 23:59:55.720979 containerd[1576]: time="2025-09-05T23:59:55.720780162Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\"" Sep 5 23:59:55.739869 containerd[1576]: time="2025-09-05T23:59:55.739817059Z" level=info msg="CreateContainer within sandbox \"c207f498ce7317ea226193d8f353f693e77cc568655affb684b6563138ee9a14\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 5 23:59:55.773846 containerd[1576]: time="2025-09-05T23:59:55.773685308Z" level=info msg="CreateContainer within sandbox \"c207f498ce7317ea226193d8f353f693e77cc568655affb684b6563138ee9a14\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"937e17db81a6fff5243530016606a68a723f2d72c16524a924db17eb54345458\"" Sep 5 23:59:55.776228 containerd[1576]: time="2025-09-05T23:59:55.774781725Z" level=info msg="StartContainer for \"937e17db81a6fff5243530016606a68a723f2d72c16524a924db17eb54345458\"" Sep 5 23:59:55.856500 containerd[1576]: time="2025-09-05T23:59:55.856373679Z" level=info msg="StartContainer for \"937e17db81a6fff5243530016606a68a723f2d72c16524a924db17eb54345458\" returns successfully" Sep 5 23:59:56.001969 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 5 23:59:56.002190 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 5 23:59:56.153918 containerd[1576]: time="2025-09-05T23:59:56.153874222Z" level=info msg="StopPodSandbox for \"53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856\"" Sep 5 23:59:56.362917 containerd[1576]: 2025-09-05 23:59:56.272 [INFO][4092] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" Sep 5 23:59:56.362917 containerd[1576]: 2025-09-05 23:59:56.273 [INFO][4092] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" iface="eth0" netns="/var/run/netns/cni-cfcc05aa-3ef7-1caa-29fb-f6db07556716" Sep 5 23:59:56.362917 containerd[1576]: 2025-09-05 23:59:56.274 [INFO][4092] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" iface="eth0" netns="/var/run/netns/cni-cfcc05aa-3ef7-1caa-29fb-f6db07556716" Sep 5 23:59:56.362917 containerd[1576]: 2025-09-05 23:59:56.274 [INFO][4092] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" iface="eth0" netns="/var/run/netns/cni-cfcc05aa-3ef7-1caa-29fb-f6db07556716" Sep 5 23:59:56.362917 containerd[1576]: 2025-09-05 23:59:56.275 [INFO][4092] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" Sep 5 23:59:56.362917 containerd[1576]: 2025-09-05 23:59:56.275 [INFO][4092] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" Sep 5 23:59:56.362917 containerd[1576]: 2025-09-05 23:59:56.339 [INFO][4101] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" HandleID="k8s-pod-network.53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" Workload="ci--4081--3--5--n--8aba32846f-k8s-whisker--7cd7c86f75--6qzr6-eth0" Sep 5 23:59:56.362917 containerd[1576]: 2025-09-05 23:59:56.339 [INFO][4101] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:59:56.362917 containerd[1576]: 2025-09-05 23:59:56.339 [INFO][4101] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:59:56.362917 containerd[1576]: 2025-09-05 23:59:56.351 [WARNING][4101] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" HandleID="k8s-pod-network.53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" Workload="ci--4081--3--5--n--8aba32846f-k8s-whisker--7cd7c86f75--6qzr6-eth0" Sep 5 23:59:56.362917 containerd[1576]: 2025-09-05 23:59:56.351 [INFO][4101] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" HandleID="k8s-pod-network.53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" Workload="ci--4081--3--5--n--8aba32846f-k8s-whisker--7cd7c86f75--6qzr6-eth0" Sep 5 23:59:56.362917 containerd[1576]: 2025-09-05 23:59:56.353 [INFO][4101] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:59:56.362917 containerd[1576]: 2025-09-05 23:59:56.356 [INFO][4092] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" Sep 5 23:59:56.364283 containerd[1576]: time="2025-09-05T23:59:56.363184727Z" level=info msg="TearDown network for sandbox \"53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856\" successfully" Sep 5 23:59:56.364283 containerd[1576]: time="2025-09-05T23:59:56.363215088Z" level=info msg="StopPodSandbox for \"53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856\" returns successfully" Sep 5 23:59:56.434745 kubelet[2774]: I0905 23:59:56.433832 2774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9kk9\" (UniqueName: \"kubernetes.io/projected/6155f6a9-1ed8-4cf8-a81d-799ccb4af958-kube-api-access-h9kk9\") pod \"6155f6a9-1ed8-4cf8-a81d-799ccb4af958\" (UID: \"6155f6a9-1ed8-4cf8-a81d-799ccb4af958\") " Sep 5 23:59:56.434745 kubelet[2774]: I0905 23:59:56.433885 2774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/6155f6a9-1ed8-4cf8-a81d-799ccb4af958-whisker-backend-key-pair\") pod \"6155f6a9-1ed8-4cf8-a81d-799ccb4af958\" (UID: \"6155f6a9-1ed8-4cf8-a81d-799ccb4af958\") " Sep 5 23:59:56.434745 kubelet[2774]: I0905 23:59:56.433905 2774 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6155f6a9-1ed8-4cf8-a81d-799ccb4af958-whisker-ca-bundle\") pod \"6155f6a9-1ed8-4cf8-a81d-799ccb4af958\" (UID: \"6155f6a9-1ed8-4cf8-a81d-799ccb4af958\") " Sep 5 23:59:56.438349 kubelet[2774]: I0905 23:59:56.438041 2774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6155f6a9-1ed8-4cf8-a81d-799ccb4af958-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "6155f6a9-1ed8-4cf8-a81d-799ccb4af958" (UID: "6155f6a9-1ed8-4cf8-a81d-799ccb4af958"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 5 23:59:56.441891 kubelet[2774]: I0905 23:59:56.441835 2774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6155f6a9-1ed8-4cf8-a81d-799ccb4af958-kube-api-access-h9kk9" (OuterVolumeSpecName: "kube-api-access-h9kk9") pod "6155f6a9-1ed8-4cf8-a81d-799ccb4af958" (UID: "6155f6a9-1ed8-4cf8-a81d-799ccb4af958"). InnerVolumeSpecName "kube-api-access-h9kk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 5 23:59:56.444331 kubelet[2774]: I0905 23:59:56.444268 2774 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6155f6a9-1ed8-4cf8-a81d-799ccb4af958-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "6155f6a9-1ed8-4cf8-a81d-799ccb4af958" (UID: "6155f6a9-1ed8-4cf8-a81d-799ccb4af958"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 5 23:59:56.535091 kubelet[2774]: I0905 23:59:56.534907 2774 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9kk9\" (UniqueName: \"kubernetes.io/projected/6155f6a9-1ed8-4cf8-a81d-799ccb4af958-kube-api-access-h9kk9\") on node \"ci-4081-3-5-n-8aba32846f\" DevicePath \"\"" Sep 5 23:59:56.535091 kubelet[2774]: I0905 23:59:56.535019 2774 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/6155f6a9-1ed8-4cf8-a81d-799ccb4af958-whisker-backend-key-pair\") on node \"ci-4081-3-5-n-8aba32846f\" DevicePath \"\"" Sep 5 23:59:56.535091 kubelet[2774]: I0905 23:59:56.535050 2774 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6155f6a9-1ed8-4cf8-a81d-799ccb4af958-whisker-ca-bundle\") on node \"ci-4081-3-5-n-8aba32846f\" DevicePath \"\"" Sep 5 23:59:56.688041 systemd[1]: run-netns-cni\x2dcfcc05aa\x2d3ef7\x2d1caa\x2d29fb\x2df6db07556716.mount: Deactivated successfully. Sep 5 23:59:56.688257 systemd[1]: var-lib-kubelet-pods-6155f6a9\x2d1ed8\x2d4cf8\x2da81d\x2d799ccb4af958-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dh9kk9.mount: Deactivated successfully. Sep 5 23:59:56.688382 systemd[1]: var-lib-kubelet-pods-6155f6a9\x2d1ed8\x2d4cf8\x2da81d\x2d799ccb4af958-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 5 23:59:56.870790 kubelet[2774]: I0905 23:59:56.869467 2774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-j42nm" podStartSLOduration=3.972127917 podStartE2EDuration="15.86943959s" podCreationTimestamp="2025-09-05 23:59:41 +0000 UTC" firstStartedPulling="2025-09-05 23:59:43.824505025 +0000 UTC m=+25.388129256" lastFinishedPulling="2025-09-05 23:59:55.721816698 +0000 UTC m=+37.285440929" observedRunningTime="2025-09-05 23:59:56.869364709 +0000 UTC m=+38.432988980" watchObservedRunningTime="2025-09-05 23:59:56.86943959 +0000 UTC m=+38.433063861" Sep 5 23:59:57.037877 kubelet[2774]: I0905 23:59:57.037690 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2fsg\" (UniqueName: \"kubernetes.io/projected/c9432df2-44f2-485f-b4e4-be8cff6d49e6-kube-api-access-q2fsg\") pod \"whisker-7bfdf4954b-9mxlp\" (UID: \"c9432df2-44f2-485f-b4e4-be8cff6d49e6\") " pod="calico-system/whisker-7bfdf4954b-9mxlp" Sep 5 23:59:57.037877 kubelet[2774]: I0905 23:59:57.037749 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9432df2-44f2-485f-b4e4-be8cff6d49e6-whisker-ca-bundle\") pod \"whisker-7bfdf4954b-9mxlp\" (UID: \"c9432df2-44f2-485f-b4e4-be8cff6d49e6\") " pod="calico-system/whisker-7bfdf4954b-9mxlp" Sep 5 23:59:57.037877 kubelet[2774]: I0905 23:59:57.037770 2774 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/c9432df2-44f2-485f-b4e4-be8cff6d49e6-whisker-backend-key-pair\") pod \"whisker-7bfdf4954b-9mxlp\" (UID: \"c9432df2-44f2-485f-b4e4-be8cff6d49e6\") " pod="calico-system/whisker-7bfdf4954b-9mxlp" Sep 5 23:59:57.250526 containerd[1576]: time="2025-09-05T23:59:57.250437691Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7bfdf4954b-9mxlp,Uid:c9432df2-44f2-485f-b4e4-be8cff6d49e6,Namespace:calico-system,Attempt:0,}" Sep 5 23:59:57.418582 systemd-networkd[1232]: calif5c4adf8117: Link UP Sep 5 23:59:57.419925 systemd-networkd[1232]: calif5c4adf8117: Gained carrier Sep 5 23:59:57.447158 containerd[1576]: 2025-09-05 23:59:57.297 [INFO][4124] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 5 23:59:57.447158 containerd[1576]: 2025-09-05 23:59:57.316 [INFO][4124] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--n--8aba32846f-k8s-whisker--7bfdf4954b--9mxlp-eth0 whisker-7bfdf4954b- calico-system c9432df2-44f2-485f-b4e4-be8cff6d49e6 908 0 2025-09-05 23:59:56 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:7bfdf4954b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4081-3-5-n-8aba32846f whisker-7bfdf4954b-9mxlp eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calif5c4adf8117 [] [] }} ContainerID="54c1ccc5a55378d0cf631b6ce668b97f3524df41d638527e9f6899899ab209be" Namespace="calico-system" Pod="whisker-7bfdf4954b-9mxlp" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-whisker--7bfdf4954b--9mxlp-" Sep 5 23:59:57.447158 containerd[1576]: 2025-09-05 23:59:57.316 [INFO][4124] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="54c1ccc5a55378d0cf631b6ce668b97f3524df41d638527e9f6899899ab209be" Namespace="calico-system" Pod="whisker-7bfdf4954b-9mxlp" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-whisker--7bfdf4954b--9mxlp-eth0" Sep 5 23:59:57.447158 containerd[1576]: 2025-09-05 23:59:57.347 [INFO][4135] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="54c1ccc5a55378d0cf631b6ce668b97f3524df41d638527e9f6899899ab209be" HandleID="k8s-pod-network.54c1ccc5a55378d0cf631b6ce668b97f3524df41d638527e9f6899899ab209be" Workload="ci--4081--3--5--n--8aba32846f-k8s-whisker--7bfdf4954b--9mxlp-eth0" Sep 5 23:59:57.447158 containerd[1576]: 2025-09-05 23:59:57.348 [INFO][4135] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="54c1ccc5a55378d0cf631b6ce668b97f3524df41d638527e9f6899899ab209be" HandleID="k8s-pod-network.54c1ccc5a55378d0cf631b6ce668b97f3524df41d638527e9f6899899ab209be" Workload="ci--4081--3--5--n--8aba32846f-k8s-whisker--7bfdf4954b--9mxlp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3600), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-5-n-8aba32846f", "pod":"whisker-7bfdf4954b-9mxlp", "timestamp":"2025-09-05 23:59:57.347968018 +0000 UTC"}, Hostname:"ci-4081-3-5-n-8aba32846f", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 5 23:59:57.447158 containerd[1576]: 2025-09-05 23:59:57.348 [INFO][4135] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 5 23:59:57.447158 containerd[1576]: 2025-09-05 23:59:57.348 [INFO][4135] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 5 23:59:57.447158 containerd[1576]: 2025-09-05 23:59:57.348 [INFO][4135] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-n-8aba32846f' Sep 5 23:59:57.447158 containerd[1576]: 2025-09-05 23:59:57.359 [INFO][4135] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.54c1ccc5a55378d0cf631b6ce668b97f3524df41d638527e9f6899899ab209be" host="ci-4081-3-5-n-8aba32846f" Sep 5 23:59:57.447158 containerd[1576]: 2025-09-05 23:59:57.368 [INFO][4135] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-n-8aba32846f" Sep 5 23:59:57.447158 containerd[1576]: 2025-09-05 23:59:57.379 [INFO][4135] ipam/ipam.go 511: Trying affinity for 192.168.74.0/26 host="ci-4081-3-5-n-8aba32846f" Sep 5 23:59:57.447158 containerd[1576]: 2025-09-05 23:59:57.385 [INFO][4135] ipam/ipam.go 158: Attempting to load block cidr=192.168.74.0/26 host="ci-4081-3-5-n-8aba32846f" Sep 5 23:59:57.447158 containerd[1576]: 2025-09-05 23:59:57.388 [INFO][4135] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.74.0/26 host="ci-4081-3-5-n-8aba32846f" Sep 5 23:59:57.447158 containerd[1576]: 2025-09-05 23:59:57.388 [INFO][4135] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.74.0/26 handle="k8s-pod-network.54c1ccc5a55378d0cf631b6ce668b97f3524df41d638527e9f6899899ab209be" host="ci-4081-3-5-n-8aba32846f" Sep 5 23:59:57.447158 containerd[1576]: 2025-09-05 23:59:57.390 [INFO][4135] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.54c1ccc5a55378d0cf631b6ce668b97f3524df41d638527e9f6899899ab209be Sep 5 23:59:57.447158 containerd[1576]: 2025-09-05 23:59:57.395 [INFO][4135] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.74.0/26 handle="k8s-pod-network.54c1ccc5a55378d0cf631b6ce668b97f3524df41d638527e9f6899899ab209be" host="ci-4081-3-5-n-8aba32846f" Sep 5 23:59:57.447158 containerd[1576]: 2025-09-05 23:59:57.405 [INFO][4135] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.74.1/26] block=192.168.74.0/26 handle="k8s-pod-network.54c1ccc5a55378d0cf631b6ce668b97f3524df41d638527e9f6899899ab209be" host="ci-4081-3-5-n-8aba32846f" Sep 5 23:59:57.447158 containerd[1576]: 2025-09-05 23:59:57.405 [INFO][4135] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.74.1/26] handle="k8s-pod-network.54c1ccc5a55378d0cf631b6ce668b97f3524df41d638527e9f6899899ab209be" host="ci-4081-3-5-n-8aba32846f" Sep 5 23:59:57.447158 containerd[1576]: 2025-09-05 23:59:57.406 [INFO][4135] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 5 23:59:57.447158 containerd[1576]: 2025-09-05 23:59:57.406 [INFO][4135] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.74.1/26] IPv6=[] ContainerID="54c1ccc5a55378d0cf631b6ce668b97f3524df41d638527e9f6899899ab209be" HandleID="k8s-pod-network.54c1ccc5a55378d0cf631b6ce668b97f3524df41d638527e9f6899899ab209be" Workload="ci--4081--3--5--n--8aba32846f-k8s-whisker--7bfdf4954b--9mxlp-eth0" Sep 5 23:59:57.448400 containerd[1576]: 2025-09-05 23:59:57.408 [INFO][4124] cni-plugin/k8s.go 418: Populated endpoint ContainerID="54c1ccc5a55378d0cf631b6ce668b97f3524df41d638527e9f6899899ab209be" Namespace="calico-system" Pod="whisker-7bfdf4954b-9mxlp" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-whisker--7bfdf4954b--9mxlp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--8aba32846f-k8s-whisker--7bfdf4954b--9mxlp-eth0", GenerateName:"whisker-7bfdf4954b-", Namespace:"calico-system", SelfLink:"", UID:"c9432df2-44f2-485f-b4e4-be8cff6d49e6", ResourceVersion:"908", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 59, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7bfdf4954b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-8aba32846f", ContainerID:"", Pod:"whisker-7bfdf4954b-9mxlp", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.74.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calif5c4adf8117", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:59:57.448400 containerd[1576]: 2025-09-05 23:59:57.409 [INFO][4124] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.1/32] ContainerID="54c1ccc5a55378d0cf631b6ce668b97f3524df41d638527e9f6899899ab209be" Namespace="calico-system" Pod="whisker-7bfdf4954b-9mxlp" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-whisker--7bfdf4954b--9mxlp-eth0" Sep 5 23:59:57.448400 containerd[1576]: 2025-09-05 23:59:57.409 [INFO][4124] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif5c4adf8117 ContainerID="54c1ccc5a55378d0cf631b6ce668b97f3524df41d638527e9f6899899ab209be" Namespace="calico-system" Pod="whisker-7bfdf4954b-9mxlp" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-whisker--7bfdf4954b--9mxlp-eth0" Sep 5 23:59:57.448400 containerd[1576]: 2025-09-05 23:59:57.421 [INFO][4124] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="54c1ccc5a55378d0cf631b6ce668b97f3524df41d638527e9f6899899ab209be" Namespace="calico-system" Pod="whisker-7bfdf4954b-9mxlp" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-whisker--7bfdf4954b--9mxlp-eth0" Sep 5 23:59:57.448400 containerd[1576]: 2025-09-05 23:59:57.422 [INFO][4124] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="54c1ccc5a55378d0cf631b6ce668b97f3524df41d638527e9f6899899ab209be" Namespace="calico-system" Pod="whisker-7bfdf4954b-9mxlp" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-whisker--7bfdf4954b--9mxlp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--8aba32846f-k8s-whisker--7bfdf4954b--9mxlp-eth0", GenerateName:"whisker-7bfdf4954b-", Namespace:"calico-system", SelfLink:"", UID:"c9432df2-44f2-485f-b4e4-be8cff6d49e6", ResourceVersion:"908", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 59, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7bfdf4954b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-8aba32846f", ContainerID:"54c1ccc5a55378d0cf631b6ce668b97f3524df41d638527e9f6899899ab209be", Pod:"whisker-7bfdf4954b-9mxlp", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.74.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calif5c4adf8117", MAC:"72:74:87:21:fd:c4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 5 23:59:57.448400 containerd[1576]: 2025-09-05 23:59:57.442 [INFO][4124] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="54c1ccc5a55378d0cf631b6ce668b97f3524df41d638527e9f6899899ab209be" Namespace="calico-system" Pod="whisker-7bfdf4954b-9mxlp" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-whisker--7bfdf4954b--9mxlp-eth0" Sep 5 23:59:57.469281 containerd[1576]: time="2025-09-05T23:59:57.469159976Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 5 23:59:57.469281 containerd[1576]: time="2025-09-05T23:59:57.469231697Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 5 23:59:57.469281 containerd[1576]: time="2025-09-05T23:59:57.469242897Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:59:57.469567 containerd[1576]: time="2025-09-05T23:59:57.469372059Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 5 23:59:57.610528 containerd[1576]: time="2025-09-05T23:59:57.610484072Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7bfdf4954b-9mxlp,Uid:c9432df2-44f2-485f-b4e4-be8cff6d49e6,Namespace:calico-system,Attempt:0,} returns sandbox id \"54c1ccc5a55378d0cf631b6ce668b97f3524df41d638527e9f6899899ab209be\"" Sep 5 23:59:57.615049 containerd[1576]: time="2025-09-05T23:59:57.614992179Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 5 23:59:58.209395 kubelet[2774]: I0905 23:59:58.207885 2774 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 5 23:59:58.593959 kubelet[2774]: I0905 23:59:58.593830 2774 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6155f6a9-1ed8-4cf8-a81d-799ccb4af958" path="/var/lib/kubelet/pods/6155f6a9-1ed8-4cf8-a81d-799ccb4af958/volumes" Sep 5 23:59:58.697137 systemd-networkd[1232]: calif5c4adf8117: Gained IPv6LL Sep 5 23:59:59.242737 containerd[1576]: time="2025-09-05T23:59:59.241211195Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:59:59.244381 containerd[1576]: time="2025-09-05T23:59:59.242919379Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4605606" Sep 5 23:59:59.245910 containerd[1576]: time="2025-09-05T23:59:59.245868421Z" level=info msg="ImageCreate event name:\"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:59:59.251219 containerd[1576]: time="2025-09-05T23:59:59.251166415Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 5 23:59:59.252468 containerd[1576]: time="2025-09-05T23:59:59.252430073Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"5974839\" in 1.636324838s" Sep 5 23:59:59.253035 containerd[1576]: time="2025-09-05T23:59:59.252973721Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\"" Sep 5 23:59:59.294028 containerd[1576]: time="2025-09-05T23:59:59.293970299Z" level=info msg="CreateContainer within sandbox \"54c1ccc5a55378d0cf631b6ce668b97f3524df41d638527e9f6899899ab209be\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 5 23:59:59.313345 containerd[1576]: time="2025-09-05T23:59:59.313297652Z" level=info msg="CreateContainer within sandbox \"54c1ccc5a55378d0cf631b6ce668b97f3524df41d638527e9f6899899ab209be\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"db11db440d980f056d77cbd9970e927abeab131137525d04a78526d809db94df\"" Sep 5 23:59:59.323117 containerd[1576]: time="2025-09-05T23:59:59.323073749Z" level=info msg="StartContainer for \"db11db440d980f056d77cbd9970e927abeab131137525d04a78526d809db94df\"" Sep 5 23:59:59.353644 kernel: bpftool[4403]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Sep 5 23:59:59.411030 containerd[1576]: time="2025-09-05T23:59:59.410859868Z" level=info msg="StartContainer for \"db11db440d980f056d77cbd9970e927abeab131137525d04a78526d809db94df\" returns successfully" Sep 5 23:59:59.413802 containerd[1576]: time="2025-09-05T23:59:59.413764068Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 5 23:59:59.569543 systemd-networkd[1232]: vxlan.calico: Link UP Sep 5 23:59:59.569550 systemd-networkd[1232]: vxlan.calico: Gained carrier Sep 6 00:00:01.449079 systemd-networkd[1232]: vxlan.calico: Gained IPv6LL Sep 6 00:00:02.607132 containerd[1576]: time="2025-09-06T00:00:02.605459880Z" level=info msg="StopPodSandbox for \"1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481\"" Sep 6 00:00:02.608732 containerd[1576]: time="2025-09-06T00:00:02.608431839Z" level=info msg="StopPodSandbox for \"dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847\"" Sep 6 00:00:02.766913 containerd[1576]: 2025-09-06 00:00:02.706 [INFO][4519] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" Sep 6 00:00:02.766913 containerd[1576]: 2025-09-06 00:00:02.707 [INFO][4519] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" iface="eth0" netns="/var/run/netns/cni-35d87c84-8338-cf61-1acc-8a8a3299a6e8" Sep 6 00:00:02.766913 containerd[1576]: 2025-09-06 00:00:02.708 [INFO][4519] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" iface="eth0" netns="/var/run/netns/cni-35d87c84-8338-cf61-1acc-8a8a3299a6e8" Sep 6 00:00:02.766913 containerd[1576]: 2025-09-06 00:00:02.709 [INFO][4519] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" iface="eth0" netns="/var/run/netns/cni-35d87c84-8338-cf61-1acc-8a8a3299a6e8" Sep 6 00:00:02.766913 containerd[1576]: 2025-09-06 00:00:02.710 [INFO][4519] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" Sep 6 00:00:02.766913 containerd[1576]: 2025-09-06 00:00:02.711 [INFO][4519] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" Sep 6 00:00:02.766913 containerd[1576]: 2025-09-06 00:00:02.738 [INFO][4535] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" HandleID="k8s-pod-network.1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--n8cgl-eth0" Sep 6 00:00:02.766913 containerd[1576]: 2025-09-06 00:00:02.739 [INFO][4535] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:00:02.766913 containerd[1576]: 2025-09-06 00:00:02.739 [INFO][4535] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:00:02.766913 containerd[1576]: 2025-09-06 00:00:02.753 [WARNING][4535] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" HandleID="k8s-pod-network.1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--n8cgl-eth0" Sep 6 00:00:02.766913 containerd[1576]: 2025-09-06 00:00:02.753 [INFO][4535] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" HandleID="k8s-pod-network.1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--n8cgl-eth0" Sep 6 00:00:02.766913 containerd[1576]: 2025-09-06 00:00:02.760 [INFO][4535] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:00:02.766913 containerd[1576]: 2025-09-06 00:00:02.764 [INFO][4519] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" Sep 6 00:00:02.768244 containerd[1576]: time="2025-09-06T00:00:02.767828006Z" level=info msg="TearDown network for sandbox \"1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481\" successfully" Sep 6 00:00:02.768244 containerd[1576]: time="2025-09-06T00:00:02.767879887Z" level=info msg="StopPodSandbox for \"1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481\" returns successfully" Sep 6 00:00:02.772597 systemd[1]: run-netns-cni\x2d35d87c84\x2d8338\x2dcf61\x2d1acc\x2d8a8a3299a6e8.mount: Deactivated successfully. Sep 6 00:00:02.780479 containerd[1576]: time="2025-09-06T00:00:02.780076766Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76445ff9b6-n8cgl,Uid:8fd1cf87-0065-4b36-b757-559cbde7316b,Namespace:calico-apiserver,Attempt:1,}" Sep 6 00:00:02.782193 systemd[1]: Started logrotate.service - Rotate and Compress System Logs. Sep 6 00:00:02.797065 systemd[1]: logrotate.service: Deactivated successfully. Sep 6 00:00:02.804149 containerd[1576]: 2025-09-06 00:00:02.707 [INFO][4520] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" Sep 6 00:00:02.804149 containerd[1576]: 2025-09-06 00:00:02.710 [INFO][4520] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" iface="eth0" netns="/var/run/netns/cni-398b5652-0c7d-6b8c-ac3a-26382037f75d" Sep 6 00:00:02.804149 containerd[1576]: 2025-09-06 00:00:02.711 [INFO][4520] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" iface="eth0" netns="/var/run/netns/cni-398b5652-0c7d-6b8c-ac3a-26382037f75d" Sep 6 00:00:02.804149 containerd[1576]: 2025-09-06 00:00:02.711 [INFO][4520] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" iface="eth0" netns="/var/run/netns/cni-398b5652-0c7d-6b8c-ac3a-26382037f75d" Sep 6 00:00:02.804149 containerd[1576]: 2025-09-06 00:00:02.711 [INFO][4520] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" Sep 6 00:00:02.804149 containerd[1576]: 2025-09-06 00:00:02.711 [INFO][4520] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" Sep 6 00:00:02.804149 containerd[1576]: 2025-09-06 00:00:02.739 [INFO][4536] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" HandleID="k8s-pod-network.dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" Workload="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--8dbj7-eth0" Sep 6 00:00:02.804149 containerd[1576]: 2025-09-06 00:00:02.739 [INFO][4536] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:00:02.804149 containerd[1576]: 2025-09-06 00:00:02.760 [INFO][4536] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:00:02.804149 containerd[1576]: 2025-09-06 00:00:02.787 [WARNING][4536] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" HandleID="k8s-pod-network.dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" Workload="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--8dbj7-eth0" Sep 6 00:00:02.804149 containerd[1576]: 2025-09-06 00:00:02.787 [INFO][4536] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" HandleID="k8s-pod-network.dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" Workload="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--8dbj7-eth0" Sep 6 00:00:02.804149 containerd[1576]: 2025-09-06 00:00:02.792 [INFO][4536] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:00:02.804149 containerd[1576]: 2025-09-06 00:00:02.799 [INFO][4520] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" Sep 6 00:00:02.809376 containerd[1576]: time="2025-09-06T00:00:02.804715369Z" level=info msg="TearDown network for sandbox \"dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847\" successfully" Sep 6 00:00:02.809376 containerd[1576]: time="2025-09-06T00:00:02.804760250Z" level=info msg="StopPodSandbox for \"dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847\" returns successfully" Sep 6 00:00:02.808323 systemd[1]: run-netns-cni\x2d398b5652\x2d0c7d\x2d6b8c\x2dac3a\x2d26382037f75d.mount: Deactivated successfully. Sep 6 00:00:02.811749 containerd[1576]: time="2025-09-06T00:00:02.811712861Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-8dbj7,Uid:e137dec7-0dff-4776-b673-c2d533bf21f9,Namespace:kube-system,Attempt:1,}" Sep 6 00:00:03.027587 systemd-networkd[1232]: calib992151005b: Link UP Sep 6 00:00:03.031864 systemd-networkd[1232]: calib992151005b: Gained carrier Sep 6 00:00:03.067354 containerd[1576]: 2025-09-06 00:00:02.874 [INFO][4552] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--n8cgl-eth0 calico-apiserver-76445ff9b6- calico-apiserver 8fd1cf87-0065-4b36-b757-559cbde7316b 939 0 2025-09-05 23:59:36 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:76445ff9b6 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081-3-5-n-8aba32846f calico-apiserver-76445ff9b6-n8cgl eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calib992151005b [] [] }} ContainerID="aec6df2d8f9fd860a7fd99b4ffd502270992208348b6984be61845d928b36bce" Namespace="calico-apiserver" Pod="calico-apiserver-76445ff9b6-n8cgl" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--n8cgl-" Sep 6 00:00:03.067354 containerd[1576]: 2025-09-06 00:00:02.875 [INFO][4552] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="aec6df2d8f9fd860a7fd99b4ffd502270992208348b6984be61845d928b36bce" Namespace="calico-apiserver" Pod="calico-apiserver-76445ff9b6-n8cgl" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--n8cgl-eth0" Sep 6 00:00:03.067354 containerd[1576]: 2025-09-06 00:00:02.930 [INFO][4573] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="aec6df2d8f9fd860a7fd99b4ffd502270992208348b6984be61845d928b36bce" HandleID="k8s-pod-network.aec6df2d8f9fd860a7fd99b4ffd502270992208348b6984be61845d928b36bce" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--n8cgl-eth0" Sep 6 00:00:03.067354 containerd[1576]: 2025-09-06 00:00:02.930 [INFO][4573] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="aec6df2d8f9fd860a7fd99b4ffd502270992208348b6984be61845d928b36bce" HandleID="k8s-pod-network.aec6df2d8f9fd860a7fd99b4ffd502270992208348b6984be61845d928b36bce" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--n8cgl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3600), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4081-3-5-n-8aba32846f", "pod":"calico-apiserver-76445ff9b6-n8cgl", "timestamp":"2025-09-06 00:00:02.930362694 +0000 UTC"}, Hostname:"ci-4081-3-5-n-8aba32846f", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 6 00:00:03.067354 containerd[1576]: 2025-09-06 00:00:02.930 [INFO][4573] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:00:03.067354 containerd[1576]: 2025-09-06 00:00:02.930 [INFO][4573] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:00:03.067354 containerd[1576]: 2025-09-06 00:00:02.930 [INFO][4573] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-n-8aba32846f' Sep 6 00:00:03.067354 containerd[1576]: 2025-09-06 00:00:02.944 [INFO][4573] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.aec6df2d8f9fd860a7fd99b4ffd502270992208348b6984be61845d928b36bce" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:03.067354 containerd[1576]: 2025-09-06 00:00:02.958 [INFO][4573] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:03.067354 containerd[1576]: 2025-09-06 00:00:02.968 [INFO][4573] ipam/ipam.go 511: Trying affinity for 192.168.74.0/26 host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:03.067354 containerd[1576]: 2025-09-06 00:00:02.974 [INFO][4573] ipam/ipam.go 158: Attempting to load block cidr=192.168.74.0/26 host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:03.067354 containerd[1576]: 2025-09-06 00:00:02.981 [INFO][4573] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.74.0/26 host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:03.067354 containerd[1576]: 2025-09-06 00:00:02.981 [INFO][4573] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.74.0/26 handle="k8s-pod-network.aec6df2d8f9fd860a7fd99b4ffd502270992208348b6984be61845d928b36bce" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:03.067354 containerd[1576]: 2025-09-06 00:00:02.984 [INFO][4573] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.aec6df2d8f9fd860a7fd99b4ffd502270992208348b6984be61845d928b36bce Sep 6 00:00:03.067354 containerd[1576]: 2025-09-06 00:00:02.993 [INFO][4573] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.74.0/26 handle="k8s-pod-network.aec6df2d8f9fd860a7fd99b4ffd502270992208348b6984be61845d928b36bce" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:03.067354 containerd[1576]: 2025-09-06 00:00:03.012 [INFO][4573] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.74.2/26] block=192.168.74.0/26 handle="k8s-pod-network.aec6df2d8f9fd860a7fd99b4ffd502270992208348b6984be61845d928b36bce" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:03.067354 containerd[1576]: 2025-09-06 00:00:03.013 [INFO][4573] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.74.2/26] handle="k8s-pod-network.aec6df2d8f9fd860a7fd99b4ffd502270992208348b6984be61845d928b36bce" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:03.067354 containerd[1576]: 2025-09-06 00:00:03.013 [INFO][4573] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:00:03.067354 containerd[1576]: 2025-09-06 00:00:03.013 [INFO][4573] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.74.2/26] IPv6=[] ContainerID="aec6df2d8f9fd860a7fd99b4ffd502270992208348b6984be61845d928b36bce" HandleID="k8s-pod-network.aec6df2d8f9fd860a7fd99b4ffd502270992208348b6984be61845d928b36bce" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--n8cgl-eth0" Sep 6 00:00:03.069505 containerd[1576]: 2025-09-06 00:00:03.016 [INFO][4552] cni-plugin/k8s.go 418: Populated endpoint ContainerID="aec6df2d8f9fd860a7fd99b4ffd502270992208348b6984be61845d928b36bce" Namespace="calico-apiserver" Pod="calico-apiserver-76445ff9b6-n8cgl" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--n8cgl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--n8cgl-eth0", GenerateName:"calico-apiserver-76445ff9b6-", Namespace:"calico-apiserver", SelfLink:"", UID:"8fd1cf87-0065-4b36-b757-559cbde7316b", ResourceVersion:"939", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 59, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"76445ff9b6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-8aba32846f", ContainerID:"", Pod:"calico-apiserver-76445ff9b6-n8cgl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.74.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib992151005b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:00:03.069505 containerd[1576]: 2025-09-06 00:00:03.016 [INFO][4552] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.2/32] ContainerID="aec6df2d8f9fd860a7fd99b4ffd502270992208348b6984be61845d928b36bce" Namespace="calico-apiserver" Pod="calico-apiserver-76445ff9b6-n8cgl" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--n8cgl-eth0" Sep 6 00:00:03.069505 containerd[1576]: 2025-09-06 00:00:03.016 [INFO][4552] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib992151005b ContainerID="aec6df2d8f9fd860a7fd99b4ffd502270992208348b6984be61845d928b36bce" Namespace="calico-apiserver" Pod="calico-apiserver-76445ff9b6-n8cgl" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--n8cgl-eth0" Sep 6 00:00:03.069505 containerd[1576]: 2025-09-06 00:00:03.031 [INFO][4552] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="aec6df2d8f9fd860a7fd99b4ffd502270992208348b6984be61845d928b36bce" Namespace="calico-apiserver" Pod="calico-apiserver-76445ff9b6-n8cgl" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--n8cgl-eth0" Sep 6 00:00:03.069505 containerd[1576]: 2025-09-06 00:00:03.036 [INFO][4552] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="aec6df2d8f9fd860a7fd99b4ffd502270992208348b6984be61845d928b36bce" Namespace="calico-apiserver" Pod="calico-apiserver-76445ff9b6-n8cgl" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--n8cgl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--n8cgl-eth0", GenerateName:"calico-apiserver-76445ff9b6-", Namespace:"calico-apiserver", SelfLink:"", UID:"8fd1cf87-0065-4b36-b757-559cbde7316b", ResourceVersion:"939", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 59, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"76445ff9b6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-8aba32846f", ContainerID:"aec6df2d8f9fd860a7fd99b4ffd502270992208348b6984be61845d928b36bce", Pod:"calico-apiserver-76445ff9b6-n8cgl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.74.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib992151005b", MAC:"ce:eb:c7:de:cd:00", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:00:03.069505 containerd[1576]: 2025-09-06 00:00:03.059 [INFO][4552] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="aec6df2d8f9fd860a7fd99b4ffd502270992208348b6984be61845d928b36bce" Namespace="calico-apiserver" Pod="calico-apiserver-76445ff9b6-n8cgl" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--n8cgl-eth0" Sep 6 00:00:03.101511 containerd[1576]: time="2025-09-06T00:00:03.101360661Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 6 00:00:03.101511 containerd[1576]: time="2025-09-06T00:00:03.101455302Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 6 00:00:03.101511 containerd[1576]: time="2025-09-06T00:00:03.101476943Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:00:03.101969 containerd[1576]: time="2025-09-06T00:00:03.101594864Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:00:03.144780 systemd-networkd[1232]: calif4039490265: Link UP Sep 6 00:00:03.147450 systemd-networkd[1232]: calif4039490265: Gained carrier Sep 6 00:00:03.185656 containerd[1576]: 2025-09-06 00:00:02.940 [INFO][4568] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--8dbj7-eth0 coredns-7c65d6cfc9- kube-system e137dec7-0dff-4776-b673-c2d533bf21f9 940 0 2025-09-05 23:59:25 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081-3-5-n-8aba32846f coredns-7c65d6cfc9-8dbj7 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calif4039490265 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="681e4504d7d3f5533eee66e14d99af48d1fb110f75de8e10c58e0721cc0cbb1d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8dbj7" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--8dbj7-" Sep 6 00:00:03.185656 containerd[1576]: 2025-09-06 00:00:02.940 [INFO][4568] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="681e4504d7d3f5533eee66e14d99af48d1fb110f75de8e10c58e0721cc0cbb1d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8dbj7" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--8dbj7-eth0" Sep 6 00:00:03.185656 containerd[1576]: 2025-09-06 00:00:03.016 [INFO][4583] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="681e4504d7d3f5533eee66e14d99af48d1fb110f75de8e10c58e0721cc0cbb1d" HandleID="k8s-pod-network.681e4504d7d3f5533eee66e14d99af48d1fb110f75de8e10c58e0721cc0cbb1d" Workload="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--8dbj7-eth0" Sep 6 00:00:03.185656 containerd[1576]: 2025-09-06 00:00:03.016 [INFO][4583] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="681e4504d7d3f5533eee66e14d99af48d1fb110f75de8e10c58e0721cc0cbb1d" HandleID="k8s-pod-network.681e4504d7d3f5533eee66e14d99af48d1fb110f75de8e10c58e0721cc0cbb1d" Workload="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--8dbj7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d2ff0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081-3-5-n-8aba32846f", "pod":"coredns-7c65d6cfc9-8dbj7", "timestamp":"2025-09-06 00:00:03.016245654 +0000 UTC"}, Hostname:"ci-4081-3-5-n-8aba32846f", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 6 00:00:03.185656 containerd[1576]: 2025-09-06 00:00:03.016 [INFO][4583] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:00:03.185656 containerd[1576]: 2025-09-06 00:00:03.016 [INFO][4583] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:00:03.185656 containerd[1576]: 2025-09-06 00:00:03.016 [INFO][4583] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-n-8aba32846f' Sep 6 00:00:03.185656 containerd[1576]: 2025-09-06 00:00:03.051 [INFO][4583] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.681e4504d7d3f5533eee66e14d99af48d1fb110f75de8e10c58e0721cc0cbb1d" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:03.185656 containerd[1576]: 2025-09-06 00:00:03.072 [INFO][4583] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:03.185656 containerd[1576]: 2025-09-06 00:00:03.083 [INFO][4583] ipam/ipam.go 511: Trying affinity for 192.168.74.0/26 host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:03.185656 containerd[1576]: 2025-09-06 00:00:03.087 [INFO][4583] ipam/ipam.go 158: Attempting to load block cidr=192.168.74.0/26 host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:03.185656 containerd[1576]: 2025-09-06 00:00:03.097 [INFO][4583] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.74.0/26 host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:03.185656 containerd[1576]: 2025-09-06 00:00:03.098 [INFO][4583] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.74.0/26 handle="k8s-pod-network.681e4504d7d3f5533eee66e14d99af48d1fb110f75de8e10c58e0721cc0cbb1d" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:03.185656 containerd[1576]: 2025-09-06 00:00:03.101 [INFO][4583] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.681e4504d7d3f5533eee66e14d99af48d1fb110f75de8e10c58e0721cc0cbb1d Sep 6 00:00:03.185656 containerd[1576]: 2025-09-06 00:00:03.113 [INFO][4583] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.74.0/26 handle="k8s-pod-network.681e4504d7d3f5533eee66e14d99af48d1fb110f75de8e10c58e0721cc0cbb1d" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:03.185656 containerd[1576]: 2025-09-06 00:00:03.127 [INFO][4583] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.74.3/26] block=192.168.74.0/26 handle="k8s-pod-network.681e4504d7d3f5533eee66e14d99af48d1fb110f75de8e10c58e0721cc0cbb1d" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:03.185656 containerd[1576]: 2025-09-06 00:00:03.127 [INFO][4583] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.74.3/26] handle="k8s-pod-network.681e4504d7d3f5533eee66e14d99af48d1fb110f75de8e10c58e0721cc0cbb1d" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:03.185656 containerd[1576]: 2025-09-06 00:00:03.127 [INFO][4583] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:00:03.185656 containerd[1576]: 2025-09-06 00:00:03.127 [INFO][4583] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.74.3/26] IPv6=[] ContainerID="681e4504d7d3f5533eee66e14d99af48d1fb110f75de8e10c58e0721cc0cbb1d" HandleID="k8s-pod-network.681e4504d7d3f5533eee66e14d99af48d1fb110f75de8e10c58e0721cc0cbb1d" Workload="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--8dbj7-eth0" Sep 6 00:00:03.187908 containerd[1576]: 2025-09-06 00:00:03.132 [INFO][4568] cni-plugin/k8s.go 418: Populated endpoint ContainerID="681e4504d7d3f5533eee66e14d99af48d1fb110f75de8e10c58e0721cc0cbb1d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8dbj7" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--8dbj7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--8dbj7-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"e137dec7-0dff-4776-b673-c2d533bf21f9", ResourceVersion:"940", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 59, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-8aba32846f", ContainerID:"", Pod:"coredns-7c65d6cfc9-8dbj7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.74.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif4039490265", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:00:03.187908 containerd[1576]: 2025-09-06 00:00:03.133 [INFO][4568] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.3/32] ContainerID="681e4504d7d3f5533eee66e14d99af48d1fb110f75de8e10c58e0721cc0cbb1d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8dbj7" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--8dbj7-eth0" Sep 6 00:00:03.187908 containerd[1576]: 2025-09-06 00:00:03.133 [INFO][4568] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif4039490265 ContainerID="681e4504d7d3f5533eee66e14d99af48d1fb110f75de8e10c58e0721cc0cbb1d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8dbj7" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--8dbj7-eth0" Sep 6 00:00:03.187908 containerd[1576]: 2025-09-06 00:00:03.145 [INFO][4568] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="681e4504d7d3f5533eee66e14d99af48d1fb110f75de8e10c58e0721cc0cbb1d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8dbj7" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--8dbj7-eth0" Sep 6 00:00:03.187908 containerd[1576]: 2025-09-06 00:00:03.146 [INFO][4568] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="681e4504d7d3f5533eee66e14d99af48d1fb110f75de8e10c58e0721cc0cbb1d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8dbj7" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--8dbj7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--8dbj7-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"e137dec7-0dff-4776-b673-c2d533bf21f9", ResourceVersion:"940", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 59, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-8aba32846f", ContainerID:"681e4504d7d3f5533eee66e14d99af48d1fb110f75de8e10c58e0721cc0cbb1d", Pod:"coredns-7c65d6cfc9-8dbj7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.74.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif4039490265", MAC:"e6:8d:34:8b:ea:1a", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:00:03.187908 containerd[1576]: 2025-09-06 00:00:03.173 [INFO][4568] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="681e4504d7d3f5533eee66e14d99af48d1fb110f75de8e10c58e0721cc0cbb1d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-8dbj7" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--8dbj7-eth0" Sep 6 00:00:03.225799 containerd[1576]: time="2025-09-06T00:00:03.225686330Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76445ff9b6-n8cgl,Uid:8fd1cf87-0065-4b36-b757-559cbde7316b,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"aec6df2d8f9fd860a7fd99b4ffd502270992208348b6984be61845d928b36bce\"" Sep 6 00:00:03.235063 containerd[1576]: time="2025-09-06T00:00:03.232989783Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 6 00:00:03.235063 containerd[1576]: time="2025-09-06T00:00:03.233108104Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 6 00:00:03.235063 containerd[1576]: time="2025-09-06T00:00:03.233123585Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:00:03.235063 containerd[1576]: time="2025-09-06T00:00:03.233231226Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:00:03.320345 containerd[1576]: time="2025-09-06T00:00:03.320237458Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-8dbj7,Uid:e137dec7-0dff-4776-b673-c2d533bf21f9,Namespace:kube-system,Attempt:1,} returns sandbox id \"681e4504d7d3f5533eee66e14d99af48d1fb110f75de8e10c58e0721cc0cbb1d\"" Sep 6 00:00:03.325722 containerd[1576]: time="2025-09-06T00:00:03.325395364Z" level=info msg="CreateContainer within sandbox \"681e4504d7d3f5533eee66e14d99af48d1fb110f75de8e10c58e0721cc0cbb1d\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 6 00:00:03.339373 containerd[1576]: time="2025-09-06T00:00:03.339231580Z" level=info msg="CreateContainer within sandbox \"681e4504d7d3f5533eee66e14d99af48d1fb110f75de8e10c58e0721cc0cbb1d\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"a50f8ddb3c23214f2f977deaacfbfc33e1b9bda5d9d646bd62d21812e2c8a40d\"" Sep 6 00:00:03.343449 containerd[1576]: time="2025-09-06T00:00:03.340560237Z" level=info msg="StartContainer for \"a50f8ddb3c23214f2f977deaacfbfc33e1b9bda5d9d646bd62d21812e2c8a40d\"" Sep 6 00:00:03.412644 containerd[1576]: time="2025-09-06T00:00:03.412584918Z" level=info msg="StartContainer for \"a50f8ddb3c23214f2f977deaacfbfc33e1b9bda5d9d646bd62d21812e2c8a40d\" returns successfully" Sep 6 00:00:03.592265 containerd[1576]: time="2025-09-06T00:00:03.592099811Z" level=info msg="StopPodSandbox for \"1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063\"" Sep 6 00:00:03.782510 containerd[1576]: 2025-09-06 00:00:03.692 [INFO][4741] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" Sep 6 00:00:03.782510 containerd[1576]: 2025-09-06 00:00:03.694 [INFO][4741] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" iface="eth0" netns="/var/run/netns/cni-413b9a19-134e-b1dc-93b9-2265763701d2" Sep 6 00:00:03.782510 containerd[1576]: 2025-09-06 00:00:03.694 [INFO][4741] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" iface="eth0" netns="/var/run/netns/cni-413b9a19-134e-b1dc-93b9-2265763701d2" Sep 6 00:00:03.782510 containerd[1576]: 2025-09-06 00:00:03.694 [INFO][4741] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" iface="eth0" netns="/var/run/netns/cni-413b9a19-134e-b1dc-93b9-2265763701d2" Sep 6 00:00:03.782510 containerd[1576]: 2025-09-06 00:00:03.694 [INFO][4741] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" Sep 6 00:00:03.782510 containerd[1576]: 2025-09-06 00:00:03.694 [INFO][4741] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" Sep 6 00:00:03.782510 containerd[1576]: 2025-09-06 00:00:03.747 [INFO][4748] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" HandleID="k8s-pod-network.1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" Workload="ci--4081--3--5--n--8aba32846f-k8s-csi--node--driver--ps2ks-eth0" Sep 6 00:00:03.782510 containerd[1576]: 2025-09-06 00:00:03.747 [INFO][4748] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:00:03.782510 containerd[1576]: 2025-09-06 00:00:03.748 [INFO][4748] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:00:03.782510 containerd[1576]: 2025-09-06 00:00:03.764 [WARNING][4748] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" HandleID="k8s-pod-network.1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" Workload="ci--4081--3--5--n--8aba32846f-k8s-csi--node--driver--ps2ks-eth0" Sep 6 00:00:03.782510 containerd[1576]: 2025-09-06 00:00:03.764 [INFO][4748] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" HandleID="k8s-pod-network.1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" Workload="ci--4081--3--5--n--8aba32846f-k8s-csi--node--driver--ps2ks-eth0" Sep 6 00:00:03.782510 containerd[1576]: 2025-09-06 00:00:03.769 [INFO][4748] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:00:03.782510 containerd[1576]: 2025-09-06 00:00:03.777 [INFO][4741] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" Sep 6 00:00:03.785545 containerd[1576]: time="2025-09-06T00:00:03.783323935Z" level=info msg="TearDown network for sandbox \"1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063\" successfully" Sep 6 00:00:03.785545 containerd[1576]: time="2025-09-06T00:00:03.784778873Z" level=info msg="StopPodSandbox for \"1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063\" returns successfully" Sep 6 00:00:03.787171 containerd[1576]: time="2025-09-06T00:00:03.786066530Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ps2ks,Uid:4ff97107-cfe4-4d25-918b-36fd4176bf0c,Namespace:calico-system,Attempt:1,}" Sep 6 00:00:03.786698 systemd[1]: run-netns-cni\x2d413b9a19\x2d134e\x2db1dc\x2d93b9\x2d2265763701d2.mount: Deactivated successfully. Sep 6 00:00:03.940658 kubelet[2774]: I0906 00:00:03.937523 2774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-8dbj7" podStartSLOduration=38.937505945 podStartE2EDuration="38.937505945s" podCreationTimestamp="2025-09-05 23:59:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-06 00:00:03.932580122 +0000 UTC m=+45.496204353" watchObservedRunningTime="2025-09-06 00:00:03.937505945 +0000 UTC m=+45.501130136" Sep 6 00:00:04.047786 systemd-networkd[1232]: cali64b00cc420c: Link UP Sep 6 00:00:04.048078 systemd-networkd[1232]: cali64b00cc420c: Gained carrier Sep 6 00:00:04.077298 containerd[1576]: 2025-09-06 00:00:03.866 [INFO][4754] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--n--8aba32846f-k8s-csi--node--driver--ps2ks-eth0 csi-node-driver- calico-system 4ff97107-cfe4-4d25-918b-36fd4176bf0c 954 0 2025-09-05 23:59:42 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:856c6b598f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4081-3-5-n-8aba32846f csi-node-driver-ps2ks eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali64b00cc420c [] [] }} ContainerID="59eeb191e9847ded5f0b0c50561a32db880df7b37af27b8bfebc65eb26a73e70" Namespace="calico-system" Pod="csi-node-driver-ps2ks" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-csi--node--driver--ps2ks-" Sep 6 00:00:04.077298 containerd[1576]: 2025-09-06 00:00:03.866 [INFO][4754] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="59eeb191e9847ded5f0b0c50561a32db880df7b37af27b8bfebc65eb26a73e70" Namespace="calico-system" Pod="csi-node-driver-ps2ks" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-csi--node--driver--ps2ks-eth0" Sep 6 00:00:04.077298 containerd[1576]: 2025-09-06 00:00:03.907 [INFO][4766] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="59eeb191e9847ded5f0b0c50561a32db880df7b37af27b8bfebc65eb26a73e70" HandleID="k8s-pod-network.59eeb191e9847ded5f0b0c50561a32db880df7b37af27b8bfebc65eb26a73e70" Workload="ci--4081--3--5--n--8aba32846f-k8s-csi--node--driver--ps2ks-eth0" Sep 6 00:00:04.077298 containerd[1576]: 2025-09-06 00:00:03.907 [INFO][4766] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="59eeb191e9847ded5f0b0c50561a32db880df7b37af27b8bfebc65eb26a73e70" HandleID="k8s-pod-network.59eeb191e9847ded5f0b0c50561a32db880df7b37af27b8bfebc65eb26a73e70" Workload="ci--4081--3--5--n--8aba32846f-k8s-csi--node--driver--ps2ks-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3130), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-5-n-8aba32846f", "pod":"csi-node-driver-ps2ks", "timestamp":"2025-09-06 00:00:03.907193557 +0000 UTC"}, Hostname:"ci-4081-3-5-n-8aba32846f", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 6 00:00:04.077298 containerd[1576]: 2025-09-06 00:00:03.908 [INFO][4766] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:00:04.077298 containerd[1576]: 2025-09-06 00:00:03.908 [INFO][4766] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:00:04.077298 containerd[1576]: 2025-09-06 00:00:03.910 [INFO][4766] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-n-8aba32846f' Sep 6 00:00:04.077298 containerd[1576]: 2025-09-06 00:00:03.947 [INFO][4766] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.59eeb191e9847ded5f0b0c50561a32db880df7b37af27b8bfebc65eb26a73e70" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:04.077298 containerd[1576]: 2025-09-06 00:00:03.972 [INFO][4766] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:04.077298 containerd[1576]: 2025-09-06 00:00:04.000 [INFO][4766] ipam/ipam.go 511: Trying affinity for 192.168.74.0/26 host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:04.077298 containerd[1576]: 2025-09-06 00:00:04.012 [INFO][4766] ipam/ipam.go 158: Attempting to load block cidr=192.168.74.0/26 host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:04.077298 containerd[1576]: 2025-09-06 00:00:04.016 [INFO][4766] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.74.0/26 host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:04.077298 containerd[1576]: 2025-09-06 00:00:04.016 [INFO][4766] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.74.0/26 handle="k8s-pod-network.59eeb191e9847ded5f0b0c50561a32db880df7b37af27b8bfebc65eb26a73e70" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:04.077298 containerd[1576]: 2025-09-06 00:00:04.019 [INFO][4766] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.59eeb191e9847ded5f0b0c50561a32db880df7b37af27b8bfebc65eb26a73e70 Sep 6 00:00:04.077298 containerd[1576]: 2025-09-06 00:00:04.027 [INFO][4766] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.74.0/26 handle="k8s-pod-network.59eeb191e9847ded5f0b0c50561a32db880df7b37af27b8bfebc65eb26a73e70" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:04.077298 containerd[1576]: 2025-09-06 00:00:04.039 [INFO][4766] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.74.4/26] block=192.168.74.0/26 handle="k8s-pod-network.59eeb191e9847ded5f0b0c50561a32db880df7b37af27b8bfebc65eb26a73e70" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:04.077298 containerd[1576]: 2025-09-06 00:00:04.039 [INFO][4766] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.74.4/26] handle="k8s-pod-network.59eeb191e9847ded5f0b0c50561a32db880df7b37af27b8bfebc65eb26a73e70" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:04.077298 containerd[1576]: 2025-09-06 00:00:04.039 [INFO][4766] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:00:04.077298 containerd[1576]: 2025-09-06 00:00:04.039 [INFO][4766] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.74.4/26] IPv6=[] ContainerID="59eeb191e9847ded5f0b0c50561a32db880df7b37af27b8bfebc65eb26a73e70" HandleID="k8s-pod-network.59eeb191e9847ded5f0b0c50561a32db880df7b37af27b8bfebc65eb26a73e70" Workload="ci--4081--3--5--n--8aba32846f-k8s-csi--node--driver--ps2ks-eth0" Sep 6 00:00:04.077935 containerd[1576]: 2025-09-06 00:00:04.042 [INFO][4754] cni-plugin/k8s.go 418: Populated endpoint ContainerID="59eeb191e9847ded5f0b0c50561a32db880df7b37af27b8bfebc65eb26a73e70" Namespace="calico-system" Pod="csi-node-driver-ps2ks" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-csi--node--driver--ps2ks-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--8aba32846f-k8s-csi--node--driver--ps2ks-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"4ff97107-cfe4-4d25-918b-36fd4176bf0c", ResourceVersion:"954", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 59, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-8aba32846f", ContainerID:"", Pod:"csi-node-driver-ps2ks", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.74.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali64b00cc420c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:00:04.077935 containerd[1576]: 2025-09-06 00:00:04.042 [INFO][4754] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.4/32] ContainerID="59eeb191e9847ded5f0b0c50561a32db880df7b37af27b8bfebc65eb26a73e70" Namespace="calico-system" Pod="csi-node-driver-ps2ks" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-csi--node--driver--ps2ks-eth0" Sep 6 00:00:04.077935 containerd[1576]: 2025-09-06 00:00:04.042 [INFO][4754] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali64b00cc420c ContainerID="59eeb191e9847ded5f0b0c50561a32db880df7b37af27b8bfebc65eb26a73e70" Namespace="calico-system" Pod="csi-node-driver-ps2ks" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-csi--node--driver--ps2ks-eth0" Sep 6 00:00:04.077935 containerd[1576]: 2025-09-06 00:00:04.049 [INFO][4754] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="59eeb191e9847ded5f0b0c50561a32db880df7b37af27b8bfebc65eb26a73e70" Namespace="calico-system" Pod="csi-node-driver-ps2ks" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-csi--node--driver--ps2ks-eth0" Sep 6 00:00:04.077935 containerd[1576]: 2025-09-06 00:00:04.051 [INFO][4754] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="59eeb191e9847ded5f0b0c50561a32db880df7b37af27b8bfebc65eb26a73e70" Namespace="calico-system" Pod="csi-node-driver-ps2ks" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-csi--node--driver--ps2ks-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--8aba32846f-k8s-csi--node--driver--ps2ks-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"4ff97107-cfe4-4d25-918b-36fd4176bf0c", ResourceVersion:"954", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 59, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-8aba32846f", ContainerID:"59eeb191e9847ded5f0b0c50561a32db880df7b37af27b8bfebc65eb26a73e70", Pod:"csi-node-driver-ps2ks", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.74.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali64b00cc420c", MAC:"6a:ec:c0:86:f8:b0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:00:04.077935 containerd[1576]: 2025-09-06 00:00:04.068 [INFO][4754] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="59eeb191e9847ded5f0b0c50561a32db880df7b37af27b8bfebc65eb26a73e70" Namespace="calico-system" Pod="csi-node-driver-ps2ks" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-csi--node--driver--ps2ks-eth0" Sep 6 00:00:04.117565 containerd[1576]: time="2025-09-06T00:00:04.116921361Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 6 00:00:04.117565 containerd[1576]: time="2025-09-06T00:00:04.117026483Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 6 00:00:04.117565 containerd[1576]: time="2025-09-06T00:00:04.117045963Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:00:04.117565 containerd[1576]: time="2025-09-06T00:00:04.117150684Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:00:04.175286 containerd[1576]: time="2025-09-06T00:00:04.175226488Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:00:04.177374 containerd[1576]: time="2025-09-06T00:00:04.176890269Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=30823700" Sep 6 00:00:04.179681 containerd[1576]: time="2025-09-06T00:00:04.177834961Z" level=info msg="ImageCreate event name:\"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:00:04.182219 containerd[1576]: time="2025-09-06T00:00:04.182171175Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:00:04.184569 containerd[1576]: time="2025-09-06T00:00:04.184517684Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"30823530\" in 4.770708855s" Sep 6 00:00:04.184569 containerd[1576]: time="2025-09-06T00:00:04.184574525Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\"" Sep 6 00:00:04.187936 containerd[1576]: time="2025-09-06T00:00:04.187888166Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 6 00:00:04.190371 containerd[1576]: time="2025-09-06T00:00:04.190322037Z" level=info msg="CreateContainer within sandbox \"54c1ccc5a55378d0cf631b6ce668b97f3524df41d638527e9f6899899ab209be\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 6 00:00:04.201540 systemd-networkd[1232]: calib992151005b: Gained IPv6LL Sep 6 00:00:04.211009 containerd[1576]: time="2025-09-06T00:00:04.210942054Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ps2ks,Uid:4ff97107-cfe4-4d25-918b-36fd4176bf0c,Namespace:calico-system,Attempt:1,} returns sandbox id \"59eeb191e9847ded5f0b0c50561a32db880df7b37af27b8bfebc65eb26a73e70\"" Sep 6 00:00:04.215386 containerd[1576]: time="2025-09-06T00:00:04.215336829Z" level=info msg="CreateContainer within sandbox \"54c1ccc5a55378d0cf631b6ce668b97f3524df41d638527e9f6899899ab209be\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"ea0d1f1879196f522c6699830ae9708e7b5f1548b6350ae3c66c33a19763135c\"" Sep 6 00:00:04.216309 containerd[1576]: time="2025-09-06T00:00:04.216190959Z" level=info msg="StartContainer for \"ea0d1f1879196f522c6699830ae9708e7b5f1548b6350ae3c66c33a19763135c\"" Sep 6 00:00:04.286042 containerd[1576]: time="2025-09-06T00:00:04.285883828Z" level=info msg="StartContainer for \"ea0d1f1879196f522c6699830ae9708e7b5f1548b6350ae3c66c33a19763135c\" returns successfully" Sep 6 00:00:04.458296 systemd-networkd[1232]: calif4039490265: Gained IPv6LL Sep 6 00:00:04.778667 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount417719678.mount: Deactivated successfully. Sep 6 00:00:04.932377 kubelet[2774]: I0906 00:00:04.931778 2774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-7bfdf4954b-9mxlp" podStartSLOduration=2.359565502 podStartE2EDuration="8.931753683s" podCreationTimestamp="2025-09-05 23:59:56 +0000 UTC" firstStartedPulling="2025-09-05 23:59:57.613932563 +0000 UTC m=+39.177556754" lastFinishedPulling="2025-09-06 00:00:04.186120704 +0000 UTC m=+45.749744935" observedRunningTime="2025-09-06 00:00:04.930354506 +0000 UTC m=+46.493978737" watchObservedRunningTime="2025-09-06 00:00:04.931753683 +0000 UTC m=+46.495377914" Sep 6 00:00:06.056818 systemd-networkd[1232]: cali64b00cc420c: Gained IPv6LL Sep 6 00:00:06.596811 containerd[1576]: time="2025-09-06T00:00:06.595553108Z" level=info msg="StopPodSandbox for \"163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a\"" Sep 6 00:00:06.596811 containerd[1576]: time="2025-09-06T00:00:06.595905232Z" level=info msg="StopPodSandbox for \"ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e\"" Sep 6 00:00:06.612780 containerd[1576]: time="2025-09-06T00:00:06.610954531Z" level=info msg="StopPodSandbox for \"5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098\"" Sep 6 00:00:06.614116 containerd[1576]: time="2025-09-06T00:00:06.614056328Z" level=info msg="StopPodSandbox for \"7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61\"" Sep 6 00:00:06.843082 containerd[1576]: 2025-09-06 00:00:06.724 [INFO][4911] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" Sep 6 00:00:06.843082 containerd[1576]: 2025-09-06 00:00:06.724 [INFO][4911] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" iface="eth0" netns="/var/run/netns/cni-07d111ef-346d-5ecc-695b-ae3709c5115c" Sep 6 00:00:06.843082 containerd[1576]: 2025-09-06 00:00:06.726 [INFO][4911] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" iface="eth0" netns="/var/run/netns/cni-07d111ef-346d-5ecc-695b-ae3709c5115c" Sep 6 00:00:06.843082 containerd[1576]: 2025-09-06 00:00:06.730 [INFO][4911] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" iface="eth0" netns="/var/run/netns/cni-07d111ef-346d-5ecc-695b-ae3709c5115c" Sep 6 00:00:06.843082 containerd[1576]: 2025-09-06 00:00:06.730 [INFO][4911] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" Sep 6 00:00:06.843082 containerd[1576]: 2025-09-06 00:00:06.730 [INFO][4911] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" Sep 6 00:00:06.843082 containerd[1576]: 2025-09-06 00:00:06.812 [INFO][4929] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" HandleID="k8s-pod-network.163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--kh72c-eth0" Sep 6 00:00:06.843082 containerd[1576]: 2025-09-06 00:00:06.812 [INFO][4929] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:00:06.843082 containerd[1576]: 2025-09-06 00:00:06.812 [INFO][4929] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:00:06.843082 containerd[1576]: 2025-09-06 00:00:06.832 [WARNING][4929] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" HandleID="k8s-pod-network.163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--kh72c-eth0" Sep 6 00:00:06.843082 containerd[1576]: 2025-09-06 00:00:06.832 [INFO][4929] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" HandleID="k8s-pod-network.163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--kh72c-eth0" Sep 6 00:00:06.843082 containerd[1576]: 2025-09-06 00:00:06.837 [INFO][4929] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:00:06.843082 containerd[1576]: 2025-09-06 00:00:06.841 [INFO][4911] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" Sep 6 00:00:06.844053 containerd[1576]: time="2025-09-06T00:00:06.843924660Z" level=info msg="TearDown network for sandbox \"163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a\" successfully" Sep 6 00:00:06.845788 containerd[1576]: time="2025-09-06T00:00:06.845746522Z" level=info msg="StopPodSandbox for \"163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a\" returns successfully" Sep 6 00:00:06.849465 systemd[1]: run-netns-cni\x2d07d111ef\x2d346d\x2d5ecc\x2d695b\x2dae3709c5115c.mount: Deactivated successfully. Sep 6 00:00:06.851527 containerd[1576]: time="2025-09-06T00:00:06.850965624Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76445ff9b6-kh72c,Uid:55586fdc-5c0e-4b46-a8dd-484764d828e4,Namespace:calico-apiserver,Attempt:1,}" Sep 6 00:00:06.880660 containerd[1576]: 2025-09-06 00:00:06.763 [INFO][4912] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" Sep 6 00:00:06.880660 containerd[1576]: 2025-09-06 00:00:06.763 [INFO][4912] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" iface="eth0" netns="/var/run/netns/cni-f9d4c19a-2b3e-5f0e-0e0b-c178c4fe8ec4" Sep 6 00:00:06.880660 containerd[1576]: 2025-09-06 00:00:06.764 [INFO][4912] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" iface="eth0" netns="/var/run/netns/cni-f9d4c19a-2b3e-5f0e-0e0b-c178c4fe8ec4" Sep 6 00:00:06.880660 containerd[1576]: 2025-09-06 00:00:06.764 [INFO][4912] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" iface="eth0" netns="/var/run/netns/cni-f9d4c19a-2b3e-5f0e-0e0b-c178c4fe8ec4" Sep 6 00:00:06.880660 containerd[1576]: 2025-09-06 00:00:06.764 [INFO][4912] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" Sep 6 00:00:06.880660 containerd[1576]: 2025-09-06 00:00:06.764 [INFO][4912] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" Sep 6 00:00:06.880660 containerd[1576]: 2025-09-06 00:00:06.838 [INFO][4937] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" HandleID="k8s-pod-network.5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" Workload="ci--4081--3--5--n--8aba32846f-k8s-goldmane--7988f88666--2kb4z-eth0" Sep 6 00:00:06.880660 containerd[1576]: 2025-09-06 00:00:06.838 [INFO][4937] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:00:06.880660 containerd[1576]: 2025-09-06 00:00:06.838 [INFO][4937] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:00:06.880660 containerd[1576]: 2025-09-06 00:00:06.855 [WARNING][4937] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" HandleID="k8s-pod-network.5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" Workload="ci--4081--3--5--n--8aba32846f-k8s-goldmane--7988f88666--2kb4z-eth0" Sep 6 00:00:06.880660 containerd[1576]: 2025-09-06 00:00:06.855 [INFO][4937] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" HandleID="k8s-pod-network.5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" Workload="ci--4081--3--5--n--8aba32846f-k8s-goldmane--7988f88666--2kb4z-eth0" Sep 6 00:00:06.880660 containerd[1576]: 2025-09-06 00:00:06.858 [INFO][4937] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:00:06.880660 containerd[1576]: 2025-09-06 00:00:06.875 [INFO][4912] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" Sep 6 00:00:06.882041 containerd[1576]: time="2025-09-06T00:00:06.881324505Z" level=info msg="TearDown network for sandbox \"5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098\" successfully" Sep 6 00:00:06.882041 containerd[1576]: time="2025-09-06T00:00:06.881360105Z" level=info msg="StopPodSandbox for \"5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098\" returns successfully" Sep 6 00:00:06.884416 containerd[1576]: time="2025-09-06T00:00:06.883926896Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-2kb4z,Uid:040258ac-5a18-4787-bc03-5dba35b78258,Namespace:calico-system,Attempt:1,}" Sep 6 00:00:06.885867 systemd[1]: run-netns-cni\x2df9d4c19a\x2d2b3e\x2d5f0e\x2d0e0b\x2dc178c4fe8ec4.mount: Deactivated successfully. Sep 6 00:00:06.914502 containerd[1576]: 2025-09-06 00:00:06.762 [INFO][4894] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" Sep 6 00:00:06.914502 containerd[1576]: 2025-09-06 00:00:06.763 [INFO][4894] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" iface="eth0" netns="/var/run/netns/cni-ac55c055-898f-0dfa-8f1c-bcd49c5c7460" Sep 6 00:00:06.914502 containerd[1576]: 2025-09-06 00:00:06.764 [INFO][4894] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" iface="eth0" netns="/var/run/netns/cni-ac55c055-898f-0dfa-8f1c-bcd49c5c7460" Sep 6 00:00:06.914502 containerd[1576]: 2025-09-06 00:00:06.765 [INFO][4894] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" iface="eth0" netns="/var/run/netns/cni-ac55c055-898f-0dfa-8f1c-bcd49c5c7460" Sep 6 00:00:06.914502 containerd[1576]: 2025-09-06 00:00:06.765 [INFO][4894] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" Sep 6 00:00:06.914502 containerd[1576]: 2025-09-06 00:00:06.766 [INFO][4894] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" Sep 6 00:00:06.914502 containerd[1576]: 2025-09-06 00:00:06.836 [INFO][4943] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" HandleID="k8s-pod-network.ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" Workload="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--mm682-eth0" Sep 6 00:00:06.914502 containerd[1576]: 2025-09-06 00:00:06.837 [INFO][4943] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:00:06.914502 containerd[1576]: 2025-09-06 00:00:06.858 [INFO][4943] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:00:06.914502 containerd[1576]: 2025-09-06 00:00:06.886 [WARNING][4943] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" HandleID="k8s-pod-network.ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" Workload="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--mm682-eth0" Sep 6 00:00:06.914502 containerd[1576]: 2025-09-06 00:00:06.886 [INFO][4943] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" HandleID="k8s-pod-network.ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" Workload="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--mm682-eth0" Sep 6 00:00:06.914502 containerd[1576]: 2025-09-06 00:00:06.890 [INFO][4943] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:00:06.914502 containerd[1576]: 2025-09-06 00:00:06.901 [INFO][4894] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" Sep 6 00:00:06.915474 containerd[1576]: time="2025-09-06T00:00:06.914742502Z" level=info msg="TearDown network for sandbox \"ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e\" successfully" Sep 6 00:00:06.915474 containerd[1576]: time="2025-09-06T00:00:06.914772743Z" level=info msg="StopPodSandbox for \"ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e\" returns successfully" Sep 6 00:00:06.917313 containerd[1576]: time="2025-09-06T00:00:06.917240652Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-mm682,Uid:7f869a7e-1df4-4450-bbcf-ba5d46557d8b,Namespace:kube-system,Attempt:1,}" Sep 6 00:00:06.924399 containerd[1576]: 2025-09-06 00:00:06.793 [INFO][4920] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" Sep 6 00:00:06.924399 containerd[1576]: 2025-09-06 00:00:06.794 [INFO][4920] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" iface="eth0" netns="/var/run/netns/cni-5e555f1d-f496-947c-1843-148097b1a52e" Sep 6 00:00:06.924399 containerd[1576]: 2025-09-06 00:00:06.795 [INFO][4920] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" iface="eth0" netns="/var/run/netns/cni-5e555f1d-f496-947c-1843-148097b1a52e" Sep 6 00:00:06.924399 containerd[1576]: 2025-09-06 00:00:06.795 [INFO][4920] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" iface="eth0" netns="/var/run/netns/cni-5e555f1d-f496-947c-1843-148097b1a52e" Sep 6 00:00:06.924399 containerd[1576]: 2025-09-06 00:00:06.795 [INFO][4920] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" Sep 6 00:00:06.924399 containerd[1576]: 2025-09-06 00:00:06.795 [INFO][4920] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" Sep 6 00:00:06.924399 containerd[1576]: 2025-09-06 00:00:06.855 [INFO][4948] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" HandleID="k8s-pod-network.7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--kube--controllers--cbc856df4--cgnd8-eth0" Sep 6 00:00:06.924399 containerd[1576]: 2025-09-06 00:00:06.855 [INFO][4948] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:00:06.924399 containerd[1576]: 2025-09-06 00:00:06.890 [INFO][4948] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:00:06.924399 containerd[1576]: 2025-09-06 00:00:06.910 [WARNING][4948] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" HandleID="k8s-pod-network.7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--kube--controllers--cbc856df4--cgnd8-eth0" Sep 6 00:00:06.924399 containerd[1576]: 2025-09-06 00:00:06.910 [INFO][4948] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" HandleID="k8s-pod-network.7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--kube--controllers--cbc856df4--cgnd8-eth0" Sep 6 00:00:06.924399 containerd[1576]: 2025-09-06 00:00:06.913 [INFO][4948] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:00:06.924399 containerd[1576]: 2025-09-06 00:00:06.921 [INFO][4920] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" Sep 6 00:00:06.924822 containerd[1576]: time="2025-09-06T00:00:06.924635940Z" level=info msg="TearDown network for sandbox \"7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61\" successfully" Sep 6 00:00:06.924822 containerd[1576]: time="2025-09-06T00:00:06.924675340Z" level=info msg="StopPodSandbox for \"7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61\" returns successfully" Sep 6 00:00:06.926443 containerd[1576]: time="2025-09-06T00:00:06.926100477Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cbc856df4-cgnd8,Uid:0d045d3d-688c-447d-9931-b4f33f413acf,Namespace:calico-system,Attempt:1,}" Sep 6 00:00:07.146370 systemd-networkd[1232]: calif64776222e5: Link UP Sep 6 00:00:07.146577 systemd-networkd[1232]: calif64776222e5: Gained carrier Sep 6 00:00:07.180820 containerd[1576]: 2025-09-06 00:00:06.967 [INFO][4963] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--kh72c-eth0 calico-apiserver-76445ff9b6- calico-apiserver 55586fdc-5c0e-4b46-a8dd-484764d828e4 984 0 2025-09-05 23:59:36 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:76445ff9b6 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4081-3-5-n-8aba32846f calico-apiserver-76445ff9b6-kh72c eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calif64776222e5 [] [] }} ContainerID="b3d73fc337479040a4a30615e586489c3394daa44cd2ca6d876324676a90e1a4" Namespace="calico-apiserver" Pod="calico-apiserver-76445ff9b6-kh72c" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--kh72c-" Sep 6 00:00:07.180820 containerd[1576]: 2025-09-06 00:00:06.968 [INFO][4963] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b3d73fc337479040a4a30615e586489c3394daa44cd2ca6d876324676a90e1a4" Namespace="calico-apiserver" Pod="calico-apiserver-76445ff9b6-kh72c" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--kh72c-eth0" Sep 6 00:00:07.180820 containerd[1576]: 2025-09-06 00:00:07.037 [INFO][5009] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b3d73fc337479040a4a30615e586489c3394daa44cd2ca6d876324676a90e1a4" HandleID="k8s-pod-network.b3d73fc337479040a4a30615e586489c3394daa44cd2ca6d876324676a90e1a4" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--kh72c-eth0" Sep 6 00:00:07.180820 containerd[1576]: 2025-09-06 00:00:07.037 [INFO][5009] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b3d73fc337479040a4a30615e586489c3394daa44cd2ca6d876324676a90e1a4" HandleID="k8s-pod-network.b3d73fc337479040a4a30615e586489c3394daa44cd2ca6d876324676a90e1a4" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--kh72c-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3ae0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4081-3-5-n-8aba32846f", "pod":"calico-apiserver-76445ff9b6-kh72c", "timestamp":"2025-09-06 00:00:07.037351189 +0000 UTC"}, Hostname:"ci-4081-3-5-n-8aba32846f", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 6 00:00:07.180820 containerd[1576]: 2025-09-06 00:00:07.037 [INFO][5009] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:00:07.180820 containerd[1576]: 2025-09-06 00:00:07.037 [INFO][5009] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:00:07.180820 containerd[1576]: 2025-09-06 00:00:07.037 [INFO][5009] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-n-8aba32846f' Sep 6 00:00:07.180820 containerd[1576]: 2025-09-06 00:00:07.062 [INFO][5009] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b3d73fc337479040a4a30615e586489c3394daa44cd2ca6d876324676a90e1a4" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.180820 containerd[1576]: 2025-09-06 00:00:07.076 [INFO][5009] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.180820 containerd[1576]: 2025-09-06 00:00:07.090 [INFO][5009] ipam/ipam.go 511: Trying affinity for 192.168.74.0/26 host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.180820 containerd[1576]: 2025-09-06 00:00:07.095 [INFO][5009] ipam/ipam.go 158: Attempting to load block cidr=192.168.74.0/26 host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.180820 containerd[1576]: 2025-09-06 00:00:07.103 [INFO][5009] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.74.0/26 host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.180820 containerd[1576]: 2025-09-06 00:00:07.105 [INFO][5009] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.74.0/26 handle="k8s-pod-network.b3d73fc337479040a4a30615e586489c3394daa44cd2ca6d876324676a90e1a4" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.180820 containerd[1576]: 2025-09-06 00:00:07.110 [INFO][5009] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b3d73fc337479040a4a30615e586489c3394daa44cd2ca6d876324676a90e1a4 Sep 6 00:00:07.180820 containerd[1576]: 2025-09-06 00:00:07.124 [INFO][5009] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.74.0/26 handle="k8s-pod-network.b3d73fc337479040a4a30615e586489c3394daa44cd2ca6d876324676a90e1a4" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.180820 containerd[1576]: 2025-09-06 00:00:07.137 [INFO][5009] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.74.5/26] block=192.168.74.0/26 handle="k8s-pod-network.b3d73fc337479040a4a30615e586489c3394daa44cd2ca6d876324676a90e1a4" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.180820 containerd[1576]: 2025-09-06 00:00:07.137 [INFO][5009] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.74.5/26] handle="k8s-pod-network.b3d73fc337479040a4a30615e586489c3394daa44cd2ca6d876324676a90e1a4" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.180820 containerd[1576]: 2025-09-06 00:00:07.137 [INFO][5009] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:00:07.180820 containerd[1576]: 2025-09-06 00:00:07.137 [INFO][5009] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.74.5/26] IPv6=[] ContainerID="b3d73fc337479040a4a30615e586489c3394daa44cd2ca6d876324676a90e1a4" HandleID="k8s-pod-network.b3d73fc337479040a4a30615e586489c3394daa44cd2ca6d876324676a90e1a4" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--kh72c-eth0" Sep 6 00:00:07.181437 containerd[1576]: 2025-09-06 00:00:07.141 [INFO][4963] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b3d73fc337479040a4a30615e586489c3394daa44cd2ca6d876324676a90e1a4" Namespace="calico-apiserver" Pod="calico-apiserver-76445ff9b6-kh72c" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--kh72c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--kh72c-eth0", GenerateName:"calico-apiserver-76445ff9b6-", Namespace:"calico-apiserver", SelfLink:"", UID:"55586fdc-5c0e-4b46-a8dd-484764d828e4", ResourceVersion:"984", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 59, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"76445ff9b6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-8aba32846f", ContainerID:"", Pod:"calico-apiserver-76445ff9b6-kh72c", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.74.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif64776222e5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:00:07.181437 containerd[1576]: 2025-09-06 00:00:07.141 [INFO][4963] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.5/32] ContainerID="b3d73fc337479040a4a30615e586489c3394daa44cd2ca6d876324676a90e1a4" Namespace="calico-apiserver" Pod="calico-apiserver-76445ff9b6-kh72c" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--kh72c-eth0" Sep 6 00:00:07.181437 containerd[1576]: 2025-09-06 00:00:07.141 [INFO][4963] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif64776222e5 ContainerID="b3d73fc337479040a4a30615e586489c3394daa44cd2ca6d876324676a90e1a4" Namespace="calico-apiserver" Pod="calico-apiserver-76445ff9b6-kh72c" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--kh72c-eth0" Sep 6 00:00:07.181437 containerd[1576]: 2025-09-06 00:00:07.150 [INFO][4963] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b3d73fc337479040a4a30615e586489c3394daa44cd2ca6d876324676a90e1a4" Namespace="calico-apiserver" Pod="calico-apiserver-76445ff9b6-kh72c" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--kh72c-eth0" Sep 6 00:00:07.181437 containerd[1576]: 2025-09-06 00:00:07.151 [INFO][4963] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b3d73fc337479040a4a30615e586489c3394daa44cd2ca6d876324676a90e1a4" Namespace="calico-apiserver" Pod="calico-apiserver-76445ff9b6-kh72c" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--kh72c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--kh72c-eth0", GenerateName:"calico-apiserver-76445ff9b6-", Namespace:"calico-apiserver", SelfLink:"", UID:"55586fdc-5c0e-4b46-a8dd-484764d828e4", ResourceVersion:"984", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 59, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"76445ff9b6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-8aba32846f", ContainerID:"b3d73fc337479040a4a30615e586489c3394daa44cd2ca6d876324676a90e1a4", Pod:"calico-apiserver-76445ff9b6-kh72c", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.74.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif64776222e5", MAC:"42:7b:0e:b8:fd:18", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:00:07.181437 containerd[1576]: 2025-09-06 00:00:07.174 [INFO][4963] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b3d73fc337479040a4a30615e586489c3394daa44cd2ca6d876324676a90e1a4" Namespace="calico-apiserver" Pod="calico-apiserver-76445ff9b6-kh72c" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--kh72c-eth0" Sep 6 00:00:07.260502 containerd[1576]: time="2025-09-06T00:00:07.258652518Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 6 00:00:07.263855 containerd[1576]: time="2025-09-06T00:00:07.263514735Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 6 00:00:07.263855 containerd[1576]: time="2025-09-06T00:00:07.263549935Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:00:07.264403 containerd[1576]: time="2025-09-06T00:00:07.264015461Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:00:07.300231 systemd-networkd[1232]: cali72f4e08d14e: Link UP Sep 6 00:00:07.304935 systemd-networkd[1232]: cali72f4e08d14e: Gained carrier Sep 6 00:00:07.348391 containerd[1576]: 2025-09-06 00:00:06.989 [INFO][4969] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--n--8aba32846f-k8s-goldmane--7988f88666--2kb4z-eth0 goldmane-7988f88666- calico-system 040258ac-5a18-4787-bc03-5dba35b78258 985 0 2025-09-05 23:59:41 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7988f88666 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4081-3-5-n-8aba32846f goldmane-7988f88666-2kb4z eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali72f4e08d14e [] [] }} ContainerID="28315c1659efcb8886450b61571b272e89bb977f1548bffe403a305b272287ea" Namespace="calico-system" Pod="goldmane-7988f88666-2kb4z" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-goldmane--7988f88666--2kb4z-" Sep 6 00:00:07.348391 containerd[1576]: 2025-09-06 00:00:06.989 [INFO][4969] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="28315c1659efcb8886450b61571b272e89bb977f1548bffe403a305b272287ea" Namespace="calico-system" Pod="goldmane-7988f88666-2kb4z" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-goldmane--7988f88666--2kb4z-eth0" Sep 6 00:00:07.348391 containerd[1576]: 2025-09-06 00:00:07.112 [INFO][5016] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="28315c1659efcb8886450b61571b272e89bb977f1548bffe403a305b272287ea" HandleID="k8s-pod-network.28315c1659efcb8886450b61571b272e89bb977f1548bffe403a305b272287ea" Workload="ci--4081--3--5--n--8aba32846f-k8s-goldmane--7988f88666--2kb4z-eth0" Sep 6 00:00:07.348391 containerd[1576]: 2025-09-06 00:00:07.112 [INFO][5016] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="28315c1659efcb8886450b61571b272e89bb977f1548bffe403a305b272287ea" HandleID="k8s-pod-network.28315c1659efcb8886450b61571b272e89bb977f1548bffe403a305b272287ea" Workload="ci--4081--3--5--n--8aba32846f-k8s-goldmane--7988f88666--2kb4z-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3390), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-5-n-8aba32846f", "pod":"goldmane-7988f88666-2kb4z", "timestamp":"2025-09-06 00:00:07.111951055 +0000 UTC"}, Hostname:"ci-4081-3-5-n-8aba32846f", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 6 00:00:07.348391 containerd[1576]: 2025-09-06 00:00:07.113 [INFO][5016] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:00:07.348391 containerd[1576]: 2025-09-06 00:00:07.137 [INFO][5016] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:00:07.348391 containerd[1576]: 2025-09-06 00:00:07.137 [INFO][5016] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-n-8aba32846f' Sep 6 00:00:07.348391 containerd[1576]: 2025-09-06 00:00:07.164 [INFO][5016] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.28315c1659efcb8886450b61571b272e89bb977f1548bffe403a305b272287ea" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.348391 containerd[1576]: 2025-09-06 00:00:07.185 [INFO][5016] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.348391 containerd[1576]: 2025-09-06 00:00:07.209 [INFO][5016] ipam/ipam.go 511: Trying affinity for 192.168.74.0/26 host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.348391 containerd[1576]: 2025-09-06 00:00:07.221 [INFO][5016] ipam/ipam.go 158: Attempting to load block cidr=192.168.74.0/26 host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.348391 containerd[1576]: 2025-09-06 00:00:07.230 [INFO][5016] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.74.0/26 host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.348391 containerd[1576]: 2025-09-06 00:00:07.230 [INFO][5016] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.74.0/26 handle="k8s-pod-network.28315c1659efcb8886450b61571b272e89bb977f1548bffe403a305b272287ea" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.348391 containerd[1576]: 2025-09-06 00:00:07.235 [INFO][5016] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.28315c1659efcb8886450b61571b272e89bb977f1548bffe403a305b272287ea Sep 6 00:00:07.348391 containerd[1576]: 2025-09-06 00:00:07.250 [INFO][5016] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.74.0/26 handle="k8s-pod-network.28315c1659efcb8886450b61571b272e89bb977f1548bffe403a305b272287ea" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.348391 containerd[1576]: 2025-09-06 00:00:07.269 [INFO][5016] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.74.6/26] block=192.168.74.0/26 handle="k8s-pod-network.28315c1659efcb8886450b61571b272e89bb977f1548bffe403a305b272287ea" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.348391 containerd[1576]: 2025-09-06 00:00:07.269 [INFO][5016] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.74.6/26] handle="k8s-pod-network.28315c1659efcb8886450b61571b272e89bb977f1548bffe403a305b272287ea" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.348391 containerd[1576]: 2025-09-06 00:00:07.270 [INFO][5016] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:00:07.348391 containerd[1576]: 2025-09-06 00:00:07.271 [INFO][5016] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.74.6/26] IPv6=[] ContainerID="28315c1659efcb8886450b61571b272e89bb977f1548bffe403a305b272287ea" HandleID="k8s-pod-network.28315c1659efcb8886450b61571b272e89bb977f1548bffe403a305b272287ea" Workload="ci--4081--3--5--n--8aba32846f-k8s-goldmane--7988f88666--2kb4z-eth0" Sep 6 00:00:07.351985 containerd[1576]: 2025-09-06 00:00:07.280 [INFO][4969] cni-plugin/k8s.go 418: Populated endpoint ContainerID="28315c1659efcb8886450b61571b272e89bb977f1548bffe403a305b272287ea" Namespace="calico-system" Pod="goldmane-7988f88666-2kb4z" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-goldmane--7988f88666--2kb4z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--8aba32846f-k8s-goldmane--7988f88666--2kb4z-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"040258ac-5a18-4787-bc03-5dba35b78258", ResourceVersion:"985", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 59, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-8aba32846f", ContainerID:"", Pod:"goldmane-7988f88666-2kb4z", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.74.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali72f4e08d14e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:00:07.351985 containerd[1576]: 2025-09-06 00:00:07.281 [INFO][4969] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.6/32] ContainerID="28315c1659efcb8886450b61571b272e89bb977f1548bffe403a305b272287ea" Namespace="calico-system" Pod="goldmane-7988f88666-2kb4z" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-goldmane--7988f88666--2kb4z-eth0" Sep 6 00:00:07.351985 containerd[1576]: 2025-09-06 00:00:07.281 [INFO][4969] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali72f4e08d14e ContainerID="28315c1659efcb8886450b61571b272e89bb977f1548bffe403a305b272287ea" Namespace="calico-system" Pod="goldmane-7988f88666-2kb4z" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-goldmane--7988f88666--2kb4z-eth0" Sep 6 00:00:07.351985 containerd[1576]: 2025-09-06 00:00:07.308 [INFO][4969] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="28315c1659efcb8886450b61571b272e89bb977f1548bffe403a305b272287ea" Namespace="calico-system" Pod="goldmane-7988f88666-2kb4z" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-goldmane--7988f88666--2kb4z-eth0" Sep 6 00:00:07.351985 containerd[1576]: 2025-09-06 00:00:07.314 [INFO][4969] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="28315c1659efcb8886450b61571b272e89bb977f1548bffe403a305b272287ea" Namespace="calico-system" Pod="goldmane-7988f88666-2kb4z" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-goldmane--7988f88666--2kb4z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--8aba32846f-k8s-goldmane--7988f88666--2kb4z-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"040258ac-5a18-4787-bc03-5dba35b78258", ResourceVersion:"985", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 59, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-8aba32846f", ContainerID:"28315c1659efcb8886450b61571b272e89bb977f1548bffe403a305b272287ea", Pod:"goldmane-7988f88666-2kb4z", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.74.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali72f4e08d14e", MAC:"9a:9f:99:47:2c:5c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:00:07.351985 containerd[1576]: 2025-09-06 00:00:07.343 [INFO][4969] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="28315c1659efcb8886450b61571b272e89bb977f1548bffe403a305b272287ea" Namespace="calico-system" Pod="goldmane-7988f88666-2kb4z" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-goldmane--7988f88666--2kb4z-eth0" Sep 6 00:00:07.416315 containerd[1576]: time="2025-09-06T00:00:07.415542340Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 6 00:00:07.418363 containerd[1576]: time="2025-09-06T00:00:07.415615581Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 6 00:00:07.419425 containerd[1576]: time="2025-09-06T00:00:07.418132730Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:00:07.419425 containerd[1576]: time="2025-09-06T00:00:07.418312572Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:00:07.443485 systemd-networkd[1232]: calic9e07cbfc1c: Link UP Sep 6 00:00:07.448156 systemd-networkd[1232]: calic9e07cbfc1c: Gained carrier Sep 6 00:00:07.495941 containerd[1576]: 2025-09-06 00:00:07.026 [INFO][4992] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--n--8aba32846f-k8s-calico--kube--controllers--cbc856df4--cgnd8-eth0 calico-kube-controllers-cbc856df4- calico-system 0d045d3d-688c-447d-9931-b4f33f413acf 987 0 2025-09-05 23:59:42 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:cbc856df4 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4081-3-5-n-8aba32846f calico-kube-controllers-cbc856df4-cgnd8 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calic9e07cbfc1c [] [] }} ContainerID="bbc184507886a72b1c880a2a42d73250baf8226c68e72bd198bfa30293acdff5" Namespace="calico-system" Pod="calico-kube-controllers-cbc856df4-cgnd8" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-calico--kube--controllers--cbc856df4--cgnd8-" Sep 6 00:00:07.495941 containerd[1576]: 2025-09-06 00:00:07.026 [INFO][4992] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bbc184507886a72b1c880a2a42d73250baf8226c68e72bd198bfa30293acdff5" Namespace="calico-system" Pod="calico-kube-controllers-cbc856df4-cgnd8" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-calico--kube--controllers--cbc856df4--cgnd8-eth0" Sep 6 00:00:07.495941 containerd[1576]: 2025-09-06 00:00:07.119 [INFO][5024] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bbc184507886a72b1c880a2a42d73250baf8226c68e72bd198bfa30293acdff5" HandleID="k8s-pod-network.bbc184507886a72b1c880a2a42d73250baf8226c68e72bd198bfa30293acdff5" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--kube--controllers--cbc856df4--cgnd8-eth0" Sep 6 00:00:07.495941 containerd[1576]: 2025-09-06 00:00:07.119 [INFO][5024] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="bbc184507886a72b1c880a2a42d73250baf8226c68e72bd198bfa30293acdff5" HandleID="k8s-pod-network.bbc184507886a72b1c880a2a42d73250baf8226c68e72bd198bfa30293acdff5" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--kube--controllers--cbc856df4--cgnd8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002cb600), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4081-3-5-n-8aba32846f", "pod":"calico-kube-controllers-cbc856df4-cgnd8", "timestamp":"2025-09-06 00:00:07.119017017 +0000 UTC"}, Hostname:"ci-4081-3-5-n-8aba32846f", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 6 00:00:07.495941 containerd[1576]: 2025-09-06 00:00:07.119 [INFO][5024] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:00:07.495941 containerd[1576]: 2025-09-06 00:00:07.270 [INFO][5024] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:00:07.495941 containerd[1576]: 2025-09-06 00:00:07.271 [INFO][5024] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-n-8aba32846f' Sep 6 00:00:07.495941 containerd[1576]: 2025-09-06 00:00:07.318 [INFO][5024] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.bbc184507886a72b1c880a2a42d73250baf8226c68e72bd198bfa30293acdff5" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.495941 containerd[1576]: 2025-09-06 00:00:07.342 [INFO][5024] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.495941 containerd[1576]: 2025-09-06 00:00:07.366 [INFO][5024] ipam/ipam.go 511: Trying affinity for 192.168.74.0/26 host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.495941 containerd[1576]: 2025-09-06 00:00:07.374 [INFO][5024] ipam/ipam.go 158: Attempting to load block cidr=192.168.74.0/26 host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.495941 containerd[1576]: 2025-09-06 00:00:07.387 [INFO][5024] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.74.0/26 host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.495941 containerd[1576]: 2025-09-06 00:00:07.388 [INFO][5024] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.74.0/26 handle="k8s-pod-network.bbc184507886a72b1c880a2a42d73250baf8226c68e72bd198bfa30293acdff5" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.495941 containerd[1576]: 2025-09-06 00:00:07.393 [INFO][5024] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.bbc184507886a72b1c880a2a42d73250baf8226c68e72bd198bfa30293acdff5 Sep 6 00:00:07.495941 containerd[1576]: 2025-09-06 00:00:07.408 [INFO][5024] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.74.0/26 handle="k8s-pod-network.bbc184507886a72b1c880a2a42d73250baf8226c68e72bd198bfa30293acdff5" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.495941 containerd[1576]: 2025-09-06 00:00:07.423 [INFO][5024] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.74.7/26] block=192.168.74.0/26 handle="k8s-pod-network.bbc184507886a72b1c880a2a42d73250baf8226c68e72bd198bfa30293acdff5" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.495941 containerd[1576]: 2025-09-06 00:00:07.423 [INFO][5024] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.74.7/26] handle="k8s-pod-network.bbc184507886a72b1c880a2a42d73250baf8226c68e72bd198bfa30293acdff5" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.495941 containerd[1576]: 2025-09-06 00:00:07.423 [INFO][5024] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:00:07.495941 containerd[1576]: 2025-09-06 00:00:07.423 [INFO][5024] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.74.7/26] IPv6=[] ContainerID="bbc184507886a72b1c880a2a42d73250baf8226c68e72bd198bfa30293acdff5" HandleID="k8s-pod-network.bbc184507886a72b1c880a2a42d73250baf8226c68e72bd198bfa30293acdff5" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--kube--controllers--cbc856df4--cgnd8-eth0" Sep 6 00:00:07.497251 containerd[1576]: 2025-09-06 00:00:07.432 [INFO][4992] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bbc184507886a72b1c880a2a42d73250baf8226c68e72bd198bfa30293acdff5" Namespace="calico-system" Pod="calico-kube-controllers-cbc856df4-cgnd8" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-calico--kube--controllers--cbc856df4--cgnd8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--8aba32846f-k8s-calico--kube--controllers--cbc856df4--cgnd8-eth0", GenerateName:"calico-kube-controllers-cbc856df4-", Namespace:"calico-system", SelfLink:"", UID:"0d045d3d-688c-447d-9931-b4f33f413acf", ResourceVersion:"987", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 59, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"cbc856df4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-8aba32846f", ContainerID:"", Pod:"calico-kube-controllers-cbc856df4-cgnd8", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.74.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic9e07cbfc1c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:00:07.497251 containerd[1576]: 2025-09-06 00:00:07.433 [INFO][4992] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.7/32] ContainerID="bbc184507886a72b1c880a2a42d73250baf8226c68e72bd198bfa30293acdff5" Namespace="calico-system" Pod="calico-kube-controllers-cbc856df4-cgnd8" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-calico--kube--controllers--cbc856df4--cgnd8-eth0" Sep 6 00:00:07.497251 containerd[1576]: 2025-09-06 00:00:07.433 [INFO][4992] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic9e07cbfc1c ContainerID="bbc184507886a72b1c880a2a42d73250baf8226c68e72bd198bfa30293acdff5" Namespace="calico-system" Pod="calico-kube-controllers-cbc856df4-cgnd8" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-calico--kube--controllers--cbc856df4--cgnd8-eth0" Sep 6 00:00:07.497251 containerd[1576]: 2025-09-06 00:00:07.452 [INFO][4992] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bbc184507886a72b1c880a2a42d73250baf8226c68e72bd198bfa30293acdff5" Namespace="calico-system" Pod="calico-kube-controllers-cbc856df4-cgnd8" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-calico--kube--controllers--cbc856df4--cgnd8-eth0" Sep 6 00:00:07.497251 containerd[1576]: 2025-09-06 00:00:07.464 [INFO][4992] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bbc184507886a72b1c880a2a42d73250baf8226c68e72bd198bfa30293acdff5" Namespace="calico-system" Pod="calico-kube-controllers-cbc856df4-cgnd8" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-calico--kube--controllers--cbc856df4--cgnd8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--8aba32846f-k8s-calico--kube--controllers--cbc856df4--cgnd8-eth0", GenerateName:"calico-kube-controllers-cbc856df4-", Namespace:"calico-system", SelfLink:"", UID:"0d045d3d-688c-447d-9931-b4f33f413acf", ResourceVersion:"987", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 59, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"cbc856df4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-8aba32846f", ContainerID:"bbc184507886a72b1c880a2a42d73250baf8226c68e72bd198bfa30293acdff5", Pod:"calico-kube-controllers-cbc856df4-cgnd8", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.74.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic9e07cbfc1c", MAC:"de:53:be:89:d2:f4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:00:07.497251 containerd[1576]: 2025-09-06 00:00:07.485 [INFO][4992] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bbc184507886a72b1c880a2a42d73250baf8226c68e72bd198bfa30293acdff5" Namespace="calico-system" Pod="calico-kube-controllers-cbc856df4-cgnd8" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-calico--kube--controllers--cbc856df4--cgnd8-eth0" Sep 6 00:00:07.511219 containerd[1576]: time="2025-09-06T00:00:07.511159210Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-76445ff9b6-kh72c,Uid:55586fdc-5c0e-4b46-a8dd-484764d828e4,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"b3d73fc337479040a4a30615e586489c3394daa44cd2ca6d876324676a90e1a4\"" Sep 6 00:00:07.547737 containerd[1576]: time="2025-09-06T00:00:07.547314829Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 6 00:00:07.547737 containerd[1576]: time="2025-09-06T00:00:07.547376590Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 6 00:00:07.547737 containerd[1576]: time="2025-09-06T00:00:07.547395310Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:00:07.547737 containerd[1576]: time="2025-09-06T00:00:07.547493391Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:00:07.568596 systemd-networkd[1232]: calif7eeb23cccd: Link UP Sep 6 00:00:07.578144 systemd-networkd[1232]: calif7eeb23cccd: Gained carrier Sep 6 00:00:07.601659 containerd[1576]: 2025-09-06 00:00:07.040 [INFO][4986] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--mm682-eth0 coredns-7c65d6cfc9- kube-system 7f869a7e-1df4-4450-bbcf-ba5d46557d8b 986 0 2025-09-05 23:59:25 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4081-3-5-n-8aba32846f coredns-7c65d6cfc9-mm682 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calif7eeb23cccd [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="4aab30ceb6b2b66f8983e1ecd01d27a695c76635a6f758ec53a1b5f0e6ebbbf6" Namespace="kube-system" Pod="coredns-7c65d6cfc9-mm682" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--mm682-" Sep 6 00:00:07.601659 containerd[1576]: 2025-09-06 00:00:07.041 [INFO][4986] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4aab30ceb6b2b66f8983e1ecd01d27a695c76635a6f758ec53a1b5f0e6ebbbf6" Namespace="kube-system" Pod="coredns-7c65d6cfc9-mm682" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--mm682-eth0" Sep 6 00:00:07.601659 containerd[1576]: 2025-09-06 00:00:07.126 [INFO][5031] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4aab30ceb6b2b66f8983e1ecd01d27a695c76635a6f758ec53a1b5f0e6ebbbf6" HandleID="k8s-pod-network.4aab30ceb6b2b66f8983e1ecd01d27a695c76635a6f758ec53a1b5f0e6ebbbf6" Workload="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--mm682-eth0" Sep 6 00:00:07.601659 containerd[1576]: 2025-09-06 00:00:07.126 [INFO][5031] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4aab30ceb6b2b66f8983e1ecd01d27a695c76635a6f758ec53a1b5f0e6ebbbf6" HandleID="k8s-pod-network.4aab30ceb6b2b66f8983e1ecd01d27a695c76635a6f758ec53a1b5f0e6ebbbf6" Workload="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--mm682-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400034f560), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4081-3-5-n-8aba32846f", "pod":"coredns-7c65d6cfc9-mm682", "timestamp":"2025-09-06 00:00:07.126069459 +0000 UTC"}, Hostname:"ci-4081-3-5-n-8aba32846f", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 6 00:00:07.601659 containerd[1576]: 2025-09-06 00:00:07.126 [INFO][5031] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:00:07.601659 containerd[1576]: 2025-09-06 00:00:07.426 [INFO][5031] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:00:07.601659 containerd[1576]: 2025-09-06 00:00:07.426 [INFO][5031] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4081-3-5-n-8aba32846f' Sep 6 00:00:07.601659 containerd[1576]: 2025-09-06 00:00:07.472 [INFO][5031] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4aab30ceb6b2b66f8983e1ecd01d27a695c76635a6f758ec53a1b5f0e6ebbbf6" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.601659 containerd[1576]: 2025-09-06 00:00:07.495 [INFO][5031] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.601659 containerd[1576]: 2025-09-06 00:00:07.509 [INFO][5031] ipam/ipam.go 511: Trying affinity for 192.168.74.0/26 host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.601659 containerd[1576]: 2025-09-06 00:00:07.515 [INFO][5031] ipam/ipam.go 158: Attempting to load block cidr=192.168.74.0/26 host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.601659 containerd[1576]: 2025-09-06 00:00:07.521 [INFO][5031] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.74.0/26 host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.601659 containerd[1576]: 2025-09-06 00:00:07.521 [INFO][5031] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.74.0/26 handle="k8s-pod-network.4aab30ceb6b2b66f8983e1ecd01d27a695c76635a6f758ec53a1b5f0e6ebbbf6" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.601659 containerd[1576]: 2025-09-06 00:00:07.527 [INFO][5031] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.4aab30ceb6b2b66f8983e1ecd01d27a695c76635a6f758ec53a1b5f0e6ebbbf6 Sep 6 00:00:07.601659 containerd[1576]: 2025-09-06 00:00:07.537 [INFO][5031] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.74.0/26 handle="k8s-pod-network.4aab30ceb6b2b66f8983e1ecd01d27a695c76635a6f758ec53a1b5f0e6ebbbf6" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.601659 containerd[1576]: 2025-09-06 00:00:07.551 [INFO][5031] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.74.8/26] block=192.168.74.0/26 handle="k8s-pod-network.4aab30ceb6b2b66f8983e1ecd01d27a695c76635a6f758ec53a1b5f0e6ebbbf6" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.601659 containerd[1576]: 2025-09-06 00:00:07.551 [INFO][5031] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.74.8/26] handle="k8s-pod-network.4aab30ceb6b2b66f8983e1ecd01d27a695c76635a6f758ec53a1b5f0e6ebbbf6" host="ci-4081-3-5-n-8aba32846f" Sep 6 00:00:07.601659 containerd[1576]: 2025-09-06 00:00:07.551 [INFO][5031] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:00:07.601659 containerd[1576]: 2025-09-06 00:00:07.551 [INFO][5031] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.74.8/26] IPv6=[] ContainerID="4aab30ceb6b2b66f8983e1ecd01d27a695c76635a6f758ec53a1b5f0e6ebbbf6" HandleID="k8s-pod-network.4aab30ceb6b2b66f8983e1ecd01d27a695c76635a6f758ec53a1b5f0e6ebbbf6" Workload="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--mm682-eth0" Sep 6 00:00:07.604134 containerd[1576]: 2025-09-06 00:00:07.561 [INFO][4986] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4aab30ceb6b2b66f8983e1ecd01d27a695c76635a6f758ec53a1b5f0e6ebbbf6" Namespace="kube-system" Pod="coredns-7c65d6cfc9-mm682" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--mm682-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--mm682-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"7f869a7e-1df4-4450-bbcf-ba5d46557d8b", ResourceVersion:"986", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 59, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-8aba32846f", ContainerID:"", Pod:"coredns-7c65d6cfc9-mm682", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.74.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif7eeb23cccd", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:00:07.604134 containerd[1576]: 2025-09-06 00:00:07.561 [INFO][4986] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.74.8/32] ContainerID="4aab30ceb6b2b66f8983e1ecd01d27a695c76635a6f758ec53a1b5f0e6ebbbf6" Namespace="kube-system" Pod="coredns-7c65d6cfc9-mm682" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--mm682-eth0" Sep 6 00:00:07.604134 containerd[1576]: 2025-09-06 00:00:07.561 [INFO][4986] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif7eeb23cccd ContainerID="4aab30ceb6b2b66f8983e1ecd01d27a695c76635a6f758ec53a1b5f0e6ebbbf6" Namespace="kube-system" Pod="coredns-7c65d6cfc9-mm682" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--mm682-eth0" Sep 6 00:00:07.604134 containerd[1576]: 2025-09-06 00:00:07.576 [INFO][4986] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4aab30ceb6b2b66f8983e1ecd01d27a695c76635a6f758ec53a1b5f0e6ebbbf6" Namespace="kube-system" Pod="coredns-7c65d6cfc9-mm682" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--mm682-eth0" Sep 6 00:00:07.604134 containerd[1576]: 2025-09-06 00:00:07.583 [INFO][4986] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4aab30ceb6b2b66f8983e1ecd01d27a695c76635a6f758ec53a1b5f0e6ebbbf6" Namespace="kube-system" Pod="coredns-7c65d6cfc9-mm682" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--mm682-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--mm682-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"7f869a7e-1df4-4450-bbcf-ba5d46557d8b", ResourceVersion:"986", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 59, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-8aba32846f", ContainerID:"4aab30ceb6b2b66f8983e1ecd01d27a695c76635a6f758ec53a1b5f0e6ebbbf6", Pod:"coredns-7c65d6cfc9-mm682", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.74.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif7eeb23cccd", MAC:"ea:d8:56:51:0e:7c", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:00:07.604134 containerd[1576]: 2025-09-06 00:00:07.598 [INFO][4986] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4aab30ceb6b2b66f8983e1ecd01d27a695c76635a6f758ec53a1b5f0e6ebbbf6" Namespace="kube-system" Pod="coredns-7c65d6cfc9-mm682" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--mm682-eth0" Sep 6 00:00:07.644258 containerd[1576]: time="2025-09-06T00:00:07.643774949Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Sep 6 00:00:07.644258 containerd[1576]: time="2025-09-06T00:00:07.643899511Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Sep 6 00:00:07.644258 containerd[1576]: time="2025-09-06T00:00:07.643918791Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:00:07.644258 containerd[1576]: time="2025-09-06T00:00:07.644037032Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Sep 6 00:00:07.680098 containerd[1576]: time="2025-09-06T00:00:07.679872488Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-2kb4z,Uid:040258ac-5a18-4787-bc03-5dba35b78258,Namespace:calico-system,Attempt:1,} returns sandbox id \"28315c1659efcb8886450b61571b272e89bb977f1548bffe403a305b272287ea\"" Sep 6 00:00:07.685252 containerd[1576]: time="2025-09-06T00:00:07.684400861Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cbc856df4-cgnd8,Uid:0d045d3d-688c-447d-9931-b4f33f413acf,Namespace:calico-system,Attempt:1,} returns sandbox id \"bbc184507886a72b1c880a2a42d73250baf8226c68e72bd198bfa30293acdff5\"" Sep 6 00:00:07.730094 containerd[1576]: time="2025-09-06T00:00:07.729985230Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-mm682,Uid:7f869a7e-1df4-4450-bbcf-ba5d46557d8b,Namespace:kube-system,Attempt:1,} returns sandbox id \"4aab30ceb6b2b66f8983e1ecd01d27a695c76635a6f758ec53a1b5f0e6ebbbf6\"" Sep 6 00:00:07.734046 containerd[1576]: time="2025-09-06T00:00:07.733988236Z" level=info msg="CreateContainer within sandbox \"4aab30ceb6b2b66f8983e1ecd01d27a695c76635a6f758ec53a1b5f0e6ebbbf6\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 6 00:00:07.748767 containerd[1576]: time="2025-09-06T00:00:07.748610126Z" level=info msg="CreateContainer within sandbox \"4aab30ceb6b2b66f8983e1ecd01d27a695c76635a6f758ec53a1b5f0e6ebbbf6\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"ee70edbdc10c146a7957076bed12e22cf54e590901265b3418d34cfe4c782ca3\"" Sep 6 00:00:07.751345 containerd[1576]: time="2025-09-06T00:00:07.750168824Z" level=info msg="StartContainer for \"ee70edbdc10c146a7957076bed12e22cf54e590901265b3418d34cfe4c782ca3\"" Sep 6 00:00:07.815414 containerd[1576]: time="2025-09-06T00:00:07.815342141Z" level=info msg="StartContainer for \"ee70edbdc10c146a7957076bed12e22cf54e590901265b3418d34cfe4c782ca3\" returns successfully" Sep 6 00:00:07.862582 systemd[1]: run-netns-cni\x2d5e555f1d\x2df496\x2d947c\x2d1843\x2d148097b1a52e.mount: Deactivated successfully. Sep 6 00:00:07.864848 systemd[1]: run-netns-cni\x2dac55c055\x2d898f\x2d0dfa\x2d8f1c\x2dbcd49c5c7460.mount: Deactivated successfully. Sep 6 00:00:07.951096 kubelet[2774]: I0906 00:00:07.950088 2774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-mm682" podStartSLOduration=42.950067265 podStartE2EDuration="42.950067265s" podCreationTimestamp="2025-09-05 23:59:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-06 00:00:07.948288324 +0000 UTC m=+49.511912595" watchObservedRunningTime="2025-09-06 00:00:07.950067265 +0000 UTC m=+49.513691536" Sep 6 00:00:08.361656 systemd-networkd[1232]: cali72f4e08d14e: Gained IPv6LL Sep 6 00:00:08.745313 systemd-networkd[1232]: calif64776222e5: Gained IPv6LL Sep 6 00:00:09.257372 systemd-networkd[1232]: calic9e07cbfc1c: Gained IPv6LL Sep 6 00:00:09.322044 systemd-networkd[1232]: calif7eeb23cccd: Gained IPv6LL Sep 6 00:00:11.819595 containerd[1576]: time="2025-09-06T00:00:11.819515756Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:00:11.821863 containerd[1576]: time="2025-09-06T00:00:11.821151893Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=44530807" Sep 6 00:00:11.823332 containerd[1576]: time="2025-09-06T00:00:11.823105434Z" level=info msg="ImageCreate event name:\"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:00:11.830534 containerd[1576]: time="2025-09-06T00:00:11.830373791Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 7.642434984s" Sep 6 00:00:11.830534 containerd[1576]: time="2025-09-06T00:00:11.830425272Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 6 00:00:11.834606 containerd[1576]: time="2025-09-06T00:00:11.833151500Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 6 00:00:11.841153 containerd[1576]: time="2025-09-06T00:00:11.841096184Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:00:11.848475 containerd[1576]: time="2025-09-06T00:00:11.848297661Z" level=info msg="CreateContainer within sandbox \"aec6df2d8f9fd860a7fd99b4ffd502270992208348b6984be61845d928b36bce\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 6 00:00:11.890465 containerd[1576]: time="2025-09-06T00:00:11.890402106Z" level=info msg="CreateContainer within sandbox \"aec6df2d8f9fd860a7fd99b4ffd502270992208348b6984be61845d928b36bce\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"53ad8859cbfed070e16fabc3a307ae4326ca6e6549da9c45cb5c4fe1f5fa3c03\"" Sep 6 00:00:11.895144 containerd[1576]: time="2025-09-06T00:00:11.893311097Z" level=info msg="StartContainer for \"53ad8859cbfed070e16fabc3a307ae4326ca6e6549da9c45cb5c4fe1f5fa3c03\"" Sep 6 00:00:12.023390 containerd[1576]: time="2025-09-06T00:00:12.022606540Z" level=info msg="StartContainer for \"53ad8859cbfed070e16fabc3a307ae4326ca6e6549da9c45cb5c4fe1f5fa3c03\" returns successfully" Sep 6 00:00:12.992986 kubelet[2774]: I0906 00:00:12.992911 2774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-76445ff9b6-n8cgl" podStartSLOduration=28.388728123 podStartE2EDuration="36.992872134s" podCreationTimestamp="2025-09-05 23:59:36 +0000 UTC" firstStartedPulling="2025-09-06 00:00:03.227826677 +0000 UTC m=+44.791450908" lastFinishedPulling="2025-09-06 00:00:11.831970568 +0000 UTC m=+53.395594919" observedRunningTime="2025-09-06 00:00:12.991283518 +0000 UTC m=+54.554907749" watchObservedRunningTime="2025-09-06 00:00:12.992872134 +0000 UTC m=+54.556496365" Sep 6 00:00:13.976367 kubelet[2774]: I0906 00:00:13.975545 2774 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 6 00:00:18.594855 containerd[1576]: time="2025-09-06T00:00:18.594733239Z" level=info msg="StopPodSandbox for \"163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a\"" Sep 6 00:00:18.686872 containerd[1576]: 2025-09-06 00:00:18.637 [WARNING][5354] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--kh72c-eth0", GenerateName:"calico-apiserver-76445ff9b6-", Namespace:"calico-apiserver", SelfLink:"", UID:"55586fdc-5c0e-4b46-a8dd-484764d828e4", ResourceVersion:"993", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 59, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"76445ff9b6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-8aba32846f", ContainerID:"b3d73fc337479040a4a30615e586489c3394daa44cd2ca6d876324676a90e1a4", Pod:"calico-apiserver-76445ff9b6-kh72c", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.74.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif64776222e5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:00:18.686872 containerd[1576]: 2025-09-06 00:00:18.637 [INFO][5354] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" Sep 6 00:00:18.686872 containerd[1576]: 2025-09-06 00:00:18.637 [INFO][5354] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" iface="eth0" netns="" Sep 6 00:00:18.686872 containerd[1576]: 2025-09-06 00:00:18.637 [INFO][5354] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" Sep 6 00:00:18.686872 containerd[1576]: 2025-09-06 00:00:18.637 [INFO][5354] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" Sep 6 00:00:18.686872 containerd[1576]: 2025-09-06 00:00:18.660 [INFO][5362] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" HandleID="k8s-pod-network.163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--kh72c-eth0" Sep 6 00:00:18.686872 containerd[1576]: 2025-09-06 00:00:18.660 [INFO][5362] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:00:18.686872 containerd[1576]: 2025-09-06 00:00:18.660 [INFO][5362] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:00:18.686872 containerd[1576]: 2025-09-06 00:00:18.679 [WARNING][5362] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" HandleID="k8s-pod-network.163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--kh72c-eth0" Sep 6 00:00:18.686872 containerd[1576]: 2025-09-06 00:00:18.679 [INFO][5362] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" HandleID="k8s-pod-network.163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--kh72c-eth0" Sep 6 00:00:18.686872 containerd[1576]: 2025-09-06 00:00:18.681 [INFO][5362] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:00:18.686872 containerd[1576]: 2025-09-06 00:00:18.684 [INFO][5354] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" Sep 6 00:00:18.688330 containerd[1576]: time="2025-09-06T00:00:18.687799002Z" level=info msg="TearDown network for sandbox \"163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a\" successfully" Sep 6 00:00:18.688330 containerd[1576]: time="2025-09-06T00:00:18.687838843Z" level=info msg="StopPodSandbox for \"163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a\" returns successfully" Sep 6 00:00:18.688525 containerd[1576]: time="2025-09-06T00:00:18.688473848Z" level=info msg="RemovePodSandbox for \"163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a\"" Sep 6 00:00:18.690969 containerd[1576]: time="2025-09-06T00:00:18.690926791Z" level=info msg="Forcibly stopping sandbox \"163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a\"" Sep 6 00:00:18.775712 containerd[1576]: 2025-09-06 00:00:18.732 [WARNING][5376] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--kh72c-eth0", GenerateName:"calico-apiserver-76445ff9b6-", Namespace:"calico-apiserver", SelfLink:"", UID:"55586fdc-5c0e-4b46-a8dd-484764d828e4", ResourceVersion:"993", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 59, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"76445ff9b6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-8aba32846f", ContainerID:"b3d73fc337479040a4a30615e586489c3394daa44cd2ca6d876324676a90e1a4", Pod:"calico-apiserver-76445ff9b6-kh72c", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.74.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif64776222e5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:00:18.775712 containerd[1576]: 2025-09-06 00:00:18.732 [INFO][5376] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" Sep 6 00:00:18.775712 containerd[1576]: 2025-09-06 00:00:18.732 [INFO][5376] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" iface="eth0" netns="" Sep 6 00:00:18.775712 containerd[1576]: 2025-09-06 00:00:18.732 [INFO][5376] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" Sep 6 00:00:18.775712 containerd[1576]: 2025-09-06 00:00:18.732 [INFO][5376] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" Sep 6 00:00:18.775712 containerd[1576]: 2025-09-06 00:00:18.757 [INFO][5383] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" HandleID="k8s-pod-network.163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--kh72c-eth0" Sep 6 00:00:18.775712 containerd[1576]: 2025-09-06 00:00:18.757 [INFO][5383] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:00:18.775712 containerd[1576]: 2025-09-06 00:00:18.757 [INFO][5383] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:00:18.775712 containerd[1576]: 2025-09-06 00:00:18.769 [WARNING][5383] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" HandleID="k8s-pod-network.163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--kh72c-eth0" Sep 6 00:00:18.775712 containerd[1576]: 2025-09-06 00:00:18.769 [INFO][5383] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" HandleID="k8s-pod-network.163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--kh72c-eth0" Sep 6 00:00:18.775712 containerd[1576]: 2025-09-06 00:00:18.772 [INFO][5383] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:00:18.775712 containerd[1576]: 2025-09-06 00:00:18.773 [INFO][5376] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a" Sep 6 00:00:18.775712 containerd[1576]: time="2025-09-06T00:00:18.775691919Z" level=info msg="TearDown network for sandbox \"163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a\" successfully" Sep 6 00:00:18.781037 containerd[1576]: time="2025-09-06T00:00:18.780973687Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 6 00:00:18.781178 containerd[1576]: time="2025-09-06T00:00:18.781095608Z" level=info msg="RemovePodSandbox \"163c661c831b254f802d8292519fe2430e7877ab5ce0bd52311fca24b758818a\" returns successfully" Sep 6 00:00:18.781999 containerd[1576]: time="2025-09-06T00:00:18.781711574Z" level=info msg="StopPodSandbox for \"53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856\"" Sep 6 00:00:18.866919 containerd[1576]: 2025-09-06 00:00:18.820 [WARNING][5397] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-whisker--7cd7c86f75--6qzr6-eth0" Sep 6 00:00:18.866919 containerd[1576]: 2025-09-06 00:00:18.820 [INFO][5397] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" Sep 6 00:00:18.866919 containerd[1576]: 2025-09-06 00:00:18.820 [INFO][5397] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" iface="eth0" netns="" Sep 6 00:00:18.866919 containerd[1576]: 2025-09-06 00:00:18.820 [INFO][5397] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" Sep 6 00:00:18.866919 containerd[1576]: 2025-09-06 00:00:18.820 [INFO][5397] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" Sep 6 00:00:18.866919 containerd[1576]: 2025-09-06 00:00:18.847 [INFO][5404] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" HandleID="k8s-pod-network.53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" Workload="ci--4081--3--5--n--8aba32846f-k8s-whisker--7cd7c86f75--6qzr6-eth0" Sep 6 00:00:18.866919 containerd[1576]: 2025-09-06 00:00:18.847 [INFO][5404] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:00:18.866919 containerd[1576]: 2025-09-06 00:00:18.847 [INFO][5404] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:00:18.866919 containerd[1576]: 2025-09-06 00:00:18.860 [WARNING][5404] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" HandleID="k8s-pod-network.53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" Workload="ci--4081--3--5--n--8aba32846f-k8s-whisker--7cd7c86f75--6qzr6-eth0" Sep 6 00:00:18.866919 containerd[1576]: 2025-09-06 00:00:18.860 [INFO][5404] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" HandleID="k8s-pod-network.53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" Workload="ci--4081--3--5--n--8aba32846f-k8s-whisker--7cd7c86f75--6qzr6-eth0" Sep 6 00:00:18.866919 containerd[1576]: 2025-09-06 00:00:18.863 [INFO][5404] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:00:18.866919 containerd[1576]: 2025-09-06 00:00:18.864 [INFO][5397] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" Sep 6 00:00:18.867674 containerd[1576]: time="2025-09-06T00:00:18.867442711Z" level=info msg="TearDown network for sandbox \"53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856\" successfully" Sep 6 00:00:18.867674 containerd[1576]: time="2025-09-06T00:00:18.867498231Z" level=info msg="StopPodSandbox for \"53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856\" returns successfully" Sep 6 00:00:18.868279 containerd[1576]: time="2025-09-06T00:00:18.868237518Z" level=info msg="RemovePodSandbox for \"53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856\"" Sep 6 00:00:18.868279 containerd[1576]: time="2025-09-06T00:00:18.868275838Z" level=info msg="Forcibly stopping sandbox \"53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856\"" Sep 6 00:00:18.976253 containerd[1576]: 2025-09-06 00:00:18.911 [WARNING][5418] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" WorkloadEndpoint="ci--4081--3--5--n--8aba32846f-k8s-whisker--7cd7c86f75--6qzr6-eth0" Sep 6 00:00:18.976253 containerd[1576]: 2025-09-06 00:00:18.912 [INFO][5418] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" Sep 6 00:00:18.976253 containerd[1576]: 2025-09-06 00:00:18.912 [INFO][5418] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" iface="eth0" netns="" Sep 6 00:00:18.976253 containerd[1576]: 2025-09-06 00:00:18.912 [INFO][5418] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" Sep 6 00:00:18.976253 containerd[1576]: 2025-09-06 00:00:18.912 [INFO][5418] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" Sep 6 00:00:18.976253 containerd[1576]: 2025-09-06 00:00:18.958 [INFO][5425] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" HandleID="k8s-pod-network.53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" Workload="ci--4081--3--5--n--8aba32846f-k8s-whisker--7cd7c86f75--6qzr6-eth0" Sep 6 00:00:18.976253 containerd[1576]: 2025-09-06 00:00:18.958 [INFO][5425] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:00:18.976253 containerd[1576]: 2025-09-06 00:00:18.958 [INFO][5425] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:00:18.976253 containerd[1576]: 2025-09-06 00:00:18.970 [WARNING][5425] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" HandleID="k8s-pod-network.53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" Workload="ci--4081--3--5--n--8aba32846f-k8s-whisker--7cd7c86f75--6qzr6-eth0" Sep 6 00:00:18.976253 containerd[1576]: 2025-09-06 00:00:18.970 [INFO][5425] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" HandleID="k8s-pod-network.53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" Workload="ci--4081--3--5--n--8aba32846f-k8s-whisker--7cd7c86f75--6qzr6-eth0" Sep 6 00:00:18.976253 containerd[1576]: 2025-09-06 00:00:18.973 [INFO][5425] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:00:18.976253 containerd[1576]: 2025-09-06 00:00:18.974 [INFO][5418] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856" Sep 6 00:00:18.976695 containerd[1576]: time="2025-09-06T00:00:18.976284657Z" level=info msg="TearDown network for sandbox \"53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856\" successfully" Sep 6 00:00:18.981026 containerd[1576]: time="2025-09-06T00:00:18.980874819Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 6 00:00:18.981176 containerd[1576]: time="2025-09-06T00:00:18.981088781Z" level=info msg="RemovePodSandbox \"53969e86fe5a8a6a8831c6b4f2b68d2302e067d9414418b76ab2596ce90d5856\" returns successfully" Sep 6 00:00:18.981973 containerd[1576]: time="2025-09-06T00:00:18.981883188Z" level=info msg="StopPodSandbox for \"5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098\"" Sep 6 00:00:19.071373 containerd[1576]: 2025-09-06 00:00:19.030 [WARNING][5439] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--8aba32846f-k8s-goldmane--7988f88666--2kb4z-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"040258ac-5a18-4787-bc03-5dba35b78258", ResourceVersion:"997", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 59, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-8aba32846f", ContainerID:"28315c1659efcb8886450b61571b272e89bb977f1548bffe403a305b272287ea", Pod:"goldmane-7988f88666-2kb4z", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.74.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali72f4e08d14e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:00:19.071373 containerd[1576]: 2025-09-06 00:00:19.031 [INFO][5439] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" Sep 6 00:00:19.071373 containerd[1576]: 2025-09-06 00:00:19.031 [INFO][5439] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" iface="eth0" netns="" Sep 6 00:00:19.071373 containerd[1576]: 2025-09-06 00:00:19.031 [INFO][5439] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" Sep 6 00:00:19.071373 containerd[1576]: 2025-09-06 00:00:19.031 [INFO][5439] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" Sep 6 00:00:19.071373 containerd[1576]: 2025-09-06 00:00:19.053 [INFO][5446] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" HandleID="k8s-pod-network.5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" Workload="ci--4081--3--5--n--8aba32846f-k8s-goldmane--7988f88666--2kb4z-eth0" Sep 6 00:00:19.071373 containerd[1576]: 2025-09-06 00:00:19.053 [INFO][5446] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:00:19.071373 containerd[1576]: 2025-09-06 00:00:19.053 [INFO][5446] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:00:19.071373 containerd[1576]: 2025-09-06 00:00:19.064 [WARNING][5446] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" HandleID="k8s-pod-network.5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" Workload="ci--4081--3--5--n--8aba32846f-k8s-goldmane--7988f88666--2kb4z-eth0" Sep 6 00:00:19.071373 containerd[1576]: 2025-09-06 00:00:19.064 [INFO][5446] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" HandleID="k8s-pod-network.5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" Workload="ci--4081--3--5--n--8aba32846f-k8s-goldmane--7988f88666--2kb4z-eth0" Sep 6 00:00:19.071373 containerd[1576]: 2025-09-06 00:00:19.067 [INFO][5446] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:00:19.071373 containerd[1576]: 2025-09-06 00:00:19.069 [INFO][5439] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" Sep 6 00:00:19.071373 containerd[1576]: time="2025-09-06T00:00:19.071260145Z" level=info msg="TearDown network for sandbox \"5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098\" successfully" Sep 6 00:00:19.071373 containerd[1576]: time="2025-09-06T00:00:19.071288025Z" level=info msg="StopPodSandbox for \"5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098\" returns successfully" Sep 6 00:00:19.072864 containerd[1576]: time="2025-09-06T00:00:19.072344955Z" level=info msg="RemovePodSandbox for \"5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098\"" Sep 6 00:00:19.072864 containerd[1576]: time="2025-09-06T00:00:19.072381555Z" level=info msg="Forcibly stopping sandbox \"5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098\"" Sep 6 00:00:19.161905 containerd[1576]: 2025-09-06 00:00:19.117 [WARNING][5460] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--8aba32846f-k8s-goldmane--7988f88666--2kb4z-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"040258ac-5a18-4787-bc03-5dba35b78258", ResourceVersion:"997", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 59, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-8aba32846f", ContainerID:"28315c1659efcb8886450b61571b272e89bb977f1548bffe403a305b272287ea", Pod:"goldmane-7988f88666-2kb4z", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.74.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali72f4e08d14e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:00:19.161905 containerd[1576]: 2025-09-06 00:00:19.117 [INFO][5460] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" Sep 6 00:00:19.161905 containerd[1576]: 2025-09-06 00:00:19.117 [INFO][5460] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" iface="eth0" netns="" Sep 6 00:00:19.161905 containerd[1576]: 2025-09-06 00:00:19.117 [INFO][5460] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" Sep 6 00:00:19.161905 containerd[1576]: 2025-09-06 00:00:19.117 [INFO][5460] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" Sep 6 00:00:19.161905 containerd[1576]: 2025-09-06 00:00:19.140 [INFO][5467] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" HandleID="k8s-pod-network.5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" Workload="ci--4081--3--5--n--8aba32846f-k8s-goldmane--7988f88666--2kb4z-eth0" Sep 6 00:00:19.161905 containerd[1576]: 2025-09-06 00:00:19.140 [INFO][5467] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:00:19.161905 containerd[1576]: 2025-09-06 00:00:19.140 [INFO][5467] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:00:19.161905 containerd[1576]: 2025-09-06 00:00:19.151 [WARNING][5467] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" HandleID="k8s-pod-network.5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" Workload="ci--4081--3--5--n--8aba32846f-k8s-goldmane--7988f88666--2kb4z-eth0" Sep 6 00:00:19.161905 containerd[1576]: 2025-09-06 00:00:19.151 [INFO][5467] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" HandleID="k8s-pod-network.5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" Workload="ci--4081--3--5--n--8aba32846f-k8s-goldmane--7988f88666--2kb4z-eth0" Sep 6 00:00:19.161905 containerd[1576]: 2025-09-06 00:00:19.153 [INFO][5467] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:00:19.161905 containerd[1576]: 2025-09-06 00:00:19.156 [INFO][5460] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098" Sep 6 00:00:19.161905 containerd[1576]: time="2025-09-06T00:00:19.160370376Z" level=info msg="TearDown network for sandbox \"5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098\" successfully" Sep 6 00:00:19.167119 containerd[1576]: time="2025-09-06T00:00:19.167051115Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 6 00:00:19.167412 containerd[1576]: time="2025-09-06T00:00:19.167378798Z" level=info msg="RemovePodSandbox \"5f76ebcb19d4e387a5e60f763189030d513100ad4d02f2bb2825e5fb67469098\" returns successfully" Sep 6 00:00:19.169109 containerd[1576]: time="2025-09-06T00:00:19.169056453Z" level=info msg="StopPodSandbox for \"ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e\"" Sep 6 00:00:19.260534 containerd[1576]: 2025-09-06 00:00:19.212 [WARNING][5482] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--mm682-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"7f869a7e-1df4-4450-bbcf-ba5d46557d8b", ResourceVersion:"1015", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 59, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-8aba32846f", ContainerID:"4aab30ceb6b2b66f8983e1ecd01d27a695c76635a6f758ec53a1b5f0e6ebbbf6", Pod:"coredns-7c65d6cfc9-mm682", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.74.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif7eeb23cccd", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:00:19.260534 containerd[1576]: 2025-09-06 00:00:19.213 [INFO][5482] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" Sep 6 00:00:19.260534 containerd[1576]: 2025-09-06 00:00:19.213 [INFO][5482] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" iface="eth0" netns="" Sep 6 00:00:19.260534 containerd[1576]: 2025-09-06 00:00:19.213 [INFO][5482] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" Sep 6 00:00:19.260534 containerd[1576]: 2025-09-06 00:00:19.213 [INFO][5482] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" Sep 6 00:00:19.260534 containerd[1576]: 2025-09-06 00:00:19.239 [INFO][5489] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" HandleID="k8s-pod-network.ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" Workload="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--mm682-eth0" Sep 6 00:00:19.260534 containerd[1576]: 2025-09-06 00:00:19.239 [INFO][5489] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:00:19.260534 containerd[1576]: 2025-09-06 00:00:19.239 [INFO][5489] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:00:19.260534 containerd[1576]: 2025-09-06 00:00:19.253 [WARNING][5489] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" HandleID="k8s-pod-network.ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" Workload="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--mm682-eth0" Sep 6 00:00:19.260534 containerd[1576]: 2025-09-06 00:00:19.253 [INFO][5489] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" HandleID="k8s-pod-network.ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" Workload="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--mm682-eth0" Sep 6 00:00:19.260534 containerd[1576]: 2025-09-06 00:00:19.256 [INFO][5489] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:00:19.260534 containerd[1576]: 2025-09-06 00:00:19.258 [INFO][5482] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" Sep 6 00:00:19.260534 containerd[1576]: time="2025-09-06T00:00:19.260395464Z" level=info msg="TearDown network for sandbox \"ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e\" successfully" Sep 6 00:00:19.260534 containerd[1576]: time="2025-09-06T00:00:19.260423024Z" level=info msg="StopPodSandbox for \"ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e\" returns successfully" Sep 6 00:00:19.262001 containerd[1576]: time="2025-09-06T00:00:19.261481513Z" level=info msg="RemovePodSandbox for \"ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e\"" Sep 6 00:00:19.262001 containerd[1576]: time="2025-09-06T00:00:19.261527634Z" level=info msg="Forcibly stopping sandbox \"ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e\"" Sep 6 00:00:19.356139 containerd[1576]: 2025-09-06 00:00:19.303 [WARNING][5503] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--mm682-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"7f869a7e-1df4-4450-bbcf-ba5d46557d8b", ResourceVersion:"1015", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 59, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-8aba32846f", ContainerID:"4aab30ceb6b2b66f8983e1ecd01d27a695c76635a6f758ec53a1b5f0e6ebbbf6", Pod:"coredns-7c65d6cfc9-mm682", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.74.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif7eeb23cccd", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:00:19.356139 containerd[1576]: 2025-09-06 00:00:19.303 [INFO][5503] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" Sep 6 00:00:19.356139 containerd[1576]: 2025-09-06 00:00:19.303 [INFO][5503] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" iface="eth0" netns="" Sep 6 00:00:19.356139 containerd[1576]: 2025-09-06 00:00:19.303 [INFO][5503] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" Sep 6 00:00:19.356139 containerd[1576]: 2025-09-06 00:00:19.303 [INFO][5503] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" Sep 6 00:00:19.356139 containerd[1576]: 2025-09-06 00:00:19.329 [INFO][5510] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" HandleID="k8s-pod-network.ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" Workload="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--mm682-eth0" Sep 6 00:00:19.356139 containerd[1576]: 2025-09-06 00:00:19.329 [INFO][5510] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:00:19.356139 containerd[1576]: 2025-09-06 00:00:19.329 [INFO][5510] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:00:19.356139 containerd[1576]: 2025-09-06 00:00:19.344 [WARNING][5510] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" HandleID="k8s-pod-network.ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" Workload="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--mm682-eth0" Sep 6 00:00:19.356139 containerd[1576]: 2025-09-06 00:00:19.345 [INFO][5510] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" HandleID="k8s-pod-network.ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" Workload="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--mm682-eth0" Sep 6 00:00:19.356139 containerd[1576]: 2025-09-06 00:00:19.348 [INFO][5510] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:00:19.356139 containerd[1576]: 2025-09-06 00:00:19.352 [INFO][5503] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e" Sep 6 00:00:19.356139 containerd[1576]: time="2025-09-06T00:00:19.355977392Z" level=info msg="TearDown network for sandbox \"ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e\" successfully" Sep 6 00:00:19.361529 containerd[1576]: time="2025-09-06T00:00:19.361461200Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 6 00:00:19.361831 containerd[1576]: time="2025-09-06T00:00:19.361554361Z" level=info msg="RemovePodSandbox \"ae180cb95b0cddc1db318cea1b2c734250e7c876f248c12b77adb8766503705e\" returns successfully" Sep 6 00:00:19.362236 containerd[1576]: time="2025-09-06T00:00:19.362198687Z" level=info msg="StopPodSandbox for \"1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063\"" Sep 6 00:00:19.453750 containerd[1576]: 2025-09-06 00:00:19.409 [WARNING][5524] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--8aba32846f-k8s-csi--node--driver--ps2ks-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"4ff97107-cfe4-4d25-918b-36fd4176bf0c", ResourceVersion:"965", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 59, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-8aba32846f", ContainerID:"59eeb191e9847ded5f0b0c50561a32db880df7b37af27b8bfebc65eb26a73e70", Pod:"csi-node-driver-ps2ks", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.74.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali64b00cc420c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:00:19.453750 containerd[1576]: 2025-09-06 00:00:19.409 [INFO][5524] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" Sep 6 00:00:19.453750 containerd[1576]: 2025-09-06 00:00:19.410 [INFO][5524] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" iface="eth0" netns="" Sep 6 00:00:19.453750 containerd[1576]: 2025-09-06 00:00:19.410 [INFO][5524] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" Sep 6 00:00:19.453750 containerd[1576]: 2025-09-06 00:00:19.410 [INFO][5524] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" Sep 6 00:00:19.453750 containerd[1576]: 2025-09-06 00:00:19.436 [INFO][5532] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" HandleID="k8s-pod-network.1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" Workload="ci--4081--3--5--n--8aba32846f-k8s-csi--node--driver--ps2ks-eth0" Sep 6 00:00:19.453750 containerd[1576]: 2025-09-06 00:00:19.436 [INFO][5532] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:00:19.453750 containerd[1576]: 2025-09-06 00:00:19.436 [INFO][5532] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:00:19.453750 containerd[1576]: 2025-09-06 00:00:19.447 [WARNING][5532] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" HandleID="k8s-pod-network.1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" Workload="ci--4081--3--5--n--8aba32846f-k8s-csi--node--driver--ps2ks-eth0" Sep 6 00:00:19.453750 containerd[1576]: 2025-09-06 00:00:19.447 [INFO][5532] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" HandleID="k8s-pod-network.1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" Workload="ci--4081--3--5--n--8aba32846f-k8s-csi--node--driver--ps2ks-eth0" Sep 6 00:00:19.453750 containerd[1576]: 2025-09-06 00:00:19.449 [INFO][5532] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:00:19.453750 containerd[1576]: 2025-09-06 00:00:19.451 [INFO][5524] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" Sep 6 00:00:19.453750 containerd[1576]: time="2025-09-06T00:00:19.453382256Z" level=info msg="TearDown network for sandbox \"1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063\" successfully" Sep 6 00:00:19.453750 containerd[1576]: time="2025-09-06T00:00:19.453427137Z" level=info msg="StopPodSandbox for \"1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063\" returns successfully" Sep 6 00:00:19.456227 containerd[1576]: time="2025-09-06T00:00:19.455992039Z" level=info msg="RemovePodSandbox for \"1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063\"" Sep 6 00:00:19.456227 containerd[1576]: time="2025-09-06T00:00:19.456075640Z" level=info msg="Forcibly stopping sandbox \"1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063\"" Sep 6 00:00:19.552890 containerd[1576]: 2025-09-06 00:00:19.504 [WARNING][5547] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--8aba32846f-k8s-csi--node--driver--ps2ks-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"4ff97107-cfe4-4d25-918b-36fd4176bf0c", ResourceVersion:"965", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 59, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-8aba32846f", ContainerID:"59eeb191e9847ded5f0b0c50561a32db880df7b37af27b8bfebc65eb26a73e70", Pod:"csi-node-driver-ps2ks", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.74.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali64b00cc420c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:00:19.552890 containerd[1576]: 2025-09-06 00:00:19.504 [INFO][5547] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" Sep 6 00:00:19.552890 containerd[1576]: 2025-09-06 00:00:19.504 [INFO][5547] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" iface="eth0" netns="" Sep 6 00:00:19.552890 containerd[1576]: 2025-09-06 00:00:19.504 [INFO][5547] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" Sep 6 00:00:19.552890 containerd[1576]: 2025-09-06 00:00:19.504 [INFO][5547] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" Sep 6 00:00:19.552890 containerd[1576]: 2025-09-06 00:00:19.531 [INFO][5554] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" HandleID="k8s-pod-network.1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" Workload="ci--4081--3--5--n--8aba32846f-k8s-csi--node--driver--ps2ks-eth0" Sep 6 00:00:19.552890 containerd[1576]: 2025-09-06 00:00:19.532 [INFO][5554] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:00:19.552890 containerd[1576]: 2025-09-06 00:00:19.532 [INFO][5554] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:00:19.552890 containerd[1576]: 2025-09-06 00:00:19.544 [WARNING][5554] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" HandleID="k8s-pod-network.1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" Workload="ci--4081--3--5--n--8aba32846f-k8s-csi--node--driver--ps2ks-eth0" Sep 6 00:00:19.552890 containerd[1576]: 2025-09-06 00:00:19.544 [INFO][5554] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" HandleID="k8s-pod-network.1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" Workload="ci--4081--3--5--n--8aba32846f-k8s-csi--node--driver--ps2ks-eth0" Sep 6 00:00:19.552890 containerd[1576]: 2025-09-06 00:00:19.547 [INFO][5554] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:00:19.552890 containerd[1576]: 2025-09-06 00:00:19.549 [INFO][5547] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063" Sep 6 00:00:19.552890 containerd[1576]: time="2025-09-06T00:00:19.552756858Z" level=info msg="TearDown network for sandbox \"1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063\" successfully" Sep 6 00:00:19.558901 containerd[1576]: time="2025-09-06T00:00:19.558804472Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 6 00:00:19.558901 containerd[1576]: time="2025-09-06T00:00:19.558882712Z" level=info msg="RemovePodSandbox \"1e4bb5df844a20df2c4fe926e353f7c7f74664aea15c1506b34b0eb7d0212063\" returns successfully" Sep 6 00:00:19.559579 containerd[1576]: time="2025-09-06T00:00:19.559482558Z" level=info msg="StopPodSandbox for \"7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61\"" Sep 6 00:00:19.651289 containerd[1576]: 2025-09-06 00:00:19.610 [WARNING][5568] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--8aba32846f-k8s-calico--kube--controllers--cbc856df4--cgnd8-eth0", GenerateName:"calico-kube-controllers-cbc856df4-", Namespace:"calico-system", SelfLink:"", UID:"0d045d3d-688c-447d-9931-b4f33f413acf", ResourceVersion:"1001", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 59, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"cbc856df4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-8aba32846f", ContainerID:"bbc184507886a72b1c880a2a42d73250baf8226c68e72bd198bfa30293acdff5", Pod:"calico-kube-controllers-cbc856df4-cgnd8", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.74.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic9e07cbfc1c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:00:19.651289 containerd[1576]: 2025-09-06 00:00:19.611 [INFO][5568] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" Sep 6 00:00:19.651289 containerd[1576]: 2025-09-06 00:00:19.611 [INFO][5568] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" iface="eth0" netns="" Sep 6 00:00:19.651289 containerd[1576]: 2025-09-06 00:00:19.611 [INFO][5568] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" Sep 6 00:00:19.651289 containerd[1576]: 2025-09-06 00:00:19.611 [INFO][5568] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" Sep 6 00:00:19.651289 containerd[1576]: 2025-09-06 00:00:19.633 [INFO][5575] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" HandleID="k8s-pod-network.7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--kube--controllers--cbc856df4--cgnd8-eth0" Sep 6 00:00:19.651289 containerd[1576]: 2025-09-06 00:00:19.634 [INFO][5575] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:00:19.651289 containerd[1576]: 2025-09-06 00:00:19.634 [INFO][5575] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:00:19.651289 containerd[1576]: 2025-09-06 00:00:19.644 [WARNING][5575] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" HandleID="k8s-pod-network.7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--kube--controllers--cbc856df4--cgnd8-eth0" Sep 6 00:00:19.651289 containerd[1576]: 2025-09-06 00:00:19.645 [INFO][5575] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" HandleID="k8s-pod-network.7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--kube--controllers--cbc856df4--cgnd8-eth0" Sep 6 00:00:19.651289 containerd[1576]: 2025-09-06 00:00:19.647 [INFO][5575] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:00:19.651289 containerd[1576]: 2025-09-06 00:00:19.649 [INFO][5568] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" Sep 6 00:00:19.652335 containerd[1576]: time="2025-09-06T00:00:19.651333333Z" level=info msg="TearDown network for sandbox \"7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61\" successfully" Sep 6 00:00:19.652335 containerd[1576]: time="2025-09-06T00:00:19.651359773Z" level=info msg="StopPodSandbox for \"7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61\" returns successfully" Sep 6 00:00:19.653151 containerd[1576]: time="2025-09-06T00:00:19.653116229Z" level=info msg="RemovePodSandbox for \"7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61\"" Sep 6 00:00:19.653151 containerd[1576]: time="2025-09-06T00:00:19.653158829Z" level=info msg="Forcibly stopping sandbox \"7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61\"" Sep 6 00:00:19.758783 containerd[1576]: 2025-09-06 00:00:19.699 [WARNING][5589] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--8aba32846f-k8s-calico--kube--controllers--cbc856df4--cgnd8-eth0", GenerateName:"calico-kube-controllers-cbc856df4-", Namespace:"calico-system", SelfLink:"", UID:"0d045d3d-688c-447d-9931-b4f33f413acf", ResourceVersion:"1001", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 59, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"cbc856df4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-8aba32846f", ContainerID:"bbc184507886a72b1c880a2a42d73250baf8226c68e72bd198bfa30293acdff5", Pod:"calico-kube-controllers-cbc856df4-cgnd8", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.74.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic9e07cbfc1c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:00:19.758783 containerd[1576]: 2025-09-06 00:00:19.700 [INFO][5589] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" Sep 6 00:00:19.758783 containerd[1576]: 2025-09-06 00:00:19.700 [INFO][5589] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" iface="eth0" netns="" Sep 6 00:00:19.758783 containerd[1576]: 2025-09-06 00:00:19.700 [INFO][5589] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" Sep 6 00:00:19.758783 containerd[1576]: 2025-09-06 00:00:19.700 [INFO][5589] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" Sep 6 00:00:19.758783 containerd[1576]: 2025-09-06 00:00:19.731 [INFO][5596] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" HandleID="k8s-pod-network.7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--kube--controllers--cbc856df4--cgnd8-eth0" Sep 6 00:00:19.758783 containerd[1576]: 2025-09-06 00:00:19.731 [INFO][5596] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:00:19.758783 containerd[1576]: 2025-09-06 00:00:19.731 [INFO][5596] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:00:19.758783 containerd[1576]: 2025-09-06 00:00:19.750 [WARNING][5596] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" HandleID="k8s-pod-network.7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--kube--controllers--cbc856df4--cgnd8-eth0" Sep 6 00:00:19.758783 containerd[1576]: 2025-09-06 00:00:19.750 [INFO][5596] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" HandleID="k8s-pod-network.7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--kube--controllers--cbc856df4--cgnd8-eth0" Sep 6 00:00:19.758783 containerd[1576]: 2025-09-06 00:00:19.753 [INFO][5596] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:00:19.758783 containerd[1576]: 2025-09-06 00:00:19.756 [INFO][5589] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61" Sep 6 00:00:19.758783 containerd[1576]: time="2025-09-06T00:00:19.758660485Z" level=info msg="TearDown network for sandbox \"7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61\" successfully" Sep 6 00:00:19.764949 containerd[1576]: time="2025-09-06T00:00:19.764883621Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 6 00:00:19.765104 containerd[1576]: time="2025-09-06T00:00:19.764963301Z" level=info msg="RemovePodSandbox \"7de5f6b50a372a2489937cdf42ce9de88194b70f9ab1f53b5785bf6d715f8f61\" returns successfully" Sep 6 00:00:19.765964 containerd[1576]: time="2025-09-06T00:00:19.765604827Z" level=info msg="StopPodSandbox for \"dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847\"" Sep 6 00:00:19.864516 containerd[1576]: 2025-09-06 00:00:19.803 [WARNING][5611] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--8dbj7-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"e137dec7-0dff-4776-b673-c2d533bf21f9", ResourceVersion:"958", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 59, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-8aba32846f", ContainerID:"681e4504d7d3f5533eee66e14d99af48d1fb110f75de8e10c58e0721cc0cbb1d", Pod:"coredns-7c65d6cfc9-8dbj7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.74.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif4039490265", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:00:19.864516 containerd[1576]: 2025-09-06 00:00:19.804 [INFO][5611] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" Sep 6 00:00:19.864516 containerd[1576]: 2025-09-06 00:00:19.804 [INFO][5611] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" iface="eth0" netns="" Sep 6 00:00:19.864516 containerd[1576]: 2025-09-06 00:00:19.804 [INFO][5611] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" Sep 6 00:00:19.864516 containerd[1576]: 2025-09-06 00:00:19.804 [INFO][5611] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" Sep 6 00:00:19.864516 containerd[1576]: 2025-09-06 00:00:19.841 [INFO][5618] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" HandleID="k8s-pod-network.dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" Workload="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--8dbj7-eth0" Sep 6 00:00:19.864516 containerd[1576]: 2025-09-06 00:00:19.841 [INFO][5618] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:00:19.864516 containerd[1576]: 2025-09-06 00:00:19.841 [INFO][5618] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:00:19.864516 containerd[1576]: 2025-09-06 00:00:19.854 [WARNING][5618] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" HandleID="k8s-pod-network.dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" Workload="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--8dbj7-eth0" Sep 6 00:00:19.864516 containerd[1576]: 2025-09-06 00:00:19.854 [INFO][5618] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" HandleID="k8s-pod-network.dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" Workload="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--8dbj7-eth0" Sep 6 00:00:19.864516 containerd[1576]: 2025-09-06 00:00:19.857 [INFO][5618] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:00:19.864516 containerd[1576]: 2025-09-06 00:00:19.858 [INFO][5611] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" Sep 6 00:00:19.865687 containerd[1576]: time="2025-09-06T00:00:19.864558025Z" level=info msg="TearDown network for sandbox \"dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847\" successfully" Sep 6 00:00:19.865687 containerd[1576]: time="2025-09-06T00:00:19.864602946Z" level=info msg="StopPodSandbox for \"dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847\" returns successfully" Sep 6 00:00:19.865687 containerd[1576]: time="2025-09-06T00:00:19.865198591Z" level=info msg="RemovePodSandbox for \"dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847\"" Sep 6 00:00:19.865687 containerd[1576]: time="2025-09-06T00:00:19.865231351Z" level=info msg="Forcibly stopping sandbox \"dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847\"" Sep 6 00:00:19.959173 containerd[1576]: 2025-09-06 00:00:19.913 [WARNING][5638] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--8dbj7-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"e137dec7-0dff-4776-b673-c2d533bf21f9", ResourceVersion:"958", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 59, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-8aba32846f", ContainerID:"681e4504d7d3f5533eee66e14d99af48d1fb110f75de8e10c58e0721cc0cbb1d", Pod:"coredns-7c65d6cfc9-8dbj7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.74.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif4039490265", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:00:19.959173 containerd[1576]: 2025-09-06 00:00:19.914 [INFO][5638] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" Sep 6 00:00:19.959173 containerd[1576]: 2025-09-06 00:00:19.914 [INFO][5638] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" iface="eth0" netns="" Sep 6 00:00:19.959173 containerd[1576]: 2025-09-06 00:00:19.914 [INFO][5638] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" Sep 6 00:00:19.959173 containerd[1576]: 2025-09-06 00:00:19.914 [INFO][5638] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" Sep 6 00:00:19.959173 containerd[1576]: 2025-09-06 00:00:19.939 [INFO][5646] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" HandleID="k8s-pod-network.dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" Workload="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--8dbj7-eth0" Sep 6 00:00:19.959173 containerd[1576]: 2025-09-06 00:00:19.939 [INFO][5646] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:00:19.959173 containerd[1576]: 2025-09-06 00:00:19.939 [INFO][5646] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:00:19.959173 containerd[1576]: 2025-09-06 00:00:19.951 [WARNING][5646] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" HandleID="k8s-pod-network.dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" Workload="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--8dbj7-eth0" Sep 6 00:00:19.959173 containerd[1576]: 2025-09-06 00:00:19.951 [INFO][5646] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" HandleID="k8s-pod-network.dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" Workload="ci--4081--3--5--n--8aba32846f-k8s-coredns--7c65d6cfc9--8dbj7-eth0" Sep 6 00:00:19.959173 containerd[1576]: 2025-09-06 00:00:19.954 [INFO][5646] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:00:19.959173 containerd[1576]: 2025-09-06 00:00:19.956 [INFO][5638] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847" Sep 6 00:00:19.959173 containerd[1576]: time="2025-09-06T00:00:19.958468219Z" level=info msg="TearDown network for sandbox \"dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847\" successfully" Sep 6 00:00:19.972391 containerd[1576]: time="2025-09-06T00:00:19.972088820Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 6 00:00:19.972856 containerd[1576]: time="2025-09-06T00:00:19.972412102Z" level=info msg="RemovePodSandbox \"dff34d639c23db62cbab5d00bb41314fd2337243dc0d51252d062136b8636847\" returns successfully" Sep 6 00:00:19.973230 containerd[1576]: time="2025-09-06T00:00:19.973188749Z" level=info msg="StopPodSandbox for \"1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481\"" Sep 6 00:00:20.061207 containerd[1576]: 2025-09-06 00:00:20.020 [WARNING][5660] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--n8cgl-eth0", GenerateName:"calico-apiserver-76445ff9b6-", Namespace:"calico-apiserver", SelfLink:"", UID:"8fd1cf87-0065-4b36-b757-559cbde7316b", ResourceVersion:"1031", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 59, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"76445ff9b6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-8aba32846f", ContainerID:"aec6df2d8f9fd860a7fd99b4ffd502270992208348b6984be61845d928b36bce", Pod:"calico-apiserver-76445ff9b6-n8cgl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.74.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib992151005b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:00:20.061207 containerd[1576]: 2025-09-06 00:00:20.021 [INFO][5660] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" Sep 6 00:00:20.061207 containerd[1576]: 2025-09-06 00:00:20.021 [INFO][5660] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" iface="eth0" netns="" Sep 6 00:00:20.061207 containerd[1576]: 2025-09-06 00:00:20.021 [INFO][5660] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" Sep 6 00:00:20.061207 containerd[1576]: 2025-09-06 00:00:20.021 [INFO][5660] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" Sep 6 00:00:20.061207 containerd[1576]: 2025-09-06 00:00:20.041 [INFO][5667] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" HandleID="k8s-pod-network.1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--n8cgl-eth0" Sep 6 00:00:20.061207 containerd[1576]: 2025-09-06 00:00:20.041 [INFO][5667] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:00:20.061207 containerd[1576]: 2025-09-06 00:00:20.041 [INFO][5667] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:00:20.061207 containerd[1576]: 2025-09-06 00:00:20.054 [WARNING][5667] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" HandleID="k8s-pod-network.1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--n8cgl-eth0" Sep 6 00:00:20.061207 containerd[1576]: 2025-09-06 00:00:20.054 [INFO][5667] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" HandleID="k8s-pod-network.1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--n8cgl-eth0" Sep 6 00:00:20.061207 containerd[1576]: 2025-09-06 00:00:20.057 [INFO][5667] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:00:20.061207 containerd[1576]: 2025-09-06 00:00:20.059 [INFO][5660] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" Sep 6 00:00:20.061768 containerd[1576]: time="2025-09-06T00:00:20.061184519Z" level=info msg="TearDown network for sandbox \"1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481\" successfully" Sep 6 00:00:20.061768 containerd[1576]: time="2025-09-06T00:00:20.061706284Z" level=info msg="StopPodSandbox for \"1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481\" returns successfully" Sep 6 00:00:20.063335 containerd[1576]: time="2025-09-06T00:00:20.063277217Z" level=info msg="RemovePodSandbox for \"1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481\"" Sep 6 00:00:20.063335 containerd[1576]: time="2025-09-06T00:00:20.063319658Z" level=info msg="Forcibly stopping sandbox \"1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481\"" Sep 6 00:00:20.139686 containerd[1576]: 2025-09-06 00:00:20.102 [WARNING][5681] cni-plugin/k8s.go 604: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--n8cgl-eth0", GenerateName:"calico-apiserver-76445ff9b6-", Namespace:"calico-apiserver", SelfLink:"", UID:"8fd1cf87-0065-4b36-b757-559cbde7316b", ResourceVersion:"1031", Generation:0, CreationTimestamp:time.Date(2025, time.September, 5, 23, 59, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"76445ff9b6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4081-3-5-n-8aba32846f", ContainerID:"aec6df2d8f9fd860a7fd99b4ffd502270992208348b6984be61845d928b36bce", Pod:"calico-apiserver-76445ff9b6-n8cgl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.74.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib992151005b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 6 00:00:20.139686 containerd[1576]: 2025-09-06 00:00:20.103 [INFO][5681] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" Sep 6 00:00:20.139686 containerd[1576]: 2025-09-06 00:00:20.103 [INFO][5681] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" iface="eth0" netns="" Sep 6 00:00:20.139686 containerd[1576]: 2025-09-06 00:00:20.103 [INFO][5681] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" Sep 6 00:00:20.139686 containerd[1576]: 2025-09-06 00:00:20.103 [INFO][5681] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" Sep 6 00:00:20.139686 containerd[1576]: 2025-09-06 00:00:20.124 [INFO][5688] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" HandleID="k8s-pod-network.1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--n8cgl-eth0" Sep 6 00:00:20.139686 containerd[1576]: 2025-09-06 00:00:20.124 [INFO][5688] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 6 00:00:20.139686 containerd[1576]: 2025-09-06 00:00:20.124 [INFO][5688] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 6 00:00:20.139686 containerd[1576]: 2025-09-06 00:00:20.134 [WARNING][5688] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" HandleID="k8s-pod-network.1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--n8cgl-eth0" Sep 6 00:00:20.139686 containerd[1576]: 2025-09-06 00:00:20.134 [INFO][5688] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" HandleID="k8s-pod-network.1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" Workload="ci--4081--3--5--n--8aba32846f-k8s-calico--apiserver--76445ff9b6--n8cgl-eth0" Sep 6 00:00:20.139686 containerd[1576]: 2025-09-06 00:00:20.136 [INFO][5688] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 6 00:00:20.139686 containerd[1576]: 2025-09-06 00:00:20.138 [INFO][5681] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481" Sep 6 00:00:20.140133 containerd[1576]: time="2025-09-06T00:00:20.139754402Z" level=info msg="TearDown network for sandbox \"1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481\" successfully" Sep 6 00:00:20.143976 containerd[1576]: time="2025-09-06T00:00:20.143918718Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Sep 6 00:00:20.144093 containerd[1576]: time="2025-09-06T00:00:20.144040879Z" level=info msg="RemovePodSandbox \"1d8fb26fb70218a9684f0fa3cd08055d57f41f1fa4c6ef7ec10ecc78fb147481\" returns successfully" Sep 6 00:00:22.037135 systemd[1]: Started sshd@8-128.140.56.156:22-103.99.206.83:59824.service - OpenSSH per-connection server daemon (103.99.206.83:59824). Sep 6 00:00:22.381543 sshd[5694]: Connection closed by 103.99.206.83 port 59824 [preauth] Sep 6 00:00:22.385451 systemd[1]: sshd@8-128.140.56.156:22-103.99.206.83:59824.service: Deactivated successfully. Sep 6 00:00:24.941299 containerd[1576]: time="2025-09-06T00:00:24.941231120Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:00:24.942686 containerd[1576]: time="2025-09-06T00:00:24.942593771Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8227489" Sep 6 00:00:24.943967 containerd[1576]: time="2025-09-06T00:00:24.943278296Z" level=info msg="ImageCreate event name:\"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:00:24.946201 containerd[1576]: time="2025-09-06T00:00:24.946163719Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:00:24.946880 containerd[1576]: time="2025-09-06T00:00:24.946841205Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"9596730\" in 13.113539383s" Sep 6 00:00:24.946931 containerd[1576]: time="2025-09-06T00:00:24.946884085Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\"" Sep 6 00:00:24.948991 containerd[1576]: time="2025-09-06T00:00:24.948937542Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 6 00:00:24.951463 containerd[1576]: time="2025-09-06T00:00:24.951412441Z" level=info msg="CreateContainer within sandbox \"59eeb191e9847ded5f0b0c50561a32db880df7b37af27b8bfebc65eb26a73e70\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 6 00:00:24.978696 containerd[1576]: time="2025-09-06T00:00:24.978522459Z" level=info msg="CreateContainer within sandbox \"59eeb191e9847ded5f0b0c50561a32db880df7b37af27b8bfebc65eb26a73e70\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"d85201c1f1532c8177425e6827854139d0309d6145797551fab406f12af6d943\"" Sep 6 00:00:24.979387 containerd[1576]: time="2025-09-06T00:00:24.979354385Z" level=info msg="StartContainer for \"d85201c1f1532c8177425e6827854139d0309d6145797551fab406f12af6d943\"" Sep 6 00:00:25.057153 containerd[1576]: time="2025-09-06T00:00:25.056982798Z" level=info msg="StartContainer for \"d85201c1f1532c8177425e6827854139d0309d6145797551fab406f12af6d943\" returns successfully" Sep 6 00:00:28.013346 containerd[1576]: time="2025-09-06T00:00:28.013234194Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:00:28.018951 containerd[1576]: time="2025-09-06T00:00:28.016330617Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 6 00:00:28.022092 containerd[1576]: time="2025-09-06T00:00:28.021925738Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 3.071901588s" Sep 6 00:00:28.022092 containerd[1576]: time="2025-09-06T00:00:28.021989299Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 6 00:00:28.029218 containerd[1576]: time="2025-09-06T00:00:28.029161232Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 6 00:00:28.029712 containerd[1576]: time="2025-09-06T00:00:28.029680876Z" level=info msg="CreateContainer within sandbox \"b3d73fc337479040a4a30615e586489c3394daa44cd2ca6d876324676a90e1a4\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 6 00:00:28.050679 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount933819313.mount: Deactivated successfully. Sep 6 00:00:28.058259 containerd[1576]: time="2025-09-06T00:00:28.058209927Z" level=info msg="CreateContainer within sandbox \"b3d73fc337479040a4a30615e586489c3394daa44cd2ca6d876324676a90e1a4\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"f0703df8aa2039e382b8c3a286c8e98ff24686f39f07559636fc8f433b43dbba\"" Sep 6 00:00:28.069702 containerd[1576]: time="2025-09-06T00:00:28.064857296Z" level=info msg="StartContainer for \"f0703df8aa2039e382b8c3a286c8e98ff24686f39f07559636fc8f433b43dbba\"" Sep 6 00:00:28.261157 containerd[1576]: time="2025-09-06T00:00:28.260995390Z" level=info msg="StartContainer for \"f0703df8aa2039e382b8c3a286c8e98ff24686f39f07559636fc8f433b43dbba\" returns successfully" Sep 6 00:00:30.581314 kubelet[2774]: I0906 00:00:30.581243 2774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-76445ff9b6-kh72c" podStartSLOduration=34.075210148 podStartE2EDuration="54.581186405s" podCreationTimestamp="2025-09-05 23:59:36 +0000 UTC" firstStartedPulling="2025-09-06 00:00:07.519925431 +0000 UTC m=+49.083549662" lastFinishedPulling="2025-09-06 00:00:28.025901648 +0000 UTC m=+69.589525919" observedRunningTime="2025-09-06 00:00:29.098372541 +0000 UTC m=+70.661996772" watchObservedRunningTime="2025-09-06 00:00:30.581186405 +0000 UTC m=+72.144810636" Sep 6 00:00:37.009190 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4017452229.mount: Deactivated successfully. Sep 6 00:00:37.775679 containerd[1576]: time="2025-09-06T00:00:37.774707193Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:00:37.778926 containerd[1576]: time="2025-09-06T00:00:37.778874659Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=61845332" Sep 6 00:00:37.783084 containerd[1576]: time="2025-09-06T00:00:37.782864684Z" level=info msg="ImageCreate event name:\"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:00:37.793680 containerd[1576]: time="2025-09-06T00:00:37.791279377Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:00:37.795973 containerd[1576]: time="2025-09-06T00:00:37.795912527Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"61845178\" in 9.766695415s" Sep 6 00:00:37.796121 containerd[1576]: time="2025-09-06T00:00:37.795978567Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\"" Sep 6 00:00:37.798657 containerd[1576]: time="2025-09-06T00:00:37.797926059Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 6 00:00:37.801157 containerd[1576]: time="2025-09-06T00:00:37.801108239Z" level=info msg="CreateContainer within sandbox \"28315c1659efcb8886450b61571b272e89bb977f1548bffe403a305b272287ea\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 6 00:00:37.836099 containerd[1576]: time="2025-09-06T00:00:37.835928059Z" level=info msg="CreateContainer within sandbox \"28315c1659efcb8886450b61571b272e89bb977f1548bffe403a305b272287ea\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"66ebe0ca124ef0164fceadd60242fca2c1e976a74e7b1c16fc5e37518743b754\"" Sep 6 00:00:37.838644 containerd[1576]: time="2025-09-06T00:00:37.838043473Z" level=info msg="StartContainer for \"66ebe0ca124ef0164fceadd60242fca2c1e976a74e7b1c16fc5e37518743b754\"" Sep 6 00:00:38.062185 containerd[1576]: time="2025-09-06T00:00:38.061781558Z" level=info msg="StartContainer for \"66ebe0ca124ef0164fceadd60242fca2c1e976a74e7b1c16fc5e37518743b754\" returns successfully" Sep 6 00:00:45.794196 systemd[1]: run-containerd-runc-k8s.io-66ebe0ca124ef0164fceadd60242fca2c1e976a74e7b1c16fc5e37518743b754-runc.qlKLu2.mount: Deactivated successfully. Sep 6 00:00:46.684942 containerd[1576]: time="2025-09-06T00:00:46.684878112Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:00:46.690064 containerd[1576]: time="2025-09-06T00:00:46.689946020Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=48134957" Sep 6 00:00:46.692459 containerd[1576]: time="2025-09-06T00:00:46.692391873Z" level=info msg="ImageCreate event name:\"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:00:46.699203 containerd[1576]: time="2025-09-06T00:00:46.697587942Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:00:46.699641 containerd[1576]: time="2025-09-06T00:00:46.699392791Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"49504166\" in 8.901417131s" Sep 6 00:00:46.699781 containerd[1576]: time="2025-09-06T00:00:46.699755073Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\"" Sep 6 00:00:46.703881 containerd[1576]: time="2025-09-06T00:00:46.703831496Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 6 00:00:46.720903 containerd[1576]: time="2025-09-06T00:00:46.718138334Z" level=info msg="CreateContainer within sandbox \"bbc184507886a72b1c880a2a42d73250baf8226c68e72bd198bfa30293acdff5\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 6 00:00:46.750902 containerd[1576]: time="2025-09-06T00:00:46.750670513Z" level=info msg="CreateContainer within sandbox \"bbc184507886a72b1c880a2a42d73250baf8226c68e72bd198bfa30293acdff5\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"805cf5395a5de155be3dfa0ef5fc84b67cca6c96d54f6dbafaed6031ff849593\"" Sep 6 00:00:46.751769 containerd[1576]: time="2025-09-06T00:00:46.751725238Z" level=info msg="StartContainer for \"805cf5395a5de155be3dfa0ef5fc84b67cca6c96d54f6dbafaed6031ff849593\"" Sep 6 00:00:46.859107 containerd[1576]: time="2025-09-06T00:00:46.859034786Z" level=info msg="StartContainer for \"805cf5395a5de155be3dfa0ef5fc84b67cca6c96d54f6dbafaed6031ff849593\" returns successfully" Sep 6 00:00:46.956690 kubelet[2774]: I0906 00:00:46.954593 2774 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 6 00:00:46.995456 kubelet[2774]: I0906 00:00:46.994774 2774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-7988f88666-2kb4z" podStartSLOduration=35.883007146 podStartE2EDuration="1m5.99475653s" podCreationTimestamp="2025-09-05 23:59:41 +0000 UTC" firstStartedPulling="2025-09-06 00:00:07.685155669 +0000 UTC m=+49.248779900" lastFinishedPulling="2025-09-06 00:00:37.796905093 +0000 UTC m=+79.360529284" observedRunningTime="2025-09-06 00:00:38.132379716 +0000 UTC m=+79.696003947" watchObservedRunningTime="2025-09-06 00:00:46.99475653 +0000 UTC m=+88.558380721" Sep 6 00:00:47.172965 kubelet[2774]: I0906 00:00:47.172780 2774 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-cbc856df4-cgnd8" podStartSLOduration=26.158366668 podStartE2EDuration="1m5.172757493s" podCreationTimestamp="2025-09-05 23:59:42 +0000 UTC" firstStartedPulling="2025-09-06 00:00:07.688013463 +0000 UTC m=+49.251637694" lastFinishedPulling="2025-09-06 00:00:46.702404288 +0000 UTC m=+88.266028519" observedRunningTime="2025-09-06 00:00:47.170396 +0000 UTC m=+88.734020231" watchObservedRunningTime="2025-09-06 00:00:47.172757493 +0000 UTC m=+88.736381684" Sep 6 00:00:53.862440 containerd[1576]: time="2025-09-06T00:00:53.861611820Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:00:53.865796 containerd[1576]: time="2025-09-06T00:00:53.865746480Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=13761208" Sep 6 00:00:53.867271 containerd[1576]: time="2025-09-06T00:00:53.867229648Z" level=info msg="ImageCreate event name:\"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:00:53.873721 containerd[1576]: time="2025-09-06T00:00:53.873675240Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 6 00:00:53.875654 containerd[1576]: time="2025-09-06T00:00:53.874238483Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"15130401\" in 7.170354747s" Sep 6 00:00:53.875654 containerd[1576]: time="2025-09-06T00:00:53.874278443Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\"" Sep 6 00:00:53.883280 containerd[1576]: time="2025-09-06T00:00:53.882959806Z" level=info msg="CreateContainer within sandbox \"59eeb191e9847ded5f0b0c50561a32db880df7b37af27b8bfebc65eb26a73e70\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 6 00:00:53.920490 containerd[1576]: time="2025-09-06T00:00:53.917125896Z" level=info msg="CreateContainer within sandbox \"59eeb191e9847ded5f0b0c50561a32db880df7b37af27b8bfebc65eb26a73e70\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"04b1e239134efb035b072bfed023f026d4d1b99fcc0ed5e3e6283d19cf94e8c8\"" Sep 6 00:00:53.920490 containerd[1576]: time="2025-09-06T00:00:53.919821230Z" level=info msg="StartContainer for \"04b1e239134efb035b072bfed023f026d4d1b99fcc0ed5e3e6283d19cf94e8c8\"" Sep 6 00:00:54.063346 containerd[1576]: time="2025-09-06T00:00:54.062985619Z" level=info msg="StartContainer for \"04b1e239134efb035b072bfed023f026d4d1b99fcc0ed5e3e6283d19cf94e8c8\" returns successfully" Sep 6 00:00:54.775512 kubelet[2774]: I0906 00:00:54.775447 2774 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 6 00:00:54.776577 kubelet[2774]: I0906 00:00:54.775552 2774 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 6 00:01:21.347670 systemd[1]: run-containerd-runc-k8s.io-66ebe0ca124ef0164fceadd60242fca2c1e976a74e7b1c16fc5e37518743b754-runc.Zkg8W8.mount: Deactivated successfully. Sep 6 00:01:53.604840 systemd[1]: run-containerd-runc-k8s.io-937e17db81a6fff5243530016606a68a723f2d72c16524a924db17eb54345458-runc.ICI0Wj.mount: Deactivated successfully. Sep 6 00:02:10.075890 systemd[1]: Started sshd@9-128.140.56.156:22-139.178.68.195:34154.service - OpenSSH per-connection server daemon (139.178.68.195:34154). Sep 6 00:02:11.087821 sshd[6322]: Accepted publickey for core from 139.178.68.195 port 34154 ssh2: RSA SHA256:+hHHVborSkWo7/0A1ohHVzFaxSLc/9IisClzOe0fYVI Sep 6 00:02:11.091714 sshd[6322]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:02:11.099773 systemd-logind[1551]: New session 8 of user core. Sep 6 00:02:11.105178 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 6 00:02:11.879953 sshd[6322]: pam_unix(sshd:session): session closed for user core Sep 6 00:02:11.885979 systemd[1]: sshd@9-128.140.56.156:22-139.178.68.195:34154.service: Deactivated successfully. Sep 6 00:02:11.889316 systemd[1]: session-8.scope: Deactivated successfully. Sep 6 00:02:11.890423 systemd-logind[1551]: Session 8 logged out. Waiting for processes to exit. Sep 6 00:02:11.892573 systemd-logind[1551]: Removed session 8. Sep 6 00:02:17.051081 systemd[1]: Started sshd@10-128.140.56.156:22-139.178.68.195:47784.service - OpenSSH per-connection server daemon (139.178.68.195:47784). Sep 6 00:02:18.050407 sshd[6339]: Accepted publickey for core from 139.178.68.195 port 47784 ssh2: RSA SHA256:+hHHVborSkWo7/0A1ohHVzFaxSLc/9IisClzOe0fYVI Sep 6 00:02:18.053813 sshd[6339]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:02:18.060963 systemd-logind[1551]: New session 9 of user core. Sep 6 00:02:18.064969 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 6 00:02:18.843898 sshd[6339]: pam_unix(sshd:session): session closed for user core Sep 6 00:02:18.849933 systemd[1]: sshd@10-128.140.56.156:22-139.178.68.195:47784.service: Deactivated successfully. Sep 6 00:02:18.850201 systemd-logind[1551]: Session 9 logged out. Waiting for processes to exit. Sep 6 00:02:18.855586 systemd[1]: session-9.scope: Deactivated successfully. Sep 6 00:02:18.856999 systemd-logind[1551]: Removed session 9. Sep 6 00:02:24.013126 systemd[1]: Started sshd@11-128.140.56.156:22-139.178.68.195:44494.service - OpenSSH per-connection server daemon (139.178.68.195:44494). Sep 6 00:02:25.011299 sshd[6419]: Accepted publickey for core from 139.178.68.195 port 44494 ssh2: RSA SHA256:+hHHVborSkWo7/0A1ohHVzFaxSLc/9IisClzOe0fYVI Sep 6 00:02:25.013288 sshd[6419]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:02:25.019400 systemd-logind[1551]: New session 10 of user core. Sep 6 00:02:25.023947 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 6 00:02:25.775256 sshd[6419]: pam_unix(sshd:session): session closed for user core Sep 6 00:02:25.779473 systemd[1]: sshd@11-128.140.56.156:22-139.178.68.195:44494.service: Deactivated successfully. Sep 6 00:02:25.785944 systemd-logind[1551]: Session 10 logged out. Waiting for processes to exit. Sep 6 00:02:25.787401 systemd[1]: session-10.scope: Deactivated successfully. Sep 6 00:02:25.789142 systemd-logind[1551]: Removed session 10. Sep 6 00:02:25.944983 systemd[1]: Started sshd@12-128.140.56.156:22-139.178.68.195:44508.service - OpenSSH per-connection server daemon (139.178.68.195:44508). Sep 6 00:02:26.937470 sshd[6436]: Accepted publickey for core from 139.178.68.195 port 44508 ssh2: RSA SHA256:+hHHVborSkWo7/0A1ohHVzFaxSLc/9IisClzOe0fYVI Sep 6 00:02:26.940586 sshd[6436]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:02:26.948042 systemd-logind[1551]: New session 11 of user core. Sep 6 00:02:26.952933 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 6 00:02:27.769821 sshd[6436]: pam_unix(sshd:session): session closed for user core Sep 6 00:02:27.777134 systemd[1]: sshd@12-128.140.56.156:22-139.178.68.195:44508.service: Deactivated successfully. Sep 6 00:02:27.780817 systemd-logind[1551]: Session 11 logged out. Waiting for processes to exit. Sep 6 00:02:27.781533 systemd[1]: session-11.scope: Deactivated successfully. Sep 6 00:02:27.784611 systemd-logind[1551]: Removed session 11. Sep 6 00:02:27.938081 systemd[1]: Started sshd@13-128.140.56.156:22-139.178.68.195:44510.service - OpenSSH per-connection server daemon (139.178.68.195:44510). Sep 6 00:02:28.941381 sshd[6448]: Accepted publickey for core from 139.178.68.195 port 44510 ssh2: RSA SHA256:+hHHVborSkWo7/0A1ohHVzFaxSLc/9IisClzOe0fYVI Sep 6 00:02:28.944116 sshd[6448]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:02:28.950171 systemd-logind[1551]: New session 12 of user core. Sep 6 00:02:28.958121 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 6 00:02:29.705358 sshd[6448]: pam_unix(sshd:session): session closed for user core Sep 6 00:02:29.710236 systemd[1]: sshd@13-128.140.56.156:22-139.178.68.195:44510.service: Deactivated successfully. Sep 6 00:02:29.714462 systemd[1]: session-12.scope: Deactivated successfully. Sep 6 00:02:29.714676 systemd-logind[1551]: Session 12 logged out. Waiting for processes to exit. Sep 6 00:02:29.717730 systemd-logind[1551]: Removed session 12. Sep 6 00:02:33.673016 systemd[1]: Started sshd@14-128.140.56.156:22-117.72.32.238:41170.service - OpenSSH per-connection server daemon (117.72.32.238:41170). Sep 6 00:02:34.873025 systemd[1]: Started sshd@15-128.140.56.156:22-139.178.68.195:36516.service - OpenSSH per-connection server daemon (139.178.68.195:36516). Sep 6 00:02:35.866460 sshd[6463]: Accepted publickey for core from 139.178.68.195 port 36516 ssh2: RSA SHA256:+hHHVborSkWo7/0A1ohHVzFaxSLc/9IisClzOe0fYVI Sep 6 00:02:35.868516 sshd[6463]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:02:35.874043 systemd-logind[1551]: New session 13 of user core. Sep 6 00:02:35.879285 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 6 00:02:36.631968 sshd[6463]: pam_unix(sshd:session): session closed for user core Sep 6 00:02:36.637173 systemd[1]: sshd@15-128.140.56.156:22-139.178.68.195:36516.service: Deactivated successfully. Sep 6 00:02:36.644326 systemd[1]: session-13.scope: Deactivated successfully. Sep 6 00:02:36.645602 systemd-logind[1551]: Session 13 logged out. Waiting for processes to exit. Sep 6 00:02:36.647456 systemd-logind[1551]: Removed session 13. Sep 6 00:02:36.801326 systemd[1]: Started sshd@16-128.140.56.156:22-139.178.68.195:36526.service - OpenSSH per-connection server daemon (139.178.68.195:36526). Sep 6 00:02:37.799087 sshd[6477]: Accepted publickey for core from 139.178.68.195 port 36526 ssh2: RSA SHA256:+hHHVborSkWo7/0A1ohHVzFaxSLc/9IisClzOe0fYVI Sep 6 00:02:37.800608 sshd[6477]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:02:37.806111 systemd-logind[1551]: New session 14 of user core. Sep 6 00:02:37.812138 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 6 00:02:38.730394 sshd[6477]: pam_unix(sshd:session): session closed for user core Sep 6 00:02:38.742057 systemd[1]: sshd@16-128.140.56.156:22-139.178.68.195:36526.service: Deactivated successfully. Sep 6 00:02:38.751858 systemd[1]: session-14.scope: Deactivated successfully. Sep 6 00:02:38.753138 systemd-logind[1551]: Session 14 logged out. Waiting for processes to exit. Sep 6 00:02:38.754570 systemd-logind[1551]: Removed session 14. Sep 6 00:02:38.901818 systemd[1]: Started sshd@17-128.140.56.156:22-139.178.68.195:36536.service - OpenSSH per-connection server daemon (139.178.68.195:36536). Sep 6 00:02:39.906864 sshd[6489]: Accepted publickey for core from 139.178.68.195 port 36536 ssh2: RSA SHA256:+hHHVborSkWo7/0A1ohHVzFaxSLc/9IisClzOe0fYVI Sep 6 00:02:39.910303 sshd[6489]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:02:39.916122 systemd-logind[1551]: New session 15 of user core. Sep 6 00:02:39.923479 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 6 00:02:42.962403 sshd[6489]: pam_unix(sshd:session): session closed for user core Sep 6 00:02:42.972856 systemd-logind[1551]: Session 15 logged out. Waiting for processes to exit. Sep 6 00:02:42.974046 systemd[1]: sshd@17-128.140.56.156:22-139.178.68.195:36536.service: Deactivated successfully. Sep 6 00:02:42.984103 systemd[1]: session-15.scope: Deactivated successfully. Sep 6 00:02:42.991854 systemd-logind[1551]: Removed session 15. Sep 6 00:02:43.127087 systemd[1]: Started sshd@18-128.140.56.156:22-139.178.68.195:51402.service - OpenSSH per-connection server daemon (139.178.68.195:51402). Sep 6 00:02:44.127035 sshd[6516]: Accepted publickey for core from 139.178.68.195 port 51402 ssh2: RSA SHA256:+hHHVborSkWo7/0A1ohHVzFaxSLc/9IisClzOe0fYVI Sep 6 00:02:44.131616 sshd[6516]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:02:44.143077 systemd-logind[1551]: New session 16 of user core. Sep 6 00:02:44.144946 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 6 00:02:45.071612 sshd[6516]: pam_unix(sshd:session): session closed for user core Sep 6 00:02:45.078057 systemd-logind[1551]: Session 16 logged out. Waiting for processes to exit. Sep 6 00:02:45.079641 systemd[1]: sshd@18-128.140.56.156:22-139.178.68.195:51402.service: Deactivated successfully. Sep 6 00:02:45.086801 systemd[1]: session-16.scope: Deactivated successfully. Sep 6 00:02:45.092733 systemd-logind[1551]: Removed session 16. Sep 6 00:02:45.262269 systemd[1]: Started sshd@19-128.140.56.156:22-139.178.68.195:51418.service - OpenSSH per-connection server daemon (139.178.68.195:51418). Sep 6 00:02:46.320222 sshd[6549]: Accepted publickey for core from 139.178.68.195 port 51418 ssh2: RSA SHA256:+hHHVborSkWo7/0A1ohHVzFaxSLc/9IisClzOe0fYVI Sep 6 00:02:46.323852 sshd[6549]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:02:46.331174 systemd-logind[1551]: New session 17 of user core. Sep 6 00:02:46.338032 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 6 00:02:47.121671 sshd[6549]: pam_unix(sshd:session): session closed for user core Sep 6 00:02:47.127729 systemd-logind[1551]: Session 17 logged out. Waiting for processes to exit. Sep 6 00:02:47.128403 systemd[1]: sshd@19-128.140.56.156:22-139.178.68.195:51418.service: Deactivated successfully. Sep 6 00:02:47.132843 systemd[1]: session-17.scope: Deactivated successfully. Sep 6 00:02:47.134031 systemd-logind[1551]: Removed session 17. Sep 6 00:02:52.285454 systemd[1]: Started sshd@20-128.140.56.156:22-139.178.68.195:60256.service - OpenSSH per-connection server daemon (139.178.68.195:60256). Sep 6 00:02:53.283519 sshd[6624]: Accepted publickey for core from 139.178.68.195 port 60256 ssh2: RSA SHA256:+hHHVborSkWo7/0A1ohHVzFaxSLc/9IisClzOe0fYVI Sep 6 00:02:53.285954 sshd[6624]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:02:53.292320 systemd-logind[1551]: New session 18 of user core. Sep 6 00:02:53.299565 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 6 00:02:54.050611 sshd[6624]: pam_unix(sshd:session): session closed for user core Sep 6 00:02:54.056608 systemd[1]: sshd@20-128.140.56.156:22-139.178.68.195:60256.service: Deactivated successfully. Sep 6 00:02:54.061796 systemd[1]: session-18.scope: Deactivated successfully. Sep 6 00:02:54.065290 systemd-logind[1551]: Session 18 logged out. Waiting for processes to exit. Sep 6 00:02:54.066345 systemd-logind[1551]: Removed session 18. Sep 6 00:02:59.218986 systemd[1]: Started sshd@21-128.140.56.156:22-139.178.68.195:60266.service - OpenSSH per-connection server daemon (139.178.68.195:60266). Sep 6 00:03:00.208706 sshd[6663]: Accepted publickey for core from 139.178.68.195 port 60266 ssh2: RSA SHA256:+hHHVborSkWo7/0A1ohHVzFaxSLc/9IisClzOe0fYVI Sep 6 00:03:00.210874 sshd[6663]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 6 00:03:00.218080 systemd-logind[1551]: New session 19 of user core. Sep 6 00:03:00.225176 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 6 00:03:00.971504 sshd[6663]: pam_unix(sshd:session): session closed for user core Sep 6 00:03:00.977574 systemd[1]: sshd@21-128.140.56.156:22-139.178.68.195:60266.service: Deactivated successfully. Sep 6 00:03:00.978446 systemd-logind[1551]: Session 19 logged out. Waiting for processes to exit. Sep 6 00:03:00.983901 systemd[1]: session-19.scope: Deactivated successfully. Sep 6 00:03:00.986029 systemd-logind[1551]: Removed session 19. Sep 6 00:03:32.168397 containerd[1576]: time="2025-09-06T00:03:32.168318519Z" level=info msg="shim disconnected" id=3b8b67645073c617c28d809da380ce1c94e68b38c3dcbfee485e83b86a155eb8 namespace=k8s.io Sep 6 00:03:32.169175 containerd[1576]: time="2025-09-06T00:03:32.168377519Z" level=warning msg="cleaning up after shim disconnected" id=3b8b67645073c617c28d809da380ce1c94e68b38c3dcbfee485e83b86a155eb8 namespace=k8s.io Sep 6 00:03:32.169175 containerd[1576]: time="2025-09-06T00:03:32.168701520Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 6 00:03:32.171429 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3b8b67645073c617c28d809da380ce1c94e68b38c3dcbfee485e83b86a155eb8-rootfs.mount: Deactivated successfully. Sep 6 00:03:32.338464 kubelet[2774]: E0906 00:03:32.338401 2774 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:38538->10.0.0.2:2379: read: connection timed out" Sep 6 00:03:32.360686 containerd[1576]: time="2025-09-06T00:03:32.359910015Z" level=info msg="shim disconnected" id=3b3e214261afc585e291c28a6e47b4197bac762af6a953ab69a61bfccb52be32 namespace=k8s.io Sep 6 00:03:32.360686 containerd[1576]: time="2025-09-06T00:03:32.360120095Z" level=warning msg="cleaning up after shim disconnected" id=3b3e214261afc585e291c28a6e47b4197bac762af6a953ab69a61bfccb52be32 namespace=k8s.io Sep 6 00:03:32.360686 containerd[1576]: time="2025-09-06T00:03:32.360143215Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 6 00:03:32.362090 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3b3e214261afc585e291c28a6e47b4197bac762af6a953ab69a61bfccb52be32-rootfs.mount: Deactivated successfully. Sep 6 00:03:32.370724 containerd[1576]: time="2025-09-06T00:03:32.369855224Z" level=info msg="shim disconnected" id=d18077f8a74891589a28489baecc1cf79dfcb67fb977737dc56149cf618d0d1b namespace=k8s.io Sep 6 00:03:32.370724 containerd[1576]: time="2025-09-06T00:03:32.369912824Z" level=warning msg="cleaning up after shim disconnected" id=d18077f8a74891589a28489baecc1cf79dfcb67fb977737dc56149cf618d0d1b namespace=k8s.io Sep 6 00:03:32.370724 containerd[1576]: time="2025-09-06T00:03:32.369920944Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 6 00:03:32.373059 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d18077f8a74891589a28489baecc1cf79dfcb67fb977737dc56149cf618d0d1b-rootfs.mount: Deactivated successfully. Sep 6 00:03:32.390988 containerd[1576]: time="2025-09-06T00:03:32.390332403Z" level=warning msg="cleanup warnings time=\"2025-09-06T00:03:32Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Sep 6 00:03:32.667911 kubelet[2774]: I0906 00:03:32.667723 2774 scope.go:117] "RemoveContainer" containerID="3b3e214261afc585e291c28a6e47b4197bac762af6a953ab69a61bfccb52be32" Sep 6 00:03:32.671821 containerd[1576]: time="2025-09-06T00:03:32.671358821Z" level=info msg="CreateContainer within sandbox \"6b9a5da773a547cfd9fc54155abdaf3a6e73c6f00abf03c59222b0ca5b3b77dd\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Sep 6 00:03:32.672806 kubelet[2774]: I0906 00:03:32.672445 2774 scope.go:117] "RemoveContainer" containerID="3b8b67645073c617c28d809da380ce1c94e68b38c3dcbfee485e83b86a155eb8" Sep 6 00:03:32.674820 kubelet[2774]: I0906 00:03:32.674686 2774 scope.go:117] "RemoveContainer" containerID="d18077f8a74891589a28489baecc1cf79dfcb67fb977737dc56149cf618d0d1b" Sep 6 00:03:32.676424 containerd[1576]: time="2025-09-06T00:03:32.675867905Z" level=info msg="CreateContainer within sandbox \"e665bdb44818108392062ed3a80cc01ef969347b158429872911365b3c3914ee\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Sep 6 00:03:32.679242 containerd[1576]: time="2025-09-06T00:03:32.679172268Z" level=info msg="CreateContainer within sandbox \"4df81ee4cfb7f49a52fde4f7fbfd70b746b3dba0a0969167e43128470b2f1253\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Sep 6 00:03:32.698808 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1809734825.mount: Deactivated successfully. Sep 6 00:03:32.703788 containerd[1576]: time="2025-09-06T00:03:32.703727411Z" level=info msg="CreateContainer within sandbox \"6b9a5da773a547cfd9fc54155abdaf3a6e73c6f00abf03c59222b0ca5b3b77dd\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"406c7ed5d58d52c7d3ae68973259216d545dc90bf21323877819f9a825ae9732\"" Sep 6 00:03:32.704671 containerd[1576]: time="2025-09-06T00:03:32.704645332Z" level=info msg="StartContainer for \"406c7ed5d58d52c7d3ae68973259216d545dc90bf21323877819f9a825ae9732\"" Sep 6 00:03:32.710886 containerd[1576]: time="2025-09-06T00:03:32.710737257Z" level=info msg="CreateContainer within sandbox \"e665bdb44818108392062ed3a80cc01ef969347b158429872911365b3c3914ee\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"c2fc777fcf70aeadbdf1ef77e558059e5c7d10178554d9e02e8c79ddab665954\"" Sep 6 00:03:32.711432 containerd[1576]: time="2025-09-06T00:03:32.711403498Z" level=info msg="StartContainer for \"c2fc777fcf70aeadbdf1ef77e558059e5c7d10178554d9e02e8c79ddab665954\"" Sep 6 00:03:32.713539 containerd[1576]: time="2025-09-06T00:03:32.713433580Z" level=info msg="CreateContainer within sandbox \"4df81ee4cfb7f49a52fde4f7fbfd70b746b3dba0a0969167e43128470b2f1253\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"52b9b41f2cb3c46c6b160c093a3f38792503662be57930fc4270740eda5dd6a5\"" Sep 6 00:03:32.714254 containerd[1576]: time="2025-09-06T00:03:32.714219620Z" level=info msg="StartContainer for \"52b9b41f2cb3c46c6b160c093a3f38792503662be57930fc4270740eda5dd6a5\"" Sep 6 00:03:32.815081 containerd[1576]: time="2025-09-06T00:03:32.815012553Z" level=info msg="StartContainer for \"406c7ed5d58d52c7d3ae68973259216d545dc90bf21323877819f9a825ae9732\" returns successfully" Sep 6 00:03:32.824258 containerd[1576]: time="2025-09-06T00:03:32.824209641Z" level=info msg="StartContainer for \"c2fc777fcf70aeadbdf1ef77e558059e5c7d10178554d9e02e8c79ddab665954\" returns successfully" Sep 6 00:03:32.838834 containerd[1576]: time="2025-09-06T00:03:32.838222894Z" level=info msg="StartContainer for \"52b9b41f2cb3c46c6b160c093a3f38792503662be57930fc4270740eda5dd6a5\" returns successfully" Sep 6 00:03:35.659367 kubelet[2774]: I0906 00:03:35.659193 2774 status_manager.go:851] "Failed to get status for pod" podUID="d8f42a34ba66a7b41021f548558ce545" pod="kube-system/kube-controller-manager-ci-4081-3-5-n-8aba32846f" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:38454->10.0.0.2:2379: read: connection timed out" Sep 6 00:03:35.667192 kubelet[2774]: E0906 00:03:35.665298 2774 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:38332->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4081-3-5-n-8aba32846f.1862889b4a76081a kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4081-3-5-n-8aba32846f,UID:4ba089ccbafa93e097d8d8214ca67960,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Liveness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4081-3-5-n-8aba32846f,},FirstTimestamp:2025-09-06 00:03:25.178472474 +0000 UTC m=+246.742096745,LastTimestamp:2025-09-06 00:03:25.178472474 +0000 UTC m=+246.742096745,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4081-3-5-n-8aba32846f,}"